[go: up one dir, main page]

WO2025146929A1 - Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour réguler un niveau de luminosité - Google Patents

Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour réguler un niveau de luminosité Download PDF

Info

Publication number
WO2025146929A1
WO2025146929A1 PCT/KR2024/017848 KR2024017848W WO2025146929A1 WO 2025146929 A1 WO2025146929 A1 WO 2025146929A1 KR 2024017848 W KR2024017848 W KR 2024017848W WO 2025146929 A1 WO2025146929 A1 WO 2025146929A1
Authority
WO
WIPO (PCT)
Prior art keywords
illuminance
image
display
region
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/KR2024/017848
Other languages
English (en)
Korean (ko)
Inventor
이민우
김웅규
이서영
김동휘
김광태
염동현
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020240007319A external-priority patent/KR20250106155A/ko
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of WO2025146929A1 publication Critical patent/WO2025146929A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/14Details of searching files based on file metadata
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters

Definitions

  • the following descriptions relate to electronic devices, methods, and non-transitory computer-readable storage media for controlling brightness levels.
  • An electronic device may include a display.
  • the display may be used to display an image.
  • the electronic device may support a function for controlling a brightness level of the display based on changes in illumination around the electronic device.
  • the electronic device may include a memory, including one or more storage media, storing instructions.
  • the electronic device may include an ambient light sensor.
  • the electronic device may include a display.
  • the electronic device may include at least one processor, including one or more processing circuits.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to detect an event for displaying a first image.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to obtain, based on the event, a second image, the second image including a third region corresponding to a first region of the first image and brighter than the first region and a fourth region corresponding to a second region of the first image and darker than the second region.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to control a brightness level of the display that performs display of the second image provided in response to the event by setting a value to a first value for compensating for the fourth region being darker than the second region.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to control the brightness level of the display by setting the value to a second value lower than the first value based on recognizing through the light sensor that while the second image is displayed, the illuminance surrounding the electronic device changes from a first illuminance lower than a threshold illuminance to a second illuminance higher than the threshold illuminance.
  • a method is described.
  • the method can be executed in an electronic device having an illumination sensor and a display.
  • the method can include an operation of detecting an event for displaying a first image.
  • the method can include an operation of acquiring a second image, based on the event, including a third region corresponding to a first region of the first image and brighter than the first region and a fourth region corresponding to a second region of the first image and darker than the second region.
  • the method can include an operation of controlling a brightness level of the display, which executes display of the second image provided in response to the event, by setting a value for compensating for the fourth region being darker than the second region to a first value.
  • the method can include an operation of controlling the brightness level of the display by setting the value to a second value lower than the first value based on recognizing through the illumination sensor that the illumination around the electronic device changes from a first illumination lower than a threshold illumination to a second illumination higher than the threshold illumination while the second image is displayed.
  • the non-transitory computer-readable storage medium may store one or more programs.
  • the one or more programs may include instructions that, when executed by an electronic device having a light sensor and a display, cause the electronic device to detect an event for displaying a first image.
  • the one or more programs may include instructions that, when executed by the electronic device, cause the electronic device to obtain, based on the event, a second image including a third region corresponding to a first region of the first image and brighter than the first region and a fourth region corresponding to a second region of the first image and darker than the second region.
  • the one or more programs may include instructions that, when executed by the electronic device, cause the electronic device to control a brightness level of the display to perform display of the second image provided in response to the event by setting a value for compensating for the fourth region being darker than the second region to a first value.
  • the one or more programs may include instructions that, when executed by the electronic device, cause the electronic device to control the brightness level of the display by setting the value to a second value lower than the first value based on recognizing through the light sensor that the illuminance around the electronic device has changed from a first illuminance lower than a threshold illuminance to a second illuminance higher than the threshold illuminance while the second image is displayed.
  • Figure 1 illustrates an example of controlling the brightness level of a display of an electronic device depending on the illuminance surrounding the electronic device.
  • Figure 2 is a simplified block diagram of an exemplary electronic device.
  • FIG. 3 is a flowchart illustrating an exemplary method for controlling the brightness level of a display that displays a second image generated from a first image according to changes in illuminance.
  • Figure 4 shows an example of an event.
  • Figure 5 illustrates an exemplary method for generating a second image by applying map information to a first image.
  • FIG. 6 illustrates an exemplary method for controlling the brightness level of a display displaying a second image depending on changes in illuminance.
  • FIG. 8 is a block diagram of an electronic device within a network environment according to various embodiments.
  • a touch input (410) may cause a change from a state (400) to a state (450).
  • at least one processor (210) may change the state (400) to a state (450) based on (or in response to) the touch input (410).
  • at least one processor (210) may display, on the display (120), a second image (402) generated from the first image within the state (450).
  • At least one processor (210) may generate the second image (402) by applying the map information (510) to the first image (511).
  • a third area (513) of the second image (402) may be brighter than a first area (501) of the first image (511).
  • a maximum value of the second grayscale range may be higher than a maximum value of the first grayscale range.
  • a fourth area (514) of the second image (402) may be darker than a second area (502) of the first image (511).
  • At least one processor (210) may control the brightness level of the display (120) using the first value so that the brightness of the fourth area (514) of the second image (402) displayed on the display (120) corresponds to the brightness of the second area (502) of the first image (511) based on the event.
  • the brightness level of the display (120) when the illuminance surrounding the electronic device (100) before (or immediately before) detecting the event is the first illuminance, the brightness level of the display (120) may be set to a first brightness level corresponding to the first illuminance (e.g., the first brightness level exemplified in the description of FIG. 3) according to the function.
  • the brightness level of the display (120) displaying the second image (402) may be set to a second brightness level higher than the first brightness level.
  • the first value may be included in the metadata within the file including the first image (511).
  • the metadata may indicate that the value is the first value.
  • at least one processor (210) may obtain the first value from the metadata based on the event, and control the brightness level of the display (120) to perform display of the second image provided in response to the event by using the first value (or by setting the value to the first value).
  • At operation 304, at least one processor (210) may determine, verify, identify, or monitor, based on recognizing the change in the illuminance, whether the illuminance is less than a threshold illuminance.
  • operation 304 e.g., comparing the illuminance represented by data acquired via the illuminance sensor (230) to the threshold illuminance
  • operation 304 may be executed under a condition that the function is activated within the electronic device (100) and the second image is displayed on the display (120) in response to an event with respect to the first image.
  • At least one processor (210) may execute operation 305 based on recognizing, through the light sensor (230), that the illuminance changes from the first illuminance lower than the threshold illuminance to the second illuminance higher than the threshold illuminance while the second image is displayed according to operation 303, and execute operation 306 based on recognizing, through the light sensor (230), that the illuminance changes from the first illuminance lower than the threshold illuminance to the third illuminance higher than the first illuminance and lower than the threshold illuminance while the second image is displayed according to operation 303.
  • FIG. 6 illustrates an exemplary method for controlling the brightness level of a display displaying a second image depending on changes in illuminance.
  • a chart (600) represents a change in brightness of a display (120) (a display (120) displaying the second image (e.g., the second image (402))) that changes according to a change in illuminance.
  • the horizontal axis of the chart (600) represents the illuminance, and the vertical axis of the chart (600) represents the brightness.
  • operation 305 executed by at least one processor (210) may be represented as a line (601) within a luminance range (621) (e.g., including the second luminance) and a line (602) within the luminance range (621).
  • a minimum luminance A in the luminance range (621) represents the threshold luminance.
  • the difference (or ratio) between the first luminance represented by the line (601) within the illuminance range (621) and the second luminance represented by the line (602) within the illuminance range (621) corresponds to the second value being lower than the first value.
  • At least one processor (210) can control the brightness level of the display (120) by acquiring the second image including the fourth region having the second brightness higher than the first brightness of the fourth region of the second image acquired while the illuminance was the first illuminance, based on recognizing that the illuminance changes from the first illuminance to the second illuminance while the second image is displayed.
  • At least one processor (210) can control the brightness level of the display (120) by setting the value to the second value based on recognizing that the illuminance changes from the first illuminance to the second illuminance while the second image is displayed, upon acquiring the second image including the fourth region having the second brightness higher than the first brightness of the fourth region of the second image acquired while the illuminance is at the first illuminance.
  • At least one processor (210) can control the brightness level of the display (120) based on recognizing that the illuminance changes from the first illuminance to the second illuminance while the second image is displayed, upon acquiring the second image including the third region having the fourth brightness corresponding to (or substantially the same as) the third brightness of the third region of the second image acquired while the illuminance is at the first illuminance and the fourth region having the second brightness.
  • At least one processor (210) can control the brightness level of the display (120) by setting the value to the second value based on recognizing that the illuminance changes from the first illuminance to the second illuminance while the second image is displayed, upon obtaining the second image including the third region having the fourth brightness and the fourth region having the second brightness.
  • the electronic device (100) may support another function for controlling a contrast ratio of an image displayed on a display (120) of the electronic device (100) located outdoors to enhance the readability of the image.
  • the other function may be activated (or executed) when the illuminance is B within the illuminance range (621).
  • a difference (611) (or ratio (611)) between a first luminance D1 represented by line (601) when the illuminance is B and a second luminance D3 represented by line (602) when the illuminance is B may be set to a value for reducing occurrence of side effects (e.g., false contours) in the second image displayed on the display (120) according to the other function.
  • At operation 306 at least one processor (210) can control the brightness level of the display (120) by maintaining the value at the first value under a condition where the illuminance changes from the first illuminance to the third illuminance that is higher than the first illuminance and lower than the threshold illuminance.
  • This operation is exemplified in the description of FIG. 6.
  • the operation 306 executed by at least one processor (210) may be represented as a line (601) within a luminance range (622) (e.g., including the first luminance and the third luminance) and a line (602) within the luminance range (622).
  • a difference (or ratio) between the first luminance represented by the line (601) within the luminance range (622) and the second luminance represented by the line (602) within the luminance range (622) corresponds to the first value.
  • At least one processor (210) can control the brightness level of the display (120) displaying the second image by maintaining the value at the first value independently of an increase in the illuminance recognized by the illuminance sensor (230) within the illuminance range (622).
  • maintaining the value at the first value only indicates that a ratio between the first luminance indicated by line (601) within the illuminance range (622) and the second luminance indicated by line (602) within the illuminance range (622) is maintained, but does not indicate that the first luminance indicated by line (601) within the illuminance range (622) and the second luminance indicated by line (602) within the illuminance range (622) are fixed.
  • the electronic device (100) can enhance the quality of the service provided through the display (120) by executing the operations exemplified in the description of FIG. 3 according to the change in illumination that occurs while displaying the second image.
  • the electronic device (100) may be positioned in an environment having an illuminance that is (much) higher than the second illuminance (e.g., the fourth illuminance within the description of FIG. 7).
  • the electronic device (100) may support operation according to another threshold illuminance that is higher than the threshold illuminance exemplified above for display within the environment. Such operation is exemplified within the description of FIG. 7.
  • FIG. 7 is a flowchart illustrating an exemplary method for controlling the brightness level of a display by stopping display of a second image and executing display of a first image.
  • At operation 701 at least one processor (210) may determine, verify, identify, or monitor whether the illuminance higher than the threshold illuminance is lower than the other threshold illuminance based on recognizing a change in the illuminance via the illuminance sensor (230).
  • operation 701 e.g., comparing the illuminance represented by the data acquired via the illuminance sensor (230) with the other threshold illuminance
  • operation 701 may be executed under a condition that the function is activated within the electronic device (100) and the second image is displayed on the display (120) according to an event for the first image.
  • At least one processor (210) may execute operation 702 based on recognizing, through the light sensor (230), that the illuminance changes from the first illuminance to the second illuminance that is higher than the threshold illuminance and lower than the other threshold illuminance while the second image is displayed according to operation 303, and execute operation 703 based on recognizing, through the light sensor (230), that the illuminance changes from the first illuminance to a fourth illuminance that is higher than the other threshold illuminance while the second image is displayed according to operation 303.
  • At least one processor (210) can control the brightness level of the display (120) by setting the value to a second value lower than the first value under a condition where the illuminance changes from the first illuminance to the second illuminance.
  • operation 702 can correspond to operation 305 of FIG. 3.
  • At least one processor (210) can control the brightness level of the display (120) by stopping the display of the second image and executing the display of the first image under the condition that the illuminance changes from the first illuminance to the fourth illuminance. This operation is exemplified in the description of FIG. 6.
  • the operation 703 executed by at least one processor (210) may be represented by a line (601) that is not defined within the illuminance range (623) and a line (602) within the illuminance range (623) (e.g., including the fourth illuminance).
  • C which is a minimum illuminance of the illuminance range (623)
  • the at least one processor (210) may stop displaying the second image on the display (120) and display the first image on the display (120) based on recognizing the fourth illuminance through the illuminance sensor (230).
  • at least one processor (210) may increase the brightness level of the display (120) displaying the first image modified from the second image as the illumination recognized by the illumination sensor (230) increases within the illumination range (623).
  • the electronic device (100) can support a first function of displaying an SDR image shown as an HDR image and a second function of changing a brightness level of the display (120) according to a change in the illuminance around the electronic device (100).
  • the electronic device (100) can provide an enhanced service by executing the operations exemplified above under a condition in which both the first function and the second function are executed.
  • FIG. 8 is a block diagram of an electronic device (801) in a network environment (800) according to various embodiments.
  • the electronic device (801) may communicate with the electronic device (802) via a first network (898) (e.g., a short-range wireless communication network) or may communicate with at least one of the electronic device (804) or the server (808) via a second network (899) (e.g., a long-range wireless communication network).
  • the electronic device (801) may communicate with the electronic device (804) via the server (808).
  • the electronic device (801) may include a processor (820), a memory (830), an input module (850), an audio output module (855), a display module (860), an audio module (870), a sensor module (876), an interface (877), a connection terminal (878), a haptic module (879), a camera module (880), a power management module (888), a battery (889), a communication module (890), a subscriber identification module (896), or an antenna module (897).
  • the electronic device (801) may omit at least one of these components (e.g., the connection terminal (878)), or may have one or more other components added.
  • some of these components e.g., the sensor module (876), the camera module (880), or the antenna module (897) may be integrated into one component (e.g., the display module (860)).
  • the processor (820) may control at least one other component (e.g., a hardware or software component) of the electronic device (801) connected to the processor (820) by executing, for example, software (e.g., a program (840)), and may perform various data processing or calculations. According to one embodiment, as at least a part of the data processing or calculations, the processor (820) may store a command or data received from another component (e.g., a sensor module (876) or a communication module (890)) in the volatile memory (832), process the command or data stored in the volatile memory (832), and store result data in the nonvolatile memory (834).
  • a command or data received from another component e.g., a sensor module (876) or a communication module (890)
  • the processor (820) may include a main processor (821) (e.g., a central processing unit or an application processor) or an auxiliary processor (823) (e.g., a graphics processing unit, a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor) that can operate independently or together with the main processor (821).
  • a main processor e.g., a central processing unit or an application processor
  • an auxiliary processor e.g., a graphics processing unit, a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor
  • the auxiliary processor (823) may be configured to use less power than the main processor (821) or to be specialized for a given function.
  • the auxiliary processor (823) may be implemented separately from the main processor (821) or as a part thereof.
  • the auxiliary processor (823) may control at least a portion of functions or states associated with at least one of the components of the electronic device (801) (e.g., the display module (860), the sensor module (876), or the communication module (890)), for example, on behalf of the main processor (821) while the main processor (821) is in an inactive (e.g., sleep) state, or together with the main processor (821) while the main processor (821) is in an active (e.g., application execution) state.
  • the auxiliary processor (823) e.g., an image signal processor or a communication processor
  • the auxiliary processor (823) may include a hardware structure specialized for processing artificial intelligence models.
  • the artificial intelligence models may be generated through machine learning. Such learning may be performed, for example, in the electronic device (801) itself on which the artificial intelligence model is executed, or may be performed through a separate server (e.g., server (808)).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but is not limited to the examples described above.
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • the artificial neural network may be one of a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted Boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-networks, or a combination of two or more of the above, but is not limited to the examples described above.
  • the artificial intelligence model may additionally or alternatively include a software structure.
  • the memory (830) can store various data used by at least one component (e.g., the processor (820) or the sensor module (876)) of the electronic device (801).
  • the data can include, for example, software (e.g., the program (840)) and input data or output data for commands related thereto.
  • the memory (830) can include a volatile memory (832) or a nonvolatile memory (834).
  • the program (840) may be stored as software in the memory (830) and may include, for example, an operating system (842), middleware (844), or an application (846).
  • the input module (850) can receive commands or data to be used for a component of the electronic device (801) (e.g., a processor (820)) from an external source (e.g., a user) of the electronic device (801).
  • the input module (850) can include, for example, a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g., a stylus pen).
  • the audio output module (855) can output an audio signal to the outside of the electronic device (801).
  • the audio output module (855) can include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive an incoming call. According to one embodiment, the receiver can be implemented separately from the speaker or as a part thereof.
  • the display module (860) can visually provide information to an external party (e.g., a user) of the electronic device (801).
  • the display module (860) can include, for example, a display, a holographic device, or a projector and a control circuit for controlling the device.
  • the display module (860) can include a touch sensor configured to detect a touch, or a pressure sensor configured to measure a strength of a force generated by the touch.
  • the audio module (870) can convert sound into an electrical signal, or vice versa, convert an electrical signal into sound. According to one embodiment, the audio module (870) can obtain sound through the input module (850), or output sound through an audio output module (855), or an external electronic device (e.g., an electronic device (802)) (e.g., a speaker or a headphone) directly or wirelessly connected to the electronic device (801).
  • an electronic device e.g., an electronic device (802)
  • a speaker or a headphone directly or wirelessly connected to the electronic device (801).
  • the sensor module (876) can detect an operating state (e.g., power or temperature) of the electronic device (801) or an external environmental state (e.g., user state) and generate an electric signal or data value corresponding to the detected state.
  • the sensor module (876) can include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface (877) may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device (801) with an external electronic device (e.g., the electronic device (802)).
  • the interface (877) may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • connection terminal (878) may include a connector through which the electronic device (801) may be physically connected to an external electronic device (e.g., the electronic device (802)).
  • the connection terminal (878) may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
  • the haptic module (879) can convert an electrical signal into a mechanical stimulus (e.g., vibration or movement) or an electrical stimulus that a user can perceive through a tactile or kinesthetic sense.
  • the haptic module (879) can include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module (880) can capture still images and moving images.
  • the camera module (880) can include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module (888) can manage power supplied to the electronic device (801).
  • the power management module (888) can be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery (889) can power at least one component of the electronic device (801).
  • the battery (889) can include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • the communication module (890) may support establishment of a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device (801) and an external electronic device (e.g., the electronic device (802), the electronic device (804), or the server (808)), and performance of communication through the established communication channel.
  • the communication module (890) may operate independently from the processor (820) (e.g., the application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
  • the communication module (890) may include a wireless communication module (892) (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module (894) (e.g., a local area network (LAN) communication module or a power line communication module).
  • a wireless communication module e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module e.g., a local area network (LAN) communication module or a power line communication module.
  • a corresponding communication module among these communication modules may communicate with an external electronic device (804) via a first network (898) (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network (899) (e.g., a long-range communication network such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a LAN or WAN)).
  • a first network e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network e.g., a long-range communication network such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a LAN or WAN)
  • a first network e.g.
  • the wireless communication module (892) may use subscriber information (e.g., an international mobile subscriber identity (IMSI)) stored in the subscriber identification module (896) to identify or authenticate the electronic device (801) within a communication network such as the first network (898) or the second network (899).
  • subscriber information e.g., an international mobile subscriber identity (IMSI)
  • IMSI international mobile subscriber identity
  • the wireless communication module (892) can support a 5G network and next-generation communication technology after a 4G network, for example, NR access technology (new radio access technology).
  • the NR access technology can support high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), terminal power minimization and connection of multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency communications)).
  • the wireless communication module (892) can support, for example, a high-frequency band (e.g., mmWave band) to achieve a high data transmission rate.
  • a high-frequency band e.g., mmWave band
  • the wireless communication module (892) may support various technologies for securing performance in a high-frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna.
  • the wireless communication module (892) may support various requirements specified in the electronic device (801), an external electronic device (e.g., the electronic device (804)), or a network system (e.g., the second network (899)).
  • the wireless communication module (892) may support a peak data rate (e.g., 20 Gbps or more) for eMBB realization, a loss coverage (e.g., 164 dB or less) for mMTC realization, or a U-plane latency (e.g., 0.5 ms or less for downlink (DL) and uplink (UL) each, or 1 ms or less for round trip) for URLLC realization.
  • a peak data rate e.g., 20 Gbps or more
  • a loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 0.5 ms or less for downlink (DL) and uplink (UL) each, or 1 ms or less for round trip
  • the antenna module (897) can transmit or receive signals or power to or from the outside (e.g., an external electronic device).
  • the antenna module (897) can include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (e.g., a PCB).
  • the antenna module (897) can include a plurality of antennas (e.g., an array antenna).
  • at least one antenna suitable for a communication method used in a communication network, such as the first network (898) or the second network (899) can be selected from the plurality of antennas by, for example, the communication module (890).
  • a signal or power can be transmitted or received between the communication module (890) and the external electronic device through the selected at least one antenna.
  • another component e.g., a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module (897) can form a mmWave antenna module.
  • the mmWave antenna module can include a printed circuit board, an RFIC positioned on or adjacent a first side (e.g., a bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., a mmWave band), and a plurality of antennas (e.g., an array antenna) positioned on or adjacent a second side (e.g., a top side or a side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band.
  • a first side e.g., a bottom side
  • a plurality of antennas e.g., an array antenna
  • peripheral devices e.g., a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • commands or data may be transmitted or received between the electronic device (801) and an external electronic device (804) via a server (808) connected to a second network (899).
  • Each of the external electronic devices (802, or 804) may be the same or a different type of device as the electronic device (801).
  • all or part of the operations executed in the electronic device (801) may be executed in one or more of the external electronic devices (802, 804, or 808). For example, when the electronic device (801) is to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device (801) may, instead of or in addition to executing the function or service itself, request one or more external electronic devices to perform at least a part of the function or service.
  • One or more external electronic devices that receive the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device (801).
  • the electronic device (801) may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device (801) may provide an ultra-low latency service by using, for example, distributed computing or mobile edge computing.
  • the external electronic device (804) may include an IoT (Internet of Things) device.
  • the server (808) may be an intelligent server using machine learning and/or a neural network.
  • the external electronic device (804) or the server (808) may be included in the second network (899).
  • the electronic device (801) can be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • FIG. 9 is a block diagram (900) of a display module (860) according to various embodiments.
  • the display module (860) may include a display (910) and a display driver IC (DDI) (930) for controlling the display (910).
  • the DDI (930) may include an interface module (931), a memory (933) (e.g., a buffer memory), an image processing module (935), or a mapping module (937).
  • the DDI (930) may receive image information including, for example, image data or an image control signal corresponding to a command for controlling the image data, from another component of the electronic device (801) through the interface module (931).
  • image information may be received from a processor (820) (e.g., a main processor (821) (e.g., an application processor) or an auxiliary processor (823) (e.g., a graphic processing unit) that operates independently of the function of the main processor (821).
  • the DDI (930) may communicate with a touch circuit (950) or a sensor module (876) through the interface module (931).
  • the DDI (930) may store at least some of the received image information in the memory (933), for example, in units of frames.
  • the image processing module (935) may perform preprocessing or postprocessing (e.g., resolution, brightness, or size adjustment) on at least some of the image data based on at least the characteristics of the image data or the characteristics of the display (910), for example.
  • the mapping module (937) may generate a voltage value or a current value corresponding to the image data that has been preprocessed or postprocessed through the image processing module (935).
  • the voltage The generation of the value or current value may be performed at least in part based on, for example, properties of pixels of the display (910) (e.g., arrangement of pixels (RGB stripe or pentile structure), or size of each of the sub-pixels).
  • At least some pixels of the display (910) may be driven at least in part based on, for example, the voltage value or current value, so that visual information (e.g., text, image, or icon) corresponding to the image data may be displayed through the display (910).
  • the display module (860) may further include a touch circuit (950).
  • the touch circuit (950) may include a touch sensor (951) and a touch sensor IC (953) for controlling the same.
  • the touch sensor IC (953) may control the touch sensor (951) to detect, for example, a touch input or a hovering input for a specific location of the display (910).
  • the touch sensor IC (953) may detect the touch input or the hovering input by measuring a change in a signal (e.g., voltage, light amount, resistance, or charge amount) for a specific location of the display (910).
  • a signal e.g., voltage, light amount, resistance, or charge amount
  • the touch sensor IC (953) may provide information (e.g., location, area, pressure, or time) about the detected touch input or hovering input to the processor (820).
  • information e.g., location, area, pressure, or time
  • at least a portion of the touch circuit (950) may be included as part of the display driver IC (930), or as part of the display (910), or as part of another component (e.g., the auxiliary processor (823)) disposed external to the display module (860).
  • the display module (860) may further include at least one sensor (e.g., a fingerprint sensor, an iris sensor, a pressure sensor, or an illuminance sensor) of the sensor module (876), or a control circuit therefor.
  • the at least one sensor or the control circuit therefor may be embedded in a part of the display module (860) (e.g., the display (910) or the DDI (930)) or a part of the touch circuit (950).
  • the biometric sensor may obtain biometric information (e.g., a fingerprint image) associated with a touch input through a part of the display (910).
  • the sensor module (876) embedded in the display module (860) includes a pressure sensor, the pressure sensor may obtain pressure information associated with a touch input through a part or the entire part of the display (910).
  • the touch sensor (951) or sensor module (876) may be positioned between pixels of a pixel layer of the display (910), or above or below the pixel layer.
  • an electronic device may include one or more storage media, a memory (e.g., memory (220)) storing instructions, a light sensor (e.g., light sensor (230)), a display (e.g., display (120)), and at least one processor (e.g., at least one processor (210)) including one or more processing circuits.
  • a memory e.g., memory (220)
  • a light sensor e.g., light sensor (230)
  • a display e.g., display (120)
  • at least one processor e.g., at least one processor (210) including one or more processing circuits.
  • the at least one processor may be configured to detect an event for displaying a first image (e.g., operation 301)), acquire a second image based on the event, including a third region corresponding to a first region of the first image and brighter than the first region and a fourth region corresponding to a second region of the first image and darker than the second region (e.g., operation 302)), set a value for compensating for the fourth region darker than the second region to a first value, thereby controlling a brightness level of the display that executes display of the second image provided in response to the event (e.g., operation 303)), and control the brightness level of the display by recognizing through the illumination sensor that the illumination around the electronic device changes from a first illumination lower than a threshold illumination level to a second illumination higher than the threshold illumination level while the second image is displayed (e.g., operation 305)).
  • an event for displaying a first image e.g., operation 301
  • acquire a second image based on the event including a third region corresponding to
  • the at least one processor may be configured to control the brightness level of the display by maintaining the value at the first value based on recognizing through the light sensor that the illuminance changes from the first illuminance to a third illuminance that is higher than the first illuminance and lower than the threshold illuminance while the second image is displayed (e.g., at operation 306).
  • the second illuminance may be lower than another threshold illuminance that is higher than the threshold illuminance.
  • the at least one processor may be configured to control the brightness level of the display by stopping the display of the second image and executing the display of the first image based on recognizing through the illuminance sensor that the illuminance changes from the first illuminance to a fourth illuminance that is higher than the other threshold illuminance while the second image is displayed (e.g., at operation 703).
  • the at least one processor may be configured to control the brightness level of the display by setting the value to the second value based on recognizing that the illuminance has changed from the first illuminance to the second illuminance, and acquiring the second image including the fourth region having a second brightness higher than the first brightness of the fourth region of the second image acquired while the illuminance was at the first illuminance (e.g., operation 305).
  • the at least one processor may be configured to control the brightness level of the display by setting the value to the second value based on recognizing that the illuminance changes from the first illuminance to the second illuminance, and acquiring the second image including the third region having the fourth brightness corresponding to the third brightness of the third region of the second image acquired while the illuminance is the first illuminance, and the fourth region having the second brightness (e.g., operation 305).
  • the first value may correspond to the difference between the third brightness of the second region and the first brightness.
  • the at least one processor may be configured to, prior to detecting the event, generate the second image, store the generated second image in the memory, and, based on the event, obtain the second image from the memory (e.g., operation 302).
  • the at least one processor may be configured to obtain the second image by generating the second image based on the event (e.g., operation 302).
  • the first value may be obtained from metadata within a file, stored within the memory, that includes the first image.
  • the at least one processor may be configured to obtain the second image by generating the second image by applying map information in the metadata for the third region having a second tone range wider than a first tone range of the first region to the first image based on the event, or obtain, from the memory, the second image generated by applying the map information for the third region for the second tone range to the first image (e.g., operation 302).
  • the map information may be applied to the first image for the third region that is visually emphasized with respect to the first region.
  • a method can be executed in an electronic device having a light sensor and a display.
  • the method can include an operation of detecting an event for displaying a first image, an operation of acquiring a second image based on the event, the second image including a third region corresponding to a first region of the first image and brighter than the first region and a fourth region corresponding to a second region of the first image and darker than the second region, an operation of controlling a brightness level of the display, which executes display of the second image provided in response to the event, by setting a value for compensating for the fourth region being darker than the second region to a first value, and an operation of controlling the brightness level of the display by setting the value to a second value lower than the first value based on recognizing through the light sensor that the light level around the electronic device changes from a first illuminance lower than a threshold illuminance to a second illuminance higher than the threshold illuminance while the second image is displayed.
  • the method may include controlling the brightness level of the display by maintaining the value at the first value based on recognizing through the illuminance sensor that the illuminance has changed from the first illuminance to a third illuminance that is higher than the first illuminance and lower than the threshold illuminance while the second image is displayed.
  • the second illuminance may be lower than another threshold illuminance that is higher than the threshold illuminance.
  • the method may include controlling the brightness level of the display by stopping the display of the second image and executing the display of the first image based on recognizing through the illuminance sensor that the illuminance changes from the first illuminance to a fourth illuminance that is higher than the other threshold illuminance while the second image is displayed.
  • the operation of controlling the brightness level of the display based on recognizing that the illuminance has changed from the first illuminance to the second illuminance may include the operation of controlling the brightness level of the display by setting the value to the second value according to acquiring the second image including the fourth region having a second brightness higher than the first brightness of the fourth region of the second image acquired while the illuminance was at the first illuminance, based on recognizing that the illuminance has changed from the first illuminance to the second illuminance.
  • the operation of controlling the brightness level of the display based on recognizing that the illuminance has changed from the first illuminance to the second illuminance may include the operation of controlling the brightness level of the display by setting the value to the second value according to acquiring the second image, the second image including the third region having the fourth brightness corresponding to the third brightness of the third region of the second image acquired while the illuminance was at the first illuminance, and the fourth region having the second brightness.
  • the non-transitory computer-readable storage medium as described above may store one or more programs.
  • the one or more programs may include instructions that, when executed by an electronic device having a light sensor and a display, cause the electronic device to detect an event for displaying a first image, obtain a second image based on the event, including a third region corresponding to a first region of the first image and brighter than the first region and a fourth region corresponding to a second region of the first image and darker than the second region, and set a value for compensating for the fourth region darker than the second region to a first value, thereby controlling a brightness level of the display to execute display of the second image provided in response to the event, and control the brightness level of the display by setting the value to a second value lower than the first value based on recognizing through the light sensor that the light level around the electronic device changes from a first illuminance lower than a threshold illuminance to a second illuminance higher than the threshold illuminance while the second image is displayed.
  • the one or more programs may include instructions that cause the electronic device to control the brightness level of the display by maintaining the value at the first value based on recognizing through the light sensor that the illuminance has changed from the first illuminance to a third illuminance that is higher than the first illuminance and lower than the threshold illuminance while the second image is displayed.
  • the second illuminance may be lower than another threshold illuminance that is higher than the threshold illuminance.
  • the one or more programs may include instructions that cause the electronic device to control the brightness level of the display by stopping the display of the second image and executing the display of the first image based on recognizing through the illuminance sensor that the illuminance has changed from the first illuminance to a fourth illuminance that is higher than the other threshold illuminance while the second image is displayed.
  • the one or more programs may include instructions that cause the electronic device to control the brightness level of the display by acquiring the second image including the fourth region having a second brightness higher than the first brightness of the fourth region of the second image acquired while the illuminance was at the first illuminance, based on recognizing that the illuminance has changed from the first illuminance to the second illuminance.
  • the electronic devices according to various embodiments disclosed in this document may be devices of various forms.
  • the electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliance devices.
  • portable communication devices e.g., smartphones
  • computer devices portable multimedia devices
  • portable medical devices e.g., cameras
  • wearable devices e.g., smart watch devices
  • home appliance devices e.g., smartphones
  • the electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used merely to distinguish one component from another, and do not limit the components in any other respect (e.g., importance or order).
  • a component e.g., a first
  • another component e.g., a second
  • functionally e.g., a third component
  • module used in various embodiments of this document may include a unit implemented in hardware, software or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit, for example.
  • a module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • a module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document may be implemented as software (e.g., a program (840)) including one or more instructions stored in a storage medium (e.g., an internal memory (836) or an external memory (838)) readable by a machine (e.g., an electronic device (801)).
  • a processor e.g., a processor (820)
  • the machine e.g., the electronic device (801)
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the machine-readable storage medium may be provided in the form of a non-transitory storage medium.
  • ‘non-transitory’ simply means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and the term does not distinguish between cases where data is stored semi-permanently or temporarily on the storage medium.
  • the method according to various embodiments disclosed in the present document may be provided as included in a computer program product.
  • the computer program product may be traded between a seller and a buyer as a commodity.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g., a compact disc read only memory (CD-ROM)), or may be distributed online (e.g., downloaded or uploaded) via an application store (e.g., Play StoreTM) or directly between two user devices (e.g., smart phones).
  • an application store e.g., Play StoreTM
  • at least a part of the computer program product may be at least temporarily stored or temporarily generated in a machine-readable storage medium, such as a memory of a manufacturer's server, a server of an application store, or an intermediary server.
  • each component e.g., a module or a program of the above-described components may include a single or multiple entities, and some of the multiple entities may be separately arranged in other components.
  • one or more components or operations of the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • the multiple components e.g., a module or a program
  • the integrated component may perform one or more functions of each of the multiple components identically or similarly to those performed by the corresponding component of the multiple components before the integration.
  • the operations performed by the module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, omitted, or one or more other operations may be added.
  • the devices described above may be implemented as hardware components, software components, and/or a combination of hardware components and software components.
  • the devices and components described in the embodiments may be implemented using one or more general-purpose computers or special-purpose computers, such as a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a programmable logic unit (PLU), a microprocessor, or any other device capable of executing instructions and responding to them.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may access, store, manipulate, process, and generate data in response to the execution of the software.
  • OS operating system
  • the processing device may access, store, manipulate, process, and generate data in response to the execution of the software.
  • the processing device is sometimes described as being used alone, but those skilled in the art will appreciate that the processing device may include multiple processing elements and/or multiple types of processing elements.
  • the processing device may include multiple processors, or a processor and a controller.
  • Other processing configurations, such as parallel processors, are also possible.
  • the software may include a computer program, code, instructions, or a combination of one or more of these, which may configure a processing device to perform a desired operation or may independently or collectively command the processing device.
  • the software and/or data may be embodied in any type of machine, component, physical device, computer storage medium, or device for interpretation by the processing device or for providing instructions or data to the processing device.
  • the software may be distributed over network-connected computer systems and stored or executed in a distributed manner.
  • the software and data may be stored on one or more computer-readable recording media.
  • the method according to one embodiment may be implemented in the form of program commands that can be executed through various computer means and recorded on a computer-readable medium.
  • the medium may be one that continuously stores a program executable by a computer, or one that temporarily stores it for execution or downloading.
  • the medium may be various recording means or storage means in the form of a single or multiple hardware combinations, and is not limited to a medium directly connected to a computer system, and may be distributed on a network. Examples of the medium may include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, and ROMs, RAMs, flash memories, etc., configured to store program commands.
  • examples of other media may include recording media or storage media managed by app stores that distribute applications, sites that supply or distribute various software, servers, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un dispositif électronique qui peut comprendre une mémoire, un capteur d'éclairage, un dispositif d'affichage et au moins un processeur. Des instructions, lorsqu'elles sont exécutées individuellement ou collectivement par le ou les processeurs, peuvent ordonner au dispositif électronique de : détecter un événement pour afficher une première image ; acquérir, sur la base de l'événement, une seconde image comprenant une troisième région correspondant à une première région de la première image et plus lumineuse que la première région et une quatrième région correspondant à une deuxième région de la première image et plus sombre que la deuxième région ; réguler un niveau de luminosité du dispositif d'affichage, qui exécute l'affichage de la seconde image fournie en réponse à l'événement, en réglant une valeur pour compenser la quatrième région qui est plus sombre que la deuxième région à une première valeur ; et réguler le niveau de luminosité du dispositif d'affichage en réglant la valeur à une seconde valeur inférieure à la première valeur, sur la base de la reconnaissance par l'intermédiaire du capteur d'éclairage que l'éclairage autour du dispositif électronique passe d'un premier éclairage inférieur à un éclairage de seuil à un second éclairage supérieur à l'éclairage de seuil tandis que la seconde image est affichée.
PCT/KR2024/017848 2024-01-02 2024-11-12 Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour réguler un niveau de luminosité Pending WO2025146929A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20240000596 2024-01-02
KR10-2024-0000596 2024-01-02
KR1020240007319A KR20250106155A (ko) 2024-01-02 2024-01-17 밝기 레벨을 제어하기 위한 전자 장치, 방법, 및 비일시적 컴퓨터 판독가능 저장 매체
KR10-2024-0007319 2024-01-17

Publications (1)

Publication Number Publication Date
WO2025146929A1 true WO2025146929A1 (fr) 2025-07-10

Family

ID=96300514

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2024/017848 Pending WO2025146929A1 (fr) 2024-01-02 2024-11-12 Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour réguler un niveau de luminosité

Country Status (1)

Country Link
WO (1) WO2025146929A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140108780A (ko) * 2013-02-28 2014-09-15 엘지디스플레이 주식회사 감마 보정 장치 및 감마 보정 방법
KR101629825B1 (ko) * 2014-12-04 2016-06-22 현대모비스 주식회사 Hdr 기능을 이용한 차량용 디스플레이 장치 및 방법
KR101764943B1 (ko) * 2010-02-04 2017-08-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 높은 동적 범위의 이미지 생성 및 렌더링
KR20200080542A (ko) * 2018-12-27 2020-07-07 엘지전자 주식회사 영상표시장치
KR102139751B1 (ko) * 2015-04-21 2020-07-31 삼성전자주식회사 디스플레이 장치 및 그 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101764943B1 (ko) * 2010-02-04 2017-08-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 높은 동적 범위의 이미지 생성 및 렌더링
KR20140108780A (ko) * 2013-02-28 2014-09-15 엘지디스플레이 주식회사 감마 보정 장치 및 감마 보정 방법
KR101629825B1 (ko) * 2014-12-04 2016-06-22 현대모비스 주식회사 Hdr 기능을 이용한 차량용 디스플레이 장치 및 방법
KR102139751B1 (ko) * 2015-04-21 2020-07-31 삼성전자주식회사 디스플레이 장치 및 그 제어 방법
KR20200080542A (ko) * 2018-12-27 2020-07-07 엘지전자 주식회사 영상표시장치

Similar Documents

Publication Publication Date Title
WO2022080614A1 (fr) Dispositif électronique comprenant un afficheur à taille d'écran variable et procédé de compensation de la dégradation de l'afficheur
WO2020060218A1 (fr) Dispositif électronique d'amélioration du phénomène de reconnaissance visuelle dans une zone partielle d'affichage
WO2020091491A1 (fr) Dispositif électronique de commande de position ou de zone d'affichage d'image en fonction d'un changement de contenu d'image
WO2022030921A1 (fr) Dispositif électronique, et procédé de commande de son écran
WO2019143207A1 (fr) Dispositif électronique et afficheur pour réduire le courant de fuite
WO2023214675A1 (fr) Dispositif électronique et procédé de traitement d'entrée tactile
WO2022255789A1 (fr) Dispositif électronique comprenant un écran tactile et son procédé de fonctionnement
WO2022030998A1 (fr) Dispositif électronique comprenant une unité d'affichage et son procédé de fonctionnement
WO2024154920A1 (fr) Dispositif électronique et procédé de changement d'état d'affichage
WO2024076031A1 (fr) Dispositif électronique comprenant un circuit d'attaque d'affichage commandant la fréquence d'horloge
WO2025146929A1 (fr) Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour réguler un niveau de luminosité
WO2022231168A1 (fr) Procédé et dispositif de reconnaissance faciale par inversion de couleurs sur un écran
WO2022158798A1 (fr) Procédé de commande d'affichage à de multiples fréquences de commande et dispositif électronique le mettant en œuvre
WO2022114648A1 (fr) Dispositif électronique de paramétrage d'un écran d'arrière-plan et procédé de fonctionnement dudit dispositif
WO2022005003A1 (fr) Dispositif électronique comprenant un dispositif d'affichage ayant une fréquence de rafraîchissement variable, et procédé de fonctionnement associé
WO2022010092A1 (fr) Dispositif électronique pour prendre en charge un partage de contenu
WO2025164919A1 (fr) Dispositif électronique et procédé de demande d'image pour l'excitation multifréquence d'un panneau d'affichage, et support de stockage non transitoire lisible par ordinateur
WO2024177250A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur pour changer l'état d'affichage
WO2024101684A1 (fr) Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour modifier la fréquence d'attaque
WO2025100798A1 (fr) Dispositif électronique, procédé et support de stockage pour commander la luminosité d'un écran d'affichage
WO2025058227A1 (fr) Dispositif à porter sur soi doté d'un dispositif d'affichage et procédé associé
WO2024071562A1 (fr) Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour identifier un niveau de luminosité en fonction d'un taux de pixel actif
WO2024072053A1 (fr) Dispositif électronique et procédé pour commander la mémoire dans un dispositif d'affichage
KR20250106155A (ko) 밝기 레벨을 제어하기 위한 전자 장치, 방법, 및 비일시적 컴퓨터 판독가능 저장 매체
WO2023287057A1 (fr) Dispositif électronique permettant de rapidement mettre à jour un écran lorsqu'une entrée est reçue en provenance d'un dispositif périphérique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24915513

Country of ref document: EP

Kind code of ref document: A1