WO2025018531A1 - Dispositif électronique portable comprenant une caméra de détection infrarouge - Google Patents
Dispositif électronique portable comprenant une caméra de détection infrarouge Download PDFInfo
- Publication number
- WO2025018531A1 WO2025018531A1 PCT/KR2024/006029 KR2024006029W WO2025018531A1 WO 2025018531 A1 WO2025018531 A1 WO 2025018531A1 KR 2024006029 W KR2024006029 W KR 2024006029W WO 2025018531 A1 WO2025018531 A1 WO 2025018531A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- electronic device
- camera
- display panel
- user
- wearable electronic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/28—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
Definitions
- Various embodiments of this document relate to a wearable electronic device including an infrared sensing camera.
- wearable electronic devices such as virtual reality (VR) devices, augmented reality (AR) devices, and/or mixed reality (MR) devices
- VR virtual reality
- AR augmented reality
- MR mixed reality
- Wearable electronic devices can be configured to be worn on a user's head.
- wearable electronic devices can take the form of glasses or goggles.
- a wearable electronic device may include a housing configured to be worn on a user's head, a display panel positioned in the housing to generate visible light in at least a first direction, a lens assembly positioned on a side of the display panel in the first direction, at least one infrared light source positioned on the display panel to irradiate infrared light toward an eye of the user, and a camera positioned on a side of the display panel in a second direction opposite to the first direction to sense infrared light reflected from the eye of the user.
- the camera may be positioned outside an effective field of view of the display panel.
- a display assembly configured to be applied to a wearable electronic device may include a display panel emitting visible light in at least a first direction, a lens assembly positioned on a side of the display panel in the first direction, at least one infrared light source positioned on the display panel to irradiate infrared light toward an eye of a user, and a camera positioned on a side of the display panel in a second direction opposite to the first direction to sense infrared light reflected from the eye of the user.
- the camera may be positioned outside an effective viewing area of the display panel.
- a wearable electronic device may include a housing configured to be worn on a user's head, a display panel positioned in the housing to generate visible light in at least a first direction, a lens assembly positioned on a side of the display panel in the first direction, a polarizing plate positioned on the side of the lens assembly in the first direction, at least one infrared light source positioned on the display panel to irradiate infrared light toward an eye of the user, and a camera positioned on a second direction side of the display panel opposite to the first direction to sense infrared light reflected from the eye of the user.
- the camera may be positioned on the second direction side of the display panel outside an effective field of view of the display panel using an under display camera (UDC) structure.
- the lens assembly may be configured to provide a first infrared path through which infrared light irradiated from the infrared light source reaches the user's eye and a second infrared path through which infrared light reflected from the user's eye reaches the camera.
- the polarizing plate may form the first infrared path or the second infrared path.
- the performance of the camera can be improved and user convenience can be enhanced by positioning the camera on the back of the display panel so that the camera is positioned in the direction of the user's gaze.
- FIG. 1 is a block diagram of an electronic device within a network environment according to one embodiment.
- FIG. 2A is a diagram showing the front side of a wearable electronic device according to one embodiment.
- FIG. 2b is a drawing showing the rear side of a wearable electronic device according to one embodiment.
- FIG. 3A is a schematic cross-sectional view of a display assembly according to one embodiment, illustrating a visible light path.
- FIG. 3b is a schematic cross-sectional view of a display assembly according to one embodiment, showing an infrared path.
- FIG. 4A is a front view illustrating a state in which an infrared light source is arranged on a display panel according to one embodiment.
- FIG. 4b is a front view illustrating a state in which an infrared light source is arranged on a display panel according to one embodiment.
- FIG. 5a is a front view illustrating a state in which a camera is arranged on a display panel according to one embodiment.
- FIG. 5b is a front view illustrating a state in which a camera is arranged on a display panel according to one embodiment.
- FIG. 6 is a cross-sectional view of a display panel and a camera according to one embodiment.
- FIG. 7 is a schematic cross-sectional view of a display assembly according to one embodiment, showing an infrared path.
- FIG. 1 is a block diagram of an electronic device (101) in a network environment (100) according to an embodiment.
- the electronic device (101) may communicate with the electronic device (102) via a first network (198) (e.g., a short-range wireless communication network) or may communicate with at least one of the electronic device (104) or the server (108) via a second network (199) (e.g., a long-range wireless communication network).
- the electronic device (101) may communicate with the electronic device (104) via the server (108).
- the electronic device (101) may include a processor (120), a memory (130), an input module (150), an audio output module (155), a display module (160), an audio module (170), a sensor module (176), an interface (177), a connection terminal (178), a haptic module (179), a camera module (180), a power management module (188), a battery (189), a communication module (190), a subscriber identification module (196), or an antenna module (197).
- the electronic device (101) may include at least one of these components (e.g., the connection terminal (178)) omitted, or one or more other components added. In one embodiment, some of these components (e.g., the sensor module (176), the camera module (180), or the antenna module (197)) may be integrated into one component (e.g., the display module (160)).
- the processor (120) may include a main processor (121) (e.g., a central processing unit or an application processor) or an auxiliary processor (123) (e.g., a graphic processing unit, a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor) that can operate independently or together therewith.
- a main processor (121) e.g., a central processing unit or an application processor
- an auxiliary processor (123) e.g., a graphic processing unit, a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor
- the secondary processor (123) may be configured to use lower power than the main processor (121) or to be specialized for a given function.
- the secondary processor (123) may be implemented separately from the main processor (121) or as a part thereof.
- the auxiliary processor (123) may control at least a part of functions or states associated with at least one of the components of the electronic device (101) (e.g., the display module (160), the sensor module (176), or the communication module (190)), for example, while the main processor (121) is in an inactive (e.g., sleep) state, or together with the main processor (121) while the main processor (121) is in an active (e.g., application execution) state.
- the auxiliary processor (123) e.g., an image signal processor or a communication processor
- the auxiliary processor (123) may include a hardware structure specialized for processing an artificial intelligence model.
- the artificial intelligence model may be generated through machine learning. Such learning may be performed, for example, in the electronic device (101) itself on which the artificial intelligence model is executed, or may be performed through a separate server (e.g., server (108)).
- the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but is not limited to the examples described above.
- the artificial intelligence model may include a plurality of artificial neural network layers.
- the artificial neural network may be one of a deep neural network (DNN), a convolutional neural network (CNN), a recurrent neural network (RNN), a restricted Boltzmann machine (RBM), a deep belief network (DBN), a bidirectional recurrent deep neural network (BRDNN), deep Q-networks, or a combination of two or more of the above, but is not limited to the examples described above.
- the artificial intelligence model may additionally or alternatively include a software structure.
- the memory (130) can store various data used by at least one component (e.g., processor (120) or sensor module (176)) of the electronic device (101).
- the data can include, for example, software (e.g., program (140)) and input data or output data for commands related thereto.
- the memory (130) can include volatile memory (132) or nonvolatile memory (134).
- the program (140) may be stored as software in memory (130) and may include, for example, an operating system (142), middleware (144), or an application (146).
- the input module (150) can receive commands or data to be used in a component of the electronic device (101) (e.g., a processor (120)) from an external source (e.g., a user) of the electronic device (101).
- the input module (150) can include, for example, a microphone, a mouse, a keyboard, a key (e.g., a button), or a digital pen (e.g., a stylus pen).
- the audio output module (155) can output an audio signal to the outside of the electronic device (101).
- the audio output module (155) can include, for example, a speaker or a receiver.
- the speaker can be used for general purposes such as multimedia playback or recording playback.
- the receiver can be used to receive an incoming call. According to one embodiment, the receiver can be implemented separately from the speaker or as a part thereof.
- the audio module (170) can convert sound into an electrical signal, or vice versa, convert an electrical signal into sound. According to one embodiment, the audio module (170) can obtain sound through an input module (150), or output sound through an audio output module (155), or an external electronic device (e.g., an electronic device (102)) (e.g., a speaker or a headphone) directly or wirelessly connected to the electronic device (101).
- an electronic device e.g., an electronic device (102)
- a speaker or a headphone directly or wirelessly connected to the electronic device (101).
- the sensor module (176) can detect an operating state (e.g., power or temperature) of the electronic device (101) or an external environmental state (e.g., user state) and generate an electric signal or data value corresponding to the detected state.
- the sensor module (176) can include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface (177) may support one or more designated protocols that may be used to directly or wirelessly connect the electronic device (101) with an external electronic device (e.g., the electronic device (102)).
- the interface (177) may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card
- connection terminal (178) may include a connector through which the electronic device (101) may be physically connected to an external electronic device (e.g., the electronic device (102)).
- the connection terminal (178) may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
- the haptic module (179) can convert an electrical signal into a mechanical stimulus (e.g., vibration or movement) or an electrical stimulus that a user can perceive through a tactile or kinesthetic sense.
- the haptic module (179) can include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module (180) can capture still images and moving images.
- the camera module (180) can include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module (188) can manage power supplied to the electronic device (101).
- the power management module (188) can be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery (189) can power at least one component of the electronic device (101).
- the battery (189) can include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
- the communication module (190) may support establishment of a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device (101) and an external electronic device (e.g., the electronic device (102), the electronic device (104), or the server (108)), and performance of communication through the established communication channel.
- the communication module (190) may operate independently from the processor (120) (e.g., the application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
- the communication module (190) may include a wireless communication module (192) (e.g., a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system) communication module) or a wired communication module (194) (e.g., a local area network (LAN) communication module, or a power line communication module).
- a wireless communication module (192) e.g., a cellular communication module, a short-range wireless communication module, or a GNSS (global navigation satellite system) communication module
- a wired communication module (194) e.g., a local area network (LAN) communication module, or a power line communication module.
- a corresponding communication module may communicate with an external electronic device (104) via a first network (198) (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network (199) (e.g., a long-range communication network such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a LAN or WAN)).
- a first network (198) e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
- a second network (199) e.g., a long-range communication network such as a legacy cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., a LAN or WAN)
- a computer network e.g.,
- the wireless communication module (192) may use subscriber information (e.g., an international mobile subscriber identity (IMSI)) stored in the subscriber identification module (196) to identify or authenticate the electronic device (101) within a communication network such as the first network (198) or the second network (199).
- subscriber information e.g., an international mobile subscriber identity (IMSI)
- IMSI international mobile subscriber identity
- the wireless communication module (192) can support a 5G network and next-generation communication technology after a 4G network, for example, NR access technology (new radio access technology).
- the NR access technology can support high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), terminal power minimization and connection of multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency communications)).
- eMBB enhanced mobile broadband
- mMTC massive machine type communications
- URLLC ultra-reliable and low-latency communications
- the wireless communication module (192) can support, for example, a high-frequency band (e.g., mmWave band) to achieve a high data transmission rate.
- a high-frequency band e.g., mmWave band
- the wireless communication module (192) may support various technologies for securing performance in a high-frequency band, such as beamforming, massive multiple-input and multiple-output (MIMO), full dimensional MIMO (FD-MIMO), array antenna, analog beam-forming, or large scale antenna.
- the wireless communication module (192) may support various requirements specified in an electronic device (101), an external electronic device (e.g., an electronic device (104)), or a network system (e.g., a second network (199)).
- the wireless communication module (192) may support a peak data rate (e.g., 20 Gbps or more) for eMBB realization, a loss coverage (e.g., 164 dB or less) for mMTC realization, or a U-plane latency (e.g., 0.5 ms or less for downlink (DL) and uplink (UL) each, or 1 ms or less for round trip) for URLLC realization.
- a peak data rate e.g., 20 Gbps or more
- a loss coverage e.g., 164 dB or less
- U-plane latency e.g., 0.5 ms or less for downlink (DL) and uplink (UL) each, or 1 ms or less for round trip
- the antenna module (197) can transmit or receive signals or power to or from the outside (e.g., an external electronic device).
- the antenna module (197) may include an antenna including a radiator formed of a conductor or a conductive pattern formed on a substrate (e.g., a PCB).
- the antenna module (197) may include a plurality of antennas (e.g., an array antenna).
- at least one antenna suitable for a communication method used in a communication network, such as the first network (198) or the second network (199) may be selected from the plurality of antennas by, for example, the communication module (190).
- a signal or power may be transmitted or received between the communication module (190) and the external electronic device through the selected at least one antenna.
- another component e.g., a radio frequency integrated circuit (RFIC)
- RFIC radio frequency integrated circuit
- the antenna module (197) can form a mmWave antenna module.
- the mmWave antenna module can include a printed circuit board, an RFIC disposed on or adjacent a first side (e.g., a bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., a mmWave band), and a plurality of antennas (e.g., an array antenna) disposed on or adjacent a second side (e.g., a top side or a side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band.
- a first side e.g., a bottom side
- a plurality of antennas e.g., an array antenna
- peripheral devices e.g., a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
- GPIO general purpose input and output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- a command or data may be transmitted or received between the electronic device (101) and an external electronic device (104) via a server (108) connected to a second network (199).
- Each of the external electronic devices (102 or 104) may be the same or a different type of device as the electronic device (101).
- all or part of the operations executed in the electronic device (101) may be executed in one or more of the external electronic devices (102, 104, or 108). For example, when the electronic device (101) is to perform a certain function or service automatically or in response to a request from a user or another device, the electronic device (101) may, instead of executing the function or service by itself or in addition, request one or more external electronic devices to perform at least a part of the function or service.
- One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device (101).
- the electronic device (101) may provide the result, as is or additionally processed, as at least a part of a response to the request.
- cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used, for example.
- the electronic device (101) may provide an ultra-low latency service by using distributed computing or mobile edge computing, for example.
- the external electronic device (104) may include an IoT (Internet of Things) device.
- the server (108) may be an intelligent server using machine learning and/or a neural network.
- the external electronic device (104) or the server (108) may be included in the second network (199).
- the electronic device (101) can be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
- An electronic device may be a device of various forms.
- the electronic device may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
- a portable communication device e.g., a smartphone
- first, second, or first or second may be used merely to distinguish one component from another, and do not limit the components in any other respect (e.g., importance or order).
- a component e.g., a first
- another component e.g., a second
- functionally e.g., a third component
- module used in one embodiment of this document may include a unit implemented in hardware, software or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit, for example.
- a module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
- a module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- An embodiment of the present document may be implemented as software (e.g., a program (140)) including one or more instructions stored in a storage medium (e.g., an internal memory (136) or an external memory (138)) readable by a machine (e.g., an electronic device (101)).
- a processor e.g., a processor (120)
- the machine e.g., the electronic device (101)
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the machine-readable storage medium may be provided in the form of a non-transitory storage medium.
- ‘non-transitory’ simply means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves), and the term does not distinguish between cases where data is stored semi-permanently or temporarily on the storage medium.
- the method according to one embodiment disclosed in the present document may be provided as included in a computer program product.
- the computer program product may be traded between a seller and a buyer as a commodity.
- the computer program product may be distributed in the form of a machine-readable storage medium (e.g., a compact disc read only memory (CD-ROM)), or may be distributed online (e.g., downloaded or uploaded) via an application store (e.g., Play StoreTM) or directly between two user devices (e.g., smart phones).
- an application store e.g., Play StoreTM
- at least a part of the computer program product may be at least temporarily stored or temporarily generated in a machine-readable storage medium, such as a memory of a manufacturer's server, a server of an application store, or an intermediary server.
- each of the components may include a single or multiple entities, and some of the multiple entities may be separated and arranged in other components.
- one or more of the components or operations of the aforementioned components may be omitted, or one or more other components or operations may be added.
- the multiple components e.g., modules or programs
- the integrated component may perform one or more functions of each of the multiple components identically or similarly to those performed by the corresponding component of the multiple components before the integration.
- the operations performed by the module, program or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, omitted, or one or more other operations may be added.
- FIGS. 2A and 2B are diagrams showing the front and back of a wearable electronic device (200) according to one embodiment.
- the appearance that the user's eyes see may be FIG. 2B.
- the electronic device (101) of FIG. 1 may include a wearable electronic device (200) that provides a service that provides an extended reality (XR) experience to a user.
- XR or XR service may be defined as a service that collectively refers to virtual reality (VR), augmented reality (AR), and/or mixed reality (MR).
- VR virtual reality
- AR augmented reality
- MR mixed reality
- the wearable electronic device (200) may have a form factor for being worn on a user's head.
- the wearable electronic device (200) may mean a head-mounted device or a head-mounted display worn on the user's head, but may also be configured in the form of at least one of glasses, goggles, a helmet, or a hat.
- the wearable electronic device (200) may include an OST (optical see-through) type configured to allow external light to reach the user's eyes through glasses when worn, or a VST (video see-through) type configured to block external light so that, when worn, light emitted from a display reaches the user's eyes but does not reach the user's eyes.
- OST optical see-through
- VST video see-through
- the wearable electronic device (200) may be worn on the user's head and provide the user with an image related to an extended reality (XR) service.
- the wearable electronic device (200) may provide XR content (hereinafter, referred to as an XR content image) that outputs at least one virtual object to be superimposed on a display area or an area determined as the user's field of view (FoV).
- the XR content may mean an image or image related to a real space acquired through a camera (e.g., a camera for taking pictures) or an image or image in which at least one virtual object is superimposed on a virtual space.
- the wearable electronic device (200) may provide XR content based on a function being performed by the wearable electronic device (200) and/or a function being performed by one or more external electronic devices (e.g., the electronic devices (102, 104, or 108) of FIG. 1).
- a function being performed by the wearable electronic device (200) and/or a function being performed by one or more external electronic devices (e.g., the electronic devices (102, 104, or 108) of FIG. 1).
- the wearable electronic device (200) is at least partially controlled by an external electronic device (e.g., electronic devices (102 or 104) of FIG. 1), and at least one function may be performed under the control of the external electronic device, but at least one function may also be performed independently.
- an external electronic device e.g., electronic devices (102 or 104) of FIG. 1
- at least one function may be performed under the control of the external electronic device, but at least one function may also be performed independently.
- the wearable electronic device (200) may include a housing (210) in which at least some of the configurations of FIG. 1 are arranged.
- the housing (210) may be configured to be wearable on a user's head.
- the housing (210) may include a strap and/or a wearing member for being fixed on a body part of the user.
- the user may wear the wearable electronic device (200) on the head so as to face the first direction (1) of the wearable electronic device (200).
- a fourth function camera e.g., a face recognition camera (225, 226, 227) and/or a display assembly (300) may be disposed in a first direction (1) of the housing (210) facing the user's face.
- a first function camera e.g., a recognition camera
- a second function camera e.g., a shooting camera
- a depth sensor e.g., a depth sensor
- a touch sensor e.g., a touch sensor
- the housing (210) may include a memory (e.g., a memory (130) of FIG. 1) and a processor (e.g., a processor (120) of FIG. 1), and may further include other configurations illustrated in FIG. 1.
- the display assembly (300) may be positioned in the first direction (1) of the wearable electronic device (200).
- the display assembly (300) may be positioned toward the user's face.
- the display assembly (300) may include a display panel (e.g., the display module (160) of FIG. 1 and/or the display panel (310) of FIG. 3A) and a lens assembly (e.g., the lens assembly (320) of FIG. 3A).
- the display assembly (300) may include a liquid crystal display (LCD), a digital mirror device (DMD), a liquid crystal on silicon (LCoS), an organic light emitting diode (OLED), or a micro light emitting diode (micro LED).
- LCD liquid crystal display
- DMD digital mirror device
- LCD liquid crystal on silicon
- OLED organic light emitting diode
- micro LED micro light emitting diode
- the wearable electronic device (200) may include a light source that irradiates light (e.g., visible light) to a screen output area of the display assembly (300).
- the display assembly (300) can generate light (e.g., visible light) on its own, for example, when the wearable electronic device (200) is formed of one of an organic light-emitting diode (OLED) or a micro LED, the wearable electronic device (200) may provide a user with good quality XR content images even without including a separate light source.
- the display assembly (300) is implemented with an organic light-emitting diode (OLED) or a micro LED, a light source is unnecessary, and thus the electronic wearable electronic device (200) may be lightweight.
- the display assembly (300) may include a first display assembly (300a) and/or a second display assembly (300b).
- the first display assembly (300a) may be arranged to face the user's left eye in the fourth direction (4)
- the second display assembly (300b) may be arranged to face the user's right eye in the third direction (3).
- the first function cameras can acquire images while the wearable electronic device (200) is worn by the user.
- the first function cameras (215) can be used for the purpose of detecting user movements or recognizing user gestures.
- the first function cameras (215) can be used for at least one of hand detection, hand tracking, recognition of user gestures (e.g., hand movements), and/or space recognition.
- the first function cameras (215) mainly use GS (global shutter) cameras, which have superior performance compared to RS (rolling shutter) cameras, to detect hand movements and fine movements of fingers and track movements, and can be configured as a stereo camera including two or more GS cameras for head tracking and space recognition.
- the first function cameras (215) can be used for 3DoF, 6DoF head tracking, location (space, environment) recognition and/or movement recognition.
- the first function camera (215) can perform simultaneous localization and mapping (SLAM) function for recognizing information (e.g., location and/or direction) related to the surrounding space through space recognition for 6DoF and depth shooting.
- SLAM simultaneous localization and mapping
- the second function cameras (211, 212) can also be used for hand detection and tracking and user gestures.
- the second function camera e.g., a shooting camera
- the second function camera can obtain an image related to the surrounding environment of the wearable electronic device (200).
- the second function camera (211, 212) can be used to capture the outside and generate an image or video corresponding to the outside and transmit it to a processor (e.g., the processor (120) of FIG. 1).
- the processor (120) can display the image provided from the second function camera (211, 212) on the display assembly (300).
- the second function camera (211, 212) may also be referred to as an HR (high resolution) or PV (photo video) and may include a high-resolution camera.
- the second function camera (211, 212) may include a color camera equipped with functions for obtaining high-quality images, such as an auto focus (AF) function and an optical image stabilizer (OIS), but is not limited thereto, and the second function camera (211, 212) may also include a GS camera or an RS camera.
- AF auto focus
- OIS optical image stabilizer
- a third function camera e.g., a gaze tracking camera
- a camera (340) of FIG. 3B may be positioned on the display assembly (300) (or inside the housing (210)) so that the camera lens faces the user's eyes when the user wears the wearable electronic device (200).
- the third function camera (340) may be used for detecting and tracking (ET) pupils and/or for recognizing the user's irises.
- the processor (120) may track the movements of the user's left and right eyes in the images received from the third function camera (340) to determine the gaze direction.
- the processor (120) may track the position of the pupil in the images so that the center of the XR content image displayed in the screen display area is positioned according to the direction in which the pupil is gazing.
- a GS camera may be used as the third function camera (340) to detect the pupil and track the movement of the pupil.
- the third function camera (340) can be installed for the left eye and the right eye respectively, and cameras with the same performance and specifications can be used.
- the fourth function camera e.g., a face recognition camera
- the fourth function camera may be used to detect and track (FT) a user's facial expression when the user wears the wearable electronic device (200).
- the fourth function camera may be used to recognize the user's face, or may recognize and/or track the user's two eyes.
- the depth sensor (or depth camera) (217) may be used for the purpose of checking the distance to an object (e.g., an object), such as time of flight (TOF).
- TOF time of flight
- a signal e.g., near-infrared, ultrasound, or laser.
- the depth sensor (217) may be configured to transmit a signal and receive a signal reflected from a subject.
- the first camera (245a, 245b, 245c, 245d) may check the distance to the object.
- the touch sensor (213) may be placed in the second direction (2) of the housing (210).
- the touch sensor (213) may be implemented as a single type or a left/right separated type depending on the shape of the housing (210), but is not limited thereto.
- the touch sensor (213) when the touch sensor (213) is implemented as a left/right separated type as shown in FIG. 2A, when the user wears the wearable electronic device (200), the first touch sensor (213a) may be placed at the user's left eye position, such as in the fourth direction (4), and the second touch sensor (213b) may be placed at the user's right eye position, such as in the third direction (3).
- the touch sensor (213) can recognize a touch input in at least one of, for example, a capacitive, a pressure-sensitive, an infrared, or an ultrasonic manner.
- the capacitive touch sensor (213) can recognize a physical touch (or contact) input or a hovering input (or proximity) of an external object.
- the wearable electronic device (200) may utilize a proximity sensor (not shown) to recognize proximity of an external object.
- the touch sensor (213) has a two-dimensional surface and can transmit touch data (e.g., touch coordinates) of an external object (e.g., a user's finger) that comes into contact with the touch sensor (213) to a processor (e.g., the processor (120) of FIG. 1).
- the touch sensor (213) can detect a hovering input for an external object (e.g., a user's finger) that approaches within a first distance from the touch sensor (213), or detect a touch input that touches the touch sensor (213).
- the touch sensor (213) may provide two-dimensional information about the point of contact as "touch data" to the processor (120) when an external object touches the touch sensor (213).
- the touch data may be described as a "touch mode.”
- the touch sensor (213) may provide hovering data about the point of time or location when an external object hovers around the touch sensor (213) when the external object is located within a first distance (or in proximity, hovering above the touch sensor) from the touch sensor (213), to the processor (120).
- the hovering data may be described as a "hovering mode/proximity mode.”
- the wearable electronic device (200) may obtain hovering data using at least one of a touch sensor (213), a proximity sensor (not shown), and/or a depth sensor (217) to generate information about a distance, location, or time point between the touch sensor (213) and an external object.
- the interior of the housing (210) may include components of FIG. 1, for example, a processor (e.g., processor (120) of FIG. 1) and memory (e.g., memory (130) of FIG. 1).
- a processor e.g., processor (120) of FIG. 1
- memory e.g., memory (130) of FIG. 1.
- the memory (130) may store various instructions that may be performed by the processor (120).
- the instructions may include arithmetic and logical operations, data movement, or control commands such as input/output that may be recognized by the processor (120).
- the memory (130) may temporarily or permanently store various data, including volatile memory (e.g., volatile memory (132) of FIG. 1) and nonvolatile memory (e.g., nonvolatile memory (134) of FIG. 1).
- the processor (120) may be operatively, functionally, and/or electrically connected to each component of the wearable electronic device (200) and may be configured to perform operations or data processing related to control and/or communication of each component. Operations performed by the processor (120) may be stored in the memory (130) and, when executed, may be executed by instructions that cause the processor (120) to operate.
- the processor (120) can implement on the wearable electronic device (200), but a series of operations related to the XR content service function will be described.
- the operations of the processor (120) described below can be performed by executing instructions stored in the memory (130).
- the processor (120) may generate a virtual object based on virtual information based on image information.
- the processor (120) may output a virtual object related to an XR service together with background space information through the display assembly (300).
- the processor (120) may capture an image related to a real space corresponding to a field of view of a user wearing the wearable electronic device (200) through the second function camera (211, 212) to obtain image information or generate a virtual space for a virtual environment.
- FIG. 3A is a schematic cross-sectional view of a display assembly according to an embodiment, illustrating a visible light path.
- FIG. 3B is a schematic cross-sectional view of a display assembly according to an embodiment, illustrating an infrared path.
- FIG. 4A is a front view illustrating a state in which an infrared light source is arranged on a display panel according to an embodiment.
- FIG. 4B is a front view illustrating a state in which an infrared light source is arranged on a display panel according to an embodiment.
- FIG. 5A is a front view illustrating a state in which a camera is arranged on a display panel according to an embodiment.
- FIG. 5B is a front view illustrating a state in which a camera is arranged on a display panel according to an embodiment.
- a display assembly (300) may be applied to a wearable electronic device (e.g., the wearable electronic device (200) of FIG. 2B).
- a first direction (1) may be a direction toward a user's face (e.g., an eyeball (E))
- a second direction (2) may be a direction opposite to the first direction (1).
- the display assembly (300) may include a display panel (310), a lens assembly (320), an infrared light source (330), a camera (340), and/or a polarizing plate (not shown).
- the display panel (310) can generate light (e.g., visible light) at least in the first direction (1).
- the display panel (310) can be positioned inside a housing (e.g., the housing (210) of FIG. 2B) of a wearable electronic device (e.g., the wearable electronic device (200) of FIG. 2B).
- the display panel (310) can be fixedly connected to a second direction (2) side of the lens assembly (320). Visible light generated from the display panel (310) can pass through the lens assembly (320) and be transmitted to the user's eye (E).
- the display panel (310) is illustrated as being rectangular in FIGS. 4A to 5B , this is an example and the shape of the display panel (310) is not limited thereto.
- the display panel (310) can be formed in a circular or arbitrary polygonal shape.
- the lens assembly (320) may be positioned on the first direction (1) side of the display panel (310).
- the lens assembly (320) may provide a path through which visible light and/or infrared light pass.
- the lens assembly (320) may be formed to transmit both visible light and infrared light.
- the lens assembly (320) may be formed to filter a region excluding a visible light region and an infrared region.
- the lens assembly (320) may be configured to substantially function as a dual wavelength band-pass filter.
- the lens assembly (320) may include a lens barrel and/or at least one lens.
- the lens assembly (320) may include at least one of a Fresnel lens, a pancake lens, a convex lens, or a multi-channel lens.
- this is exemplary, and the types included in the lens assembly (320) are not limited thereto.
- the structure and/or shape of the lens assembly (320) illustrated in FIGS. 3a and 3b are exemplary, and the structure and/or shape of the lens assembly (320) is not limited thereto.
- the lens assembly (320) may be configured to provide a visible light path (V) through which visible light generated from the display panel (310) reaches the user's eye (E).
- the lens assembly (320) may magnify and/or reduce content displayed on the display panel (310) and transmit it to the user's eye (E).
- the lens assembly (320) may adjust a focus so that content displayed on the display panel (310) is visible to the user.
- the visible light path (V) illustrated in FIG. 3A is simplified for convenience of explanation, and the visible light path (V) is not limited thereto.
- the infrared light source (330) can irradiate infrared light toward the user's eye (E).
- the infrared light source (330) can generate infrared light toward at least the first direction (1).
- At least one infrared light source (330) can be provided.
- the infrared light source (330) can be located at the display panel (310).
- the infrared light source (330) can be located at the outside or inside of the display panel (310).
- At least one infrared light source (330a) may be positioned in an outer edge area (e.g., a bezel area (B)) of the display panel (310).
- a plurality of infrared light sources (330a) may be positioned spaced apart at a specified interval along the bezel area (B) of the display panel (310).
- the space where the infrared light source (330a) is positioned can be saved by utilizing the bezel area (B) of the display panel (310).
- at least one infrared light source (330a) is positioned inside the housing (e.g., the housing (210) of FIG.
- the appearance of the wearable electronic device e.g., the wearable electronic device (200) of FIG. 2B
- the appearance of the wearable electronic device can be improved. Since the distance between the infrared light source (330a) and the user's eyes (E) can be maintained at a predetermined distance, harm to the human body can be reduced.
- At least one infrared light source (330b) may be positioned inside the display panel (310).
- at least one infrared light source (330b) may be positioned adjacent to an outer edge (e.g., a bezel area (B)) of the display panel (310).
- a plurality of infrared light sources (330b) may be positioned at a specified interval inside the display panel (310) along the outer edge of the display panel (310).
- at least one infrared light source (330b) may be configured as a part of a pixel of the display panel (310).
- the pixels of the display panel (310) may be set to irradiate infrared rays.
- the pixels set to irradiate infrared rays may be positioned outside a visible area of the display panel (310).
- the space where the infrared light source (330b) is placed can be saved by utilizing the internal area of the display panel (310). By positioning the infrared light source (330b) adjacent to the outer edge of the display panel (310), interference between the display area of the display panel (310) and the area of the infrared light source (330b) can be reduced.
- the appearance of the wearable electronic device e.g., the wearable electronic device (200) of FIG. 2B
- the distance between the infrared light source (330b) and the user's eye (E) can be maintained at a certain distance, harm to the human body can be reduced.
- the positions, sizes, and/or numbers of the infrared light sources (330) illustrated in FIGS. 4A and 4B are exemplary and are not limited thereto.
- the arrangement of at least one infrared light source (330a) and at least one infrared light source (330b) is not limited to the arrangement of the embodiment illustrated in the drawings, and may be formed in a structure in which at least one infrared light source (330a) is arranged in the bezel area (B) and at least one infrared light source (330b) is arranged inside the display panel (310).
- at least one infrared light source (330a) and at least one infrared light source (330b) may be formed together in one display panel (310).
- the infrared light source (330) may be positioned on the front surface (e.g., the surface in the first direction (1)) and/or the back surface (e.g., the surface in the second direction (2)) of the display panel (310) so as to be adjacent to the outer edge of the display panel (310).
- infrared light generated from an infrared light source (330) may pass through a lens assembly (320) and be transmitted to a user's eye (E). Infrared light reaching the user's eye (E) may be reflected by the user's eye (E), and the reflected infrared light may again pass through the lens assembly (320) and be transmitted to the camera (340).
- the lens assembly (320) may be configured to provide a first infrared path (R1) through which infrared light irradiated from the infrared light source (330) reaches the user's eye (E) and a second infrared path (R2) through which infrared light reflected from the user's eye (E) reaches the camera (340).
- the second infrared path (R2) may be formed at one area (e.g., the upper area) of the lens assembly (320).
- the first infrared path (R1) and the second infrared path (R2) illustrated in FIG. 3b are simplified for convenience of explanation, and the first infrared path (R1) and the second infrared path (R2) are not limited thereto.
- the visible light path (V) and the second infrared path (R2) may not overlap each other.
- the second infrared path (R2) may be formed in one area (e.g., the upper area) of the lens assembly (320), and the visible light path (V) may be formed in the remaining area of the lens assembly (320).
- a partition guide may be formed in the lens assembly (320) to separate the visible light path (V) and the second infrared path (R2).
- one area (e.g., the upper area) of the lens assembly (320) may function as an infrared filter that passes only infrared light
- the remaining area of the lens assembly (320) may function as a visible light filter that passes only visible light.
- the camera (340) can sense an image of the user's eye (E).
- the camera (340) can sense infrared rays reflected from the user's eye (E).
- the camera (340) can sense a reflected image (glint image) of the user's eye (E) and/or an iris image.
- the camera (340) can be used for tracking the user's gaze and/or recognizing the user's iris.
- a processor e.g., processor (120) of FIG. 1) can be configured to track the user's gaze or recognize the user's iris using the sensing information of the camera (340).
- the camera (340) may be positioned on the back side (e.g., the side in the second direction (2)) of the display panel (310).
- the camera (340) may be positioned on the back side (e.g., the side in the second direction (2)) of the display panel (310) using an under display camera (UDC) structure.
- UDC under display camera
- At least one camera (340a) may be positioned at a central portion (e.g., a top central portion) of one side of the display panel (310).
- the at least one camera (340a) may be positioned at any location of the display panel (310).
- the at least one camera (340a) may be positioned at any one of the top, center, and bottom of the display panel (310).
- the at least one camera (340a) may be positioned at at least a part of the bezel area (B) of the display panel.
- At least one camera (340b) may be positioned outside the effective viewing area (F) of the display panel (310).
- the effective viewing area (F) may refer to a display area visible to the user's field of view.
- at least one camera (340b) may be positioned near a corner of the display panel (310) so as to be positioned outside the effective viewing area (F) of the display panel (310).
- this is exemplary, and at least one camera (340b) may be positioned at any location outside the effective viewing area (F) of the display panel (310).
- the camera (340b) When the camera (340b) is positioned outside the effective viewing area (F) of the display panel (310), the camera (340b) is not visible from the user's field of view, so that the deterioration of the image quality of the display panel (310) perceived by the user can be reduced.
- the performance of the camera (340) for sensing the image of the user's eye (E) can be improved.
- the wearable electronic device e.g., the wearable electronic device (200) of FIG. 2B
- the camera (340) positioned on the back surface e.g., the surface in the second direction (2) of the display panel (310) can be positioned in the direction of the user's gaze.
- the camera (340) is positioned in the direction of the user's gaze, the camera (340) can sense the user's eye (E) image more easily and accurately.
- the camera (340) is positioned on the back surface of the display panel (310) (e.g., the surface in the second direction (2)), iris recognition can be performed while the user's gaze is directed toward the front.
- the user may have to turn his gaze outward and gaze outside the display assembly (e.g., outside the housing of the wearable electronic device) for iris recognition.
- the wearable electronic device (200) since the camera (340) is positioned in the direction of the user's gaze, iris recognition can be performed while the user is looking straight ahead, and thus user convenience can be improved.
- the wearable electronic device may further include a sensor position compensation unit (not shown).
- the sensor position compensation unit may be configured to compensate for the position of an image sensor (e.g., the image sensor (342) of FIG. 6) in response to shaking of the wearable electronic device (200).
- the sensor position compensation unit may include an optical image stabilization (OIS) structure.
- the sensor position compensation unit may be positioned together with the camera (340) or inside the camera (340).
- the sensor position compensation unit may be positioned on the back side (e.g., the surface in the second direction (2)) of the display panel (310).
- the wearable electronic device (200) in the comparative example where the camera is positioned outside the display assembly (e.g., outside the housing of the wearable electronic device), it may be difficult to install the sensor position correction unit due to the narrow space, but in the wearable electronic device (200) according to one embodiment, it may be possible to install the sensor position correction unit using the space on the back side (e.g., the surface in the second direction (2)) of the display panel (310). Accordingly, since shaking can be responded to through the sensor position correction unit, the performance of the camera (340) can be improved. Meanwhile, this is exemplary, and the structure for correcting the shaking of the wearable electronic device (200) is not limited thereto. In one embodiment, in order to correct the shaking of the wearable electronic device (200), a position correction unit configured to move the lens assembly (e.g., the barrel (341) of FIG. 6) of the camera (340) may be provided.
- a position correction unit configured to move the lens assembly (e.g., the barrel (341) of FIG. 6) of
- FIG. 6 is a cross-sectional view of a display panel and a camera according to one embodiment.
- the camera (340) may be positioned in the camera hole area (311) of the display panel (310).
- the camera hole area (311) may be an area on the back surface (e.g., the surface in the second direction (2)) of the display panel (310) where the camera (340) is positioned.
- infrared rays reflected from the eye may be incident on the camera hole area (311).
- the camera hole area (311) may be configured to have an infrared transmittance (e.g., transmittance at a wavelength of 750 nm or more) of at least 40% or more.
- this is exemplary, and the infrared transmittance is not limited thereto.
- the camera (340) may include a barrel (341), an image sensor (342), a first infrared filter (343), and/or a second infrared filter (344).
- the camera (340) may be configured with at least 4 million pixels for iris recognition of a user. However, this is exemplary, and the pixels of the camera (340) are not limited thereto.
- the barrel (341) may form at least a portion of the housing of the camera (340).
- the image sensor (342) may be positioned inside the barrel (341).
- the image sensor (342) may convert incident infrared light into an electrical image signal.
- the first infrared filter (343) may be positioned on the first direction (1) side of the barrel (341).
- the first infrared filter (343) may be attached to the first direction (1) end of the barrel (341).
- the first infrared filter (343) may be positioned between the barrel (341) and the display panel (310).
- the first infrared filter (343) may include a cover glass.
- the second infrared filter (344) may be positioned on the first direction (1) side of the image sensor (342).
- the second infrared filter (344) may cover the image sensor (342) in the first direction (1).
- the second infrared filter (344) may include a cover film.
- the first infrared filter (343) and the second infrared filter (344) may refer to filters configured to allow only an infrared region to pass through.
- a noise region e.g., a visible light region
- only an infrared region may be incident on the image sensor (342). Accordingly, the quality of the image acquired by the image sensor (342) may be improved.
- the structures of the first infrared filter (343) and/or the second infrared filter (344) described above are exemplary and are not limited thereto.
- the infrared filter may be positioned as a separate structure outside the camera (340).
- a filter that integrates the first infrared filter (343) and the second infrared filter (344) may be applied.
- the polarizing plate may be configured to form at least one of a visible light path (V), a first infrared path (R1), or a second infrared path (R2).
- the polarizing plate may form the first infrared path (R1) or the second infrared path (R2).
- the polarizing plate may be positioned on the first direction (1) side of the lens assembly (320).
- the polarizing plate may be positioned between the lens assembly (320) and the user's eye (E).
- the polarizing plate may include at least one of a full-wave plate, a half-wave plate, a quarter-wave plate, a multiple-order waveplate, or a zero-order waveplate.
- this is exemplary, and the type of the polarizing plate is not limited thereto.
- an infrared light source (330) may irradiate infrared rays toward the user's eye (E). At least a portion of the infrared rays generated from the infrared light source (330) may be transmitted to the user's eye (E) along a first infrared path (R1) provided by a lens assembly (320). The infrared rays transmitted to the user's eye (E) may be reflected to generate an eye image (e.g., a reflection image and/or an iris image) of an infrared component.
- an eye image e.g., a reflection image and/or an iris image
- the reflected infrared rays may be transmitted to a camera (340) positioned on a rear surface (e.g., a surface in the second direction (2)) of the display panel (310) along a second infrared path (R2) provided by the lens assembly (320).
- a processor e.g., processor (120) of FIG. 1
- a code based on a unique iris-related pattern can be extracted by analyzing the size, shape, color, retina, and/or capillaries of the iris.
- the user's iris can be recognized based on this code.
- a gaze or a direction in which the user is looking can be extracted from the imaged iris photo.
- a sensor position correction unit can correct the position of the camera (340) to improve the accuracy of gaze tracking and/or iris recognition.
- FIG. 7 is a schematic cross-sectional view of a display assembly according to one embodiment, showing an infrared path.
- the display assembly (300-1) may include a display panel (310), a lens assembly (320-1), an infrared light source (e.g., the infrared light source (330) of FIG. 4A and/or the infrared light source (330) of FIG. 4B), a camera (340-1), and/or a polarizing plate (not shown).
- an infrared light source e.g., the infrared light source (330) of FIG. 4A and/or the infrared light source (330) of FIG. 4B
- a camera 340-1
- a polarizing plate not shown
- the camera (340-1) may be positioned adjacent to the display panel (310) on an outer side of the display panel (310).
- the camera (340-1) may be positioned adjacent to a side (e.g., a border) of the display panel (310).
- the camera (340-1) may be positioned substantially on the same plane as the display panel (310) or on a plane adjacent thereto.
- the camera (340-1) may be positioned adjacent to an upper border of the display panel (310).
- the camera (340-1) may be arranged at an angle with respect to the display panel (310).
- the camera (340-1) may be arranged at an angle such that the front side (e.g., the surface in the first direction (1)) of the camera (340-1) faces away from the center of the lens assembly (320-1).
- the front side (e.g., the surface in the first direction (1)) of the camera (340-1) may be arranged at an angle so that it faces upward.
- the angle and/or direction of inclination of the camera (340-1) is not limited thereto.
- the angle and/or direction of inclination of the camera (340-1) may be set in various ways.
- the lens assembly (320-1) may be configured to provide a third infrared path (R3) through which infrared light irradiated from an infrared light source (e.g., the infrared light source (330) of FIG. 4A and/or the infrared light source (330) of FIG. 4B) reaches the user's eye (E) and a fourth infrared path (R4) through which infrared light reflected from the user's eye (E) reaches the camera (340-1).
- an infrared light source e.g., the infrared light source (330) of FIG. 4A and/or the infrared light source (330) of FIG. 4B
- R4 fourth infrared path
- the fourth infrared path (R4) may be formed in one area (e.g., the upper area) of the lens assembly (320-1).
- the third infrared path (R3) and the fourth infrared path (R4) illustrated in FIG. 7 are simplified for convenience of explanation, and the third infrared path (R3) and the fourth infrared path (R4) are not limited thereto.
- the camera (340-1) by positioning the camera (340-1) adjacent to the display panel (310), the camera (340-1) can be positioned in the direction of the user's gaze, and accordingly, the performance of the camera (340-1) for sensing the image of the user's eye (E) can be improved.
- the wearable electronic device e.g., the wearable electronic device (200) of FIG. 2B
- the camera (340-1) positioned adjacent to the display panel (310) can be positioned in the direction of the user's gaze.
- the camera (340-1) is positioned in the direction of the user's gaze, the camera (340-1) can sense the image of the user's eye (E) more easily and accurately.
- the camera (340-1) is positioned adjacent to the display panel (310), iris recognition can be performed while the user's gaze is directed toward the front.
- the user may have to turn his gaze outward and gaze at the outside of the display assembly (e.g., outside the housing of the wearable electronic device) for iris recognition.
- the wearable electronic device (200) since the camera (340-1) is positioned in the direction of the user's gaze, iris recognition can be performed while the user is looking toward the front, and thus user convenience can be improved.
- a wearable electronic device (200) may include a housing (210) configured to be worn on a user's head, a display panel (310) positioned in the housing (210) to generate visible light in at least a first direction, a lens assembly (320) positioned on a side of the display panel (310) in the first direction, at least one infrared light source (330) positioned in the display panel (310) to irradiate infrared light toward an eye (E) of the user, and a camera (340) positioned on a side of the display panel (310) in a second direction opposite to the first direction to sense infrared light reflected from the eye (E) of the user.
- the camera (340) may be positioned outside an effective viewing area (F) of the display panel (310).
- the camera (340) may be positioned on the second direction side of the display panel (310) using an under display camera (UDC) structure.
- UDC under display camera
- the infrared light source (330) may be positioned in the bezel area (B) of the display panel (310).
- the infrared light source (330) may be configured as a part of a pixel of the display panel (310).
- the infrared light source (330) may be positioned adjacent to an outer edge of the display panel (310).
- the camera (340) may include a barrel (341), an image sensor (342) positioned inside the barrel (341), a first infrared filter (343) positioned on the first direction side of the barrel (341), and a second infrared filter (344) positioned on the first direction side of the image sensor (342).
- the lens assembly (320) may be configured to provide a first infrared path (R1) through which infrared rays irradiated from the infrared light source (330) reach the user's eye (E) and a second infrared path (R2) through which infrared rays reflected from the user's eye (E) reach the camera (340).
- R1 first infrared path
- R2 second infrared path
- the lens assembly (320) may be configured to provide a visible light path (V) through which visible light generated from the display panel (310) reaches the user's eye (E).
- the visible light path (V) and the second infrared path (R2) may not overlap each other.
- the wearable electronic device (200) may further include a polarizing plate for forming the first infrared path (R1) or the second infrared path (R2).
- the polarizing plate may be positioned on the first direction side of the lens assembly (320).
- the wearable electronic device (200) may further include a sensor position correction unit configured to correct the position of the image sensor (342) in response to shaking of the wearable electronic device (200).
- the wearable electronic device (200) may further include a processor configured to track the user's gaze or recognize the user's iris using sensing information of the camera (340).
- the camera (340) may be configured with at least 4 million pixels.
- an area (311) of the display panel (310) where the camera (340) is positioned may have an infrared transmittance of 40% or more.
- a display assembly (300) configured to be applied to a wearable electronic device (200) may include a display panel (310) that generates visible light toward at least a first direction, a lens assembly (320) positioned on a side of the display panel (310) in the first direction, at least one infrared light source (330) positioned on the display panel (310) to irradiate infrared light toward an eye (E) of a user, and a camera (340) positioned on a side of the display panel (310) in a second direction opposite to the first direction to sense infrared light reflected from the eye (E) of the user.
- the camera (340) may be positioned outside an effective viewing area (F) of the display panel (310).
- the camera (340) may be positioned on the second direction side of the display panel (310) using an under display camera (UDC) structure.
- UDC under display camera
- the infrared light source (330) may be positioned in the bezel area (B) of the display panel (310).
- the infrared light source (330) may be configured as a part of a pixel of the display panel (310).
- a wearable electronic device (200) may include a housing (210) configured to be worn on a user's head, a display panel (310) positioned in the housing (210) to generate visible light toward at least a first direction, a lens assembly (320) positioned on a side of the display panel (310) in the first direction, a polarizing plate positioned on a side of the lens assembly (320) in the first direction, at least one infrared light source (330) positioned in the display panel (310) to irradiate infrared light toward an eye (E) of the user, and a camera (340) positioned on a side of the display panel (310) in a second direction opposite to the first direction to sense infrared light reflected from the eye (E) of the user.
- the camera (340) may be positioned on the second direction side of the display panel (310) using an under display camera (UDC) structure outside the effective viewing area (F) of the display panel (310).
- the lens assembly (320) may be configured to provide a first infrared path (R1) through which infrared rays irradiated from the infrared light source (330) reach the user's eyeball (E) and a second infrared path (R2) through which infrared rays reflected from the user's eyeball (E) reach the camera (340).
- the polarizing plate may form the first infrared path (R1) or the second infrared path (R2).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Dans un mode de réalisation de l'invention, le dispositif électronique portable peut comprendre : un boîtier configuré pour être porté sur la tête d'un utilisateur ; un panneau d'affichage positionné dans le boîtier pour générer de la lumière visible vers au moins une première direction ; un ensemble lentille positionné dans la première direction du panneau d'affichage ; au moins une source de lumière infrarouge positionnée sur le panneau d'affichage pour émettre de la lumière infrarouge vers les globes oculaires de l'utilisateur ; et une caméra positionnée dans une seconde direction opposée à la première direction du panneau d'affichage pour détecter la lumière infrarouge réfléchie par les globes oculaires de l'utilisateur. La caméra peut être placée en dehors du champ de vision effectif du panneau d'affichage.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR20230094700 | 2023-07-20 | ||
| KR10-2023-0094700 | 2023-07-20 | ||
| KR10-2023-0112715 | 2023-08-28 | ||
| KR1020230112715A KR20250015607A (ko) | 2023-07-20 | 2023-08-28 | 적외선 센싱 카메라를 포함하는 웨어러블 전자 장치 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025018531A1 true WO2025018531A1 (fr) | 2025-01-23 |
Family
ID=94281824
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/006029 Pending WO2025018531A1 (fr) | 2023-07-20 | 2024-05-03 | Dispositif électronique portable comprenant une caméra de détection infrarouge |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025018531A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20220063467A (ko) * | 2020-11-10 | 2022-05-17 | 삼성전자주식회사 | 디스플레이를 포함하는 웨어러블 전자 장치 |
| WO2022114535A1 (fr) * | 2020-11-24 | 2022-06-02 | 삼성전자 주식회사 | Dispositif électronique portable à réalité augmentée comprenant une caméra |
| KR20220128726A (ko) * | 2021-03-15 | 2022-09-22 | 삼성전자주식회사 | 머리 착용형 디스플레이 장치, 그 장치에서의 동작 방법 및 저장매체 |
| KR102468129B1 (ko) * | 2016-06-01 | 2022-11-22 | 삼성전자주식회사 | 전자 장치 및 전자 장치 제작 방법 |
| KR20230071769A (ko) * | 2021-11-15 | 2023-05-23 | 우한 차이나 스타 옵토일렉트로닉스 테크놀로지 컴퍼니 리미티드 | 차량 탑재용 디스플레이 장치 |
-
2024
- 2024-05-03 WO PCT/KR2024/006029 patent/WO2025018531A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102468129B1 (ko) * | 2016-06-01 | 2022-11-22 | 삼성전자주식회사 | 전자 장치 및 전자 장치 제작 방법 |
| KR20220063467A (ko) * | 2020-11-10 | 2022-05-17 | 삼성전자주식회사 | 디스플레이를 포함하는 웨어러블 전자 장치 |
| WO2022114535A1 (fr) * | 2020-11-24 | 2022-06-02 | 삼성전자 주식회사 | Dispositif électronique portable à réalité augmentée comprenant une caméra |
| KR20220128726A (ko) * | 2021-03-15 | 2022-09-22 | 삼성전자주식회사 | 머리 착용형 디스플레이 장치, 그 장치에서의 동작 방법 및 저장매체 |
| KR20230071769A (ko) * | 2021-11-15 | 2023-05-23 | 우한 차이나 스타 옵토일렉트로닉스 테크놀로지 컴퍼니 리미티드 | 차량 탑재용 디스플레이 장치 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2022169255A1 (fr) | Dispositif électronique et son procédé de suivi du regard d'un utilisateur et de fourniture d'un service de réalité augmentée | |
| WO2022050638A1 (fr) | Procédé de modification de paramètres d'affichage et dispositif électronique | |
| WO2024096485A1 (fr) | Dispositif habitronique qui ajuste la transmittance de lumière en fonction de l'éclairement d'une source de lumière externe et son procédé de commande | |
| WO2022114535A1 (fr) | Dispositif électronique portable à réalité augmentée comprenant une caméra | |
| WO2022154338A1 (fr) | Dispositif électronique portable comprenant une caméra miniature | |
| WO2022119105A1 (fr) | Dispositif électronique pouvant être porté comprenant unité électroluminescente | |
| US11863945B2 (en) | Augmented reality wearable electronic device and case | |
| WO2025018531A1 (fr) | Dispositif électronique portable comprenant une caméra de détection infrarouge | |
| WO2024049110A1 (fr) | Dispositif électronique et procédé de commande permettant de corriger un objet virtuel en utilisant des informations de profondeur d'un objet réel | |
| WO2023106895A1 (fr) | Dispositif électronique destiné à utiliser un dispositif d'entrée virtuel, et procédé de fonctionnement dans un dispositif électronique | |
| WO2022177209A1 (fr) | Procédé de suivi de l'œil et dispositif électronique | |
| WO2022108150A1 (fr) | Dispositif électronique portable à réalité augmentée empêchant une interférence lumineuse | |
| WO2025063445A1 (fr) | Ensemble affichage et dispositif électronique habitronique le comprenant | |
| KR20250015607A (ko) | 적외선 센싱 카메라를 포함하는 웨어러블 전자 장치 | |
| WO2024196221A1 (fr) | Dispositif électronique pour prendre en charge un contenu xr, et procédé de support de mode d'entrée s'y rapportant | |
| WO2025023439A1 (fr) | Appareil et procédé de commande de sources de lumière pour un suivi des yeux | |
| WO2025183394A1 (fr) | Dispositif portable, procédé et support d'enregistrement non transitoire lisible par ordinateur pour suivi du regard sur la base de caractéristiques personnelles | |
| WO2024043438A1 (fr) | Dispositif électronique portable commandant un modèle de caméra et son procédé de fonctionnement | |
| KR20250044061A (ko) | 디스플레이 어셈블리 및 이를 포함하는 웨어러블 전자 장치 | |
| WO2024237753A1 (fr) | Dispositif électronique portable de suivi de regard et de visage | |
| WO2025023626A1 (fr) | Dispositif électronique habitronique | |
| WO2025023741A1 (fr) | Dispositif électronique comprenant une caméra infrarouge | |
| WO2023027276A1 (fr) | Dispositif électronique pour exécuter une pluralité de fonctions à l'aide d'un stylet et son procédé de fonctionnement | |
| WO2024219934A1 (fr) | Dispositif et procédé de réalité augmentée pour prévenir un éclat de lumière dans une image de service de réalité augmentée | |
| WO2025028974A1 (fr) | Dispositif électronique habitronique comprenant un affichage transparent |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24843254 Country of ref document: EP Kind code of ref document: A1 |