WO2025127562A1 - Dispositif projecteur et dispositif électronique le comprenant - Google Patents
Dispositif projecteur et dispositif électronique le comprenant Download PDFInfo
- Publication number
- WO2025127562A1 WO2025127562A1 PCT/KR2024/019435 KR2024019435W WO2025127562A1 WO 2025127562 A1 WO2025127562 A1 WO 2025127562A1 KR 2024019435 W KR2024019435 W KR 2024019435W WO 2025127562 A1 WO2025127562 A1 WO 2025127562A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- optical axis
- light source
- light
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/28—Reflectors in projection beam
Definitions
- the invention relates to a project device and an electronic device comprising the same.
- VR Virtual Reality
- a specific environment or situation or the technology itself, that is similar to reality but not real, created using artificial technology such as computers.
- Augmented Reality is a technology that synthesizes virtual objects or information into the real environment to make them appear as if they existed in the original environment.
- Mixed reality or hybrid reality refers to the creation of a new environment or new information by combining the virtual world and the real world.
- mixed reality when it refers to real-time interaction between things that exist in reality and virtual worlds.
- the created virtual environment or situation stimulates the user's five senses and allows them to freely move between reality and imagination by experiencing spatial and temporal experiences similar to reality.
- the user can interact with the things implemented in this environment by not only simply immersing himself in this environment, but also using real devices to operate or give commands.
- the embodiment provides a projector device and an electronic device that are miniaturized and compact by adjusting the positions of a light source, a mirror, a lens, a prism, a light modulator, and the projector device when using a projector device used for AR (Augmented Reality) and an electronic device including the same.
- a projector device includes a light source unit that emits light; a projection lens unit that transmits the light emitted from the light source unit; and a light modulator that modulates and reflects the light passing through the projection lens unit, wherein the projection lens unit includes a plurality of lenses, the light modulator includes a substrate and a plurality of mirrors arranged on the substrate, the plurality of lenses of the projection lens unit include at least three lenses aligned in an optical axis direction, the substrate is tilted with respect to a plane perpendicular to the optical axis direction, and a surface of a lens of the projection lens unit adjacent to the optical modulator may be convex toward the optical modulator.
- the projection lens unit includes a plurality of lenses
- the light modulator includes a substrate and a plurality of mirrors arranged on the substrate
- the plurality of lenses of the projection lens unit include at least three lenses aligned in an optical axis direction
- the substrate is tilted with respect to a plane perpendicular to the
- the above plurality of lenses may include first to seventh lenses sequentially arranged from the light source side to the light modulator side.
- the sixth lens may include a first outer surface adjacent to the light source and a second outer surface spaced apart from the light source
- the seventh lens may include a third outer surface adjacent to the light source and a fourth outer surface spaced apart from the light source.
- the first and fourth outer surfaces do not form a circular symmetry with respect to the optical axis, and the third outer surface and the fourth outer surface can be tilted with respect to a plane perpendicular to the direction of the optical axis.
- the center of the first outer surface and the center of the fourth outer surface may not be on the optical axis, and the center of the second outer surface and the center of the third outer surface may be on the optical axis.
- the projection lens unit includes an optical axis passing through the centers of the first lens to the third lens, and the center of the first outer surface and the center of the fourth outer surface can be spaced apart from the optical axis by a predetermined distance in a direction perpendicular to the optical axis.
- a projector device includes an optical axis passing through the centers of the first lens to the third lens of the projection lens unit, a center of the first outer surface, a center of the third outer surface, and a center of the fourth outer surface are spaced apart from the optical axis by a predetermined distance in a direction perpendicular to the optical axis, and the third outer surface and the fourth outer surface can be tilted based on a plane perpendicular to the direction of the optical axis.
- the difference between the angle at which the fourth outer surface is tilted relative to a plane perpendicular to the optical axis direction and the angle at which the substrate is tilted relative to a plane perpendicular to the optical axis direction may be -5 ⁇ to +5 ⁇ .
- the tilt angle of the substrate relative to a plane perpendicular to the optical axis direction may be 6.5 ⁇ to 7 ⁇ .
- the angle formed by the mirror with respect to the substrate may be 16.5 ⁇ to 17.5 ⁇ .
- the surface of the first lens adjacent to the light source unit may be convex toward the light source unit.
- the above light source unit can be placed in a first region on one side based on the optical axis of the first lens.
- the second region may be a region that is on a different side from the first region based on the optical axis of the first lens and does not overlap with the first region in the direction of the optical axis.
- the above substrate can be tilted so that the surface on which the light is reflected faces the second region.
- the third outer surface may be tilted toward the first region with respect to a plane perpendicular to the optical axis direction, and a width in the optical axis direction of the seventh lens located in the first region may be smaller than a width in the optical axis direction of the seventh lens located in the second region.
- a projector device includes a light source unit that emits light; a projection lens unit that transmits the light emitted from the light source unit; and a light modulator that modulates and reflects the light passing through the projection lens unit, wherein the projection lens unit includes first to sixth lenses that are sequentially arranged from the light source unit side toward the light modulator side, and a surface of the sixth lens adjacent to the light modulator may be convex toward the light modulator side.
- a projector device includes an optical axis passing through the centers of the first lens to the third lens of the projection lens unit, and a first axis passing through the center of a surface of the fifth lens spaced apart from the optical modulator and parallel to the optical axis can be spaced apart from the optical axis by 1.0 mm to 1.2 mm in a direction perpendicular to the optical axis.
- the surface of the sixth lens spaced from the optical modulator and the surface of the sixth lens adjacent to the optical modulator can be tilted in opposite directions with respect to a surface perpendicular to the optical axis direction.
- the surface of the sixth lens spaced from the optical modulator can be tilted by 9.0 ⁇ to 9.6 ⁇ with respect to the surface perpendicular to the optical axis direction, and the surface of the sixth lens adjacent to the optical modulator can be tilted by 10.5 ⁇ to 11.1 ⁇ with respect to the surface perpendicular to the optical axis direction.
- a projector device and an electronic device can be provided that are miniaturized and compact by adjusting the positions of a light source, a mirror, a lens, a prism, a light modulator, and the projector device.
- Figure 1 is a conceptual diagram illustrating an embodiment of an AI device.
- FIG. 2 is a block diagram showing the configuration of an extended reality electronic device according to an embodiment of the present invention.
- FIG. 3 is a perspective view of an augmented reality electronic device according to an embodiment of the present invention.
- Figures 4 to 6 are conceptual diagrams for explaining various display methods applicable to a display unit according to an embodiment of the present invention.
- Fig. 7 is a perspective view of a project device according to an embodiment
- Fig. 8 is another perspective view of a project device according to an embodiment
- Figures 9 and 10 are schematic diagrams of the inside of a project device according to an embodiment.
- Fig. 11 is a schematic diagram of the inside of a project device according to another embodiment.
- Fig. 12 is an enlarged view of the optical modulator of the project device according to the embodiment.
- Figures 13 to 16 are schematic diagrams of the inside of a project device according to another embodiment.
- a component when a component is described as being 'connected', 'coupled' or 'connected' to another component, it may include not only cases where the component is directly connected, coupled or connected to the other component, but also cases where the component is 'connected', 'coupled' or 'connected' by another component between the component and the other component.
- each component when described as being formed or arranged "above or below” each component, above or below includes not only the case where the two components are in direct contact with each other, but also the case where one or more other components are formed or arranged between the two components.
- it when expressed as "above or below", it can include the meaning of the downward direction as well as the upward direction based on one component.
- Figure 1 is a conceptual diagram illustrating an embodiment of an AI device.
- an AI system is connected to a cloud network (10) by at least one of an AI server (16), a robot (11), an autonomous vehicle (12), an XR device (13), a smartphone (14), or an appliance (15).
- an AI device 11 to 15
- an AI device 11 to 15
- a cloud network (10) may mean a network that constitutes part of a cloud computing infrastructure or exists within a cloud computing infrastructure.
- the cloud network (10) may be configured using a 3G network, a 4G or LTE (Long Term Evolution) network, a 5G network, etc.
- each device (11 to 16) constituting the AI system can be connected to each other through the cloud network (10).
- each device (11 to 16) can communicate with each other through a base station, but can also communicate with each other directly without going through a base station.
- the AI server (16) may include a server that performs AI processing and a server that performs operations on big data.
- the AI server (16) is connected to at least one of AI devices constituting the AI system, such as a robot (11), an autonomous vehicle (12), an XR device (13), a smartphone (14), or a home appliance (15), through a cloud network (10), and can assist at least part of the AI processing of the connected AI devices (11 to 15).
- AI devices constituting the AI system, such as a robot (11), an autonomous vehicle (12), an XR device (13), a smartphone (14), or a home appliance (15), through a cloud network (10), and can assist at least part of the AI processing of the connected AI devices (11 to 15).
- the AI server (16) can train an artificial neural network according to a machine learning algorithm on behalf of the AI devices (11 to 15), and can directly store the learning model or transmit it to the AI devices (11 to 15).
- the AI server (16) can receive input data from the AI devices (11 to 15), infer a result value for the received input data using a learning model, and generate a response or control command based on the inferred result value and transmit it to the AI devices (11 to 15).
- the AI device may infer a result value for input data using a direct learning model and generate a response or control command based on the inferred result value.
- Robots (11) can be implemented as guide robots, transport robots, cleaning robots, wearable robots, entertainment robots, pet robots, unmanned flying robots, etc. by applying AI technology.
- the robot (11) may include a robot control module for controlling movement, and the robot control module may mean a software module or a chip that implements the same as hardware.
- the robot (11) can obtain status information of the robot (11), detect (recognize) the surrounding environment and objects, generate map data, determine a movement path and driving plan, determine a response to user interaction, or determine an action using sensor information obtained from various types of sensors.
- the robot (11) can use sensor information acquired from at least one sensor among lidar, radar, and camera to determine a movement path and driving plan.
- the robot (11) can perform the above-described operations using a learning model composed of at least one artificial neural network.
- the robot (11) can recognize the surrounding environment and objects using the learning model, and determine operations using the recognized surrounding environment information or object information.
- the learning model can be learned directly in the robot (11) or learned from an external device such as an AI server (16).
- the robot (11) may perform an action by generating a result using a direct learning model, but may also transmit sensor information to an external device such as an AI server (16) and perform an action by receiving the result generated accordingly.
- the robot (11) can determine a movement path and driving plan using at least one of map data, object information detected from sensor information, or object information acquired from an external device, and control a driving unit to drive the robot (11) according to the determined movement path and driving plan.
- the map data may include object identification information for various objects placed in the space where the robot (11) moves.
- the map data may include object identification information for fixed objects such as walls and doors, and movable objects such as flower pots and desks.
- the object identification information may include name, type, distance, location, etc.
- the robot (11) can perform an action or drive by controlling the driving unit based on the user's control/interaction. At this time, the robot (11) can obtain the intention information of the interaction according to the user's action or voice utterance, and determine a response based on the obtained intention information to perform the action.
- Autonomous vehicles (12) can be implemented as mobile robots, vehicles, unmanned aerial vehicles, etc. by applying AI technology.
- the autonomous vehicle (12) may include an autonomous driving control module for controlling autonomous driving functions, and the autonomous driving control module may mean a software module or a chip that implements the same as hardware.
- the autonomous driving control module may be included internally as a component of the autonomous vehicle (12), but may also be configured as separate hardware and connected to the outside of the autonomous vehicle (12).
- An autonomous vehicle (12) can obtain status information of the autonomous vehicle (12), detect (recognize) the surrounding environment and objects, generate map data, determine a movement path and driving plan, or determine an operation by using sensor information obtained from various types of sensors.
- the autonomous vehicle (12) can use sensor information acquired from at least one sensor among lidar, radar, and camera, similar to the robot (11), to determine the movement path and driving plan.
- an autonomous vehicle (12) can recognize an environment or objects in an area where the field of vision is obscured or an area beyond a certain distance by receiving sensor information from external devices, or can receive information recognized directly from external devices.
- the autonomous vehicle (12) can perform the above-described operations using a learning model composed of at least one artificial neural network.
- the autonomous vehicle (12) can recognize the surrounding environment and objects using the learning model, and determine the driving route using the recognized surrounding environment information or object information.
- the learning model can be learned directly in the autonomous vehicle (12) or learned from an external device such as an AI server (16).
- the autonomous vehicle (12) may perform an operation by generating a result using a direct learning model, but may also perform an operation by transmitting sensor information to an external device such as an AI server (16) and receiving the result generated accordingly.
- An autonomous vehicle (12) can determine a movement path and driving plan by using at least one of map data, object information detected from sensor information, or object information acquired from an external device, and control a driving unit to drive the autonomous vehicle (12) according to the determined movement path and driving plan.
- Map data may include object identification information for various objects placed in a space (e.g., a road) where an autonomous vehicle (12) runs.
- map data may include object identification information for fixed objects such as streetlights, rocks, and buildings, and movable objects such as vehicles and pedestrians.
- object identification information may include name, type, distance, location, and the like.
- the autonomous vehicle (12) can perform an action or drive by controlling the driving unit based on the user's control/interaction. At this time, the autonomous vehicle (12) can obtain the intention information of the interaction according to the user's action or voice utterance, and determine a response based on the obtained intention information to perform the action.
- the XR device (13) can be implemented as an HMD (Head-Mount Display), a HUD (Head-Up Display) equipped in a vehicle, a television, a mobile phone, a smart phone, a computer, a wearable device, a home appliance, digital signage, a vehicle, a fixed robot or a mobile robot, etc. by applying AI technology.
- HMD Head-Mount Display
- HUD Head-Up Display
- the XR device (13) can obtain information about surrounding space or real objects by analyzing 3D point cloud data or image data acquired through various sensors or from an external device to generate location data and attribute data for 3D points, and can render and output an XR object to be output. For example, the XR device (13) can output an XR object including additional information about a recognized object by corresponding it to the recognized object.
- the XR device (13) can perform the above-described operations using a learning model composed of at least one artificial neural network.
- the XR device (13) can recognize a real object from 3D point cloud data or image data using the learning model, and provide information corresponding to the recognized real object.
- the learning model can be learned directly in the XR device (13) or learned in an external device such as an AI server (16).
- the XR device (13) may perform an operation by generating a result using a direct learning model, but may also transmit sensor information to an external device such as an AI server (16) and perform an operation by receiving the result generated accordingly.
- Robots (11) can be implemented as guide robots, transport robots, cleaning robots, wearable robots, entertainment robots, pet robots, unmanned flying robots, etc. by applying AI technology and autonomous driving technology.
- a robot (11) to which AI technology and autonomous driving technology are applied may refer to a robot itself with autonomous driving functions, or a robot (11) that interacts with an autonomous vehicle (12).
- a robot (11) with autonomous driving function can be a general term for devices that move on their own along a given path without user control or move by determining the path on their own.
- a robot (11) with autonomous driving function and an autonomous vehicle (12) may use a common sensing method to determine one or more of a movement path or a driving plan.
- a robot (11) with autonomous driving function and an autonomous vehicle (12) may use information sensed through a lidar, a radar, and a camera to determine one or more of a movement path or a driving plan.
- a robot (11) interacting with an autonomous vehicle (12) may exist separately from the autonomous vehicle (12), and may be linked to autonomous driving functions inside or outside the autonomous vehicle (12) or perform actions linked to a user riding in the autonomous vehicle (12).
- the robot (11) interacting with the autonomous vehicle (12) can control or assist the autonomous driving function of the autonomous vehicle (12) by acquiring sensor information on behalf of the autonomous vehicle (12) and providing it to the autonomous vehicle (12), or by acquiring sensor information and generating surrounding environment information or object information and providing it to the autonomous vehicle (12).
- a robot (11) interacting with a self-driving vehicle (12) may monitor a user riding in the self-driving vehicle (12) or control the functions of the self-driving vehicle (12) through interaction with the user. For example, if the robot (11) determines that the driver is drowsy, it may activate the self-driving function of the self-driving vehicle (12) or assist in the control of the driving unit of the self-driving vehicle (12).
- the functions of the self-driving vehicle (12) controlled by the robot (11) may include not only the self-driving function, but also functions provided by a navigation system or audio system equipped inside the self-driving vehicle (12).
- a robot (11) interacting with an autonomous vehicle (12) may provide information to the autonomous vehicle (12) or assist functions from outside the autonomous vehicle (12).
- the robot (11) may provide traffic information including signal information to the autonomous vehicle (12), such as a smart traffic light, or may interact with the autonomous vehicle (12) to automatically connect an electric charger to a charging port, such as an automatic electric charger for an electric vehicle.
- Robots (11) can be implemented as guide robots, transport robots, cleaning robots, wearable robots, entertainment robots, pet robots, unmanned flying robots, drones, etc. by applying AI technology and XR technology.
- a robot (11) to which XR technology is applied may refer to a robot that is the target of control/interaction within an XR image.
- the robot (11) is distinct from the XR device (13) and can be linked with each other.
- a robot (11) that is the target of control/interaction within an XR image obtains sensor information from sensors including a camera
- the robot (11) or the XR device (13) can generate an XR image based on the sensor information, and the XR device (13) can output the generated XR image.
- the robot (11) can operate based on a control signal input through the XR device (13) or a user's interaction.
- a user can check an XR image corresponding to the viewpoint of a remotely connected robot (11) through an external device such as an XR device (13), and through interaction, adjust the autonomous driving path of the robot (11), control the operation or driving, or check information on surrounding objects.
- Autonomous vehicles (12) can be implemented as mobile robots, vehicles, unmanned aerial vehicles, etc. by applying AI technology and XR technology.
- An autonomous vehicle (12) to which XR technology is applied may refer to an autonomous vehicle equipped with a means for providing XR images, an autonomous vehicle that is the subject of control/interaction within an XR image, etc.
- an autonomous vehicle (12) that is the subject of control/interaction within an XR image is distinct from an XR device (13) and can be linked with each other.
- An autonomous vehicle (12) equipped with a means for providing XR images can obtain sensor information from sensors including cameras and output XR images generated based on the obtained sensor information.
- the autonomous vehicle (12) can provide passengers with XR objects corresponding to real objects or objects on the screen by having a HUD to output XR images.
- the XR object when the XR object is output to the HUD, at least a part of the XR object may be output so as to overlap with an actual object toward which the passenger's gaze is directed.
- the XR object when the XR object is output to a display provided inside the autonomous vehicle (12), at least a part of the XR object may be output so as to overlap with an object in the screen.
- the autonomous vehicle (12) may output XR objects corresponding to objects such as a road, another vehicle, a traffic light, a traffic sign, a two-wheeled vehicle, a pedestrian, a building, etc.
- an autonomous vehicle (12) that is the target of control/interaction within an XR image obtains sensor information from sensors including a camera
- the autonomous vehicle (12) or the XR device (13) can generate an XR image based on the sensor information, and the XR device (13) can output the generated XR image.
- the autonomous vehicle (12) can operate based on a control signal input through an external device such as the XR device (13) or a user's interaction.
- Extended reality is a general term for virtual reality (VR), augmented reality (AR), and mixed reality (MR).
- VR technology provides real-world objects and backgrounds only as CG images
- AR technology provides virtual CG images on top of images of real objects
- MR technology is a computer graphics technology that mixes and combines virtual objects in the real world.
- MR technology is similar to AR technology in that it shows real objects and virtual objects together. However, there is a difference in that while AR technology uses virtual objects to complement real objects, MR technology uses virtual and real objects with equal characteristics.
- XR technology can be applied to HMD (Head-Mount Display), HUD (Head-Up Display), mobile phones, tablet PCs, laptops, desktops, TVs, digital signage, etc., and devices to which XR technology is applied can be called XR devices.
- HMD Head-Mount Display
- HUD Head-Up Display
- mobile phones tablet PCs, laptops, desktops, TVs, digital signage, etc.
- XR devices devices to which XR technology is applied.
- Figure 2 is a block diagram showing the configuration of an extended reality electronic device (20) according to an embodiment of the present invention.
- the extended reality electronic device (20) may include a wireless communication unit (21), an input unit (22), a sensing unit (23), an output unit (24), an interface unit (25), a memory (26), a control unit (27), and a power supply unit (28).
- the components illustrated in FIG. 2 are not essential for implementing the electronic device (20), and thus, the electronic device (20) described in this specification may have more or fewer components than the components listed above.
- the wireless communication unit (21) may include one or more modules that enable wireless communication between the electronic device (20) and a wireless communication system, between the electronic device (20) and another electronic device, or between the electronic device (20) and an external server.
- the wireless communication unit (21) may include one or more modules that connect the electronic device (20) to one or more networks.
- This wireless communication unit (21) may include at least one of a broadcast reception module, a mobile communication module, a wireless Internet module, a short-range communication module, and a location information module.
- the input unit (22) may include a camera or a video input unit for inputting a video signal, a microphone or an audio input unit for inputting an audio signal, and a user input unit (e.g., a touch key, a mechanical key, etc.) for receiving information from a user.
- Voice data or image data collected by the input unit (22) may be analyzed and processed into a user's control command.
- the sensing unit (23) may include one or more sensors for sensing at least one of information within the electronic device (20), information about the surrounding environment surrounding the electronic device (20), and user information.
- the sensing unit (23) may include at least one of a proximity sensor, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor (IR sensor), a finger scan sensor, an ultrasonic sensor, an optical sensor (e.g., a photographing device), a microphone, a battery gauge, an environmental sensor (e.g., a barometer, a hygrometer, a thermometer, a radiation detection sensor, a heat detection sensor, a gas detection sensor, etc.), and a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric recognition sensor, etc.).
- the electronic device (20) disclosed in the present specification may utilize information sensed by at least two or more of these sensors in combination.
- the output unit (24) is for generating output related to vision, hearing, or tactile sensations, and may include at least one of a display unit, an audio output unit, a haptic module, and an optical output unit.
- the display unit may be formed as a layer structure with a touch sensor or formed as an integral part, thereby implementing a touch screen.
- This touch screen may function as a user input means that provides an input interface between the augmented reality electronic device (20) and the user, and at the same time, provide an output interface between the augmented reality electronic device (20) and the user.
- the interface unit (25) serves as a passageway between various types of external devices connected to the electronic device (20). Through the interface unit (25), the electronic device (20) can receive virtual reality or augmented reality content from an external device, and can perform mutual interaction by exchanging various input signals, sensing signals, and data.
- the interface unit (25) may include at least one of a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device equipped with an identification module, an audio I/O (Input/Output) port, a video I/O (Input/Output) port, and an earphone port.
- the memory (26) stores data that supports various functions of the electronic device (20).
- the memory (26) can store a plurality of application programs (or applications) that run on the electronic device (20), data for the operation of the electronic device (20), and commands. At least some of these application programs can be downloaded from an external server via wireless communication. In addition, at least some of these application programs can exist on the electronic device (20) from the time of shipment for basic functions of the electronic device (20) (e.g., call reception, call transmission, message reception, call transmission).
- control unit (27) In addition to operations related to the application program, the control unit (27) typically controls the overall operation of the electronic device (20).
- the control unit (27) can process signals, data, information, etc. input or output through the components discussed above.
- control unit (27) can control at least some of the components by driving the application program stored in the memory (26) to provide appropriate information to the user or process a function. Furthermore, the control unit (27) can operate at least two or more of the components included in the electronic device (20) in combination with each other to drive the application program.
- control unit (27) can detect the movement of the electronic device (20) or the user by using a gyroscope sensor, a gravity sensor, a motion sensor, etc. included in the sensing unit (23).
- control unit (27) can detect an object approaching the electronic device (20) or the user by using a proximity sensor, a light sensor, a magnetic sensor, an infrared sensor, an ultrasonic sensor, a light sensor, etc. included in the sensing unit (23).
- control unit (27) can also detect the movement of the user by using sensors provided in a controller that operates in conjunction with the electronic device (20).
- control unit (27) can perform operations (or functions) of the electronic device (20) using an application program stored in the memory (26).
- the power supply unit (28) receives external power or internal power under the control of the control unit (27) and supplies power to each component included in the electronic device (20).
- the power supply unit (28) includes a battery, and the battery may be provided in a built-in or replaceable form.
- At least some of the above components may operate in cooperation with each other to implement the operation, control, or control method of the electronic device according to various embodiments described below.
- the operation, control, or control method of the electronic device may be implemented on the electronic device by driving at least one application program stored in the memory (26).
- embodiments of the electronic device according to the present invention may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (personal digital assistants), a PMP (portable multimedia player), a navigation, a slate PC, a tablet PC, an ultrabook, and a wearable device.
- the wearable device may include a smart watch, a contact lens, VR/AR/MR Glass, and the like.
- FIG. 3 is a perspective view of an augmented reality electronic device according to an embodiment of the present invention.
- an electronic device may include a frame (100), a projector device (200), and a display unit (300).
- the electronic device may be provided as a glass type (smart glass).
- the glass type electronic device is configured to be worn on the head of the human body and may be provided with a frame (case, housing, etc.) (100) for this purpose.
- the frame (100) may be formed of a flexible material to facilitate wearing.
- the frame (100) is supported on the head and provides a space for mounting various components.
- electronic components such as a projector device (200), a user input unit (130), or an audio output unit (140) may be mounted on the frame (100).
- a lens covering at least one of the left and right eyes may be detachably mounted on the frame (100).
- the frame (100) may have a form of glasses worn on the face of the user's body as shown in the drawing, but is not necessarily limited thereto, and may also have a form such as goggles worn in close contact with the user's face.
- a frame (100) like this may include a front frame (110) having at least one opening, and a pair of side frames (120) that extend in the y direction (in FIG. 3) intersecting the front frame (110) and are parallel to each other.
- the frame (100) may have the same or different length (DI) in the x direction and length (LI) in the y direction.
- the project device (200) is provided to control various electronic components provided in an electronic device.
- the project device (200) may be used interchangeably with 'optical output device', 'optical projector device', 'light irradiation device', 'optical device', etc.
- the projector device (200) can generate an image or a video of a series of images that are displayed to the user.
- the projector device (200) can include an image source panel that generates an image and a plurality of lenses that diffuse and converge light generated from the image source panel.
- the project device (200) may be secured to either of the two side frames (120).
- the project device (200) may be secured to the inside or outside of either of the side frames (120), or may be integrally formed by being built into the inside of either of the side frames (120).
- the project device (200) may be secured to the front frame (110) or may be provided separately from the electronic device.
- the display unit (300) may be implemented in the form of a head mounted display (HMD).
- HMD form refers to a display method that is mounted on the head and directly shows an image in front of the user's eyes.
- the display unit (300) may be positioned to correspond to at least one of the left and right eyes so that the image can be directly provided in front of the user's eyes.
- the display unit (300) is positioned in a portion corresponding to the right eye so that the image can be output toward the user's right eye.
- it is not limited thereto and may be positioned for both the left and right eyes.
- the display unit (300) can allow the user to visually perceive the external environment while simultaneously allowing the user to see an image generated by the projector device (200).
- the display unit (300) can project an image onto the display area using a prism.
- the display unit (300) may be formed to be translucent so that the projected image and the general field of view in front (the range that the user sees through his eyes) can be viewed simultaneously.
- the display unit (300) may be translucent and formed of an optical member including glass.
- the display unit (300) may be inserted and fixed into an opening included in the front frame (110), or may be positioned on the back surface of the opening (i.e., between the opening and the user) and fixed to the front frame (110).
- the display unit (300) is positioned on the back surface of the opening and fixed to the front frame (110) as an example, but the display unit (300) may be positioned and fixed at various positions of the frame (100).
- the electronic device projects image light from the projector device (200) to one side of the display unit (300)
- the image light is emitted to the other side through the display unit (300), allowing the user to see the image generated from the projector device (200).
- the user can view the external environment through the opening of the frame (100) and at the same time view the image generated by the projector device (200). That is, the image output through the display unit (300) can be seen to overlap with the general field of view.
- the electronic device can provide augmented reality (AR) that superimposes a virtual image on a real image or background and shows it as a single image.
- AR augmented reality
- images generated from the external environment and the projector device (200) can be provided to the user with a time difference for a short period of time that is not recognized by the person.
- the external environment can be provided to the person in one section, and images from the projector device (200) can be provided to the person in another section.
- both overlap and time difference may be provided.
- FIGS. 4 to 6 are conceptual diagrams for explaining various display methods applicable to a display unit according to an embodiment of the present invention.
- FIG. 4 is a drawing for explaining an embodiment of a prism-type optical member
- FIG. 5 is a drawing for explaining an embodiment of a waveguide-type optical member
- FIG. 6 is a drawing for explaining an embodiment of a surface reflection-type optical member.
- a prism-type optical member may be used in the display unit (300-1) according to an embodiment of the present invention.
- a flat type glass optical member in which the surface from which image light is incident and the surface from which it is emitted (300a) are planes may be used as a prism-type optical member, or as shown in (b) of FIG. 4, a freeform glass optical member in which the surface from which image light is emitted (300b) is formed as a curved surface without a constant radius of curvature may be used.
- a flat type glass optical member can receive image light generated from a projector device (200) through a flat side, reflect the image light by a total reflection mirror (300a) provided inside, and emit the image light toward a user.
- the total reflection mirror (300a) provided inside the flat type glass optical member can be formed inside the flat type glass optical member by a laser.
- the freeform glass optical member is configured to become thinner the farther away it is from the incident surface, so that the image light generated from the projector device (200) can be received through a curved side, totally reflected internally, and emitted toward the user.
- a display unit (300-2) may use a waveguide-type optical element or a light guide optical element (LOE).
- LOE light guide optical element
- optical elements of the waveguide (or waveguide) or light guide type include a glass optical element of the segmented beam splitter type as illustrated in FIG. 5 (a), a glass optical element of the sawtooth prism type as illustrated in FIG. 5 (b), a glass optical element having a diffractive optical element (DOE) as illustrated in FIG. 5 (c), a glass optical element having a hologram optical element (HOE) as illustrated in FIG. 5 (d), a glass optical element having a passive grating as illustrated in FIG. 5 (e), and a glass optical element having an active grating as illustrated in FIG. 5 (f).
- a glass optical element of the segmented beam splitter type as illustrated in FIG. 5 (a)
- a glass optical element of the sawtooth prism type as illustrated in FIG. 5 (b)
- DOE diffractive optical element
- HOE hologram optical element
- a glass optical member of the segmented beam splitter type may be provided with a total reflection mirror (301a) on the side where the light image is incident and a partial reflection mirror (segmented beam splitter, 301b) on the side where the light image is emitted, as illustrated.
- the optical image generated from the projector device (200) is totally reflected by the total reflection mirror (301a) inside the glass optical member, and the totally reflected optical image is partially separated and emitted by the partial reflection mirror (301b) while guiding the light along the length direction of the glass, so that it can be recognized by the user's eyes.
- a glass optical member in the form of a sawtooth prism causes the image light of the projector device (200) to be incident diagonally on the side of the glass and is totally reflected inside the glass, and the light image is emitted outside the glass by the sawtooth-shaped unevenness (302) provided on the side from which it is emitted, so that it can be recognized by the user's eyes.
- a glass optical element having a diffractive optical element (DOE) as illustrated in (c) of FIG. 5 may be provided with a first diffractive portion (303a) on the surface on which the light image is incident and a second diffractive portion (303b) on the surface on which the light image is emitted.
- the first and second diffractive portions (303a, 303b) may be provided in a form in which a specific pattern is patterned on the surface of the glass or in a form in which a separate diffractive film is attached.
- the light image generated from the projector device (200) is diffracted upon entering through the first diffraction unit (303a), is totally reflected, and is guided along the length direction of the glass, and is emitted through the second diffraction unit (303b), so that it can be recognized by the user's eyes.
- a glass optical element having a hologram optical element (HOE) as illustrated in (d) of FIG. 5 may be provided with an out-coupler (304) inside the glass on the side from which the light image is emitted. Accordingly, an light image is incident from a projector device (200) in a diagonal direction through the side surface of the glass, is totally reflected, is guided along the length direction of the glass, and is emitted by the out-coupler (304) so as to be recognized by the user's eyes.
- a hologram optical element may be further subdivided into a structure having a passive grating and a structure having an active grating by slightly changing the structure.
- a glass optical member having a passive grating such as that illustrated in (e) of FIG. 5, may be provided with an in-coupler (305a) on the surface opposite to the glass surface on which an optical image is incident, and an out-coupler (305b) on the surface opposite to the glass surface on which an optical image is emitted.
- the in-coupler (305a) and the out-coupler (305b) may be provided in the form of a film having a passive grating.
- the light image incident on the incident side of the glass surface is totally reflected by the in-coupler (305a) provided on the opposite surface and guided along the length direction of the glass, and is emitted through the opposite surface of the glass by the out-coupler (305b), so that it can be recognized by the user's eyes.
- a glass optical member having an active grating such as that illustrated in (f) of FIG. 5, may be provided with an in-coupler (306a) formed as an active grating inside the glass on the side where the light image is incident, and an out-coupler (306b) formed as an active grating inside the glass on the side where the light image is emitted.
- the light image incident on the glass is guided along the length direction of the glass while being totally reflected by the in-coupler (306a), and is emitted outside the glass by the out-coupler (306b), so that it can be recognized by the user's eyes.
- a pin mirror type optical member may be used as a display unit.
- a freeform combiner-type surface reflection type optical member may use a freeform combiner glass formed with a plurality of flat surfaces having different incident angles of light images as a single glass to have an overall curved surface in order to perform the role of a combiner.
- a freeform combiner glass may be output to a user with the incident angles of light images being different for each area.
- a surface reflection type optical member of the Flat HOE method can be provided by coating or patterning a holographic optical member (HOE, 311) on the surface of a flat glass, and an optical image incident from a projector device (200) can pass through the holographic optical member (311), be reflected from the surface of the glass, and then pass through the holographic optical member (311) again to be emitted toward the user.
- a holographic optical member HOE, 311
- a freeform HOE-type surface reflection type optical member can be provided by coating or patterning a holographic optical member (HOE, 313) on the surface of a freeform glass, and the operating principle can be the same as that described in (b) of Fig. 6.
- HOE holographic optical member
- FIG. 7 is a perspective view of a project device according to an embodiment
- FIG. 8 is another perspective view of a project device according to an embodiment
- FIGS. 9 and 10 are schematic diagrams of the inside of the project device according to the embodiment.
- a project device (200) may include a housing (210), a light source unit (220), a projection lens unit (230), and a light modulator (240).
- the housing (210) may have a space or housing groove in which each component of the project device (200) is accommodated or placed.
- the housing (210) may be located on the outside of the project device (200).
- a light source unit (220), a projection lens unit (230), and a light modulator (240) may be placed inside the housing (210).
- the housing (210) may have a structure in which one side is opened. Accordingly, each of the components described above may be assembled through the opened area or surface. In addition, light may be emitted to the outside through the opening of the housing (210).
- the housing (210) may have various shapes.
- the housing (210) may have a hexahedral structure. Accordingly, the project device according to the embodiment can be easily mounted on an electronic device. In addition, the project device according to the embodiment can be easily miniaturized or compactized.
- the light source unit (220) can emit light.
- the light source unit (220) can emit light in a first direction.
- the light source unit (220) can emit light of a specific wavelength band.
- the light source unit (220) can output white light.
- the light source unit (220) can output light of a red, green, or blue wavelength band.
- the light source unit (220) may be placed inside the housing (210).
- the light source unit (220) may emit light toward the projection lens unit (230).
- the light source unit (220) may be placed spaced apart from the projection lens unit (230) in a first direction.
- the light source unit (220) may be placed adjacent to the first lens (L1) of the projection lens unit (230).
- the light source unit (220) may be placed spaced apart from the light modulator (240) in the first direction.
- the light emitted by the light source unit (220) may pass through the projection lens unit (230) and reach the light modulator (240).
- the light source unit (220) may include at least one light source.
- the light source unit (220) may include a first light source to a third light source (not shown).
- the first light source to the third light source may be adjacently positioned at a predetermined distance apart from each other.
- the first light source to the third light source may emit light of different wavelength bands or colors. For example, the first light source may emit light of a blue wavelength, the second light source may emit light of a green wavelength, and the third light source may emit light of a red wavelength. Since the light source unit (220) includes the first light source to the third light source, light can be irradiated from one light source, and accordingly, the projector device may be miniaturized and compact.
- the first direction may correspond to the ‘X-axis direction’ in the drawing.
- the first direction may correspond to the direction from the light source unit (220) toward the projection lens unit (230).
- the first direction may correspond to the optical axis direction.
- the second direction may correspond to the Y-axis direction in the drawing.
- the second direction may be a direction perpendicular to the first direction.
- the light source unit (220) may be placed in the first region (a1).
- the first region (a1) may refer to a region located at the bottom with respect to the optical axis of the first lens (L1) of the projection lens unit (230).
- the second region (a2) may refer to a region located at the top with respect to the optical axis of the first lens (L1) of the projection lens unit (230).
- the first region (a1) and the second region (a2) may be located in a second direction perpendicular to the first direction.
- the first region (a1) and the second region (a2) may be located at the front end of the projection lens unit (230). That is, the light source unit (220) may be located at the bottom with respect to the optical axis of the first lens (L1).
- the light source unit (220) may irradiate light from the first region (a1) toward the projection lens unit (230).
- the light irradiated by the light source unit (220) in the first area (a1) can pass through the projection lens unit (230), be reflected by the light modulator (240), and then pass through the projection lens unit (230) again to be emitted into the second area (a2). Since the area where the light is irradiated and the area where the light is emitted are arranged in the same direction, the light illumination system and the imaging system can be integrated into one, and accordingly, the projector device can be made smaller and more compact.
- the projection lens unit (230) may be arranged at the rear end of the light source unit (220). The light irradiated by the light source unit (220) may be projected onto the projection lens unit (230). The projection lens unit (230) may allow the light irradiated by the light source unit (220) to reach the light modulator (240). In addition, the projection lens unit (230) may project the light reflected by the light modulator (240) onto a screen or waveguide (or display unit). The projection lens unit (230) may be arranged spaced apart from the light source unit (220) in a first direction. The projection lens unit (230) may partially overlap the light source unit (220) or the light modulator (240) in the first direction.
- the light irradiated to the plurality of light sources of the light source unit (220) can form a plurality of unit cells after transmitting through the projection lens unit (230).
- Each of the plurality of unit cells can be in a state where light of each wavelength band is superimposed.
- the projection lens unit (230) can adjust the size of the image so that the light is incident within the effective aperture diameter (enterance pupil diameter (EPD)) of the waveguide or the like.
- EPD effective aperture diameter
- the projection lens unit (230) according to the embodiment can include a lens barrel (not shown) and a plurality of lenses arranged within the lens barrel.
- the projection lens unit (230) may include a plurality of lenses.
- the plurality of lenses of the projection lens unit (230) may include at least three lenses aligned in the optical axis direction.
- the projection lens unit (230) according to the embodiment may include first to seventh lenses (L1, L2, L3, L4, L5, L6, L7).
- the first to seventh lenses (L1, L2, L3, L4, L5, L6, L7) may be arranged in a first direction.
- the first to seventh lenses (L1, L2, L3, L4, L5, L6, L7) may be sequentially arranged while being spaced apart from each other in the first direction.
- Each lens may include a surface adjacent to the light source unit (220) and a surface spaced apart from the light source unit (220).
- Each surface of the plurality of lenses may be convex or concave. Additionally, each surface of the plurality of lenses may be decentered with respect to the optical axis. Additionally, each surface of the plurality of lenses may be tilted with respect to a direction perpendicular to the optical axis.
- the first lens (L1) may be a lens positioned closest to the light source unit (220) among the plurality of lenses. Light irradiated by the light source unit (220) may pass through the first lens (L1).
- the first lens (L1) may include a first surface (S1) adjacent to the light source unit (220) and a second surface (S2) spaced apart from the light source unit (220).
- the first surface (S1) of the first lens (L1) adjacent to the light source unit (220) may be convex toward the light source unit (220).
- the second surface (S2) of the first lens (L1) spaced apart from the light source unit (220) may be concave toward the light source unit (220).
- the second lens (L2) may be arranged at the rear end of the first lens (L1).
- the second lens (L2) may be arranged between the first lens (L1) and the third lens (L3).
- the second lens (L2) may include a third surface (S3) adjacent to the light source (220) and a fourth surface (S4) spaced apart from the light source (220).
- the third surface (S3) of the second lens (L2) adjacent to the light source (220) may be convex toward the light source (220).
- the fourth surface (S4) of the second lens (L2) spaced apart from the light source (220) may be concave toward the light source (220).
- the third lens (L3) may be arranged at the rear end of the second lens (L2).
- the third lens (L3) may be arranged between the second lens (L2) and the fourth lens (L4).
- the third lens (L3) may include a fifth surface (S5) adjacent to the light source (220) and a sixth surface (S6) spaced from the light source (220).
- the fifth surface (S5) of the third lens (L3) adjacent to the light source (220) may be convex toward the light source (220).
- the surface (S6) of the third lens (L3) spaced from the light source (220) may be concave toward the light source (220).
- the fourth lens (L4) may be arranged at the rear end of the third lens (L3).
- the fourth lens (L4) may be arranged between the third lens (L3) and the fifth lens (L5).
- the fourth lens (L4) may include a seventh surface (S7) adjacent to the light source (220) and an eighth surface (S8) spaced apart from the light source (220).
- the seventh surface (S7) of the fourth lens (L4) adjacent to the light source (220) may be concave toward the light modulator (240).
- the eighth surface (S8) of the fourth lens (L4) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the fifth lens (L5) may be arranged at the rear end of the fourth lens (L4).
- the fifth lens (L5) may be arranged between the fourth lens (L4) and the sixth lens (L6).
- the fifth lens (L5) may include a ninth surface (S9) adjacent to the light source (220) and a tenth surface (S10) spaced apart from the light source (220).
- the ninth surface (S9) of the fifth lens (L5) adjacent to the light source (220) may be convex toward the light source (220).
- the tenth surface (S10) of the fifth lens (L5) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the sixth lens (L6) may be arranged at the rear end of the fifth lens (L5).
- the sixth lens (L6) may be arranged between the fifth lens (L5) and the seventh lens (L7).
- the sixth lens (L6) may include an eleventh surface (S11) adjacent to the light source (220) and a twelfth surface (S12) spaced apart from the light source (220).
- the eleventh surface (S11) of the sixth lens (L6) adjacent to the light source (220) may be concave toward the light modulator (240).
- the twelfth surface (S12) of the sixth lens (L6) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the eleventh surface (S11) of the sixth lens (L6) may be referred to as the first outer surface, and the twelfth surface (S12) may be referred to as the second outer surface. That is, the first outer surface and the second outer surface may be two surfaces of the lens that is second adjacent to the optical modulator (240), and the first outer surface may be a surface of the lens that is second adjacent to the optical modulator (240) and is spaced apart from the optical modulator (240), and the second outer surface may be a surface of the lens that is second adjacent to the optical modulator (240) and is adjacent to the optical modulator (240).
- the seventh lens (L7) may be arranged at the rear end of the sixth lens (L6).
- the seventh lens (L7) may be arranged between the sixth lens (L6) and the light modulator (240).
- the seventh lens (L7) may include a 13th surface (S13) adjacent to the light source (220) and a 14th surface (S14) spaced apart from the light source (220).
- the 13th surface (S13) of the seventh lens (L7) adjacent to the light source (220) may be flat.
- the 14th surface (S14) of the seventh lens (L7) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the 13th surface (S13) of the seventh lens (L7) may be referred to as the third outer surface
- the 14th surface (S14) may be referred to as the fourth outer surface. That is, the third outer side and the fourth outer side may be two sides of the lens closest to the optical modulator (240), the third outer side may be a side spaced from the optical modulator (240) of the lens closest to the optical modulator (240), and the fourth outer side may be a side adjacent to the optical modulator (240) of the lens closest to the optical modulator (240).
- the centers of the eleventh surface (S11) and the fourteenth surface (S14) may be decentered with respect to the optical axis.
- the optical axis may refer to an optical axis passing through the centers of the first to third lenses (L1, L2, L3).
- the centers of the eleventh surface (S11) and the fourteenth surface (S14) may be decentered in a direction perpendicular to the optical axis with respect to the optical axis.
- the center of the eleventh surface (S11) may be decentered upward with respect to the optical axis.
- the center of the eleventh surface (S11) and the center of the fourteenth surface (S14) may be spaced apart from the optical axis by a certain distance in a direction perpendicular to the optical axis. Accordingly, the center of the eleventh surface (S11) and the center of the fourteenth surface (S14) may not be on the optical axis, and the center of the twelfth surface (S12) and the center of the thirteenth surface (S13) may be on the optical axis. In addition, the eleventh surface (S11) and the fourteenth surface (S14) may not form a circular symmetry in the effective area with respect to the optical axis.
- the center of the eleventh surface (S11) may be decentred toward the second area (a2) with respect to the optical axis.
- the center of the fourteenth surface (S14) may be decentred toward the lower side with respect to the optical axis. That is, the center of the fourteenth surface (S14) may be decentred toward the first area (a1) with respect to the optical axis.
- the center of the eleventh surface (S11) and the center of the fourteenth surface (S14) may be decentred in opposite directions with respect to the optical axis.
- the distance (d1) at which the eleventh surface (S11) is decentred in a direction perpendicular to the optical axis with respect to the optical axis may be 1.0 mm to 1.2 mm.
- the center of the 11th surface (S11) can be spaced apart from the optical axis by 1.0 mm to 1.2 mm in a direction perpendicular to the optical axis based on the optical axis connecting the centers of the first lens to the third lens.
- the 11th surface (S11) can be decentered by 1.14 mm from the optical axis. Since the center of the 11th surface (S11) is spaced apart from the optical axis by 1.0 mm to 1.2 mm in a direction perpendicular to the optical axis, the resolution of the projector device and the light incidence angle of the light modulator can be secured. Accordingly, the area where light enters the projection lens unit and the area where light exits from the projection lens unit can be separated, and the illumination system and the imaging system of the projector device can be unified into one optical system.
- the 13th surface (S13) and the 14th surface (S14) can be tilted based on a surface perpendicular to the optical axis direction.
- the 13th surface (S13) and the 14th surface (S14) can form a predetermined angle with a surface perpendicular to the optical axis direction. That is, the 13th surface (S13) and the 14th surface (S14) can form a predetermined angle with a surface perpendicular to the optical axis along a straight line connecting the endpoints of the effective diameters. Therefore, the 13th surface (S13) and the 14th surface (S14) do not form a circular symmetry based on the optical axis passing through the centers of the first lens to the third lens.
- the 13th surface (S13) can form a first angle ( ⁇ 1) with a surface perpendicular to the optical axis direction, and the 14th surface (S14) can form a second angle ( ⁇ 2) with a surface perpendicular to the optical axis direction.
- the 13th surface (S13) can be tilted toward the first region (a1) with respect to a surface perpendicular to the optical axis direction. That is, in FIG. 9, the 13th surface (S13) can be tilted counterclockwise with respect to a surface perpendicular to the optical axis direction.
- the 14th surface (S14) can be tilted toward the second region (a2) with respect to a surface perpendicular to the optical axis direction. That is, in FIG.
- the 14th surface (S14) can be tilted clockwise with respect to a surface perpendicular to the optical axis direction.
- the 13th surface (S13) and the 14th surface (S14) can be tilted in opposite directions with respect to a surface perpendicular to the optical axis direction.
- the width in the optical axis direction of the seventh lens (L7) can be different in the first region (a1) and the second region (a2).
- the width in the optical axis direction of the seventh lens (L7) may be smaller than the width in the optical axis direction of the seventh lens (L7) in the second region (a2).
- the width in the optical axis direction of the seventh lens (L7) may become narrower from the second region side to the second region side.
- the first angle ( ⁇ 1) formed by the 13th surface (S13) with respect to a surface perpendicular to the optical axis direction may be 9 ⁇ to 9.6 ⁇ .
- the first angle ( ⁇ 1) formed by the 13th surface (S13) with respect to a surface perpendicular to the optical axis direction may be 9.35 ⁇ .
- the second angle ( ⁇ 2) formed by the 14th surface (S14) with respect to a surface perpendicular to the optical axis direction may be 10.5 ⁇ to 11.1 ⁇ .
- the second angle ( ⁇ 2) formed by the 14th surface (S14) with respect to the plane perpendicular to the optical axis direction may be 10.827 ⁇ .
- the Fnumber may be 2.2
- the Fov Field of view
- the EFL Effective focal length
- the TTL Through-The-Lens
- the range of the first angle to 9 ⁇ to 9.6 ⁇ and forming the range of the second angle ( ⁇ 2) to 10.5 ⁇ to 11.1 ⁇ the resolution of the projector device and the light incident angle of the light modulator can be secured. Accordingly, the area where light enters the projection lens unit and the area where light exits from the projection lens unit can be separated, and the illumination system and the imaging system of the projector device can be unified into one optical system.
- Figure 11 is a schematic diagram of the interior of a project device according to another embodiment.
- the centers of the eleventh surface (S11), the thirteenth surface (S13), and the fourteenth surface (S14) are decentered with respect to the optical axis, and the thirteenth surface (S13) and the fourteenth surface (S14) can be tilted with respect to a plane perpendicular to the optical axis direction.
- the Fnumber may be 1.8
- the Fov Field of view
- the EFL Effective focal length
- the TTL Thiough-The-Lens
- Fig. 12 is an enlarged view of an optical modulator of a project device according to an embodiment.
- the light modulator (240) may be placed at the rear end of the projection lens unit (230).
- the light modulator (240) may reflect light transmitted from the projection lens unit (230) back to the projection lens unit (230).
- the light modulator (240) may partially overlap with the projection lens unit (230) in the first direction.
- the light modulator (240) may partially overlap with the light source unit (220) in the first direction.
- the optical modulator (240) can reflect incident light to project an image.
- the optical modulator (240) can emit or project an image or image based on an image signal that is incident through the substrate (241). That is, the optical modulator (240) can modulate light emitted from the light source unit (220).
- the optical modulator (240) reflects illumination light into patterned light, and the patterned light can pass through the projection lens unit (230) and be output to the outside of the projector device.
- the optical modulator (240) according to the embodiment can include a digital micromirror device (DMD).
- DMD digital micromirror device
- the optical modulator (240) may include a substrate (241) and a plurality of mirrors (242) arranged on the substrate (241). Each mirror (242) may reflect or block light according to a signal (e.g., a digital signal). In other words, the optical modulator (240) may control the state of each mirror (242) based on an image signal applied through the substrate (241) to project or display an image (or image) corresponding to the image signal. For example, when light is reflected by the control of the mirror (242), a bright image area may be output, and when light is blocked, a dark image area may be output.
- a signal e.g., a digital signal
- the substrate (241) can be tilted with respect to a plane perpendicular to the optical axis direction.
- the substrate (241) can be tilted at a predetermined angle ( ⁇ 3) with respect to the plane perpendicular to the optical axis direction so that light is emitted to the second region (a2).
- the substrate (241) can be tilted so that the light reflection surface faces the second region (a2).
- the substrate (241) can be tilted at a third angle ( ⁇ 3) with respect to the plane perpendicular to the optical axis direction.
- the third angle ( ⁇ 3) can be 6.5 ⁇ to 7.5 ⁇ .
- the third angle ( ⁇ 3) can be 7 ⁇ .
- the resolution of the projector device and the light incidence angle of the optical modulator can be secured. Accordingly, the area where light enters the projection lens unit and the area where light exits from the projection lens unit can be separated, and the lighting system and imaging system of the projector device can be unified into a single optical system.
- the difference between the tilt angle ( ⁇ 2) of the 14th surface (S14) with respect to the plane perpendicular to the optical axis direction and the tilt angle ( ⁇ 3) of the substrate (241) with respect to the plane perpendicular to the optical axis direction may be -5 ⁇ to +5 ⁇ .
- the difference between the second angle ( ⁇ 2) and the third angle ( ⁇ 3) may be -5 ⁇ to +5 ⁇ .
- the mirror (242) can form a predetermined angle with the substrate (241).
- Each of the plurality of mirrors (242) can form a predetermined angle with the substrate (241) to allow light to be emitted to the second area (a2).
- the mirror (242) can be tilted at a predetermined angle with respect to the substrate (241).
- the mirror (242) can be tilted with respect to the substrate (241) so that the mirror surface on which light is reflected faces the first area (a1). That is, the mirror (242) can be tilted in a direction opposite to the direction in which the substrate (241) is tilted.
- the mirror (242) can form a fourth angle ( ⁇ 4) with respect to the substrate (241).
- the fourth angle ( ⁇ 4) can be 16.5° to 17.5°.
- the fourth angle ( ⁇ 4) can be 17°.
- the resolution of the projector device and the light incident angle of the light modulator can be secured. Accordingly, the area where light enters the projection lens unit and the area where light exits from the projection lens unit can be separated, and the illumination system and imaging system of the projector device can be unified into one optical system.
- Figures 13 and 14 are schematic diagrams of the inside of a project device according to another embodiment.
- the projection lens unit (230) may include first to sixth lenses (L1, L2, L3, L4, L5, L6) sequentially arranged from the light source unit (220) side to the light modulator (240) side.
- the first lens (L1) may be a lens positioned closest to the light source unit (220) among the plurality of lenses. Light irradiated by the light source unit (220) may pass through the first lens (L1).
- the first lens (L1) may include a first surface (S1) adjacent to the light source unit (220) and a second surface (S2) spaced apart from the light source unit (220).
- the first surface (S1) of the first lens (L1) adjacent to the light source unit (220) may be convex toward the light source unit (220).
- the second surface (S2) of the first lens (L1) spaced apart from the light source unit (220) may be convex toward the light modulator (240).
- the second lens (L2) may be arranged at the rear end of the first lens (L1).
- the second lens (L2) may be arranged between the first lens (L1) and the third lens (L3).
- the second lens (L2) may include a third surface (S3) adjacent to the light source (220) and a fourth surface (S4) spaced apart from the light source (220).
- the third surface (S3) of the second lens (L2) adjacent to the light source (220) may be flat.
- the fourth surface (S4) of the second lens (L2) spaced apart from the light source (220) may be concave toward the light source (220).
- the third lens (L3) may be arranged at the rear end of the second lens (L2).
- the third lens (L3) may be arranged between the second lens (L2) and the fourth lens (L4).
- the third lens (L3) may include a fifth surface (S5) adjacent to the light source (220) and a sixth surface (S6) spaced apart from the light source (220).
- the fifth surface (S5) of the third lens (L3) adjacent to the light source (220) may be concave toward the light modulator (240).
- the sixth surface (S6) of the third lens (L3) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the fourth lens (L4) may be arranged at the rear end of the third lens (L3).
- the fourth lens (L4) may be arranged between the third lens (L3) and the fifth lens (L5).
- the fourth lens (L4) may include a seventh surface (S7) adjacent to the light source (220) and an eighth surface (S8) spaced apart from the light source (220).
- the seventh surface (S7) of the fourth lens (L4) adjacent to the light source (220) may be convex toward the light source (220).
- the eighth surface (S8) of the fourth lens (L4) spaced apart from the light source (220) may be concave toward the light source (220).
- the fifth lens (L5) may be arranged at the rear end of the fourth lens (L4). Additionally, the fifth lens (L5) may be arranged at the front end of the sixth lens (L6). The fifth lens (L5) may be arranged between the fourth lens (L4) and the sixth lens (L6).
- the fifth lens (L5) may include a ninth surface (S9) adjacent to the light source (220) and a tenth surface (S10) spaced apart from the light source (220).
- the ninth surface (S9) of the fifth lens (L5) adjacent to the light source (220) may be concave toward the light modulator (240).
- the tenth surface (S10) of the fifth lens (L5) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the ninth surface (S9) of the fifth lens (L5) may be referred to as the first outer surface
- the tenth surface (S10) may be referred to as the second outer surface. That is, the first outer surface and the second outer surface may be two surfaces of the lens that is second adjacent to the optical modulator (240), and the first outer surface may be a surface of the lens that is second adjacent to the optical modulator (240) and is spaced apart from the optical modulator (240), and the second outer surface may be a surface of the lens that is second adjacent to the optical modulator (240) and is adjacent to the optical modulator (240).
- the sixth lens (L6) may be arranged at the rear end of the fifth lens (L5).
- the sixth lens (L6) may be the lens closest to the light modulator (240).
- the sixth lens (L6) may be arranged between the fifth lens (L5) and the light modulator (240).
- the sixth lens (L6) may include an eleventh surface (S11) adjacent to the light source (220) and a twelfth surface (S12) spaced apart from the light source (220).
- the eleventh surface (S11) of the sixth lens (L6) adjacent to the light source (220) may be flat.
- the twelfth surface (S12) of the sixth lens (L6) spaced apart from the light source (220) may be convex toward the light modulator (240).
- the eleventh surface (S11) of the sixth lens (L6) may be referred to as the third outer surface, and the twelfth surface (S12) may be referred to as the fourth outer surface. That is, the third outer surface and the fourth outer surface may be two surfaces of the lens that are closest to the optical modulator (240), and the third outer surface may be a surface of the lens that is closest to the optical modulator (240) and is spaced apart from the optical modulator (240), and the fourth outer surface may be a surface of the lens that is closest to the optical modulator (240) and is adjacent to the optical modulator (240).
- the centers of the ninth side (S9) and the twelfth side (S12) may be decentered with respect to the optical axis.
- the centers of the ninth side (S9) and the twelfth side (S12) may be decentered in a direction perpendicular to the optical axis with respect to the optical axis.
- the center of the ninth side (S9) may be decentered upward with respect to the optical axis. That is, the center of the ninth side (S9) and the center of the twelfth side (S12) may be spaced apart from the optical axis by a certain distance in a direction perpendicular to the optical axis.
- the center of the ninth side (S9) and the center of the twelfth side (S12) may not be on the optical axis, and the center of the tenth side (S10) and the center of the eleventh side (S11) may be on the optical axis.
- the ninth side (S9) and the twelfth side (S12) may not form a circular symmetry in an effective diameter region with respect to the optical axis. That is, the center of the ninth surface (S9) can be decentered toward the second region (a2) with respect to the optical axis.
- the center of the twelfth surface (S12) can be decentered toward the lower side with respect to the optical axis.
- the center of the twelfth surface (S12) can be decentered toward the first region (a1) with respect to the optical axis.
- the center of the ninth surface (S9) and the center of the twelfth surface (S4) can be decentered in opposite directions with respect to the optical axis.
- the ninth surface (S9) can be decentered by 1.0 mm to 1.2 mm in a direction perpendicular to the optical axis from the optical axis. That is, the ninth surface (S9) can be spaced apart by 1.0 mm to 1.2 mm in a direction perpendicular to the optical axis with respect to the optical axis.
- the ninth surface (S9) can be decentered by 1.14 mm from the optical axis.
- the resolution of the projector device and the angle of incidence of light of the optical modulator can be secured by decentering the ninth surface (S9) by 1.0 mm to 1.2 mm in a direction perpendicular to the optical axis direction. Accordingly, the area where light enters the projection lens unit and the area where light exits from the projection lens unit can be separated, and the illumination system and imaging system of the projector device can be unified into a single optical system.
- the eleventh surface (S11) and the twelfth surface (S12) can be tilted with respect to a surface perpendicular to the optical axis direction.
- the eleventh surface (S11) and the twelfth surface (S12) can form a predetermined angle with a surface perpendicular to the optical axis direction. That is, the straight line connecting the end points of the effective diameters of the eleventh surface (S11) and the twelfth surface (S12) can form a predetermined angle with a surface perpendicular to the optical axis direction.
- the eleventh surface (S11) can be tilted to face the first region (a1) with respect to the surface perpendicular to the optical axis direction. That is, in FIG.
- the eleventh surface (S11) can be tilted counterclockwise with respect to the surface perpendicular to the optical axis direction.
- the twelfth surface (S12) can be tilted with respect to the surface perpendicular to the optical axis direction to face the second region (a2). That is, in FIG. 9, the twelfth surface (S12) can be tilted clockwise with respect to a plane perpendicular to the optical axis direction.
- the eleventh surface (S11) and the twelfth surface (S12) can be tilted in opposite directions with respect to a plane perpendicular to the optical axis direction.
- the eleventh surface (S11) can be tilted 9 ⁇ to 9.6 ⁇ with respect to a plane perpendicular to the optical axis direction.
- the eleventh surface (S11) can be tilted 9.35 ⁇ with respect to a plane perpendicular to the optical axis direction.
- the twelfth surface (S12) can be tilted 10.5 ⁇ to 11.1 ⁇ with respect to a plane perpendicular to the optical axis direction.
- the twelfth surface (S12) can be tilted 10.827 ⁇ with respect to a plane perpendicular to the optical axis direction.
- the Fnumber can be 107225
- the Fov Field of view
- the EFL Effective focal length
- the TTL Through-The-Lens
- the eleventh surface (S11) is tilted 9 ⁇ to 9.6 ⁇ with respect to the plane perpendicular to the optical axis direction
- the twelfth surface (S12) is tilted 10.5 ⁇ to 11.1 ⁇ with respect to the plane perpendicular to the optical axis direction, thereby securing the resolution of the projector device and the light incident angle of the light modulator. Accordingly, the area where light enters the projection lens unit and the area where light exits from the projection lens unit can be separated, and the illumination system and the imaging system of the projector device can be unified into one optical system.
- the projection lens unit (230) may include first to sixth lenses (L1, L2, L3, L4, L5, L6) sequentially arranged from the light source unit (220) side to the light modulator (240) side.
- the first lens (L1) may include a first surface (S1) adjacent to the light source unit (220) and a second surface (S2) spaced apart from the light source unit (220).
- the second lens (L2) may include a third surface (S3) adjacent to the light source unit (220) and a fourth surface (S4) spaced apart from the light source unit (220).
- the third lens (L3) may include a fifth surface (S5) adjacent to the light source unit (220) and a sixth surface (S6) spaced apart from the light source unit (220).
- the fourth lens (L4) may include a seventh surface (S7) adjacent to the light source (220) and an eighth surface (S8) spaced apart from the light source (220).
- the fifth lens (L5) may include a ninth surface (S9) adjacent to the light source (220) and a tenth surface (S10) spaced apart from the light source (220).
- the sixth lens (L6) may include an eleventh surface (S11) adjacent to the light source (220) and a twelfth surface (S12) spaced apart from the light source (220).
- Tables 1 to 4 show the optical characteristics of the project device according to the embodiment of Fig. 14.
- Table 1 shows the characteristics of each surface of multiple lenses of the project device according to the embodiment of Fig. 14.
- Table 2 shows the aspherical coefficients of multiple lenses of the project device according to the embodiment of Fig. 14.
- Table 3 shows the refractive power, focal length, curvature, Abbe number, center thickness, airgap (spacing between lenses), etc. of multiple lenses of the projector device according to the embodiment of Fig. 14.
- f1 to f6 may represent focal lengths of the first to sixth lenses, respectively.
- CT1 to CT6 may represent center thicknesses of the first to sixth lenses, respectively.
- ET1 to ET6 may represent edge thicknesses of the first to sixth lenses, respectively.
- Table 4 shows sag data of multiple lenses of the project device according to the embodiment of Fig. 14.
- Figure 15 is a schematic diagram of the inside of a project device according to another embodiment.
- the projection lens unit (230) of the projector device may include first to fifth lenses (L1, L2, L3, L4, L5) that are sequentially arranged from the light source unit (220) side to the light modulator (240) side.
- the first lens (L1) may include a first surface (S1) adjacent to the light source unit (220) and a second surface (S2) spaced apart from the light source unit (220).
- the second lens (L2) may include a third surface (S3) adjacent to the light source unit (220) and a fourth surface (S4) spaced apart from the light source unit (220).
- the third lens (L3) may include a fifth surface (S5) adjacent to the light source unit (220) and a sixth surface (S6) spaced apart from the light source unit (220).
- the fourth lens (L4) may include a seventh surface (S7) adjacent to the light source (220) and an eighth surface (S8) spaced apart from the light source (220).
- the fifth lens (L5) may include a ninth surface (S9) adjacent to the light source (220) and a tenth surface (S10) spaced apart from the light source (220).
- Tables 5 to 8 show data of a project device according to the embodiment of Fig. 15.
- Table 5 shows the characteristics of each surface of multiple lenses of the project device according to the embodiment of Fig. 15.
- Lens1 S1 3.335 1.699 2.505 5.010 S2 -5.342 0.604 2.586 5.172 Lens2 S3 2.385 0.373 1.970 3.939 S4 0.990 1.373 1.571 3.141 Lens3 S5 -12.018 0.428 1.735 3.471 S6 -4.399 1.049 1.670 3.340 Lens4 S7 -5.462 2.486 2.226 4.452 S8 -3.258 0.100 2.465 4.930 Lens5 S9 INF 0.887 2.173 4.346 S10 INF 0.415 2.058 4.115 Cover S11 INF 0.400 2.005 4.010 S12 INF 0.185 1.974 3.947 Image 1.880 3.760
- Table 6 shows the aspherical coefficients of multiple lenses of the project device according to the embodiment of Fig. 15.
- Table 7 shows the refractive power, focal length, curvature, Abbe number, center thickness, airgap (spacing between lenses), etc. of multiple lenses of the projector device according to the embodiment of Fig. 15.
- f1 to f5 may represent focal lengths of the first to fifth lenses, respectively.
- CT1 to CT5 may represent center thicknesses of the first to fifth lenses, respectively.
- ET1 to ET5 may represent edge thicknesses of the first to fifth lenses, respectively.
- Table 8 shows sag data of multiple lenses of the project device according to the embodiment of Fig. 15.
- Figure 16 is a schematic diagram of the inside of a project device according to another embodiment.
- the projection lens unit (230) of the projector device may include first to fifth lenses (L1, L2, L3, L4, L5) that are sequentially arranged from the light source unit (220) side to the light modulator (240) side.
- the first lens (L1) may include a first surface (S1) adjacent to the light source unit (220) and a second surface (S2) spaced apart from the light source unit (220).
- the second lens (L2) may include a third surface (S3) adjacent to the light source unit (220) and a fourth surface (S4) spaced apart from the light source unit (220).
- the third lens (L3) may include a fifth surface (S5) adjacent to the light source unit (220) and a sixth surface (S6) spaced apart from the light source unit (220).
- the fourth lens (L4) may include a seventh surface (S7) adjacent to the light source (220) and an eighth surface (S8) spaced apart from the light source (220).
- the fifth lens (L5) may include a ninth surface (S9) adjacent to the light source (220) and a tenth surface (S10) spaced apart from the light source (220).
- Tables 9 to 12 show data of a project device according to the embodiment of Fig. 16.
- Table 9 shows the characteristics of each surface of multiple lenses of the project device according to the embodiment of Fig. 16.
- Lens1 11.782 1.403 3.000 6.000 S2 -3.015 0.695 2.900 5.800 Lens2 S3 6.877 0.412 2.776 5.551 S4 1.213 1.082 2.460 4.919 Lens3 S5 6.364 1.224 2.517 5.034 S6 -6.379 1.850 2.512 5.023 Lens4 S7 -11.624 1.200 5.085 10.169 S8 -3.789 0.502 2.714 5.428 Lens5 S9 INF 1.182 2.616 5.233 S10 INF 0.950 2.818 5.636 Image 1.823 3.645
- Table 10 shows the aspherical coefficients of multiple lenses of the project device according to the embodiment of Fig. 16.
- Table 11 shows the refractive power, focal length, curvature, Abbe number, center thickness, airgap (spacing between lenses), etc. of multiple lenses of the projector device according to the embodiment of Fig. 16.
- f1 to f5 may represent focal lengths of the first to fifth lenses, respectively.
- CT1 to CT5 may represent center thicknesses of the first to fifth lenses, respectively.
- ET1 to ET5 may represent edge thicknesses of the first to fifth lenses, respectively.
- Table 12 shows sag data of multiple lenses of the project device according to the embodiment of Fig. 16.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Un mode de réalisation de la présente invention concerne un dispositif projecteur qui comprend : une unité de source de lumière servant à émettre de la lumière ; une unité de lentille de projection servant à transmettre la lumière émise par l'unité de source de lumière ; et un modulateur optique servant à moduler et à réfléchir la lumière qui traverse l'unité de lentille de projection. L'unité de lentille de projection comprend une pluralité de lentilles, le modulateur optique comprend un substrat et une pluralité de miroirs disposés sur le substrat, la pluralité de lentilles de l'unité de lentille de projection comprennent au moins trois lentilles alignées dans la direction de l'axe optique, le substrat est incliné par rapport à une surface perpendiculaire à la direction de l'axe optique, et la surface de la lentille adjacente au modulateur optique parmi la pluralité de lentilles de l'unité de lentille de projection qui est adjacente au modulateur optique est convexe vers le modulateur optique.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2023-0181927 | 2023-12-14 | ||
| KR20230181927 | 2023-12-14 | ||
| KR10-2024-0162897 | 2024-11-15 | ||
| KR1020240162897A KR20250092063A (ko) | 2023-12-14 | 2024-11-15 | 프로젝트 장치 및 이를 포함하는 전자 디바이스 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025127562A1 true WO2025127562A1 (fr) | 2025-06-19 |
Family
ID=96057970
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/019435 Pending WO2025127562A1 (fr) | 2023-12-14 | 2024-12-02 | Dispositif projecteur et dispositif électronique le comprenant |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025127562A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR100559003B1 (ko) * | 2002-03-22 | 2006-03-10 | 세이코 엡슨 가부시키가이샤 | 화상 표시 디바이스 및 프로젝터 |
| KR100724081B1 (ko) * | 2000-08-03 | 2007-06-04 | 리플렉티버티 인코퍼레이티드 | 마이크로미러 어레이, 프로젝션 시스템, 광 변조 방법, 광 마이크로기계식 소자 및 광선을 공간적으로 변조시키는 방법 |
| JP2008233641A (ja) * | 2007-03-22 | 2008-10-02 | Casio Comput Co Ltd | 投射装置 |
| US20160077319A1 (en) * | 2013-04-24 | 2016-03-17 | Hitachi Maxell, Ltd. | Projection-type video display device |
| JP2021004925A (ja) * | 2019-06-25 | 2021-01-14 | 株式会社nittoh | 投射光学系およびプロジェクタ装置 |
-
2024
- 2024-12-02 WO PCT/KR2024/019435 patent/WO2025127562A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR100724081B1 (ko) * | 2000-08-03 | 2007-06-04 | 리플렉티버티 인코퍼레이티드 | 마이크로미러 어레이, 프로젝션 시스템, 광 변조 방법, 광 마이크로기계식 소자 및 광선을 공간적으로 변조시키는 방법 |
| KR100559003B1 (ko) * | 2002-03-22 | 2006-03-10 | 세이코 엡슨 가부시키가이샤 | 화상 표시 디바이스 및 프로젝터 |
| JP2008233641A (ja) * | 2007-03-22 | 2008-10-02 | Casio Comput Co Ltd | 投射装置 |
| US20160077319A1 (en) * | 2013-04-24 | 2016-03-17 | Hitachi Maxell, Ltd. | Projection-type video display device |
| JP2021004925A (ja) * | 2019-06-25 | 2021-01-14 | 株式会社nittoh | 投射光学系およびプロジェクタ装置 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020226235A1 (fr) | Dispositif électronique | |
| WO2020138640A1 (fr) | Dispositif électronique | |
| WO2020189866A1 (fr) | Dispositif électronique pouvant être porté sur la tête | |
| WO2022092517A1 (fr) | Dispositif électronique pouvant être porté comprenant une unité d'affichage, procédé de commande d'affichage, système comprenant un dispositif électronique pouvant être porté et boîtier | |
| WO2021040117A1 (fr) | Dispositif électronique | |
| WO2021040076A1 (fr) | Dispositif électronique | |
| WO2021040116A1 (fr) | Dispositif électronique | |
| WO2022255682A1 (fr) | Dispositif électronique habitronique et procédé de commande de trajet d'alimentation de celui-ci | |
| WO2021049693A1 (fr) | Dispositif électronique | |
| WO2021040081A1 (fr) | Dispositif électronique | |
| WO2020138636A1 (fr) | Dispositif électronique | |
| WO2021049694A1 (fr) | Dispositif électronique | |
| WO2022164094A1 (fr) | Procédé de traitement d'image d'un visiocasque (hmd), et hmd pour l'exécution du procédé | |
| WO2022139424A1 (fr) | Dispositif électronique et son procédé de suivi de l'œil d'un utilisateur et de fourniture d'un service de réalité augmentée | |
| WO2025127562A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant | |
| WO2021033790A1 (fr) | Dispositif électronique | |
| WO2024054055A1 (fr) | Dispositif de projection et dispositif électronique le comprenant | |
| WO2025206869A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant | |
| WO2024136393A1 (fr) | Dispositif de projection et dispositif électronique le comprenant | |
| WO2025254358A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant | |
| WO2025089814A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant | |
| WO2025095720A1 (fr) | Dispositif de guidage optique et dispositif électronique le comprenant | |
| WO2024205289A1 (fr) | Dispositif optique et dispositif électronique le comprenant | |
| WO2024219802A1 (fr) | Dispositif de projection et dispositif électronique le comprenant | |
| WO2025095675A1 (fr) | Dispositif de guidage de lumière et dispositif électronique le comprenant |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24904127 Country of ref document: EP Kind code of ref document: A1 |