WO2025095720A1 - Dispositif de guidage optique et dispositif électronique le comprenant - Google Patents
Dispositif de guidage optique et dispositif électronique le comprenant Download PDFInfo
- Publication number
- WO2025095720A1 WO2025095720A1 PCT/KR2024/017138 KR2024017138W WO2025095720A1 WO 2025095720 A1 WO2025095720 A1 WO 2025095720A1 KR 2024017138 W KR2024017138 W KR 2024017138W WO 2025095720 A1 WO2025095720 A1 WO 2025095720A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- diffractive element
- angle
- substrate
- light
- guide device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/18—Diffraction gratings
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
Definitions
- the embodiment relates to a light guide device and an electronic device including the same.
- VR Virtual Reality
- a specific environment or situation or the technology itself, that is similar to reality but not real, created using artificial technology such as computers.
- Augmented Reality is a technology that synthesizes virtual objects or information into the real environment to make them appear as if they existed in the original environment.
- Mixed reality or hybrid reality refers to the creation of a new environment or new information by combining the virtual world and the real world.
- mixed reality when it refers to real-time interaction between things that exist in reality and virtual worlds.
- the created virtual environment or situation stimulates the user's five senses and allows them to freely move between reality and imagination by experiencing spatial and temporal experiences similar to reality.
- the user can interact with the things implemented in this environment by not only simply immersing himself in this environment, but also using real devices to operate or give commands.
- DMS was generally implemented in a form that simply predicted the driver's condition based on the driver's steering wheel operation status or the driving time, and alerted the driver with a warning sound or message.
- ADAS advanced driver assistance systems
- ACC Adaptive Cruise Control
- LKAS Lane Keeping Assist System
- various methods are being implemented, such as using a camera to film the driver to actively determine the driver's driving concentration or physical condition, such as drowsiness, or monitoring the driver's physical movements through the captured video, or monitoring the movement of the driver's eyelids to determine the driver's driving concentration or drowsiness, and vibrating the steering wheel or generating a warning sound to the driver to avoid danger, or slowing down the vehicle to a certain speed or below.
- the present invention provides a light guide device and an electronic device capable of designing a grating structure having degrees of freedom for all angles of an optical device and an observer, when using a light guide device used in AR (Augmented Reality) and an electronic device including the same.
- the embodiments provide a light guide device and an electronic device with improved optical performance.
- the embodiment provides a light guide device and a camera module that enable easier creation of images of passengers, etc., within a vehicle, etc.
- An optical guide device includes a first substrate; and a first input diffraction element, a first transmission diffraction element, and a first output diffraction element, which are arranged on the first substrate and onto which light emitted from a projector is sequentially incident; and can satisfy mathematical expressions 1 and 2.
- ⁇ projector is the angle between the projector and the first substrate, is the angle at which the projector faces the first substrate, ⁇ is the wrap angle of the first substrate, ⁇ is the wavelength of light, ⁇ is the pantoscopic angle of the first substrate, ⁇ IC is the grating period of the first input diffractive element, ⁇ OC is the grating period of the first output diffractive element, ⁇ FG is the grating period of the first transmission diffractive element, is the grating angle of the first input diffractive element, is the grating angle of the first diffractive element, is the grating angle of the first transmission diffraction element, FoV is the field of view of the light guide device, x and y are the resolutions of the light guide device.
- the wrap angle of the first substrate is an angle formed by the first substrate with respect to a first direction, and the first direction may be a direction perpendicular to a direction in which a user views the light guide device.
- the pantospheric angle of the first substrate is an angle formed by the first substrate with respect to a second direction, and the second direction may be a direction perpendicular to the first direction and a direction in which the user views the light guide device.
- the angle between the projector and the first substrate may be the angle formed between the direction in which the projector outputs the light and a plane perpendicular to the first substrate.
- the angle at which the projector faces the first substrate is the angle that the direction in which the projector outputs the light makes with the first axis on the first substrate, and the first axis may be the long axis of the first substrate.
- the first input diffractive element may include a first protrusion
- the first transmission diffractive element may include a second protrusion
- the first output diffractive element may include a third protrusion
- a grating period of the first input diffractive element may be a shortest distance between same side surfaces of adjacent first protrusions
- a grating period of the first transmission diffractive element may be a shortest distance between same side surfaces of adjacent second protrusions
- a grating period of the first output diffractive element may be a shortest distance between same side surfaces of adjacent third protrusions.
- the grating angle of the first input diffractive element may be an angle formed by the separation direction between the first protrusions of the first input diffractive element and the first axis
- the grating angle of the first transmission diffractive element may be an angle formed by the separation direction between the second protrusions of the first transmission diffractive element and the first axis
- the grating angle of the first output diffractive element may be an angle formed by the separation direction between the third protrusions of the first output diffractive element and the first axis.
- the apparatus may further include a second substrate arranged to overlap the first substrate; a second input diffractive element, a second transmission diffractive element, and a second output diffractive element arranged on the second substrate, onto which light is sequentially incident.
- the grating angles of the first input diffractive element and the second input diffractive element may be the same, the grating angles of the first transmission diffractive element and the second transmission diffractive element may be the same, and the grating angles of the first output diffractive element and the second output diffractive element may be the same.
- the viewing angle of the above light guide device may be the angle at which the light is incident on the first input diffractive element and transmitted to the first transmission diffractive element.
- the grating periods of the second input diffractive element, the second transmission diffractive element, and the second output diffractive element may be different from the grating periods of the first input diffractive element, the first transmission diffractive element, and the first output diffractive element, respectively.
- the light guide device according to the embodiment can satisfy mathematical expressions 3 and 4.
- ⁇ projector is the angle between the projector and the second substrate, is the angle at which the projector faces the second substrate, ⁇ is the wrap angle of the second substrate, ⁇ is the wavelength of light, ⁇ is the pantoscopic angle of the second substrate, ⁇ IC is the grating period of the second input diffractive element, ⁇ OC is the grating period of the second output diffractive element, ⁇ FG is the grating period of the second transmission diffractive element, is the grating angle of the second input diffractive element, is the grating angle of the second diffraction element, is the grating angle of the second transmission diffraction element, FoV is the field of view of the light guide device, x and y are the resolutions of the light guide device.
- the wrap angle of the first substrate and the wrap angle of the second substrate may be the same.
- the pantospheric angle of the first substrate and the pantospheric angle of the second substrate may be the same.
- Light sequentially incident on the first input diffraction element, the first transmission diffraction element, and the first output diffraction element may have a different wavelength from light sequentially incident on the second input diffraction element, the second transmission diffraction element, and the second output diffraction element.
- a light guide device and a camera module can be provided that make it easier to create images of passengers, etc., within a vehicle.
- Figure 1 is a conceptual diagram illustrating an embodiment of an AI device.
- FIG. 2 is a block diagram showing the configuration of an extended reality electronic device according to an embodiment of the present invention.
- FIG. 3 is a perspective view of an augmented reality electronic device according to an embodiment of the present invention.
- FIG. 5 is a drawing showing the angle of the substrate of the light guide device according to an embodiment of the present invention.
- FIG. 6 is a drawing showing the angle between the projector and the substrate of the light guide device according to an embodiment of the present invention.
- FIG. 7 is a drawing showing a grating pattern of a diffraction element of a light guide device according to an embodiment of the present invention.
- Figure 8 is a schematic diagram of a light guide device according to another embodiment of the present invention.
- FIG. 9 is a drawing illustrating the configuration of a camera module according to an embodiment of the present invention.
- FIG. 10 is a drawing showing the angle of the substrate of the light guide device according to an embodiment of the present invention.
- the AI server (16) can receive input data from the AI devices (11 to 15), infer a result value for the received input data using a learning model, and generate a response or control command based on the inferred result value and transmit it to the AI devices (11 to 15).
- the AI device may infer a result value for input data using a direct learning model and generate a response or control command based on the inferred result value.
- Robots (11) can be implemented as guide robots, transport robots, cleaning robots, wearable robots, entertainment robots, pet robots, unmanned flying robots, etc. by applying AI technology.
- the robot (11) may include a robot control module for controlling movement, and the robot control module may mean a software module or a chip that implements the same as hardware.
- the robot (11) can obtain status information of the robot (11), detect (recognize) the surrounding environment and objects, generate map data, determine a movement path and driving plan, determine a response to user interaction, or determine an action using sensor information obtained from various types of sensors.
- the robot (11) can use sensor information acquired from at least one sensor among lidar, radar, and camera to determine a movement path and driving plan.
- the robot (11) can perform the above-described operations using a learning model composed of at least one artificial neural network.
- the robot (11) can recognize the surrounding environment and objects using the learning model, and determine operations using the recognized surrounding environment information or object information.
- the learning model can be learned directly in the robot (11) or learned from an external device such as an AI server (16).
- the robot (11) may perform an action by generating a result using a direct learning model, but may also transmit sensor information to an external device such as an AI server (16) and perform an action by receiving the result generated accordingly.
- the robot (11) can determine a movement path and driving plan using at least one of map data, object information detected from sensor information, or object information acquired from an external device, and control a driving unit to drive the robot (11) according to the determined movement path and driving plan.
- the map data may include object identification information for various objects placed in the space where the robot (11) moves.
- the map data may include object identification information for fixed objects such as walls and doors, and movable objects such as flower pots and desks.
- the object identification information may include name, type, distance, location, etc.
- the robot (11) can perform an action or drive by controlling the driving unit based on the user's control/interaction. At this time, the robot (11) can obtain the intention information of the interaction according to the user's action or voice utterance, and determine a response based on the obtained intention information to perform the action.
- Autonomous vehicles (12) can be implemented as mobile robots, vehicles, unmanned aerial vehicles, etc. by applying AI technology.
- the autonomous vehicle (12) may include an autonomous driving control module for controlling autonomous driving functions, and the autonomous driving control module may mean a software module or a chip that implements the same as hardware.
- the autonomous driving control module may be included internally as a component of the autonomous vehicle (12), but may also be configured as separate hardware and connected to the outside of the autonomous vehicle (12).
- An autonomous vehicle (12) can obtain status information of the autonomous vehicle (12), detect (recognize) the surrounding environment and objects, generate map data, determine a movement path and driving plan, or determine an operation by using sensor information obtained from various types of sensors.
- the autonomous vehicle (12) can use sensor information acquired from at least one sensor among lidar, radar, and camera, similar to the robot (11), to determine the movement path and driving plan.
- an autonomous vehicle (12) can recognize an environment or objects in an area where the field of vision is obscured or an area beyond a certain distance by receiving sensor information from external devices, or can receive information recognized directly from external devices.
- the autonomous vehicle (12) can perform the above-described operations using a learning model composed of at least one artificial neural network.
- the autonomous vehicle (12) can recognize the surrounding environment and objects using the learning model, and determine the driving route using the recognized surrounding environment information or object information.
- the learning model can be learned directly in the autonomous vehicle (12) or learned from an external device such as an AI server (16).
- the autonomous vehicle (12) may perform an operation by generating a result using a direct learning model, but may also perform an operation by transmitting sensor information to an external device such as an AI server (16) and receiving the result generated accordingly.
- Robots (11) can be implemented as guide robots, transport robots, cleaning robots, wearable robots, entertainment robots, pet robots, unmanned flying robots, etc. by applying AI technology and autonomous driving technology.
- an electronic device may include a frame (100), a projector device (200), and a display unit (300).
- the electronic device may be provided as a glass type (smart glass).
- the glass type electronic device is configured to be worn on the head of the human body and may be provided with a frame (case, housing, etc.) (100) for this purpose.
- the frame (100) may be formed of a flexible material to facilitate wearing.
- the frame (100) is supported on the head and provides a space for mounting various components.
- electronic components such as a projector device (200), a user input unit (130), or an audio output unit (140) may be mounted on the frame (100).
- a lens covering at least one of the left and right eyes may be detachably mounted on the frame (100).
- the frame (100) may have a form of glasses worn on the face of the user's body as shown in the drawing, but is not necessarily limited thereto, and may also have a form such as goggles worn in close contact with the user's face.
- Such a frame (100) may include a front frame (110) having at least one opening, and a pair of side frames (120) that extend in the y direction (in FIG. 2) intersecting the front frame (110) and are parallel to each other.
- the frame (100) may have the same or different length (DI) in the x direction and length (LI) in the y direction.
- the project device (200) is provided to control various electronic components provided in an electronic device.
- the project device (200) may be used interchangeably with ‘optical output device’, ‘optical projector device’, ‘light irradiation device’, ‘optical device’, etc.
- the projector device (200) can generate an image or a video of a series of images that are displayed to the user.
- the projector device (200) can include an image source panel that generates an image and a plurality of lenses that diffuse and converge light generated from the image source panel.
- the project device (200) may be secured to either of the two side frames (120).
- the project device (200) may be secured to the inside or outside of either of the side frames (120), or may be integrally formed by being built into the inside of either of the side frames (120).
- the project device (200) may be secured to the front frame (110) or may be provided separately from the electronic device.
- the display unit (300) may be implemented in the form of a head mounted display (HMD).
- HMD form refers to a display method that is mounted on the head and directly shows an image in front of the user's eyes.
- the display unit (300) may be positioned to correspond to at least one of the left and right eyes so that the image can be directly provided in front of the user's eyes.
- the display unit (300) is positioned in a portion corresponding to the right eye so that the image can be output toward the user's right eye.
- it is not limited thereto and may be positioned for both the left and right eyes.
- the display unit (300) can allow the user to visually perceive the external environment while simultaneously allowing the user to see an image generated by the projector device (200).
- the display unit (300) can project an image onto the display area using a prism.
- the display unit (300) may be formed to be translucent so that the projected image and the general field of view in front (the range that the user sees through his eyes) can be viewed simultaneously.
- the display unit (300) may be translucent and may be formed of an optical member including glass.
- the display unit (300) may be inserted and fixed into an opening included in the front frame (110), or may be positioned on the back surface of the opening (i.e., between the opening and the user) and fixed to the front frame (110).
- the display unit (300) is positioned on the back surface of the opening and fixed to the front frame (110) as an example, but the display unit (300) may be positioned and fixed at various positions of the frame (100).
- the electronic device projects image light from the projector device (200) to one side of the display unit (300)
- the image light is emitted to the other side through the display unit (300), allowing the user to see the image generated from the projector device (200).
- the user can view the external environment through the opening of the frame (100) and at the same time view the image generated by the projector device (200). That is, the image output through the display unit (300) can be seen to overlap with the general field of view.
- the electronic device can provide augmented reality (AR) that superimposes a virtual image on a real image or background and shows it as a single image.
- AR augmented reality
- images generated from the external environment and the projector device (200) can be provided to the user with a time difference for a short period of time that is not recognized by the person.
- the external environment can be provided to the person in one section, and images from the projector device (200) can be provided to the person in another section.
- both overlap and time difference may be provided.
- the projector device according to the embodiment may have the structure described below, or may be formed by a structure further including a waveguide or/and glass in the structure.
- the projector device may include a DLP (Digital Light Processing) projector or a projector device.
- the projector device may be expressed as a projector.
- the projector according to the embodiment may correspond to the projector device included in the augmented reality electronic device according to the embodiment.
- the following display unit may be expressed as a light guide device.
- the light guide device according to the embodiment may correspond to the display unit included in the augmented reality electronic device according to the embodiment.
- Figure 4 is a schematic diagram of a light guide device according to an embodiment of the present invention.
- the light guide device (300) may include a first substrate (310), a first input diffraction element (320), a first transmission diffraction element (330), a first output diffraction element (340), and a cover (301).
- the light guide device (300) can change the path of light output from the projector (200) and output the light to the outside again.
- the light guide device (300) can output the light to the outside so that it reaches the user's eyes.
- the light can be sequentially input to the first input diffraction element (320), the first transmission diffraction element (330), and the first output diffraction element (340) and output to the outside again.
- the direction of incidence of the light to the light guide device (300) can be a direction perpendicular to the first substrate (310) or a direction forming a certain angle with the direction perpendicular to the first substrate (310).
- the first substrate (310) can serve as a path for transmitting light.
- a first input diffractive element (320), a first transmission diffractive element (330), and a first output diffractive element (340) can be arranged on the first substrate (310).
- the light can be totally reflected inside the first substrate (310) and travel along the inside of the first substrate (310).
- the first substrate (310) can include a waveguide.
- the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340) can be arranged spaced apart from each other on the first substrate (310).
- the first substrate (310) may include a plurality of surfaces.
- the first substrate (310) may include a first surface and a second surface.
- the first surface may be a surface on which light is incident.
- the second surface may be a surface on which light is emitted.
- the second surface may be a surface spaced apart from the first surface.
- the first surface and the second surface may be parallel to each other.
- a first input diffraction element (320), a first transmission diffraction element (330), and a first output diffraction element (340) may be arranged on the first surface or the second surface.
- the first input diffraction element (320) can serve as a path through which light is incident.
- the first input diffraction element (320) can be placed on the first substrate (310).
- Light can be incident from the outside to the light guide device (300) through the first input diffraction element (320) and transmitted through the first substrate (310).
- the first input diffraction element (320) can change the path of light by diffracting the light.
- the first transmission diffraction element (330) can play a role in changing the path of light.
- the first transmission diffraction element (330) can be arranged on the first substrate (310).
- the first transmission diffraction element (330) can change the path of light incident through the first input diffraction element (320).
- the first transmission diffraction element (330) can change the path of light so that it is directed toward the first output diffraction element (340).
- the first transmission diffraction element (330) can change the path of light by diffracting the light.
- the first exit diffraction element (340) can serve as a path through which light is emitted.
- the first exit diffraction element (340) can be arranged on the first substrate (310). Light can be emitted to the outside of the light guide device (300) through the first exit diffraction element (340).
- the first exit diffraction element (340) can receive light whose path has been changed from the first transmission diffraction element (330) and emit it to the outside.
- the first exit diffraction element (340) can change the path of light and emit it to the outside.
- the first exit diffraction element (340) can change the path of light by diffracting the light.
- the cover (301) may be placed on the first substrate (310), the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340).
- the cover (301) may be placed adjacent to the projector (200) on the first substrate (310), the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340).
- Light may pass through the cover (301) and enter the first input diffractive element (320).
- the cover (301) may have an effect of protecting the interior of the light guide device (300).
- the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340) may include a plurality of protrusions.
- the plurality of protrusions may have constant widths, periods, and heights and may be arranged on the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340).
- the plurality of protrusions may protrude in a direction perpendicular to the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340).
- the plurality of protrusions may be arranged to be spaced apart from each other in a vector direction of a pattern including the protrusions.
- the paths of light after passing through the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340) may be changed differently.
- FIG. 5 is a drawing showing the angle of the substrate of the light guide device according to an embodiment of the present invention.
- the first coordinate axis (x1, y1, z1) may be a coordinate axis set based on the user.
- the user's gaze direction may be defined as the z1 axis.
- the x1 axis and the y1 axis may be axes set in a direction perpendicular to the z1 axis.
- the x1 axis and the y2 axis may be perpendicular to each other.
- the second coordinate axis (x2, y2, z2) may be a coordinate axis set based on the light guide device.
- the direction perpendicular to the first substrate (310) of the light guide device may be defined as the z2 axis.
- the x2 axis and the y2 axis may be axes set in a direction perpendicular to the z2 axis.
- the x2 axis and the y2 axis may be perpendicular to each other.
- the wrap angle ( ⁇ ) may be defined as an angle formed by the first substrate (310) of the light guide device and the first direction (x1).
- the first direction (x1) may be a direction perpendicular to a direction in which a user views the light guide device.
- the direction in which the user views the light guide device may be a third direction (z1).
- the first direction (x1) may be a direction perpendicular to the third direction (z3).
- a pantoscopic angle ( ⁇ ) may be defined as an angle formed by a first substrate (310) of the light guide device and a second direction (y1).
- the second direction (y1) may be a direction perpendicular to a direction in which a user views the light guide device.
- the second direction (y1) may be a direction perpendicular to the first direction (x1) and the third direction (z1).
- the placement angle of the light guide device may vary depending on the wrap angle and pantoscope angle.
- FIG. 6 is a drawing showing the angle between the projector and the substrate of the light guide device according to an embodiment of the present invention.
- the first angle ( ⁇ projector) may be defined as an angle between the projector (200) and the first substrate (310).
- the angle between the projector (200) and the first substrate (310) may be an angle formed between a direction in which the projector (200) outputs light and a direction (z2) perpendicular to the first substrate (310).
- the first angle ( ⁇ projector) may be 0 ⁇ to 90 ⁇ .
- the second angle ( ) can be defined as the angle at which the projector (200) faces the first substrate (310).
- the angle at which the projector (200) faces the first substrate (310) can be the angle that the direction in which the projector (200) outputs light forms with the first axis (x2) on the first substrate (310).
- the first axis (x2) can be the long axis of the first substrate (310).
- the second angle ( ) may be an angle formed counterclockwise from the first axis (x2).
- the second angle ( ) can be between 0 ⁇ and 360 ⁇ .
- the arrangement relationship between the projector and the light guide device may vary depending on the first and second angles.
- FIG. 7 is a drawing showing a grating pattern of a diffraction element of a light guide device according to an embodiment of the present invention.
- the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340) may each include a grating pattern in which a plurality of protrusions are repeatedly arranged.
- the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340) may each include a plurality of protrusions, and the plurality of protrusions may be arranged in parallel and spaced apart from each other by a predetermined distance.
- the grating period of the diffractive element may be defined as the shortest distance between one side of the protrusion and one side of an adjacent protrusion.
- the grating period of the diffractive element may be the shortest distance between the same side surfaces of the protrusions.
- the grating vector of the diffractive element may be set in a direction perpendicular to the direction in which the protrusions are arranged.
- the grating angle of the diffractive element may be an angle that the grating vector of the diffractive element forms with the first axis (x2). Additionally, the grating angle of the diffractive element may be the angle formed by the separation direction between the protrusions of the diffractive element and the first axis (x2).
- the grating angle ( ) may be an angle formed by the separation direction between the first protrusions (321) of the first input diffractive element (320) and the first axis (x2).
- the grating period ( ⁇ IC ) of the first input diffractive element (320) may be 480 nm to 520 nm.
- the grating period ( ⁇ IC ) of the first input diffractive element (320) may be 500 nm.
- the grating angle ( ) can be 310 ⁇ to 320 ⁇ .
- the grating angle ( of the first input diffractive element (320) ) can be 315 ⁇ .
- the first transmission diffraction element (330) may include a plurality of second protrusions (331).
- the grating period ( ⁇ FG ) of the first transmission diffraction element (330) may be the shortest distance between one side of the second protrusion (331) and one side of the adjacent second protrusion (331).
- the grating period ( ⁇ FG ) of the first transmission diffraction element (330) may be the shortest distance between the same side surfaces of the adjacent second protrusions (331).
- the grating angle ( ) may be the angle formed by the grating vector of the first transmission diffraction element (330) with the first axis (x2).
- the grating angle ( ) may be an angle formed by the separation direction between the second protrusions (331) of the first transmission diffraction element (330) and the first axis (x2).
- the grating period ( ⁇ FG ) of the first transmission diffraction element (330) may be 380 nm to 420 nm.
- the grating period ( ⁇ FG ) of the first transmission diffraction element (330) may be 400 nm.
- the grating angle ( ) can be 65 ⁇ to 75 ⁇ .
- the grating angle ( of the first transmission diffraction element (330) ) can be 70 ⁇ .
- the first emission diffraction element (340) may include a plurality of third protrusions (341).
- the grating period ( ⁇ OC ) of the first emission diffraction element (340) may be the shortest distance between one side of the third protrusion (341) and one side of an adjacent third protrusion (341).
- the grating period ( ⁇ OC ) of the first emission diffraction element (340) may be the shortest distance between the same side surfaces of adjacent third protrusions (341).
- the grating angle ( ) may be the angle formed by the grating vector of the first diffraction element (340) with the first axis (x2).
- the grating angle ( ) may be an angle formed by the separation direction between the third protrusions (341) of the first diffraction element (340) and the first axis (x2).
- the grating period ( ⁇ OC ) of the first diffraction element (340) may be 360 nm to 400 nm.
- the grating period ( ⁇ OC ) of the first diffraction element (340) may be 378 nm.
- the grating angle ( ) can be 180 ⁇ to 190 ⁇ .
- the grating angle ( of the first diffractive element (340) ) can be 185.35 ⁇ .
- the light guide device (300) according to the embodiment can satisfy mathematical expressions 1 and 2.
- ⁇ projector is the angle between the projector and the first substrate, is the angle at which the projector faces the first substrate, ⁇ is the wrap angle of the first substrate, ⁇ is the wavelength of light, ⁇ is the pantoscopic angle of the first substrate, ⁇ IC is the grating period of the first input diffractive element, ⁇ OC is the grating period of the first output diffractive element, ⁇ FG is the grating period of the first transmission diffractive element, is the grating angle of the first input diffractive element, is the grating angle of the first diffractive element, is the grating angle of the first transmission diffractive element, FoV is the field of view of the light guide device, and x and y can be the resolutions of the light guide device.
- the field of view of the light guide device can be the angle at which light is incident on the first input diffractive element and transmitted to the first transmission diffractive element. It is possible to design a grating structure of the diffractive element that can be applied to all angles of the light guide device and the projector.
- An optical guide device is ⁇ IC is 500nm, ⁇ OC is 378 nm, ⁇ FG is 400 nm, is 315 ⁇ , is 185.35 ⁇ , is 70 ⁇ , ⁇ is 525nm, ⁇ projector is 5 ⁇ , can be 7 ⁇ , ⁇ can be 6 ⁇ , and ⁇ can be 8 ⁇ .
- Figure 8 is a schematic diagram of a light guide device according to another embodiment of the present invention.
- the light guide device (300) may further include a second substrate (350), a second input diffractive element (360), a second transmission diffractive element (370), and a second output diffractive element (380).
- the second substrate (350) may be arranged spaced apart from the first substrate (310) in the direction of the optical axis.
- the second substrate (350) may be arranged at the rear end of the first substrate (310) on the path of light.
- the second input diffraction element (360), the second transmission diffraction element (370), and the second exit diffraction element (280) may be arranged on the second substrate (350).
- Light may be incident through the second input diffraction element (370) and may be exited through the second exit diffraction element (380).
- the second input diffraction element (360) may diffract light that passes through the first input diffraction element (320) without being diffracted by the first input diffraction element (320).
- the wrap angle of the first substrate and the wrap angle of the second substrate may be the same.
- the pantoscal angle of the first substrate and the pantoscal angle of the second substrate may be the same.
- the second input diffractive element (360) may diffract light having a different wavelength band from the first input diffractive element (320). Accordingly, the wavelength band of light transmitted through the second substrate (350) may be different from the wavelength band of light transmitted through the first substrate (310). Accordingly, light sequentially incident on the first input diffractive element, the first transmission diffractive element, and the first exit diffractive element may have a different wavelength from light sequentially incident on the second input diffractive element, the second transmission diffractive element, and the second exit diffractive element. Light having different wavelength bands may be emitted from the first exit diffractive element (340) and the second exit diffractive element (380), respectively, and may reach the user's eyes.
- the grating angle of the first input diffractive element (320) and the grating angle of the second input diffractive element (360) are the same, the grating angle of the first transmission diffractive element (330) and the grating angle of the second transmission diffractive element (370) are the same, and the grating angles of the first exit diffractive element (340) and the second exit diffractive element (380) can be the same.
- light of different wavelength bands can be transmitted and output in the same manner.
- the grating periods of the second input diffractive element (360), the second transmission diffractive element (370), and the second output diffractive element (380) may be different from the grating periods of the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340), respectively. Since the grating periods of the second input diffractive element (360), the second transmission diffractive element (370), and the second output diffractive element (380) are different from the grating periods of the first input diffractive element (320), the first transmission diffractive element (330), and the first output diffractive element (340), light of different wavelength bands can be diffracted.
- the light guide device according to the embodiment can satisfy mathematical expressions 3 and 4.
- ⁇ projector is the angle between the projector and the second substrate, is the angle at which the projector faces the second substrate, ⁇ is the wrap angle of the second substrate, ⁇ is the wavelength of light, ⁇ is the pantoscopic angle of the second substrate, ⁇ IC is the grating period of the second input diffractive element, ⁇ OC is the grating period of the second output diffractive element, ⁇ FG is the grating period of the second transmission diffractive element, is the grating angle of the second input diffractive element, is the grating angle of the second diffraction element, is the grating angle of the second transmission diffractive element, FoV is the viewing angle of the light guide device, x and y can be the resolutions of the light guide device. It is possible to design a grating structure of the diffractive element that can be applied to all angles of the light guide device and the projector.
- a vehicle system may include a vehicle, a passenger (driver), and an electronic device.
- the electronic device is described as a device separately provided in the vehicle (1), but the present invention is not limited thereto.
- the electronic device may be implemented as a part of the vehicle.
- a vehicle may include a body and various devices for moving the body (e.g., wheels, a driving device for driving the wheels, a starting device for starting the driving device, an engine for generating power and transmitting the generated power to the driving device, a steering device for controlling the direction of the vehicle, an accelerator for controlling the speed of the vehicle, etc.).
- the vehicle may include various electrical systems.
- the electrical system may include an engine control device for controlling the engine, a temperature control device for controlling the temperature inside the vehicle, a light control device for controlling the lights according to external conditions, etc.
- the vehicle may include a communication interface capable of communicating with an electronic device, and may include an additional processor for analyzing data transmitted through the communication interface and performing preset functions based on the analysis results.
- an electronic device may be connected to a camera module to acquire an image of a driver, analyze the acquired image, and then perform various set function processing (e.g., deceleration processing, turning on or off an emergency light, controlling a horn device, controlling vehicle vibration, controlling window opening and closing, etc.) based on the analysis result.
- various function processing may be additionally implemented.
- the driver as a person who can sit in the driver's seat and control the steering device, can be the subject of image capture by the electronic device.
- the electronic device acquires an image of the driver seated in the driver's seat as a representative example, but the present invention is not limited thereto.
- the monitoring system can be applied to acquire images of not only the driver but also the passenger seated in the passenger seat or other seats, and to perform adjustment of the image acquisition method according to various actions of the passenger.
- at least one electronic device can be arranged in the vehicle.
- a camera module and an electronic device can be arranged so as to acquire an image of the driver seated in the driver's seat that only supports driver monitoring.
- multiple electronic devices may be placed within the vehicle to enable image acquisition of the driver and passenger seats.
- the electronic device can receive image information about the driver's face through the camera module. Accordingly, the electronic device can determine whether the driver is drowsy, etc. in a predetermined manner and provide various feedbacks (e.g., alarms) corresponding thereto. For example, the electronic device can determine whether the driver is drowsy based on the user's eye track or the size or change in the size of the iris. In addition, the electronic device can receive image information about the driver's hands as well as the user's face from the camera module. Accordingly, the electronic device can easily determine the driver's hands-off state. In other words, a driver monitoring system having a hands-off monitoring function can be implemented. Accordingly, the driver's concentration state or whether the driver is drowsy can be easily determined based on various driver movements according to the driver's hands-off.
- various feedbacks e.g., alarms
- the camera module may be placed in a location within the vehicle where it is easy to capture images of passengers.
- it may be placed in various locations within the vehicle, such as the windshield (e.g., the location where the head-up display is placed) or the bottom of the windshield, the dashboard, the instrument panel, etc., so as to obtain images of a subject seated in the driver's seat.
- the camera module may be placed in a location that is difficult for passengers to easily recognize.
- a camera module connected to an electronic device may be placed at a predetermined location within the vehicle to receive image information about passengers other than the driver of the vehicle.
- the camera module may be placed on a rearview mirror (or room mirror) or the like to detect all passengers other than the driver. As a result, the camera module may generate an image of all passengers.
- the camera module can generate image information about other passengers in the vehicle other than the driver.
- the electronic device can receive image information including passengers other than the driver.
- a camera module may include a light source unit, a light guide device, a light receiving unit, and a control unit.
- the light source unit can output light by a control signal. Finally, the light output from the light source unit can be irradiated to an object. And the light irradiated to the object can be reflected and provided to the light receiving unit.
- the light source unit may include at least one light source. And at least one light source may emit light of a predetermined wavelength band or light having a predetermined center wavelength. In addition, the light source of the light source unit may emit light of a predetermined pattern according to a pre-designed algorithm. The light source unit may output light under the control of the control unit.
- output light or incident light may refer to light output from a light source and provided to an object
- input light or reflected light may refer to light output from a light source, reaches an object, is reflected from the object, and is input to a light receiving unit. That is, from the object's perspective, output light may be incident light, and input light may be reflected light.
- At least one light source of the light source unit can output light of a predetermined wavelength band.
- the wavelength of the light output from the light source can be infrared rays of 770 nm to 3000 nm.
- the wavelength of the light output from the light source can be visible light of 380 nm to 770 nm.
- the light source of the light source unit can emit light outside the wavelength range described above.
- the light source can irradiate light of a specific wavelength band so as not to be harmful to passengers such as the driver and passengers in the vehicle, as described above, or irradiate light of a specific energy or lower so as not to be harmful.
- the light source may include a light emitting diode (LED), an organic light emitting diode (OLED), a laser diode (LD), a vertical-cavity surface-emitting laser (VCSEL), a plasma lamp, a fluorescent lamp, a xenon lamp, a halogen lamp, a neon lamp, or the like.
- the light source may output a wavelength of about 800 nm to 1000 nm, for example, about 850 nm or about 940 nm.
- the light guide device can be arranged adjacent to the light source and the light receiving unit.
- the light guide device can guide light irradiated from the light source and transmit it to an object.
- the light guide device can guide light reflected from the object again and provide it to the light receiving unit.
- the light guide device can be configured to control the light and move it along a desired path.
- the light guide device can perform both transmitting light to the object and receiving reflected light.
- the light guide device can be configured to help light from the camera module accurately reach the sensor or guide it along a specific path so that optical information is accurately transmitted.
- These light guide devices can be made of materials such as glass, polymer, or silicon.
- the light guide devices can include various other light-guiding materials.
- the light guide device can transmit light in a desired direction by utilizing diffraction.
- the light guide device can include an optical element for determining the path of light in a substrate that is a waveguide.
- the optical element can include various elements that operate based on diffraction.
- the light guide device may include a diffractive element, which is a holographic optical element (HOE).
- the light guide device may include an input diffractive element, an input/output diffractive element, and an output diffractive element, as described below.
- the input diffractive element, the input/output diffractive element, and the output diffractive element may be formed of a holographic optical element.
- holographic optical elements diffract light (or light) using interference patterns generated by laser interference, and through this, light of a specific wavelength can be controlled or diffracted in a desired direction.
- Bragg’s Law is applied in this diffraction process, and the diffraction angle can be determined according to the wavelength of light and the structure of the holographic optical element.
- a holographic optical element can be composed of an interference pattern recorded on a transparent substrate.
- the transparent substrate is a waveguide and can be made of various materials such as glass, plastic, and polymer.
- the interference pattern of the holographic optical element can be precisely designed inside or on the surface of the substrate to guide light in a specific direction.
- the holographic optical element can be classified into a transmissive type in which light is diffracted while passing through and a reflective type in which light is diffracted while being reflected. Accordingly, the holographic optical element can change its position on the substrate. Light can be precisely controlled by such a holographic optical element to provide a high-resolution image.
- the holographic optical element can also support high-speed data transmission through wavelength separation and combination in optical communication.
- the holographic optical element can provide a miniaturized camera module because it is lighter and thinner than a conventional lens or mirror.
- the light receiving unit can receive light transmitted through the light guide device.
- the light receiving unit can include an image sensor.
- the image sensor can receive light reflected from an object. Accordingly, the image sensor can detect the light and convert it into an electrical signal. For example, the image sensor can convert the electrical signal to generate a digital image.
- the image sensor can include a CCD (Charge-Coupled Device), a CMOS (Complementary Metal-Oxide-Semiconductor), an InGaAs (Indium Gallium Arsenide) sensor, a HgCdTe (Mercury Cadmium Telluride) sensor, a microbolometer, and the like.
- the light receiving unit may also include an image sensor that receives light of various wavelength bands other than those described above or as examples.
- the light receiving unit may be positioned adjacent to the light guide device, just like the light source unit.
- additional lenses may be positioned between the light receiving unit and the light guide device. The same may be applied between the light source unit and the light guide device.
- the control unit can control the operation of the light source unit and the light receiving unit.
- the control unit can generate depth information based on the image generated by the light receiving unit, or transmit and receive image information with other electronic devices such as vehicles.
- This control unit can control the operation within the camera module, and can also communicate with the processor within the device such as an external electronic device such as a vehicle.
- the control unit may include a processor, a microcontroller (MCU), a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc., and may also be implemented in the form of an application processor (AP) of various electronic devices.
- MCU microcontroller
- FPGA field programmable gate array
- ASIC application specific integrated circuit
- FIG. 9 is a drawing illustrating the configuration of a camera module according to an embodiment of the present invention
- FIG. 10 is a drawing illustrating the angle of a substrate of a light guide device according to an embodiment of the present invention.
- the light guide device (400) may include a cover (401), a first substrate (410), a first input diffractive element (420), a first transmission diffractive element (430), a first input/output diffractive element (440), and a first output diffractive element (450).
- the first input diffractive element (420), the first transmission diffractive element (430), the first input/output diffractive element (440), and the first output diffractive element (450) may be arranged on a first substrate (410) which is a waveguide.
- the first input diffractive element (420), the first transmission diffractive element (430), the first input/output diffractive element (440), and the first output diffractive element (450) may be either a transmission type or a reflection type, and may be positioned on either one surface (e.g., an upper surface) or the other surface (e.g., a lower surface) of the first substrate (410).
- first input diffraction element (420), the first transmission diffraction element (430), the first input/output diffraction element (440), and the first output diffraction element (450) are diffraction elements as described above, and can be spaced apart from each other.
- the first input diffraction element (420) diffracts light provided from the light source unit (600) and guides it to the first substrate (410), and the light guided into the first substrate (410) can be provided to the first input/output diffraction element (440) through the first transmission diffraction element (430).
- the first input/output diffraction element (440) diffracts light guided from the first input diffraction element (420) to the first substrate (410) toward an object, and diffracts light reflected from the object and guides it into the substrate (410).
- the first output diffraction element (450) diffracts light reflected from an object and guided to the substrate (410) by the first input/output diffraction element (440) and guides or provides it to the light receiving unit (500). At this time, light diffracted by the first emitting diffraction element (450) and guided to the light receiving unit (500) may be incident on the light receiving unit (500) and converted into image information.
- the first transmission diffraction element (430) may not be included.
- the light guide device of the camera module according to the embodiment can satisfy mathematical expressions 5 and 6.
- ⁇ camera is the angle between the photodetector and the first substrate, is the angle at which the light receiving portion faces the first substrate, ⁇ is the wrap angle of the first substrate, ⁇ is the wavelength of light, ⁇ is the pantoscopic angle of the first substrate, ⁇ IOC is the grating period of the first input/output diffractive element, ⁇ OC is the grating period of the first output diffractive element, ⁇ FG is the grating period of the first transmission diffractive element, is the grating angle of the first input/output diffraction element, is the grating angle of the first diffractive element, is the grating angle of the first transmission diffraction element, FoV is the viewing angle of the light guide device, and x and y can be the resolutions of the light guide device.
- the viewing angle of the light guide device can be the angle at which light is emitted from the first exit diffraction element and transmitted to the light receiving unit.
- the grating structure of the diffraction element that can be applied to all angles of the light guide device and the light receiving unit can be designed. (In the case where the first transmission diffraction element is not included in the light guide device according to another embodiment, the grating period ( ⁇ FG ) and the grating angle ( ) of the first transmission diffraction element in Equations 5 and 6 are ) can be ignored.)
- the wrap angle ( ⁇ ) of Mathematical Expression 5 can be defined as an angle formed between the first substrate of the light guide device and the first direction (x1).
- the first direction (x1) may be a direction perpendicular to a direction in which a user views the light guide device.
- the pantospheric angle ( ⁇ ) of Mathematical Expression 6 can be defined as an angle formed between the first substrate of the light guide device and the second direction (y1).
- the second direction (y1) may be a direction perpendicular to a direction in which a user views the light guide device.
- the pantospheric angle ( ⁇ ) may have a range of -(90 ⁇ - Fov) to +(90 ⁇ - Fov).
- the pantospheric angle ( ⁇ ) according to an embodiment may be 15 ⁇ to 25 ⁇ .
- the pantoscope angle ( ⁇ ) may be 20 ⁇ .
- ⁇ camera may be defined as an angle between the light receiving unit (corresponding to 200 in Fig. 6a) and the first substrate.
- the angle between the light receiving unit and the first substrate may be an angle formed between the direction in which the light receiving unit receives light and the direction perpendicular to the first substrate (z2). Referring to Fig.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne un dispositif de guidage optique et un dispositif électronique le comprenant. Le dispositif de guidage optique est utilisé dans la réalité augmentée (AR) dans laquelle une structure de grille avec des degrés de liberté pour tous les angles d'un dispositif optique et d'un observateur peut être conçue, le dispositif de guidage optique comprenant : un premier substrat; un premier élément diffractif d'entrée qui est disposé sur le premier substrat et frappé successivement avec la lumière émise par un projecteur; et un premier élément de diffraction de transmission et un premier élément de diffraction de sortie.
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2023-0149975 | 2023-11-02 | ||
| KR20230149975 | 2023-11-02 | ||
| KR1020240112154A KR20250064572A (ko) | 2023-11-02 | 2024-08-21 | 광 가이드 장치 및 이를 포함하는 전자 디바이스 |
| KR10-2024-0112154 | 2024-08-21 | ||
| KR1020240129814A KR20250064580A (ko) | 2023-11-02 | 2024-09-25 | 광 가이드 장치 및 이를 포함하는 전자 디바이스 |
| KR10-2024-0129814 | 2024-09-25 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025095720A1 true WO2025095720A1 (fr) | 2025-05-08 |
Family
ID=95582798
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/017138 Pending WO2025095720A1 (fr) | 2023-11-02 | 2024-11-04 | Dispositif de guidage optique et dispositif électronique le comprenant |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025095720A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140016051A1 (en) * | 2010-12-22 | 2014-01-16 | Seereal Technologies S.A. | Combined light modulation device for tracking users |
| KR20190124175A (ko) * | 2018-12-26 | 2019-11-04 | 엘지전자 주식회사 | 전자 디바이스 |
| WO2022033929A1 (fr) * | 2020-08-10 | 2022-02-17 | Essilor International | Élément optique comprenant au moins un élément de diffusion holographique |
| KR20220026472A (ko) * | 2020-08-25 | 2022-03-04 | 삼성전자주식회사 | 홀로그래픽 회절 격자 구조를 갖는 웨이브가이드에 기초한 증강현실 디바이스 및 홀로그래픽 회절 격자 구조를 기록하는 장치 |
-
2024
- 2024-11-04 WO PCT/KR2024/017138 patent/WO2025095720A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140016051A1 (en) * | 2010-12-22 | 2014-01-16 | Seereal Technologies S.A. | Combined light modulation device for tracking users |
| KR20190124175A (ko) * | 2018-12-26 | 2019-11-04 | 엘지전자 주식회사 | 전자 디바이스 |
| WO2022033929A1 (fr) * | 2020-08-10 | 2022-02-17 | Essilor International | Élément optique comprenant au moins un élément de diffusion holographique |
| KR20220026472A (ko) * | 2020-08-25 | 2022-03-04 | 삼성전자주식회사 | 홀로그래픽 회절 격자 구조를 갖는 웨이브가이드에 기초한 증강현실 디바이스 및 홀로그래픽 회절 격자 구조를 기록하는 장치 |
Non-Patent Citations (1)
| Title |
|---|
| NOUI, LOUAHAB ET AL.: "Laser beam scanner and combiner architectures", OPTICAL ARCHITECTURES FOR DISPLAYS AND SENSING IN AUGMENTED, VIRTUAL, AND MIXED REALITY ( AR, VR, MR) II, vol. 1176508, 27 March 2021 (2021-03-27), pages 1 - 12, XP060140739, [retrieved on 20250117], DOI: https://doi.org/10.1117/12.2576702 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2022102954A1 (fr) | Dispositif électronique à porter sur soi comprenant un écran | |
| WO2020189866A1 (fr) | Dispositif électronique pouvant être porté sur la tête | |
| WO2020226235A1 (fr) | Dispositif électronique | |
| WO2021040117A1 (fr) | Dispositif électronique | |
| WO2021040076A1 (fr) | Dispositif électronique | |
| WO2020138640A1 (fr) | Dispositif électronique | |
| WO2023027300A1 (fr) | Dispositif électronique, dispositif d'affichage monté sur la tête, dispositif habitronique et son procédé de fonctionnement | |
| WO2022092517A1 (fr) | Dispositif électronique pouvant être porté comprenant une unité d'affichage, procédé de commande d'affichage, système comprenant un dispositif électronique pouvant être porté et boîtier | |
| WO2021040107A1 (fr) | Dispositif de ra et procédé pour le commander | |
| WO2021040082A1 (fr) | Dispositif électronique | |
| WO2021040116A1 (fr) | Dispositif électronique | |
| WO2021049693A1 (fr) | Dispositif électronique | |
| WO2021040081A1 (fr) | Dispositif électronique | |
| WO2021049694A1 (fr) | Dispositif électronique | |
| WO2020138636A1 (fr) | Dispositif électronique | |
| WO2022154176A1 (fr) | Terminal de type lunettes et procédé pour fournir une image virtuelle à un terminal de type lunettes | |
| WO2025095720A1 (fr) | Dispositif de guidage optique et dispositif électronique le comprenant | |
| WO2021033790A1 (fr) | Dispositif électronique | |
| WO2025095721A1 (fr) | Dispositif de guidage de lumière et dispositif électronique le comprenant | |
| WO2025095728A1 (fr) | Dispositif de guidage de lumière et dispositif électronique le comprenant | |
| WO2025127562A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant | |
| WO2025206869A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant | |
| WO2021025190A1 (fr) | Dispositif électronique et son procédé de fabrication | |
| WO2025164971A1 (fr) | Dispositif portable et procédé de déplacement d'objet virtuel pour obtenir des informations concernant des positions du regard | |
| WO2025089814A1 (fr) | Dispositif projecteur et dispositif électronique le comprenant |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24886390 Country of ref document: EP Kind code of ref document: A1 |