[go: up one dir, main page]

WO2018182159A1 - Lunettes intelligentes capables de traiter un objet virtuel - Google Patents

Lunettes intelligentes capables de traiter un objet virtuel Download PDF

Info

Publication number
WO2018182159A1
WO2018182159A1 PCT/KR2018/001100 KR2018001100W WO2018182159A1 WO 2018182159 A1 WO2018182159 A1 WO 2018182159A1 KR 2018001100 W KR2018001100 W KR 2018001100W WO 2018182159 A1 WO2018182159 A1 WO 2018182159A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
smart glasses
lens
display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2018/001100
Other languages
English (en)
Korean (ko)
Inventor
문명일
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020170118599A external-priority patent/KR20180109644A/ko
Application filed by Individual filed Critical Individual
Publication of WO2018182159A1 publication Critical patent/WO2018182159A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • Embodiments of the present invention relate to smart glasses, and more particularly, to a smart glasses having a camera and a display device and capable of processing a virtual object.
  • a head mount display refers to a digital device that displays a virtual image on glasses in a form that is worn on a face like glasses and that multimedia content is close to an eyeball.
  • HMD can provide various conveniences to users in combination with Augmented Reality technology beyond simple display functions, and can communicate with external digital devices to output the contents as well as users for external digital devices. It also supports receiving input or performing work in conjunction with an external digital device. Recently, research and development on HMDs have been actively performed.
  • Augmented reality is one of the hybrid virtual reality technologies that combines reality and virtual environment by using a technology that displays three-dimensional virtual objects in the real world.
  • This augmented reality is being used in various fields such as military, entertainment, medical, learning, film, architectural design, tourism, etc., and is gradually becoming a reality beyond the imaginary stages described in science fiction or film.
  • Augmented reality is an area of virtual reality, and virtual reality systems are classified into window systems, mirror systems, vehicle-based systems, and augmented reality systems. Can be.
  • the virtual reality system includes an input device and an output device.
  • Output devices are devices that allow users to perceive visual, auditory, tactile, and motion through sensory channels, and include visual display devices, auditory display devices, tactile display devices, motion feedback, and image display devices.
  • Representative hardware of visual display device is HMD and smart glasses.
  • Presence refers to the sense of being in a certain environment.
  • remote presence can refer to the existence of a virtual environment by a communication medium, and can be referred to as a virtual reality system or an augmented reality system. Allows the user to experience remote presence.
  • the conventional smart glasses supply images to one eye of either the left or the right side, and the user may observe side effects such as dizziness even when only a certain time is observed, as the user observes the multimedia content with only one eye.
  • smart glasses having a small projector for supplying the contents to the left eye and a small projector for supplying the content to the right eye have been proposed to supply multimedia contents to both eyes, but a separate small projector is used to provide a single image.
  • the content delivered to both eyes is not the same in terms of the user's perspective and is not associated with each other, so that the user feels great discomfort such as dizziness or a lot of fatigue.
  • an object of the present invention is to provide smart glasses with an optical system that can greatly reduce the fatigue and dizziness of the user by projecting the image in the upper or lower direction of the face. It is.
  • Another object of the present invention is to reflect the multimedia content provided by the display in both directions through the mirror around the eye and to partially reflect the content enlarged by the magnifying lens or the auxiliary lens to both eyes through the beam splitter and partially transmitted through the front of the eye. It is to provide smart glasses that can observe the same image content formed in the virtual image from both eyes, and to implement 3D content as needed or to easily implement augmented reality or virtual reality using 3D content.
  • the frame assembly having a spectacle shape; Electrical components mounted to the frame assembly; A display system having a first display device and a second display device supported by the frame assembly and controlled by the control device of the electric component and disposed on the center upper side and the center lower side of the frame assembly; And a first image and a second image output from the first display device and the second display device respectively coupled to the display system in a horizontal direction through a case assembly fixed to the frame assembly and arranged in a vertical direction in the display system. It includes an optical system for transmitting in both directions in the horizontal direction through a mirror positioned between the display device and the second display device.
  • the optical system reflects the first image, which is input in a first direction from the center upper side toward the center lower side, in a first direction to be transmitted in a second direction perpendicular to the first direction, and at the center lower side.
  • a first right lens in the form of a concave lens through which the first image transmitted from the mirror passes;
  • a second right lens in the form of a convex lens for passing the first image passing through the first right lens again;
  • the first main lens or the second main lens the body in the form of a cube block; And a mirror surface, a radial surface, or a beam split disposed inclined with the projection direction of the first image passing through the second right lens or the second image passing through the second left lens.
  • the controller may control the first display device or the second display device such that the second image is shifted by a predetermined pixel in one direction based on the first image so as to correspond to binocular parallax.
  • the controller may control the first display device or the second display device such that the first image and the second image are sequentially or alternately outputted for displaying the 3D image or the stereoscopic image.
  • the first display device or the second display device may be a liquid crystal display (LCD), a digital light processing (DLP) projector, a liquid crystal on silicon (LCoS) device, a plasma dispaly panel (PDP), or a field of display (FED). It may include any one of an emission display, an ELD (electro-luminescent display), and an organic light emitting diode (OLED) device.
  • LCD liquid crystal display
  • DLP digital light processing
  • LCD liquid crystal on silicon
  • PDP plasma dispaly panel
  • FED field of display
  • It may include any one of an emission display, an ELD (electro-luminescent display), and an organic light emitting diode (OLED) device.
  • ELD electro-luminescent display
  • OLED organic light emitting diode
  • the optical system of the smart glasses according to another aspect of the present invention for solving the above technical problem is an optical system of the smart glasses to be displayed on the front of the user's eye is worn on the user's face in the form of glasses, the frame provided in the form of glasses
  • the first image input in the first direction from the front center upper side toward the front center lower side of the assembly is reflected from the first surface and transmitted in a second direction orthogonal to the first direction, and the front center upper portion from the front center lower side
  • a second right lens in the form of a convex lens for passing the first image passing through the first right lens again;
  • the coupling structure of the first right lens and the second right lens or the coupling structure of the first left lens and the second left lens is a single lens in which the front or one side is concave and the back or the other side is convex when viewed from the mirror side. It may have a form.
  • the first main lens or the second main lens the body in the form of a cube block; And a mirror surface, a radial surface, or a beam split disposed inclined with the projection direction of the first image passing through the second right lens or the second image passing through the second left lens.
  • the smart glasses include a camera or an infrared projector mounted on the frame assembly; And a control board or processor mounted or supported in the frame assembly.
  • the control board or processor recognizes an object including one of a palm, a back of a hand, a forearm, a desk, a door, and a wall, and an area to display a virtual object or a user interface (UI) image based on the structural characteristics of the object. , And create a virtual object or a UI image with a background in the searched area.
  • UI user interface
  • control board or processor may recognize the end of another object approaching the virtual object or the UI image, and may further perform a preset operation on the virtual object or UI image corresponding to the end of the other object.
  • control board or the processor may extract the outline of the other hand or extract the joint to recognize the gesture of the hand.
  • the frame assembly having a spectacle frame form;
  • a display device supported by the frame assembly and positioned on the central portion of the face when the frame assembly is worn on the user's face;
  • an optical system that is supported by the frame assembly and transmits an image of the display device in a direction in which both eyes of the user are located when the frame assembly is worn on the face of the user, wherein the optical system is displayed in front of the user's eyes.
  • At least one main lens that partially reflects and partially transmits an image transmitted from the image, and the visible image of the user is an image formed on a virtual image spaced a certain distance behind the main lens.
  • An optical system that can be employed in the smart glasses according to another aspect of the present invention for solving the above technical problem is, as an optical system of the smart glasses to be worn on the user's face in the form of glasses frame to provide an image to the user eye, A mirror reflecting an image of the device; An auxiliary lens for enlarging an image reflected by the mirror; A main lens for partially reflecting and partially transmitting an image enlarged by the auxiliary lens; And a frame assembly in the form of a spectacle frame for supporting the mirror, the auxiliary lens, and the main lens, wherein when the frame assembly is worn on the user's head, the mirror, the auxiliary lens, and the main lens are positioned on the center of the user's face, the glans, or the nose. Characterized in that.
  • the user can use smart glasses for a long time when using contents of virtual reality (VR), augmented reality (AR), fused reality (MR), mixed reality (MR), etc. It can provide an environment. In addition, it can be effectively or easily applied to various fields such as working or working without a monitor, shopping, using a navigation device, playing an AR, VR or MR game, watching a video, or making a video call. There is this.
  • VR virtual reality
  • AR augmented reality
  • MR fused reality
  • MR mixed reality
  • the user's eyelids may be recognized through a rear camera or a sensor, and vibrations or alarms may be generated through an actuator, a driver, or a speaker stored in the device according to a recognition result, thereby warning a danger to a user, especially a drowsy driver.
  • the heat generated when the device is used for a long time can be effectively discharged to the heat dissipation structure or the heat sink disposed in the upper front portion of the main frame to provide an operational stability and reliability for the smart glasses.
  • smart glasses when the battery is removed from the smart glasses, and the power is supplied to the external control and power supply through the battery to the smart glasses, smart glasses can be effectively reduced in weight and increase the wearing comfort and ease of use.
  • augmented reality service is provided or a user command is input through a user interface image or a virtual object displayed as augmented reality, and a preset operation is performed accordingly.
  • a preset operation is performed accordingly.
  • FIG. 1 is a perspective view of smart glasses according to an embodiment of the present invention.
  • FIG. 2 is an exploded perspective view of the smart glasses of FIG. 1.
  • FIG. 3 is a partially exploded perspective view for explaining the principle of operation of the smart glasses of FIG.
  • FIG. 4 is an enlarged perspective view illustrating a structure of a rear case of the smart glasses of FIG. 3.
  • FIG. 5 is a block diagram illustrating a main configuration of a main control board that can be employed in smart glasses according to another embodiment of the present invention.
  • FIG. 6 is a perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 7 is a perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 8 is a front view of the smart glasses of FIG. 7.
  • FIG. 9 is a partial projection front view for describing an internal structure of the smart glasses of FIG. 8.
  • FIG. 10 is a right side view of the smart glasses of FIG. 7.
  • FIG. 11 is a partially projected right side view showing the internal structure of the smart glasses of FIG. 10.
  • FIG. 12 is a partially exploded perspective view for explaining the components of the smart glasses of FIG.
  • 13 and 14 are diagrams for describing an optical system of the smart glasses of FIG. 7.
  • FIG. 15 is a block diagram of a control board that may be employed in the smart glasses of FIG. 7.
  • 16 is a perspective view of a smart glasses and a control system according to another embodiment of the present invention.
  • FIG. 17 is a block diagram illustrating a main configuration of the smart glasses of FIG. 16.
  • FIG. 18 is a block diagram illustrating a control module that may be employed in the smart glasses of FIG. 16.
  • 19 and 20 are flowcharts for describing a main operation principle of the smart glasses of FIG. 16.
  • FIG. 21 is a diagram illustrating a user interface that may be employed in smart glasses according to another embodiment of the present invention.
  • 22 and 23 are exemplary views for explaining a modification of the user interface that can be employed in the smart glasses according to another embodiment of the present invention.
  • 24 is a view for explaining the principle of operation of the user interface that can be employed in the smart glasses of this embodiment.
  • 25 and 26 are flowcharts illustrating a signal processing process of a user interface employed in smart glasses according to another embodiment of the present invention.
  • FIG. 1 is a perspective view of smart glasses according to an embodiment of the present invention.
  • FIG. 2 is an exploded perspective view of the smart glasses of FIG. 1.
  • 3 is a partially exploded perspective view for explaining the principle of operation of the smart glasses of FIG. 4 is an enlarged perspective view illustrating a structure of a rear case of the smart glasses of FIG. 3.
  • the smart glasses 100 include a frame assembly 10, an electric component, a display system 30, and an optical system 50.
  • the display system 30 may include a case assembly 60 and an optical system 50 in addition to the first display device 31 and the second display device 32.
  • the display apparatuses, the optical system 50 and the case assembly 60 may be arranged to cross each other in a form arranged on three different axes in the Cartesian coordinate system, and the mirror 51 is disposed at the center of the cross-array structure.
  • the first display device 31 and the second display device 32 may be disposed at both sides with the mirror 51 interposed therebetween.
  • the frame assembly 10 forms the spectacle shape by a plurality of frames.
  • the frame assembly comprises an upper frame 11, a lower frame 12, a body frame 13, a right cover frame 14 and a left cover frame 15 ) May be included.
  • a nose pad 18 may be coupled to the body frame 13.
  • the lower frame 12 accommodates the main control board 21 in the central accommodating portion.
  • the central receiving portion may be covered by the upper frame 11.
  • the lower frame 12 is a first battery pack 24 connected to the first printed circuit board 23 and the first printed circuit board 23 in the first accommodating portion in the right side extension part connected to the right side of the central accommodating portion. ) Can be stored.
  • the first accommodating part may be covered by the right cover frame 14.
  • the button cover power button 22 may be installed on the right cover frame 14, and the power button 22 may be connected to the first printed circuit board 23.
  • the first printed circuit board 23 may connect the battery management system in the first battery pack 24 to the control device 210 of the main control board 21.
  • the lower frame 12 may include a second battery pack connected to the second printed circuit board 25 and the second printed circuit board 25 in the second accommodating part in the left side extension part connected to the left side of the central accommodating part. (26) can be stored.
  • the second housing part may be covered by the left cover frame 15.
  • the second printed circuit board 25 may connect the battery management system in the second battery pack 26 and the control device 210 of the main control board 21.
  • the body frame 13 is coupled to the bottom of the lower frame 12.
  • the body frame 13 is installed to support their coupling between the lower frame 12 and the case system 60.
  • the body frame 13 is coupled to the lower portion of the lower frame 12 to complement and increase the overall durability of the smart glasses.
  • the electrical component may be mounted on or contained in the frame assembly 10.
  • the electrical components include a main control board 21, a power switch 22, a first control board 23, a first battery pack 24, a second control board 25 and a second battery. It may include a pack 26.
  • the first control board 23 may correspond to the first printed circuit board
  • the second control board 25 may correspond to the second printed circuit board
  • the battery pack may be simply referred to as a battery.
  • the first display device 31 and the second display device 32 may be controlled by the control device of the electric component and may be disposed at the center upper side and the center lower side of the frame assembly, respectively.
  • the first display device 31 or the second display device 32 includes a liquid crystal display (LCD).
  • the first display device 31 and the second display device 32 may include backlight units 33 and 34.
  • the display device of the present invention is not limited to LCD, liquid crystal on silicon (LCoS), plasma dispaly panel (PDP), field emission display (FED), electro-luminescent display (ELD) and OLED (organic light emitting diode) can be used by replacing any one.
  • LCD liquid crystal on silicon
  • PDP plasma dispaly panel
  • FED field emission display
  • ELD electro-luminescent display
  • OLED organic light emitting diode
  • the optical system 50 controls the image size while transferring the first image output from the first display apparatus 31 and the second image output from the second display apparatus 32 to both eyes of the user, thereby controlling the first main lens ( 54 and the second main lens 57, respectively.
  • the optical system 50 includes a mirror 51, a first right lens 52, a second right lens 53, a first main lens 54, and a first lens 51.
  • the first left lens 55, the second left lens 56, and the second main lens 57 may be included.
  • the mirror 51 reflects a first image, which is input in a direction from the upper side of the front center of the smart glasses or the center of the optical system to the lower side of the center (first direction), on the first surface to be orthogonal to the first direction. Pass in the direction.
  • the mirror 51 reflects a second image input in a direction from the lower side of the center toward the upper side of the center (the first reverse direction facing the first direction) on the second surface opposite to the first surface.
  • the transmission may be performed in a second reverse direction that is perpendicular to the first reverse direction and opposite to the second direction.
  • the first right lens 52 may have a concave lens shape and may pass a first image transmitted from the mirror 51.
  • the first right lens 52 may primarily pass the first image.
  • the second right lens 53 may have a convex lens shape and may secondly pass the first image passing through the first right lens 52.
  • the first right lens 52 and the second right lens 53 may form a single lens shape in which light incident through one end of the concave surface exits through the convex surface of the other surface.
  • the first main lens 54 may transmit while reflecting the first image passing through the second right lens 54.
  • the first main lens 54 may be supported by the frame assembly or the cover and disposed on one front side of the smart glasses.
  • the first main lens 54 may be arranged in front of the right eye of the user.
  • the first left lens 55 may have a concave lens shape and may pass a second image transmitted from the mirror 51.
  • the first left lens 55 may primarily pass the second image in a direction opposite to the first right lens 52.
  • the second left lens 56 may have a convex lens shape and may secondly pass the second image passing through the first left lens 55.
  • the first left lens 55 and the second left lens 56 may correspond to a single lens structure in which light incident through one end of the concave surface exits through the convex surface of the other surface, but is not limited thereto.
  • the second main lens 57 may transmit while reflecting the second image passing through the second left lens 56.
  • the second main lens 57 may be supported by the frame assembly or the cover and disposed at one front side of the smart glasses.
  • the second main lens 57 may be arranged in front of the user's left eye.
  • the two display apparatuses of the optical system are orthogonal to the second direction (or horizontal direction) in which the components of the optical system 50 are arranged.
  • the two display devices 31 and 32 may be disposed above and below the mirror 51 of the optical system 50.
  • the mirror 51 is also located in the center of the components of the optical system 50 arranged in the second direction, thereby allowing the components of the optical system 50 to be arranged to transmit image signals in two opposite directions. .
  • the rear case 62 of the present embodiment as shown in Figure 4, the first mounting groove 620 on which the mirror 51 is seated, the first upper seat for mounting the first display device 31
  • the first lower seating recess 622 for seating the groove 621 and the second display device 32 is provided.
  • a second upper mounting groove 631 may be further provided to accommodate the backlight unit 33 on the first upper mounting groove 621.
  • a second lower seating recess for accommodating the backlight unit 34 of the second display device 32 may be provided under the first lower seating recess 622.
  • the rear case 62 includes a second mounting groove 625 and a first main lens 54 for seating the first right lens 52 and the second right lens 53. It may be provided with a third seating groove 626 for receiving and supporting the. Similarly, the rear case 62 accommodates and supports the fourth seating groove 627 and the second main lens 57 for mounting the first left lens 55 and the second left lens 56. The fifth mounting recess 628 may be provided.
  • the first main lens 54 and the second main lens 57 are primarily supported by the front case 61 and the rear case 62, and the front cover 63 surrounding the outside of the front case 61. And it can be firmly supported by the combination of the back cover 64 surrounding the outside of the back case 62.
  • the rear case 62 may include a fastening part 629 for fastening with the body frame 13.
  • the fastening part 629 may have a screw hole.
  • the front cover 63 may be referred to as a first outer case, in which case the front case 61 may be referred to as a first inner case.
  • back cover 64 may be referred to as a second outer case, in which case back case 63 may be referred to as a second inner case.
  • the optical system does not provide the first image and the second image from the side of the user's face, but primarily transmits the first image and the second image from the forehead side and the nose side of the user's face to the glacial side.
  • Image size adjustment and focusing are performed through the concave lens and the convex lens while forwarding, and the user's eye is visible on the first main lens 54 and the second main lens 57 in front of the eye based on the user's vision.
  • the first image and the second image can be displayed.
  • the smart glasses 100 may implement a three-dimensional program without causing any inconvenience to the user. That is, according to the present exemplary embodiment, gesture recognition, 3D implementation, virtual reality image reproduction, augmented reality service, etc. may be provided using smart glasses having a simplified structure. In particular, while using 3D content using smart glasses, users can easily reduce the manufacturing costs by solving the problems of user's existing head-mount display such as dizziness or nausea, which is convenient for users and the structure of the device is simple. There is an advantage.
  • FIG. 5 is a block diagram illustrating a main configuration of a main control board that can be employed in smart glasses according to another embodiment of the present invention.
  • control device 210 may include a communication unit 211, a processor 212, and a memory 213.
  • Control device 2120 may include a controller or a computing device.
  • the processor 212 may be referred to as a controller.
  • the communication unit 211 connects the control device 210 to the network.
  • the communication unit 211 may access various servers on the Internet or a network through a network, and may operate in cooperation with or cooperate with various servers.
  • the server may include a server supporting virtual reality, a server supporting augmented reality, and the like.
  • the communication unit 211 may include one or more wired and / or wireless communication subsystems that support one or more communication protocols.
  • Wired communication subsystems include public switched telephone networks (PSTN), Asymmetric Digital Subscriber Line (ADSL) or Very High-data Rate Digital Subscriber Line (VDSL) networks, subsystems for PSTN Emulation Service (PES), and Internet Protocol (IP).
  • PSTN public switched telephone networks
  • ADSL Asymmetric Digital Subscriber Line
  • VDSL Very High-data Rate Digital Subscriber Line
  • PES PSTN Emulation Service
  • IP Internet Protocol
  • Multimedia subsystem IMS and the like.
  • the wireless communication subsystem may include a radio frequency (RF) receiver, an RF transmitter, an RF transceiver, an optical (eg, infrared) receiver, an optical transmitter, an optical transceiver, or a combination thereof.
  • RF radio frequency
  • a wireless network basically refers to Wi-Fi, but is not limited thereto.
  • the communication unit 211 is used in various wireless networks, for example, Global System for Mobile Communication (GSM), Enhanced Data GSM Environment (EDGE), Code Division Multiple Access (CDMA), and W-Code Division Multiple (W-CDMA). It can be implemented to support at least one selected from Access, Long Term Evolution (LTE), LET-A (LET-Advanced), Orthogonal Frequency Division Multiple Access (OFDMA), WiMax, Wireless Fidelity (Wi-Fi), Bluetooth, and the like. Can be.
  • GSM Global System for Mobile Communication
  • EDGE Enhanced Data GSM Environment
  • CDMA Code Division Multiple Access
  • W-CDMA Wideband Code Division Multiple Access
  • W-CDMA Wideband Code Division Multiple Access
  • the processor 212 may provide various services through smart glasses by executing a software module or program stored in the internal memory or the memory 213.
  • Processor 212 may be referred to as a microprocessor.
  • the processor 212 may be implemented as a processor or microprocessor including at least one central processing unit (CPU) or core.
  • the central processing unit or core is a register that stores the instructions to be processed, an arithmetic logical unit (ALU) that is responsible for comparison, determination, and operation, and the CPU internally to interpret and execute the instructions. It may be provided with a control unit (control unit) for controlling, and an internal bus connecting them.
  • the CPU or core may be implemented as a system on chip (SOC) in which a micro control unit (MCU) and a peripheral device (an integrated circuit for an external expansion device) are arranged together, but is not limited thereto.
  • SOC system on chip
  • the processor 212 may include, but is not limited to, one or more data processors, an image processor, or a codec.
  • the controller 212 may include a peripheral device interface and a memory interface.
  • the peripheral interface may connect an input / output system such as the processor 212 and an output device or another peripheral device, and the memory interface may connect the processor 212 and the memory 213.
  • the processor 212 uses the first battery pack 24 and the second battery to secure a relatively long operating time for one-time charging of the battery even under operating conditions that consume a relatively large amount of power by the image display.
  • the pack 26 is controlled.
  • the processor 212 may be connected to and communicate with a battery management system (BMS) 240 of each battery pack.
  • BMS battery management system
  • the processor 212 may be connected to a control device or a timing controller of at least one or both display devices of the display system 30, thereby controlling the operation of the display device.
  • the memory 213 may store a software module for providing services such as object recognition, gesture recognition, virtual reality, and augmented reality through smart glasses.
  • the software module may include a first module 215 for controlling destination recognition; A second module 216 for providing 3D image or 3D control; A third module 217 for providing a display image service of virtual reality; And a fourth module 218 for providing a display image service of augmented reality for adding a virtual object to an image of the real world currently viewed.
  • a module may be implemented as at least one task processor that is performed by the processor 212 to perform certain functions or operations within the processor 212.
  • the above-described memory 213 is a non-volatile random access memory (NVRAM), a semiconductor memory such as dynamic random access memory (DRAM), which is a typical volatile memory, a hard disk drive (HDD), optical storage It may be implemented as a device, a flash memory, or the like.
  • the memory 56 may store an operating system, a program, a command set, etc. in addition to software modules for implementing a print production service method.
  • components of the smart glasses device may be implemented as functional blocks or modules mounted in various computing devices based on nonvolatile memory (NVRAM).
  • NVRAM nonvolatile memory
  • software modules stored in the memory of the processor of FIG. 5 may be stored in a computer-readable medium (recording medium) in the form of software for implementing a series of functions they perform or in a storage device in a remote server device in the form of a carrier. It may be implemented to operate on a particular computing device that is stored and connected via a network with the server device.
  • the computer-readable medium may include a memory or a storage device of a plurality of computer devices or cloud systems connected through a network, and at least one of the plurality of computer devices or cloud systems may provide various services in the smart glasses of the present embodiment. You can store the program or source code for implementation.
  • the computer readable medium may be embodied in the form of a single or combination of program instructions, data files, data structures, and the like.
  • the programs recorded on the computer readable medium may be those specially designed and configured for the present invention, or may include those known and available to those skilled in computer software.
  • the computer readable medium may include a hardware device specifically configured to store and execute program instructions, such as a ROM, a RAM, a flash memory, and the like.
  • the program instructions may include high-level language code that can be executed by a computer using an interpreter as well as machine code such as produced by a compiler.
  • the hardware device may be configured to operate with at least one software module for the implementation of all services that can be provided through the smart glasses of the present embodiment, and vice versa.
  • control device 210 of the present embodiment may be implemented as a controller including a controller, a memory, and a communication unit in a single substrate or a single housing for convenience of description, but the present invention is not limited thereto. It may be implemented in the form.
  • FIG. 6 is a perspective view of smart glasses according to another embodiment of the present invention.
  • the smart glasses 100a include a frame assembly 10 in the form of a glasses frame; And a case assembly 60 integrally supporting a display for outputting an image at the front center of the face of the user when the user wears smart glasses and an optical system that reflects, enlarges, partially reflects, and partially transmits the image of the display. .
  • the first main lens (see 54 of FIG. 2) and the second main lens (see 57 of FIG. 2) indicated by the reference numeral 50a are made of plate-shaped reflective and transmissive members, unlike the above-described embodiment.
  • the first main lens and the second main lens are inclined at a predetermined angle with the central axis.
  • the first main lens and the second main lens may be beam splitters in the form of plate members having a thickness of several millimeters or less.
  • the first main lens and the second main lens partially reflect and partially transmit the image so that the user wearing the smart glasses can see the image formed on the virtual image so that the user can see a relatively large image through the smart glasses.
  • FIG. 7 is a perspective view of smart glasses according to another embodiment of the present invention.
  • FIG. 8 is a front view of the smart glasses of FIG. 7.
  • FIG. 9 is a partial projection front view for describing an internal structure of the smart glasses of FIG. 8.
  • FIG. 10 is a right side view of the smart glasses of FIG. 7.
  • FIG. 11 is a partially projected right side view showing the internal structure of the smart glasses of FIG. 10.
  • 12 is a partial exploded perspective view for explaining the components of the smart glasses of FIG.
  • the smart glasses 100 may include a main frame 110, a first display 131, a second display 132, a first mirror 133, a second mirror 134, and a first main lens. 137 and a second main lens 138.
  • the smart glasses 100 may further include a first auxiliary lens 135 and a second auxiliary lens 136.
  • the smart glasses 100 may include a controller including a processor (see 210 of FIG. 15), a display interface 160 for the first and second displays 131 and 132, and a flexible printing connection therebetween.
  • a flexible printed circuit board (FPCB) 164 may be provided.
  • the controller may be referred to as a control and power supply, and may be embedded or have separate communication and power boards.
  • the communications and power boards may include connectors or means or components for relaying or interconnecting communications or power sources.
  • the smart glasses 100 may include one or more front cameras 151 and one or more rear cameras 153.
  • the front camera 151 includes a first front camera and a second front camera spaced apart from each other by a predetermined distance, the input or application of the 3D content may be facilitated.
  • the front or rear camera may be a kind of sensor or function as a sensor.
  • the rear camera 153 or the sensor may be a means for detecting a user's eyelid movement or a device that performs a function corresponding to the means.
  • the smart glasses 100 may further include an actuator or driver 194 vibrating according to a signal from a camera, a sensor, or a control and power supply device, or a speaker for outputting an acoustic signal in response to the signal.
  • an actuator or driver 194 vibrating according to a signal from a camera, a sensor, or a control and power supply device, or a speaker for outputting an acoustic signal in response to the signal.
  • Driver 194 may include a vibration motor.
  • the smart glasses 100 may further include a heat dissipation structure or a heat sink for dissipating heat generated from the first display 131, the second display 132, the display interface 160, and the like.
  • a heat dissipation structure or a heat sink for dissipating heat generated from the first display 131, the second display 132, the display interface 160, and the like.
  • At least one of the first display 131 and the second display 132 may be referred to as a display unit.
  • the display refers to a display device briefly. When the display is a liquid crystal display, the display may include a backlight unit.
  • the main frame 110 is formed of a rigid material having an approximately U-shape and has a frame shape or a frame shape of the frame.
  • the main frame 110 may be formed of a single structure, in which case the main frame 110 may be coupled to the center frame portion 110c and both ends thereof to form a frame shape of the left frame portion 110a and the right frame portion ( 110b).
  • the left and right sides may be opposite to the user.
  • the main frame 110 has a coupling structure of a support frame disposed in the middle portion and a first side frame coupled to the left side of the support frame and a second side frame coupled to the right side for ease of manufacture and light weight. It may be provided.
  • the support frame may be detachably formed of the first support frame 120a and the second support frame 120b to facilitate the support, arrangement, and assembly of the display, the mirror, the auxiliary lens, and the main lens.
  • the support frame 120 may refer to a form in which the first support frame 120a and the second support frame 120b are combined.
  • the display interface 160 may be inserted into a central portion of the support frame 120 and may be electrically connected to the controller through a terminal of a flexible printed circuit board (FPCB) 164 disposed on the upper side of the center.
  • the display interface 160 is connected to the first display 131 and the second display 132.
  • the FPCB 164 may be installed to extend from the terminal at the center front portion to the first side frame 110a and the second side frame 110b along the upper surface of the main frame 110 or the support frame 120.
  • the first display 131 and the second display 132 are disposed on both sides of the box-shaped central portion of the support frame 120 to output the first image and the second image in opposite directions.
  • the first display 131 or the second display 132 may be a thin film transistor liquid crystal display (TFT LCD), an organic light emitting diode (OLED), liquid crystal on silicon (LCoS), It may include at least one selected from a digital light processing (DLP) based display.
  • TFT LCD thin film transistor liquid crystal display
  • OLED organic light emitting diode
  • LCDoS liquid crystal on silicon
  • DLP digital light processing
  • the first mirror 133 which is supported by the left inclined support structure portion of the support frame 120, is installed on the front surface of the first display 131 to substantially orthogonally cross the first image of the first display 131 with the left direction. It may reflect in the first downward direction.
  • a second mirror 134 supported by the right inclined support structure portion of the support frame 120 is installed on the front surface of the second display 132, and the second image of the second display 132 is directed to the right. It can reflect in the 2nd lower direction substantially orthogonal to.
  • the second lower direction may be a direction parallel to each other at a distance approximately equal to a distance between two eyes of the user.
  • the first image reflected by the first mirror 133 is enlarged by the first auxiliary lens 135.
  • the first auxiliary lens 135 may be inserted into and supported in the support structure having the uneven structure of the support frame 120.
  • the first auxiliary lens 135 enlarges and transmits the first image to the first main lens 137.
  • the second image reflected by the second mirror 134 is magnified by the second auxiliary lens 136.
  • the second auxiliary lens 136 may be inserted into and supported in the support structure having the uneven structure of the support frame 120.
  • the second auxiliary lens 136 enlarges the second image and transfers the second image to the second main lens 138.
  • the first auxiliary lens 135 or the second auxiliary lens 136 has a stacked structure or superposition of one side convex lens (the other side concave lens), two-sided convex lens and one side convex lens based on when the image is viewed from the input end side. It can have an arrangement.
  • the first and second auxiliary lenses 135 and 136 described above are used to enlarge the first image and the second image to a desired size and to remove spherical aberration.
  • the first main lens 137 is disposed under the first auxiliary lens 135.
  • the first main lens 137 may correspond to the first main lens or the second main lens indicated by reference numeral 50 in FIG. 1 or indicated by reference numeral 50a in FIG. 6.
  • the first main lens 137 may have an arrangement structure in which a relatively thin plate-like lens member is inclined with the central axis of the first auxiliary lens 135.
  • the first main lens 137 may be coupled to a lower portion of the support frame 120 or the center frame portion 110c (hereinafter, simply a center frame) on the lower side of the first auxiliary lens 135.
  • the first main lens 137 may reflect the first image passing through the first image or the first auxiliary lens 135 at approximately right angles to the user's eye as a beam splitter.
  • the first image descending from the first mirror 133 in a substantially vertical direction forms an image facing the user's eye and a virtual image facing the front of the user's eye with respect to the reflective surface of the first main lens 137.
  • the distance from the reflective surface to the location of the lower image or the image plane may be about 20 meters.
  • the second main lens 138 is disposed under the second auxiliary lens 136.
  • the second main lens 138 may have an arrangement structure in which a relatively thin plate-like lens member is inclined with the central axis of the second auxiliary lens 136.
  • the second main lens 138 may be coupled to the lower portion of the support frame 120 or the center frame 110c at the lower side of the second auxiliary lens 136.
  • the second main lens 138 may reflect a second image passing through the second auxiliary lens 136 at approximately right angles to the user's eye as a beam splitter.
  • the second image descending from the second mirror 134 in a substantially vertical direction forms an image facing the user's eye and a virtual image facing the front of the user's eye with respect to the reflective surface of the second main lens 138.
  • the convex surfaces of the auxiliary lenses 135 and 136 described above may be installed to include three aspherical lenses in order to enlarge an image or content and eliminate spherical aberration.
  • the nose pad member 140 may be connected to the central lower side of the support frame 120.
  • the nose pad member 140 may be made of at least a portion of a relatively soft material.
  • the soft material may include synthetic resin.
  • An upper cover 113 may be coupled to a central upper side of the main frame 110, an upper front cover 114 may be coupled to an upper front side of the center, and a front auxiliary cover 115 may be coupled to an uppermost center of the center front.
  • the upper front cover 114 may display the first and second support frames 120a and 120b and may be coupled to the central frame 120c by the rear cover 112c and fastening means such as screws or bolts.
  • the front subsidiary cover 115 may cover the exposed lens portion of the front camera 151, in which case at least a portion of the front subsidiary cover 115 may be formed of a transparent or translucent material.
  • the front subsidiary cover 115 may include an opening or a through hole for exposing the lens of the front camera 151.
  • the first side frame 110a may be connected to the left side of the support frame 120, and the other end thereof may be connected to the first side connection frame 121.
  • the first side connection frame 121 may be connected to a first side end frame 122.
  • the first side end frame 122 has an inner space for accommodating the first battery 172, and an opening in the inner space facing the second side end frame 125, which will be described later, is formed on the first flexible cover 123.
  • the first side connecting frame 121 and the first flexible cover 123 may be formed of rubber or a soft synthetic resin material to improve a feeling of wearing at least a portion of the surface contacting the user's ear and the back thereof.
  • one end of the second side frame 110b may be connected to the right side of the support frame 120, and the other end thereof may be connected to the second side connection frame 124.
  • the second side connection frame 124 may be connected to a second side end frame 125.
  • the second side end frame 125 has an inner space for accommodating the second battery 171, and the opening of the inner side facing the first side end frame 122 or the first flexible cover 123 described above is provided.
  • the second flexible cover 126 may be detachably closed.
  • the second side connecting frame 124 and the second flexible cover 126 may be formed of a rubber or soft synthetic resin material to improve a feeling of wearing at least a portion of the surface contacting the user's left ear and the back thereof.
  • the first battery 171 and the second battery 172 may be referred to as a power supply device, and one battery may be mounted or one battery may not be mounted depending on the implementation.
  • the first side cover 112a may be installed on an outer surface of the first side frame 110a to accommodate the first printed circuit board 161.
  • the first printed circuit board 161 may be connected to the first battery 172 and connected to one end of the FPCB 164.
  • a second side cover 112b may be coupled to an outer surface of the second side frame 110b to accommodate a second PCB 162.
  • the second printed circuit board 162 may be connected to the second battery 171 and may be connected to the other end of the FPCB 164.
  • the front side of the first side frame (110a) and the second side frame (110b) may be provided with a projection or uneven portion for securing the coupling force and positioning in the coupling of the shield (see 180 of FIG. 16).
  • a metal material attaching a magnet or a magnet may be provided in the vicinity of the forming surface, thereby facilitating coupling with the shield and maintaining a stable coupling state.
  • the first printed circuit board 161 may include at least one port or a first connector for transmitting and receiving data or receiving power from the outside.
  • the second printed circuit board 162 may include at least one port or a second connector for transmitting and receiving data or receiving power from the outside.
  • the first or second printed circuit boards 161 and 162 may include earphone terminals.
  • the front camera 151 may include a lens exposed on the front surface of the main frame 110 or the center frame 120c. Two front cameras 151 may be installed, but are not limited thereto. The lens of the front camera 151 or a peripheral portion thereof may be protected by the front subsidiary cover 115. The front camera 151 may be electrically connected to another end or extension of the FPCB 164.
  • the rear camera 153 may include a lens exposed at the rear of the main frame 110 or the support frame 120.
  • One rear camera 153 may be installed, but is not limited thereto.
  • the rear camera 153 may be electrically connected to another end or extension of the FPCB 164.
  • the processor (see 212 of FIG. 5) may be disposed on the first or second printed circuit boards 161 and 162.
  • the processor may control the components of the smart glasses 100 to operate and manage the operation or function of the smart glasses 100.
  • the processor may include an application processor (AP).
  • the processor may include a display interface.
  • the display interface may control timing of signals transmitted to the first display 131 and the second display 132 under the control of the processor.
  • the signal may include an image signal.
  • the display interface is dual and 3D MIPI DSI (Dual & 3D Mobile Industry Processor Interface Digital Serial Interface) (hereinafter, simply referred to as DSI) suitable for applying to the first and second displays 131 and 132 and 3D content. (See FIG. 17 210A) or similar means or device.
  • the driver 194 may be built in the first or second side frames 110a and 110b.
  • the driver 194 may be electrically connected to the first or second printed circuit boards 161 and 162 and may operate in response to signals from the front camera 151, the rear camera 153, a sensor, or a processor to generate vibrations. .
  • the user may generate a vibration for the sleepiness prevention alarm according to the signal of the rear camera 153.
  • a heat dissipation structure may be installed at an upper portion of the support frame 120 or a lower portion or a portion of the upper cover 113.
  • One end of the heat dissipation structure may be connected to the display interface 160, the first display 131, the second display 132, the LCD backlight, and the like to emit heat generated in at least one of the above components.
  • the heat dissipation structure may extend along the lengthwise outline of the upper end of the center frame 120c, and may be installed with a plurality of fins on at least one surface thereof to extend the cross-sectional area.
  • the heat dissipation structure may be connected to a material having excellent conductivity installed at a predetermined width along the length direction of the upper cover 113 or may be integrally formed with the high conductive material.
  • FIG. 13 and 14 are diagrams for describing an optical system that may be employed in the smart glasses of FIG. 7.
  • the optical system including the first mirror 133, the first auxiliary lenses 135a, 135b, and 1372 and the first main lens 137 will be described.
  • the second mirror 134 and the second auxiliary lens are described.
  • the smart glasses according to the present embodiment may include a first or right side including a first mirror 133, first auxiliary lenses 135a, 135b, and 1372 and a first main lens 137. And a second or left optical system (see 134, 136 and 138 of FIG. 9) including an optical system and a second mirror, a second auxiliary lens and a second main lens.
  • the first main lens and the second main lens are formed of a plate-shaped reflective and transmissive member, and are disposed to be inclined at a predetermined angle with a central axis of each of the first main lens and the second main lens.
  • the first main lens and the second main lens may be beam splitters in the form of plate members having a thickness of several millimeters or less.
  • the smart glasses of the present embodiment may include at least one display device, preferably two display devices, and may include one mirror or two mirrors 133 and 134, depending on the implementation.
  • the display device is arranged to output the initial image in the front direction or the face direction of the user wearing the smart glasses, or is arranged to output the initial image in the vertical direction from the upper side to the lower side or the lower side to the upper side of the face, or the face. It may be arranged to output the initial image toward both eyes between the eyes of the eyes (the brow) or the nose. The distance between both eyes may correspond approximately to the distance between the centers of the first and second main lenses.
  • the first main lens 137 faces the first auxiliary lenses 135a, 135b, and 1372 and is inclined at about 30 to 60 degrees with the center line of the optical path of the first auxiliary lens. Is placed.
  • the first auxiliary lens 137 may include a first one-sided convex lens 135a, a first double-sided convex lens 135b, and a first convex surface lens 1372.
  • One surface of the first one-side convex lens 135a has a convex surface, and the other surface has a concave surface.
  • the curvature of the concave surface of the first one-sided convex lens 135a may be set to be substantially similar to or the same as the curvature of the opposite convex surface of the first two-sided convex lens 135b.
  • the first biconvex lens 135b has a convex surface on both one surface and the other surface thereof.
  • the radius of curvature of the convex surface of one surface and the convex surface of the other surface may be the same, but is not limited thereto.
  • the first convex lens 1372 may have a convex surface facing the other surface convex surface of the first biconvex lens 135b, and an opposite surface of the convex surface may have a planar shape.
  • the first and second images output from the first and second displays of the smart glasses according to the present embodiment are reflected by the first mirror 133 and the second mirror, respectively, and then the first auxiliary image. After being enlarged by the combination of the convex surface of each of the lens and the second auxiliary lens, it is partially reflected and partially transmitted from the reflective surface of the first main lens 137 and the reflective surface of the second main lens.
  • the reflective surface may be referred to as the beam split surface.
  • the first image and the second image are enlarged by the combination of the convex surface of the first auxiliary lens and the convex surface of the second auxiliary lens, and the spherical surface is formed by the aspherical convex surface of the first auxiliary lens and the second auxiliary lens. Aberrations can be eliminated.
  • first image and the second image focused at a predetermined position in front of the first and second main lenses may form a predetermined single image plane or a virtual image plane.
  • a stereoscopic (3D) image a virtual reality image, augmented reality image, etc. may be provided with a multilayer image plane or a virtual image plane.
  • each of the main lenses has an empty space between the side frames (see 112b of FIG. 7).
  • the content output from the screen A1 of the display may be enlarged using the auxiliary lens.
  • the distance from the auxiliary lens to the enlarged image can be adjusted as necessary. That is, the size and resolution of the image enlarged by the auxiliary lens may be easily adjusted so that the image is enlarged beyond the smart glasses to the user's eye C1 as the virtual image B1 rather than the actual image.
  • the distance li to the virtual image felt by the user through the main lens may be adjusted by the relationship between the focal length f of the main lens and the distance lo from the display to the reflective surface of the main lens.
  • the relationship between the distances satisfies Equation 1.
  • the user can be designed and manufactured so that an enlarged image can be formed at a desired position. It can also be configured to implement about 20 inches of content about 2.4 meters from the smart glasses. That is, it can provide about 20 inches of content about 2 meters from the smart glasses, and the user can easily implement and use augmented reality or virtual reality as needed.
  • FIG. 15 is a block diagram of a control board that may be employed in the smart glasses of FIG. 7.
  • the smart glasses 100 may include a controller 210 connected to the first and second displays D1 and D2 131 and 132.
  • the smart glasses 100 may include a power supply unit 170 and a communication unit 190.
  • the power supply unit 170 may be referred to as a power supply device or a power supply device, and may include a battery or a power supply terminal.
  • the communication unit 190 may include a subsystem for wired or wireless communication. At least a portion of the power supply unit 170 and the communication unit 190 may be mounted on a printed circuit board. In that case, such a printed circuit board may be referred to as a communication and power board.
  • the controller 210 may include a processor and a memory 220.
  • the controller 210 may control the operation of the power supply unit 170 and the communication unit 190 by a program stored in the memory 220 and control the operation of the smart glasses 100.
  • the controller 210 and the power supply unit 190 are integrally formed in the smart glasses 100 in an accommodating form, but are not limited thereto.
  • the display interface is described as a separate configuration from the controller 210, but the present invention is not limited to such a configuration, and the display interface may be implemented to be integrally formed with the controller.
  • the smart glasses 100 includes one small battery or auxiliary battery as an auxiliary power supply, and a separate main power supply including a large, large capacity, or main battery is disposed outside the smart glasses. And data and power lines (see 192 of FIG. 26).
  • FIG. 16 is a perspective view of a smart glasses and a control system according to another embodiment of the present invention.
  • FIG. 17 is a block diagram illustrating a main configuration of the smart glasses of FIG. 16.
  • the smart glasses 100 may be implemented to include a display interface and to arrange a control device connected to the display interface outside the smart glasses 100.
  • the smart glasses 100 are implemented to place substantially all of the battery outside the smart glasses 100.
  • the display interface may be a dual or 3D MIPI DSI (Dual & 3D Mobile Industry Processor Interface Digital Serial Interface, 210a) (hereinafter, simply referred to as DSI) or a means or device for performing a similar function.
  • DSI Device & 3D Mobile Industry Processor Interface Digital Serial Interface
  • This display interface is suitable for implementing 3D content with the first and second displays 131, 132.
  • the smart glasses 100 includes a first display 131, a second display 132, a display interface 160, a camera 150, a communication and power board 164.
  • the camera 150 may include a front or rear camera, and the communication and power board 164 may be installed at an installation position of the first or second printed circuit board while an external main power supply (MPS), 240 may be connected to an application processor (AP) 260.
  • MPS main power supply
  • AP application processor
  • the external device 300 may control and monitor the operation of the smart glasses 100 from the outside as a control and power supply device.
  • the first battery 172 and the second battery 171 together with most of the controller, the first printed circuit board 161, and the second printed circuit board 162 may be omitted. That is, the communication and power board 164 having a relatively simple structure is installed at the position of the second printed circuit board 162 to relay the communication and power connection between the smart glasses 100 and the external device 300 and the remaining electronics. Most of the components for the part can be omitted.
  • the weight of the smart glasses 100 can be considerably reduced, and there is an advantage of increasing the ease of use.
  • FIG. 18 is a block diagram illustrating a control module that may be employed in the smart glasses of FIG. 16.
  • control apparatus of the smart glasses may be referred to as an application processor, and the video interface 261, the tracking module 262, the rendering module 263, and the measurement module 264 may be referred to. It may include.
  • the tracking module 262, the rendering module 263, and the measurement module 264 may be connected to a user interface U / I 265.
  • the video interface 261 may receive a video image stream input from the video cameras 151 and 152 and transfer an image of a desired section to the tracking module 262.
  • the tracking module 262 may receive image data from the video interface 261 and transmit the measured image and the pose estimation result to the rendering module. In this case, the tracking module 262 may receive adjustment information including adjustment parameters and the like through the user interface 265, and recognize the marker using the marker information stored in the marker information database 266.
  • the marker serves as a medium between the real image and the virtual object.
  • the marker information may include information about the size, pattern, etc. of the marker.
  • the marker information database 266 may be replaced with marker information stored in a predetermined storage unit.
  • the rendering module 263 is responsible for creating and removing virtual objects.
  • the rendering module 263 may obtain adjustment information or the like through the user interface 265.
  • the adjustment information may include load or unload, rotation, movement (coordinate movement, etc.), scaling, and the like.
  • the rendering module 263 may interwork with an engine that supports three-dimensional (3D) modeling (in brief, the 3D modeling engine 268) or a means for performing such a function through the content resource database 267.
  • the 3D modeling engine 268 may generate a virtual object of the measured image based on a virtual reality modeling language (VRML).
  • the rendering module 263 may receive the virtual object from the content resource database 267 and render the measured image.
  • VRML virtual reality modeling language
  • the measurement module 264 may measure and process the distance between the virtual objects, the distance and the direction between the generated coordinate systems, and the interference between the virtual objects.
  • the measurement module 264 may receive the augmented image from the rendering module 263 and provide the augmented reality image to the display device according to the object information input through the user interface 265.
  • the object information may include information about positive or negative matching, three-dimensional point, collision, and the like.
  • 19 and 20 are flowcharts for describing a main operation principle of the smart glasses of FIG. 16.
  • the control apparatus of the smart glasses may acquire a stereoscopic image through a camera (S201).
  • the controller may generate a depth map image based on the stereoscopic image.
  • control device may detect or extract the hand image (S203).
  • the controller may extract the hand image from the depth map image.
  • the control device may extract a command corresponding to the series of hand images from the storage unit or the database (S205).
  • the controller may execute a preset command corresponding to a vector component indicated by a series of hand images generated by the extracted hand images for a predetermined time, an image region corresponding to the start and end points of the vector component, or an object located in the image region. I can recognize it.
  • the object may have different types of menus and icons in response to a preset command.
  • the control device of the smart glasses may detect the hand motion through image processing and control the image seen by the smart glasses based on the same.
  • the image control may include forward or reverse image forwarding, fast forwarding the image, rewinding the image, and performing a preset command by recognizing a touch on a specific region of the image.
  • the controller recognizes a hand from an input image frame for hand gesture recognition, detects an image of a hand position, or recognizes or predicts a direction of movement of a hand, and accordingly puts a content on a screen or moves a current image.
  • the current image may be overlapped with another image.
  • Prediction of hand movements may include recognizing a hand, a finger, or a combination thereof.
  • Such gesture recognition may be performed based on image processing and preset gesture information.
  • a user interface using a hand gesture in not only augmented reality, but also virtual reality, mixed reality, and the like, whereby various contents of smart glasses are provided. Can be effectively used, and can greatly improve user convenience.
  • the operator when working at a desk without a monitor, or wearing smart glasses when shopping, performing payment through augmented reality, using a navigation function through augmented reality, or playing augmented reality games, , Watch a virtual reality video, make a video call with augmented reality or virtual reality, the operator can wear smart glasses and work in the field with augmented reality, or simply operate and use the functions of the smartphone on the screen of smart glasses have.
  • the control apparatus of the smart glasses acquires a stereoscopic image through a camera (S201), and detects or extracts a hand image based on this (S203). Screen mirroring of a mobile computing device such as a mobile terminal or a notebook may be performed in response to the series of hand images (S206).
  • Screen mirroring may be referred to as screen mirroring, and refers to a function of allowing a screen of a mobile computing device to be wirelessly viewed on a screen of smart glasses without connecting a separate line.
  • the display device of the control device or the control board of the smart glasses receives and decodes the screen information when the mobile computing device encodes the screen information and transmits it at a frequency, and displays the screen of the smart glasses through the display.
  • Can be output to Wireless includes, but is not limited to, Bluetooth, other short range wireless communication schemes may be used.
  • FIG. 21 is a diagram illustrating a user interface that may be employed in smart glasses according to another embodiment of the present invention.
  • 22 and 23 are exemplary views for explaining a modification of the user interface that can be employed in the smart glasses according to another embodiment of the present invention.
  • 24 is a view for explaining the principle of operation of the user interface that can be employed in the smart glasses of this embodiment.
  • 25 and 26 are flowcharts illustrating a signal processing process of a user interface employed in smart glasses according to another embodiment of the present invention.
  • the smart glasses according to the present embodiment may be implemented as a user interface by floating a virtual touch panel on a surface of a hand, a back of a hand, a forearm, a desk, a visit, and performing an interaction function.
  • the smart glasses may be equipped with a sensor or a camera (see 150 in FIG. 17 and 151 and 152 in FIG. 18) for recognizing a corresponding body part or an object.
  • the camera is equipped with two infrared sensors (cameras) for stereoscopic analysis and stereo function, and when the infrared projector sprays infrared rays into the space, it is reflected from the space. By reading infrared rays, the depth of an object can be sensed. By using an infrared projector, it is possible to easily determine the depth or distance of the space.
  • Infrared projectors are available in two device types.
  • One is the so-called time-of-flight, which has a so-called structured light structure that irradiates infrared rays into a specific space in a certain pattern or pattern, or determines the sensitivity of the infrared rays by different distances from objects. (time of flight) can have a structure.
  • the present invention may implement a depth sensor of two cameras without using infrared projection.
  • a shape of a space and a coordinate of an object may be obtained.
  • this step can be omitted if implemented on a body surface.
  • the depth sensor can be used to find the best place to locate the virtual object.
  • the user interface control device applied to the smart glasses may be implemented to recognize a corresponding body such as a hand, a back of a hand, a forearm, or the like as at least some components of the control device.
  • it may be implemented to recognize the object in order to place the object on the surface of the refrigerator door or visit.
  • the virtual object may be created to have attributes of size, direction, curved surface, and perspective.
  • the object when the virtual object is positioned, the object can be drawn naturally reflecting the object's attributes along the image surface of the user interface. Objects can be drawn to have the appropriate size, direction, and perspective along the same plane or curvature.
  • the finger of the other hand approaching to touch the object or the end of the corresponding pointer may be recognized.
  • a specific finger may be designated and recognized according to an implementation (see FIG. 34). In this case, an operation error such as an overlapping touch can be prevented.
  • the virtual contact may be determined by recognizing whether the specific region corresponding to the virtual object and the designated fingertip are substantially at the same position.
  • the processor may perform a preset command according to the gesture of the movement.
  • a processor connected to a sensor or camera can detect a body part or object and recognize the surface, size, floor, etc., and measure the distance between feature points such as an edge or a point using a function such as motion tracking. Can be.
  • the processor may be implemented to generate a point cloud for texturing, mashing, or the like, or retrieve a color of the point cloud with reference to the color image data.
  • sensors, cameras, processors, or a combination thereof may be designed to work best at distances on the order of 0.5-4M and indoors. This configuration can contribute to good performance while balancing the power required for infrared illumination and depth processing. However, scanning performance may be deteriorated with respect to an object that does not reflect infrared rays well.
  • the additional information may be augmented so that the virtual objects are shown to the user as if they are naturally mixed with the real objects.
  • the size may be adjusted according to the position of the virtual object, the position or size of the three-dimensional surface, or the state of the color or brightness of the lighting may be adjusted.
  • the above conditions can be adjusted to provide the user with a more realistic feeling.
  • a processor coupled to the sensor or camera may be implemented to handle stereopsis.
  • the phantom fusion region of stereoscopic vision may comprise a user interface region for binocular stereopsis used by the human eye.
  • the user interface area may be an area of a single vision.
  • a stereoscopic vision processing system such as software running on a processor, may determine the user's current focus area within the user's field of view.
  • the processor may perform eye tracking processing based on the data captured by the eye tracking camera for each eye. The eye tracking process may acquire a current focus area of the user and perform a processing operation based on the current focus area.
  • Convergence between the eyes can be used to perform triangulation with focus curves, focus on the horopter, along with data indicative of the position of the user's face.
  • the focus area adjustment module in the processor may adjust the variable focus lens disposed to focus the at least one eye in the focus area of the current user through the control circuit.
  • the surface of the user interface image on which the virtual object is displayed may be set as the default virtual image position.
  • a virtual object to be drawn in space may be drawn before or after the basic virtual image position.
  • the virtual object has a different position and size when viewed from the left eye and the right eye of the user according to its position. That is, the two-dimensional image of the virtual object reflected in each eye is transmitted to the user's brain to recognize the three-dimensional virtual object. Inversely, if the user draws an image for the left eye and an image for the right eye, the user's brain can recognize it in three dimensions.
  • the palm 510, the back of the hand 520, the forearm 530, and the desk are used as the depth sensor.
  • a target of image output such as a refrigerator door, a visit 550, a wall, and the like, determine a location and size of a real space and an object, create a coordinate system, and based on the virtual object 600, 610.
  • the virtual object includes a user interface image or vice versa.
  • the stereoscopic vision processing system, the virtual object controller, or a combination thereof in the processor of the present embodiment may recognize a superposition of an instruction pointer for accessing the virtual object and perform a preset command or a series of operations.
  • the palm detection system in the processor extracts the outline of the hand 562 from the image acquired by the sensor or the camera, or the joint of the hand ( 550 and the fingertips 566 may be extracted. That is, the outline 562 of the hand may be extracted or the joint 556 of the hand and the tip 566 of the finger may be extracted based on the point cloud obtained based on the acquired image.
  • joint extraction when the virtual object is also matched or hidden when the fist is clenched or the palm is inverted without opening the palm. Using the joint extraction, it is possible to specifically determine the gesture for the hand gesture.
  • a virtual object 600 is displayed on the recognized palm, and the virtual object 600 may function as a user interface for acquiring an input signal from the user by a combination of another hand of the user or a pointer corresponding thereto. .
  • the recognized palm may be used as the other hand and also as a pointer, in which case it may be used to perform a preset command according to various gesture sub-sets preset in advance with palm recognition.
  • the above-described stereoscopic vision processing system, the virtual object controller, a combination thereof, or a processor including the same may perform a series of operations as shown in FIGS. 25 and 26.
  • the processor mounted in the smart glasses may recognize an object having a structural feature of the hand or the wrist (S301).
  • Objects may include palms, back of hands, forearms, objects, walls, doors, and the like.
  • the processor may select one of the plurality of objects according to a preset processing policy in operation S303.
  • the selection criterion may be set to include a criterion such as a priority of the object, a center of the image, or occupying the largest area of the image.
  • the user interface (UI) image may be searched for based on the structural feature of the recognized or selected object (S304).
  • the UI image may correspond to or include a virtual object and vice versa.
  • the processor may determine or adjust an appropriate size of the area in which the UI image is to be drawn with respect to the searched area (S305).
  • the appropriate size, position or arrangement direction of the UI image may be determined according to a predetermined rule or setting by the type of the recognized or selected object or the UI image arrangement region accordingly.
  • a UI image of the adjusted size may be generated in the searched area.
  • the UI image may be generated together with the 3D plane or the curved surface as the background (S306).
  • the three-dimensional plane or curved surface may function to allow the user to see the virtual object or the UI image better by contrast with the virtual object.
  • the processor mounted on the smart glasses may recognize an object having structural features of another hand approaching the UI image emitted on the object (S311).
  • Another hand or object may be included in or referred to as a pointer.
  • the processor may determine whether a plurality of pointers are found (S312). If there are a plurality of found pointers, the processor may select any one of the plurality of pointers according to a prestored processing policy (S313).
  • the processor may determine whether a fingertip is close to a specific object in the UI within a specific range (S314).
  • Proximity may include overlapping, and in a recognition value indicating the degree of proximity, overlap may have a negative value when the two objects are brought into contact with each other based on the distance between the two objects. If the fingertips are not within a certain range, the processor may maintain or return to the step of recognizing the other hand or object.
  • the processor may determine whether a plurality of objects corresponding to one fingertip or the end of the pointer corresponding thereto exist (S315).
  • the processor may determine the nearest point according to the preset proximity priority policy (S316).
  • the nearest determination may be basically determined based on the distance between the pointer end and the object or the overlapping distance.
  • the nearest determination may be determined based on a preset priority, a weight, a difference between proximity distances, or a combination thereof among the plurality of objects.
  • the processor may determine that the object corresponding to the fingertip or the end of the pointer is touched (touch determination), return the identifier (ID) of the corresponding UI object that is determined to be touched, and then execute a preset touch reaction (S317). ).
  • a stereo RGB camera mounted in smart glasses may be used, or an infrared projector or a stereo infrared camera may be used.
  • augmented reality using binocular stereoscopic vision may be implemented, and a preset operation may be executed by detecting a touch signal or an event input through a user interface according to a gesture of a user. Therefore, it is possible to increase user convenience and provide various augmented reality services and smart glasses services incorporating the same.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

L'invention concerne des lunettes intelligentes comprenant un dispositif d'affichage et capables de traiter un objet virtuel. Les lunettes intelligentes comprennent : un ensemble cadre comprenant de multiples cadres formant la forme de lunettes; un composant d'équipement électronique monté sur l'ensemble cadre; un premier et un second dispositif d'affichage commandés par un dispositif de commande du composant d'équipement électronique et disposés au niveau des côtés centraux-supérieur et central-inférieur de l'ensemble cadre, respectivement; et un système optique pour transférer et projeter une première sortie d'image par le premier dispositif d'affichage et une seconde image délivrée par le second dispositif d'affichage sur une première et une seconde lentille principale, respectivement, tout en agrandissant ou en réduisant la taille des images ou en concentrant les images, un utilisateur voit la première image et la seconde image sur la première lentille principale et la seconde lentille principale, respectivement.
PCT/KR2018/001100 2017-03-28 2018-01-25 Lunettes intelligentes capables de traiter un objet virtuel Ceased WO2018182159A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20170039473 2017-03-28
KR10-2017-0039473 2017-03-28
KR10-2017-0118599 2017-09-15
KR1020170118599A KR20180109644A (ko) 2017-03-28 2017-09-15 스마트 안경 및 그 광학시스템

Publications (1)

Publication Number Publication Date
WO2018182159A1 true WO2018182159A1 (fr) 2018-10-04

Family

ID=63676672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/001100 Ceased WO2018182159A1 (fr) 2017-03-28 2018-01-25 Lunettes intelligentes capables de traiter un objet virtuel

Country Status (1)

Country Link
WO (1) WO2018182159A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110825225A (zh) * 2019-10-30 2020-02-21 深圳市掌众信息技术有限公司 一种广告展示方法及系统
WO2020189866A1 (fr) * 2019-03-19 2020-09-24 엘지전자 주식회사 Dispositif électronique pouvant être porté sur la tête
CN113064274A (zh) * 2019-12-30 2021-07-02 Lg电子株式会社 电子装置
CN114755833A (zh) * 2022-06-14 2022-07-15 深圳市合川智能科技有限公司 虚拟现实设备
CN114839783A (zh) * 2022-06-13 2022-08-02 中国银行股份有限公司 一种基于人工智能的ar眼镜
CN115343854A (zh) * 2022-08-17 2022-11-15 亮风台(上海)信息科技有限公司 光学显示组件及智能头戴设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06105256A (ja) * 1992-09-18 1994-04-15 Olympus Optical Co Ltd 頭部又は顔面装着式ディスプレイ装置
US20020181115A1 (en) * 2001-04-20 2002-12-05 John Hopkins University Head mounted display with full field of view and high resolution
JP3413885B2 (ja) * 1993-07-29 2003-06-09 ソニー株式会社 ディスプレイ装置及び該装置を用いる眼鏡型映像表示装置
JP2003270584A (ja) * 2003-03-07 2003-09-25 Nec Corp ヘッドマウントディスプレイおよびヘッドマウントディスプレイを備えた画像入出力装置
US20160070122A1 (en) * 2014-09-05 2016-03-10 Vision Service Plan Computerized replacement temple for standard eyewear

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06105256A (ja) * 1992-09-18 1994-04-15 Olympus Optical Co Ltd 頭部又は顔面装着式ディスプレイ装置
JP3413885B2 (ja) * 1993-07-29 2003-06-09 ソニー株式会社 ディスプレイ装置及び該装置を用いる眼鏡型映像表示装置
US20020181115A1 (en) * 2001-04-20 2002-12-05 John Hopkins University Head mounted display with full field of view and high resolution
JP2003270584A (ja) * 2003-03-07 2003-09-25 Nec Corp ヘッドマウントディスプレイおよびヘッドマウントディスプレイを備えた画像入出力装置
US20160070122A1 (en) * 2014-09-05 2016-03-10 Vision Service Plan Computerized replacement temple for standard eyewear

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020189866A1 (fr) * 2019-03-19 2020-09-24 엘지전자 주식회사 Dispositif électronique pouvant être porté sur la tête
CN110825225A (zh) * 2019-10-30 2020-02-21 深圳市掌众信息技术有限公司 一种广告展示方法及系统
CN110825225B (zh) * 2019-10-30 2023-11-28 深圳市掌众信息技术有限公司 一种广告展示方法及系统
CN113064274A (zh) * 2019-12-30 2021-07-02 Lg电子株式会社 电子装置
US11789275B2 (en) 2019-12-30 2023-10-17 Lg Electronics Inc. Electronic device
CN113064274B (zh) * 2019-12-30 2024-06-04 Lg电子株式会社 电子装置
CN114839783A (zh) * 2022-06-13 2022-08-02 中国银行股份有限公司 一种基于人工智能的ar眼镜
CN114755833A (zh) * 2022-06-14 2022-07-15 深圳市合川智能科技有限公司 虚拟现实设备
CN115343854A (zh) * 2022-08-17 2022-11-15 亮风台(上海)信息科技有限公司 光学显示组件及智能头戴设备

Similar Documents

Publication Publication Date Title
WO2018182159A1 (fr) Lunettes intelligentes capables de traiter un objet virtuel
US11378802B2 (en) Smart eyeglasses
WO2015053449A1 (fr) Dispositif d'affichage d'image de type lunettes et son procédé de commande
WO2019147021A1 (fr) Dispositif de fourniture de service de réalité augmentée et son procédé de fonctionnement
KR102051202B1 (ko) 가상 오브젝트의 처리가 가능한 스마트 안경
WO2020111594A1 (fr) Dispositif électronique, dispositif de réalité augmentée destiné à fournir un service de réalité augmentée, et son procédé de fonctionnement
WO2018066962A1 (fr) Lunettes intelligentes
WO2018080149A2 (fr) Système de rééducation cognitive à réalité virtuelle associé à la biométrique
KR102218210B1 (ko) 가상 오브젝트의 처리가 가능한 스마트 안경
KR102218207B1 (ko) 가상 오브젝트의 처리가 가능한 스마트 안경
WO2022255682A1 (fr) Dispositif électronique habitronique et procédé de commande de trajet d'alimentation de celui-ci
KR102191433B1 (ko) 스마트 안경
WO2020004941A1 (fr) Dispositif électronique comprenant un élément de réflexion et un élément de réflexion translucide qui peut transmettre, à une lentille, une lumière émise par un affichage
WO2024090844A1 (fr) Dispositif habitronique pour changer l'état d'un écran, et procédé associé
WO2025244279A1 (fr) Dispositif à porter sur soi et procédé de rendu d'image, et support de stockage non transitoire lisible par ordinateur
WO2024155076A1 (fr) Dispositif pouvant être porté pour fournir des informations, et procédé associé
WO2024117649A1 (fr) Dispositif vestimentaire pour afficher un contenu multimédia sur la base d'une forme de préhension par rapport à un objet externe, et procédé associé
WO2025041979A1 (fr) Dispositif portable, procédé et support de stockage lisible par ordinateur pour identifier le regard d'un utilisateur
WO2024101591A1 (fr) Dispositif électronique pour fournir au moins un contenu multimédia à des utilisateurs accédant à un objet, et procédé associé
WO2025164971A1 (fr) Dispositif portable et procédé de déplacement d'objet virtuel pour obtenir des informations concernant des positions du regard
WO2024122836A1 (fr) Dispositif porté par l'utilisateur et procédé d'affichage d'une interface utilisateur associée à la commande d'un dispositif électronique externe
WO2024195997A1 (fr) Dispositif électronique à porter sur soi prenant en charge un toucher à faible puissance, et son procédé de fonctionnement
WO2024101579A1 (fr) Dispositif électronique pour afficher un contenu multimédia, et procédé associé
WO2024025076A1 (fr) Dispositif électronique pour ajuster un volume à l'aide d'un signal sonore émis par un objet externe, et procédé associé
WO2024080579A1 (fr) Dispositif à porter sur soi pour guider la posture d'un utilisateur et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18774222

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18774222

Country of ref document: EP

Kind code of ref document: A1