[go: up one dir, main page]

WO2019112114A1 - Terminal de type lunettes et procédé pour son utilisation - Google Patents

Terminal de type lunettes et procédé pour son utilisation Download PDF

Info

Publication number
WO2019112114A1
WO2019112114A1 PCT/KR2018/000480 KR2018000480W WO2019112114A1 WO 2019112114 A1 WO2019112114 A1 WO 2019112114A1 KR 2018000480 W KR2018000480 W KR 2018000480W WO 2019112114 A1 WO2019112114 A1 WO 2019112114A1
Authority
WO
WIPO (PCT)
Prior art keywords
resolution
distance
image
augmented reality
change
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2018/000480
Other languages
English (en)
Korean (ko)
Inventor
강필구
권종훈
김은기
안정수
천지영
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of WO2019112114A1 publication Critical patent/WO2019112114A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present invention relates to a glass-type terminal which can be worn on a head of a user and outputs an augmented reality image, and an operation method thereof.
  • a terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved.
  • the mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
  • mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some mobile terminals add electronic game play functions or perform multimedia player functions. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
  • the mobile terminal is implemented in the form of a multimedia player having a complex function of, for example, shooting a picture or a moving picture, playing a music or a moving picture file, receiving a game, have.
  • Another object of the present invention is to provide a method for adjusting a screen resolution according to eye relief adjustment and a screen position adjustment according to an eye characteristic of a user who does not wear glasses, And a method of operating the same.
  • Another object of the present invention is to provide a glass type terminal capable of viewing a calibrated comfortable screen even when a wearer of the main body moves while moving by moving means, And to provide a method of operating the same.
  • a glass-type terminal comprising: a main body that is worn on a head of a user; A first frame coupled to the body; A second frame movably connected to the first frame and having a display unit for outputting an augmented reality image; A distance adjusting unit disposed between the first and second frames, the distance adjusting unit adjusting a distance between the display unit and the first frame; A sensing unit for sensing a change in distance between the display unit and the first frame according to the operation of the distance adjusting unit; And a controller for changing resolution of the augmented reality image based on the sensed distance change when the resolution change mode is executed while the augmented reality image is output.
  • an operation method of a glass-type terminal including: outputting an augmented reality image; When the resolution changing mode is executed while the augmented reality image is being output, detecting a change in distance of the sagittal distance according to the adjustment of the separation distance between the display and the lens; And changing the resolution of the augmented reality image based on the sensed distance change of the CRT image.
  • both the wearer wearing glasses and the wearer wearing glasses can stably wear and use without additional cost or separate parts.
  • the wearer of the eyeglass can easily adjust the screen resolution corresponding to the adjustment of the eye relief distance and the user can easily adjust the screen position according to the eyeball characteristics.
  • the wearer of the main body moves while performing the movement by the moving means, it becomes possible to view a comfortable screen with the resolution or position corrected.
  • FIG. 1 and 2 are block diagrams illustrating a glass-type terminal according to the present invention.
  • FIG. 2 and FIG. 3 are views for explaining a change in distance of dead time according to a distance adjustment between a frame and a display in a glass-type terminal according to an exemplary embodiment of the present invention.
  • FIGS. 4A, 4B, and 4C are diagrams for explaining a method of recognizing a change in dead time based on detection of a polarity change level in a glass-type terminal according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a method of operating a glass-type terminal according to an exemplary embodiment of the present invention.
  • FIGS. 6A, 6B, and 6C are conceptual diagrams illustrating screen changes associated with resolution change for a wearer of a glass-type terminal according to an exemplary embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating another operation method of the glass-type terminal according to the embodiment of the present invention
  • FIGS. 8A, 8B, 8C, and 8D are conceptual diagrams for explaining screen changes related to image position correction for a non-eyeglass user in a glass-type terminal according to an embodiment of the present invention.
  • FIG. 9 is a flowchart for explaining resolution change and image position correction according to shaking during movement by a moving unit in a glass-type terminal according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method of generating a plurality of captured images corresponding to a plurality of resolution levels and outputting a captured image in the case of augmented reality image in a glass-type terminal according to an exemplary embodiment of the present invention .
  • FIG. 1 is a block diagram of a glass-type terminal 100 according to an embodiment of the present invention.
  • the glass-type terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, ).
  • the components shown in FIG. 1 are not essential to the implementation of the glass-type terminal 100, so that the mobile terminal described herein can have more or fewer components than the components listed above have.
  • the wireless communication unit 110 includes at least one module for enabling wireless communication between the glass type terminal 100 and the wireless communication system, between the glass type terminal 100 and another terminal, between the glass type terminal 100 and the external server .
  • the wireless communication unit 110 may include one or more modules that connect the glass-type terminal 100 to one or more networks.
  • the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .
  • the input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 or an audio input unit for inputting an audio signal, a user input unit 123 for receiving information from a user, A touch key, a mechanical key, and the like).
  • the voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.
  • the camera 121 may include at least one of at least one image sensor (e.g., CCD, CMOS, etc.), a photosensor (or image sensor) and a laser sensor.
  • the camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image.
  • the photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.
  • TRs Transistors
  • the sensing unit 140 may include at least one sensor for sensing at least one of information in the glass-type terminal 100, surrounding environment information surrounding the terminal 100, and user information.
  • the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an infrared sensor, an infrared sensor, a magnetic sensor 143, an acceleration sensor 144, a G-sensor, a gyroscope sensor 145, a motion sensor, an RGB sensor an ultrasonic sensor, an optical sensor (e.g., a camera 121), a microphone 122, a battery gauge, an environmental sensor (e.g., a barometer , A hygrometer, a thermometer, a radiation sensor, a thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.).
  • the glass-type terminal 100 disclosed in the present specification can combine and use information sensed by at least two of the sensors.
  • the output unit 150 includes a display unit 151, an acoustic output unit 152, a haptic tip module 153, a light output unit 154, an infrared light emitting unit (not shown) 155, < / RTI >
  • the display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen.
  • the touch screen functions as a user input unit 123 that provides an input interface between the glass-type terminal 100 and a user and can provide an output interface between the terminal 100 and a user.
  • the infrared light emitting unit 155 generates infrared light and projects it to the outside.
  • the infrared ray emitting unit 155 may be disposed adjacent to the camera 121 in such a form that a plurality of infrared ray emitting devices are aggregated to support an angle of view range of the camera 121.
  • the infrared ray emitting unit 155 may operate independently of the camera 121 or may operate to generate light when the camera 121 is driven. The arrangement and the configuration of the infrared ray emitting unit 155 included in the glass-type terminal 100 according to the embodiment of the present invention will be described in detail below with reference to the drawings.
  • the interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100.
  • the interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port.
  • a port corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.
  • the memory 170 stores data supporting various functions of the glass-type terminal 100.
  • the memory 170 may store a plurality of application programs or applications that are driven by the glass-type terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least some of these applications may reside on the glass-type terminal 100 from the time of shipment for basic functions (e.g., phone call incoming, outgoing, message receiving, originating functions) of the glass- have.
  • the application program may be stored in the memory 170 and installed on the glass-type terminal 100 and may be operated by the control unit 180 to perform the operation (or function) of the terminal 100.
  • control unit 180 In addition to the operations associated with the application program, the control unit 180 typically controls the overall operation of the glass-
  • the control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.
  • controller 180 may control at least some of the components shown in FIG. 1 in order to drive an application program stored in the memory 170.
  • the controller 180 can operate at least two of the components included in the glass-type terminal 100 in combination with each other for driving the application program.
  • the power supply unit 190 receives power from the external power source and supplies power to the components included in the glass type terminal 100.
  • the power supply unit 190 includes a battery and is chargeable.
  • At least some of the components may operate in cooperation with each other to implement a method of operating, controlling, or controlling a glass-type terminal according to various embodiments described below.
  • the operation, control, or control method of the glass-type terminal may be implemented on a glass-type terminal by driving at least one application program stored in the memory 170.
  • FIGS. 2 and 3 are views for explaining a change in throwing distance according to the adjustment of the distance between the first frame and the display of the lens, which is detachable according to the wearing of the glasses, in the glass-type terminal according to the embodiment of the present invention .
  • Glass Type terminal 100 is configured to be worn on the head of a user like a pair of glasses,
  • the frame portion 101 (or the case, the housing, etc.) .
  • Glass Type terminal 100 is configured to be worn on the head of a human body,
  • the frame portion 101 And may be formed of a flexible material to facilitate wearing.
  • the frame portion 101 Tofu Supported , And a space in which various components are mounted. As shown, In the frame portion 101, The control unit 180, Power supply (190), an audio output module Mido City And the like can be mounted. Also, In the frame portion A lens 103 covering at least one of the left eye and the right eye may be detachably mounted.
  • the control unit 180 controls various electronic components included in the glass type terminal 100.
  • the frame unit 101 includes a first frame 101 and a display 151 and a lens 103, And a second frame arranged.
  • a distance adjusting unit 125 for physically adjusting the distance between the first frame 101 and the display unit 151 may be provided between the second frame of the first frame 101 . 2, the distance control unit 125 is disposed below the first frame in which the user input unit 123 and the control unit 180 are disposed.
  • the present invention is not limited thereto.
  • the first frame 101, the distance adjusting unit 125, and the display unit 151 may be arranged in parallel to each other in the longitudinal direction.
  • a sensor e.g., a magnetic sensor, a hall sensor or the like
  • the sensor values related to the change in the separation distance sensed by the sensor are transmitted to the controller 180.
  • the control unit 180 can be understood as a configuration corresponding to the control unit 180 described above. This figure illustrates that the control unit 180 is provided in the frame portion on one side of the head. However, the position of the control unit 180 is not limited thereto.
  • the display unit 151 may be implemented as a head mounted display (HMD).
  • HMD type refers to a display method that is mounted on a head and displays an image directly in front of the user's eyes.
  • the display unit 151 may be disposed to correspond to at least one of the left and right eyes so that the user can directly provide an image in front of the user's eyes.
  • the display unit 151 is located at a portion corresponding to the right eye so that an image can be output toward the user's right eye.
  • the display unit 151 can project an image to the user's eye using a prism. Further, the prism may be formed to be transmissive so that the user can view the projected image and the general view of the front (the range that the user views through the eyes) together.
  • the glass-type terminal 100 can provide an Augmented Reality (AR) in which a virtual image is superimposed on a real image or a background and displayed as a single image by using the characteristics of the display. Since the image output through the display unit 151 directly projects the AR image in the user's eye (for example, the right eye), the illustrated lens 103 can be omitted according to the user's selection.
  • AR Augmented Reality
  • the camera 121 is disposed adjacent to at least one of the left eye and the right eye, and is configured to photograph a forward image. Since the camera 121 is positioned adjacent to the eyes, the camera 121 can acquire a scene viewed by the user as an image.
  • the camera 121 is provided in the control unit 180 in this figure, it is not limited thereto.
  • the camera 121 may be installed in the frame unit or may be provided in a plurality of ways to acquire stereoscopic images.
  • the glass-type terminal 100 may include a user input unit 123 operated to receive a control command.
  • the user input unit 123 can be employed in any manner as long as it is a tactile manner in which a user touches or pushes a tactile feeling.
  • a touch input type user input unit 123 is provided in the frame portion.
  • the glass-type terminal 100 may further include a microphone (not shown) that receives sound and processes the sound into electrical voice data and vibration data, and an acoustic output module that outputs sound.
  • the sound output module may be configured to transmit sound by a general sound output method or a bone conduction method.
  • the microphone can be provided in the same insertion portion as the microphone.
  • the glass-type terminal 100 may further include an infrared ray emitting unit 155 that generates infrared ray using at least one infrared ray emitting element (e.g., an IR LED).
  • the infrared ray emitting unit 155 is disposed adjacent to the camera 121, and projects the generated infrared ray to the outside.
  • the infrared ray emitting unit 155 may be disposed at a position adjacent to the camera 121 in the form of a plurality of infrared light emitting devices cohered to support the angle of view of the camera 121.
  • the infrared light projected to the outside through the infrared light emitting unit 155 can be scanned at an arbitrary position of the image output through the display unit 151.
  • the infrared ray emitting unit 155 can change the position and direction of the infrared ray projected to the outside based on a predetermined input signal.
  • the distance adjusting unit 125 may have a double rail structure.
  • this structure is merely an example, and it is needless to say that it can be deformed into a screw-type variable structure or the like.
  • the distance adjusting unit 125 may have a structure in which the dual rail structure is slid by an external force or automatically slid according to a touch input.
  • the distance adjuster 125 may include an outer first rail and an inner second rail, and the first and second rails may vary lengthwise relative to each other.
  • the first and second rails may initially have a completely overlapping structure or may have a structure in which the majority of the first rail is drawn into the second rail.
  • the first and second rails may be connected to a well-known fastening means or hinge structure.
  • each end of the overlapping portion of the first rail and the second rail may further include an escape preventing projection and an escape preventing hole for preventing escape due to the adjustment of the maximum distance.
  • the length of the first and second rails of the distance adjusting portion 125 may be varied by applying a mechanical external force (e.g., rotation, pulling, pushing, etc.) to the wearer of the main body 100, The length may be varied in such a manner that the first and second rails gradually move relative to each other in a predetermined direction as an input.
  • a mechanical external force e.g., rotation, pulling, pushing, etc.
  • the second frame, on which the display unit 151 and the lens 103 are disposed, is attached to the first frame 101, which is worn on the user's head, Gradually it will move away. Accordingly, eye relief can be achieved due to wear of the spectacle wearer's glasses.
  • a sensor may be provided in the distance adjuster 125 to sense the distance adjustment by the distance adjuster 125.
  • the first rail 125i and the second rail 125e provided on the inner side of the distance adjusting part 125 and the second rail 125e provided on the outer side are provided with first and second rails 125i and 125i, respectively.
  • a sensor e.g., Hall sensor / magnetic sensor / IR sensor
  • H capable of detecting the overlapping length or the protrusion amount due to the relative movement of the rail 125e can be provided.
  • the distance from the display portion 151 to the wearer's eyeball that is, The wearer's travel distance can be extended to about 40 mm.
  • the change of the distance of the wearer's throwing distance is sensed by detecting a change in the length of the distance adjuster 125 in such a manner that one sensor H moves or a plurality of sensors are activated / deactivated.
  • At least one of the first rail 125i and the second rail 125e has a region having a different polarity at a predetermined length.
  • the S-pole, N-pole, and S-pole regions may exist at intersections, each including intersecting or non-polarizing regions.
  • 4A shows the structure when the length of the distance adjusting portion 125 is a minimum distance, for example, when the first rail 125i and the second rail 125e are completely overlapped.
  • the sensor H exists in a fixed state at the first position P1. In this case, the sensor H will sense no change in polarity or only one polarity (e.g., S pole) corresponding to level 1.
  • a spectacle lens (not shown) is added between the existing lens 103 and the user's pupil (distance adjustment is necessary even when the lens 103 is detached)
  • the length of the portion 125 must be increased, and accordingly, the eye relief distance necessarily increases.
  • FIG. 4B shows the structure when the length of the distance adjusting portion 125 is minimized as the maximum distance, for example, the first rail 125i and the second rail 125e are maximally moved relative to each other.
  • the sensor H is in a state of being moved from the first position P1 to the second position P2.
  • the sensor H can be made movable in a non-contact manner without worn to secure reliability.
  • the sensor H can detect a polarity change of a maximum level (e.g., level 5). For example, the change of the S pole, the N pole, the S pole, the N pole, and the S pole may be sequentially detected.
  • 4C is an example showing the output result of the Hall sensor when the polarity changes according to the length of the distance adjusting unit 125 as described above.
  • the sensor H outputs High in the S-pole region, outputs Low in the N-pole region, and transmits the output values to the control unit 180 in order. Accordingly, the control unit 180 can recognize a change in the distance traveled by the driver, and accordingly prepare for adjusting the resolution or magnification of the screen.
  • control unit 180 recognizes the change in the travel distance is to handle a crop phenomenon occurring in at least a part of the screen viewed by the user when the eye relief is expanded or reduced to be.
  • recognition and the screen adjustment according to the change of the eye relief are implemented at least smoothly or easily by the control unit 180.
  • a glass-type terminal body 100 configured to be worn on a head of a user is movably connected to a first frame 101 and a first frame 101, And a second frame structure in which the first frame 151 is disposed.
  • a distance adjuster 125 (see FIG. 1) is provided between the first frame 101 and the second frame for adjusting the separation distance between the display unit 151 and the first frame 101 so that the wearer's trip distance can be adjusted. ).
  • the sensor 143 senses a change in the distance of the missed travel distance according to the operation of the distance adjusting unit 125, The resolution of the augmented reality image can be automatically changed.
  • FIG. 5 is a flowchart illustrating an operation method related to screen adjustment according to a change in distance of a missile in a glass-type terminal according to a first embodiment of the present invention.
  • a step of a user wearing a glass-type terminal 100 and outputting an augmented reality image (hereinafter referred to as 'AR image') is started (10).
  • the wear of the glass-type terminal 100 can be sensed through various sensors, and the magnetic sensor 143 for sensing the change in the length of the distance adjuster 125 can be activated through wearing detection of the terminal 100 .
  • the control unit 180 may detect occurrence of a predetermined trigger event during output of the AR image (20).
  • the trigger event may mean a case where the user enters the resolution change mode in the detection or setting application of the operation for varying the length of the distance adjuster 125.
  • the trigger event may refer to a case where an initial screen is output after rebooting the main body.
  • the detection of the operation for varying the length of the distance adjusting unit 125 can be performed through the above-described sensor (e.g., Hall sensor / magnetic sensor, IR sensor).
  • the resolution change mode is executed (30). That is, an operation for changing the current screen resolution is executed in order to prevent a cropping phenomenon of the currently output AR image.
  • the resolution change mode may be executed in response to the detection of the shift distance change according to the operation of the distance adjuster 125.
  • the reference time e.g. 1 minute / set time
  • images (hereinafter, 'test images') having different resolutions profiled are output to the display unit 151 (40).
  • a plurality of profiled resolution levels and corresponding images may be stored in the memory 170 of the main body 100 in advance.
  • a test image corresponding to a plurality of profiling resolution levels means a predetermined test image having a resolution corresponding to a different resolution profile for each mission distance, managed by a table, and having a resolution corresponding to the table.
  • the test images may be all provided at the same time or only one specific test image matching the dead time distance corresponding to the operation of the distance control unit 125 may be provided. However, in the latter case, it may be changed to another test image according to an operation of the user input unit 123 or the like.
  • control unit 180 detects a change in eye relief distance through the above-described sensor (e.g., Hall sensor / magnetic sensor / IR sensor) (50).
  • sensor e.g., Hall sensor / magnetic sensor / IR sensor
  • a sensor for detecting a change in eye relief distance may be provided in the distance adjusting unit 125 disposed between the first frame of the main body and the second frame in which the display unit 151 is disposed.
  • the sensor may be, for example, a hall sensor, a two-axis magnetic sensor, or an IR sensor.
  • the change in the eye relief distance can be achieved by detecting the change in polarity of the sensor according to the change in the length of the distance adjusting unit 125.
  • the controller 180 selects one of the output test images (60). Specifically, since each of the plurality of test images has different resolutions that are profiled, a second test image corresponding to the sensed distance change of travel distance can be specified based on the first test image matching the current resolution have.
  • the image to be selected depends on the length of the dead time distance according to the operation of the distance control unit.
  • resolution information matched to the selected specific image is displayed on the display unit 151 Can be output.
  • the controller 180 changes the resolution of the current AR image to the optimal resolution according to the change in the distance of the moving image.
  • 6A, 6B and 6C show screen changes related to resolution change for a wearer of glasses.
  • 6A is an initial screen of the resolution changing mode, and shows an example query screen 601 for recognizing whether or not the user wears glasses.
  • the resolution change setting can be started through the inquiry screen 601.
  • the screen 601 of the example query is output to at least a part of the current AR image, have.
  • the guide information 602 for guiding the distance adjustment of the distance adjuster 125 may be output as shown in FIG. 6B.
  • the user can intuitively understand the correlation between the distance adjustment of the distance adjuster 125 and the selection of the test image through the guide information 602.
  • the guide information 602 illustrates, by way of example, the choice of five profiled exemplary test images, but is not limited to such number and resolution values.
  • 6C are examples of test images having a plurality of profiled different resolutions.
  • images with (a) 1920 * 1080, (b) 1280 * 720, (c) 1024 * 768, (d) 800 * 480, (e) 680 * 480 are profiled and stored in memory 170 And stored in advance. As the detected shift distance increases, the lower resolution test image is selected.
  • the problem is not caused by the size of the margin area which is scattered on the pupil of the actual user while the resolution is minimized .
  • a test image (b) having a resolution of 1280x720 can be selected.
  • a test image (d) having a resolution of 800x480 may be automatically selected to prevent cropping.
  • the resolution is changed to a lower resolution than the current resolution as the distance of tripping increases, and the image is applied to the AR image.
  • the memory 170 is provided with a resolution level, a test image, and a distance adjuster (not shown) so that one test image can be matched according to the combination of the resolution of the current AR and the distance variation of the distance adjuster 125 among the illustrated test images.
  • 125 are stored in the form of a table.
  • the stored tables may be updated based on the user's selection / usage pattern.
  • a reboot requirement may be imposed when applying the resolution change according to the change of the dead time distance so that the problem of the memory 170 and CPU load of the main body may not occur.
  • the controller 180 may output a rebooting request through the display unit 151.
  • the rebooting request in response to the rebooting request, the resolution value requested before rebooting can be controlled to be actually applied at the time of rebooting.
  • the resolution of the display unit 151 can be supported by changing the lower resolution on the basis of the maximum resolution (e.g., 1920 * 1080), but in the kernel structure, code is reflected in the display driver IC. Otherwise, in order to change and apply the resolution on the current operating system, it may be performed in a manner of preliminarily drawing images of a resolution level that can be changed in the image rendering dimension or in real time.
  • the maximum resolution e.g. 1920 * 1080
  • a resolution change can be automatically performed to receive an appropriate AR image without cropping.
  • FIG. 7 is a flowchart illustrating a method of operating a glass-type terminal according to a second embodiment of the present invention.
  • 8A, 8B, 8C, and 8D are conceptual diagrams for explaining screen changes related to image position correction for a spectacle wearer.
  • an AR image is output 701 through a display unit 151 of a glass-type terminal 100, and a distance between an image of the AR image is output through a sensor provided in the distance controller 125, (Step 702) is performed.
  • the controller 180 may recognize whether the user wears glasses 703.
  • the controller 180 can recognize that the user wears glasses. For example, if the shift distance change is less than 10 mm, it is determined that the glasses are not used, and if it exceeds 10 mm, it is determined that glasses are worn.
  • a query message for recognizing whether or not the user wearing the main body wears the glasses is displayed on the display unit 151, and based on the response to the query message, Or the resolution change mode can be executed.
  • control unit 180 responds with 'Yes' or 'No' to the inquiry message 801 of overlapping and outputting the current AR image or separately outputted, Accordingly, it is possible to recognize whether or not the user wears glasses.
  • response method there is no limitation on the response method, and any of voice input, touch input, and head gesture may be used.
  • the resolution changing mode described above is executed with reference to FIG. 5 (704). However, at this time, the first screen shown in FIG. 6A may be omitted.
  • the controller 180 maintains the current resolution of the AR image (705). Accordingly, as shown in FIG. 8B, the message 802 indicating that the current resolution is maintained can be output through the display unit 151.
  • a position adjustment guide for adjusting the position of the output AR image is output to the display unit 151 (706). Adjustment of the position of the AR image can be performed through a test image.
  • a guidance screen 803 for detailed adjustment of a position where the image is comfortable can be adjusted through position adjustment in the four directions, and a margin screen area 803 (Dashed rectangle area) is displayed.
  • the position adjustment may be performed by operating a four-directional interface output on a guidance screen 803 that interacts with a touch key or the like provided on one side of the glass-
  • the controller 180 stores the position-adjusted coordinate values in the storage associated with the image output in the memory 170 (707).
  • the positionally determined test image 804 is output as the confirmation input is received from the user, and the x and y coordinate values of the position are stored as the changed coordinate values of the AR image.
  • an AR image positioned at a position corresponding to the user's eyeball characteristics (pupil distance per user and eyeball having a large usage ratio in the left / right eye)) is output (708).
  • the image position adjusting function can be provided so that a comfortable position screen can be viewed.
  • FIG. 9 is a flowchart illustrating an operation method according to a third embodiment of the glass-type terminal according to an embodiment of the present invention, specifically, a resolution change and an image position correction according to a shake during movement by the movement unit.
  • the wearing of the glass-type terminal 100 is detected and the step of outputting the AR image to the display unit 151 is started (901).
  • the terminal 100 may detect movement by the moving means (902).
  • the moving means may include all kinds of transportable vehicles such as all vehicles such as a car, a taxi, and a truck, a bicycle, a ship, a train, an airplane, a cable car, and the like.
  • the detection of the movement of the moving means may be performed through the sensors (e.g., the acceleration sensor 144, the gyro sensor 145, and the magnetic sensor 143) provided in the terminal 100, (For example, 'move mode execution').
  • the AR image is outputted through the display unit 151, so that it is possible to prevent the unintentional It is not necessary to specifically exclude the motion, and a description thereof will be omitted here.
  • the controller 180 When the movement by the moving means is detected, the controller 180 recognizes the current operation mode (903). For example, if the current mode of operation is " navigation for finding a restaurant A ", then the following operations may be performed until such navigation application is terminated or until a restaurant A is found, That is, the control unit 180 may set the current operation mode continuation as the continuation condition of the image correction mode, which will be described in detail below.
  • the image correction mode is executed, and the process of initializing the reference position is performed (904).
  • the coordinate values of the reference position are used as references for correcting the resolution and position of the image thereafter.
  • the change of the coordinate values of the reference position can be performed by monitoring the front / rear / left / right shake.
  • the controller 180 of the terminal measures the angular velocity values of the two axes (y, z axes) horizontal to the ground among the sensor values obtained through the three-axis gyro sensor, And corrects the rotational angular velocity value of one of the sensor values obtained through the three-axis gyro sensor perpendicular to the paper surface (x axis) to the sensor value obtained through the magnetic sensor, Can be precisely monitored.
  • the controller 180 performs resolution correction of the AR image (905). For example, when the terminal body 100 itself or the rail of the distance control unit 125 is moved forward, the resolution of the AR image is corrected in real time to a value lower than the current value. In addition, when the terminal body 100 itself or the rails of the distance adjuster 125 move backward, the AR image is corrected in real time to a value higher than the current value and output.
  • the controller 180 performs position correction of the AR image (906). For example, when the terminal body 100 itself moves to the left with reference to the reference position, the position of the AR image is corrected in real time so as to have a coordinate value shifted to the right by a shaking distance from the stored reference position. When the terminal body 100 moves to the right with reference to the reference position, the position of the AR image is corrected in real time so as to have a coordinate value shifted to the left by a shaking distance from the stored reference position.
  • Only one of the resolution correction 905 and the position correction 906 of the AR image described above can be performed or both can be performed.
  • the resolution correction 905 and the position correction 906 of the AR image described above can be performed one at a time or at the same time regardless of the predetermined order.
  • the controller 180 can control the image correction operations to be performed when the reference position is not corrected within a predetermined time .
  • the image correction mode is terminated together with the image correction mode (907). Also, when the image correction mode is ended, the resolution at the time of storing the reference position and the reference position is applied to the AR image to be output subsequently. That is, the resolution and position of the AR image can be restored to its original state.
  • FIG. 10 is a flowchart illustrating an operation method according to the fourth embodiment of the glass-type terminal according to an embodiment of the present invention. Specifically, FIG. 10 illustrates a method of generating a plurality of captured images corresponding to a plurality of resolution levels when capturing an augmented reality image, And a method of outputting the captured image.
  • the wearing of the glass-type terminal 100 is detected and the step of outputting an AR image to the display unit 151 is started (1001).
  • the first resolution and the second profile- A plurality of AR capture images having a third resolution, a fourth resolution,, are automatically generated.
  • one AR capture image corresponding to the current trip distance among the plurality of AR capture images corresponding to the plurality of stored resolution levels is automatically output ( 1005).
  • the captured AR image when the captured AR image is stored, information related to the distance traveled by the external terminal from an external terminal (not shown) to transmit the captured AR image through the communication unit 110 included in the main body 100 And then selectively transmit one AR capture image matching the resolution corresponding to the received information.
  • both the wearer wearing glasses and the wearer wearing glasses can stably wear and use without additional cost or separate parts.
  • the wearer of the eyeglass can easily adjust the screen resolution corresponding to the adjustment of the eye relief distance and the user can easily adjust the screen position according to the eyeball characteristics.
  • the wearer of the main body moves while performing the movement by the moving means, it becomes possible to view a comfortable screen with the resolution or position corrected.
  • the present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded.
  • the computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un terminal de type lunettes et un procédé pour son utilisation. Le terminal de type lunettes d'un mode de réalisation selon la présente invention comporte un corps principal qui est configuré pour pouvoir être porté sur une partie de tête d'un utilisateur; une première monture qui est couplée au corps principal, une seconde monture qui est liée de façon mobile à la première monture et sur laquelle est disposée une unité d'affichage servant à délivrer une image de réalité augmentée, une unité de réglage de distance qui est disposée entre les première et seconde montures et peut régler la distance de séparation entre l'unité d'affichage et la première monture, une unité de détection qui détecte un changement de distance de point d'œil d'après le réglage de la distance de séparation entre l'unité d'affichage et la première monture en fonction d'un actionnement de l'unité de réglage de distance, et une unité de commande. Lorsqu'un mode de changement de résolution est exécuté tandis que l'image de réalité augmentée est délivrée, l'unité de commande change la résolution de l'image de réalité augmentée d'après le changement détecté de distance de point d'œil.
PCT/KR2018/000480 2017-12-07 2018-01-10 Terminal de type lunettes et procédé pour son utilisation Ceased WO2019112114A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0167544 2017-12-07
KR1020170167544A KR20190067523A (ko) 2017-12-07 2017-12-07 글래스 타입 단말기 및 그것의 동작방법

Publications (1)

Publication Number Publication Date
WO2019112114A1 true WO2019112114A1 (fr) 2019-06-13

Family

ID=66751458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2018/000480 Ceased WO2019112114A1 (fr) 2017-12-07 2018-01-10 Terminal de type lunettes et procédé pour son utilisation

Country Status (2)

Country Link
KR (1) KR20190067523A (fr)
WO (1) WO2019112114A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111965781A (zh) * 2020-08-28 2020-11-20 广东九联科技股份有限公司 Vr镜筒焦距电动调节控制方法、系统及装置
CN112925413A (zh) * 2021-02-08 2021-06-08 维沃移动通信有限公司 增强现实眼镜及其触控方法
US20220197029A1 (en) * 2020-12-23 2022-06-23 Tobii Ab Head-mounted display and method of optimisation

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102812420B1 (ko) 2019-08-28 2025-05-23 엘지전자 주식회사 전자 디바이스
JP7387545B2 (ja) * 2020-07-01 2023-11-28 エヌ・ティ・ティ・コミュニケーションズ株式会社 情報設定制御装置、方法およびプログラム
WO2024090896A1 (fr) * 2022-10-28 2024-05-02 삼성전자 주식회사 Dispositif électronique à porter sur soi comprenant une roue

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150075764A (ko) * 2013-12-26 2015-07-06 엘지전자 주식회사 이미지를 캡쳐하는 모바일 디바이스 및 그 제어 방법
KR20160136674A (ko) * 2015-05-20 2016-11-30 엘지전자 주식회사 헤드 마운티드 디스플레이
KR20170006058A (ko) * 2015-07-07 2017-01-17 삼성전자주식회사 헤드 마운트 디스플레이 장치
KR20170031978A (ko) * 2015-09-14 2017-03-22 엘지전자 주식회사 헤드 마운트 디스플레이
US20170200427A1 (en) * 2015-08-06 2017-07-13 Boe Technology Group Co., Ltd. Display adjusting system and display adjusting method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150075764A (ko) * 2013-12-26 2015-07-06 엘지전자 주식회사 이미지를 캡쳐하는 모바일 디바이스 및 그 제어 방법
KR20160136674A (ko) * 2015-05-20 2016-11-30 엘지전자 주식회사 헤드 마운티드 디스플레이
KR20170006058A (ko) * 2015-07-07 2017-01-17 삼성전자주식회사 헤드 마운트 디스플레이 장치
US20170200427A1 (en) * 2015-08-06 2017-07-13 Boe Technology Group Co., Ltd. Display adjusting system and display adjusting method
KR20170031978A (ko) * 2015-09-14 2017-03-22 엘지전자 주식회사 헤드 마운트 디스플레이

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111965781A (zh) * 2020-08-28 2020-11-20 广东九联科技股份有限公司 Vr镜筒焦距电动调节控制方法、系统及装置
CN111965781B (zh) * 2020-08-28 2022-02-25 广东九联科技股份有限公司 Vr镜筒焦距电动调节控制方法、系统及装置
US20220197029A1 (en) * 2020-12-23 2022-06-23 Tobii Ab Head-mounted display and method of optimisation
CN112925413A (zh) * 2021-02-08 2021-06-08 维沃移动通信有限公司 增强现实眼镜及其触控方法
CN112925413B (zh) * 2021-02-08 2023-06-23 维沃移动通信有限公司 增强现实眼镜及其触控方法

Also Published As

Publication number Publication date
KR20190067523A (ko) 2019-06-17

Similar Documents

Publication Publication Date Title
WO2019112114A1 (fr) Terminal de type lunettes et procédé pour son utilisation
WO2020185029A1 (fr) Dispositif électronique et procédé d'affichage des informations de partage sur la base de la réalité augmentée
WO2016021747A1 (fr) Visiocasque et son procédé de commande
WO2016003078A1 (fr) Terminal mobile de type lunettes
WO2018038281A1 (fr) Dispositif vestimentaire
KR20160001178A (ko) 글래스 타입 단말기 및 이의 제어방법
WO2015122566A1 (fr) Dispositif d'affichage monté sur tête pour afficher un guide de capture d'image en réalité augmentée, et son procédé de commande
JP2012203128A (ja) 頭部装着型表示装置および頭部装着型表示装置の制御方法
WO2019231090A1 (fr) Dispositif électronique et procédé d'affichage d'un objet associé à un dispositif électronique externe en fonction de la position et du déplacement du dispositif électronique externe
EP3167610A1 (fr) Dispositif d'affichage ayant une étendue d'accréditation liée à la profondeur d'un objet virtuel, et procédé de commande correspondant
WO2018088730A1 (fr) Appareil d'affichage, et procédé de commande correspondant
WO2019164092A1 (fr) Dispositif électronique de fourniture d'un second contenu pour un premier contenu affiché sur un dispositif d'affichage selon le mouvement d'un objet externe, et son procédé de fonctionnement
WO2019139404A1 (fr) Dispositif électronique et procédé de traitement d'image correspondante
WO2022196869A1 (fr) Dispositif d'affichage monté sur la tête, procédé de fonctionnement pour dispositif et support de stockage
WO2017171157A1 (fr) Dispositif pouvant être porté
WO2016182090A1 (fr) Terminal de type lunettes et son procédé de commande
WO2018093075A1 (fr) Dispositif électronique et procédé de commande associé
WO2019066323A1 (fr) Dispositif électronique et procédé d'exécution de contenu utilisant des informations de ligne de vue de celui-ci
WO2018080202A1 (fr) Dispositif visiocasque et procédé de commande de celui-ci
WO2017164545A1 (fr) Dispositif d'affichage et procédé permettant de commander un dispositif d'affichage
JP5273323B1 (ja) ヘッドマウントディスプレイ装置
WO2021230568A1 (fr) Dispositif électronique permettant de fournir un service de réalité augmentée et son procédé de fonctionnement
WO2016047824A1 (fr) Dispositif de projection d'informations d'image, et procédé de commande de dispositif de projection
WO2018097483A1 (fr) Procédé de génération d'informations de mouvement et dispositif électronique le prenant en charge
US11275547B2 (en) Display system, display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18885143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18885143

Country of ref document: EP

Kind code of ref document: A1