WO2020204594A1 - Dispositif de réalité virtuelle et son procédé de commande - Google Patents
Dispositif de réalité virtuelle et son procédé de commande Download PDFInfo
- Publication number
- WO2020204594A1 WO2020204594A1 PCT/KR2020/004447 KR2020004447W WO2020204594A1 WO 2020204594 A1 WO2020204594 A1 WO 2020204594A1 KR 2020004447 W KR2020004447 W KR 2020004447W WO 2020204594 A1 WO2020204594 A1 WO 2020204594A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- virtual reality
- virtual
- reality device
- gaze direction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
Definitions
- the present invention relates to a virtual reality device and a control method thereof, and more particularly, to a virtual reality device for providing virtual reality content through a virtual reality (VR) device and a control method thereof.
- VR virtual reality
- VR Virtual reality
- the environment or situation provided to the user through virtual reality content can stimulate the user's five senses so that the user can experience a spatial and temporal similar to the real one.
- the user not only simply immerses in the virtual reality content, but can interact with objects embodied in the virtual reality content by performing manipulation or command using an existing device.
- Virtual reality content can be distinguished from a one-sided simulation in that it allows interaction with the user and creates a user's experience.
- Virtual reality content may be provided to a user through a virtual reality device.
- One of the virtual reality devices is a head mounted display (HMD) that is mounted on a user's head and has a display screen positioned in front of the user's eyes, and displays virtual reality content through the display screen.
- HMD head mounted display
- the user may move his/her head in various directions while using the virtual reality device, and the direction the user looks at may change according to such movement.
- a virtual reality device In order to provide realistic virtual reality content, a virtual reality device must be able to provide virtual reality content reflecting a change in a user's viewing direction.
- the present invention provides a method or apparatus for providing virtual reality content reflecting this when a user viewing virtual reality content changes the viewing direction, such as rotating or moving a head.
- Another virtual reality device may provide a virtual reality device including a sensor that detects a direction of a user's gaze and a display unit that displays a virtual screen to the user.
- a method of controlling a virtual reality device may include: setting a first reference for a user's gaze direction based on the sensor; Displaying a virtual screen projected in a virtual space to a user based on the first criterion; Determining whether a rotation angle of the user's gaze direction with respect to a pitch direction compared to the first criterion is greater than a first threshold value and less than a second threshold value; Determining whether a moving distance in the user's gaze direction with respect to the Y-axis direction is greater than a third threshold value; Compared to the criterion, the rotation angle of the user's gaze direction with respect to the pitch direction is greater than a first threshold value and less than a second threshold value, and the movement length of the user's gaze direction with respect to the Y-axis direction is third.
- a method for controlling a virtual reality device including a can be provided.
- a method for controlling a virtual reality device is an external electronic device that can receive a command to set a new second reference for a user's gaze direction to an external electronic device paired with the virtual reality device.
- a method of controlling a virtual reality device including a process of transmitting a signal requesting to activate a centering button of the device may be provided.
- a method for controlling a virtual reality device may include: receiving a command for setting a new second reference for the user's gaze direction from the external electronic device; Setting a new second criterion for the user's gaze direction; And displaying a virtual screen projected on a virtual space based on a new second criterion for the user's gaze direction. It is possible to provide a method for controlling a virtual reality device further comprising.
- the virtual screen displayed by the display unit is turned off in order to reduce fatigue due to the user's cognitive dissonance.
- a method for controlling a virtual reality device further comprising a process may be provided.
- a method for controlling a virtual reality device may include, when a virtual screen projected in a virtual space is displayed based on a new second criterion for the user's gaze direction, the virtual The rotation of the virtual screen projected in space may be displayed based on a value obtained by multiplying a rotation angle of a user's gaze direction with respect to the pitch direction by a predetermined weight.
- a method for controlling a virtual reality device may include, when a virtual screen projected in a virtual space is displayed based on a new second criterion for the user's gaze direction, the virtual It is possible to provide a method for controlling a virtual reality device, wherein the rotation of the virtual screen projected in space is displayed based on a signal received from the external electronic device.
- the virtual reality device may provide a virtual reality content reflecting this to provide a realistic virtual reality experience.
- FIG. 1 is a diagram illustrating that a virtual reality device according to an embodiment of the present invention displays a virtual screen in a virtual space.
- FIG. 2 is a diagram illustrating controlling an object in a virtual screen using an external electronic device paired with a virtual reality device according to an exemplary embodiment of the present invention.
- FIG. 3 is a block diagram of a virtual reality device according to an embodiment of the present invention.
- FIG. 4(a) is a diagram illustrating an external electronic device according to an embodiment of the present invention
- FIG. 4(b) is a block diagram of an external electronic device according to an embodiment of the present invention.
- FIG. 5 is a block diagram illustrating a configuration of a virtual reality device according to an embodiment of the present invention.
- FIG. 6 is a diagram showing a projection structure applied by a virtual reality device according to an embodiment of the present invention.
- FIG. 7 is a diagram illustrating a user's gaze direction on a coordinate axis in a virtual reality device according to an embodiment of the present invention.
- FIG. 8 is a diagram illustrating changing a user's gaze direction to the left and right directions in a virtual reality device according to an embodiment of the present invention.
- 9(a) and 9(b) are diagrams illustrating changing a change in a user's gaze direction from the front to the upper side in the virtual reality device according to an embodiment of the present invention.
- FIG. 10 is a diagram illustrating changing a virtual space in a virtual reality device according to an embodiment of the present invention.
- FIG. 11 is a flowchart illustrating a method of controlling a virtual reality device according to an embodiment of the present invention.
- FIG. 12 is a diagram illustrating an example of changing a virtual space of a virtual reality device according to an embodiment of the present invention.
- FIG. 13 is a diagram illustrating a method of controlling a virtual reality device according to a change in a virtual space of the virtual reality device according to an embodiment of the present invention.
- each component, functional blocks or means may be composed of one or more sub-components, and the electrical, electronic, and mechanical functions performed by each component are an electronic circuit, It may be implemented with various known devices or mechanical elements such as an integrated circuit and an application specific integrated circuit (ASIC), and may be implemented separately or two or more may be integrated into one.
- ASIC application specific integrated circuit
- combinations of each block of the attached block diagram and each step of the flowchart may be performed by computer program instructions.
- These computer program instructions can be mounted on the processor of a general purpose computer, special purpose computer, portable notebook computer, network computer, mobile device such as a smart phone, an online game service providing server, or other programmable data processing equipment.
- the instructions, executed by the processor of the possible data processing equipment, will create a means for performing the functions described in each block of the block diagram or each step of the flow chart described below.
- Computer program instructions can also be stored in memory available to a computer device or computer readable memory that can be directed to a computer device or other programmable data processing equipment to implement a function in a particular way, so each block of the block diagram Alternatively, it is possible to produce a product containing instruction means for performing the functions described in each step of the flowchart.
- Computer program instructions can also be mounted on a computer device or other programmable data processing equipment, creating a process for performing a series of operational steps on a computer device or other programmable data processing equipment to create a block diagram and flow chart. It is also possible to provide steps for performing the functions described in each step of the.
- each block or each step may represent a module, segment, or part of code comprising one or more executable instructions for executing the specified logical function(s).
- functions mentioned in blocks or steps may occur out of order.
- two blocks or steps shown in succession may in fact be performed substantially simultaneously, or the blocks or steps may sometimes be performed in the reverse order depending on the corresponding function.
- the term user device means all calculation means capable of collecting, reading, processing, processing, storing, and displaying data such as a desktop computer, a notebook computer, a smart phone, a PDA, a mobile phone, and a game machine. do.
- the user device according to the embodiment of the present invention is a device having a function of executing software written in readable code, and displaying and delivering it to a user.
- the software can be stored on its own or can be read with data from outside.
- the terminal in the embodiment of the present invention includes not only the above data processing functions but also functions such as input, output, and storage, and for this purpose, CPU, main board, graphic card, hard disk, Not only various elements such as sound card, speaker, keyboard, mouse, monitor, USB, communication modem, etc., but also CPU, main board, graphic chip, memory chip, sound engine, speaker, touch pad, USB, etc. It may include an external connection terminal, a communication antenna, a communication modem capable of implementing communication such as 3G, LTE, LTE-A, 5G, WiFi, and Bluetooth.
- FIG. 1 is a diagram illustrating that a virtual reality device according to an embodiment of the present invention displays a virtual screen in a virtual space.
- 2 is a diagram illustrating controlling an object in a virtual screen using an external electronic device paired with a virtual reality device according to an exemplary embodiment of the present invention.
- the display unit 120 of the virtual reality device 100 outputs a virtual screen 300 in the virtual space 200.
- the virtual reality device 100 may display at least one object 310 in the virtual screen 300.
- the user may select the at least one object 310 by using the external electronic device 400 paired with the virtual reality device 100.
- FIG. 3 is a block diagram of a virtual reality device according to an embodiment of the present invention.
- the virtual reality device 100 may include a processor 110, a display unit 120, a communication module 130, a memory 140, and a position sensor 150. have. Furthermore, the virtual reality device 100 may further include an eye tracking module 160 or a focus control module 170.
- the processor 110 may control the operation of the virtual reality device 100.
- the virtual reality device 100 may be electrically connected to and controlled by at least one of the display unit 120, the communication module 120, the memory 130, and the sensor 150.
- the processor 110 may receive an input from the external electronic device 400 connected to the virtual reality device 100 and control the object 310 displayed through the virtual reality device 100.
- the processor 110 moves the selected object based on an input received from the external electronic device 400, or executes a function of an application designated for the object as the selected object is located in a specific object area. Can be controlled.
- the communication module 130 may be connected to the communication module 430 of the external electronic device 400 to receive input information of the button 410 of the external electronic device 400.
- the memory 140 may store virtual reality content provided to a user.
- Virtual reality content may be temporarily or semi-permanently stored in whole or in part in the memory 140.
- Algorithms for all techniques or methods performed by the virtual reality device 100 may be temporarily or semi-permanently stored in whole or in part in the memory 140.
- the sensor 150 may include a position sensor, and the position sensor may include an acceleration sensor, a touch sensor, or a gyroscope, and the 3D position, speed, acceleration, and rotation direction of the virtual reality device 100 , Abduction speed, rotation angular speed, etc. can be detected.
- the eye tracking module 160 includes, for example, an EOG (Electircal oculography) sensor, coil systems, dual purkinje systems, bright pupil systems, and dark pupils. At least one of the dark pupil systems can be used to track the user's gaze.
- the gaze tracking module 160 may further include a micro camera for tracking gaze.
- the adjustable optics module 170 may measure an inter-pupil distance (IPD) of the user so that the user can enjoy an image suitable for his or her eyesight.
- the virtual reality device 100 may adjust the distance of the lens according to the distance between both eyes of the user measured through the focus adjustment module 170.
- FIG. 4A is a diagram illustrating an external electronic device according to an embodiment of the present invention
- FIG. 4B is a block diagram illustrating an external electronic device according to an embodiment of the present invention.
- the external electronic device 400 includes a button 410, a processor 420, a communication module 430, a memory 440, and a sensor 450. I can.
- the processor 420 may perform overall operations of the external electronic device 400.
- the processor 420 may check the signal input through the button 410 and control the input signal to be transmitted to the virtual reality device.
- the button 410 may include a touch screen that senses a touch input from a user.
- the button 410 may detect a touch speed, a touch direction, a touch amount, a rotation speed of the touch input, a rotation direction, and a rotation amount of a touch input input through the touch screen.
- the external electronic device 400 may detect an input signal through the button 410 as an input for controlling an object displayed on the display unit 120 of the virtual reality device 100.
- the communication module 430 may be connected to the virtual reality device 100 through wireless communication (eg, short-range wireless communication such as Bluetooth, Wi-Fi, etc.) to transmit a signal input through the button 410 to the virtual reality device 100. .
- wireless communication eg, short-range wireless communication such as Bluetooth, Wi-Fi, etc.
- the sensor 450 includes a position sensor, and the position sensor may include an acceleration sensor, a touch sensor, a gyroscope, etc., and detects the position, movement speed, direction and angle of the external electronic device 400 It can be transferred to a real device.
- the position sensor may include an acceleration sensor, a touch sensor, a gyroscope, etc., and detects the position, movement speed, direction and angle of the external electronic device 400 It can be transferred to a real device.
- FIG. 5 is a block diagram illustrating a configuration of a virtual reality device according to an embodiment of the present invention.
- the virtual reality device 100 or the external electronic device 400 may include all or part of the electronic device 1401 shown in FIG. 5.
- the electronic device 1401 includes one or more processors (e.g., AP) 1410, a communication module 1420, a subscriber identification module 1424, a memory 1430, a sensor module 1440, and an input Device 1450, display 1460, interface 1470, audio module 1480, camera module 1491, power management module 1495, battery 1496, indicator 1497, and motor 1498.
- processors e.g., AP
- the processor 1410 may control a plurality of hardware or software components connected to the processor 1410 by driving an operating system or an application program, for example, and may perform various data processing and operations.
- the processor 1410 may be implemented with, for example, a system on chip (SoC).
- SoC system on chip
- the processor 1410 may further include a graphic processing unit (GPU) and/or an image signal processor.
- the processor 1410 may include at least some of the components shown in FIG. 14 (for example, the cellular module 1421).
- the processor 1410 may load a command or data received from at least one of other components (eg, nonvolatile memory) into a volatile memory, process it, and store result data in the nonvolatile memory.
- the communication module 1420 may have the same or similar configuration.
- the communication module 1420 may include, for example, a cellular module 1421, a WiFi module 1423, a Bluetooth module 1425, a GNSS module 1427, an NFC module 1428, and an RF module 1429. have.
- the cellular module 1421 may provide, for example, a voice call, a video call, a text service, or an Internet service through a communication network.
- the cellular module 1421 may distinguish and authenticate the electronic device 1401 in a communication network using a subscriber identification module (eg, a SIM card) 1424.
- the cellular module 1421 may perform at least some of the functions that the processor 1410 may provide.
- the cellular module 1421 may include a communication processor (CP).
- CP communication processor
- at least some (for example, two or more) of the cellular module 1421, the WiFi module 1423, the Bluetooth module 1425, the GNSS module 1427, or the NFC module 1428 is one integrated chip. (IC) or IC package.
- the RF module 1429 may transmit and receive, for example, a communication signal (eg, an RF signal).
- the RF module 1429 may include, for example, a transceiver, a power amp module (PAM), a frequency filter, a low noise amplifier (LNA), or an antenna.
- PAM power amp module
- LNA low noise amplifier
- the cellular module 1421, the WiFi module 1423, the Bluetooth module 1425, the GNSS module 1427, or the NFC module 1428 transmits and receives RF signals through a separate RF module.
- the subscriber identification module 1424 may include, for example, a card including a subscriber identification module or an embedded SIM, and unique identification information (eg, integrated circuit card identifier (ICCID)) or subscriber information (eg, IMSI) (international mobile subscriber identity)) may be included.
- ICCID integrated circuit card identifier
- IMSI international mobile subscriber identity
- the memory 1430 may include, for example, an internal memory 1432 or an external memory 1434.
- the internal memory 1432 is, for example, a volatile memory (eg, DRAM, SRAM, or DRAM, etc.), a nonvolatile memory (eg, one time programmable ROM (OTPROM), PROM, EPROM, EEPROM, mask ROM, flash ROM, etc.) , A flash memory, a hard drive, or a solid state drive (SSD)
- the external memory 1434 is a flash drive, for example, a compact flash (CF), a secure digital (SD) ), Micro-SD, Mini-SD, xD (extreme digital), MMC (multi-media card), memory stick, etc.
- the external memory 1434 is functional with the electronic device 1401 through various interfaces. They can be connected either physically or physically.
- the sensor module 1440 may measure a physical quantity or detect an operating state of the electronic device 1401, for example, and convert the measured or detected information into an electric signal.
- the sensor module 1440 is, for example, a gesture sensor 1440A, a gyro sensor 1440B, an atmospheric pressure sensor 1440C, a magnetic sensor 1440D, an acceleration sensor 1440E, a grip sensor 1440F, and a proximity sensor ( 1440G), color sensor (1440H) (e.g., RGB (red, green, blue) sensor), biometric sensor (1440I), temperature/humidity sensor (1440J), illuminance sensor (1440K), or UV (ultra violet) ) It may include at least one of the sensors 1440M.
- the sensor module 1440 may include, for example, an e-nose sensor, an electromyography (EMG) sensor, an electroncpalogram (EEG) sensor, an electrocardiogram (ECG) sensor, It may include an infrared (IR) sensor, an iris sensor, and/or a fingerprint sensor.
- the sensor module 1440 may further include a control circuit for controlling at least one or more sensors included therein.
- the electronic device 1401 further includes a processor configured to control the sensor module 1440 as part of or separately from the processor 1410, while the processor 1410 is in a sleep state, The sensor module 1440 may be controlled.
- the input device 1450 may include, for example, a rotation input device 1451, a touch panel 1452, a (digital) pen sensor 1454, a key 1456, or an ultrasonic input device 1458.
- the touch panel 1452 may use at least one of, for example, a capacitive type, a pressure sensitive type, an infrared type, or an ultrasonic type. Also, the touch panel 1452 may further include a control circuit.
- the touch panel 1452 may further include a tactile layer to provide a tactile reaction to a user.
- the (digital) pen sensor 1454 may be, for example, a part of a touch panel or may include a separate recognition sheet.
- the key 1456 may include, for example, a physical button, an optical key, or a keypad.
- the ultrasonic input device 1458 may detect ultrasonic waves generated from an input tool through a microphone (eg, the microphone 1488), and check data corresponding to the sensed ultrasonic waves.
- the rotation input device 1451 may be functionally connected to a wheel dial member, a wheel scroll button, and the like to receive a wheel input from the outside. For example, when a selected object is displayed to move through the display 1460, the electronic device 1401 displays an interface that controls the execution of an application or content based on a wheel input based on a position where the selected object has moved. In addition, execution of an application or content may be controlled based on a wheel input received while the interface is displayed.
- the display 1460 may include a panel 1462, a hologram device 1464, a projector 1466, and/or a control circuit for controlling them.
- the panel 1462 may be implemented to be flexible, transparent, or wearable, for example.
- the panel 1462 may include a touch panel 1452 and one or more modules.
- the panel 1462 may include a pressure sensor (or force sensor) capable of measuring the intensity of pressure for a user's touch.
- the pressure sensor may be implemented integrally with the touch panel 1452, or may be implemented as one or more sensors separate from the touch panel 1452.
- the hologram device 1464 may show a 3D image in the air using interference of light.
- the projector 1466 may project light onto a screen to display an image.
- the screen may be located inside or outside the electronic device 1401, for example.
- the interface 1470 may include, for example, an HDMI 1472, a USB 1474, an optical interface 1476, or a D-sub (D-subminiature) 1478.
- the interface 1470 may be included in the communication interface 1370 shown in FIG. 13, for example. Additionally or alternatively, the interface 1470 may include, for example, a mobile high-definition link (MHL) interface, an SD card/multi-media card (MMC) interface, or an infrared data association (IrDA) standard interface. have.
- MHL mobile high-definition link
- MMC SD card/multi-media card
- IrDA infrared data association
- the audio module 1480 may bidirectionally convert sound and electrical signals, for example. At least some components of the audio module 1480 may be included in, for example, the input/output interface 1350 illustrated in FIG. 13.
- the audio module 1480 may process sound information input or output through, for example, a speaker 1482, a receiver 1484, earphones 1486, or a microphone 1488.
- the camera module 1491 is, for example, a device capable of photographing still images and moving pictures, and according to an embodiment, one or more image sensors (eg, a front sensor or a rear sensor), a lens, an image signal processor (ISP) , Or a flash (eg, LED or xenon lamp).
- the power management module 1495 may manage power of the electronic device 1401, for example.
- the power management module 1495 may include a power management integrated circuit (PMIC), a charger IC, or a battery or fuel gauge.
- PMIC power management integrated circuit
- the PMIC may have a wired and/or wireless charging method.
- the wireless charging method includes, for example, a magnetic resonance method, a magnetic induction method, or an electromagnetic wave method, and may further include additional circuits for wireless charging, such as a coil loop, a resonance circuit, or a rectifier. have.
- the battery gauge may measure, for example, the remaining amount of the battery 1496, voltage, current, or temperature during charging.
- the battery 1496 may include, for example, a rechargeable cell and/or a solar cell.
- the indicator 1497 may display a specific state of the electronic device 1401 or a part thereof (for example, the processor 1410), for example, a booting state, a message state, or a charging state.
- the motor 1498 may convert an electrical signal into mechanical vibration, and may generate vibration or a haptic effect.
- the electronic device 1401 is, for example, a mobile TV supporting device capable of processing media data according to standards such as digital multimedia broadcasting (DMB), digital video broadcasting (DVB), or mediaFloTM. GPU).
- Each of the components described in the present invention may be composed of one or more components, and the name of the component may vary according to the type of electronic device.
- the electronic device (for example, the electronic device 1401) is composed of a single entity by omitting some components, further including additional components, or by combining some of the components. The functions of the previous corresponding components can be performed in the same way.
- FIG. 6 is a diagram showing a projection structure applied by a virtual reality device according to an embodiment of the present invention.
- 7 is a diagram illustrating a user's gaze direction on a coordinate axis in a virtual reality device according to an embodiment of the present invention.
- a projection structure, a user's viewing perspective, and a coordinate system of a virtual reality device according to an embodiment of the present invention will be described with reference to FIGS. 6 to 7.
- the virtual reality device 100 is mounted on the user's head and the display unit 120 is positioned in front of the user's eyes. It may be a head mounted display (HMD) that displays real content.
- HMD head mounted display
- the virtual reality device 100 provides a user with a projection structure for projecting the virtual screen 300 onto a virtual space 200 having a certain area from the user centering on the user. to provide.
- the virtual space 200 may be formed as a virtual spherical space having a predetermined length.
- the axis of the virtual space 200 is fixed, and the virtual space 200 does not change according to the user's gaze direction. Instead, while changing the virtual screen 300 projected on the inner surface of the virtual space 200 on which the axis is fixed, the virtual reality content is provided to the user.
- the user 10 wearing the virtual reality device 100 may move his or her head in various directions while using the virtual reality device 100, and the direction in which the user 10 looks may be changed according to this movement.
- the direction in which the user 10 looks is defined as a user's viewing perspective 210.
- the user's gaze direction can also be viewed as a viewing perspective of the virtual reality device, and hereinafter, the terms of the user's gaze direction will be unified and described.
- the direction in which the user 10 looks outward from the center of the virtual space 200 toward the inner surface May be assumed to be the user's gaze direction 210.
- the virtual screen 300 is the virtual space 200 Can be projected onto the inner surface of
- the virtual screen 300 may provide 3D virtual reality content. It is assumed that the virtual reality content is the content provided to the user through the virtual screen 300 projected on the entire or partial area of the sphere of the virtual space centered on the user using the virtual reality device.
- the virtual screen 300 may be projected on the entire area of the virtual space 200 to provide virtual reality content to a user. However, since the virtual screen 300 has a limit in the viewing angle of the person, the virtual screen 300 may be projected onto a partial area of the virtual space 200 corresponding to an area exceeding the range and angle of the person's vision to provide virtual reality content to the user. .
- the virtual screen 300 When the virtual screen 300 is projected onto a partial area of the virtual space 200 to provide virtual reality content to the user, the virtual screen 300 changes according to the user's 10's gaze direction. Provides.
- 3D virtual reality content is static content that shows the inside of the house to an observer sitting on the sofa in the living room.
- a television is located in front of the observer, a kitchen is located on the left, a bookcase is located on the right, and a house It is assumed that the floor of the house is located, the ceiling of the house is located at the top, and the wall of the house is located at the rear.
- the virtual reality device 100 may provide a user with a sense of space in a three-dimensional virtual real world.
- the direction of the user's gaze on the virtual reality device according to an embodiment of the present invention, the user's movement, and virtual reality contents are expressed in spherical coordinates or cartesian coordinates in the same coordinate axes. Can be.
- the position sensor 150 of the virtual reality device 100 uses the user's initial gaze direction A as a first reference. Set.
- the virtual reality device 100 determines that the center point of the user's head or the center point of the virtual reality device is at the center of the coordinate axis based on the first criterion.
- the virtual reality device 100 displays the virtual screen 300 using the display 120 so that the center of the virtual reality space implemented by the virtual reality content is located at the center of the coordinate axis.
- the user's initial gaze direction A may be set as a reference of the virtual space 200.
- FIG. 8 it is shown that the user's initial gaze direction A is set to the Z axis of the virtual space 200.
- the axis of the virtual space 200 set based on the user's initial gaze direction (A) does not change even if the user's gaze direction changes, and if the user's gaze direction changes thereafter, some areas in the virtual space on which the virtual screen is projected It is changed, and the virtual reality content displayed on the virtual screen is changed.
- the rotational movement in the user's gaze direction may be expressed as a yaw angle, a pitch angle, and a roll angle.
- the yaw angle ⁇ may mean an angle rotated around the y-axis.
- the yaw angle ⁇ may be defined as a clockwise rotation when viewed from the center in the positive y-axis direction.
- the pitch angle ⁇ may mean an angle rotated around the x-axis.
- the pitch angle ⁇ may be defined as a clockwise rotation when viewing the positive x-axis direction from the center.
- the roll angle ⁇ may mean an angle rotated around the z-axis.
- the roll angle ⁇ may be defined as a clockwise rotation when viewed from the center in the positive z-axis direction.
- the translational movement in the user's gaze direction is an exercise that moves the user's central point in at least one of the x-axis, y-axis, and z-axis.
- Moving along the X axis means that the user's head moves to the left or right.
- Moving along the y-axis means that the user's head moves upward or downward.
- Moving along the Z axis means that the user's head moves forward or backward.
- the position sensor 150 may provide a yaw angle ⁇ , a pitch angle ⁇ , and a roll angle ⁇ in the user's gaze direction.
- the position sensor 150 may provide x, y, and z positions of the center point in the user's gaze direction.
- the position sensor may include at least one or more of various sensors such as an acceleration sensor measuring acceleration, a gyroscope measuring angular velocity, and a magnetometer that is a geomagnetic sensor.
- a roll angle or a pitch angle may be measured through an acceleration sensor or a gyroscope, and a yaw angle may be measured through a gyroscope or a magnetometer.
- the first user's viewing perspective (A) in the virtual space 200 is assumed to be a reference point.
- the user's initial gaze direction A may be expressed as a spherical coordinate (r, ⁇ , ⁇ ).
- Starting to set the axis of the virtual space 200 based on the first reference, which is the user's initial gaze direction A, may be activated by the virtual reality device 100 or an input device of the external electronic device 400.
- the input device may be a centering button.
- the balancing adjustment process may be performed by pressing the centering button.
- the virtual reality device 100 Upon receiving the input signal of the centering button, the virtual reality device 100 measures the user's initial gaze direction A through the sensor 150. In this case, the virtual reality device visually provides the changed angle of the yaw direction and the roll direction measured through the sensor 150 to the user through the display unit to inform the difference from the existing user gaze direction (A). May be.
- 9(a) and 9(b) are diagrams illustrating changing a change in a user's gaze direction from the front to the upper side in the virtual reality device according to an embodiment of the present invention.
- 10 is a diagram illustrating changing a virtual space in a virtual reality device according to an embodiment of the present invention.
- the user 10 wearing the virtual reality device 100 may move his or her head in various directions while using the virtual reality device 100, and the user's gaze direction may be changed according to this movement.
- the virtual reality device 100 provides a virtual screen on a virtual space in which the changed user's gaze direction is directed, and the virtual screen may provide virtual reality content changed according to the user's gaze direction.
- the user's first gaze direction 210 may be predetermined as a reference point of the user and a starting point of virtual reality content and stored in the virtual reality device.
- the user's first gaze direction 210 may be measured and obtained by the position sensor 150.
- the user's gaze direction may be changed to look at the ceiling 220 of the virtual space 200 as the user's second gaze direction 240 by the user's movement. For example, the user may turn the user's head upward to look up in the virtual reality content.
- the virtual screen 300 is projected onto the ceiling 220 of the virtual space 200 facing the user's second gaze direction 240, and the virtual screen 300 is The virtual reality content corresponding to the direction is shown to the user. That is, when the user changes the gaze in the second gaze direction 240, the virtual reality device 100 must show the upper portion of the virtual reality content through the display unit.
- the user's movement is measured by the position sensor 150 of the virtual reality device 100.
- the measurement of the user's movement is a measurement of the movement of the user's gaze direction from the first gaze direction 210 to the second gaze direction 240.
- the movement in the user's gaze direction may be measured by a position sensor included in the virtual reality device, and a method in which the location sensor measures the movement in the user’s gaze direction has been described above and thus will be omitted.
- the conventional virtual reality device displays the virtual reality contents in the virtual screen 300 projected on the ceiling 220 of the spherical space 200 to the lying user even to the lying user.
- the user 10 wearing the virtual reality device 100 is lying down, it is necessary to provide the virtual reality content rotated or moved according to the gaze direction 250 of the user on which the virtual reality device 100 is lying down. Accordingly, the user can comfortably lie down and view virtual reality content such as a movie or a book, or a simple game, which requires little movement among virtual reality content.
- the virtual reality device 100 when the virtual reality device 100 detects the user's posture and determines that the user is lying down, the ceiling 220 of the virtual space 200 is laid down by changing the axis of the virtual space 200.
- a virtual space may be provided to the user so that it is placed in the upper direction of the user's head.
- a method of controlling the virtual reality device 100 in which the virtual reality device 100 detects a user's posture and provides a virtual space with a changed axis will be described.
- 11 is a flowchart illustrating a method of controlling a virtual reality device according to an embodiment of the present invention.
- 12 is a diagram illustrating an example of changing a virtual space of a virtual reality device according to an embodiment of the present invention.
- a method of controlling a virtual reality device includes a process of setting a first reference for a user's gaze direction A (S100) and the first reference.
- a process of displaying the virtual screen projected on the virtual space to the user (S200) may be included.
- the user may turn on the virtual reality device or press a button to set a new standard.
- the button is a centering button and may be an input device included in the virtual reality device 100 or the external electronic device 400.
- the sensor 150 of the virtual reality device 100 measures the gaze direction A of the user wearing the virtual reality device.
- the user's initial gaze direction A measured by the sensor may be set as the first reference of the virtual space.
- the center point of the user's gaze direction A measured by the sensor may be set as the center point of the virtual space 200.
- the user's initial gaze direction A may be set in the Z-axis direction of the virtual space. However, this is for convenience of explanation, and the center point of the user's initial gaze direction A and the direction thereof may be set to any point in the virtual space and any direction starting from that point.
- the processor 110 of the virtual reality device may display a virtual screen projected on the virtual space to the user based on the first criterion.
- the processor 110 controls the display 120 to project a virtual screen onto the virtual space to the user according to the user's gaze direction A set as the first reference.
- the processor 110 of the virtual reality device 100 projects the upper portion of the virtual reality content to the user on a virtual screen 300 that is projected onto the ceiling 220 of the virtual space. ) To display. However, even when the user wants to view the virtual reality content while lying down, the virtual reality device displays the upper portion of the virtual reality content to the user, and thus the virtual space 200 needs to be changed according to the user's body posture. .
- a method of controlling a virtual reality device includes a process of detecting a change in a user's gaze direction with respect to a pitch direction (S300) and a change in a user's gaze direction with respect to a Y-axis direction. It may further include a process (S400).
- the sensor 150 of the virtual reality device detects it.
- the sensor 150 of the virtual reality device 100 indicates that the user's new gaze direction (A1 or A2) is a yaw direction and a roll direction compared to the first user gaze direction A, which is the first reference. , It detects whether it has rotated in the pitch direction.
- the sensor 150 of the virtual reality device measures a rotation angle of the user's new gaze direction A1 or A2 with respect to the pitch direction in comparison with the first criterion.
- the processor 110 of the virtual reality device determines whether the rotation angle with respect to the pitch direction of the new gaze direction of the user measured by the sensor 150 is greater than the first threshold value ⁇ 1 and less than the second threshold value ⁇ 2 do.
- the first threshold value ⁇ 1 or the second threshold value ⁇ 2 may be a value set by a user or preset and stored in the memory 140.
- the first threshold value ⁇ 1 or the second threshold value ⁇ 2 may be defined as a positive value.
- the sensor 150 of the virtual reality device has the center point of the user's gaze direction A3 in the x-axis direction, y-axis direction, and z-axis direction compared to the first user's gaze direction A as the first reference. Detect and measure whether it has moved.
- the processor 110 of the virtual reality device determines whether the moving distance in the y direction with respect to the center point of the new gaze direction A3 measured by the sensor 150 is greater than the third threshold value ⁇ y.
- the third threshold value ⁇ y may be set by a user or a value previously set and stored in the memory 140.
- the third threshold value ⁇ y may be defined as a positive value.
- the y direction with respect to the center point of the new gaze direction A3 may be defined as a negative direction of the y axis.
- the method of controlling a virtual reality device may further include a process of determining a user's posture (S400).
- the processor 110 of the virtual reality device has a rotation angle of the user's gaze direction with respect to the pitch direction compared to the reference, and is greater than a first threshold value ⁇ 1 and a second threshold value ⁇ 2. It is smaller than that, and when the movement length of the user's gaze direction with respect to the Y-axis direction is greater than the third threshold value ⁇ y, the user's posture may be determined as a lying posture.
- the process of displaying information asking whether to set a new second criterion for the user's gaze direction is displayed on a virtual screen may be further included.
- the processor 110 of the virtual reality device 100 displays to the user whether to set the user's new gaze direction as a new second reference through the virtual screen to the user looking at the ceiling 220 of the virtual space. I can. Accordingly, the virtual reality device 100 may ask the user for a doctor about whether or not the axis of the virtual space 200 needs to be changed according to the user's posture change.
- FIG. 13 is a diagram illustrating a method of controlling a virtual reality device according to a change in a virtual space of the virtual reality device according to an embodiment of the present invention.
- a method of controlling a virtual reality device includes transmitting a signal to or receiving a signal to an external sperm device 400 paired with the virtual reality device 100 (S600). It may include.
- the processor 110 of the virtual reality device 100 when determining that the user is lying down, sends the centering button 410 of the external electronic device to the external electronic device 400 paired with the virtual reality device 100. It can send a signal instructing activation.
- the external electronic device 400 may receive a signal commanding activation of the centering button from the virtual reality device 100 and activate the centering button 410 based thereon. Before the centering button 410 is activated, it may be in a deactivated state to prevent unnecessary input by a user.
- the external electronic device 400 may transmit a signal including a command for setting a new second reference for the user's new gaze direction to the virtual reality device 100.
- the centering button 410 may display a content corresponding to the centering button on the virtual screen in connection with content information asking to set a new reference for a new gaze direction in the virtual screen projected on the virtual space.
- the centering button 410 may independently function with the virtual screen.
- the method of controlling a virtual reality device may further include a process of displaying a virtual surface projected on the changed virtual space by changing a virtual space based on a new gaze direction of the user (S800). .
- the processor 110 of the virtual reality device 100 may receive a command from the external electronic device 400 to set a new second reference for the new gaze direction of the user.
- the processor 110 sets a new second reference based on the rotation angle and the position movement distance of the user's gaze direction measured by the sensor 150. That is, the processor 110 changes the virtual space on which the virtual screen is projected based on the user's new gaze direction, based on the rotation angle and the position movement distance of the user's gaze direction, and sets the changed virtual space as a new second reference. do.
- the processor 110 projects a virtual screen onto the changed virtual space based on a new second criterion for the user's gaze direction and displays it to the user. Accordingly, the user can stand and see the same screen as using the virtual reality device even while lying down.
- the method for controlling a virtual reality device may further include a process of turning off the displayed virtual screen (S700).
- the processor 110 of the virtual reality device 100 shows the user that the axis of the virtual space is changed, dizziness or vomiting symptoms due to the user's cognitive dissonance may occur.
- the processor 110 may turn off the virtual screen displayed by the display unit.
- a method of controlling a virtual reality device is, when a virtual screen projected on the virtual space is displayed based on a new second criterion, the change of the virtual screen projected on the virtual space is the user's gaze. It may be characterized in that it is displayed on the basis of a value obtained by multiplying the movement of the direction by a predetermined weight (S900).
- the virtual screen When a virtual screen is displayed on a virtual space in which the axis is changed, the virtual screen may be changed according to the change of the user's gaze direction from before the axis was changed. Unlike the state in which the user is standing or sitting, the direction of the user's gaze in the lying state has limitations. Based on the virtual space where the axis is changed while lying down, the rotation of the user's gaze direction in the yaw direction and the pitch direction is restricted, and in particular, the rotation in the pitch direction is limited because the head touching the ground cannot be tilted back. In addition, the user's gaze direction is also restricted in the movement of the center point in the X, Y, and Z axis directions.
- the rotation of the virtual screen projected in the virtual space with respect to the pitch direction is based on a value obtained by multiplying the rotation angle of the user's gaze direction with respect to the pitch direction by a predetermined weight.
- a weight greater than 1 is multiplied even if rotated at a small angle from the user's new gaze direction to the pitch direction, the virtual screen projected in the virtual space can be rotated at a large angle and displayed to the user.
- the axis is The rotation of the virtual screen projected on the changed virtual space may be displayed based on a signal received from the external electronic device.
- the virtual screen is not displayed based on the rotation and movement measured by the sensor of the virtual reality device, but the movement of the virtual screen is controlled by using an external electronic device. I can control it.
- the rotation of the virtual screen projected in the virtual space with respect to the pitch direction which is the direction in which the head is tilted back on the ground, is displayed based on a signal received from the external electronic device, but the yaw direction or the roll direction of the virtual reality device is It is also possible to control to change the virtual screen using a sensor.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Selon un mode de réalisation de la présente invention, un procédé de commande d'un dispositif de réalité virtuelle comprend les étapes consistant : à définir une première référence pour une direction de visualisation d'un utilisateur sur la base de capteurs; et à afficher, à destination de l'utilisateur, un écran virtuel projeté sur un espace virtuel sur la base de la première référence.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020190039561A KR102444924B1 (ko) | 2019-04-04 | 2019-04-04 | 가상 현실 장치 및 그 제어 방법 |
| KR10-2019-0039561 | 2019-04-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020204594A1 true WO2020204594A1 (fr) | 2020-10-08 |
Family
ID=72667236
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2020/004447 Ceased WO2020204594A1 (fr) | 2019-04-04 | 2020-04-01 | Dispositif de réalité virtuelle et son procédé de commande |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR102444924B1 (fr) |
| WO (1) | WO2020204594A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024060959A1 (fr) * | 2022-09-20 | 2024-03-28 | 北京字跳网络技术有限公司 | Procédé et appareil pour ajuster une image de visualisation dans un environnement virtuel, support de stockage et dispositif |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120283217A (zh) * | 2022-12-20 | 2025-07-08 | 三星电子株式会社 | 用于显示视觉对象的可穿戴设备及其方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20140144510A (ko) * | 2013-06-11 | 2014-12-19 | 삼성전자주식회사 | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 |
| US20150130695A1 (en) * | 2012-06-29 | 2015-05-14 | Jianjun Gu | Camera based auto screen rotation |
| KR20170035958A (ko) * | 2014-07-25 | 2017-03-31 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 가상 현실 환경 내에서의 시선 기반의 오브젝트 배치 |
| US20170139474A1 (en) * | 2015-11-17 | 2017-05-18 | Samsung Electronics Co., Ltd. | Body position sensitive virtual reality |
| KR20180028358A (ko) * | 2016-09-08 | 2018-03-16 | 엘지전자 주식회사 | 단말기 및 그 제어 방법 |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9219901B2 (en) * | 2012-06-19 | 2015-12-22 | Qualcomm Incorporated | Reactive user interface for head-mounted display |
| KR102524641B1 (ko) * | 2016-01-22 | 2023-04-21 | 삼성전자주식회사 | Hmd 디바이스 및 그 제어 방법 |
-
2019
- 2019-04-04 KR KR1020190039561A patent/KR102444924B1/ko active Active
-
2020
- 2020-04-01 WO PCT/KR2020/004447 patent/WO2020204594A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150130695A1 (en) * | 2012-06-29 | 2015-05-14 | Jianjun Gu | Camera based auto screen rotation |
| KR20140144510A (ko) * | 2013-06-11 | 2014-12-19 | 삼성전자주식회사 | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 |
| KR20170035958A (ko) * | 2014-07-25 | 2017-03-31 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | 가상 현실 환경 내에서의 시선 기반의 오브젝트 배치 |
| US20170139474A1 (en) * | 2015-11-17 | 2017-05-18 | Samsung Electronics Co., Ltd. | Body position sensitive virtual reality |
| KR20180028358A (ko) * | 2016-09-08 | 2018-03-16 | 엘지전자 주식회사 | 단말기 및 그 제어 방법 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024060959A1 (fr) * | 2022-09-20 | 2024-03-28 | 北京字跳网络技术有限公司 | Procédé et appareil pour ajuster une image de visualisation dans un environnement virtuel, support de stockage et dispositif |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20200117444A (ko) | 2020-10-14 |
| KR102444924B1 (ko) | 2022-09-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016032124A1 (fr) | Dispositif rotatif et dispositif électronique l'intégrant | |
| WO2015183033A1 (fr) | Procédé de traitement de données et dispositif électronique correspondant | |
| CN108803896B (zh) | 控制屏幕的方法、装置、终端及存储介质 | |
| WO2015046686A1 (fr) | Dispositif d'affichage pouvant être porté et procédé permettant de commander une couche dans celui | |
| WO2017039191A1 (fr) | Station d'accueil pour charge sans fil et dispositif électronique l'utilisant | |
| CN110266930A (zh) | 图像拍摄的补光方法、移动终端及灯光调节方法 | |
| US20240353922A1 (en) | Devices, methods, and graphical user interfaces for user enrollment and authentication | |
| WO2019117566A1 (fr) | Dispositif électronique et procédé de commande d'entrée associé | |
| CN111897429A (zh) | 图像显示方法、装置、计算机设备及存储介质 | |
| WO2018105955A2 (fr) | Procédé d'affichage d'objet et dispositif électronique associé | |
| WO2016144095A1 (fr) | Procédé et appareil pour commander un dispositif électronique dans un système de communication | |
| WO2017150815A1 (fr) | Procédé de commande de luminosité d'affichage, dispositif électronique et support d'enregistrement lisible par ordinateur | |
| WO2018117533A1 (fr) | Dispositif électronique et son procédé de commande | |
| WO2016048050A1 (fr) | Procédé d'acquisition de données de capteur et dispositif électronique associé | |
| WO2018008888A1 (fr) | Dispositif électronique et son procédé de contenu d'écran | |
| WO2022050638A1 (fr) | Procédé de modification de paramètres d'affichage et dispositif électronique | |
| WO2018084649A1 (fr) | Procédé et appareil permettant d'acquérir des informations par capture d'un œil | |
| WO2021230568A1 (fr) | Dispositif électronique permettant de fournir un service de réalité augmentée et son procédé de fonctionnement | |
| CN109379539A (zh) | 一种屏幕补光方法及终端 | |
| WO2016080818A1 (fr) | Procédé pour commander un dispositif d'affichage d'image et appareil prenant en charge celui-ci | |
| WO2017164545A1 (fr) | Dispositif d'affichage et procédé permettant de commander un dispositif d'affichage | |
| WO2020204594A1 (fr) | Dispositif de réalité virtuelle et son procédé de commande | |
| WO2019135550A1 (fr) | Dispositif électronique de commande d'un affichage d'image sur la base d'une entrée de défilement et procédé associé | |
| WO2018139786A1 (fr) | Dispositif électronique et procédé de commande de dispositif électronique | |
| WO2018124774A1 (fr) | Dispositif électronique et procédé de commande associé |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20782865 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20782865 Country of ref document: EP Kind code of ref document: A1 |