WO2024071632A1 - Dispositif d'affichage d'image pour afficher une image de métavers et procédé d'affichage associé - Google Patents
Dispositif d'affichage d'image pour afficher une image de métavers et procédé d'affichage associé Download PDFInfo
- Publication number
- WO2024071632A1 WO2024071632A1 PCT/KR2023/011154 KR2023011154W WO2024071632A1 WO 2024071632 A1 WO2024071632 A1 WO 2024071632A1 KR 2023011154 W KR2023011154 W KR 2023011154W WO 2024071632 A1 WO2024071632 A1 WO 2024071632A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- data
- display device
- metaverse
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
Definitions
- Embodiments of the present disclosure relate to an image display device and method for displaying a metaverse image.
- Metaverse can refer to a three-dimensional virtual world where social, economic, and cultural activities similar to reality occur. Users using the Metaverse can use their online alter ego, an avatar, to engage in social, economic, and cultural activities similar to those in reality.
- a metaverse image may be an image that displays a three-dimensional virtual world on the metaverse using a video display device.
- a controller can use a controller to perform specific functions on the metaverse. For example, a user can move a controller to move an avatar included in a metaverse video displayed on a video display device.
- performing other functions simultaneously may be limited. For example, when moving an avatar included in a metaverse video while moving the controller, simultaneous performance of functions related to adjusting the metaverse video or setting the video display device may be limited.
- An image display device includes a communication unit; display; And it may include at least one processor.
- the at least one processor may control the communication unit to transmit a notification signal to the server.
- the at least one processor may control the communication unit to receive control data generated in response to the notification signal from the server.
- the at least one processor may control the communication unit to transmit input data received from the wearable device to the server.
- the at least one processor may control the communication unit to receive output data obtained by processing the input data from the server.
- the at least one processor may control the display to display a metaverse image based on the control data and the output data.
- a display method of a video display device may include transmitting a notification signal to the server when detecting that a user is wearing a wearable device while connected to the server.
- the display method may include receiving control data generated in response to the notification signal from the server.
- the display method may include transmitting input data received from the wearable device to the server.
- the display method may include receiving output data obtained by processing the input data from the server.
- the display method may include displaying a metaverse image based on the control data and the output data.
- FIG. 1 is a diagram showing a system according to an embodiment of the present disclosure.
- Figure 2 is a block diagram showing an image display device according to an embodiment of the present disclosure.
- Figure 3 is a flowchart showing the operation of a video display device, a server, and a wearable device according to an embodiment of the present disclosure.
- Figure 4 is a flowchart showing a display method of an image display device according to an embodiment of the present disclosure.
- Figure 5 is a block diagram showing the transfer of data and signals between an image display device, a server, and a wearable device according to an embodiment of the present disclosure.
- FIG. 6 is a block diagram illustrating input transmission between a video display device, a server, a wearable device, and a controller according to an embodiment of the present disclosure.
- Figure 7 is a diagram showing a video display device according to an embodiment of the present disclosure displaying a menu related to a metaverse video.
- FIG. 8 is a diagram illustrating an image display device according to an embodiment of the present disclosure displaying a menu related to operation of the image display device.
- FIG. 9 is a diagram illustrating a video display device displaying content related to an object according to an embodiment of the present disclosure.
- FIG. 10 is a diagram illustrating a video display device displaying an icon according to an embodiment of the present disclosure.
- Figure 11 is a block diagram showing the transfer of context information between an image display device, a server, and a wearable device according to an embodiment of the present disclosure.
- Figure 12 is a block diagram showing the transfer of a user's gaze direction and avatar movement input between a video display device, a server, a wearable device, and a controller according to an embodiment of the present disclosure.
- FIG. 13 is a block diagram showing separation of audio between a video display device, a server, a wearable device, and an external output device according to an embodiment of the present disclosure.
- Figure 14 is a block diagram showing priority assignment between a video display device, a server, a wearable device, and a controller according to an embodiment of the present disclosure.
- Figure 15 is a block diagram showing priority assignment between a video display device, a server, a wearable device, and a controller according to an embodiment of the present disclosure.
- the system may be a metaverse system that provides a metaverse.
- Metaverse can refer to a three-dimensional virtual world where social, economic, and cultural activities similar to reality occur. Users using the Metaverse can use their online alter ego, an avatar, to engage in social, economic, and cultural activities similar to those in reality.
- the system may include an image display device 110, a server 120, at least one wearable device 130, and at least one controller 140.
- the image display device 110 may display a metaverse image.
- a metaverse video may be an image displaying a three-dimensional virtual world on the metaverse.
- the video display device 110 may be a TV capable of displaying a metaverse video.
- the video display device 110 may be various types of video output devices such as monitors, laptop computers, smartphones, PDAs, tablets, e-readers, digital broadcast terminals, etc. that can display metaverse images. there is.
- the video display device 110 may receive data related to at least one of the structure and form of the metaverse video from the server 120.
- the video display device 110 may display a metaverse video based on data provided from the server 120.
- the image display device 110 may be connected to at least one wearable device 130.
- the image display device 110 may be connected to at least one wearable device 130 through short-distance communication or long-distance communication.
- the video display device 110 may receive a wearing notification signal related to whether the user is wearing at least one wearable device 130.
- the video display device 110 may transmit a notification signal notifying that the user is wearing at least one wearable device 130 to the server 120.
- the video display device 110 may receive input data based on a user's input to at least one wearable device 130.
- the video display device 110 may transmit input data based on the user's input to at least one wearable device 130 to the server 120.
- the video display device 110 may be connected to at least one controller 140.
- the video display device 110 may be connected to at least one controller 140 through short-distance communication or long-distance communication.
- the video display device 110 may receive input data based on the user's input to at least one controller 140.
- the video display device 110 may transmit the user's input for at least one controller 140 to the server 120.
- the server 120 may receive input data based on a user's input to at least one wearable device 130 from the video display device 110.
- the server 120 may receive input data based on at least the user's input to the controller 140 from the video display device 110.
- the server 120 may generate data related to at least one of the structure and form of the metaverse image based on input data received from the image display device 110. In one embodiment, the server 120 generates data related to at least one of the configuration and form of the metaverse image based on input data received from at least one wearable device 130 or input data received from the controller 140. can be created. The server 120 may transmit data related to at least one of the structure and form of the metaverse video to the video display device 110.
- At least one wearable device 130 may be a portable electronic device that a user can wear.
- the at least one wearable device 130 may be at least one of an earbud 131 and a smart watch 132.
- the present invention is not limited thereto, and the at least one wearable device 130 may be at least one of smart glass, smart band, smart belt, and smart anklet.
- At least one wearable device 130 may be connected to the image display device 110 through short-distance communication or long-distance communication. At least one wearable device 130 may transmit a wearing notification signal related to whether the user is wearing the at least one wearable device 130 to the video display device 110. At least one wearable device 130 may transmit input data based on a user's input to the at least one wearable device 130 to the image display device 110.
- At least one controller 140 may be a metaverse customization device that can be operated by a user.
- the at least one controller 140 may be at least one of a VR controller 141 and a remote controller 142.
- the present invention is not limited thereto, and the at least one controller 140 may be at least one of a head mounted device (HMD), a smart joystick, and a motion recognition input device.
- HMD head mounted device
- At least one controller 140 may be connected to the video display device 110 through short-distance communication or long-distance communication. At least one controller 140 may transmit input data based on a user's input to the at least one controller 140 to the video display device 110 .
- FIG. 2 is a block diagram showing an image display device 110 according to an embodiment of the present disclosure.
- the image display device 110 may include a communication unit 210, a display 220, a memory 230, a sensor unit 240, an interface 250, a power unit 260, and a processor 270.
- the communication unit 210 may establish a wireless communication connection with the server 120.
- the communication unit 210 may establish a wireless communication connection with at least one wearable device 130.
- the communication unit 210 may support short-range wireless communication such as Bluetooth (BT) communication and NFC communication.
- the communication unit 210 may support long-distance wireless communication such as Wi-Fi communication and cellular communication.
- the communication unit 210 may transmit and receive signals and data with the wearable device 130.
- the communication unit 210 may receive a wearing notification signal from the wearable device 130.
- the communication unit 210 can transmit and receive signals and data with the server 120.
- the communication unit 210 may transmit signals and data received from the wearable device 130 to the server 120.
- the communication unit 210 may receive video data for constructing a metaverse video from the server 120.
- the display 220 may display a metaverse image.
- the display 310 may display a metaverse image corresponding to image data transmitted from at least one processor 270.
- the memory 230 may store an operating system (OS).
- OS operating system
- the operating OS may include a program for expressing metaverse images.
- memory 230 may store sensing data.
- Sensing data may include data obtained from the sensor unit 240.
- the memory 320 may transmit the stored sensing data to at least one processor 270. Accordingly, the processor 270 can generate image data representing the metaverse image by reflecting the sensing data.
- the sensor unit 240 may include at least one of an image sensor, a distance sensor, a direction sensor, and an illumination sensor.
- the image sensor can detect objects surrounding the video display device 110.
- the image sensor may be a camera.
- the distance sensor can detect the distance between the video display device 110 and an object.
- the distance sensor may be a Time of Flight (ToF) sensor.
- the direction sensor can detect the direction in which the video display device 110 is facing.
- the direction sensor may include at least one of an acceleration sensor and a gyro sensor.
- the illuminance sensor can detect the ambient illuminance of the video display device 110.
- the interface 250 may include an input unit that receives user input and an output unit that provides feedback to the user.
- the input unit includes a physical button included in the video display device 110, a physical key included in a separate remote control device remotely connected to the video display device 110, such as a remote control, a microphone that receives the user's voice input, and a video input unit. It may include at least one of a touch screen included in the display 220 of the display device 110 and a graphic user interface (GUI) displayed on the metaverse image displayed on the display 220.
- GUI graphic user interface
- the output unit may include at least one of a speaker providing a voice output corresponding to the metaverse image to the user and a haptic feedback output device providing a tactile effect corresponding to the metaverse image to the user.
- the power unit 260 may supply power to the communication unit 210, display 220, memory 230, sensor unit 240, interface 250, and processor 270.
- the power supply unit 260 may be a battery included in the video display device 110.
- the power unit 260 may be a plug circuit disposed on the rear of the video display device 110 and connected to an external power source.
- the processor 270 may be electrically connected to the communication unit 210, the display 220, the memory 230, the sensor unit 240, the interface 250, and the power supply unit 260.
- the processor 270 may overall control the operations of the communication unit 210, display 220, memory 230, sensor unit 240, interface 250, and power unit 260.
- the processor 270 is a control circuit that performs calculation and processing functions to overall control the communication unit 210, display 220, memory 230, sensor unit 240, interface 250, and power unit 260. It can be.
- the processor 270 may be an application processor (AP) or a microprocessor (MCU) included in the video display device 110.
- AP application processor
- MCU microprocessor
- the processor 270 may control the communication unit 210 to transmit a notification signal to the server 120 when it detects that the user is wearing the wearable device 130 while connected to the server 120. there is.
- the processor 270 may control the communication unit 210 to establish a wireless communication connection with the server 120.
- the processor 270 may control the communication unit 210 to transmit a notification signal to the server 120 in response to receiving a wearing notification signal.
- the processor 270 may control the communication unit 210 to receive control data generated in response to a notification signal from the server 120.
- the server 120 may generate control data for manipulating the metaverse based on the notification signal.
- the processor 270 may control the communication unit 210 to receive control data for manipulating the metaverse from the server 120.
- the processor 270 may control the communication unit 210 to transmit input data received from the wearable device 130 to the server 120.
- Wearable device 130 may receive user input.
- the wearable device 130 may transmit input data corresponding to the user input to the communication unit 210.
- the processor 270 may determine that input data corresponding to the user input is input related to the metaverse image.
- the processor 270 may control the communication unit 210 to transmit input data to the server 120.
- the processor 270 may control the communication unit 210 to receive output data obtained by processing input data from the server 120.
- the server 120 may generate output data to be reflected in the metaverse image based on the input data.
- the processor 270 may control the communication unit 210 to receive output data to be reflected in the metaverse image from the server 120.
- the processor 270 may control the display 220 to display a metaverse image based on control data and output data.
- FIG. 3 is a flowchart showing the operations of the video display device 110, the server 120, and the wearable device 130 according to an embodiment of the present disclosure.
- the video display device 110 may perform server connection in operation 310.
- the video display device 110 may transmit a connection request to the server 120.
- the video display device 110 may receive a connection response from the server 120.
- the server 120 may include a database (DB) that stores metaverse support data that can provide metaverse technology.
- the video display device 110 may transmit a signal confirming whether the server 120 can support the metaverse.
- the video display device 110 may receive a response from the server 120 indicating that it can support the metaverse.
- the video display device 110 can run an OS that displays metaverse images.
- the video display device 110 may detect wearing of the wearable in operation 320.
- the video display device 110 may establish a wireless communication connection with the wearable device 130.
- the wearable device 130 can detect whether the user is wearing it.
- the wearable device 130 may transmit a wearing notification signal to the video display device 110.
- the video display device 110 may transmit a notification signal in operation 330.
- the video display device 110 may receive a wearing notification signal from the wearable device 130.
- the video display device 110 may transmit a notification signal to the server 120 in response to receiving the wearing notification signal.
- the video display device 110 may transmit dual control mode entry information to the server 120 when transmitting a notification signal.
- the dual control mode may be a mode in which the video display device 110 is controlled using two input devices.
- the video display device 110 can use a controller such as a remote control and the wearable device 130 as two input devices.
- the video display device 110 may transmit information indicating that it has entered a dual control mode using the controller and the wearable device 130 to the server 120.
- the video display device 110 may receive control data in operation 340.
- Control data may include data for implementing dual control mode.
- the dual control mode may be a mode in which the video display device 110 can display a metaverse image based on the user input received from the wearable device 130 and the user input received from the controller 140.
- Data for implementing the dual control mode is data required for the video display device 110 to independently transmit the input data received from the wearable device 130 and the input data received from the controller 140 to the server 120. It can be.
- the image display device 110 that has received the control data may receive input data related to the structure and form of the metaverse image from each of the wearable device 130 and the controller 140 and transmit it to the server 120.
- control data may include data for processing input data generated based on user input from each of the controller and the wearable device 130 in the dual control mode.
- the server 120 may receive a wearing notification signal and confirm that the user is wearing the wearable device 130.
- the server 120 may receive dual control mode entry information and confirm that the video display device 110 has entered a mode capable of dually receiving input data from the controller and the wearable device 130.
- the server 120 may transmit control data to the video display device 110.
- the video display device 110 may receive control data from the server 120.
- the video display device 110 may receive input data in operation 350.
- Wearable device 130 may receive user input.
- the wearable device 130 may receive a user's touch input.
- the wearable device 130 may receive at least one of a user's press input, a user's tap input, and a user's double tap input.
- the wearable device 130 may transmit input data corresponding to the user input to the video display device 110.
- the video display device 110 may transmit input data to the server 120 in operation 360.
- the video display device 110 may receive input data transmitted by the wearable device 130.
- the video display device 110 may analyze the received input data based on the control data.
- the video display device 110 may confirm that the input data received from the wearable device 130 corresponds to the user input input to the wearable device 130 in the dual control mode.
- the image display device 110 may confirm that input data received from the wearable device 130 corresponds to user input for manipulating the metaverse image.
- the video display device 110 may transmit the received input data to the server 120.
- the video display device 110 may receive output data from the server 120 in operation 370.
- Output data may be data processed as input data that performs functions related to metaverse images.
- output data may be data that is processed to visually express user input corresponding to the input data in a metaverse image.
- output data may be data that is processed to execute additional functions corresponding to input data.
- Addition may be a function for manipulating the image marking device 110. Additions may be a function independent of the metaverse video.
- the server 120 may generate output data corresponding to the received input data.
- the server 120 may transmit the generated output data to the video display device 110.
- the video display device 110 may receive output data from the server 120.
- the image display device 110 may display a metaverse image in operation 380.
- the video display device 110 may display a metaverse video based on output data.
- the image display device 110 may display a metaverse image that initially represents a user input corresponding to the input data.
- the image display device 110 may display a metaverse image and simultaneously execute additional functions corresponding to input data.
- the video display device 110 may transmit output data in operation 390.
- the video display device 110 may transmit output data to the wearable device 130.
- the wearable device 130 may receive output data.
- the wearable device 130 may perform a function corresponding to the received output data.
- the wearable device 130 may provide feedback corresponding to user input to the user.
- Operation 390 may be optional.
- the image display device 110 may transmit output data to the wearable device 130 when the wearable device 130 needs to provide feedback.
- FIG. 4 is a flowchart showing a display method of the image display device 110 according to an embodiment of the present disclosure.
- the video display device 110 When the video display device 110 according to one embodiment detects that the user is wearing the wearable device 130 while connected to the server 120 in operation 410, it may transmit a notification signal to the server 120.
- the processor 270 of the video display device 110 may control the communication unit 210 to establish a wireless communication connection with the server 120 that provides the metaverse.
- the processor 270 may control the communication unit 210 to establish a wireless communication connection with the wearable device 130.
- the communication unit 210 may receive a wearing notification signal from the wearable device 130 indicating that the user is wearing the wearable device 130 .
- the processor 270 may control the communication unit 210 to transmit the notification signal to the server 120 in response to receiving the wearing notification signal.
- the video display device 110 may receive control data generated in response to a notification signal from the server 120 in operation 420.
- the server 120 may generate control data in response to the notification signal.
- Control data may be data for providing a metaverse.
- the control data may include data for implementing a dual control mode capable of displaying a metaverse image based on the user input received from the wearable device 130 and the user input received from the controller 140.
- the server 120 may transmit the generated control data to the video display device 110.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive control data transmitted from the server 120.
- the video display device 110 may transmit input data received from the wearable device 130 to the server 120 in operation 430.
- Wearable device 130 may receive user input.
- the wearable device 130 may transmit input data corresponding to the user input to the video display device 110.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive input data transmitted from the wearable device 130.
- the processor 270 may determine that the received input data is input data related to the metaverse.
- the processor 270 may determine that the received input data is input data for controlling the display of a metaverse image.
- the processor 270 may determine that the received input data is input data for displaying a metaverse image and executing an additional function on the video display device 110 at the same time.
- the processor 270 may control the communication unit 210 to transmit the received input data to the server 120.
- the processor may receive output data obtained by processing input data from the server 120 in operation 440.
- Server 120 may generate output data based on received input data.
- Output data may be data processed as input data that performs functions related to metaverse images.
- the output data may include image data for displaying a metaverse image corresponding to the input data and function data for performing a function corresponding to the input data.
- the image data may be data for displaying at least one object included in the metaverse image and an avatar, the user's online alter ego, included in the metaverse image.
- function data may be data that executes a function that corresponds to input data and changes the configuration or form of the metaverse image.
- the function data may be data that corresponds to input data and executes a function related to the operation of the video display device 110.
- the server 120 may transmit the generated output data to the video display device 110.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive output data.
- the image display device 110 may display a metaverse image based on control data and output data in operation 450.
- a metaverse video may be an image displaying a three-dimensional virtual world on the metaverse.
- the metaverse image may include at least one object and an avatar, which is the user's alter ego.
- a metaverse image may include a background constituting the virtual world of the metaverse, a plurality of objects existing in the virtual world, and an avatar approaching one of the plurality of objects.
- the processor 270 of the video display device 110 may generate image data to display a metaverse image based on control data and output data.
- the processor 270 transmits the input data input from each of the wearable device 130 and the controller 140 to the server 120 based on the control data, so that the server 120 generates output data.
- the processor 270 may generate image data that displays a metaverse image in which an avatar performs an action according to the input data based on the output data.
- the processor 270 may perform a function corresponding to the input data while the image display device 110 displays a metaverse image based on the output data.
- the processor 270 may generate image data that displays a metaverse image and an additional menu based on control data and output data.
- the processor 270 may control the display 220 to display a metaverse image corresponding to image data.
- the display 220 may display a metaverse image corresponding to image data.
- FIG. 5 is a block diagram illustrating the transfer of data and signals between the video display device 110, the server 120, and the wearable device 130 according to an embodiment of the present disclosure.
- the wearable device 130 may transmit information indicating the wearing state to the video display device 110.
- the wearable device 130 may transmit a wearing notification signal to the video display device 110.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive a wearing notification signal.
- the video display device 110 may transmit mode change information to the server 120.
- the mode change information may be information indicating that a dual control mode using the controller and the wearable device 130 has been entered.
- the processor 270 of the video display device 110 may control the communication unit 210 to transmit mode change information to the server 120.
- the video display device 110 may receive output data related to the composition of the metaverse video from the server 120.
- Output data related to the composition of the metaverse image may be data for displaying the metaverse image.
- output data related to the composition of the metaverse image may be data related to the display of a background, at least one object, and an avatar included in the metaverse image.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive output data related to the configuration of the metaverse video.
- the video display device 110 may receive output data related to infrastructure management from the server 120.
- Output data related to infrastructure management may be data for managing input data based on user input in dual control mode.
- output data related to infrastructure management may be data defining how to process user input input to each of the controller and the wearable device 130 in dual control mode.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive output data related to infrastructure management.
- the video display device 110 may receive output data related to function performance from the server 120.
- Output data related to function performance may be data for performing additional functions corresponding to user input.
- output data related to function performance may be data that displays an additional menu corresponding to user input in a pop-up format, like a metaverse image.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive output data related to function performance.
- the image display device 110 may transmit feedback data to the wearable device 130.
- Feedback data may be data that allows the wearable device 130 to provide feedback to the user.
- the feedback data may be voice data that allows the wearable device 130 to provide voice output corresponding to the metaverse image to the user.
- the feedback data may be a signal that allows the wearable device 130 to provide haptic feedback corresponding to the metaverse image to the user.
- the processor 270 of the video display device 110 may control the communication unit 210 to transmit feedback data.
- the video display device 110 receives user input from each of the controller 140 and the wearable device 130, the process of classifying properties for each subject transmitting the user input will be described with reference to FIG. 6. .
- FIG. 6 is a block diagram illustrating input transmission between the video display device 110, the server 120, the wearable device 130, and the controller 140 according to an embodiment of the present disclosure.
- the video display device 110 may receive a first input from the controller 140.
- the controller 140 may be a metaverse custom device that can be operated by the user.
- the at least one controller 140 may be at least one of a VR controller 141 and a remote controller 142.
- the controller 140 may receive a first input from the user.
- the first input is a button press input for pressing the operation button included in the controller 140, the user's movement input for manipulating the direction keys included in the controller 140, and a user's movement input for moving the controller 140 in space. It may include at least one of the user's swing inputs.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive the first input.
- the image display device 110 may receive input data from the wearable device 130.
- the wearable device 130 may receive user input from the user.
- Input data may be generated based on user input to the wearable device 130.
- the user input may be a press input to the wearable device 130, a tap input to the wearable device 130, and a double tap input to the wearable device 130.
- the wearable device 130 is a smart watch 132
- the user input is a touch input to the wearable device 130 and a drag input to the wearable device 130.
- a bezel rotation input of the wearable device 130 may control the communication unit 210 to receive input data.
- the video display device 110 may process the first input as having the first attribute and transmit it to the server 120.
- the processor 270 of the video display device 110 may determine that the first input has the first attribute.
- the first attribute may include control attributes related to the avatar of the metaverse image.
- the first attribute may include a control attribute that moves the position of the avatar.
- the processor 270 may control the communication unit 210 to transmit the first input having the first attribute to the server 120.
- the video display device 110 may process input data as having a second attribute that is distinct from the first attribute and transmit it to the server 120.
- the processor 270 of the video display device 110 may determine that the input data has the second attribute.
- the second attribute may include control attributes related to additional functions of the video display device 110.
- the second property may include a property that controls display of a menu related to the display of the metaverse image.
- the second attribute may include an attribute that controls a menu related to an additional function of the video display device 110 to be displayed simultaneously with the metaverse video.
- the processor 270 may control the communication unit 210 to transmit input data having the second attribute to the server 120.
- FIG. 7 is a diagram showing the video display device 110 according to an embodiment of the present disclosure displaying a menu related to a metaverse video.
- the image display device 110 may display a metaverse image.
- a metaverse video may be a video expressing a metaverse in which an avatar, the alter ego of a user, exists in a 3D virtual reality.
- the metaverse image may include a background of virtual reality, at least one object in virtual reality, and an avatar.
- a metaverse video may be an image that displays a historical site or exhibition hall in 3D virtual reality.
- a metaverse video may be a virtual reality video with avatars of multiple users surrounding a statue.
- users can be provided with the experience of actually being inside a historic site or exhibition hall.
- a user can check information related to a statue at a historic site or exhibition hall.
- a user can chat about statues in a historic site or exhibition hall with other users who can be identified through an avatar using an external input device such as a microphone or keyboard.
- the image display device 110 may display a first menu 710 related to a metaverse image.
- the processor 270 of the image display device 110 may control the display 220 to display the first menu 710 related to the metaverse image.
- the processor 270 may control the display 220 to simultaneously display the metaverse image and the first menu 710.
- the processor 270 may control the display 220 to display the first menu 710 in a pop-up form on one edge of the metaverse image.
- the processor 270 may control the display 220 to display the first menu 710 related to the metaverse image.
- the processor 270 displays the first menu 710 related to the metaverse image when receiving a tap input from the earbud 131. (220) can be controlled.
- the processor 270 displays the first menu 710 related to the metaverse image when receiving a touch input from the smart watch 132. (220) can be controlled.
- the first menu 710 may be related to controlling which metaverse image to display, how to display the metaverse image, and what to display with the image.
- the first menu 710 may include one or more first sub-menus 711, 712, 713, and 714.
- the first submenus 711, 712, 713, and 714 include a video management submenu 711, a chat management submenu 712, a previous exhibition hall submenu 713, and a next exhibition hall submenu 714.
- the user can adjust the brightness of the metaverse video, the resolution of the metaverse video, and the color of the metaverse video through the video management submenu 711.
- the user can select whether to display the chat visually, such as a metaverse video, through the chat management submenu 712.
- the user can change the metaverse image so that it represents the previous exhibition hall through the previous exhibition hall submenu 713.
- the user can change the metaverse image so that it represents the next exhibition hall through the Daum exhibition hall submenu 714.
- the video display device 110 may display a selection screen of one or more first sub-menus 711, 712, 713, and 714 included in the first menu 710.
- the user can select one of the first submenus 711, 712, 713, and 714.
- the processor 270 selects one of the first submenus 711, 712, 713, and 714 when receiving a press input from the earbud 131.
- a screen where the video management submenu 711 is selected can be displayed.
- the processor 270 selects one of the first submenus 711, 712, 713, and 714 when receiving a touch input from the smart watch 132.
- a screen where the video management submenu 711 is selected can be displayed.
- the screen on which the video management sub-menu 711 is selected may be a screen that displays the video management sub-menu 711 differently from the other sub-menus 712, 713, and 714.
- the screen where the video management sub-menu 711 is selected may be a screen in which the video management sub-menu 711 is displayed in a different color from the other sub-menus 712, 713, and 714.
- the image display device 110 selects a currently selected menu item from among the one or more first sub-menus 711, 712, 713, and 714 included in the first menu 710 in response to the user input to the wearable device 130.
- a submenu can be displayed to the user.
- the video display device 110 may change the selected submenu among the first submenus 711, 712, 713, and 714.
- the video management submenu 711 may be selected among the first submenus 711, 712, 713, and 714.
- the processor 270 may receive a tap input from the earbud 131 and change the selected sub-menu to one lower sub-menu.
- the processor 270 may receive a double tap input from the earbud 131 and change the selected sub-menu to the next sub-menu.
- the processor 270 when the video management submenu 711 is selected, when the processor 270 receives a tap input from the earbud 131, the processor 270 changes the selected submenu to the chat management submenu 712, which is one submenu below. It can be changed to .
- the processor 270 when the chat management submenu 712 is selected, when the processor 270 receives a double tap input from the earbud 131, the processor 270 changes the selected submenu to the video management submenu 711, which is the submenu one above. ) can be changed to .
- the processor 270 may receive at least one of a drag input and a bezel rotation input from the smart watch 132 to change the selected submenu. Accordingly, the image display device 110 may change the selected sub-menu in response to the user input to the wearable device 130.
- the image display device 110 may display a change in the metaverse image corresponding to a first submenu selected from one or more first submenus 711, 712, 713, and 714.
- the processor 270 of the video display device 110 may control the display 220 to display a changed metaverse image corresponding to the selected first submenu.
- the processor 270 configures the display 220 to display the metaverse image to correspond to the brightness of the metaverse image, the resolution of the metaverse image, and the color of the metaverse image adjusted through the image management submenu 711. ) can be controlled.
- the processor 270 may control the display 220 to visually display the chat, such as a metaverse video, through the chat management submenu 712.
- the processor 270 may control the display 220 so that the metaverse image represents the previous exhibition hall through the previous exhibition hall submenu 713.
- the processor 270 may control the display 220 so that the metaverse image displays the next exhibition hall through the next exhibition hall sub-menu 714.
- the processor 270 may control the display 220 to display changes in the metaverse image corresponding to the selected submenu in response to input data based on the user's input to the wearable device 130.
- FIG. 8 is a diagram illustrating the image display device 110 according to an embodiment of the present disclosure displaying a menu related to operation of the image display device 110.
- the image display device 110 may display a second menu 810 related to operating the image display device 110.
- the processor 270 of the video display device 110 may control the display 220 to display a second menu 810 related to operation of the video display device 110.
- the processor 270 may control the display 220 to simultaneously display the metaverse image and the second menu 810.
- the processor 270 may control the display 220 to display the second menu 810 in a pop-up form on one edge of the metaverse image.
- the processor 270 may control the display 220 to display the second menu 810 related to the operation of the video display device 110. there is.
- the processor 270 when the wearable device 130 is an earbud 131, the processor 270 generates a second menu 810 related to the operation of the video display device 110 when receiving a press input from the earbud 131. ) can be controlled to display the display 220.
- the processor 270 when the wearable device 130 is a smart watch 132, when the processor 270 receives a button press input from the smart watch 132, the processor 270 creates a second menu ( The display 220 can be controlled to display 810).
- the second menu 810 provides information on how to set up other outputs of the video display device 110, how to set up external devices connected to the video display device 110, and how to configure the user in the video display device 110. It may be related to controlling how to provide functions matched to .
- the second menu 810 may include one or more second sub-menus 811, 812, 813, and 814.
- the second submenus 811, 812, 813, and 814 include a volume control submenu 811, a device management submenu 812, a user settings submenu 813, and a convenience function submenu 814. ) may include.
- the user can adjust the volume of the voice output from the video display device 110 and whether to mute the sound through the volume control submenu 811.
- the user can add, manage, and delete the wearable device 130 and controller 140 connected to the video display device 110 through the device management submenu 812.
- the user can change the metaverse image so that it represents the previous exhibition hall through the previous exhibition hall submenu 713.
- the user can change the metaverse image so that it represents the next exhibition hall through the Daum exhibition hall submenu 714.
- the video display device 110 may display a selection screen of one or more second sub-menus 811, 812, 813, and 814 included in the second menu 810.
- the user can select one of the second submenus 811, 812, 813, and 814.
- the processor 270 selects one of the second submenus 811, 812, 813, and 814 when receiving a press input from the earbud 131.
- a screen where the volume control submenu 811 is selected can be displayed.
- the processor 270 selects one of the second submenus 811, 812, 813, and 814 when receiving a touch input from the smart watch 132.
- a screen where the volume control submenu 811 is selected can be displayed.
- the screen on which the volume control sub-menu 811 is selected may be a screen that displays the volume control sub-menu 811 differently from the other sub-menus 812, 813, and 814.
- the screen where the volume control sub-menu 811 is selected may be a screen in which the volume control sub-menu 811 is displayed in a different color from the other sub-menus 812, 813, and 814.
- the image display device 110 selects a currently selected menu item from among one or more second sub-menus 811, 812, 813, and 814 included in the second menu 810 in response to the user input to the wearable device 130.
- a submenu can be displayed to the user.
- the video display device 110 may change the selected submenu among the second submenus 811, 812, 813, and 814.
- the volume control submenu 811 may be selected from the second submenus 811, 812, 813, and 814.
- the processor 270 may receive a tap input from the earbud 131 and change the selected sub-menu to one lower sub-menu.
- the processor 270 may receive a double tap input from the earbud 131 and change the selected sub-menu to a sub-menu one higher.
- the processor 270 when the volume control submenu 811 is selected, when the processor 270 receives a tap input from the earbud 131, the processor 270 changes the selected submenu to the device management submenu 812, which is one submenu below. It can be changed to .
- the processor 270 when the device management submenu 812 is selected, when the processor 270 receives a double tap input from the earbud 131, the processor 270 changes the selected submenu to the volume control submenu 811, which is the submenu one above. ) can be changed to .
- the processor 270 may receive at least one of a drag input and a bezel rotation input from the smart watch 132 to change the selected submenu. Accordingly, the image display device 110 may change the selected sub-menu in response to the user input to the wearable device 130.
- the video display device 110 may display a function corresponding to a second submenu selected from one or more second submenus 811, 812, 813, and 814.
- the processor 270 of the video display device 110 may control the display 220 to display a function corresponding to the selected second sub-menu.
- the processor 270 controls the display 220 to display the volume of the voice output from the video display device 110 and whether to mute it through the volume control submenu 811 using numbers, bars, circular icons, etc. can do.
- the processor 270 configures the display 220 to display the results of adding, managing, and deleting the wearable device 130 and the controller 140 connected to the video display device 110 through the device management submenu 812. ) can be controlled.
- the processor 270 may control the display 220 to display information about a user set on the video display device 110 through the user setting submenu 813.
- the processor 270 may control the display 220 to display the application status of the convenience function provided by the video display device 110 through the convenience function submenu 814.
- the processor 270 may control the display 220 to display a function corresponding to the submenu selected in response to input data based on the user's input to the wearable device 130.
- FIG. 9 is a diagram showing the image display device 110 displaying content related to an object according to an embodiment of the present disclosure.
- the processor 270 of the image display device 110 may check whether at least one object included in the metaverse image is focused.
- a metaverse image may include at least one object and at least one avatar.
- at least one object included in the metaverse image may be a statue displayed at the historic site or exhibition hall.
- the processor 270 may determine that the statue is in focus.
- the processor 270 may determine that the statue is focused.
- the processor 270 may select the statue in response to positioning the pointer or cursor on the displayed statue. It can be judged that it is focused.
- the processor 270 of the video display device 110 may determine whether the input data is related to at least one object based on the verification result. The processor 270 may determine whether focusing on at least one object is an input unrelated to the at least one object or an input related to the at least one object.
- the at least one object-unrelated input may be an input that manipulates the avatar.
- inputs for manipulating the avatar may include inputs for moving the avatar, inputs for rotating the avatar's gaze direction, and inputs for setting the avatar's facial expression.
- the processor 270 may control the display to perform a function corresponding to the input.
- the processor 270 of the image display device 110 controls the display 220 to display content related to at least one object in the metaverse image based on the determination result, control data, and output data. You can.
- the processor 270 may load content data related to at least one object.
- content data may be image data corresponding to at least one of a front view, a perspective view, and an enlarged view of at least one object.
- content data may be image data that displays at least one of text, icons, and videos related to at least one object.
- the processor 270 may control the display 220 to display a metaverse image corresponding to content data.
- the processor 270 displays a metaverse image including a pop-up window located at one edge showing an enlarged view of the front of the statue.
- the display 220 can be controlled.
- the processor 270 may control the display 220 to display the name of the statue in a pop-up window.
- FIG. 10 is a diagram showing the image display device 110 displaying an icon according to an embodiment of the present disclosure.
- the image display device 110 may display a third menu 1010 related to a metaverse image.
- the processor 270 of the image display device 110 may control the display 220 to display the third menu 1010 related to the metaverse image.
- the processor 270 may control the display 220 to simultaneously display the metaverse image and the third menu 1010.
- the third menu 1010 may include one or more third sub-menus 1011, 1012, and 1013.
- the third menu 1010 includes an object photography submenu 1011 for photographing the front view of at least one object included in the metaverse image, a user call submenu 1012 for calling another user, and a current display ( 220) may include a screen capture submenu 1013 that captures the entire screen of the metaverse video displayed.
- the video display device 110 may display at least one icon 1021, 1022, and 1023. At least one icon 1021, 1022, and 1023 may be a graphic user interface (GUI) for executing at least one function related to a metaverse image.
- GUI graphic user interface
- the processor 270 of the video display device 110 may control the display 220 to display at least one icon 1021, 1022, and 1023.
- the processor 270 may control the display 220 to display at least one icon 1021, 1022, and 1023 that executes a function corresponding to the third menu 1010.
- the processor 270 may generate at least one icon 1021, 1022, and 1023 to correspond to each of the functions performed by the third submenus 1011, 1012, and 1013 included in the third menu 1010. there is.
- the processor 270 may display a first icon 1021 corresponding to the object photographing function of the object photographing submenu 1011 and a second icon 1022 corresponding to the user call function of the user call submenu 1012.
- a third icon 1023 corresponding to the screen capture function of the screen capture submenu 1013 can be created.
- the processor 270 may control the display 220 to display the first icon 1021, the second icon 1022, and the third icon 1023.
- the processor 270 may control the display 220 to display at least one icon 1021, 1022, and 1023 excluding the third menu 1010.
- the processor 270 controls the display 220 to display at least one icon 1021, 1022, and 1023 that executes a function corresponding to the third menu 1010 to provide the user with information included in the third menu 1010. Functions can be displayed visually.
- the processor 270 may execute a function corresponding to the icon in which the user input is input in response to a user input for one of the at least one icons 1021, 1022, and 1023.
- the processor 270 may execute an object photographing function in response to a user input for the first icon 1021 among the at least one icons 1021, 1022, and 1023.
- the processor 270 may execute a user call function in response to a user input for the second icon 1022 among the at least one icons 1021, 1022, and 1023.
- the processor 270 may execute a screen capture function in response to a user input for the third icon 1023 among the at least one icons 1021, 1022, and 1023.
- FIG. 11 is a block diagram illustrating the transfer of context information between the video display device 110, the server 120, and the wearable device 130 according to an embodiment of the present disclosure.
- the video display device 110 may receive context information from the wearable device 130.
- the wearable device 130 may obtain context information.
- Context information may be surrounding environment information detected by the wearable device 130.
- Context information may be information related to the user's physical action detected by the wearable device 130.
- the context information includes at least one of the user's location detected by the wearable device 130, the user's movement direction detected by the wearable device 130, and the user's gaze direction detected by the wearable device 130. can do.
- the wearable device 130 is an earbud 131
- the earbud 131 can detect the user's gaze direction using sensing data received from a sensor module.
- the wearable device 130 may transmit the acquired context information.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive context information.
- the video display device 110 may check surrounding environment information based on the received context information.
- the processor 270 of the video display device 110 may check the user's physical action based on the received context information.
- the processor 270 of the video display device 110 may check the user's location, the user's movement direction, and the user's gaze direction based on the received context information.
- the processor 270 may check the user's gaze direction based on context information received from the earbud 131.
- the video display device 110 may transmit the confirmed user's location, the user's movement direction, and the user's gaze direction to the server 120.
- the processor 270 of the video display device 110 may control the communication unit 210 to transmit the user's location, the user's movement direction, and the user's gaze direction to the server 120.
- the server 120 may generate output data for constructing a metaverse image based on the received user's location, the user's movement direction, and the user's gaze direction. For example, the server 120 may set the direction the avatar is facing in the avatar configuration data included in the output data based on the user's gaze direction.
- the server 120 may transmit output data for constructing a metaverse image to the video display device 110.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive output data.
- the image display device 110 may display a metaverse image based on output data.
- the output data may be data generated by the server 120 based on context information received from the wearable device 130.
- the processor 270 of the image display device 110 may control the display 220 to display the metaverse image based on the context information received from the wearable device 130.
- FIG. 12 is a block diagram illustrating the transmission of a user's gaze direction and avatar movement input between the video display device 110, the server 120, the wearable device 130, and the controller 140 according to an embodiment of the present disclosure. am.
- the controller 140 may receive an input from the user to move the avatar. For example, if the controller 140 is the VR controller 141, the user can move the VR controller 141 in space, and the VR controller 141 detects movement in space and uses input to move the avatar. You can receive it. For example, if the controller 140 is a TV remote control type remote controller 142, the user can operate the direction buttons of the remote controller 142, and the remote controller 142 operates the direction buttons through the avatar. It can be received as a moving input. The controller 140 may transmit the received input to move the avatar to the video display device 110.
- the wearable device 130 may receive an input that moves the user's gaze direction.
- the wearable device 130 is an earbud 131
- the user can turn his head or change the direction of gaze while wearing the earbud 131.
- the earbud 131 may include a gyro sensor that detects the rotation direction, rotation angle, and tilt degree.
- the earbud 131 may include an acceleration sensor that detects the number of rotational movements and the acceleration of the rotational movement.
- the earbud 131 may receive the user's gaze direction using a gyro sensor and an acceleration sensor.
- the earbud 131 may transmit the received user's gaze direction to the video display device 110.
- the video display device 110 may receive an avatar movement input from the controller 140.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive an avatar movement input.
- the image display device 110 may receive the user's gaze direction from the wearable device 130.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive the user's gaze direction.
- the video display device 110 may transmit avatar movement data and gaze direction data to the server 120.
- the processor 270 of the video display device 110 may control the communication unit 210 to transmit avatar movement data and gaze direction data.
- the processor 270 may separate avatar movement data and gaze direction data and transmit them as data with different properties.
- the video display device 110 may receive output data related to the avatar's movement and output data related to the avatar's gaze direction.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive output data related to the avatar's movement and output data related to the avatar's gaze direction.
- the image display device 110 may display a metaverse image based on output data.
- the processor 270 of the video display device 110 may control the display 220 to display an avatar included in the metaverse video based on output data.
- the video display device 110 may set the avatar's movement and the direction the avatar is facing in response to different inputs.
- the processor 270 of the video display device 110 may set the movement of the avatar to correspond to the input from the controller 140.
- the processor 270 may set the avatar's gaze direction to correspond to the input from the wearable device 130. Accordingly, the processor 270 can accurately set the avatar's movement and the avatar's gaze direction using two different input means.
- the image display device 110 may set the direction in which the avatar included in the metaverse image faces based on the gaze direction.
- the processor 270 of the video display device 110 may control the display 220 to set the direction in which the avatar faces based on the user's gaze direction.
- the direction the avatar is facing can change to correspond to a change in the direction the user is facing.
- the user can change the direction the avatar is facing by intuitively changing the direction of gaze while wearing the wearable device 130. Accordingly, the user can easily set the direction the avatar is facing.
- FIG. 13 is a block diagram illustrating the separation of audio between the video display device 110, the server 120, the wearable device 130, and the external output device 150 according to an embodiment of the present disclosure.
- the video display device 110 may receive audio data from the server 120.
- Voice data may be data corresponding to voice related to the metaverse video.
- Voice data may be data corresponding to voice to be output through an external output device 150 such as the wearable device 130 and a speaker.
- voice data includes background voice data corresponding to background music (BGM) played like a metaverse video, overall conversation data corresponding to a user talking to all other users, and a user talking only to a specific user. It may contain individual voice data corresponding to what is being said.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive background voice data, overall conversation data, and individual voice data from the server 120.
- the video display device 110 may transmit background audio and audio data related to the entire conversation to the external output device 150.
- the processor 270 of the video display device 110 may control the communication unit 210 to transmit background voice and voice data related to the entire conversation to the external output device 150.
- the processor 270 may control the communication unit 210 to transmit background music played like a metaverse video and voice data related to the user's conversation with all other users to the external output device 150.
- the external output device 150 can output background music played along with a metaverse video and the user's conversation with all other users. Accordingly, many people can easily check the background music played along with the metaverse video and the user's conversations with other users.
- the video display device 110 may transmit audio excluding background audio and individual audio data related to the conversation excluding the entire conversation to the wearable device 130.
- the processor 270 of the video display device 110 may control the communication unit 210 to transmit individual voice data to the wearable device 130.
- the processor 270 may control the communication unit 210 to transmit individual voice data corresponding to the user speaking only to a specific user to the wearable device 130.
- the wearable device 130 can output an individual voice that the user speaks only to a specific user. Accordingly, users can check conversations between specific users through the wearable device 130 without exposing them to other people.
- FIG. 14 is a block diagram showing priority assignment between the video display device 110, the server 120, the wearable device 130, and the controller 140 according to an embodiment of the present disclosure.
- FIG. 15 is a block diagram illustrating priority assignment between the video display device 110, the server 120, the wearable device 130, and the controller 140 according to an embodiment of the present disclosure.
- FIG. 14 illustrates a case where priority is given to the wearable device 130.
- Figure 15 shows a case where priority is given to the controller 140.
- the video display device 110 may receive an execution input for a common function that can be selectively input from the wearable device 130 and the controller 140 from at least one of the wearable device 130 and the controller 140.
- the common function may be a function that can receive input from the wearable device 130 or the controller 140.
- a common function may be the start function of a metaverse video.
- a common function may be a function to end a metaverse video.
- the processor 270 of the video display device 110 may control the communication unit 210 to receive an execution input for a common function from at least one of the wearable device 130 and the controller 140.
- the processor 270 may control the communication unit 210 to receive an execution input for a common function from the wearable device 130 as shown in FIG. 14 .
- the processor 270 may control the communication unit 210 to receive execution input for a common function from the controller 140 as shown in FIG. 15 .
- the video display device 110 may transmit data related to common functions to the server 120.
- the processor of the video display device 110 controls the communication unit 210 to transmit data related to a common function to the server 120 in response to an execution input received from at least one of the wearable device 130 and the controller 140. You can.
- the video display device 110 determines the priority related to the common function based on the result of receiving the execution input for the common function, which of the wearable device 130 and the controller 140 receives the execution input first. It can be assigned to a preferred device.
- the processor 270 of the video display device 110 may control the communication unit 210 to grant priority related to common functions to the preferred device among the wearable device 130 and the controller 140 according to the result of receiving the execution input. .
- the processor 270 may control the communication unit 210 to give priority to the common function to the wearable device 130.
- the processor 270 may control the communication unit 210 to give priority to the common function to the controller 140.
- the video display device 110 may control common functions using a preferred device. For example, when the processor 270 of the video display device 110 grants common function priority to the wearable device 130 as shown in FIG. 14, the processor 270 determines the wearable device 130 as the preferred device and selects the wearable device 130 as the preferred device. You can use it to control common functions. For example, when the processor 270 of the video display device 110 grants common function priority to the controller 140 as shown in FIG. 15, the processor 270 determines the controller 140 as the preferred device and uses the controller 140 to Functions can be controlled.
- the video display device 110 may discard execution input from devices other than the preferred device. For example, when the processor 270 of the video display device 110 grants common function priority to the wearable device 130 as shown in FIG. 14, the processor 270 may ignore execution input related to the common function input from the controller 140. . For example, when the processor 270 of the video display device 110 grants common function priority to the controller 140 as shown in FIG. 15, the processor 270 may ignore execution input related to the common function input from the wearable device 130. . Accordingly, the video display device 110 can prevent duplication and conflict of execution inputs related to common functions when the wearable device 130 and the controller 140 are used simultaneously.
- An image display device and a display method display a metaverse image corresponding to a user input received from a wearable device separate from the controller, thereby creating a metaverse environment with improved user input convenience and efficiency. can be provided.
- a video display device includes a communication unit; display; and at least one processor, wherein the at least one processor controls the communication unit to transmit a notification signal to the server when detecting that the user is wearing the wearable device while connected to the server, and responds to the notification signal.
- Control the communication unit to receive control data generated in response from the server, control the communication unit to transmit input data received from the wearable device to the server, and output data processed by the input data from the server.
- the communication unit may be controlled to receive information, and the display may be controlled to display a metaverse image based on the control data and the output data.
- the notification signal may notify that the video display device has entered a dual control mode.
- control data may include data for implementing a dual control mode capable of displaying a metaverse image based on user input received from each of the wearable device and the controller.
- the output data may include image data for displaying the metaverse image corresponding to the input data and function data for performing a function corresponding to the input data.
- the at least one processor controls the communication unit to receive a first input from a controller, controls the communication unit to process the first input as having a first attribute and transmit it to the server, And the communication unit may be controlled to process the input data received from the wearable device as having a second attribute distinct from the first attribute and transmit the input data to the server.
- the at least one processor displays a first menu related to the metaverse image based on the control data and the output data, a selection screen of one or more first submenus included in the first menu, and The display may be controlled to display changes in the metaverse image corresponding to a selected first submenu among the one or more first submenus.
- the at least one processor displays a second menu related to operation of the video display device based on the control data and the output data, and a selection screen of one or more second submenus included in the second menu. , and the display can be controlled to display a function corresponding to a selected second submenu among the one or more second submenus.
- the input data is generated based on user input to the wearable device, and the user input includes a press input to the wearable device, a tap input to the wearable device, and a double press input to the wearable device.
- the user input includes a press input to the wearable device, a tap input to the wearable device, and a double press input to the wearable device.
- the input data is generated based on a user input to the wearable device, and the user input includes a touch input to the wearable device, a drag input to the wearable device, and a bezel rotation input to the wearable device. , and may include at least one of a button press input of the wearable device.
- the at least one processor determines whether the at least one object included in the metaverse image is focused, and based on the confirmation result, the input data corresponds to the at least one object. and control the display to display content related to the at least one object in the metaverse image based on the determination result, the control data, and the output data.
- the at least one processor may control the display to display at least one icon for executing at least one function related to the metaverse image.
- the at least one processor controls the display to display the metaverse image based on context information received from the wearable device, and the context information is the user's information detected by the wearable device. It may include at least one of location, direction of movement of the user, and direction of gaze of the user.
- the at least one processor may control the display to set the direction in which the avatar included in the metaverse image faces based on the gaze direction.
- the at least one processor controls the communication unit to transmit voice data related to the background voice and the entire conversation to an external output device, and to transmit voice data other than the background voice and voice data related to the entire conversation to an external output device.
- the communication unit can be controlled to transmit individual voice data to the wearable device.
- the at least one processor controls the communication unit to receive an execution input for a common function that can be selectively input from the wearable device and the controller from at least one of the wearable device and the controller, and executes the common function. Based on the result of receiving the execution input for, priority related to the common function is given to the preferred device that received the execution input first among the wearable device and the controller, and the common function is performed using the preferred device. control, and can discard the execution input from devices other than the preferred device.
- a display method of a video display device includes transmitting a notification signal to the server when detecting that a user is wearing a wearable device while connected to a server; Receiving control data generated in response to the notification signal from the server; Transmitting input data received from the wearable device to the server; Receiving output data obtained by processing the input data from the server; and displaying a metaverse image based on the control data and the output data.
- receiving a first input from a controller Processing the first input as having a first attribute and transmitting it to the server; and processing the input data as having a second attribute that is distinct from the first attribute and transmitting the input data to the server.
- the operation of displaying the metaverse image includes selecting a first menu related to the metaverse image and a plurality of first submenus included in the first menu based on the control data and the output data. It may include an operation of displaying a screen and a change in the metaverse image corresponding to a first submenu selected from among the plurality of first submenus.
- the operation of displaying the metaverse image includes a second menu related to operation of the image display device based on the control data and the output data, and a plurality of second submenus included in the second menu.
- the method may include controlling the display to display a selection screen and a function corresponding to a second submenu selected from among the plurality of second submenus.
- Computer-readable media may include program instructions, data files, data structures, etc., singly or in combination.
- Program instructions recorded on the medium may be specially designed and configured for this disclosure or may be known and available to those skilled in the art of computer software.
- Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
- Examples of program instructions include machine language code, such as that produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter, etc.
- Computer-readable media can be any available media that can be accessed by a computer and includes both volatile and non-volatile media, removable and non-removable media. Additionally, computer-readable media may include both computer storage media and communication media.
- Computer storage media includes both volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- Communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave, or other transmission mechanism, and includes any information delivery medium. Additionally, some embodiments of the present disclosure may be implemented as a computer program or computer program product that includes instructions executable by a computer, such as a computer program executed by a computer.
- a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
- 'non-transitory storage medium' simply means that it is a tangible device and does not contain signals (e.g. electromagnetic waves). This term is used to refer to cases where data is semi-permanently stored in a storage medium and temporary storage media. It does not distinguish between cases where it is stored as .
- a 'non-transitory storage medium' may include a buffer where data is temporarily stored.
- Computer program products are commodities and can be traded between sellers and buyers.
- a computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or through an application store or between two user devices (e.g. smartphones). It may be distributed in person or online (e.g., downloaded or uploaded). In the case of online distribution, at least a portion of the computer program product (e.g., a downloadable app) is stored on a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server. It can be temporarily stored or created temporarily.
- a machine-readable storage medium such as the memory of a manufacturer's server, an application store's server, or a relay server. It can be temporarily stored or created temporarily.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Un dispositif d'affichage d'image pour afficher une image de métavers et un procédé d'affichage associé peuvent être fournis. Plus particulièrement, le dispositif d'affichage d'image peut comprendre : une unité de communication; un dispositif d'affichage; et au moins un processeur, le ou les processeurs commandant l'unité de communication pour transmettre un signal de notification à un serveur lors de la détection d'un utilisateur portant un dispositif habitronique tout en étant connecté au serveur, commandant l'unité de communication pour recevoir des données de commande générées en réponse au signal de notification provenant du serveur, commandant l'unité de communication pour transmettre des données d'entrée reçues du dispositif habitronique au serveur, commandant l'unité de communication pour recevoir des données de sortie obtenues en traitant les données d'entrée provenant du serveur, et commandant le dispositif d'affichage pour afficher une image de métavers sur la base des données de commande et des données de sortie.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR20220125411 | 2022-09-30 | ||
| KR10-2022-0125411 | 2022-09-30 | ||
| KR1020220168099A KR20240045947A (ko) | 2022-09-30 | 2022-12-05 | Ca메타버스 영상을 표시하는 영상 표시 장치 및 그 표시 방법 |
| KR10-2022-0168099 | 2022-12-05 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024071632A1 true WO2024071632A1 (fr) | 2024-04-04 |
Family
ID=90478288
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2023/011154 Ceased WO2024071632A1 (fr) | 2022-09-30 | 2023-07-31 | Dispositif d'affichage d'image pour afficher une image de métavers et procédé d'affichage associé |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024071632A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20180028165A (ko) * | 2016-09-08 | 2018-03-16 | 삼성전자주식회사 | 컨텐츠 재생 방법 및 이를 지원하는 전자 장치 |
| KR20190004806A (ko) * | 2016-06-03 | 2019-01-14 | 페이스북 테크놀로지스, 엘엘씨 | 헤드 마운트 디스플레이의 얼굴 센서를 사용한 얼굴과 안구 추적 및 얼굴 애니메이션 |
| KR20190023636A (ko) * | 2017-08-29 | 2019-03-08 | 삼성전자주식회사 | 복수의 컨트롤러를 이용한 전자 장치의 디스플레이 제어 방법 및 그 장치 |
| JP2020503574A (ja) * | 2016-09-30 | 2020-01-30 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイによって提供される仮想現実環境への見物人フィードバックコンテンツの配信 |
| KR20200023858A (ko) * | 2018-08-27 | 2020-03-06 | 삼성전자주식회사 | 가상 현실에서 정보를 제공하는 전자 장치 및 방법 |
-
2023
- 2023-07-31 WO PCT/KR2023/011154 patent/WO2024071632A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20190004806A (ko) * | 2016-06-03 | 2019-01-14 | 페이스북 테크놀로지스, 엘엘씨 | 헤드 마운트 디스플레이의 얼굴 센서를 사용한 얼굴과 안구 추적 및 얼굴 애니메이션 |
| KR20180028165A (ko) * | 2016-09-08 | 2018-03-16 | 삼성전자주식회사 | 컨텐츠 재생 방법 및 이를 지원하는 전자 장치 |
| JP2020503574A (ja) * | 2016-09-30 | 2020-01-30 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイによって提供される仮想現実環境への見物人フィードバックコンテンツの配信 |
| KR20190023636A (ko) * | 2017-08-29 | 2019-03-08 | 삼성전자주식회사 | 복수의 컨트롤러를 이용한 전자 장치의 디스플레이 제어 방법 및 그 장치 |
| KR20200023858A (ko) * | 2018-08-27 | 2020-03-06 | 삼성전자주식회사 | 가상 현실에서 정보를 제공하는 전자 장치 및 방법 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014088310A1 (fr) | Dispositif d'affichage et son procédé de commande | |
| WO2021206415A1 (fr) | Dispositif électronique pour communiquer en réalité augmentée et procédé associé | |
| WO2014157846A1 (fr) | Terminal portatif, prothèse auditive et procédé d'indication de positions de sources sonores dans le terminal portatif | |
| WO2014157897A1 (fr) | Procédé et dispositif permettant de commuter des tâches | |
| WO2014193101A1 (fr) | Procédé et appareil permettant de commander un écran d'affichage à l'aide d'informations environnementales | |
| WO2014025108A1 (fr) | Visiocasque pour ajuster une sortie audio et une sortie vidéo l'une par rapport à l'autre et son procédé de commande | |
| WO2015030488A1 (fr) | Procédé d'affichage multiple, support de stockage et dispositif électronique | |
| WO2015002380A1 (fr) | Dispositif électronique et procédé de commande de fenêtres multiples dans le dispositif électronique | |
| WO2017171204A1 (fr) | Dispositif terminal et procédé de commande correspondant | |
| WO2014204239A1 (fr) | Dispositif électronique destiné à l'affichage d'un écran de verrouillage et procédé de commande associé | |
| WO2016093506A1 (fr) | Terminal mobile et procédé de commande associé | |
| WO2010143843A2 (fr) | Procédé et dispositif de radiodiffusion d'un contenu | |
| EP2923232A1 (fr) | Visiocasque et son procédé de commande | |
| EP3105666A1 (fr) | Terminal utilisateur et procédé d'affichage associé | |
| WO2018093005A1 (fr) | Terminal mobile et procédé de commande associé | |
| WO2020159302A1 (fr) | Dispositif électronique permettant d'assurer diverses fonctions dans un environnement de réalité augmentée et procédé de fonctionnement associé | |
| WO2014126283A1 (fr) | Procédé d'exploitation de terminal portatif | |
| WO2018030567A1 (fr) | Hmd et son procédé de commande | |
| WO2016093633A1 (fr) | Procédé et dispositif d'affichage de contenu | |
| WO2016076561A2 (fr) | Procédé de commande de dispositif et dispositif pour la mise en oeuvre de ce procédé | |
| WO2020153766A1 (fr) | Procédé d'affichage d'informations visuelles associées à une entrée vocale et dispositif électronique prenant en charge ledit procédé | |
| WO2020013651A1 (fr) | Dispositif électronique, et procédé pour la transmission d'un contenu du dispositif électronique | |
| WO2016122153A1 (fr) | Appareil d'affichage et son procédé de commande | |
| WO2020075926A1 (fr) | Dispositif mobile et procédé de commande de dispositif mobile | |
| WO2017126709A1 (fr) | Terminal mobile et procédé de commande associé |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23872756 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 23872756 Country of ref document: EP Kind code of ref document: A1 |