[go: up one dir, main page]

WO2024205075A1 - Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire pour transmettre des données de rendu pour générer un écran à un dispositif électronique externe - Google Patents

Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire pour transmettre des données de rendu pour générer un écran à un dispositif électronique externe Download PDF

Info

Publication number
WO2024205075A1
WO2024205075A1 PCT/KR2024/003104 KR2024003104W WO2024205075A1 WO 2024205075 A1 WO2024205075 A1 WO 2024205075A1 KR 2024003104 W KR2024003104 W KR 2024003104W WO 2024205075 A1 WO2024205075 A1 WO 2024205075A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
wearable device
rendering
electronic devices
manager
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/KR2024/003104
Other languages
English (en)
Korean (ko)
Inventor
정준식
김보성
이진석
김성오
김주영
김지현
김지호
염동현
우현택
임동건
최성수
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020230121072A external-priority patent/KR20240147402A/ko
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of WO2024205075A1 publication Critical patent/WO2024205075A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/368Image reproducers using viewer tracking for two or more viewers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof

Definitions

  • Embodiments of the present disclosure relate to an electronic device, a method, and a non-transitory computer-readable storage medium for transmitting rendering data for generating a screen to an external electronic device.
  • electronic devices are being developed that provide extended reality (XR) services that display computer-generated information in conjunction with external objects in the real world.
  • the electronic devices can provide extended reality services to users by using virtual objects corresponding to the users.
  • an electronic device may include: at least one processor including a communication circuit; a processing circuit; and a memory storing instructions and including one or more storage media.
  • the instructions when individually or collectively executed by the at least one processor, may be configured to cause the electronic device to receive a request for displaying at least a portion of a three-dimensional virtual space from a plurality of external electronic devices connected through the communication circuit.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to identify, through a tracking identifier, viewpoints of the plurality of external electronic devices using input information received from the plurality of external electronic devices through the communication circuit.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to identify, through a display manager, screens representing at least a portion of the three-dimensional virtual space corresponding to the viewpoints.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device, via the rendering manager, to determine resource information for obtaining at least a portion of each of the screens.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device, based on the determined resource information, to transmit, using the communication circuit, rendering data representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices, to each of the plurality of external electronic devices.
  • a non-transitory computer-readable storage medium may store one or more programs including instructions.
  • the instructions when individually or collectively executed by at least one processor of an electronic device, may cause the electronic device to receive a request from a plurality of external electronic devices connected via a communication circuit to display at least a portion of a three-dimensional virtual space.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to identify, via a tracking identifier, viewpoints of the plurality of external electronic devices using input information received from the plurality of external electronic devices via the communication circuit.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device to identify, via a display manager, screens representing at least a portion of the three-dimensional virtual space corresponding to the viewpoints.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device, via the rendering manager, to determine resource information for obtaining at least a portion of each of the screens.
  • the instructions when individually or collectively executed by the at least one processor, may cause the electronic device, based on the determined resource information, to transmit, using the communication circuit, rendering data representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices, to each of the plurality of external electronic devices.
  • the method may include an operation of receiving a request for displaying at least a portion of a three-dimensional virtual space from a plurality of external electronic devices connected through a communication circuit.
  • the method may include an operation of identifying, through a tracking identifier, viewpoints of the plurality of external electronic devices using input information received from the plurality of external electronic devices through the communication circuit.
  • the method may include an operation of identifying, through a display manager, screens representing at least a portion of the three-dimensional virtual space corresponding to the viewpoints.
  • the method may include an operation of determining, through a rendering manager, resource information for obtaining at least a portion of each of the screens.
  • the method may include an operation of transmitting, to each of the plurality of external electronic devices, rendering data representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices based on the determined resource information, using the communication circuit.
  • FIG. 1 illustrates an example of a screen corresponding to a viewpoint of a wearable device that an electronic device transmits to the wearable device according to one embodiment.
  • FIG. 2A illustrates an example of a perspective view of a wearable device, according to one embodiment.
  • FIG. 2b illustrates an example of one or more hardware elements positioned within a wearable device, according to one embodiment.
  • FIGS. 3A and 3B illustrate an example of an appearance of a wearable device according to one embodiment.
  • FIG. 4a illustrates an example of a block diagram of a wearable device according to one embodiment.
  • FIG. 4b illustrates an example of a block diagram of an electronic device according to one embodiment.
  • FIG. 5 illustrates an example of a block diagram showing programs included in an electronic device and a wearable device according to one embodiment.
  • FIG. 6 illustrates an example of a flowchart illustrating the operation of an electronic device according to one embodiment.
  • FIG. 7 illustrates an example of an operation for a wearable device to establish a communication link with an electronic device according to one embodiment.
  • FIG. 8 illustrates an example of rendering data transmitted by an electronic device to a wearable device according to one embodiment.
  • FIG. 9 illustrates an example of layer information transmitted by an electronic device to a wearable device according to one embodiment.
  • FIG. 10 illustrates an example of an operation in which an electronic device and an external electronic device transmit data to a wearable device according to one embodiment.
  • FIG. 11 illustrates an example of a flowchart showing the operation of a wearable device according to one embodiment.
  • the electronic devices according to various embodiments disclosed in this document may be devices of various forms.
  • the electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, electronic devices, or home appliance devices.
  • portable communication devices e.g., smartphones
  • computer devices e.g., smartphones
  • portable multimedia devices portable medical devices
  • cameras electronic devices
  • electronic devices or home appliance devices.
  • the electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used merely to distinguish one component from another, and do not limit the components in any other respect (e.g., importance or order).
  • a component e.g., a first
  • another component e.g., a second
  • functionally e.g., a third component
  • module used in various embodiments of this document may include a unit implemented in hardware, software or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit, for example.
  • a module may be an integrally configured component or a minimum unit of the component or a part thereof that performs one or more functions.
  • a module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document may be implemented as software (e.g., a program) including one or more instructions stored in a storage medium (e.g., an internal memory or an external memory) that can be read by a machine.
  • a processor of the machine may call at least one instruction among the one or more instructions stored from the storage medium and execute it. This enables the machine to operate to perform at least one function according to the at least one instruction called.
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • the machine-readable storage medium may be provided in the form of a non-transitory storage medium.
  • non-transitory only means that the storage medium is a tangible device and does not include a signal (e.g., an electromagnetic wave), and this term does not distinguish between cases where data is stored semi-permanently and cases where it is stored temporarily in the storage medium.
  • a signal e.g., an electromagnetic wave
  • the method according to various embodiments disclosed in the present document may be provided as included in a computer program product.
  • the computer program product may be traded between a seller and a buyer as a commodity.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g., a compact disc read only memory (CD-ROM)), or may be distributed online (e.g., downloaded or uploaded) via an application store (e.g., Play StoreTM) or directly between two user devices (e.g., smart phones).
  • an application store e.g., Play StoreTM
  • at least a part of the computer program product may be at least temporarily stored or temporarily generated in a machine-readable storage medium, such as a memory of a manufacturer's server, a server of an application store, or an intermediary server.
  • each component e.g., a module or a program of the above-described components may include a single or multiple entities, and some of the multiple entities may be separately arranged in other components.
  • one or more of the components or operations of the above-described components may be omitted, or one or more other components or operations may be added.
  • the multiple components e.g., a module or a program
  • the integrated component may perform one or more functions of each of the multiple components identically or similarly to those performed by the corresponding component of the multiple components before the integration.
  • the operations performed by the module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, omitted, or one or more other operations may be added.
  • FIG. 1 illustrates an example of a screen corresponding to a viewpoint of a wearable device that an electronic device transmits to a wearable device according to one embodiment.
  • the electronic device (101) may be referred to as a terminal (or user terminal).
  • the terminal may include, for example, a personal computer (PC) such as a laptop and a desktop.
  • the terminal may include, for example, a smartphone, a smartpad, and/or a tablet PC.
  • the terminal may include a smart accessory such as a smartwatch and/or a head-mounted device (HMD).
  • HMD head-mounted device
  • An electronic device (101) may, while establishing a communication link with an external electronic device (e.g., a wearable device (103)), log in to an XR service based on the execution of an XR application by linking with the external electronic device.
  • the electronic device (101) may receive input information for using an XR service from the external electronic device and/or the electronic device (101), and transmit a screen representing at least a portion of a virtual space (100) corresponding to the input information to the external electronic device. From the perspective of the electronic device (101) transmitting rendering data to be used for rendering the external electronic device (e.g., a wearable device (103)), the electronic device (101) may be referred to as a server.
  • the wearable device (103) of FIG. 1 may include a head-mounted display (HMD) that can be worn on the head of a user (105).
  • the wearable device (103) may be one or more (or multiple).
  • the wearable device (103) may include a camera (e.g., the camera (425) of FIG. 4A to be described later) positioned toward the front of the user (105) when the device is worn by the user (105).
  • the front of the user (105) may include the head of the user (105) and/or the direction in which the gaze of the user (105) is directed.
  • the wearable device (103) according to one embodiment may include a sensor for identifying the motion of the head of the user (105) and/or the wearable device (103) when the device is worn by the user (105).
  • the wearable device (103) can identify the posture of the wearable device (103) based on the data of the sensor.
  • the wearable device (103) can control a camera and/or a sensor.
  • the UI can be related to a metaverse service provided by the wearable device (103), and/or a server connected to the wearable device (103), and/or a notification service.
  • the wearable device (103) may perform functions related to augmented reality (AR) and/or mixed reality (MR). While the user (105) wears the wearable device (103), the wearable device (103) may include at least one lens positioned adjacent to the eyes of the user (105). Ambient light passing through the lens of the wearable device (103) may be combined (or mixed) with light emitted from a display of the wearable device (103) (e.g., display (420) of FIG. 4A). A display area of the display may be formed within the lens through which the ambient light passes. Since the wearable device (103) combines the ambient light and the light emitted from the display, the user (105) can see a mixed image of a real object recognized by the ambient light and a virtual object formed by the light emitted from the display.
  • AR augmented reality
  • MR mixed reality
  • the wearable device (103) may perform functions related to video see-through (VST) and/or virtual reality (VR).
  • VST video see-through
  • VR virtual reality
  • the wearable device (103) may include a housing that covers an eye of the user (105).
  • the wearable device (103) may include a display disposed on a first side facing the eye (e.g., the first side (310) of FIG. 3A).
  • the wearable device (103) may include a camera (e.g., cameras 260-7, 260-8, 260-9, 260-10, 260-11, 260-12 of FIG. 3B) disposed on a second surface (e.g., the second surface (320) of FIG. 3A) opposite to the first surface.
  • the wearable device (103) may obtain frame images including ambient light.
  • the wearable device (103) may output the frame images to a display disposed on the first surface, thereby allowing a user (105) to recognize the ambient light through the display.
  • a display area of the display disposed on the first surface may be formed by one or more pixels included in the display.
  • the wearable device (103) can synthesize a virtual object within frame images output through the display to allow the user (105) to recognize the virtual object together with a real object recognized by ambient light.
  • the wearable device (103) may provide a user experience based on mixed reality (MR) by using a virtual space.
  • the wearable device (103) may recognize an external space (e.g., an actual space) in which the wearable device (103) is included, and generate a virtual space mapped to the external space.
  • the spatial recognition performed by the wearable device (103) may include simultaneous localization and mapping (SLAM) and/or spatial mapping (e.g., scene understanding (SU)).
  • SLAM simultaneous localization and mapping
  • SU scene understanding
  • the wearable device (103) may be referred to as an external electronic device from the perspective of being located outside the electronic device (101).
  • the electronic device (101) may receive input information from the wearable device (103) while being connected to the wearable device (103) through a communication circuit.
  • the wearable device (103) may identify a 6 degrees of freedom pose (6 DOF pose) of the wearable device (103) using a sensor of the wearable device (103).
  • the wearable device (103) may identify input information based on the 6 degrees of freedom pose.
  • the input information may include at least one of location information, viewpoint information, or pose information of the wearable device (103) identified by one or more programs included in the wearable device (103) (e.g., software to be described later in FIG. 4A).
  • the input information may include an input for controlling an avatar (105-1) representing the user (105) within the virtual space (100).
  • the wearable device (103) may receive input information corresponding to a voice signal for controlling the avatar (105-1) by using a microphone of the wearable device (103).
  • the wearable device (103) may recognize a user's gesture through a camera (e.g., cameras (260-7, 260-8, 260-9, 260-10, 260-11, 260-12) of FIG. 3B) and receive input information for controlling the avatar (105-1) based on the gesture.
  • the wearable device (103) may receive the input information by using an external controller (not shown) for controlling a function of the wearable device (103).
  • the wearable device (103) may transmit the input information to the electronic device (101) through a communication circuit.
  • An electronic device (101) can identify a viewpoint (110) of the wearable device (103) using input information received from the wearable device (103).
  • the viewpoint (110) of the wearable device (103) can correspond to a field of view (FoV) of a user (105) (or a camera of the wearable device (103)).
  • the electronic device (101) can identify the viewpoint (110) of the wearable device (103) from the input information through a tracking identifier (e.g., the tracking identifier (570) of FIG. 5).
  • the electronic device (101) can map the viewpoint (110-1) of an avatar (105-1) in a virtual space and the viewpoint (110) of the wearable device (103) based on identifying the viewpoint (110) of the wearable device (103).
  • the electronic device (101) can identify at least a portion (115-1) of the virtual space (100) corresponding to the viewpoint (110-1) of the avatar (105-1).
  • the electronic device (101) can transmit a screen (115) representing at least a portion (115-1) of the virtual space (100) to the wearable device (103). Referring to FIG.
  • the wearable device (103) may obtain one or more screens by using one or more rendering data sets to display the screen (115) on the display. Since the wearable device (103) includes one or more displays corresponding to both eyes of the user (105), the screen (115) may be provided to the user (105) by displaying one or more screens on each of the one or more displays.
  • the one or more rendering data sets for generating the one or more screens may include similar information. For example, a first screen of the one or more screens may be displayed on a display corresponding to the left eye of the user (105), and a second screen of the one or more screens may be displayed on another display corresponding to the right eye of the user (105).
  • the electronic device (101) can identify a screen (115) corresponding to the viewpoint (110) and representing at least a portion (115-1) of the virtual space (100) through the display manager based on identifying the viewpoint (110) of the wearable device (103).
  • the electronic device (101) may identify layers for forming the screen (115), virtual objects in a virtual space, and/or the number of screens to be transmitted to the wearable device (103) in order to transmit the screen (115) to the wearable device (103).
  • the electronic device (101) may identify elements constituting the screen (115) through a rendering manager (e.g., the rendering manager (510-1) of FIG. 5).
  • the elements may include one or more layers included in the screen (115) and/or one or more virtual objects included in the screen (115).
  • the electronic device (101) may obtain resource information representing one of the elements through the rendering manager.
  • the electronic device (101) may transmit rendering data representing at least a portion of the screen (115) to be used for rendering of the wearable device (103) to the wearable device (103).
  • the electronic device (101) can render some of the layers included in the screen (115) based on the resource information.
  • the electronic device (101) can obtain rendering data for rendering the other part.
  • the electronic device (101) can transmit, to the wearable device (103), a part of the screen (115) on which some of the layers are rendered, together with the rendering data.
  • the electronic device (101) can perform rendering on some of the one or more virtual objects included in the screen (115) based on the resource information.
  • the electronic device (101) can transmit, to the wearable device (103), rendering data for rendering some of the one or more virtual objects on which rendering has been performed, and other parts of the one or more virtual objects.
  • the electronic device (101) can render a screen corresponding to some of the multiple displays of the wearable device (103), and transmit rendering data for rendering a screen corresponding to the other part, to the wearable device (103).
  • the electronic device (101) while connected to the wearable device (103), can process at least a portion of rendering data to be used for rendering the wearable device (103). For example, since the processing speed of the electronic device (101) for obtaining the rendering data is relatively faster than the processing speed of the wearable device (103), the electronic device (101) can process a portion of the task of the wearable device (103).
  • FIG. 2A illustrates an example of a perspective view of a wearable device according to one embodiment.
  • the wearable device (103) may have a form of glasses that can be worn on a body part (e.g., head) of a user.
  • the wearable device (103) of FIGS. 2A and 2B may be an example of the wearable device (103) of FIG. 1.
  • the wearable device (103) may include a head-mounted display (HMD).
  • the housing of the wearable device (103) may include a flexible material, such as rubber and/or silicone, that is configured to fit closely to a portion of the user's head (e.g., a portion of the face surrounding both eyes).
  • the housing of the wearable device (103) may include one or more straps that can be wrapped around the user's head, and/or one or more temples that can be detachably attached to ears of the head.
  • a wearable device (103) may include at least one display (250) and a frame (200) supporting at least one display (250).
  • the wearable device (103) may be worn on a part of a user's body.
  • the wearable device (103) may provide augmented reality (AR), virtual reality (VR), or mixed reality (MR) that combines augmented reality and virtual reality to a user wearing the wearable device (103).
  • AR augmented reality
  • VR virtual reality
  • MR mixed reality
  • the wearable device (103) may display a virtual reality image provided from at least one optical device (282, 284) of FIG. 2B on at least one display (250) in response to a designated gesture of the user obtained through a motion recognition camera (motion recognition camera or motion tracking camera) (260-2, 260-3) of FIG. 2B.
  • a motion recognition camera motion recognition camera or motion tracking camera
  • At least one display (250) can provide visual information to a user.
  • at least one display (250) can include a transparent or translucent lens.
  • At least one display (250) can include a first display (250-1) and/or a second display (250-2) spaced apart from the first display (250-1).
  • the first display (250-1) and the second display (250-2) can be positioned at positions corresponding to the left and right eyes of the user, respectively.
  • At least one display (250) can provide a user with visual information transmitted from external light and other visual information that is distinct from the visual information through a lens included in the at least one display (250).
  • the lens can be formed based on at least one of a Fresnel lens, a pancake lens, or a multi-channel lens.
  • the at least one display (250) can include a first surface (231) and a second surface (232) opposite to the first surface (231).
  • a display area can be formed on the second surface (232) of the at least one display (250).
  • external light can be transmitted to the user by being incident on the first surface (231) and transmitted through the second surface (232).
  • at least one display (250) can display an augmented reality image combined with a virtual reality image provided from at least one optical device (282, 284) on a real screen transmitted through external light, on a display area formed on the second surface (232).
  • At least one display (250) can include at least one waveguide (233, 234) that diffracts light emitted from at least one optical device (282, 284) and transmits the diffracted light to a user.
  • the at least one waveguide (233, 234) can be formed based on at least one of glass, plastic, or polymer.
  • a nano-pattern can be formed on at least a portion of an exterior or interior of the at least one waveguide (233, 234).
  • the nano-pattern can be formed based on a grating structure having a polygonal and/or curved shape. Light incident on one end of the at least one waveguide (233, 234) can be propagated to the other end of the at least one waveguide (233, 234) by the nano-pattern.
  • At least one waveguide (233, 234) may include at least one diffractive element (e.g., a diffractive optical element (DOE), a holographic optical element (HOE)), a reflective element (e.g., a reflective mirror).
  • at least one waveguide (233, 234) may be arranged within the wearable device (103) to guide a screen displayed by at least one display (250) to the user's eyes.
  • the screen may be transmitted to the user's eyes based on total internal reflection (TIR) occurring within the at least one waveguide (233, 234).
  • TIR total internal reflection
  • the wearable device (103) can analyze an object included in a real image collected through a shooting camera (260-4), combine a virtual object corresponding to an object to be provided with augmented reality among the analyzed objects, and display the virtual object on at least one display (250).
  • the virtual object can include at least one of text and image for various information related to the object included in the real image.
  • the wearable device (103) can analyze the object based on a multi-camera such as a stereo camera.
  • the wearable device (103) can execute spatial recognition (e.g., simultaneous localization and mapping (SLAM)) using a multi-camera and/or ToF (time-of-flight).
  • SLAM simultaneous localization and mapping
  • ToF time-of-flight
  • the frame (200) may be formed as a physical structure that allows the wearable device (103) to be worn on the user's body.
  • the frame (200) may be configured so that, when the user wears the wearable device (103), the first display (250-1) and the second display (250-2) may be positioned corresponding to the user's left and right eyes.
  • the frame (200) may support at least one display (250).
  • the frame (200) may support the first display (250-1) and the second display (250-2) to be positioned corresponding to the user's left and right eyes.
  • the frame (200) may include a region (220) that at least partially contacts a part of the user's body when the user wears the wearable device (103).
  • the region (220) of the frame (200) that contacts a part of the user's body may include a region that contacts a part of the user's nose, a part of the user's ear, and a part of the side of the user's face that the wearable device (103) makes contact with.
  • the frame (200) may include a nose pad (210) that contacts a part of the user's body. When the wearable device (103) is worn by the user, the nose pad (210) may contact a part of the user's nose.
  • the frame (200) may include a first temple (204) and a second temple (205) that contact another part of the user's body that is distinct from the part of the user's body.
  • the frame (200) may include a first rim (201) that surrounds at least a portion of the first display (250-1), a second rim (202) that surrounds at least a portion of the second display (250-2), a bridge (203) that is arranged between the first rim (201) and the second rim (202), a first pad (211) that is arranged along a portion of an edge of the first rim (201) from one end of the bridge (203), a second pad (212) that is arranged along a portion of an edge of the second rim (202) from the other end of the bridge (203), a first temple (204) that extends from the first rim (201) and is fixed to a portion of an ear of the wearer, and a second temple (205) that extends from the second rim (202) and is fixed to a portion of an ear opposite the ear.
  • a first rim (201) that surrounds at least a portion of the first display (250-1)
  • a second rim (202) that surrounds at least a portion of
  • the first pad (211) and the second pad (212) can be in contact with a part of the user's nose, and the first temple (204) and the second temple (205) can be in contact with a part of the user's face and a part of the user's ear.
  • the temples (204, 205) can be rotatably connected to the rim through the hinge units (206, 207) of FIG. 2B.
  • the first temple (204) can be rotatably connected to the first rim (201) through the first hinge unit (206) disposed between the first rim (201) and the first temple (204).
  • the second temple (205) can be rotatably connected to the second rim (202) through the second hinge unit (207) disposed between the second rim (202) and the second temple (205).
  • the wearable device (103) can identify an external object (e.g., a user's fingertip) touching the frame (200) and/or a gesture performed by the external object by using a touch sensor, a grip sensor, and/or a proximity sensor formed on at least a portion of a surface of the frame (200).
  • an external object e.g., a user's fingertip
  • a grip sensor e.g., a grip sensor
  • a proximity sensor formed on at least a portion of a surface of the frame (200).
  • the wearable device (103) may include hardwares (e.g., hardwares to be described later based on the block diagrams of FIGS. 4A and/or 4B) that perform various functions.
  • the hardwares may include a battery module (270), an antenna module (275), at least one optical device (282, 284), speakers (e.g., speakers (255-1, 255-2)), microphones (e.g., microphones (265-1, 265-2, 265-3)), a light-emitting module (not shown), and/or a printed circuit board (PCB) (290) (e.g., a printed circuit board).
  • the various hardwares may be arranged within the frame (200).
  • microphones e.g., microphones 265-1, 265-2, 265-3) of the wearable device (103) may be disposed on at least a portion of the frame (200) to acquire sound signals.
  • a first microphone (265-1) disposed on the bridge (203), a second microphone (265-2) disposed on the second rim (202), and a third microphone (265-3) disposed on the first rim (201) are illustrated in FIG. 2B, but the number and arrangement of the microphones (265) are not limited to the embodiment of FIG. 2B.
  • the wearable device (103) may identify the direction of the sound signal by using a plurality of microphones disposed on different portions of the frame (200).
  • At least one optical device (282, 284) may project a virtual object onto at least one display (250) to provide various image information to a user.
  • at least one optical device (282, 284) may be a projector.
  • At least one optical device (282, 284) may be positioned adjacent to at least one display (250) or may be included within at least one display (250) as a part of the at least one display (250).
  • the wearable device (103) may include a first optical device (282) corresponding to a first display (250-1) and a second optical device (284) corresponding to a second display (250-2).
  • At least one optical device (282, 284) may include a first optical device (282) disposed at an edge of the first display (250-1) and a second optical device (284) disposed at an edge of the second display (250-2).
  • the first optical device (282) may transmit light to a first waveguide (233) disposed on the first display (250-1), and the second optical device (284) may transmit light to a second waveguide (234) disposed on the second display (250-2).
  • the camera (260) may include a recording camera (260-4), an eye tracking camera (ET CAM) (260-1), and/or a motion recognition camera (260-2, 260-3).
  • the recording camera (260-4), the eye tracking camera (260-1), and the motion recognition camera (260-2, 260-3) may be positioned at different locations on the frame (200) and may perform different functions.
  • the eye tracking camera (260-1) may output data indicating a location or gaze of an eye of a user wearing the wearable device (103).
  • the wearable device (103) may detect the gaze from an image including the user's pupil obtained through the eye tracking camera (260-1).
  • the wearable device (103) can identify an object (e.g., a real object and/or a virtual object) focused on by the user by using the user's gaze acquired through the gaze tracking camera (260-1).
  • the wearable device (103) that has identified the focused object can execute a function (e.g., gaze interaction) for interaction between the user and the focused object.
  • the wearable device (103) can express a part corresponding to the eye of an avatar representing the user in a virtual space by using the user's gaze acquired through the gaze tracking camera (260-1).
  • the wearable device (103) can render an image (or screen) displayed on at least one display (250) based on the position of the user's eyes.
  • the visual quality of a first area related to the gaze in the image and the visual quality (e.g., resolution, brightness, saturation, grayscale, or PPI (pixels per inch)) of a second area distinguished from the first area may be different from each other.
  • the wearable device (103) can obtain an image (or screen) having visual quality of a first area matching the user's gaze and visual quality of a second area by using foveated rendering.
  • the wearable device (103) supports an iris recognition function
  • user authentication can be performed based on iris information obtained by using a gaze tracking camera (260-1).
  • An example in which the gaze tracking camera (260-1) is positioned toward both eyes of the user is illustrated in FIG. 2B, but the embodiment is not limited thereto, and the gaze tracking camera (260-1) can be positioned solely toward the user's left eye or right eye.
  • the capturing camera (260-4) can capture an actual image or background to be aligned with a virtual image in order to implement augmented reality or mixed reality content.
  • the capturing camera (260-4) can be used to obtain an image having a high resolution based on HR (high resolution) or PV (photo video).
  • the capturing camera (260-4) can capture an image of a specific object existing at a location where a user is looking and provide the image to at least one display (250).
  • the at least one display (250) can display one image in which information about a real image or background including an image of the specific object obtained using the capturing camera (260-4) and a virtual image provided through at least one optical device (282, 284) are superimposed.
  • the wearable device (103) can compensate for depth information (e.g., the distance between the wearable device (103) and an external object obtained through a depth sensor) by using the image obtained through the capturing camera (260-4).
  • the wearable device (103) can perform object recognition through an image acquired using the shooting camera (260-4).
  • the wearable device (103) can perform a function (e.g., auto focus) for focusing on an object (or subject) in an image and/or an optical image stabilization (OIS) function (e.g., anti-shake function) using the shooting camera (260-4).
  • the wearable device (103) can perform a pass-through function for displaying an image acquired through the shooting camera (260-4) by overlapping at least a part of a screen representing a virtual space on at least one display (250).
  • the shooting camera (260-4) can be referred to as an HR (high resolution) camera or a PV (photo video) camera.
  • the shooting camera (260-4) can provide an AF (auto focus) function and an optical image stabilization (OIS) function.
  • the capturing camera (260-4) may include a GS (global shutter) camera and/or an RS (rolling shutter) camera.
  • the capturing camera (260-4) may be positioned on a bridge (203) positioned between the first rim (201) and the second rim (202).
  • the gaze tracking camera (260-1) can implement more realistic augmented reality by tracking the gaze of a user wearing the wearable device (103) and thereby matching the user's gaze with visual information provided to at least one display (250). For example, when the wearable device (103) looks straight ahead, the wearable device (103) can naturally display environmental information related to the user's front at a location where the user is located on at least one display (250).
  • the gaze tracking camera (260-1) can be configured to capture an image of the user's pupil in order to determine the user's gaze. For example, the gaze tracking camera (260-1) can receive gaze detection light reflected from the user's pupil and track the user's gaze based on the position and movement of the received gaze detection light.
  • the gaze tracking camera (260-1) can be placed at positions corresponding to the user's left and right eyes.
  • the gaze tracking camera (260-1) may be positioned within the first rim (201) and/or the second rim (202) to face the direction in which the user wearing the wearable device (103) is positioned.
  • the motion recognition camera (260-2, 260-3) can recognize the movement of the user's entire body, such as the user's torso, hand, or face, or a part of the body, and thereby provide a specific event on a screen provided on at least one display (250).
  • the motion recognition camera (260-2, 260-3) can recognize the user's motion (or recognize a gesture), obtain a signal corresponding to the motion, and provide a display corresponding to the signal on at least one display (250).
  • the processor can identify the signal corresponding to the motion, and perform a designated function based on the identification.
  • the motion recognition camera (260-2, 260-3) can be used to perform a spatial recognition function using SLAM and/or a depth map for 6 degrees of freedom pose (6 DOF pose).
  • the processor can perform a gesture recognition function and/or an object tracking function using the motion recognition camera (260-2, 260-3).
  • the gesture recognition cameras (260-2, 260-3) may be positioned on the first limb (201) and/or the second limb (202).
  • the gesture recognition cameras (260-2, 260-3) may include a global shutter (GS) camera used for head tracking, hand tracking, and/or spatial recognition based on one of a three-degree-of-freedom pose or a six-degree-of-freedom pose.
  • the GS camera may include two or more stereo cameras to track fine movements.
  • the GS camera may be included in a gaze tracking camera (260-1) for tracking a user's gaze.
  • the camera (260) included in the wearable device (103) is not limited to the above-described gaze tracking camera (260-1) and motion recognition cameras (260-2, 260-3).
  • the wearable device (103) can identify an external object included in the FoV by using a camera positioned toward the user's FoV.
  • the wearable device (103) identifying the external object can be performed based on a sensor for identifying the distance between the wearable device (103) and the external object, such as a depth sensor and/or a ToF (time of flight) sensor.
  • the camera (260) positioned toward the FoV can support an autofocus function and/or an OIS (optical image stabilization) function.
  • the wearable device (103) may include a camera (260) (e.g., a face tracking (FT) camera) positioned toward the face to obtain an image including the face of a user wearing the wearable device (103).
  • FT face tracking
  • the wearable device (103) may further include a light source (e.g., an LED) that emits light toward a subject (e.g., a user's eye, face, and/or an external object within the FoV) being captured using the camera (260).
  • the light source may include an LED having an infrared wavelength.
  • the light source may be disposed on at least one of the frame (200) and the hinge units (206, 207).
  • the battery module (270) may supply power to electronic components of the wearable device (103).
  • the battery module (270) may be disposed within the first temple (204) and/or the second temple (205).
  • the battery module (270) may be a plurality of battery modules (270).
  • the plurality of battery modules (270) may be disposed within each of the first temple (204) and the second temple (205).
  • the battery module (270) may be disposed at an end of the first temple (204) and/or the second temple (205).
  • the antenna module (275) can transmit signals or power to the outside of the wearable device (103), or receive signals or power from the outside.
  • the antenna module (275) can be positioned within the first temple (204) and/or the second temple (205).
  • the antenna module (275) can be positioned close to one surface of the first temple (204) and/or the second temple (205).
  • the speaker (255) can output an acoustic signal to the outside of the wearable device (103).
  • the acoustic output module may be referred to as a speaker.
  • the speaker (255) may be positioned within the first temple (204) and/or the second temple (205) so as to be positioned adjacent to an ear of a user wearing the wearable device (103).
  • the speaker (255) may include a second speaker (255-2) positioned within the first temple (204) and thus positioned adjacent to the user's left ear, and a first speaker (255-1) positioned within the second temple (205) and thus positioned adjacent to the user's right ear.
  • the light-emitting module may include at least one light-emitting element.
  • the light-emitting module may emit light of a color corresponding to a specific state or emit light with an action corresponding to a specific state in order to visually provide information about a specific state of the wearable device (103) to a user. For example, when the wearable device (103) needs to be charged, it may emit red light at a regular cycle.
  • the light-emitting module may be disposed on the first rim (201) and/or the second rim (202).
  • the wearable device (103) may include a printed circuit board (PCB) (290).
  • the PCB (290) may be included in at least one of the first temple (204) or the second temple (205).
  • the PCB (290) may include an interposer positioned between at least two sub-PCBs.
  • One or more hardwares included in the wearable device (103) e.g., hardwares illustrated by different blocks in FIG. 4A
  • the wearable device (103) may include a flexible PCB (FPCB) for interconnecting the hardwares.
  • FPCB flexible PCB
  • the wearable device (103) may include at least one of a gyro sensor, a gravity sensor, and/or an acceleration sensor for detecting a posture of the wearable device (103) and/or a posture of a body part (e.g., a head) of a user wearing the wearable device (103).
  • a gyro sensor for detecting a posture of the wearable device (103) and/or a posture of a body part (e.g., a head) of a user wearing the wearable device (103).
  • Each of the gravity sensor and the acceleration sensor may measure gravitational acceleration and/or acceleration based on designated three-dimensional axes (e.g., x-axis, y-axis, and z-axis) that are perpendicular to each other.
  • the gyro sensor may measure an angular velocity about each of the designated three-dimensional axes (e.g., x-axis, y-axis, and z-axis). At least one of the gravity sensor, the acceleration sensor, and the gyro sensor may be referred to as an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • the wearable device (103) may identify a user's motion and/or gesture performed to execute or terminate a specific function of the wearable device (103) based on the IMU.
  • FIGS. 3A and 3B illustrate an example of an exterior appearance of a wearable device according to one embodiment.
  • the wearable device (103) of FIGS. 3A and 3B may be an example of the wearable device (103) of FIG. 1.
  • An example of an exterior appearance of a first side (310) of a housing of the wearable device (103) according to one embodiment may be illustrated in FIG. 3A, and an example of an exterior appearance of a second side (320) opposite to the first side (310) may be illustrated in FIG. 3B.
  • a first surface (310) of a wearable device (103) may have a form attachable to a body part of a user (e.g., the face of the user).
  • the wearable device (103) may further include a strap for being fixed to a body part of a user, and/or one or more temples (e.g., the first temple (204) and/or the second temple (205) of FIGS. 2A and 2B).
  • a first display (250-1) for outputting an image to a left eye among the user's two eyes, and a second display (250-2) for outputting an image to a right eye among the user's two eyes may be disposed on the first surface (310).
  • the wearable device (103) is formed on the first surface (310) and may further include a rubber or silicone packing to prevent interference by light (e.g., external light) different from light radiated from the first display (250-1) and the second display (250-2).
  • the wearable device (103) may include cameras (260-1) for capturing and/or tracking both eyes of the user adjacent to each of the first display (250-1) and the second display (250-2).
  • the cameras (260-1) may be referred to as the gaze tracking camera (260-1) of FIG. 2B.
  • the wearable device (103) may include cameras (260-5, 260-6) for capturing and/or recognizing the face of the user.
  • the cameras (260-5, 260-6) may be referred to as FT cameras.
  • the wearable device (103) may control an avatar representing the user in a virtual space based on the movement of the user's face identified using the cameras (260-5, 260-6).
  • the wearable device (103) may change the texture and/or shape of a portion of an avatar (e.g., a portion of an avatar expressing a human face) using information obtained by cameras (260-5, 260-6) (e.g., FT cameras) and representing facial expressions of a user wearing the wearable device (103).
  • a portion of an avatar e.g., a portion of an avatar expressing a human face
  • cameras 260-5, 260-6
  • FT cameras representing facial expressions of a user wearing the wearable device (103).
  • a camera e.g., cameras (260-7, 260-8, 260-9, 260-10, 260-11, 260-12)
  • a sensor e.g., depth sensor (330)
  • the cameras (260-7, 260-8, 260-9, 260-10) may be disposed on the second surface (320) to recognize an external object.
  • Cameras (260-7, 260-8, 260-9, 260-10) may be referenced to the motion recognition cameras (260-2, 260-3) of Fig. 2b.
  • the wearable device (103) can obtain images and/or videos to be transmitted to each of the user's two eyes.
  • the camera (260-11) can be placed on the second face (320) of the wearable device (103) to obtain an image to be displayed through the second display (250-2) corresponding to the right eye among the two eyes.
  • the camera (260-12) can be placed on the second face (320) of the wearable device (103) to obtain an image to be displayed through the first display (250-1) corresponding to the left eye among the two eyes.
  • the wearable device (101) can obtain one screen by using a plurality of images obtained through the cameras (260-11, 260-12).
  • the cameras (260-11, 260-12) may be referenced to the shooting camera (260-4) of Fig. 2b.
  • the wearable device (103) may include a depth sensor (330) disposed on the second face (320) to identify a distance between the wearable device (103) and an external object. Using the depth sensor (330), the wearable device (103) may obtain spatial information (e.g., depth) for at least a portion of the FoV of a user wearing the wearable device (103).
  • a microphone may be disposed on the second face (320) of the wearable device (103) to obtain a sound output from an external object. The number of microphones may be one or more, depending on the embodiment.
  • FIG. 4A illustrates an example of a block diagram of a wearable device according to one embodiment.
  • a wearable device (103) may include at least one of a processor (410), a memory (415), a display (420), a camera (425), a sensor (430), or a communication circuit (435).
  • the processor (410), the memory (415), the display (420), the camera (425), the sensor (430), and the communication circuit (435) may be electrically and/or operatively connected to each other by an electronic device such as a communication bus (402).
  • the type and/or number of hardware components included in the wearable device (103) are not limited to those illustrated in FIG. 4A.
  • the wearable device (103) may include only some of the hardware components illustrated in FIG. 4A.
  • the elements (e.g., layers and/or modules) within the memory described below may be logically separated.
  • the elements within the memory (415) may be included within a hardware component that is separated from the memory (415).
  • the operation performed by the processor (410) using each of the elements within the memory (415) is one embodiment, and the processor (410) may perform a different operation from the above operation through at least one of the elements within the memory (415).
  • the processor (410) of the wearable device (103) may include a hardware component for processing data based on one or more instructions.
  • the hardware component for processing data may include, for example, an arithmetic and logic unit (ALU), a field programmable gate array (FPGA), and/or a central processing unit (CPU).
  • ALU arithmetic and logic unit
  • FPGA field programmable gate array
  • CPU central processing unit
  • the number of processors (410) may be one or more.
  • the processor (410) may have a multi-core processor structure such as a dual core, a quad core, or a hexa core.
  • the memory (415) of the wearable device (103) may include a hardware component for storing data and/or instructions input and/or output to the processor (410).
  • the memory (415) may include, for example, a volatile memory such as a random-access memory (RAM) and/or a non-volatile memory such as a read-only memory (ROM).
  • the volatile memory may include, for example, at least one of a dynamic RAM (DRAM), a static RAM (SRAM), a Cache RAM, and a pseudo SRAM (PSRAM).
  • DRAM dynamic RAM
  • SRAM static RAM
  • PSRAM pseudo SRAM
  • the non-volatile memory may include, for example, at least one of a programmable ROM (PROM), an erasable PROM (EPROM), an electrically erasable PROM (EEPROM), a flash memory, a hard disk, a compact disc, and an embedded multi media card (eMMC).
  • PROM programmable ROM
  • EPROM erasable PROM
  • EEPROM electrically erasable PROM
  • flash memory a hard disk, a compact disc, and an embedded multi media card (eMMC).
  • eMMC embedded multi media card
  • the display (420) of the wearable device (103) can output visualized information to a user of the wearable device (103).
  • the display (420) can be controlled by a processor (410) including a circuit such as a graphic processing unit (GPU) to output visualized information to the user.
  • the display (420) can include a flat panel display (FPD) and/or electronic paper.
  • the FPD can include a liquid crystal display (LCD), a plasma display panel (PDP), and/or one or more light emitting diodes (LEDs).
  • the LEDs can include organic LEDs (OLEDs).
  • the camera (425) of the wearable device (103) may include one or more optical sensors (e.g., a charged coupled device (CCD) sensor, a complementary metal oxide semiconductor (CMOS) sensor) that generate electrical signals representing the color and/or brightness of light.
  • the plurality of optical sensors included in the camera (425) may be arranged in the form of a two-dimensional array.
  • the camera (425) may acquire electrical signals of each of the plurality of optical sensors substantially simultaneously to generate two-dimensional frame data corresponding to light reaching the optical sensors of the two-dimensional array.
  • photographic data captured using the camera (425) may mean one (a) two-dimensional frame data acquired from the camera (425).
  • video data captured using the camera (425) may mean a sequence of a plurality of two-dimensional frame data acquired from the camera (425) according to a frame rate.
  • the camera (425) may further include a flash light positioned toward the direction in which the camera (425) receives light and outputs light toward the direction.
  • the wearable device (103) may include a plurality of cameras, for example, cameras (425), arranged in different directions.
  • a first camera among the plurality of cameras may be referred to as a motion recognition camera (e.g., motion recognition cameras (260-2, 260-3) of FIG. 2B), and a second camera may be referred to as a gaze tracking camera (e.g., gaze tracking camera (260-1) of FIG. 2B).
  • the wearable device (103) may identify a position, shape, and/or gesture of a hand using an image acquired using the first camera.
  • the wearable device (103) may identify a direction of a gaze of a user wearing the wearable device (103) using an image acquired using the second camera. For example, the direction in which the first camera faces may be opposite to the direction in which the second camera faces.
  • a sensor (430) of a wearable device (103) may generate electrical information from non-electronic information related to the wearable device (103), which may be processed by a processor (410) and/or a memory (415) of the wearable device (103).
  • the information may be referred to as sensor data.
  • the sensor (430) may include a global positioning system (GPS) sensor, an image sensor, an ambient light sensor, and/or a time-of-flight (ToF) sensor for detecting a geographic location of the wearable device (103), and an inertial measurement unit (IMU) for detecting a physical motion of the wearable device (103).
  • GPS global positioning system
  • ToF time-of-flight
  • IMU inertial measurement unit
  • the communication circuit (435) of the wearable device (103) may include hardware components for supporting transmission and/or reception of electrical signals between the wearable device (103) and an external electronic device.
  • the communication circuit (435) may include, for example, at least one of a modem (MODEM), an antenna, and an optical/electronic (O/E) converter.
  • the communication circuit (435) may support transmission and/or reception of electrical signals based on various types of protocols, such as Ethernet, a local area network (LAN), a wide area network (WAN), wireless fidelity (WiFi), Bluetooth, Bluetooth low energy (BLE), ZigBee, long term evolution (LTE), 5G NR (new radio), and/or 6G.
  • one or more instructions (or commands) representing operations and/or actions to be performed on data by a processor (410) of the wearable device (103) may be stored in the memory (415) of the wearable device (103).
  • a set of one or more instructions may be referred to as firmware, an operating system, a process, a routine, a sub-routine, and/or an application.
  • the wearable device (103) and/or the processor (410) may perform at least one of the operations of FIG. 6 or FIG. 11 when a set of multiple instructions distributed in the form of an operating system, firmware, driver, and/or application is executed.
  • programs designed to target at least one of the hardware abstraction layer (480) and/or the application layer (440) may be classified.
  • Programs classified into the framework layer (450) may provide an executable API (application programming interface) based on other programs.
  • a program designed to target a user controlling a wearable device (103) may be classified.
  • an extended reality (XR) system UI (user interface) and/or an XR application (442) are exemplified, but the embodiment is not limited thereto.
  • programs (e.g., software applications) classified into the application layer (440) may call an application programming interface (API) to cause execution of a function supported by programs classified into the framework layer (450).
  • API application programming interface
  • the wearable device (103) may display one or more visual objects on the display (420) for performing interaction with a user for using a virtual space based on the execution of the XR system UI (441).
  • a visual object may mean an object that can be placed on a screen for transmitting and/or interacting with information, such as text, an image, an icon, a video, a button, a check box, a radio button, a text box, a slider, and/or a table.
  • a visual object may be referred to as a visual guide, a virtual object, a visual element, a UI element, a view object, and/or a view element.
  • the wearable device (103) may provide a service for controlling functions available in the virtual space to the user based on the execution of the XR system UI (441).
  • a lightweight renderer (443) and/or an XR plug-in (444) are illustrated as being included within the XR system UI (441), but are not limited thereto.
  • the XR system UI (441) may cause execution of a function supported by the lightweight renderer (443) and/or the XR plug-in (444) included within the application layer (440).
  • the wearable device (103) may display a screen representing at least a portion of a virtual space on the display (420) based on the execution of the XR application (442).
  • the XR plug-in (444-1) included in the XR application (442) may be referenced by the XR plug-in (444) of the XR system UI (441). Any description of the XR plug-in (444-1) that overlaps with the description of the XR plug-in (444) may be omitted.
  • the wearable device (103) may cause the execution of the screen composition manager (451) based on the execution of the XR application (442).
  • the screen composition manager (451) may include a runtime service (452).
  • the runtime service (452) may be referenced as an OpenXR runtime module.
  • the wearable device (103) may be used to provide at least one of a pose prediction function, a frame timing function, and/or a spatial input function to a user through the wearable device (103) based on the execution of the runtime service (452).
  • the wearable device (103) may be used to perform rendering for a virtual space service to a user based on the execution of the runtime service (452).
  • an application e.g., unity or an OpenXR native application
  • the screen configuration manager (451) may include a pass-through library (453). Based on the execution of the pass-through library (453), the wearable device (103) may display a screen representing a virtual space on the display (420), while another screen representing an actual space acquired through the camera (425) may be superimposed on at least a portion of the screen.
  • the screen composition manager (451) may include a renderer (e.g., the renderer (540-1) of FIG. 5).
  • the wearable device (101) may render a screen to be displayed on the display by compositing virtual layers (or virtual nodes) rendered based on sensor data (e.g., sensing data acquired via the camera (425) or the sensor (430)) and pass-through layers (or pass-through nodes) acquired via the pass-through library (453) through the screen composition manager (451).
  • the virtual layers may be referred to as virtual nodes and/or virtual surfaces.
  • the wearable device (101) may render each of the virtual layers or all of the virtual layers through the screen composition manager (451).
  • the screen configuration manager (451) may include an input manager (454).
  • the wearable device (103) may identify acquired data (e.g., sensor data) by executing one or more programs included in the recognition service layer (470) based on the execution of the input manager (454).
  • the wearable device (103) may initiate execution of at least one of the functions of the wearable device (103) using the acquired data.
  • the perception abstract layer (460) can be used for data exchange between the screen configuration manager (451) and the perception service layer (470). From the perspective of being used for data exchange between the screen configuration manager (451) and the perception service layer (470), the perception abstract layer (460) can be referred to as an interface. As an example, the perception abstract layer (460) can be referred to as OpenPX and/or PPAL (perception platform abstract layer). The perception abstract layer (460) can be used for a perception client and a perception service.
  • the recognition service layer (470) may include one or more programs for processing data acquired from a sensor (430) (or a camera (425)).
  • the one or more programs may include at least one of a position tracker (471), a space recognizer (472), a gesture tracker (473), an eye tracker (474), and/or a face tracker (475).
  • the type and/or number of the one or more programs included in the recognition service layer (470) are not limited to those illustrated in FIG. 4A.
  • the wearable device (103) can identify the pose of the wearable device (103) using the sensor (430) based on the execution of the position tracker (471).
  • the wearable device (103) can identify the 6 degrees of freedom pose (6 DOF pose) of the wearable device (103) using data acquired using the camera (425) and the IMU based on the execution of the position tracker (471).
  • the position tracker (471) can be referred to as a head tracking (HeT) module.
  • HeT head tracking
  • the wearable device (103) may be used to construct a three-dimensional virtual space surrounding the wearable device (103) (or a user of the wearable device (103)) based on the execution of the space recognizer (472).
  • the wearable device (103) may reconstruct the three-dimensional surroundings of the wearable device (103) using data acquired using the camera (425) based on the execution of the space recognizer (472).
  • the wearable device (103) may identify at least one of a plane, a slope, or stairs based on the three-dimensionally reconstructed surroundings of the wearable device (103) based on the execution of the space recognizer (472).
  • the space recognizer (472) may be referred to as a scene understanding (SU) module.
  • SU scene understanding
  • the wearable device (103) may be used to identify (or recognize) a pose and/or gesture of a hand of a user of the wearable device (103) based on the execution of the gesture tracker (473).
  • the wearable device (103) may identify a pose and/or gesture of a hand of a user using data acquired from a sensor (430) based on the execution of the gesture tracker (473).
  • the wearable device (103) may identify a pose and/or gesture of a hand of a user based on data (or images) acquired using a camera (425) based on the execution of the gesture tracker (473).
  • the gesture tracker (473) may be referred to as a hand tracking (HaT) module and/or a gesture tracking module.
  • the wearable device (103) may identify (or track) eye movements of a user of the wearable device (103) based on the execution of the gaze tracker (474).
  • the wearable device (103) may identify eye movements of the user using data acquired from at least one sensor based on the execution of the gaze tracker (474).
  • the wearable device (103) may identify eye movements of the user based on data acquired using a camera (425) (e.g., the gaze tracking camera (260-1) of FIGS. 2A and 2B) and/or an infrared light emitting diode (IR LED) based on the execution of the gaze tracker (474).
  • the gaze tracker (474) may be referred to as an eye tracking (ET) module and/or a gaze tracking module.
  • the recognition service layer (470) of the wearable device (103) may further include a face tracker (475) for tracking the user's face.
  • the wearable device (103) may identify (or track) the movement of the user's face and/or the user's expression based on the execution of the face tracker (475).
  • the wearable device (103) may estimate the user's expression based on the movement of the user's face based on the execution of the face tracker (475).
  • the wearable device (103) may identify the movement of the user's face and/or the user's expression based on data (e.g., an image) acquired using a camera based on the execution of the face tracker (475).
  • FIG. 4B illustrates an example of a block diagram of an electronic device according to one embodiment.
  • the electronic device (101) may include at least one of a processor (410-1), a memory (415-1), a display (420-1), or a communication circuit (435-1).
  • the processor (410-1), the memory (415-1), the display (420-1), and the communication circuit (435-1) may be electrically and/or operatively connected to each other by an electronic device such as a communication bus (402-1).
  • the type and/or number of hardware components included in the electronic device (101) are not limited to those illustrated in FIG. 4B.
  • the electronic device (101) may include only some of the hardware components illustrated in FIG. 4B.
  • the processor (410-1), the memory (415-1), the display (420-1), and the communication circuit (435-1) included in the electronic device (101) may include hardware components and/or circuits corresponding to the processor (410), the memory (415), the display (420), and the communication circuit (435) of the wearable device (101).
  • the description of the hardware and/or software included in the electronic device (101) may be omitted to the extent that it overlaps with the wearable device (103).
  • FIG. 5 illustrates an example of a block diagram showing programs included in an electronic device and a wearable device according to one embodiment.
  • the electronic device (101) of FIG. 5 may include the electronic device (101) of FIGS. 1 to 4B.
  • the wearable device (103) of FIG. 5 may include the wearable device (103) of FIGS. 1 to 4A.
  • one or more programs included in a memory (415-1) of the electronic device (101) and a memory (415) of the wearable device (103) according to one embodiment are illustrated. Based on the execution of the XR application (442) of FIG. 4A or the XR application (442-1) of FIG. 4B, the execution of one or more programs may be caused.
  • An electronic device (101) may determine whether to perform rendering on a screen representing at least a portion of a virtual space based on the execution of the rendering manager (510). Using the rendering manager (510), the electronic device (101) may determine whether to execute programs included in the memory (415-1). For example, the electronic device (101) may determine whether to perform rendering on at least one element (e.g., a layer or a virtual object) included in a screen based on the execution of the rendering manager (510). The electronic device (101) may determine the number of screens on which to perform rendering based on the execution of the rendering manager (510).
  • the electronic device (101) may determine whether to perform rendering on the at least one element based on the performance of a wearable device (103) connected via a communication circuit and/or a specified condition based on the execution of the rendering manager (510). For example, the electronic device (101) may decide to render some of the elements composing the screen by mixing them based on the execution of the rendering manager (510).
  • An electronic device (101) may identify one or more layers included in a screen representing at least a portion of a virtual space based on the execution of a layer manager (520).
  • the electronic device (101) may obtain rendering data for each of the one or more layers using the layer manager (520) before performing rendering on the screen (or during a preprocessing time).
  • An operation of the electronic device (101) processing one or more layers using the layer manager (520) may be referred to as a preprocessing operation for performing rendering.
  • the electronic device (101) may identify resource information (e.g., a layer or a virtual object) for obtaining at least a portion of a screen through the rendering manager.
  • resource information e.g., a layer or a virtual object
  • the electronic device (101) may identify one or more layers corresponding to at least a portion of the screen through the layer manager.
  • the electronic device (101) may render at least a portion of the screen through the renderer (540) using the identified one or more layers.
  • the layer manager (520) may be referred to as a layer identifier, a layer division module, a layer identification module, and/or a layer control module.
  • An electronic device (101) may identify one or more virtual objects included in at least a portion of a virtual space based on the execution of the object manager (530).
  • the one or more virtual objects may be included in another layer located on a layer representing the virtual space.
  • the one or more virtual objects included in the another layer may be an example of a virtual object based on two dimensions.
  • the another layer may be an example of a layer related to the XR system UI (441) of FIG. 4A.
  • the layer related to the XR system UI (441) may mean a HUD (heads-up display).
  • the object manager (530) may be referenced by an object identifier, an object management module, and/or an object control module.
  • the electronic device (101) may obtain rendering data for each of one or more virtual objects before performing rendering on the screen by using the object manager (530).
  • An operation of the electronic device (101) processing one or more virtual objects by using the object manager (530) may be referred to as a preprocessing operation for performing rendering.
  • the electronic device (101) may identify location (or depth) information of one or more virtual objects within at least a portion of a virtual space in order to process one or more virtual objects.
  • An electronic device (101) may perform rendering for a screen to be displayed on a display based on the execution of a renderer (540).
  • the electronic device (101) may obtain the screen using rendering data based on the execution of the renderer (540).
  • the electronic device (101) may obtain rendering data for each of one or more objects through an object manager (530), and then perform rendering for each of the one or more objects through the renderer (540) using the rendering data.
  • the renderer (540) may be referred to as a rendering module and/or a remote rendering module.
  • the electronic device (101) can identify at least a portion of a virtual space corresponding to a viewpoint (e.g., viewpoint (110) of FIG. 1) of a wearable device (103) based on the execution of the display manager (550).
  • the electronic device (101) can identify a screen to be transmitted to the wearable device (103) through the display manager (550).
  • the screen can correspond to at least a portion of the virtual space.
  • the electronic device (101) can infer (or identify) the rendering processing speed of the wearable device (103) through the division manager (560).
  • the electronic device (101) can determine, through the division manager (560), whether to perform rendering on a portion of a screen to be transmitted to the wearable device (103).
  • the electronic device (101) can determine, through the division manager (560), the amount of rendering data to be used for rendering of the wearable device (103) based on the rendering processing speed of the wearable device (103).
  • the electronic device (101) can identify, through the division manager (560), a portion of one or more layers and/or a portion of one or more virtual objects on which the electronic device (101) is to perform rendering.
  • the electronic device (101) can perform rendering on a portion of a screen to obtain a rendering image, and transmit rendering data on another portion of the screen, together with the rendering image, to the wearable device (103).
  • the electronic device (101) can identify, through a division manager, at least a portion of the screen to be rendered by the wearable device (103) based on the performance of the wearable device (103).
  • the electronic device (101) can render another portion, which is distinct from the at least portion, through a renderer, to obtain a partial image corresponding to the screen.
  • the electronic device (101) can transmit the partial image and the rendering data on the at least portion to the wearable device (103).
  • the wearable device (103) can obtain a screen (e.g., screen (115) of FIG. 1) to be displayed on a display (e.g., display (420) of FIG. 4a) by rendering a rendering image and rendering data through a renderer (540-1).
  • the electronic device (101) may identify a viewpoint (e.g., viewpoint (110) of FIG. 1) of the wearable device (103) by using input information received from the wearable device (103) based on execution of the tracking identifier (570).
  • the electronic device (101) may infer (or predict) the viewpoint of the wearable device (103) by using the input information through the tracking identifier (570).
  • the electronic device (101) may identify the viewpoint based on input information related to sensing data acquired by using programs (e.g., position tracker (471), space recognizer (472), gesture tracker (473), and/or gaze tracker (474), face tracker (475)) included in a recognition service layer (e.g., recognition service layer (470) of FIG. 4A) through the tracking identifier (570).
  • programs e.g., position tracker (471), space recognizer (472), gesture tracker (473), and/or gaze tracker (474), face tracker (475)
  • a recognition service layer e.g., recognition service layer (470) of FIG
  • the electronic device (101) can synchronize a first data signal transmitted to a wearable device (103) connected through a communication circuit and a second data signal received from the wearable device (103) based on the execution of the synchronization manager (580).
  • the electronic device (101) can synchronize timing for transmitting and receiving data between the electronic device and the wearable device (103) through the synchronization manager (580). For example, when the electronic device (101) establishes a communication link with a plurality of external electronic devices through the communication circuit, the electronic device (101) can synchronize data signals transmitted and received between the plurality of external electronic devices through the synchronization manager (580).
  • An electronic device (101) may establish a communication link with a plurality of external electronic devices for using a virtual space service based on the execution of a communication service (590).
  • the communication service (590) may be related to the XR application (442-1) of FIG. 4B.
  • the communication service (590) may be related to the XR system UI (441-1) of FIG. 4B.
  • the communication service (590) may be related to an operating system installed in the electronic device (101).
  • the rendering manager (510-1), the layer manager (520-1), the object manager (530-1), the renderer (540-1), the display manager (550-1), the division manager (560-1), the tracking identifier (570-1), the synchronization manager (580-1), and the communication service (590-1) included in the wearable device (103) may correspond to the rendering manager (510), the layer manager (520), the object manager (530), the renderer (540), the display manager (550), the division manager (560), the tracking identifier (570), the synchronization manager (580), and the communication service (590) included in the electronic device (101).
  • the description of the software included in the wearable device (103) may be omitted within the scope overlapping with the electronic device (101).
  • a wearable device (103) may receive rendering data from an electronic device (101) using a communication circuit (435).
  • the rendering data may include a partial image representing at least a portion of a screen acquired by the electronic device (101).
  • the wearable device (103) may obtain a screen to be displayed on the display (420) using the partial image and the rendering data through a renderer (540-1).
  • the wearable device (103) may identify rendering data to be rendered based on receiving resource information from the electronic device (101).
  • the rendering data may include data corresponding to a virtual object and/or a layer.
  • the resource information may include at least one element in a screen identified through a rendering manager (510) of the electronic device (101).
  • the wearable device (103) may determine whether to render a virtual object through the renderer based on receiving the resource information.
  • the wearable device (103) may, based on receiving the resource information, determine whether to perform rendering for one or more layers through a renderer, but is not limited thereto.
  • the wearable device (103) can display a screen obtained by using a partial image and rendering data on the display (420) through the screen composition circuit (595).
  • the screen composition circuit (595) can be an example of a configuration for combining rendering data for a split screen.
  • the wearable device (103) can cause the execution of the screen composition manager (451) of FIG. 4A through the screen composition circuit (595).
  • the wearable device (103) can input another partial image obtained by using the partial image and rendering data to the screen composition circuit (595) and display a screen (e.g., screen (115) of FIG. 1) on the display (420).
  • the electronic device (101) can reduce the load of the wearable device (103) by processing at least a part of the rendering data to be processed by the wearable device (103).
  • the electronic device (101) can reduce the load of the wearable device (103) by processing at least a part of the rendering data to be processed by the wearable device (103).
  • an example of an operation for obtaining rendering data to be used for rendering of a plurality of wearable devices when the electronic device (101) establishes a communication link with a plurality of wearable devices is described below.
  • FIG. 6 illustrates an example of a flowchart illustrating an operation of an electronic device according to an embodiment.
  • the electronic device of FIG. 6 may include the electronic device (101) of FIG. 1. At least one of the operations of FIG. 6 may be performed by the electronic device (101) of FIG. 1. At least one of the operations of FIG. 6 may be controlled by the processor (410-1) of FIG. 4B.
  • Each of the operations of FIG. 5 may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each of the operations may be changed, and at least two operations may be performed in parallel.
  • an electronic device may receive a request to display at least a portion of a three-dimensional virtual space (e.g., the virtual space (100) of FIG. 1) from a plurality of external electronic devices connected through a communication circuit (e.g., the communication circuit (435-1) of FIG. 4b).
  • the plurality of external electronic devices may include a wearable device (103) of FIG. 1.
  • an electronic device may identify viewpoints of a plurality of external electronic devices through a tracking identifier (e.g., the tracking identifier (570) of FIG. 5) by using input information received from the plurality of external electronic devices through a communication circuit.
  • the electronic device may identify viewpoints of each of the plurality of external electronic devices based on execution of the tracking identifier, based on positions and/or postures of each of the plurality of external electronic devices included in the input information.
  • an electronic device may identify screens representing at least a portion of a three-dimensional virtual space corresponding to viewpoints through a display manager (e.g., the display manager (550) of FIG. 5).
  • the electronic device may identify at least a portion of a virtual space corresponding to each of the viewpoints based on identifying the viewpoints of each of a plurality of external electronic devices.
  • the electronic device may identify a screen representing at least a portion of the virtual space.
  • an electronic device may determine resource information for obtaining at least a portion of each of the screens through a rendering manager (e.g., the rendering manager (510) of FIG. 5).
  • a rendering manager e.g., the rendering manager (510) of FIG. 5
  • the electronic device can identify the rendering processing speed of each of the plurality of external electronic devices through the division manager (e.g., the division manager (530) of FIG. 5). Based on the identification of the rendering processing speed of each of the plurality of external electronic devices, the electronic device can identify some of the elements (e.g., layers or virtual objects) that the electronic device will render among the elements that constitute the screen. Other some of the elements may be examples of rendering data to be transmitted to each of the plurality of external electronic devices.
  • the division manager e.g., the division manager (530) of FIG. 5
  • the electronic device can identify some of the elements (e.g., layers or virtual objects) that the electronic device will render among the elements that constitute the screen. Other some of the elements may be examples of rendering data to be transmitted to each of the plurality of external electronic devices.
  • the electronic device may identify, through the object manager, one or more first virtual objects included in each of the screens.
  • the electronic device may change the one or more first virtual objects into rendering data through a renderer (e.g., the renderer (540) of FIG. 5).
  • the electronic device may render the one or more second virtual objects and transmit the one or more second virtual objects and the rendering data to a plurality of external electronic devices.
  • an electronic device may transmit rendering data representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices, using a communication circuit, to each of the plurality of external electronic devices, based on the resource information determined.
  • the electronic device can identify a common portion overlapped and included in the screens corresponding to the viewpoints of each of the plurality of external electronic devices through the display manager.
  • the electronic device can perform rendering of the common portion through the renderer and transmit the obtained common image together with the rendering data using the communication circuit.
  • the plurality of external electronic devices can render the common image and the rendering data.
  • the plurality of external electronic devices can obtain a screen to be displayed on the display based on the rendering of the common image and the rendering data.
  • the electronic device can transmit resource information together with rendering data to the plurality of external electronic devices using the communication circuit.
  • the plurality of external electronic devices can identify elements constituting the rendering data based on receiving the resource information.
  • the elements can include layers for constituting a virtual object or screen included in at least a portion of the virtual space.
  • the plurality of external electronic devices can display a screen corresponding to a viewpoint on a display.
  • FIG. 7 illustrates an example of an operation for a wearable device to establish a communication link with an electronic device according to one embodiment.
  • the electronic device (101) of FIG. 7 may include the electronic device (101) of FIG. 1.
  • the wearable device (103) of FIG. 7 may include the wearable device (103) of FIG. 1.
  • an electronic device (101) may receive an input indicating execution of an XR application (442-1) of FIG. 4b.
  • the electronic device (101) may receive an XR service from an external server providing an XR service based on the execution of the XR application.
  • the electronic device (101) may provide the XR service to a user by linking with a wearable device (103).
  • the electronic device (101) may display a screen (710) based on the execution of the XR application (442-1) on the display (420-1).
  • the wearable device (103) can identify a screen (710) displayed on the display (420-1) of the electronic device (101) by using a camera (e.g., the camera (425) of FIG. 4A).
  • the wearable device (103) can identify that the electronic device (101) has initiated access to the virtual space based on identifying the screen (710).
  • the present invention is not limited thereto.
  • the wearable device (103) can identify that the electronic device (101) has initiated access to the virtual space based on receiving an audio signal indicating execution of an XR application (442-1) by using a microphone.
  • the wearable device (103) can identify an input indicating initiation of execution of an XR application (e.g., the XR application (442) of FIG. 4A) included in the wearable device (103).
  • the wearable device (103) may initiate execution of the pass-through library (453) of FIG. 4A based on identifying the screen (710).
  • the wearable device (103) may display at least a portion of an image acquired through the camera by overlapping it on a portion (735) of a display area (730) of the display based on the execution of the pass-through library (453).
  • the portion (735) may be referred to as a pass-through area.
  • At least a portion of the image may include a visual object (101-1) corresponding to the electronic device (101).
  • the screen (710) may include a visual object (720) for establishing a connection with the wearable device (103).
  • the visual object (720) may include identification information of the electronic device (101) that may establish a communication link with the electronic device (101).
  • the wearable device (103) may identify a visual object (720-1) corresponding to the visual object (720) included in at least a portion of the image.
  • the wearable device (103) may display an image acquired through a camera on a display area (730) of the display.
  • the wearable device (103) may identify the visual object (720-1) based on object recognition through the image.
  • the present invention is not limited thereto.
  • the wearable device (103) may transmit a signal to the electronic device (101) requesting establishment of a connection based on identifying a visual object (720-1).
  • the wearable device (103) may initiate execution of an XR application based on establishing a connection with the electronic device (101).
  • the wearable device (103) may request a screen representing at least a portion of a virtual space from the electronic device (101) based on the execution of the XR application.
  • FIG. 8 illustrates an example of rendering data transmitted by an electronic device to a wearable device according to one embodiment.
  • the electronic device (101) of FIG. 8 may include the electronic device (101) of FIG. 1.
  • the first wearable device (103-1) and the second wearable device (103-2) of FIG. 8 may be included in the wearable device (103) of FIG. 1.
  • An electronic device (101) can establish a connection with a plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) can provide the same virtual space service based on establishing a connection with the plurality of wearable devices (103-1, 103-2).
  • the plurality of wearable devices (103-1, 103-2) can access the same virtual space (e.g., the virtual space (100) of FIG. 1) based on establishing a connection with the electronic device (101).
  • An electronic device (101) may receive a request for a screen representing at least a portion of a virtual space from a plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) may receive input information of the plurality of wearable devices (103-1, 103-2) from each of the plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) may identify viewpoints of the plurality of wearable devices (103-1, 103-2).
  • the input information may include information acquired by each of the plurality of wearable devices (103-1, 103-2) based on identifying the user's head movement, hand gesture, and/or gaze using sensors (e.g., sensor (430) of FIG.
  • the electronic device (101) may identify at least a portion of a virtual space corresponding to each of the viewpoints of the plurality of wearable devices (103-1, 103-2). A portion of the virtual space corresponding to the first wearable device (103-1) and another portion of the virtual space corresponding to the second wearable device (103-2) may be different.
  • the electronic device (101) can identify a part of the virtual space corresponding to the first wearable device (103-1) and another part of the virtual space corresponding to the second wearable device (103-2) using the display manager (550). Based on identifying the part of the virtual space and the other part of the virtual space, the electronic device (101) can identify screens to be transmitted to each of the plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) may identify the rendering processing speed of each of the plurality of wearable devices (103-1, 103-2) based on the execution of the division manager (560) of FIG. 5.
  • the electronic device (101) may identify a portion of a screen to be transmitted to each of the plurality of wearable devices (103-1, 103-2) based on the identification of the rendering processing speed of each of the plurality of wearable devices (103-1, 103-2).
  • the portion of the screen may include a portion of one or more layers for forming the screen and/or a portion of one or more virtual objects included in the screen.
  • the portion of the screen may include a portion of screens to be displayed on each of the plurality of displays of the wearable device.
  • the present invention is not limited thereto.
  • the electronic device (101) may obtain rendering data (810, 820) to be used for rendering the plurality of wearable devices (103-1, 103-2) based on the identification of screens (840, 850) to be transmitted to each of the plurality of wearable devices (103-1, 103-2) through the display manager (550).
  • the electronic device (101) may obtain first rendering data (810) to be transmitted to the first wearable device (103-1).
  • the electronic device (101) may obtain second rendering data (820) to be transmitted to the second wearable device (103-2).
  • the electronic device (101) may identify common portions (870-1, 870-2) that are overlapped and included in the screens (840, 850) based on identifying the screens (840, 850).
  • the electronic device (101) may perform rendering on the common portions (870-1, 870-2) through a renderer based on identifying the common portions (870-1, 870-2) and transmit the obtained common image (or common image layer) (830) together with the rendering data (810, 820) using a communication circuit.
  • the common portions (870-1, 870-2) may include a background image (or background image layer) (830-1, 830-2) included in each of the screens (840, 850).
  • Common portions (870-1, 870-2) may contain similar rendering data.
  • the electronic device (101) may identify the first rendering data (810) that the first wearable device (103-1) can process based on the identification of the rendering processing speed of the first wearable device (103-1).
  • the first rendering data (810) may correspond to the first rendering image (810-1).
  • the first rendering data (810) may correspond to another portion (860-1) of the screen (840).
  • the present invention is not limited thereto.
  • the electronic device (101) may render another portion (860-1) of the screen (840) through the renderer (540) of FIG. 5 to obtain a partial image.
  • the electronic device (101) may transmit the first rendering data (810) together with the partial image to the first wearable device (103-1).
  • the above partial image may correspond to an area of another part (860-1) of the screen (840).
  • the first rendering data (810) may correspond to another area of the other part (860-1).
  • the first rendering data (810) may include information about at least one layer for generating the other part (860-1).
  • the electronic device (101) may identify second rendering data (820) that the second wearable device (103-2) can process based on the identification of the rendering processing speed of the second wearable device (103-2).
  • the second rendering data (820) may be represented as a second rendering image (820-1).
  • the second rendering data (820) may correspond to another portion (860-2) of the screen (850).
  • the present invention is not limited thereto.
  • the electronic device (101) may render another portion (860-2) of the screen (850) through the renderer (540) to obtain a partial image.
  • the electronic device (101) may transmit the second rendering data (820) together with the partial image to the second wearable device (103-2).
  • the above partial image may correspond to an area of another portion (860-2) of the screen (850).
  • the second rendering data (820) may correspond to another area of the other portion (860-2).
  • the second rendering data (820) may include information about at least one layer for generating the other portion (860-2).
  • the first wearable device (103-1) may render the first rendering data (810) and the common image (830) through the renderer (540-1) based on receiving the first rendering data (810) and the common image (830) from the electronic device (101).
  • the first wearable device (103-1) may obtain a screen (840) based on rendering the first rendering data (810) and the common image (830).
  • the first wearable device (103-1) may display the obtained screen (840) on a display (e.g., the display (420) of FIG. 4A) through the screen configuration circuit (595) of FIG. 5.
  • the second wearable device (103-2) may render the second rendering data (820) and the common image (830) through the renderer (540-1) based on receiving the second rendering data (820) and the common image (830) from the electronic device (101).
  • the second wearable device (103-2) may obtain a screen (850) based on rendering the first rendering data (810) and the common image (830).
  • the second wearable device (103-2) may display the obtained screen (850) on a display (e.g., the display (420) of FIG. 4A) through the screen configuration circuit (595) of FIG. 5.
  • the electronic device (101) may identify a rendering processing speed of the first wearable device (103-1) that is relatively faster than the rendering processing speed of the second wearable device (103-2) based on the execution of the division manager (560) of FIG. 5.
  • the electronic device (101) may request the first wearable device (103-1) to obtain rendering data to be used for rendering the second wearable device (103-2).
  • the electronic device (101) may synchronize timing for data transmitted and received between the plurality of wearable devices (103-1, 103-2) through the synchronization manager. Based on the synchronized timing, the electronic device (101) may request the first wearable device (103-1) to obtain rendering data to be used for rendering the second wearable device (103-2).
  • the first wearable device (103-1) may, based on receiving the request, obtain rendering data to be used for rendering the second wearable device (103-2) through the renderer of the first wearable device (103-1).
  • the first wearable device (103-1) may transmit the rendering data to the second wearable device (103-2).
  • the present invention is not limited thereto.
  • the electronic device (101) can identify screens (840, 850) to be transmitted to the plurality of wearable devices (103-1, 103-2) while establishing a connection with the plurality of wearable devices.
  • the screens (840, 850) can correspond to viewpoints of each of the plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) can reduce the time for the plurality of wearable devices (103-1, 103-2) to render the screens.
  • FIG. 9 illustrates an example of layer information transmitted by an electronic device to a wearable device according to an embodiment.
  • the electronic device (101) of FIG. 9 may include the electronic device (101) of FIG. 1.
  • the plurality of wearable devices (103-1, 103-2) of FIG. 9 may be included in the wearable device (103) of FIG. 1.
  • FIG. 9 an example of an electronic device (101) and/or a plurality of wearable devices (103-1, 103-2) logged into a virtual space (100) according to an embodiment is illustrated.
  • An electronic device (101) can establish a communication link with a plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) can provide substantially similar virtual space services to the plurality of wearable devices (103-1, 103-2) based on establishing a communication link with the plurality of wearable devices (103-1, 103-2).
  • the plurality of wearable devices (103-1, 103-2) can log in to the same virtual space (100) based on establishing a connection with the electronic device (101).
  • the electronic device (101) connected to the plurality of wearable devices (103-1, 103-2) logged in to the virtual space (100) can share location information of each of the plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) can identify viewpoints (110, 120) of each of the plurality of wearable devices (103-1, 103-2) based on location information and input information received from the plurality of wearable devices (103-1, 103-2) and based on execution of the tracking identifier.
  • the second wearable device (103-2) may identify the six-degree-of-freedom posture of the second wearable device (103-2) from the movement of the second user (106) wearing the second wearable device (103-2) by using one or more programs included in the recognition service layer (470) of FIG. 4A.
  • the second wearable device (103-2) may transmit input information corresponding to the six-degree-of-freedom posture to the electronic device (101).
  • the input information may include an input for controlling at least a portion of a virtual space.
  • An electronic device (101) can transmit screens (115, 125) corresponding to viewpoints (110, 120) of each of a plurality of wearable devices (103-1, 103-2) to each of a plurality of wearable devices (103-1, 103-2).
  • the electronic device (101) can transmit the screen (115) to the first wearable device (103-1).
  • the first wearable device (103-1) can bypass rendering and display the screen (115) on the display after receiving the screen (115) from the electronic device (101).
  • the present invention is not limited thereto.
  • the electronic device (101) may identify rendering data to be used for rendering the second wearable device (103-2) based on the rendering processing speed of the second wearable device (103-2). In order to improve the rendering processing speed of the second wearable device (103-2), the electronic device (101) may render a part of the rendering data for the screen (125). The electronic device (101) may process one or more layers (920-2) through the layer manager to render a part of the rendering data for the screen (125). The electronic device (101) may render one or more layers (920-2) through the renderer. The electronic device (101) can transmit rendering data corresponding to another layer (910-1) that is distinct from one or more layers (920-2) among the layers constituting the screen (125) to the second wearable device (103-2).
  • FIG. 10 illustrates an example of an operation in which an electronic device and an external electronic device transmit data to a wearable device according to one embodiment.
  • the electronic device of FIG. 10 may be referenced to the electronic device (101) of FIG. 1.
  • the wearable device (103) of FIG. 10 may be referenced to the wearable device (103) of FIG. 1.
  • an external electronic device (1001) may include a personal computer (PC) such as a laptop and a desktop, a smartphone, a smartpad, a tablet PC, a smartwatch, and/or a head-mounted device (HMD).
  • PC personal computer
  • HMD head-mounted device
  • An electronic device (101) can establish a connection with a wearable device (103) and an external electronic device (1001).
  • the wearable device (103) can establish a connection with the electronic device (101) and the external electronic device (1001).
  • the electronic device (101) can identify rendering data for a screen (1040) to be transmitted to the wearable device (103).
  • the electronic device (101) can identify a rendering processing speed of the external electronic device (1001) and/or a rendering processing speed of the wearable device (103) through a display manager (e.g., the display manager (550) of FIG. 5).
  • the electronic device (101) can divide the rendering data for the screen (1040) based on identifying the rendering processing speed of the external electronic device (1001) and/or the rendering processing speed of the wearable device (103).
  • the electronic device (101) can perform rendering on a first portion of the rendering data.
  • the first portion can include at least one virtual object included in the screen (1040).
  • the first portion can include at least one layer for configuring the screen (1040).
  • the first portion can correspond to a background image (1010) included in the screen (1040).
  • the present invention is not limited thereto.
  • the electronic device (101) may request the external electronic device (1001) to perform rendering for a second portion of the rendering data.
  • the second portion may correspond to a first virtual object (1020) included in the screen (1040).
  • the external electronic device (1010) may render the first virtual object (1020) based on receiving the request.
  • the electronic device (101) may request the wearable device (103) to perform rendering for a third portion of the rendering data.
  • the third portion may correspond to a second virtual object (1030) included in the screen (1040).
  • the second virtual object (1030) may include a virtual object (e.g., a user's hand or an avatar) or a user interface (e.g., a system UI) for performing interaction with a user wearing the wearable device (103).
  • the wearable device (103) may render the second virtual object (1030) to be displayed on the screen (1040).
  • an electronic device (101) can transmit a background image (1010) to a wearable device (103).
  • An external electronic device (1010) can transmit a first virtual object (1020) to the wearable device (103).
  • the wearable device (103) can render the background image (1010), the first virtual object (1020), and the second virtual object (1030) through a renderer of the wearable device (103).
  • the wearable device (103) can synthesize the background image (1010), the first virtual object (1020), and the second virtual object (1030) through the screen composition circuit (595) of FIG. 5 based on the rendering of the background image (1010), the first virtual object (1020), and the second virtual object (1030).
  • the wearable device (103) can display a screen (1040) in which the background image (1010), the first virtual object (1020), and the second virtual object (1030) are synthesized on the display.
  • the electronic device (101) may divide rendering data corresponding to the screen (1040) in order to render the screen (1040) to be displayed on the display of the wearable device (103).
  • the electronic device (101) may assign the divided rendering data to the external electronic device (1001) and/or the wearable device (103). Based on the division of the rendering data, the electronic device (101) may reduce the amount of rendering data to be processed by the wearable device (103).
  • the electronic device (101) may reduce the time to display the screen on the display of the wearable device (103) by reducing the resources for the wearable device (103) to perform rendering.
  • FIG. 11 illustrates an example of a flowchart illustrating operations of a wearable device according to one embodiment.
  • the wearable device of FIG. 11 may include the wearable device (101) of FIG. 1. At least one of the operations of FIG. 11 may be performed by the wearable device (103) of FIG. 1. At least one of the operations of FIG. 11 may be controlled by the processor (410) of FIG. 4A.
  • Each of the operations of FIG. 11 may be performed sequentially, but is not necessarily performed sequentially. For example, the order of each of the operations may be changed, and at least two operations may be performed in parallel.
  • a wearable device may transmit a request to display at least a portion of a three-dimensional virtual space (e.g., the virtual space (100) of FIG. 1) to an electronic device (e.g., the electronic device (101) of FIG. 1) by using a communication circuit (e.g., the communication circuit (435) of FIG. 4A) while being connected to the electronic device.
  • a communication circuit e.g., the communication circuit (435) of FIG. 4A
  • a wearable device may identify input information indicating a posture of the wearable device using a sensor (e.g., a communication circuit (430) of FIG. 4A).
  • the input information may include a user input for controlling at least a portion of a virtual space.
  • the wearable device may obtain the input information based on identifying a user's head movement, hand gesture, and/or gaze using the sensor.
  • a wearable device may transmit input information to an electronic device using a communication circuit.
  • the electronic device may identify a viewpoint of the wearable device (e.g., viewpoint (110) of FIG. 1) using the tracking identifier (570) of FIG. 5 based on receiving the input information.
  • the electronic device may identify at least a portion of a virtual space corresponding to the viewpoint of the wearable device. At least a portion of the virtual space may be included in a field of view (FoV) of the wearable device.
  • a viewpoint of the wearable device e.g., viewpoint (110) of FIG. 1
  • the electronic device may identify at least a portion of a virtual space corresponding to the viewpoint of the wearable device. At least a portion of the virtual space may be included in a field of view (FoV) of the wearable device.
  • FoV field of view
  • a wearable device may receive rendering data for a screen corresponding to a viewpoint of the wearable device from an electronic device and a partial image representing at least a portion of the screen.
  • the electronic device may render a partial image representing at least a portion of the screen based on identifying the screen.
  • the electronic device may obtain rendering data to be used for rendering the wearable device through a renderer (e.g., renderer (540) of FIG. 5).
  • a renderer e.g., renderer (540) of FIG. 5.
  • the present invention is not limited thereto.
  • a wearable device may obtain a screen to be displayed on a display (e.g., display (420-1) of FIG. 4b) by using rendering data and a partial image.
  • the wearable device may render the rendering data by using a renderer (e.g., renderer (540-1) of FIG. 5).
  • the wearable device may obtain a screen by synthesizing the rendered rendering data and the partial image.
  • the wearable device may display the screen on the display. By dividing the screen and generating it, the electronic device and the wearable device may provide a virtual space service to the user more quickly.
  • the electronic device may include a communication circuit (435-1), a memory (415-1) including instructions, and a processor (410-1).
  • the processor may be configured to receive, when the instructions are executed, a request for displaying at least a portion of a three-dimensional virtual space (100) from a plurality of external electronic devices (103-1, 103-2) connected through the communication circuit.
  • the processor may be configured to identify, by a tracking identifier (570), viewpoints (110, 120) of the plurality of external electronic devices using input information received from the plurality of external electronic devices through the communication circuit when the instructions are executed.
  • the processor may be configured to identify, through the display manager (550), screens (115, 125) representing at least a portion of the three-dimensional virtual space corresponding to the viewpoints when the instructions are executed.
  • the processor may be configured to determine, through the rendering manager (510), resource information for obtaining at least a portion of each of the screens when the instructions are executed.
  • the processor may be configured to transmit, through the communication circuit, rendering data (810, 820) representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices based on the determined resource information when the instructions are executed, to each of the plurality of external electronic devices.
  • the processor may be configured to identify, when the instructions are executed, a common portion (870-1, 870-2) that is overlapped and included in the screens corresponding to the view points of each of the plurality of external electronic devices through the display manager.
  • the processor may be configured to perform rendering on the common portion through the renderer (540) and transmit, together with the rendering data, a common image (830) obtained by using the communication circuit when the instructions are executed.
  • the processor may be configured to identify, through the division manager (560), at least a portion of each of the screens based on the performance of each of the plurality of external electronic devices when the instructions are executed.
  • the processor may be configured to transmit, through the renderer, a portion image obtained by rendering another portion distinct from the at least a portion, together with the rendering data, to each of the plurality of external electronic devices when the instructions are executed.
  • the processor may be configured to identify, through the layer manager (520), one or more layers (910, 920) corresponding to at least a portion of each of the screens, when the instructions are executed, if the resource information is determined as first resource information related to a layer forming each of the screens through the rendering manager.
  • the processor may be configured to change, through the renderer, the identified one or more layers into the rendering data when the instructions are executed.
  • the processor may be configured to transmit the resource information, together with the rendering data, to the plurality of external electronic devices using the communication circuit when the instructions are executed.
  • the processor may be configured to synchronize timing for transmitting and receiving data between the plurality of external electronic devices and the electronic device, through the synchronization manager (580), when the instructions are executed.
  • the input information may be configured to include at least one of the positions or postures of the plurality of external electronic devices.
  • the one or more programs may be configured to include instructions that, when executed by the processor (410-1) of the electronic device (101), cause the electronic device to receive a request for displaying at least a portion of a three-dimensional virtual space (100) from a plurality of external electronic devices (103-1, 103-2) connected through the communication circuit (435-1).
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device to identify, through a tracking identifier (570), viewpoints (110, 120) of the plurality of external electronic devices by using input information received from the plurality of external electronic devices through the communication circuit.
  • the one or more programs may be configured to include instructions that cause the electronic device to identify, through the display manager (550), screens (115, 125) representing at least a portion of the three-dimensional virtual space corresponding to the viewpoints.
  • the one or more programs may be configured to include instructions that cause the electronic device, when executed by the processor of the electronic device, to determine, through the rendering manager (510), resource information for obtaining at least a portion of each of the screens.
  • the one or more programs may be configured to include instructions that cause the electronic device, when executed by the processor of the electronic device, to transmit, using the communication circuit, rendering data (810, 820) representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices, based on the determined resource information.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device to identify, through the display manager, a common portion (870-1, 870-2) that is overlapped and included within the screens corresponding to the view points of each of the plurality of external electronic devices.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device to transmit, through the communication circuit, a common image (830) obtained by performing rendering on the common portion through the renderer (540), together with the rendering data.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device, through the division manager (560), to identify at least a portion of each of the screens based on a capability of each of the plurality of external electronic devices.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device, through the renderer, to render another portion distinct from the at least a portion, and to transmit a partial image obtained by rendering the other portion, together with the rendering data, to each of the plurality of external electronic devices.
  • the one or more programs may be configured to include instructions that cause the electronic device, when executed by the processor of the electronic device, to identify, through the layer manager (520), one or more layers (910, 920) corresponding to at least a portion of each of the screens, if the resource information is determined through the rendering manager as first resource information related to a layer forming each of the screens.
  • the one or more programs may be configured to include instructions that cause the electronic device, when executed by the processor of the electronic device, to change the identified one or more layers into the rendering data, through the renderer.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device to identify, through the object manager (530), one or more virtual objects included in each of the screens, if the resource information is determined as second resource information related to a virtual object included in each of the screens through the rendering manager.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device to change, through the renderer, the identified one or more virtual objects into rendering data.
  • the one or more programs may be configured to include instructions that, when executed by the processor of the electronic device, cause the electronic device, through the synchronization manager (580), to synchronize timing for transmitting and receiving data between the plurality of external electronic devices and the electronic device.
  • the method may include an operation of receiving a request for displaying at least a portion of a three-dimensional virtual space (100) from a plurality of external electronic devices (103-1, 103-2) connected through a communication circuit (435-1).
  • the method may include an operation of identifying, through a tracking identifier (570), viewpoints (110, 120) of the plurality of external electronic devices by using input information received from the plurality of external electronic devices through the communication circuit.
  • the method may include an operation of identifying, through a display manager (550), screens (115, 125) representing at least a portion of the three-dimensional virtual space corresponding to the viewpoints.
  • the method may include an operation of determining, through a rendering manager (510), resource information for obtaining at least a portion of each of the screens.
  • the method may include an operation of transmitting, to each of the plurality of external electronic devices, rendering data (810, 820) representing at least a portion of each of the screens to be used for rendering each of the plurality of external electronic devices, using the communication circuit, based on the resource information determined.
  • the method may include an operation of identifying, through the division manager (560), at least a portion of each of the screens based on the performance of each of the plurality of external electronic devices.
  • the method may include an operation of transmitting, through the renderer, a partial image obtained by rendering another portion that is distinct from the at least a portion, together with the rendering data, to each of the plurality of external electronic devices.
  • the method may include an operation of identifying one or more virtual objects included in each of the screens through the object manager (530) when the resource information is determined as second resource information related to virtual objects included in each of the screens through the rendering manager.
  • the method may include an operation of changing the identified one or more virtual objects into rendering data through the renderer.
  • the processing device is sometimes described as being used alone, but those skilled in the art will appreciate that the processing device may include multiple processing elements and/or multiple types of processing elements.
  • the processing device may include multiple processors, or a processor and a controller.
  • Other processing configurations, such as parallel processors, are also possible.
  • the software may include a computer program, code, instructions, or a combination of one or more of these, which may configure a processing device to perform a desired operation or may independently or collectively command the processing device.
  • the software and/or data may be embodied in any type of machine, component, physical device, computer storage medium, or device for interpretation by the processing device or for providing instructions or data to the processing device.
  • the software may be distributed over network-connected computer systems and stored or executed in a distributed manner.
  • the software and data may be stored on one or more computer-readable recording media.
  • the method according to the embodiment may be implemented in the form of program commands that can be executed through various computer means and recorded on a computer-readable medium.
  • the medium may be one that continuously stores a program executable by a computer, or one that temporarily stores it for execution or downloading.
  • the medium may be various recording means or storage means in the form of a single or multiple hardware combinations, and is not limited to a medium directly connected to a computer system, and may also be distributed on a network. Examples of the medium may include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical recording media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, and ROMs, RAMs, flash memories, etc., configured to store program commands.
  • examples of other media may include recording media or storage media managed by app stores that distribute applications, sites that supply or distribute various software, servers, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif électronique qui peut identifier des points de vue d'une pluralité de dispositifs électroniques externes par l'intermédiaire d'un identifiant de suivi à l'aide d'informations d'entrée reçues de la pluralité de dispositifs électroniques externes par l'intermédiaire d'un circuit de communication. Le dispositif électronique peut identifier, par l'intermédiaire d'un gestionnaire d'affichage, des écrans représentant au moins une partie d'un espace virtuel tridimensionnel correspondant aux points de vue. Le dispositif électronique peut déterminer, par l'intermédiaire d'un gestionnaire de rendu, des informations de ressource pour obtenir au moins une partie de chacun des écrans. Le dispositif électronique peut utiliser le circuit de communication pour transmettre des données de rendu, représentant au moins la partie de chacun des écrans à utiliser pour rendre chacun de la pluralité de dispositifs électroniques externes, à chacun de la pluralité de dispositifs électroniques externes sur la base des informations de ressource qui sont déterminées.
PCT/KR2024/003104 2023-03-31 2024-03-11 Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire pour transmettre des données de rendu pour générer un écran à un dispositif électronique externe Pending WO2024205075A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR10-2023-0043189 2023-03-31
KR20230043189 2023-03-31
KR20230084756 2023-06-30
KR10-2023-0084756 2023-06-30
KR10-2023-0121072 2023-09-12
KR1020230121072A KR20240147402A (ko) 2023-03-31 2023-09-12 화면을 생성하기 위한 렌더링 데이터를 외부 전자 장치에게 송신하기 위한 전자 장치, 방법, 및 컴퓨터 판독 가능 저장 매체

Publications (1)

Publication Number Publication Date
WO2024205075A1 true WO2024205075A1 (fr) 2024-10-03

Family

ID=92907091

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2024/003104 Pending WO2024205075A1 (fr) 2023-03-31 2024-03-11 Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire pour transmettre des données de rendu pour générer un écran à un dispositif électronique externe

Country Status (1)

Country Link
WO (1) WO2024205075A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101756792B1 (ko) * 2015-07-28 2017-07-11 재단법인대구경북과학기술원 Hmd형 가상현실 콘텐츠 모니터링 및 제어 시스템
JP2018113616A (ja) * 2017-01-12 2018-07-19 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2020524450A (ja) * 2017-06-29 2020-08-13 4ディーリプレー コリア,インコーポレイテッド 多チャネル映像のための伝送システム及びその制御方法、多チャネル映像再生方法及びその装置
KR20210124500A (ko) * 2014-07-25 2021-10-14 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 삼차원 혼합 현실 뷰포트
JP2022525412A (ja) * 2019-03-15 2022-05-13 株式会社ソニー・インタラクティブエンタテインメント 仮想キャラクタの現実間のクロスオーバ

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210124500A (ko) * 2014-07-25 2021-10-14 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 삼차원 혼합 현실 뷰포트
KR101756792B1 (ko) * 2015-07-28 2017-07-11 재단법인대구경북과학기술원 Hmd형 가상현실 콘텐츠 모니터링 및 제어 시스템
JP2018113616A (ja) * 2017-01-12 2018-07-19 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP2020524450A (ja) * 2017-06-29 2020-08-13 4ディーリプレー コリア,インコーポレイテッド 多チャネル映像のための伝送システム及びその制御方法、多チャネル映像再生方法及びその装置
JP2022525412A (ja) * 2019-03-15 2022-05-13 株式会社ソニー・インタラクティブエンタテインメント 仮想キャラクタの現実間のクロスオーバ

Similar Documents

Publication Publication Date Title
WO2024205075A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire pour transmettre des données de rendu pour générer un écran à un dispositif électronique externe
WO2025048133A1 (fr) Dispositif portable et procédé d'affichage d'avatar, et support de stockage lisible par ordinateur
WO2025023474A1 (fr) Dispositif électronique, procédé et support lisible par ordinateur pour afficher un objet virtuel
WO2025023449A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur pour commander un dispositif électronique externe pour produire des vibrations
WO2025155017A1 (fr) Procédé de génération de trajet de déplacement d'objet virtuel et dispositif électronique associé
WO2025110461A1 (fr) Dispositif portable, procédé et support de stockage lisible par ordinateur pour afficher un ou plusieurs objets virtuels sur la base d'un objet externe
WO2025041979A1 (fr) Dispositif portable, procédé et support de stockage lisible par ordinateur pour identifier le regard d'un utilisateur
WO2024262807A1 (fr) Dispositif à porter sur soi, procédé et support de stockage lisible par ordinateur pour changer l'orientation d'un objet virtuel
WO2024228464A1 (fr) Dispositif portable pour afficher un objet visuel pour ajuster la visibilité d'un objet virtuel, et procédé associé
WO2024128843A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur pour afficher un objet visuel représentant une application en utilisant une zone formée sur la base d'informations physiques de l'utilisateur
WO2024101579A1 (fr) Dispositif électronique pour afficher un contenu multimédia, et procédé associé
WO2025110478A1 (fr) Dispositif portable et procédé d'affichage d'un ou de plusieurs objets virtuels pour entrer dans des espaces virtuels, et support de stockage lisible par ordinateur
WO2024090844A1 (fr) Dispositif habitronique pour changer l'état d'un écran, et procédé associé
WO2024155076A1 (fr) Dispositif pouvant être porté pour fournir des informations, et procédé associé
WO2024181695A1 (fr) Dispositif électronique et procédé de fourniture de fonction associée à une réalité étendue
WO2024122836A1 (fr) Dispositif porté par l'utilisateur et procédé d'affichage d'une interface utilisateur associée à la commande d'un dispositif électronique externe
WO2024117649A1 (fr) Dispositif vestimentaire pour afficher un contenu multimédia sur la base d'une forme de préhension par rapport à un objet externe, et procédé associé
WO2024215173A1 (fr) Dispositif électronique, procédé et support de stockage lisible par ordinateur pour afficher un écran correspondant à la taille d'un objet externe sur un écran
WO2025216484A1 (fr) Dispositif portable, procédé et support de stockage non transitoire lisible par ordinateur permettant de modifier un schéma pour l'affichage d'un avatar
WO2024090825A1 (fr) Dispositif portable et procédé de changement d'objet visuel à l'aide de données identifiées par un capteur
WO2025048141A1 (fr) Dispositif à porter sur soi et procédé d'identification de l'emplacement d'un objet cible
WO2024101591A1 (fr) Dispositif électronique pour fournir au moins un contenu multimédia à des utilisateurs accédant à un objet, et procédé associé
WO2025041977A1 (fr) Dispositif portable et procédé de traitement de signal audio sur la base d'un objet externe reconnu à partir d'une image
WO2024117524A1 (fr) Dispositif électronique pour afficher un contenu multimédia, procédé associé
WO2024101581A1 (fr) Dispositif à porter sur soi pour contrôler un contenu multimédia disposé dans un espace virtuel et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24781066

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE