[go: up one dir, main page]

WO2025153161A1 - Sharing virtual representation of a user by an augmented reality display - Google Patents

Sharing virtual representation of a user by an augmented reality display

Info

Publication number
WO2025153161A1
WO2025153161A1 PCT/EP2024/050769 EP2024050769W WO2025153161A1 WO 2025153161 A1 WO2025153161 A1 WO 2025153161A1 EP 2024050769 W EP2024050769 W EP 2024050769W WO 2025153161 A1 WO2025153161 A1 WO 2025153161A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
overlay
virtual object
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/050769
Other languages
French (fr)
Inventor
Peter ÖKVIST
Tommy Arngren
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telefonaktiebolaget LM Ericsson AB
Original Assignee
Telefonaktiebolaget LM Ericsson AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget LM Ericsson AB filed Critical Telefonaktiebolaget LM Ericsson AB
Priority to PCT/EP2024/050769 priority Critical patent/WO2025153161A1/en
Publication of WO2025153161A1 publication Critical patent/WO2025153161A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Definitions

  • overlay can mean that the virtual object is viewed by the first user with a defined spatial relationship to the second user seen in the real-world.
  • the overlaid virtual object may be displayed overlapping a part of the second user, e.g., head, or may be displayed with a defined offset relative to the second user.
  • Some other related embodiments are directed to an AR rendering device for rendering virtual objects displayed by a first AR display as overlays viewed by a first user.
  • the AR rendering device includes at least one processor, and at least one memory storing instructions executable by the at least one processor to perform operations.
  • the operations include to render a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display.
  • the operations obtain information associated with the second user, where the second user is transporting a second AR display, and determine an overlay sharing policy based on the information associated with the second user.
  • the operations determine based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user.
  • the operations send for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
  • Some other related embodiments are directed to a computer program product that includes a non-transitory computer readable medium storing instructions executable by at least one processor of an AR rendering device to perform operations for rendering virtual objects displayed by a first AR display as overlays viewed by a first user.
  • the operations include to render a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display.
  • the operations obtain information associated with the second user, where the second user is transporting a second AR display, and determine an overlay sharing policy based on the information associated with the second user.
  • the operations determine based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user.
  • the operations send for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
  • FIG. 2 schematically illustrates an AR display configured as a head-mountable display (HMD) to be worn as glasses according to some embodiments;
  • HMD head-mountable display
  • Figures 3, 4, and 5 illustrate flowcharts of operations that can be performed by the AR rendering device for restricting rendering of an audio and/or visual augmentation of a user in an AR environment according to embodiments;
  • Figures 6 and 7 each illustrates a system that provides an AR environment for three users wearing HMDs and related operations by an AR rendering device to provide notifications to the users for how they are being viewed in the AR environment, in accordance with some embodiments;
  • FIGS 9, 10, and 11 are flowcharts of operations that can be performed by the AR rendering device for provide notifications to users for how they are being viewed in the AR environment, in accordance with some embodiments.
  • users can select from a myriad of different types of virtual objects that can be displayed on AR displays, such as head mountable displays (HMDs), registered in 3D space to be viewed as being virtual objects overlaid on other users.
  • HMDs head mountable displays
  • a first user may wear an HMD while viewing a second user who is also wearing an HMD.
  • the first user's HMD may operate to display a computer-generated virtual costume that is viewed by the first user as an overlay positioned on the second user.
  • the second user the viewed user
  • a virtual object e.g., virtual costume
  • the second user may find it objectionable that any virtual object is viewed as an overlay on the second user (the viewed user), or may more-particularly find it objectionable to have perceived unpleasant or embarrassing virtual objects overlaid on the second user (the viewed user).
  • HMDs as a type of AR display
  • other types of AR displays can be used, such as smartphones, tablet computers, laptop computers, etc.
  • FIG. 1 is a schematic diagram illustrating an AR communication system 100 including components operative according to some embodiments.
  • the AR communication system 100 includes two AR displays devices ("AR displays" for brevity) 200a, 200b and an AR rendering device 800.
  • AR displays for brevity
  • the AR rendering device 800 is shown as being separate from the AR displays 200a, 200b, such as within a computing server networked to the AR displays 200a, 200b, the AR rendering device 800 may be at least partially integrated into one or both of the AR displays 200a, 200b.
  • Each AR display 200a, 200b is illustrated as being worn by a respective user 110a, 110b.
  • the AR displays 200a, 200b may alternatively be held by the users 110a, 110b, or transported in other ways and configured for viewing by the users 110a, 110b.
  • AR displays 200a, 200b, they may additionally or alternatively implement virtual reality (VR) functionality or mixed reality (MR) functionality.
  • VR virtual reality
  • MR mixed reality
  • AR, VR, and MR can collectively be referred to as extended reality (XR).
  • AR is used broadly to also refer to VR and MR. Without loss of generality, the term AR display is used throughout this disclosure to refer to any such communication device.
  • the AR rendering device 800 is configured to communicate with the AR displays 200a, 200b.
  • the AR rendering device 800 may be configured to handle communication sessions between the AR displays 200a, 200b, and may provide communication session media control, or be configured to operate in conjunction with a meeting server, etc.
  • the AR rendering device can select, generate, and/or configure a virtual object which is provided to one or both of the AR displays 200a, 200b for display at a defined location(s) where it will be viewed as an overlay on the other user.
  • the AR communication system 100 can include a plurality of AR displays, each having its own user interface, and each being configured for communication with the other AR displays.
  • AR display 200a is hereinafter denoted a first AR display 200a
  • AR display 200b is hereinafter denoted a second AR display 200b
  • the first AR display 200a comprises a first user interface for displaying a computer-generated (e.g., by the AR rendering device 800) virtual representation 120b of the second user 110b, e.g., when viewed in VR or AR, or at least a virtual object which is overlaid on the second user when viewed in AR through a see-through display.
  • FIG. 2 schematically illustrates an example configuration of the AR displays 200a, 200b as a head-mountable display (HMD) configured to be worn as glasses according to some embodiments.
  • the AR displays 200a, 200b include a display device 240a, 240b which may be a see-through so that virtual objects are displayed on the user's view of the real world.
  • the AR displays 200a, 200b can include speakers which can playout audio which may be overlaid on or a modification of a microphone signal which captures the other user's voice.
  • the AR displays 200a, 200b may therefore render an audio and/or visual representation of a user.
  • the second AR display 200b sends its capabilities indicating how it can render the audio and/or visual representation 120a of the first user 110a. This enables the first AR display 200a to adapt the rendering attributes accordingly. Therefore, in some embodiments, the handshake procedure involves the first AR display 200a to provide, towards the second AR display 200b, rendering attributes as adapted to fit the properties and/or rendering capabilities.
  • the feedback information is a confirmation that the rendering attributes are followed. In other examples, more detailed information is provided regarding how the audio and/or visual representation 120a of the first user 110a is rendered.
  • FIG. 4 illustrates a flowchart of operations which can be performed by the AR rendering device 800 for restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the second AR display 200b according to an embodiment.
  • the second AR display 200b is in communication with the first AR display 200a.
  • the second AR display 200b obtains rendering attributes pertaining to restrictions in terms of how the audio and/or visual representation 120a of the first user 110a is to be rendered at the second user interface 240b, 250b of the second AR display 200b.
  • S206 The second AR display 200b obtains the audio and/or visual data of the first user 110a as captured by the data capturing unit 280a.
  • the second AR display 200b renders the audio and/or visual representation 120a of the first user 110a at the second user interface 240b, 250b in accordance with the rendering attributes.
  • Embodiments relating to further details of restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the second AR display 200b will now be disclosed.
  • a handshake procedure is performed between the first AR display 200a and the second AR display 200b to agree on approved poses, states (e.g., types of virtual objects which are approved for overlay on the user, characteristics of virtual object(s) approved for overlay on the user, etc.), etc., before the actual rendering of the audio and/or visual representation 120a of the first user 110a is set up.
  • the second AR display 200b is configured to perform (optional) step S204.
  • the second AR display 200b provides, towards the first AR display 200a, feedback pertaining to how the audio and/or visual representation 120a of the first user 110a is rendered at the second user interface 240b, 250b of the second AR display 200b.
  • the audio and/or visual data of the first user 110a might be rendered according to default settings. Such default settings might be hardcoded in the second AR display 200a, or obtained from the AR rendering device 800, or having been obtained from the first AR display 200a during an earlier AR communication session,
  • FIG. 5 illustrates a flowchart of operations which can be performed by the AR rendering device 800 for restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the AR rendering device 800 according to an embodiment.
  • the AR rendering device 800 is in communication with the first AR display 200a and the second AR display 200b.
  • the rendering attributes may be sent from the first AR display 200a either directly to the second AR display 200b or to the second AR display 200b via the AR rendering device 800.
  • the AR rendering device 800 might be implemented by one of the AR displays 200a, 200b. If the functionality of the AR rendering device 800 is implemented by the second AR display 200b, then the AR rendering device 800 could be responsible for executing the rendering of the audio and/or visual representation 120a of the first user 110a (and also enforcing the rendering according to the rendering attributes, as will be disclosed below). [0062] In some embodiments, the AR rendering device 800 verifies that a trust score of the second AR display 200b is higher than a threshold value before the AR rendering device 800 forwards the audio and/or visual data to the second AR display 200b. Hence, in some embodiments, the second AR display 200b is associated with a trust score for rendering audio and/or visual representations, and the AR rendering device 800 is configured to perform (optional) step S304.
  • the second AR display 200b provides feedback about how the audio and/or visual representation 120a of a first user 110a is rendered at the second communication device 200b. This feedback may be obtained by the AR rendering device 800. Hence, in some embodiments, the AR rendering device 800 is configured to perform (optional) step S308.
  • S312 The AR rendering device 800, in response thereto (i.e., in response to having obtained the indication in step S310), performs an action.
  • performing the action comprises any, or any combination of: altering any further audio and/or visual data of the first user 110a that is obtained from the first AR display 200a before it is provided to the second AR display 200b, terminating provisioning of the audio and/or visual data of the first user 110a to the second AR display 200b, disabling rendering of the audio and/or visual representation 120a of the first user 110a at the second user interface 240b, 250b, issuing a non-compliance notification to the first AR display 200a, decrementing a trust score for the second AR display 200b and/or the second user 110b.
  • the restrictions are any of: mandatory, or voluntary (such as preferred rendering attributes), or any level there between.
  • the rendering attributes could pertain to the preferred exclusion of the audio and/or visual representation of the first user being rendering adjacent, or nearby, or associated with e.g., "hot words” exclamations, cultural symbols, specific individuals, animals, itemized objects (e.g. chair, loo, bathtubs, certain furniture, etc.) or certain body postures (lie down (belly down/up), kneeling, bowing, etc.), spatial orientation ("upside down”), etc.
  • the rendering attributes could pertain to contextual information, such as combinations of the above in respect to e.g., AR communication sessions with friends, family, corporate, business, where the contextual information might be derived e.g., from calendar entries, schedules, social network entries, contact entries, device, photo streams, meeting e-mail address, voice, face recognition, etc.
  • the rendering attributes could pertain to combinations of preferred spatial locations with respect to the rendered environment, preferred spatial location relative to at least one other participant, preferred exclusion of rendering relative at least one other participant, contextual information, etc.
  • the rendering considers both the context of the first user (i.e., the user whose audio and/or visual representation that is to be rendered) and the second user (i.e., the user at which the audio and/or visual representation of the first user is rendered). Both the first user and the second user might therefore have pre-defined policies, in terms of rendering attributes, that may be used for rendering adaptations.
  • contextual data e.g., inertial measurement unit (IMU) data, context data, etc.
  • the contextual data can be used to select appropriate rendering attributes that are forwarded to second user. Examples of such contextual data as provided by the second user and the resulting rendering attributes as provided by the second user are disclosed in Table 1, below.
  • a first user may wear an HMD while viewing a second user.
  • the HMD may be operated to display a computer-generated virtual costume that is viewed by the first user as an overlay positioned on the second user.
  • the second user (the viewed user) to be aware a virtual object is being viewed as an overlay on the second user (the viewed user) and, much less, made aware how the second user (the viewed user) then looks to the first user (the viewing user).
  • the second user (the viewed user) may find it objectionable that any virtual object is viewed as an overlay on the second user (the viewed user), or may more-pa rticularly find it objectionable to have perceived unpleasant or embarrassing virtual objects overlaid on the second user (the viewed user).
  • Various embodiments of the present disclosure are directed, e.g.
  • Figure 6 illustrates a system that provides an AR environment for a first user wearing HMD 200a, a second user wearing HMD 200b, and a third user wearing HMD 200c.
  • the example HMDs 200a-c have see-through display screens 240a-c through which the respective users can view the other users in the AR environment with various computergenerated virtual objects that are displayed at locations which are determined (controlled) to overlay the virtual objects on the viewed users.
  • the users are informed how they are being viewed in the AR environment by other users.
  • the first user at a first time Ta gazes toward the second user who is illustrated as object 604 viewed by the first user through the display 240a of HMD 200a.
  • the viewed second user is augmented in the AR environment by the HMD 200a (worn by first user) displaying a virtual object (e.g., hat) 606 positioned on the second user's head.
  • a virtual object e.g., hat
  • the HMDs 200a-c may operate to provide a picture-in-picture display region 602a-c, respectively, in the displays 240a-c where a notification to the user can be displayed to indicate a characteristic of a virtual object which is being viewed by another user as a virtual overlay on the user through another one of the HMDs.
  • the HMD 200a includes a camera that captures video of the second user and which is used to register location (orientation) of the HMD 200a relative to the second user, and used to then determine where a virtual object 606 (e.g., the illustrated example hat) is to be displayed on display 240a of HMD 200a so it is viewed by the first user as being overlaid on the second user's head.
  • a virtual object 606 e.g., the illustrated example hat
  • the second user wearing HMD 200b is correspondingly notified of the overlaid virtual object through an indication 608 displayed in the area 602b of HMD 200b worn by the second user.
  • the notification can indicate a characteristic of the virtual object 606 (e.g., hat) which is being overlaid on the second user's head through the display 240a of HMD 200a worn by the first user.
  • operations can receive an image 604 (e.g., single image or frame of video stream) of the second user from the camera of HMD 200a, generate a composite image 608 showing the virtual object 606 (e.g., hat) overlaid on the image of the second user; and send to the HMD 200b worn by the second user the composite image for display in the area 602b of display 240b.
  • the second user can view how the second user is being virtually augmented in the view of the first user by the overlaid virtual object 606 (e.g., hat).
  • the first user at a later time Tb rotates to gaze toward the third user who is illustrated as object 704 viewed by the first user through the display 240a of HMD 200a.
  • the viewed third user is augmented in the AR environment by the HMD 200a (worn by first user) displaying a virtual object 706 (e.g., antlers) positioned on the second user's head.
  • a virtual object 706 e.g., antlers
  • the third user wearing HMD 200c is correspondingly notified of the overlaid virtual object 706 (e.g., antlers) through an indication 708 displayed in the area 602c of HMD 200c worn by the third user.
  • the notification can indicate a characteristic of the virtual object 706 (e.g., antlers) which is being overlaid on the third user's head through the display 240a of HMD 200a worn by the first user.
  • operations can receive an image 704 (e.g., single image or frame of video stream) of the third user from the camera of HMD 200a, generate a composite image 708 showing the virtual object 706 (e.g., antlers) overlaid on the image of the third user; and send to the HMD 200c worn by the third user the composite image for display in the area 602c of display 240c.
  • the third user can view how the third user is being virtually augmented in the view of the first user by the overlaid virtual object 706 (e.g., antlers).
  • what characteristics of the virtual object 706 are shared with the third user and when sharing is initiated or ceased can be controlled by an overlay sharing policy in accordance with various optional embodiments. Different overlay sharing policies can be applied to notifications for the second user versus the third user and versus the first user.
  • AR displays have been illustrated in a non-limiting example manner as HMDs, but may alternatively be any type of displays such as smartphones, tablet computers, laptop computers, etc. Accordingly, the term AR display is used in the discussions of further embodiments below.
  • Figure 8 illustrates a flowchart of operations by a first AR display 200a, AR rendering device 800, and a second AR display 200b according to some embodiments.
  • the AR rendering device 800 may be separate from the first and second AR displays (e.g., a computing server networked to the AR displays) or may be integrated into one or both of the first and second AR displays 200a, 200b.
  • the AR rendering device 800 includes at least one processor circuit 1250 (“processor”) and at least one memory circuit 1252 (“memory”) storing instructions executable by the processor 1250 to perform operations according to one or more embodiments disclosed herein.
  • the AR rendering device 800 can include a network interface 1256 operable to communicate through the one or more networks 1270 with the one or more AR displays 200a-c.
  • the AR rendering device 800 may further include a display 1254.
  • the AR displays 200a-c each include at least one processor circuit 1200 (“processor”) and at least one memory circuit 1210 (“memory”) storing instructions executable by the processor 1200 to perform operations.
  • Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
  • These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method is performed by an augmented reality (AR) rendering device for rendering virtual objects displayed by a first AR display as overlays viewed by a first user. The method includes rendering a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display. The method obtains information associated with the second user, where the second user is transporting a second AR display, and determines an overlay sharing policy based on the information associated with the second user. The method determines based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user. The method sends for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.

Description

SHARING VIRTUAL REPRESENTATION OF A USER BY AN AUGMENTED REALITY DISPLAY
TECHNICAL FIELD
[0001] The present disclosure relates to rendering computer-generated virtual objects as overlays on users viewed through augmented reality (AR) displays of an AR system.
BACKGROUND
[0002] In general terms, augmented reality (AR) is an interactive experience of a real- world environment where the objects that reside in the real world are augmented (enhanced) by computer-generated virtual objects and other perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. An AR visual experience can be created by a system augmenting a user's view of real-world objects with the computer-generated virtual objects that are registered in three-dimensional (3D) space to the real-world objects, and can be displayed overlaid on the real-world objects. The overlaid objects can be constructive (i.e., additive to the real-world objects), or destructive (i.e., masking of the real-world objects). AR displays, such smart glasses or wearable computer glasses, can enable users to control how virtual objects are overlaid on real-world objects.
[0003] In a multi-person AR communication sessions and other AR environments there is a need for enabling users to control how they are viewed.
SUMMARY
[0004] Some embodiments disclosed herein are directed to a method performed by an AR rendering device for rendering virtual objects displayed by a first AR display as overlays viewed by a first user. The method includes rendering a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display. The method obtains information associated with the second user, where the second user is transporting a second AR display, and determines an overlay sharing policy based on the information associated with the second user. The method determines based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user. The method sends for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
[0005] The term overlay can mean that the virtual object is viewed by the first user with a defined spatial relationship to the second user seen in the real-world. The overlaid virtual object may be displayed overlapping a part of the second user, e.g., head, or may be displayed with a defined offset relative to the second user.
[0006] Some other related embodiments are directed to an AR rendering device for rendering virtual objects displayed by a first AR display as overlays viewed by a first user. The AR rendering device includes at least one processor, and at least one memory storing instructions executable by the at least one processor to perform operations. The operations include to render a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display. The operations obtain information associated with the second user, where the second user is transporting a second AR display, and determine an overlay sharing policy based on the information associated with the second user. The operations determine based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user. The operations send for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
[0007] Some other related embodiments are directed to a computer program product that includes a non-transitory computer readable medium storing instructions executable by at least one processor of an AR rendering device to perform operations for rendering virtual objects displayed by a first AR display as overlays viewed by a first user. The operations include to render a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display. The operations obtain information associated with the second user, where the second user is transporting a second AR display, and determine an overlay sharing policy based on the information associated with the second user. The operations determine based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user. The operations send for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
[0008] Potential advantages that may be provided by these and further embodiments disclosed herein include enabling a user to be informed how the user is being viewed by another user in an AR environment. For example, the user may find it objectionable that any virtual object is viewed as an overlay on the user or may find it objectionable to have perceived unpleasant or embarrassing virtual objects overlaid on the user. These embodiments can provide notice to the user that a virtual object is being overlaid on the user through the AR display operated by another user. The embodiments may also enable the user to control what types of virtual objects can be displayed through AR displays as an overlay on the user, and to modify how the virtual objects appear.
[0009] Other methods and related methods, AR rendering devices, and computer program products will be or become apparent to one with skill in the art upon review of the following drawings and detailed description. It is intended that all such additional methods, AR rendering devices, and computer program products be included within this description and protected by the accompanying claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Aspects of the present disclosure are illustrated by way of example and are not limited by the accompanying drawings. In the drawings:
[0011] Figure 1 is a schematic diagram illustrating an AR communication system including components operative according to some embodiments;
[0012] Figure 2 schematically illustrates an AR display configured as a head-mountable display (HMD) to be worn as glasses according to some embodiments;
[0013] Figures 3, 4, and 5 illustrate flowcharts of operations that can be performed by the AR rendering device for restricting rendering of an audio and/or visual augmentation of a user in an AR environment according to embodiments; [0014] Figures 6 and 7 each illustrates a system that provides an AR environment for three users wearing HMDs and related operations by an AR rendering device to provide notifications to the users for how they are being viewed in the AR environment, in accordance with some embodiments;
[0015] Figure 8 illustrates a flowchart of operations by a first AR display, AR rendering device, and a second AR display according to some embodiments; and
[0016] Figures 9, 10, and 11 are flowcharts of operations that can be performed by the AR rendering device for provide notifications to users for how they are being viewed in the AR environment, in accordance with some embodiments.
DETAILED DESCRIPTION
[0017] Inventive concepts will now be described more fully hereinafter with reference to the accompanying drawings, in which examples of embodiments of inventive concepts are shown. Inventive concepts may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of various present inventive concepts to those skilled in the art. It should also be noted that these embodiments are not mutually exclusive. Components from one embodiment may be tacitly assumed to be present/used in another embodiment.
[0018] In an AR communication session between two or more users, users can select from a myriad of different types of virtual objects that can be displayed on AR displays, such as head mountable displays (HMDs), registered in 3D space to be viewed as being virtual objects overlaid on other users. For example, a first user may wear an HMD while viewing a second user who is also wearing an HMD. The first user's HMD may operate to display a computer-generated virtual costume that is viewed by the first user as an overlay positioned on the second user. In this scenario, there has heretofore not been a way for the second user (the viewed user) to be aware that a virtual object (e.g., virtual costume) is being viewed as an overlay on the second user (the viewed user) and, much less, made aware how the second user (the viewed user) then looks to the first user (the viewing user). The second user (the viewed user) may find it objectionable that any virtual object is viewed as an overlay on the second user (the viewed user), or may more-particularly find it objectionable to have perceived unpleasant or embarrassing virtual objects overlaid on the second user (the viewed user).
[0019] Various embodiments of the present disclosure are directed, for example in this scenario, to providing notice to the second user (the viewed user) that a virtual object is being overlaid on the second user (the viewed user) through the AR display operated by the first user (viewed user). The embodiments can also enable the second user (the viewed user) to control what types of virtual objects can be displayed through AR displays as an overlay on the second user (the viewed user), and to modify how the virtual objects appear. [0020] Operations according to some embodiments are initially described with reference to Figures 1 and 2 that are directed to controlling what types of virtual objects are allowed to be displayed. Then, operations according to some other embodiments are described with reference to Figures 3 and 8 that are directed to informing users how they are being viewed through AR displays by other users.
[0021] Although some embodiments are described as using HMDs as a type of AR display, other types of AR displays can be used, such as smartphones, tablet computers, laptop computers, etc.
[0022] Figure 1 is a schematic diagram illustrating an AR communication system 100 including components operative according to some embodiments. The AR communication system 100 includes two AR displays devices ("AR displays" for brevity) 200a, 200b and an AR rendering device 800. Although the AR rendering device 800 is shown as being separate from the AR displays 200a, 200b, such as within a computing server networked to the AR displays 200a, 200b, the AR rendering device 800 may be at least partially integrated into one or both of the AR displays 200a, 200b.
[0023] Each AR display 200a, 200b is illustrated as being worn by a respective user 110a, 110b. The AR displays 200a, 200b may alternatively be held by the users 110a, 110b, or transported in other ways and configured for viewing by the users 110a, 110b. Further, although referred to as "AR" displays 200a, 200b, they may additionally or alternatively implement virtual reality (VR) functionality or mixed reality (MR) functionality. AR, VR, and MR can collectively be referred to as extended reality (XR). As used herein, the term "AR" is used broadly to also refer to VR and MR. Without loss of generality, the term AR display is used throughout this disclosure to refer to any such communication device.
[0024] As schematically illustrated at 130a, 130b, the AR rendering device 800 is configured to communicate with the AR displays 200a, 200b. The AR rendering device 800 may be configured to handle communication sessions between the AR displays 200a, 200b, and may provide communication session media control, or be configured to operate in conjunction with a meeting server, etc. The AR rendering device can select, generate, and/or configure a virtual object which is provided to one or both of the AR displays 200a, 200b for display at a defined location(s) where it will be viewed as an overlay on the other user. The AR communication system 100 can include a plurality of AR displays, each having its own user interface, and each being configured for communication with the other AR displays.
[0025] For purposes of notation and without imposing any hierarchical relationship among the AR displays 200a, 200b, AR display 200a is hereinafter denoted a first AR display 200a, whereas AR display 200b is hereinafter denoted a second AR display 200b. The first AR display 200a comprises a first user interface for displaying a computer-generated (e.g., by the AR rendering device 800) virtual representation 120b of the second user 110b, e.g., when viewed in VR or AR, or at least a virtual object which is overlaid on the second user when viewed in AR through a see-through display. The second AR display 200b comprises a second user interface for displaying a computer-generated (e.g., by the AR rendering device 800) virtual representation 120a of the first user 110a, e.g., when viewed in VR or AR, or at least a virtual object which is overlaid on the second user when viewed in AR through a see- through display. In some other embodiments, the virtual representation may be provided as an audio playout representation through respective speakers associated with the AR displays 200a, 200b or as a combined visual and audio representation. The audio playout representation may be generated by adding an audio overlay or other modification to the other user's voice, adding additional sounds (e.g., background sounds, voices, etc.) to the other user's voice, etc.
[0026] Figure 2 schematically illustrates an example configuration of the AR displays 200a, 200b as a head-mountable display (HMD) configured to be worn as glasses according to some embodiments. The AR displays 200a, 200b include a display device 240a, 240b which may be a see-through so that virtual objects are displayed on the user's view of the real world. The AR displays 200a, 200b can include speakers which can playout audio which may be overlaid on or a modification of a microphone signal which captures the other user's voice. The AR displays 200a, 200b may therefore render an audio and/or visual representation of a user.
[0027] In some examples, the audio and/or visual representation 120b of the second user 110b is rendered through the first display device 240a as an avatar, such as a three- dimensional avatar, or as a virtual object that is overlaid on the real-world view of the second user 120b through a see-through display. The AR displays 200a, 200b can include a camera 280a, 280b operate to capture video that can be processed by the AR rendering device 800 to determine location of viewed user(s) relative to the AR displays 200a, 200b, and determine therefore where the visual representations 120b, 120a are to be displayed by the AR displays 200a, 200b.
[0028] Each AR display 200a, 200b further includes a communication interface 270a, 270b for communicating with the other AR display 200a, 200b and with the AR rendering device 800. The AR rendering device 800 may relay communications between the AR display 200a, 200b and the respective users. Although illustrated as a raised antenna to represent a wireless (e.g., RF) communication interface, the communication interface 270a, 270b may alternatively be a wired communication interface, an infrared communication interface, or some other kind of communication interface. In this respect, the AR display 200a, 200b may be configured for direct communication with each other or for communicating with each other via at least one other device, such as a mobile phone, personal computer, gaming machine, or the like.
[0029] Audio and/or video recorded at one AR display 200a, 200b can be transferred, via the communication interface 270a, 270b, to be played out at another AR display 200a, 200b. The AR display 200a, 200b further includes a processing circuitry 260a, 260b for locally controlling operation of the AR display 200a, 200b. As explained above, although illustrated as a separate device in Figure 1, operational functionality described herein for an AR rendering device 800 may be implemented partly or fully by the processing circuitry 260a, 260b in at least one of the AR displays 200a, 200b.
[0030] As noted above, there is a need for improved rendering control in AR communication sessions.
[0031] According to at least some of the herein disclosed embodiments, this issue is addressed by the provision and the application of rendering attributes pertaining to restrictions in terms of how the representation of the first user (e.g., virtual objects rendered on the first user) is to be rendered at the user interface of the second user. The rendering attributes may be regarded as representing how-to-be-rendered preferences. As will be disclosed in further detail below, these rendering attributes are therefore provided from the first AR display 200a to the second AR display 200b, either directly or via the AR rendering device 800. The rendering attributes are thus specified by the first AR display 200a and applied at the second AR display 200b when a virtual representation of the first user is rendered. Computer operations for enforcing the restrictions imposed by the rendering attributes are followed will also be presented.
[0032] At least some of the following issues may be addressed by these embodiments. Traditionally, the first user lacks an ability to provide the second user with preferences for how and/or where the virtual representation of the first user is to be rendered in the local AR environment of the second user, e.g., what types of virtual objects are allowed to be viewed by the second user as an AR overlay on the first user and where such virtual object(s) are allowed to be viewed as an overlay on the first user. Similarly, the second user lacks an ability to provide the first user with preferences for how and/or where the virtual representation of the second user is to be rendered in the local AR environment of the first user, e.g., what types of virtual objects are allowed to be viewed by the first user as an AR overlay on the second user and where such virtual object(s) are allowed to be viewed as an overlay on the second user. Traditionally, the first and second users lack the ability to be determine (e.g., object notification) how and/or where the virtual representation of that user is rendered in the local environment of the other user.
[0033] The first AR display 200a is configured to be worn by the first user 110a. The first AR display 200a includes a camera 280a operative to capture video of the second user and an AR display 240a operative to display virtual objects overlaid on the second user, and may further include a microphone operative to capture digitized voice of the second user. The second AR display 200b is configured to be worn by the second user 110b. The second AR display 200b includes a camera 280b operative to capture video of the first user and an AR display 240b operative to display virtual objects overlaid on the first user, and may further include a microphone operative to capture digitized voice of the first user.
[0034] Reference is now made to Figure 3 which illustrates a flowchart of operations which can be performed by the AR rendering device 800 for restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the first AR display 200a according to an embodiment. The first AR display 200a is in communication with the second AR display 200b.
[0035] S102: The first AR display 200a provides, towards the second AR display 200b, rendering attributes pertaining to restrictions in terms of how the audio and/or visual representation 120a of the first user 110a is to be rendered at the second user interface 240b, 250b of the second AR display 200b.
[0036] S106: The first AR display 200a provides, towards the second AR display 200b, the audio and/or visual data of the first user 110a as captured by the data capturing unit [0037] Embodiments relating to further details of restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the first AR display 200a will now be disclosed.
[0038] In some examples, the rendering attributes are defined as a function of a physical environment of the first user 110a in which the audio and/or visual data is captured. In further examples, the rendering attributes are defined as a function of a physical environment of the second user 110b in which the audio and/or visual representation is rendered. In yet further examples, the rendering attributes are defined as a function of an identifier of the second user 110b.
[0039] In some embodiments, a handshake procedure is performed between the first AR display 200a and the second AR display 200b to agree on approved poses, states (e.g., types of virtual objects which are approved for overlay on the user, characteristics of virtual object(s) approved for overlay on the user, etc.), etc., before the actual rendering of the audio and/or visual representation 120a of the first user 110a is displayed for viewing by the second user. In particular, in some embodiments, the first AR display 200a is configured to perform (optional) step S104.
[0040] S104: The first AR display 200a performs, with the second AR display 200b, a handshake procedure pertaining to properties and/or rendering capabilities of the second AR display 200b to follow the restrictions before providing the audio and/or visual data towards the second AR display 200b.
[0041] There can be different types of information exchanged between the first AR display 200a and the second AR display 200b during the handshake procedure. In some embodiments, the second AR display 200b sends its capabilities indicating how it can render the audio and/or visual representation 120a of the first user 110a. This enables the first AR display 200a to adapt the rendering attributes accordingly. Therefore, in some embodiments, the handshake procedure involves the first AR display 200a to provide, towards the second AR display 200b, rendering attributes as adapted to fit the properties and/or rendering capabilities. [0042] In some embodiments, the first AR display 200a obtains feedback from the second AR display 200b, either directly or via the AR rendering device 800, indicating how the audio and/or visual representation 120a of a first user 110a is rendered at the second communication device 200b. Therefore, in some embodiments, the first AR display 200a is configured to perform (optional) step S108.
[0043] S108: The first AR display 200a obtains feedback pertaining to how the audio and/or visual representation 120a of the first user 110a is rendered at the second user interface 240b, 250b of the second AR display 200b. For example, the first AR display 200a can obtain a notification indicating a characteristic of a virtual object which is overlaid through the second AR display 200b on the first user.
[0044] There could be different examples of such feedback. In some examples, the feedback information is a confirmation that the rendering attributes are followed. In other examples, more detailed information is provided regarding how the audio and/or visual representation 120a of the first user 110a is rendered.
[0045] Reference is now made to Figure 4 which illustrates a flowchart of operations which can be performed by the AR rendering device 800 for restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the second AR display 200b according to an embodiment. The second AR display 200b is in communication with the first AR display 200a.
[0046] S202: The second AR display 200b obtains rendering attributes pertaining to restrictions in terms of how the audio and/or visual representation 120a of the first user 110a is to be rendered at the second user interface 240b, 250b of the second AR display 200b.
[0047] S206: The second AR display 200b obtains the audio and/or visual data of the first user 110a as captured by the data capturing unit 280a.
[0048] S208: The second AR display 200b renders the audio and/or visual representation 120a of the first user 110a at the second user interface 240b, 250b in accordance with the rendering attributes. [0049] Embodiments relating to further details of restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the second AR display 200b will now be disclosed.
[0050] As disclosed above, in some embodiments, a handshake procedure is performed between the first AR display 200a and the second AR display 200b to agree on approved poses, states (e.g., types of virtual objects which are approved for overlay on the user, characteristics of virtual object(s) approved for overlay on the user, etc.), etc., before the actual rendering of the audio and/or visual representation 120a of the first user 110a is set up. In particular, in some embodiments, the second AR display 200b is configured to perform (optional) step S204.
[0051] S204: The second AR display 200b performs, with the first AR display 200a, a handshake procedure pertaining to properties and/or rendering capabilities of the second AR display 200b to follow the restrictions before providing the audio and/or visual data towards the second AR display 200b.
[0052] As further disclosed above, in some embodiments, the handshake procedure involves the second AR display 200b to obtain rendering attributes as adapted by the first AR display 200a to fit the properties and/or rendering capabilities.
[0053] In some embodiments, the first AR display 200a obtains feedback from the second AR display 200b indicating how the audio and/or visual representation 120a of a first user 110a is rendered at the second communication device 200b. Therefore, in some embodiments, the second AR display 200b is configured to perform (optional) step S210.
[0054] S210: The second AR display 200b provides, towards the first AR display 200a, feedback pertaining to how the audio and/or visual representation 120a of the first user 110a is rendered at the second user interface 240b, 250b of the second AR display 200b. [0055] If any of the audio and/or visual data of the first user 110a as obtained in step S206 lacks an association with the rendering attributes obtained in step S202, or if further audio and/or visual data of the first user 110a is obtained without any defined rendering attributes, the audio and/or visual data of the first user 110a might be rendered according to default settings. Such default settings might be hardcoded in the second AR display 200a, or obtained from the AR rendering device 800, or having been obtained from the first AR display 200a during an earlier AR communication session,
[0056] Reference is now made to Figure 5 which illustrates a flowchart of operations which can be performed by the AR rendering device 800 for restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the AR rendering device 800 according to an embodiment. The AR rendering device 800 is in communication with the first AR display 200a and the second AR display 200b.
[0057] As disclosed above, the rendering attributes may be sent from the first AR display 200a either directly to the second AR display 200b or to the second AR display 200b via the AR rendering device 800.
[0058] S302: The AR rendering device 800 obtains, from the first AR display 200a, rendering attributes pertaining to restrictions in terms of how the audio and/or visual representation 120a of the first user 110a is to be rendered at the second user interface 240b, 250b of the second AR display 200b, and the audio and/or visual data of the first user 110a as captured by the data capturing unit 280a.
[0059] S306: The AR rendering device 800 provides, to the second AR display 200b, the rendering attributes and the audio and/or visual data of the first user 110a.
[0060] Embodiments relating to further details of restricting rendering of an audio and/or visual representation 120a of a first user 110a at a second AR display 200b as performed by the AR rendering device 800 will now be disclosed.
[0061] As disclosed above, the AR rendering device 800, or at least its functionality, might be implemented by one of the AR displays 200a, 200b. If the functionality of the AR rendering device 800 is implemented by the second AR display 200b, then the AR rendering device 800 could be responsible for executing the rendering of the audio and/or visual representation 120a of the first user 110a (and also enforcing the rendering according to the rendering attributes, as will be disclosed below). [0062] In some embodiments, the AR rendering device 800 verifies that a trust score of the second AR display 200b is higher than a threshold value before the AR rendering device 800 forwards the audio and/or visual data to the second AR display 200b. Hence, in some embodiments, the second AR display 200b is associated with a trust score for rendering audio and/or visual representations, and the AR rendering device 800 is configured to perform (optional) step S304.
[0063] S304: The AR rendering device 800 verifies that the trust score is higher than a threshold value before providing, to the second AR display 200b, the rendering attributes and the audio and/or visual data of the first user 110a.
[0064] There could be different ways in which the threshold value is set. In some examples, the threshold value is received from the first AR display 200a, e.g., as part of the rendering attributes. In other examples, the threshold value is set by the AR rendering device 800 itself. In further detail, compliance of the second AR display 200b may be associated with a trust-level function in that a repeatedly good compliance may provide the second AR display 200b with a higher trust score (e.g., a step size A > 0 per compliance event), whereas a poor compliance behaviour may be associated with a lower score (e.g., associated with a step size A < 0). In a rendering scenario with a user A experiencing (e.g., using picture-in-picture, screen sharing, etc.) another user B rendering an audio and/or visual representation of yet another user C outside defined rendering attributes, the user A could downgrade the trust score of user B and share this information with user C who could then decide to also downgrade the trust score of user B. Multiple users downgrading another user during a shared AR communication session could be interpreted as a strong sign for non-compliance and could have the AR rendering device 800 take an action, as will be disclosed in further detail below.
[0065] As disclosed above, in some embodiments, the second AR display 200b provides feedback about how the audio and/or visual representation 120a of a first user 110a is rendered at the second communication device 200b. This feedback may be obtained by the AR rendering device 800. Hence, in some embodiments, the AR rendering device 800 is configured to perform (optional) step S308.
[0066] S308: The AR rendering device 800 obtains, from the second AR display 200b, feedback pertaining to how the audio and/or visual representation 120a of the first user 110a is rendered at the second user interface 240b, 250b of the second AR display 200b. [0067] Based on the feedback and/or on other information, the AR rendering device 800 can thus detect whether the audio and/or visual representation 120a of the first user 110a has, at the second user interface 240b, 250b, or has not been rendered in accordance with the rendering attributes. Non-compliance might be detected by the first user via e.g., a self-monitor feedback channel from the second AR display 200b. Hence, in some embodiments, the AR rendering device 800 is configured to perform (optional) step S310.
[0068] S310: The AR rendering device 800 obtains an indication that the audio and/or visual representation 120a of the first user 110a has, at the second user interface 240b, 250b, not been rendered in accordance with the rendering attributes.
[0069] An action might then be taken in case such a policy breach has been detected. Therefore, in some embodiments, the AR rendering device 800 is configured to perform (optional) step S312.
[0070] S312: The AR rendering device 800, in response thereto (i.e., in response to having obtained the indication in step S310), performs an action.
[0071] There could be different types of actions taken by the AR rendering device 800 in step S312. In some examples, performing the action comprises any, or any combination of: altering any further audio and/or visual data of the first user 110a that is obtained from the first AR display 200a before it is provided to the second AR display 200b, terminating provisioning of the audio and/or visual data of the first user 110a to the second AR display 200b, disabling rendering of the audio and/or visual representation 120a of the first user 110a at the second user interface 240b, 250b, issuing a non-compliance notification to the first AR display 200a, decrementing a trust score for the second AR display 200b and/or the second user 110b. [0072] In some examples, the rendering attributes pertain to restricting the rendering of the audio and/or visual representation 120a of the first user 110a in terms of any, or any combination of: visual appearance of the audio and/or visual representation, virtual objects surrounding the audio and/or visual representation, virtual environment in which the audio and/or visual representation is rendered, visual effects and/or animations associated with the audio and/or visual representation, audio effects associated with the audio and/or visual representation. In some examples, the virtual objects represent any, or any combination of: at least one third user, at least one animal, pieces of furniture, etc.
[0073] In some examples, according to the rendering attributes, the restrictions are any of: mandatory, or voluntary (such as preferred rendering attributes), or any level there between.
[0074] In some examples, the rendering attributes could pertain to the preferred location of the audio and/or visual representation of the first user in the rendered environment (e.g., position and/or altitude), preferred location relative to the audio and/or visual representation of at least one other user (e.g., in exemplifying order of priority 1) to the right of, 2) in front of, 3) in front of to the right, 4) etc.).
[0075] In some examples, the rendering attributes could pertain to the preferred exclusion of the audio and/or visual representation of the first user being rendering adjacent, or nearby, or associated with e.g., "hot words" exclamations, cultural symbols, specific individuals, animals, itemized objects (e.g. chair, loo, bathtubs, certain furniture, etc.) or certain body postures (lie down (belly down/up), kneeling, bowing, etc.), spatial orientation ("upside down"), etc.
[0076] In some examples, the rendering attributes could pertain to contextual information, such as combinations of the above in respect to e.g., AR communication sessions with friends, family, corporate, business, where the contextual information might be derived e.g., from calendar entries, schedules, social network entries, contact entries, device, photo streams, meeting e-mail address, voice, face recognition, etc. [0077] In some examples, the rendering attributes could pertain to combinations of preferred spatial locations with respect to the rendered environment, preferred spatial location relative to at least one other participant, preferred exclusion of rendering relative at least one other participant, contextual information, etc.
[0078] With audio and/or visual representations of a multitude of users to be rendered at one AR display 200a, 200b, multiple rendering attributes for multiple users may have to be evaluated and complied with in parallel, and prioritizations may apply. It could also be that multiple rendering attributes being applied for one AR communication session requires the renderings of some participants to adhere to limitations put forth by rendering attributes of other participants.
[0079] In yet further aspects, the rendering considers both the context of the first user (i.e., the user whose audio and/or visual representation that is to be rendered) and the second user (i.e., the user at which the audio and/or visual representation of the first user is rendered). Both the first user and the second user might therefore have pre-defined policies, in terms of rendering attributes, that may be used for rendering adaptations. Assume that contextual data (e.g., inertial measurement unit (IMU) data, context data, etc.) of the second user is provided to the first user. The contextual data can be used to select appropriate rendering attributes that are forwarded to second user. Examples of such contextual data as provided by the second user and the resulting rendering attributes as provided by the second user are disclosed in Table 1, below.
Table 1: Examples of provided contextual data from second user and resulting rendering attributes from first user.
[0080] Further embodiments are now explained which are directed to enabling a user to be notified if the user is being viewed in an AR environment with a virtual object overload on the user. These embodiments may be used in combination with the other embodiments disclosed herein to enable the user to influence and control how users are viewed in AR environments. As explained above in some examples, a first user may wear an HMD while viewing a second user. The HMD may be operated to display a computer-generated virtual costume that is viewed by the first user as an overlay positioned on the second user. In this scenario, there has heretofore not been a way for the second user (the viewed user) to be aware a virtual object is being viewed as an overlay on the second user (the viewed user) and, much less, made aware how the second user (the viewed user) then looks to the first user (the viewing user). The second user (the viewed user) may find it objectionable that any virtual object is viewed as an overlay on the second user (the viewed user), or may more-pa rticularly find it objectionable to have perceived unpleasant or embarrassing virtual objects overlaid on the second user (the viewed user). [0081] Various embodiments of the present disclosure are directed, e.g. in this scenario, to providing notice to the second user (the viewed user) that a virtual object is being overlaid on the second user (the viewed user) through the AR display operated by the first user (the viewing user). The embodiments can also enable the second user (the viewed user) to control what types of virtual objects can be displayed through AR displays as an overlay on the second user (the viewed user), and to modify how the virtual objects appear. [0082] Figure 6 illustrates a system that provides an AR environment for a first user wearing HMD 200a, a second user wearing HMD 200b, and a third user wearing HMD 200c. The example HMDs 200a-c have see-through display screens 240a-c through which the respective users can view the other users in the AR environment with various computergenerated virtual objects that are displayed at locations which are determined (controlled) to overlay the virtual objects on the viewed users. In accordance with some operational embodiments, the users are informed how they are being viewed in the AR environment by other users.
[0083] Referring to Figure 6, in a first scenario, the first user at a first time Ta gazes toward the second user who is illustrated as object 604 viewed by the first user through the display 240a of HMD 200a. The viewed second user is augmented in the AR environment by the HMD 200a (worn by first user) displaying a virtual object (e.g., hat) 606 positioned on the second user's head.
[0084] The HMDs 200a-c may operate to provide a picture-in-picture display region 602a-c, respectively, in the displays 240a-c where a notification to the user can be displayed to indicate a characteristic of a virtual object which is being viewed by another user as a virtual overlay on the user through another one of the HMDs. In the illustrated example, the HMD 200a includes a camera that captures video of the second user and which is used to register location (orientation) of the HMD 200a relative to the second user, and used to then determine where a virtual object 606 (e.g., the illustrated example hat) is to be displayed on display 240a of HMD 200a so it is viewed by the first user as being overlaid on the second user's head. [0085] The second user wearing HMD 200b is correspondingly notified of the overlaid virtual object through an indication 608 displayed in the area 602b of HMD 200b worn by the second user. The notification can indicate a characteristic of the virtual object 606 (e.g., hat) which is being overlaid on the second user's head through the display 240a of HMD 200a worn by the first user. In the illustrated example, operations can receive an image 604 (e.g., single image or frame of video stream) of the second user from the camera of HMD 200a, generate a composite image 608 showing the virtual object 606 (e.g., hat) overlaid on the image of the second user; and send to the HMD 200b worn by the second user the composite image for display in the area 602b of display 240b. In this manner, the second user can view how the second user is being virtually augmented in the view of the first user by the overlaid virtual object 606 (e.g., hat).
[0086] As will be explained below, what characteristics of the virtual object 606 are shared with the second user and when sharing is initiated or ceased can be controlled by an overlay sharing policy in accordance with various embodiments.
[0087] Referring to Figure 7, the first user at a later time Tb rotates to gaze toward the third user who is illustrated as object 704 viewed by the first user through the display 240a of HMD 200a. The viewed third user is augmented in the AR environment by the HMD 200a (worn by first user) displaying a virtual object 706 (e.g., antlers) positioned on the second user's head.
[0088] The third user wearing HMD 200c is correspondingly notified of the overlaid virtual object 706 (e.g., antlers) through an indication 708 displayed in the area 602c of HMD 200c worn by the third user. The notification can indicate a characteristic of the virtual object 706 (e.g., antlers) which is being overlaid on the third user's head through the display 240a of HMD 200a worn by the first user. In the illustrated example, operations can receive an image 704 (e.g., single image or frame of video stream) of the third user from the camera of HMD 200a, generate a composite image 708 showing the virtual object 706 (e.g., antlers) overlaid on the image of the third user; and send to the HMD 200c worn by the third user the composite image for display in the area 602c of display 240c. In this manner, the third user can view how the third user is being virtually augmented in the view of the first user by the overlaid virtual object 706 (e.g., antlers).
[0089] Again, what characteristics of the virtual object 706 are shared with the third user and when sharing is initiated or ceased can be controlled by an overlay sharing policy in accordance with various optional embodiments. Different overlay sharing policies can be applied to notifications for the second user versus the third user and versus the first user.
[0090] In Figures 6 and 7 the AR displays have been illustrated in a non-limiting example manner as HMDs, but may alternatively be any type of displays such as smartphones, tablet computers, laptop computers, etc. Accordingly, the term AR display is used in the discussions of further embodiments below.
[0091] Figure 8 illustrates a flowchart of operations by a first AR display 200a, AR rendering device 800, and a second AR display 200b according to some embodiments. The AR rendering device 800 may be separate from the first and second AR displays (e.g., a computing server networked to the AR displays) or may be integrated into one or both of the first and second AR displays 200a, 200b.
[0092] Referring to Figure 8, a camera of the first AR display captures 810 video of the second user. The first AR display provides 812 the video or other information characterizing the second user, e.g., location of head, to the AR rendering device 800. The AR rendering device 800 renders 814 a virtual object for display as an overlay on the second user. The first AR display 200a displays 816 the virtual object at a location on its display so that the first user views the virtual object as an overlay (e.g., overlapping or adjacent to) on the first user's view of the second user.
[0093] The AR rendering device 800 determines 818 an overlay sharing policy which controls whether the second user is notified that the virtual object is being displayed as an overlay on the second user and/or control what characteristic of the virtual object indicated through the notification. Example overlaying sharing policies and characteristics of the virtual objects that can be indicated through such notifications are described below, e.g., with reference to Figure 9. [0094] The second user may define (e.g., through a user interface of the first AR display 200a) or otherwise provide to the AR rendering device 800 the overlay sharing policy that is to be applied to notifications to the second user regarding virtual overlays being rendered on the second user by other users. For example, the AR rendering device 800 may receive from the second AR display an AR overlay-rendering compliance policy responsive to the notification sent to the second AR display, and then respond to a determination that the virtual object rendered by the first AR display does not satisfy the AR overlay-rendering compliance policy, performing one of: modify the rendering of the virtual object, modify where the virtual object is rendered through the first AR display as an overlay responsive to the instruction, or communicate with the first AR display to terminate display of the virtual object as an overlay on the second user.
[0095] The AR rendering device 800 may operate to determine from the AR overlayrendering compliance policy at least one of: type of virtual objects allowed or not allowed to be displayed as overlay on the second user; texture of virtual objects allowed or not allowed to be displayed as overlay on the second user; colour of virtual objects allowed or not allowed to be displayed as overlay on the second user; size of virtual objects allowed or not displayed as overlay to be overload on the second user; or location where virtual objects are allowed or not allowed to be displayed as overlay on the second user. The AR rendering device 800 can then control rendering of the virtual object being rendered through the first AR display 200a on the view of the second user responsive to the determination.
[0096] The AR rendering device 800 determines 820 a characteristic of the virtual object to be notified to the second user based on the overlay sharing policy. The AR rendering device 800 sends 822 the notification to the second AR display 200b for display 824 by the second user.
[0097] The second AR display 200b may optionally obtain 826 an instruction from the second user which is then relayed to the AR rendering device 800. The AR rendering device 800 responsively instructs 828 a modification of the virtual object (e.g., to change: what virtual object is overlaid; where the virtual object is displayed as overlay relative to the second user; change size, color, transparency, texture, etc. of the virtual object; etc.) or instructs ceasing rendering of the virtual object. The first AR display 200a responsively operates 830 to modify or cease rendering of the virtual object.
[0098] More general operations that be performed by an AR rendering device for rendering virtual objects displayed by a first AR display as overlays viewed by a first user are now explained with reference to the flowchart of Figure 9.
[0099] Referring to Figure 9, the operations render 900 a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display. The operation to render 900 the virtual object may be performed by a separate device from the first AR display to identify for and/or generate a graphical rendering of the virtual object to be displayed by the first AR display. Alternatively, rendering 900 may be performed by processing circuitry of the first AR display to display the virtual object on a display device.
[00100] Operations obtain 902 information associated with the second user who is transporting a second AR display. Further operations determine 904 an overlay sharing policy based on the information associated with the second user.
[00101] In one embodiment, the operations to obtain 902 information associated with the second user, include to receive from a camera connected to the first AR display video of a face of the second user, and to identify characteristics of the face of the second user. The operation to determine 904 the overlay sharing policy, include to select 906 the overlay sharing policy from among a repository of overlay sharing policies based on the identified characteristics of the face of the second user (e.g., evaluation of facial attributes such as face recognition).
[00102] In another embodiment, the operations to obtain 902 information associated with the second user, include to receive from a microphone connected to the first AR display digitized voice of the second user, and to identify characteristics of the digitized voice of the second user. The operation to determine 904 the overlay sharing policy, include to select 908 the overlay sharing policy from among a repository of overlay sharing policies based on the identified characteristics of the digitized voice of the second user.
[00103] In another embodiment, the operations to obtain 902 information associated with the second user, include to receive an identifier transmitted by a transmitter connected to the second AR display. The operation to determine 904 the overlay sharing policy, include to select 910 the overlay sharing policy from among a repository of overlay sharing policies based on the identifier transmitted by the transmitter connected to the second AR display.
[00104] In some other embodiments, the operations to obtain 902 the second user information can include to perform voice recognition on digitized voice of the second user, identifying attributes associated with the second users AR device, e.g., signalling from radio, Bluetooth, WiFi of the AR device, flickering of an LED of the AR device, scanning a machine readable (e.g., QR) code on or displayed by the AR device, etc.
[00105] Figure 10 illustrates a flowchart of operations that can be performed by the AR rendering device 800 to obtain and use user information in some embodiments. Referring to Figure 10, the operations to obtain 902 (Fig. 9) the second user information can include to determine 1000 gaze direction of the first user wearing the first AR display, determine 1002 angular orientation of the second user relative to the first user, and identify 1004 that the first user is looking at the second user based on correlating the gaze direction to the angular orientation of the second user relative to the first user. The operation to determine 904 (Fig. 9) the overlay sharing policy, can include to select 1006 the overlay sharing policy from among a repository of overlay sharing policies based on identifying that the first user is looking at the second user.
[00106] Figure 11 illustrates a flowchart of operations that can be performed by the AR rendering device 800 to obtain and use user information in some embodiments. Referring to Figure 11, the operations to obtain 902 (Fig. 9) the second user information can include to determine (1100) the second user is an active speaker among a plurality of users who are operating AR displays which are communicating with the AR rendering device, where each of the plurality of users have an overlay sharing policy defined in a repository with an association to the user. The operation to determine 904 (Fig. 9) the overlay sharing policy, can include to select (1102) the overlay sharing policy from among the repository based on the selected overlay sharing policy being associated with the second user identified as being the active speaker.
[00107] In some other embodiments, the operations to obtain 902 (Fig. 9) the second user information can include to identify a priority associated with the second user based on identifying the second user. The operation to determine 904 (Fig. 9) the overlay sharing policy, can include to select the overlay sharing policy based on the priority associated with the second user.
[00108] Further operations determine 912, based on the overlay sharing policy, a characteristic of the virtual object to be notified to the second user. Further operations send 914 for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
[00109] The overlay sharing policy can control if, when, and/or how notifications are shared with the second user. For example, the overlay sharing policy may cause basic general information to be shared, such as to generally indicate that a virtual object is being overlaid in the view of another user on you. The overlay sharing policy may cause other information to be shared such as an indication of a relative spatial location (e.g., direction to) the other user who is viewing a virtual object overlaid on you, and/or may identify the user who is viewing the virtual object as the overlay on you. Other examples of notifications that are provided pursuant to overlay sharing policies are described below.
[00110] In one embodiment, the operation to send 914 the notification includes to receive an image of the second user from a camera connected to the first AR display, generate a composite image showing the virtual object overlaid on the image of the second user, and send 916 toward the second AR display the notification containing the composite image for viewing by the second user. The composite image may be generated to have a lower image resolution than the image of the second user received from the camera to, for example, reduce data bandwidth used send the composite image for display, to reduce size of the displayed composite image, etc.
[00111] In another embodiment, the operation to send 914 the notification includes to send 918 toward the second AR display the notification containing the virtual object for viewing by the second user. For example, the notification may contain a graphical rendering of the virtual object, an identification of the virtual object which the second AR display can reference to operationally look-up and display the virtual object, a schematic generalization of the virtual object (e.g., edge detected image of the virtual object), graphically emphasized (e.g., highlighted) parts of the virtual object, etc.
[00112] In another embodiment, the operation to send 914 the notification includes to send 920 toward the second AR display the notification containing a textual indication or a graphical indication that a virtual object is being displayed by the first AR display as an overlay on the second user.
[00113] As explained above, the AR rendering device may be a computing server that separate from but communicatively networked to the first AR display and the second AR display. The AR rendering device may therefore receive from the first AR display through at least one network an indication of how the virtual object appears when viewed by the first user through the first AR display as an overlay on the second user, and generate the notification based on the received indication, and send 914 the notification toward the second AR display through at least one network.
[00114] Alternatively, the AR rendering device and the first AR display may be part of a portable AR device (e.g., HMD) transported by the first user. The AR rendering device may generate an indication of how the virtual object appears when viewed by the first user through the first AR display as an overlay on the second user, generate the notification based on the indication, and send 914 the notification toward the second AR display through at least one network.
[00115] The second user's voice may be augmented in the AR environment experienced by the first user by modifying the second user's digitized voice and/or applying an audio overlay on the second user's digitized voice played out through speaker listened to by the first user.
[00116] In one embodiment, operations of the AR rendering device determine an audio modification that is applied to a voice stream from a microphone connected to the second AR display capturing voice of the second user for playout through a speaker connected to the first AR display to the first user. The operations determine based on the overlay sharing policy a characteristic of the audio modification to be notified to the second user. The operations send for display by the second AR display or playout through a speaker connected to the second AR display to the second user, a notification indicating the characteristic of the audio modification.
[00117] Figure 12 illustrates components of an AR system that are configured to operate in accordance with some embodiments of the present disclosure.
[00118] Referring to Figure 12, the AR system 1260 includes an AR rendering device 800 and AR displays 200a-c. Although the AR system 1260 is illustrated with the AR rendering device 800 located separate from the AR displays 200a-c and communicatively connected thereto through one or more networks 1270, e.g., public (Internet) and/or private communication networks, the AR rendering device 800 may be incorporated into one or more of the AR displays 200a-c. The AR rendering device 800 may be configured as a networked computing server or may be a component (e.g., hardware and/or software) of an AR display.
[00119] The AR rendering device 800 includes at least one processor circuit 1250 ("processor") and at least one memory circuit 1252 ("memory") storing instructions executable by the processor 1250 to perform operations according to one or more embodiments disclosed herein. The AR rendering device 800 can include a network interface 1256 operable to communicate through the one or more networks 1270 with the one or more AR displays 200a-c. The AR rendering device 800 may further include a display 1254. [00120] The AR displays 200a-c each include at least one processor circuit 1200 ("processor") and at least one memory circuit 1210 ("memory") storing instructions executable by the processor 1200 to perform operations. The AR displays 200a-c may include a network interface 1220 operable to communicate through the one or more networks 1270 with the AR rendering device 800, or may operate to communicate directly with other ones of the AR displays. The AR displays 200a-c include displays 240a-c, e.g., near eye displays (NED) present in HMDs, and may each include a forward facing camera 1230 (with a field-of-view facing away from the AR display), a speaker 1232, and a microphone 1234.
[00121] Further Definitions and Embodiments:
[00122] In the above-description of various embodiments of present inventive concepts, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of present inventive concepts. Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which present inventive concepts belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense expressly so defined herein.
[00123] When an element is referred to as being "connected", "coupled", "responsive", or variants thereof to another element, it can be directly connected, coupled, or responsive to the other element or intervening elements may be present. In contrast, when an element is referred to as being "directly connected", "directly coupled", "directly responsive", or variants thereof to another element, there are no intervening elements present. Like numbers refer to like elements throughout. Furthermore, "coupled", "connected", "responsive", or variants thereof as used herein may include wirelessly coupled, connected, or responsive. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Well-known functions or constructions may not be described in detail for brevity and/or clarity. The term "and/or" includes any and all combinations of one or more of the associated listed items.
[00124] It will be understood that although the terms first, second, third, etc. may be used herein to describe various elements/operations, these elements/operations should not be limited by these terms. These terms are only used to distinguish one element/operation from another element/operation. Thus, a first element/operation in some embodiments could be termed a second element/operation in other embodiments without departing from the teachings of present inventive concepts. The same reference numerals or the same reference designators denote the same or similar elements throughout the specification. [00125] As used herein, the terms "comprise", "comprising", "comprises", "include", "including", "includes", "have", "has", "having", or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof. Furthermore, as used herein, the common abbreviation "e.g.", which derives from the Latin phrase "exempli gratia," may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item. The common abbreviation "i.e.", which derives from the Latin phrase "id est," may be used to specify a particular item from a more general recitation.
[00126] Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits. These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
[00127] These computer program instructions may also be stored in a tangible computer- readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the functions/acts specified in the block diagrams and/or flowchart block or blocks. Accordingly, embodiments of present inventive concepts may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.) that runs on a processor such as a digital signal processor, which may collectively be referred to as "circuitry," "a module" or variants thereof.
[00128] It should also be noted that in some alternate implementations, the functions/acts noted in the blocks may occur out of the order noted in the flowcharts. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Moreover, the functionality of a given block of the flowcharts and/or block diagrams may be separated into multiple blocks and/or the functionality of two or more blocks of the flowcharts and/or block diagrams may be at least partially integrated. Finally, other blocks may be added/inserted between the blocks that are illustrated, and/or blocks/operations may be omitted without departing from the scope of inventive concepts. Moreover, although some of the diagrams include arrows on communication paths to show a primary direction of communication, it is to be understood that communication may occur in the opposite direction to the depicted arrows. [00129] Many variations and modifications can be made to the embodiments without substantially departing from the principles of the present inventive concepts. All such variations and modifications are intended to be included herein within the scope of present inventive concepts. Accordingly, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended examples of embodiments are intended to cover all such modifications, enhancements, and other embodiments, which fall within the spirit and scope of present inventive concepts. Thus, to the maximum extent allowed by law, the scope of present inventive concepts are to be determined by the broadest permissible interpretation of the present disclosure including the following examples of embodiments and their equivalents, and shall not be restricted or limited by the foregoing detailed description.

Claims

Claims:
1. A method by an augmented reality, AR, rendering device for rendering virtual objects displayed by a first AR display as overlays viewed by a first user, the method comprising: rendering (900) a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display; obtaining (902) information associated with the second user, wherein the second user is transporting a second AR display; determining (904) an overlay sharing policy based on the information associated with the second user; determining (912) based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user; and sending (914) for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
2. The method of Claim 1, wherein: the AR rendering device comprises a computing server separate from the first AR display and the second AR display; and the sending (914) for display by the second AR display the notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user, comprises receiving from the first AR display through at least one network an indication of how the virtual object appears when viewed by the first user through the first AR display as an overlay on the second user, generating the notification based on the received indication, and sending the notification toward the second AR display through at least one network.
3. The method of Claim 1, wherein: the AR rendering device and the first AR display are part of a portable AR device transported by the first user; and the sending (914) for display by the second AR display the notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user, comprises generating an indication of how the virtual object appears when viewed by the first user through the first AR display as an overlay on the second user, generating the notification based on the indication, and sending the notification toward the second AR display through at least one network.
4. The method of any of Claims 1 to 3, wherein: the obtaining (902) information associated with the second user, comprises: receiving from a camera connected to the first AR display, video of a face of the second user, and identifying characteristics of the face of the second user; and the determining (904) of the overlay sharing policy, comprises selecting the overlay sharing policy from among a repository of overlay sharing policies based on the identified characteristics of the face of the second user.
5. The method of any of Claims 1 to 4, wherein: the obtaining (902) of information associated with the second user, comprises: receiving from a microphone connected to the first AR display, digitized voice of the second user, and identifying characteristics of the digitized voice of the second user; and the determining (904) of the overlay sharing policy, comprises selecting the overlay sharing policy from among a repository of overlay sharing policies based on the identified characteristics of the digitized voice of the second user.
6. The method of any of Claims 1 to 5, wherein: the obtaining (902) of information associated with the second user, comprises receiving an identifier transmitted by a transmitter connected to the second AR display; and the determining (904) of the overlay sharing policy, comprises selecting the overlay sharing policy from among a repository of overlay sharing policies based on the identifier transmitted by the transmitter connected to the second AR display.
7. The method of any of Claims 1 to 6, wherein the sending (914) for display by the second AR display the notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user, comprises: receiving an image of the second user from a camera connected to the first AR display; generating a composite image showing the virtual object overlaid on the image of the second user; and sending toward the second AR display the notification containing the composite image for viewing by the second user.
8. The method of Claim 7, wherein the generating of the composite image showing the virtual object overlaid on the image of the second user, comprises: generating the composite image to have a lower image resolution than the image of the second user received from the camera.
9. The method of any of Claims 1 to 8, wherein the sending (914) for display by the second AR display the notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user, comprises: sending toward the second AR display the notification containing the virtual object for viewing by the second user.
10. The method of any of Claims 1 to 9, wherein the sending (914) for display by the second AR display the notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user, comprises: sending toward the second AR display the notification containing a textual indication or a graphical indication that a virtual object is being displayed by the first AR display as an overlay on the second user.
11. The method of any of Claims 1 to 10, further comprising: determining an audio modification that is applied to a voice stream from a microphone connected to the second AR display capturing voice of the second user for playout through a speaker connected to the first AR display to the first user; determining based on the overlay sharing policy a characteristic of the audio modification to be notified to the second user; and sending for display by the second AR display or playout through a speaker connected to the second AR display to the second user, a notification indicating the characteristic of the audio modification.
12. The method of any of Claims 1 to 11, further comprising: receiving (828) an instruction from the second user via the second AR display, wherein the instruction indicates a requested modification of the overlay; and modifying (828) the rendering of the virtual object or modifying where the virtual object is rendered through the first AR display as an overlay responsive to the instruction.
13. The method of any of Claims 1 to 12, further comprising: receiving (828) an instruction from the second user via the second AR display, wherein the instruction indicates a requested termination of the virtual object overlaid on the second user; and communicating (828) with the first AR display to terminate display of the virtual object as an overlay on the second user.
14. The method of any of Claims 1 to 13, further comprising: receiving from the second AR display an AR overlay-rendering compliance policy responsive to the notification sent to the second AR display; and responsive to determining the virtual object rendered by the first AR display does not satisfy the AR overlay-rendering compliance policy, performing one of: modifying the rendering of the virtual object, modifying where the virtual object is rendered through the first AR display as an overlay responsive to the instruction, or communicating with the first AR display to terminate display of the virtual object as an overlay on the second user.
15. The method of Claim 14, further comprising: determining from the AR overlay-rendering compliance policy at least one of: type of virtual objects allowed or not allowed to be displayed as overlay on the second user; texture of virtual objects allowed or not allowed to be displayed as overlay on the second user; colour of virtual objects allowed or not allowed to be displayed as overlay on the second user; size of virtual objects allowed or not displayed as overlay to be overload on the second user; or location where virtual objects are allowed or not allowed to be displayed as overlay on the second user.
16. The method of any of Claims 1 to 15, wherein: the obtaining (902) of information associated with the second user, comprises determining gaze direction of the first user wearing the first AR display, determining angular orientation of the second user relative to the first user, and identifying that the first user is looking at the second user based on correlating the gaze direction to the angular orientation of the second user relative to the first user; and the determining (904) of the overlay sharing policy, comprises selecting the overlay sharing policy from among a repository of overlay sharing policies based on identifying that the first user is looking at the second user.
17. The method of any of Claims 1 to 16, wherein: the obtaining (902) of information associated with the second user, further comprises determining the second user is an active speaker among a plurality of users who are operating AR displays which are communicating with the AR rendering device, wherein each of the plurality of users have an overlay sharing policy defined in a repository with an association to the user; and the determining (904) of the overlay sharing policy, comprises selecting the overlay sharing policy from among the repository based on the selected overlay sharing policy being associated with the second user identified as being the active speaker.
18. The method of any of Claims 1 to 17, wherein: the obtaining (902) of information associated with the second user, further comprises identifying a priority associated with the second user based on identifying the second user; and the determining (904) of the overlay sharing policy, comprises selecting the overlay sharing policy based on the priority associated with the second user.
19. An augmented reality, AR, rendering device (800) for rendering virtual objects displayed by a first AR display as overlays viewed by a first user, the AR rendering device comprising: at least one processor (1250); and at least one memory (1252) storing instructions executable by the at least one processor (1250) to perform operations to: render a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display; obtain information associated with the second user, wherein the second user is transporting a second AR display; determine an overlay sharing policy based on the information associated with the second user; determine based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user; and send for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
20. The AR rendering device of Claim 19, wherein the operations further comprise to perform the method of any of Claims 2 to 18.
21. A computer program product comprising a non-transitory computer readable medium storing instructions executable by at least one processor of an augmented reality, AR, rendering device to perform operations for rendering virtual objects displayed by a first AR display as overlays viewed by a first user, the operations comprising to: render a virtual object for display by the first AR display as an overlay on a second user who is viewed by the first user through the first AR display; obtain information associated with the second user, wherein the second user is transporting a second AR display; determine an overlay sharing policy based on the information associated with the second user; determine based on the overlay sharing policy a characteristic of the virtual object to be notified to the second user; and send for display by the second AR display a notification indicating the characteristic of the virtual object which is overlaid through the second AR display on the second user.
22. The computer program product of Claim 21, wherein the operations further comprise to perform the method of any of Claims 2 to 18.
PCT/EP2024/050769 2024-01-15 2024-01-15 Sharing virtual representation of a user by an augmented reality display Pending WO2025153161A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2024/050769 WO2025153161A1 (en) 2024-01-15 2024-01-15 Sharing virtual representation of a user by an augmented reality display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2024/050769 WO2025153161A1 (en) 2024-01-15 2024-01-15 Sharing virtual representation of a user by an augmented reality display

Publications (1)

Publication Number Publication Date
WO2025153161A1 true WO2025153161A1 (en) 2025-07-24

Family

ID=89620290

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/050769 Pending WO2025153161A1 (en) 2024-01-15 2024-01-15 Sharing virtual representation of a user by an augmented reality display

Country Status (1)

Country Link
WO (1) WO2025153161A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170178411A1 (en) * 2013-01-11 2017-06-22 Disney Enterprises, Inc. Mobile tele-immersive gameplay
US20210055838A1 (en) * 2014-04-01 2021-02-25 Hallmark Cards, Incorporated Augmented Reality Appearance Enhancement
US20230032824A1 (en) * 2018-05-03 2023-02-02 Pcms Holdings, Inc. Systems and methods for physical proximity and/or gesture-based chaining of vr experiences

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170178411A1 (en) * 2013-01-11 2017-06-22 Disney Enterprises, Inc. Mobile tele-immersive gameplay
US20210055838A1 (en) * 2014-04-01 2021-02-25 Hallmark Cards, Incorporated Augmented Reality Appearance Enhancement
US20230032824A1 (en) * 2018-05-03 2023-02-02 Pcms Holdings, Inc. Systems and methods for physical proximity and/or gesture-based chaining of vr experiences

Similar Documents

Publication Publication Date Title
US11985446B1 (en) Customized graphics for video conversations
KR102574874B1 (en) Improved method and system for video conference using head mounted display (HMD)
US20250200906A1 (en) Sharing virtual content in a mixed reality scene
US10554921B1 (en) Gaze-correct video conferencing systems and methods
US11417296B2 (en) Information processing device, information processing method, and recording medium
EP3275181B1 (en) Eye gaze correction
US20240242449A1 (en) Extended reality rendering device prioritizing which avatar and/or virtual object to render responsive to rendering priority preferences
US20120069143A1 (en) Object tracking and highlighting in stereoscopic images
EP3275180B1 (en) Eye gaze correction
US20240257434A1 (en) Prioritizing rendering by extended reality rendering device responsive to rendering prioritization rules
CN114651448B (en) Information processing system, information processing method and program
EP3040893B1 (en) Display of private content
WO2025153161A1 (en) Sharing virtual representation of a user by an augmented reality display
JP7392723B2 (en) Information processing device, information processing method, and program
TW201639347A (en) Eye gaze correction
KR20180079145A (en) Method of providing moving image in virtual reality
WO2016176225A1 (en) Eye gaze correction
WO2024083301A1 (en) Restricted rendering of virtual representation of a user at an augumented reality communication device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24700619

Country of ref document: EP

Kind code of ref document: A1