WO2016029224A1 - Appareil, système et procédé pour fournir aux utilisateurs une expérience multimédia partagée - Google Patents
Appareil, système et procédé pour fournir aux utilisateurs une expérience multimédia partagée Download PDFInfo
- Publication number
- WO2016029224A1 WO2016029224A1 PCT/US2015/046600 US2015046600W WO2016029224A1 WO 2016029224 A1 WO2016029224 A1 WO 2016029224A1 US 2015046600 W US2015046600 W US 2015046600W WO 2016029224 A1 WO2016029224 A1 WO 2016029224A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- media
- head
- mounted display
- experience
- display apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/2866—Architectures; Arrangements
- H04L67/30—Profiles
- H04L67/306—User profiles
Definitions
- the invention relates generally to apparatuses, systems, and methods that play media content for users. More specifically, the invention is an apparatus, system, and method that allows users to engage in a shared media experience (collectively, the "system"). Such media experiences can be immersive or non-immersive. Such media experiences can be live, recorded, and/or virtual.
- the consumption of media is exploding a world-wide basis. Movies, television programs, video games, music, video clips, and other types of media content are accessed by billions of people around the world. Consumers of media can access a universe of ever increasing content through both traditional as well as new delivery channels. For example, movies are viewed in movie theaters, but they are also viewed through online streaming services, pay-per-view cable and satellite services, and optical discs such as DVDs, BLU- RAY® discs, and other similar media.
- social media is also increasing on an exponential basis.
- Users of social networks such as Facebook, Linkedln, MySpace, Google+, and other social networks are accustomed to sharing photographs, video clips, and other types of media with the family, friends, and co-workers.
- attending a concert, a theatrical production, a professional sporting event, a birthday party for a five year old, a picnic on the beach, or other similar activities involves capturing media relating to the event, and sharing such pictures and videos with accompanying commentary on social media. Sharing what we do and see is now an important part of the daily lives of many people.
- the invention relates generally to apparatuses, systems, and methods that play media content for users. More specifically, the invention is an apparatus, system, and method that allows users to engage in a shared media experience (collectively, the "system"). Such media experiences can be immersive or non-immersive. Such media experiences can be live, recorded, and/or virtual.
- the system can build upon the relationships between users, the expansion in the different types of media experiences, and even the use of immersive media experiences in which users navigate "virtual" spaces while being visually represented to other users in the forms of avatars.
- the system can incorporate a wide variety of prior art communication technologies that can be used to can be used to establish a relationship between users or to otherwise direct content to other users.
- Social media accounts, e-mail accounts, text message accounts, online messaging technologies, and other forms of communication can be used by a user of the system to direct a point-of-view media experience to other users.
- the system can utilize an existing information technology infrastructure to implement the functionality of the system.
- the system can benefit from future advances in social media and communication technologies that give impact to relationships between people and affiliations with groups.
- the system can build upon relationships and associations that exist between different users and groups of users. The system can further expand and deepen such relationships and associations.
- the system can be used to share a wide range of different media experiences.
- Media experiences can be embodied as acoustic content (sound), visual content (sight), or both. Some media experiences may even include a vibration component (kinetic) as well.
- Some media experiences can be live events, such as concerts, theatrical productions, sporting events, political speeches, family reunions, religious services, and other types of live events.
- Other media experiences may be recordings of previously live events, with such recordings being made by ordinary persons participating in the event.
- Still other types of media experiences never existed as live experiences, such as a movie, television show, video game, or a song recorded in a music studio.
- Media experiences can also include immersive media experiences such as virtual realities in which users can navigate in while interacting with each other. III. Avatars
- the system can use avatars to represent the users within the virtual and navigable space.
- Avatars can range from actual images of the user to fully arbitrary symbols, graphics, and/or video.
- the system can provide users with the functionality of sharing of media immersive media experiences as well as non-immersive media experiences. To that functionality, some embodiments of the system can also provide the ability to interact with other users in their same group during the shared media experience. Such interactions can occur by allowing users to interact through avatars. Use of avatars is not a requirement for the system, and as such, the system can function without avatars.
- an avatar can include an acoustic component (sound), a visual component (sight), or both. Some avatars may even include a vibration component (kinetic).
- Avatars can in some embodiments be recorded or live. For example, in the context of a live event such as a professional baseball game or a concert, viewers in their respective homes could speak to each other and potentially even speak to each other as avatars that are part of the media experience.
- the media experience is a recorded event or even a purely artificial creation such as a movie or music video, individuals sharing a viewing experience can interact with each other through avatars while engaging in the shared media experience.
- the avatar of previous viewer can be recorded and enjoyed as part of the media experience of a later viewing user.
- the system can utilize a wide variety of different devices and a wide variety of different device configurations and architectures.
- Access devices can themselves be split up into different components. Sometimes an access device and a media player are separate devices, while in other instances they may be separate. For example, a user may use a head-mounted apparatus to access a media experience that is routed through a general purpose computer, an internet streaming device, a television set, a cable box, a satellite dish, or some other form of media player.
- An access device could be a conventional pair of prior art headphones accessing media through a conventional stereo system.
- the access device and media player are integrated into a single device.
- Some embodiments of the system can utilize a head-mounted display apparatus that is capable of delivering both visual and audio content to the user.
- Media content can be shared between users through a wide variety of different distribution devices and configurations of distribution devices.
- the media experience is itself created by a user initiating the sharing with the other users.
- the media experience is created by a third party such as record label, Hollywood studio, or video game producer.
- the function of capturing a media experience through a capture device and the function of distributing the media experience through a distribution device can be combined into the same device. Virtually any type of prior art device for capturing media experiences and/or distributing media experiences can be utilized by the system.
- Examples of potential capture devices and/or distribution devices include but are not limited to head-mounted access apparatuses such as head-mounted display apparatuses, video cameras, 360 degree video cameras, still-frame cameras, microphones, servers, general purpose computers, and other similar devices.
- head-mounted apparatuses can utilize cameras to share the user's point-of-view with respect to a media experience.
- a group of friends could use the system to watch a football game remotely from their own homes, while enjoying aspects of the game that are not normally accessible from home.
- a 360 degree video camera can be positioned in an appropriate seat at the football stadium.
- the users of the system can utilize head-mounted display apparatuses that allow them to navigate the field of vision as if they were actually sitting in the seats occupied by the 360 degree video camera.
- the group of friends could actually interact with each other's avatars, making comments, giving each other high-fives when a favorite player scores a touchdown, etc.
- a specific user could be part of two different groups watching the same football game from the same position.
- the system could allow that user to participate as an avatar with respect to both groups simultaneously.
- a user of the system can in some embodiments of the system, use the system to almost appear in two places at once.
- a parent of two kids could potentially alternate in watching a son's soccer game with a 360 degree video camera allowing virtual navigation of space and avatar interaction and watching a daughter's dance recital with a similarly equipped physical space.
- the avatar of the parent could interact with both kids in the physical world through the appropriate communications equipment.
- the avatar of the parent could also interact with other avatars of other viewers of either event.
- the examples above relate to either live events or recordings of live events.
- the examples above could also be used with respect to purely fictitious media experiences such as movies, television programs, videos, books on tape, choose your own adventure books, etc.
- Such interactive media approaches require the cooperation of the media content provider.
- a movie that is designed to allow viewers to navigate the point of view on their screens will require different types of filming approaches than a convention movie.
- the distinction between movies and virtual reality could be blurred by the existence of truly interactive movies, with different media experiences providing for different levels of interactivity.
- the system can be used to facilitate point-of-view (“POV”) sharing, without giving other users any ability or any substantial ability to independently impact what is seen.
- POV point-of-view
- the remote participates would at most be able to navigate particular focus within the range of the images captured and transmitted by the physically present user.
- Such point of view sharing can be done regardless of the type of media experience, be it live, recorded, or fictitious, and whether or not the media experience is comprised exclusively of sound, exclusively of video, or of combinations of both.
- Point of view sharing can be done without the use of 360 degree video cameras.
- Users wearing head-mounted display apparatuses can be equipped with cameras that continuously record and/or transmit the point of view of the user wearing the head-mounted display apparatus.
- Point of view sharing can be done with avatars or without avatars, with avatars in that context being limited to the function of selecting an applicable point of view when a user has many point of views from which to choose.
- Point of view sharing can be done with audio- only media experiences as well as with video-only and audio-video media experiences.
- Figure la is a hierarchy diagram illustrating the different categories and subcategories of media experiences in terms of the senses that are involved in experiencing such content.
- a media experience can include audio, visual, and/or kinetic content.
- Figure lb is a hierarchy diagram illustrating the different categories of media experiences in terms of whether the media experience is limited to a point of view or conversely, provides users with a space that can be independently navigated.
- Figure lc is a hierarchy diagram illustrating the different categories of media experiences in terms of whether a media experience is live, recorded, or created.
- Figure Id is a hierarchy diagram illustrating the different categories of media experiences based on the distinctions of whether avatars and the capability of avatar interactions are included in the media experience.
- Figure 2a is a block diagram illustrating an example of a media experience being captured by a capture device, distributed through a distribution device, and accessed across different networks by different access devices.
- Figure 2b is a block diagram similar to Figure 2a, except that the concept of a group of connected users is displayed along with the concept of interacting avatars.
- Figure 2c is a block diagram similar to Figure 2b, except that multiple groups are illustrated. Interactions between avatars is limited to within group boundaries.
- Figure 3 a is a block diagram illustrating an example of a system in which point-of-view media experiences are shared using wearable devices such as head-mounted display apparatuses.
- the head-mounted display apparatus of the sharing user can be used to either (1) capture the media experience or (2) play a pre-recorded media experience. In either case, the media experience is then shared with other users. While the recipient users are illustrated using head-mounted display apparatuses, other types of access devices can be used.
- Figure 3b is similar to Figure 3a except that the illustrated embodiment supports the capability of sharing a media experience that is not limited to a specific point of view, i.e. the shared media experience is capable of being navigated by the recipient users.
- Figure 3c is similar to Figure 3a except that the alternative access devices are disclosed.
- Figure 3d is similar to Figure 3c, except that a generic distribution component is indicated. As discussed above and below, the system can be implemented using a wide variety of different access, distribution, and capture devices.
- Figure 4 is a flow chart diagram illustrating an example of a process flow for sharing a media experience.
- Figure 5a is a block diagram illustrating an example of how a distribution list and social media on a network can be used to share media experiences between different users with head- mounted video displays across a network.
- Figure 5b is a block diagram illustrating an example of a head-mounted display apparatus as an interface between a user and a media experience.
- Figure 5c is a block diagram illustrating an example of a sensor as an interface between a user and a sensor feedback.
- Figure 5d is a block diagram illustrating an example of a head-mounted display apparatus as an interface between a user and an instruction.
- Figure 6a is a block diagram illustrating an example of components that can be included in a computer.
- Figure 6b is a block diagram illustrating an example of different types of computers.
- Figure 6c is a block diagram illustrating an example of different components of the system that can include computers.
- Figure 6d is a block diagram illustrating an example of a variation of Figure 6c that does not include a remote server.
- Figure 6e is a block diagram illustrating an example of a variation of Figures 6c and 6d that does not include a media player that is separate from the head-mounted display apparatus.
- Figure 6f is a block diagram illustrating an example of different types of data that can be stored on a database used by the system.
- the invention relates generally to apparatuses, systems, and methods that play media content for users. More specifically, the invention is an apparatus, system, and method that allows users to engage in a shared media experience (collectively, the "system"). Such media experiences can be immersive or non-immersive. Such media experiences can be live, recorded, and/or virtual. Table 1 below provides a glossary/index of claim element numbers, claim element names, and claim element descriptions.
- the system provides a way for users ("sharing users” or “sharers”) to share a wide range of different media experiences with their family, friends, co-workers, employees, customers, vendors, neighbors, and other recipients (collectively, “recipient users”, “recipients” or “participants”).
- Media experiences can be defined and categorized with respect to a variety of different dimensions or attribute types including but not limited to: (1) the senses used to perceive the media experience, such as sound, sight, and/or touch; (2) whether the shared media experience is embodied in a "space" that can be navigated by the recipient or whether the recipient is limited to a particular point-of-view; (3) whether the media experience is the live, recorded, or created; and (4) whether the media experience includes the ability of recipients to interact with each other through avatars.
- Figure la is a hierarchy diagram illustrating the different categories and subcategories of media experiences 600 in terms of the senses that are involved in experiencing such content.
- Examples of different types of media experiences 600 can include: (1) an audio-only media experience 601 where the media experience 600 consists solely of acoustic content, i.e. sounds; (2) a visual-only media experience 602 where the media experience 600 consists solely of visual content, i.e.
- an audio-visual media experience 603 where both sounds and images are included; (4) an audio-visual-kinetic media experience 605 where vibration/touch content is added to audio-visual content; (5) an audio-kinetic media experience 607 where sound and vibration are used, but visual content is not included; and (6) a visual- kinetic media experience 608 that involves sight and vibration/touch, but no acoustic content.
- Video content is a subset of visual content. All video is visual, but not all visual content is comprised of video. Examples of non-video visual content can include text, photographs, drawings, and symbols. Examples of subcategories of visual content that include video content include an audio-video media experience 604, an audio-video-kinetic media experience 606, and a video-kinetic media experience 609.
- Figure lb is a hierarchy diagram illustrating the different categories of media experiences 600 in terms of whether the media experience is limited to a point of view (a point of view media experience 612) or conversely, provides users with a space that can be independently navigated (a navigable space media experience 610).
- a point of view media experience 612 is by far the most common type of media experience 600 in the prior art.
- moviegoers visit the theater to see the latest theatrical release, their experience of the film is limited to the point of view presented on the screen.
- the viewer can decide to ignore portions of the screen or focus more attentively to certain portions of the screen, but the user does not have the ability to see anything outside the parameters of what is displayed on the screen.
- Figure 3 a illustrates an example of a media experience 600 that is likely to be a point of view media experience 612 if it is live event being recorded through a sensor 300 such as a video-camera located within a head-mounted display apparatus 200.
- the media experience 600 conveyed to a recipient user 80 in Figure 3a would be whatever is in the current view of the sensor 300 would be.
- the media experience 600 in that context would be limited to a particular point of view, and constitute a point of view media experience 612.
- a navigable space media experience 610 is a media experience 600 where different users 80 are free to independently navigate the media experience 600.
- Figure 3 a illustrates an example of a 360 degree camera 210 used to capture the media experience 600 that is then distributed to the various users 80.
- each user 80 can navigate that "space" independently even though they are each engaging in the same shared media experience 600.
- Figure lc is a hierarchy diagram illustrating the different categories of media experiences 600 in terms of whether a media experience is live, recorded, or created.
- a live event media experience 620 is a media experience 600 that is being shared with users 80 simultaneously or at least substantially simultaneously as the event occurs. Examples of live media experiences 620 include but are not limited to concerts, professional football games, dance recitals, graduation ceremonies, little league baseball games, family reunions, and church picnics. Live event media experiences 620 can be captured by a participating user 80 or by a third party such as television network, stadium personnel, or professional videographer. A live event media experience 620 that is captured and shared by a participating user 80 is a user captured live event media experience 621. A live even media experience 620 captured by a third party is a third party captured live event media experienced 622.
- a recorded event media experience 624 is similar to a live event media experience 620 except for the fact that the recorded event media experience 624 is being accessed by recipient users 80 after the event has occurred.
- the system can be used to automatically record live event media experiences 620 as recorded event media experiences 624 so that they can be enjoyed over and over again by participating users 80, as well as by those users 80 unable to engage with the media experience 600 when provided live.
- live event media experiences 620 can be differentiated on the basis of who is capturing the event for dissemination
- recorded event media experiences 624 can be similarly divided into the subcategories of a user recorded media experience 625 and a third party recorded media experience 626.
- a created media experience 627 is something that is beyond a live streaming or recording of an event.
- a created media experience 627 is often fictional, and is not an event or activity that has actually occurred.
- Created media experiences 627 almost always involve editors, directors, actors, authors, and other people involved in the creative process. Examples of created media experiences 627 include most movies, television programs, video games, virtual reality experiences, books, studio music recordings, and other media that is not exclusively limited to an actual event that happened in the real non-virtual world.
- Figure Id is a hierarchy diagram illustrating the different categories of media experiences 600 with delineations being made on the basis of whether avatars and avatar interactions are included with those media experiences.
- Media experiences 600 can include interactions between the participating users 80.
- a group of users 80 enjoying a baseball game captured with the 360 degree video camera 210 of Figure 3b could be allowed to interact with each other through avatars. While sitting in their chairs or sofas at home, their avatars could be sitting at virtual seats in the stadium with the participating users 80 being engaged in their traditional banter.
- Avatars like other aspects of media experiences 600 can consist exclusively of audio elements, exclusively of visual elements, or combinations of both. Visual elements could be video or still-frame images. Some avatars can include a kinetic component. It is anticipated than many users 80 will include a still frame photograph of themselves as their avatar, but the universe of potential avatars is an unlimited as the universe of potential media. Given a sufficiently robust telecommunications network, actual video of the user 80 can constitute their avatar. The use of avatars can take the functionality of sharing a media experience to the next level, sharing not only the experience the scene that would be witnessed by the group of individuals, but facilitating the ability of that group of users to interact with each other.
- Such comradery can be enabled for live events like concerts or football games, as well as for recorded events like watching a video of a wedding.
- Created media experiences 627 such as movies, television programs, video games and other similar experiences can also benefit from the capability of allowing avatars.
- Avatars can themselves be live, recorded, or both. Past commentary to a horror film that was made while the viewer was watching the film can be recorded and shared with future viewers of the same film.
- an avatar 640 is a mechanism for interactions between users 80 within a common group 90.
- Avatars 640 do not interact with avatars 640 from outside groups 90, although an individual user 80 can belong to multiple groups 90.
- the behavior of the avatar 640 is directed by their user 80 through an access device 160.
- the avatar actions of others are distributed and shared with participating users 80 as part of the media experience 600 that is shared with the users 80.
- a media experience 600 need not include any type of avatar 640.
- Media experiences 600 can be a media experience with avatars 632 or a media experience without avatars 630.
- a media experience with avatars 632 can be a media experience with live avatars 634, a media experience with recorded avatars 636, and/or a media experience with both recorded and live avatars 638.
- the system can be used by users 80 to share media experiences with family, friends, co-workers, teammates, and other individuals.
- shared media experiences 600 can be immersive allowing users to interact within a navigable space or non-immersive, where a point- of- view is shared with other users80.
- the system 100 can be utilize the relationship-rich information of social networks to establish distribution lists for shared media content referred to as media experiences.
- Figure 2a is a block diagram illustrating an example of a media experience 600 being captured by a capture device 120 (which can also be referred to as a creation device), distributed through a distribution device 140, and accessed across different networks 540 by a variety of different access devices 160.
- a capture device 120 which can also be referred to as a creation device
- distribution device 140 which can also be referred to as a distribution device 140
- the different types of media experiences 600 are described in detail above.
- a capture/creation device 120 is a device and/or configuration of devices used to create or capture a media experience 600 such that it can then be distributed and accessed by users 80.
- the capture/creation device 120 will likely include one or more sensors 300 for translating an event or activity occurring in the real world into a media experience 600 that is cognizable to the system 100 and is capable of being shared.
- sensors 300 may or may not be needed.
- computer generated music, animation, graphics, etc. can be created without reference to sensors 300 or indicia from the physical world.
- a capture/creation device 120 will often include a computer 510 and many computer components as illustrated in Figure 6a. As illustrated in Figure 6b, the computer 510 can be implemented in a wide variety of different configurations. As illustrated in Figure 6f, such computers 510 can be capture, store, process, and update a wide variety of different types of data 560.
- Examples of capture/creation devices include computers 510 of all types and sensors
- the collective production equipment on a movie set can constitute a capture/creation device 120. So can a head-mounted display apparatus 200 with a sensor 300 as illustrated in Figure 3 a.
- Virtually any prior art or future media capture/creation technology can serve as a capture/creation device 120 for the system 100.
- a distribution device 140 is a device and/or configuration of devices used to distribute the media experience 600 as captured or generated from the capture/creation device 120.
- the distribution device 140 will include a server 512 as illustrated in Figure 5a that is connected to a network 540 such as the World Wide Web, Internet, intranets, extranets, or other types of networks 540.
- the distribution device 140 will also be the capture/creation device 120, such as head-mounted display apparatus 200 with the capability to communicate over a network 540.
- the distribution device 140 can include a distribution list 570 and access to social media 568 to facilitate the sharing of the applicable media experience 600.
- Virtually any prior art or future media distribution technology can serve as a distribution device 140 for the system 100.
- An access device 160 is any device and/or configuration of devices that provides a user 80 with the ability to access the shared media experience 600.
- Examples of access devices 160 can include convention headphones, all sorts of computers 510 such as an embedded computer 511, a client device 512, a general purpose computer 514, a mobile computer such as smart phone, tablet, or a head-mounted display apparatus 200.
- an access 160 will also serve as a potential capture/create device 120 and/or a distribution device 140.
- access devices 160 While most access devices 160 are likely to include a computer 510, some access devices 160 such as a set of convention headphones need not include such technology.
- Figure 2b is a block diagram similar to Figure 2a, except that the concept of a group of connected access devices 160 (which are utilized by individual users 80) is displayed along with the concept of interacting avatars 640).
- Some embodiments of the system 100 can include the functional capability of allowing users 80 of a shared media experience 600 to interact with each other through the use of avatars 640. It is anticipated that avatars 640 will most commonly be used with respect to navigable space media experiences 610. However, as interactions between remote users 80 through avatars 640 becomes more popular, it is anticipated that even conventional point of view media experiences 612 such as movies, television programs, etc. may include a "space capability" for visually based avatars 640 and a
- Avatar 640 functionality can be built into a media experience 600, or into the devices used to capture/create, distribute, and access media experiences 600.
- a group 90 is a collection of one or more users 80. Sharing a media experience 600 inherently involves setting a parameter of who is to receive access to a shared media experience 600. In some instances, a media experience 600 could be shared with the general public, a group 90 comprising of potentially billions of users 80. In other circumstances, a group 90 could be limited to as few as one user 80.
- the system 100 can allow a group 90 with no members to exist, but in such an embodiment, there is no individual to receive the shared media experience 600.
- Figure 2c is a block diagram similar to Figure 2b, except that multiple groups 90 are illustrated. Interactions between avatars 640 is limited to within group boundaries. However, it is possible for an individual user 80 to belong to more than one group 90.
- Figure 3a is a block diagram illustrating an example of media experiences 600 being shared between users 80 with head-mounted display apparatuses 200 along shared networks 540.
- a user 80 is typically a human being, although it is possible to conceptualize instances where the user 80 could be another species of animal or even a form of robot or artificial intelligence, expert system, or other similar form of automated intelligence.
- Users 80 can often interact with each other within the shared media experience 600 through their avatars 640.
- Users 80 can organize groups 90 to which they belong. Such groups 90 can be used to manage the sharing of media experiences 600.
- the system 100 can be implemented in a wide variety of different configurations using a wide variety of different networks 540.
- Networks 540 the infrastructure for sharing data 560 such as media experiences 600, but the system 100 can be implemented in such a fashion as to be agnostic on which networks 540 are used.
- Networks 540 allow the different components and configurations of components that make up the system 100 to communicate with each other.
- Figure 3a is a block diagram illustrating an example of media experiences 600 being shared between users 80 with head-mounted display apparatuses 200 along shared networks 540.
- a system 100 for sharing media experiences 600 between users 80 of different head-mounted display apparatuses 200 can be implemented in a wide variety of different ways, using a wide variety of different components, and a wide variety of different component configurations.
- the type of media experience 600 will likely be a point of view media experience 612 if the media experience 600 is captured using the head-mounted display apparatus 300 (with an embedded sensor 300) as the capture/creation device 120.
- the access devices 160 are head-mounted display apparatuses 200 that are either directly or indirectly in communication with the head-mounted display apparatus 200 serving as the capture/create device 120.
- the media experience 600 illustrated in Figure 3 a is embodied on a BLU-RAY disc being played by a media player 500 with the head-mounted display apparatus 200 accessing the media from the media player 500, then the range of different media experiences 600 could include navigable space media experiences 610 as well as point of view media experiences 612.
- Figure 3b is similar to Figure 3 a except that the illustrated embodiment supports the capability of creating a media experience that is not limited to a specific point of view, i.e. the shared media experience is capable of being navigated by the recipient users.
- Use of a 360 degree camera 210 can support the capture of a navigable space media experience 610, such as a media experience of attending a live event on location in a virtualized way from your home or other similar remote location.
- Figure 3c is similar to Figure 3a except that the alternative access devices are disclosed.
- the system 100 can utilize a wide variety of different access devices 160 to deliver shared media experiences 600 to users 80.
- One common category of such access devices 160 are media players 500, such as television sets, headphones, computers 510, and other devices in addition to head-mounted display apparatuses 200.
- Figure 3d is similar to Figure 3c, except that a generic distribution component 140 is indicated. As discussed above and below, the system 100 can be implemented using a wide variety of different access, distribution, and capture devices.
- Figure 4 is a flow chart diagram illustrating an example of how a user 80 with a can share media experiences 600 with other users 80 of the system 100.
- the apparatus 200 is used to access some type of media experience 600.
- a distribution list 570 one or more other users 80 is identified for the purposes of sharing a point-of-view media experience 600.
- the system 100 initiates the sharing of the media experience 600 with the users 80 as set forth in the distribution list 570 for the applicable period of time as defined and set forth by the sharing user 80.
- FIG. 5a is a block diagram illustrating an example of how a distribution list 570 and a social media system 568 on a network 540 can be used to share media experiences 600 between different users 80 with head- mounted video display apparatuses 200 as well as other types of access devices 160 across a network 540.
- a distribution list 570 is potentially any technological mechanism for identifying destinations for medial experiences 600, communications, and any other form of shared content.
- the apparatus 200 does provide the possibility for authenticating the identity of users 80 utilizing the apparatus 200 because the sensor 300 of the apparatus 200 can capture biometric metrics such as retina scans and other potential identifiers.
- the system 200 may include distribution lists 570 based on such identifiers.
- Distribution lists 570 are a form of data 560.
- the system 100 can utilize data 560 from relevant social media 568 utilized by the users 80.
- Social media 568 includes information on the relationships between users 80, and may make it easier for users 80 to maintain groups 90 as well as to specify recipients for shared media experiences 600.
- a server 512 is used to house data 560, such as the distribution lists 570, social media 568, and other forms of data 560 as illustrated in Figure 6f. As illustrated in Figure 6b, a server 512 is used to house data 560, such as the distribution lists 570, social media 568, and other forms of data 560 as illustrated in Figure 6f. As illustrated in Figure 6b, a server 512 is used to house data 560, such as the distribution lists 570, social media 568, and other forms of data 560 as illustrated in Figure 6f. As illustrated in Figure 6b, a server
- 512 is a type of computer 510, and as such it will typically include many of the components illustrated in Figure 6a.
- Servers 512 are computers 510 that can enable a wide variety of client devices
- Servers 512 can provide users 80 with access to many other users 80 for the purposes of sharing media content 600.
- the system 100 can utilize wide variety of different devices, device configurations, and information technology infrastructures. Many embodiments of the system 100 will involve some type of highly individualized access devices 160 such as headphones (a head-mounted apparatus without a visual display) or a head-mounted display apparatus 200, a set of headphones that include a display for media experiences 600 with a visual component. In many embodiments, a head-mounted apparatus will also include a sensor 300 to facilitate the ability of a user 80 to capture media experiences 600 as well as share them.
- highly individualized access devices 160 such as headphones (a head-mounted apparatus without a visual display) or a head-mounted display apparatus 200, a set of headphones that include a display for media experiences 600 with a visual component.
- a head-mounted apparatus will also include a sensor 300 to facilitate the ability of a user 80 to capture media experiences 600 as well as share them.
- Figure 5b is a block diagram illustrating an example of a head-mounted display apparatus 200 as an interface between a user 80 and a media experience 600. If the apparatus 200 includes a sensor 300, the apparatus 200 can be used to capture media experiences 600 as well as "play” or access them.
- Figure 5c is a block diagram illustrating an example of a sensor 300 as an interface between a user 80 and a sensor feedback 310 or sensor reading 310. Anything captured by a sensor 300 can be referred to as sensor feedback 310 or a sensor reading 310. Such information can be used as a media experience 600 or used to create a media experience 600.
- Figure 5d is a block diagram illustrating an example of a head-mounted display apparatus 200 as an interface between a user 80 and an instruction 566.
- An instruction 566 is a communication from a user 80 to the system 100 for the purposes of invoking action by the system 100, configuring how the system 100 operates, or otherwise influencing the processing performed by the system 100.
- Access devices 160 such as a head-mounted display apparatus 200 can provide for the submitting of instructions 566 by the user 80 to the system 100 or a device/assembly utilized by the system 100.
- Figure 6a is a block diagram illustrating an example of components that can be included in virtually any computer 510 positioned within the apparatus 200 or anywhere else within the system 100.
- a computer 510 is an assembly that includes a processor 521.
- a computer 510 can also include one or more of the following components: (a) a memory component 522; (b) a storage component 523; (c) a computer program/software application 524 that provide for implementing one or more heuristics 525; (d) a database 5; and (d) a network adapter 529.
- a microprocessor or central processing unit that provides for the running of computer programs/software applications 524.
- the memory component 512 is often referred to as RAM, random access memory.
- a component that provides for the permanent storage of data 560 A component that provides for the permanent storage of data 560.
- Application An application 524 can also be referred to as a computer program 524 or a software application 524.
- An application 524 is a unit of computer code or a set of instructions that are capable of being implemented or run by the processor 521.
- a wide variety of different programming languages can be used to create the application(s) 524 used by the system 100. There are a voluminous number of different types of programming languages including but not limited to: object-oriented languages, assembly languages, fourth-generation languages, visual languages, XML -based languages, and hardware description languages.
- Heuristics 525 can include algorithms but are not limited to algorithms.
- databases 526 include but are not limited to: relational databases, object-oriented databases, navigational databases, and NoSQL databases.
- GUI graphical user interface
- a GUI will typically include virtual menus, buttons, editable fields, drop down list boxes, etc. that can users 80 can use to interact with the computer 510.
- the system 100 is not limited to the use of GUIs or other typical computer-oriented interfaces as user interfaces 528.
- Network adapters 529 can operate wirelessly or be hardwired.
- a telecommunications network that allows computers 510 to communicate with each other and exchange data 560.
- a wide variety of networks 540 can be incorporated into the system 100. Examples of network configurations include but are not limited to bus networks, star networks, ring networks, mesh networks, and tree networks. Different networks 540 can utilize different communication protocols and operate in different geographic scales.
- Local Network A network 540 that is limited in scope to particular geographic location such as a home or office. Common examples of local networks include personal area networks, local area networks, home area networks, and storage area networks.
- a network 540 that is not a local network 542.
- Common examples of external networks are not a local network 542.
- 544 include the Internet, wide area networks, metropolitan area networks, and enterprise networks.
- computers 510 can possess a wide variety of different components. Computers 510 can also be embodied in a variety of different categories of computers 510.
- Figure 6b is a block diagram illustrating an example of different types of computers 510.
- An embedded computer 511 is not typically a general purpose computer 514.
- the head-mounted display apparatus 200 can be implemented as an embedded computer 51 1.
- a computer 510 that primarily exists for the purpose of making functionality available to other computers 510, such as to client device 513.
- a computer 510 that functions at least in part by accessing at least some data 560 or an application 524 on a sever 512.
- client devices 513 are general purpose computers 514.
- client devices 513 are mobile computers 515.
- a computer 510 that providers a user 80 with substantial flexibility in configuring the computer 510.
- a one user 80 may use their desktop computer primarily to play games online while another user 80 may use that same model of desktop computer to perform legal research.
- Examples of general purpose computers 514 include desktop computers, laptop computers, smart phones, tablet computers notepad computers, mainframe computers, minicomputers, work stations, and other devices that allow users 80 to decide what is run on the computer 520.
- Some general purpose computers 514 are mobile computers 515.
- the head- mounted display apparatus 200 can be implemented as a general purpose computer 514.
- Mobile Computer A computer 510 that is easily moved from on location to another, such as smart phone, a tablet, or even a notepad or laptop computer.
- the head-mounted display apparatus 200 is a mobile computer 315.
- Figure 6c is a block diagram illustrating an example of different components of the system 100 that can include computers 510.
- Virtually any component of the system 200 can include a computer 510.
- the server 512 is a type of computer 510.
- the media player 500 can include one or more computers 510.
- the apparatus 200 can include computer 510 and the sensor 300 of apparatus 200 can include its own separate computer 510.
- Alternative configurations of various computerized components of the system 100 are disclosed and discussed above.
- Figure 6f is a block diagram illustrating an example of different types of data 560 that can be stored on a database 526 used by the system 100.
- Data 560 can represent facts, statistics, and other forms of information.
- Data 560 can include profiles 562, history 564, instructions 566, as well as social media 568.
- a user 80 can interact with the system 100 and other users 80 of the system 100 through a profile 562.
- the profile 562 is associated with one or more users 80.
- a single user 80 can have more than one profile 562.
- Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100. The more data 560 that exists for a particular user 80 and that can be associated with a particular profile 562, the more it is possible for the heuristics 525 of the system 100 to make assessments on what a user 80 wants.
- system 100 can also be used to affirmatively trigger instructions 566.
- the system 100 can be used to initiate automated or semi-automated activities.
- Social Media A collection of websites and other online means of communication and interaction. Social media 568 are a substantial avenue for creating and sharing data 560. The system 100 can be integrated work with social media 568.
- a media experience is any type of media content that can be accessed or captured using the apparatus 200.
- a senor feedback 310 is any type of data 560 captured from a sensor 300.
- the system 100 can allow multiple users 80 to engage in shared media experience 600 in a simultaneous or at least substantially simultaneous manner. Different users 80 can utilize different types of access devices 160 to engage in the media content 600 of the system 100. Head-mounted display apparatuses 200 can be a particularly desirable category of access device 160, but a wide variety of different computers 510 can function as access device 160. To facilitate the functionality of the system 100, one or more servers 512 will often be used to either deliver media content 600 from a shared source such as a sensor 300, a media player 500 or database 526 that is not associated with a particular user 80.
- a shared source such as a sensor 300, a media player 500 or database 526 that is not associated with a particular user 80.
- one of the users 80 is responsible for capturing a media experience 600 through the a sensor 300 that is associated with a particular user 80, such as a user 80 wearing a head-mounted display apparatus 200 that includes a camera 310 or a user 80 who has a 360 degree camera 210 in the control or custody.
- the system 100 can build upon relationships and associations from social media, such as affiliations with groups 90 and other users 80.
- the server 512 used to support the system 100 can include a wide variety different profiles 562 that can be used to facilitate sharing when desired, and to prevent sharing when it is not desired.
- the system 100 can be used to share a wide variety of different media experiences 600, including immersive media experiences 610 as well as non-immersive/point of view media experiences 612.
- Table 1 below provides a chart of element numbers, element names, and element descriptions.
- the system 100 can be implemented in ways that could be useful to other types of mammals and potentially even robots.
- the system 100 can perform the sharing media experiences 600.
- Device sensor 300 is a common example of a capture device 120.
- Some distribution devices 140 are media players 500.
- a device used by a user 80 to access media content 600 A device used by a user 80 to access media content 600.
- a head mounted display apparatus 200 can be worn on the head of the user 80 is also an example of an access device 160.
- a head-mounted display apparatus capable of playing video and sound.
- a 360 degree video camera can be used.
- Feedback feedback 310 can also be referred to as sensor readings 310, and sensor readings 310 can become media experiences 600 or used to create media experiences 600.
- sensor readings 310 can become media experiences 600 or used to create media experiences 600.
- Examples of potentially important types of sensor feedback 310 include biometric data such iris scans and retina scans for identification purposes as well as variety of metrics relating to eye movement, timing, and media content.
- the media player 500 is integrated with the apparatus 200.
- a computer 510 can also include one or more of the following components: (a) a memory component 522; (b) a storage component 523; (c) a computer program/software application 524 that provide for implementing one or more heuristics 525; (d) a database 5; and (d) a network access device 517
- An embedded computer 511 is not typically a
- Server A computer 510 that primarily exists for the purpose of making functionality available to other computers 510, such as to client device 513.
- Client A computer 510 that functions at least in part by accessing at Device least some data 560 or an application 524 on a sever 512.
- client devices 513 are general purpose computers 514. Some examples of client devices 513 are mobile computers 515.
- General A computer 510 that providers a user 80 with substantial Purpose flexibility in configuring the computer 510.
- a Computer one user 80 may use their desktop computer primarily to play games online while another user 80 may use that same model of desktop computer to perform legal research.
- Examples of general purpose computers 514 include desktop computers, laptop computers, smart phones, tablet computers notepad computers, mainframe computers, minicomputers, work stations, and other devices that allow users 80 to decide what is run on the computer 520.
- Some general purpose computers 514 are mobile computers 515.
- Mobile A computer 510 that is easily moved from on location to Computer another, such as smart phone, a tablet, or even a notepad or laptop computer.
- Processor A microprocessor or central processing unit that provides for the running of computer programs/software applications 524.
- the memory component 512 is often referred to as RAM, random access memory.
- Storage A component that provides for the permanent storage of data Component 560.
- Application An application 524 can also be referred to as a computer
- An application 524 is a unit of computer code or a set of instructions that are capable of being implemented or run by the processor 521.
- a wide variety of different programming languages can be used to create the application(s) 524 used by the system 100. There are a voluminous number of different types of programming languages including but not limited to: object-oriented languages, assembly languages, fourth-generation languages, visual languages, XML-based languages, and hardware description languages.
- Heuristics 525 Heuristic A problem solving, analytical process, or similar approach that can be implemented by a processor 521 running an application 524. Heuristics 525 can include algorithms but are not limited to algorithms.
- Database A comprehensive collection of data 560 that is organized for convenient access.
- the system 100 can incorporate the capabilities of any type of database 526.
- databases 526 include but are not limited to: relational databases, object-oriented databases, navigational databases, and NoSQL databases.
- GUI graphical user interface
- a GUI will typically include virtual menus, buttons, editable fields, drop down list boxes, etc. that can users 80 can use to interact with the computer 510.
- the system 100 is not limited to the use of GUIs or other typical computer-oriented interfaces as user interfaces 528.
- Network adapters 529 can operate wirelessly or be hardwired.
- networks 540 can be incorporated into the system 100. Examples of network configurations include but are not limited to bus networks, star networks, ring networks, mesh networks, and tree networks. Different networks 540 can utilize different communication protocols and operate in different geographic scales.
- Network location such as a home or office.
- Local networks include personal area networks, local area networks, home area networks, and storage area networks.
- Data 560 Data Any form of information that is capable of being accessed and/or stored by a computer 510.
- Data 560 can represent facts, statistics, and other forms of information.
- Data 560 can include profiles 562, history 564, instructions 566, as well as social media 568.
- Profiles A user 80 can interact with the system 100 and other users 80 of the system 100 through a profile 562.
- the profile 562 is associated with one or more users 80.
- a single user 80 can have more than one profile 562.
- To identify examples of profiles 562 one only has to look at popular websites such as Amazon.com, Facebook.com, eBay.com, Google.com, etc. to understand the wide diversity of profiles 562 that can exist.
- Profiles 562 can include virtually any information pertaining to the user 80 including the preferences of the user 80 and the totality of a user's history with the system 100.
- 564 History A subset of data 560 that relates to the previous interactions with the system 100.
- the historical data 564 of a user 80 can be valuable information for the processing of the system 100. So can the historical data 564 for the aggregated community of users 80.
- the system 100 can also be used to affirmatively trigger instructions 566.
- the system 100 can be used to initiate automated or semi-automated activities.
- Social media 568 are a substantial avenue for creating and sharing data 560.
- the system 100 can be integrated work with social media 568.
- a unit of storable and transmittable content can include an
- Experience audio, visual, and/or kinetic content can include a navigable space media experience (i.e. an immersive media experience) 610 and/or a point of view media experience (i.e. non- immersive media experience) 612. It can be live, recorded, or created. It can be with avatars or without avatars 640.
- a navigable space media experience i.e. an immersive media experience
- a point of view media experience i.e. non- immersive media experience
- It can be live, recorded, or created. It can be with avatars or without avatars 640.
- Avatar A manifestation of a user 80 within a media experience 600.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Hardware Design (AREA)
- Tourism & Hospitality (AREA)
- Human Computer Interaction (AREA)
- Marketing (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Data Mining & Analysis (AREA)
- Operations Research (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Quality & Reliability (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Information Transfer Between Computers (AREA)
Abstract
L'invention concerne un appareil (200), un système (100) et un procédé (700) qui permet à deux ou plusieurs utilisateurs (80) de partager une expérience multimédia (600).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201462041013P | 2014-08-22 | 2014-08-22 | |
| US62/041,013 | 2014-08-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016029224A1 true WO2016029224A1 (fr) | 2016-02-25 |
Family
ID=55351337
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2015/046600 Ceased WO2016029224A1 (fr) | 2014-08-22 | 2015-08-24 | Appareil, système et procédé pour fournir aux utilisateurs une expérience multimédia partagée |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016029224A1 (fr) |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101806477B1 (ko) | 2016-07-06 | 2017-12-07 | 주식회사 케이티 | 하드웨어 장치의 데이터를 복수의 어플리케이션으로 중계하는 방법, 단말 및 컴퓨터 프로그램 |
| US10701433B2 (en) | 2016-06-29 | 2020-06-30 | Nokia Technologies Oy | Rendering of user-defined message having 3D motion information |
| US10762722B2 (en) | 2016-06-28 | 2020-09-01 | Nokia Technologies Oy | Apparatus for sharing objects of interest and associated methods |
| US10970519B2 (en) | 2019-04-16 | 2021-04-06 | At&T Intellectual Property I, L.P. | Validating objects in volumetric video presentations |
| US11012675B2 (en) | 2019-04-16 | 2021-05-18 | At&T Intellectual Property I, L.P. | Automatic selection of viewpoint characteristics and trajectories in volumetric video presentations |
| US11074697B2 (en) | 2019-04-16 | 2021-07-27 | At&T Intellectual Property I, L.P. | Selecting viewpoints for rendering in volumetric video presentations |
| US11094001B2 (en) | 2017-06-21 | 2021-08-17 | At&T Intellectual Property I, L.P. | Immersive virtual entertainment system |
| US11153492B2 (en) | 2019-04-16 | 2021-10-19 | At&T Intellectual Property I, L.P. | Selecting spectator viewpoints in volumetric video presentations of live events |
| EP4460056A1 (fr) * | 2023-05-03 | 2024-11-06 | Panasonic Avionics Corporation | Environnement virtuel orienté sur la sécurité pour interactions de passagers en vol |
| US12428154B2 (en) | 2023-05-03 | 2025-09-30 | Panasonic Avionics Corporation | In-flight virtual experiences corresponding to real-life experiences encountered along passenger journeys |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120274750A1 (en) * | 2011-04-26 | 2012-11-01 | Echostar Technologies L.L.C. | Apparatus, systems and methods for shared viewing experience using head mounted displays |
| US20130103814A1 (en) * | 2011-10-25 | 2013-04-25 | Cbs Interactive Inc. | System and Method for a Shared Media Experience |
| US20130222369A1 (en) * | 2012-02-23 | 2013-08-29 | Charles D. Huston | System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment |
-
2015
- 2015-08-24 WO PCT/US2015/046600 patent/WO2016029224A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120274750A1 (en) * | 2011-04-26 | 2012-11-01 | Echostar Technologies L.L.C. | Apparatus, systems and methods for shared viewing experience using head mounted displays |
| US20130103814A1 (en) * | 2011-10-25 | 2013-04-25 | Cbs Interactive Inc. | System and Method for a Shared Media Experience |
| US20130222369A1 (en) * | 2012-02-23 | 2013-08-29 | Charles D. Huston | System and Method for Creating an Environment and for Sharing a Location Based Experience in an Environment |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10762722B2 (en) | 2016-06-28 | 2020-09-01 | Nokia Technologies Oy | Apparatus for sharing objects of interest and associated methods |
| US10701433B2 (en) | 2016-06-29 | 2020-06-30 | Nokia Technologies Oy | Rendering of user-defined message having 3D motion information |
| KR101806477B1 (ko) | 2016-07-06 | 2017-12-07 | 주식회사 케이티 | 하드웨어 장치의 데이터를 복수의 어플리케이션으로 중계하는 방법, 단말 및 컴퓨터 프로그램 |
| US11593872B2 (en) | 2017-06-21 | 2023-02-28 | At&T Intellectual Property I, L.P. | Immersive virtual entertainment system |
| US11094001B2 (en) | 2017-06-21 | 2021-08-17 | At&T Intellectual Property I, L.P. | Immersive virtual entertainment system |
| US11012675B2 (en) | 2019-04-16 | 2021-05-18 | At&T Intellectual Property I, L.P. | Automatic selection of viewpoint characteristics and trajectories in volumetric video presentations |
| US11074697B2 (en) | 2019-04-16 | 2021-07-27 | At&T Intellectual Property I, L.P. | Selecting viewpoints for rendering in volumetric video presentations |
| US11153492B2 (en) | 2019-04-16 | 2021-10-19 | At&T Intellectual Property I, L.P. | Selecting spectator viewpoints in volumetric video presentations of live events |
| US11470297B2 (en) | 2019-04-16 | 2022-10-11 | At&T Intellectual Property I, L.P. | Automatic selection of viewpoint characteristics and trajectories in volumetric video presentations |
| US10970519B2 (en) | 2019-04-16 | 2021-04-06 | At&T Intellectual Property I, L.P. | Validating objects in volumetric video presentations |
| US11663725B2 (en) | 2019-04-16 | 2023-05-30 | At&T Intellectual Property I, L.P. | Selecting viewpoints for rendering in volumetric video presentations |
| US11670099B2 (en) | 2019-04-16 | 2023-06-06 | At&T Intellectual Property I, L.P. | Validating objects in volumetric video presentations |
| US11956546B2 (en) | 2019-04-16 | 2024-04-09 | At&T Intellectual Property I, L.P. | Selecting spectator viewpoints in volumetric video presentations of live events |
| EP4460056A1 (fr) * | 2023-05-03 | 2024-11-06 | Panasonic Avionics Corporation | Environnement virtuel orienté sur la sécurité pour interactions de passagers en vol |
| US12428154B2 (en) | 2023-05-03 | 2025-09-30 | Panasonic Avionics Corporation | In-flight virtual experiences corresponding to real-life experiences encountered along passenger journeys |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016029224A1 (fr) | Appareil, système et procédé pour fournir aux utilisateurs une expérience multimédia partagée | |
| US11593872B2 (en) | Immersive virtual entertainment system | |
| RU2527199C2 (ru) | Совместный выбор мультимедиа с интегрированными видеообразами | |
| US10644894B2 (en) | Systems and methods for virtual interactions | |
| US9826277B2 (en) | Method and system for collaborative and scalable information presentation | |
| US9292163B2 (en) | Personalized 3D avatars in a virtual social venue | |
| US9189818B2 (en) | Association of comments with screen locations during media content playback | |
| US9384587B2 (en) | Virtual event viewing | |
| US9111285B2 (en) | System and method for representing content, user presence and interaction within virtual world advertising environments | |
| JP2023551476A (ja) | ビデオゲームのコンテンツに含めるためのグラフィックインターチェンジフォーマットファイル識別 | |
| Lelyveld | Virtual reality primer with an emphasis on camera-captured VR | |
| US8667402B2 (en) | Visualizing communications within a social setting | |
| US20110239136A1 (en) | Instantiating widgets into a virtual social venue | |
| US20130144727A1 (en) | Comprehensive method and apparatus to enable viewers to immediately purchase or reserve for future purchase goods and services which appear on a public broadcast | |
| KR20120096065A (ko) | 3차원 미디어 환경에서 미디어 객체의 근접도를 결정하는 시스템 및 방법 | |
| US20130242064A1 (en) | Apparatus, system, and method for providing social content | |
| US20130326352A1 (en) | System For Creating And Viewing Augmented Video Experiences | |
| KR20210049197A (ko) | 인터랙티브 원격 영화 시청, 스케줄링, 및 소셜 커넥션을 위한 시스템 및 방법 | |
| WO2013163083A2 (fr) | Systèmes et procédés pour adresser automatiquement un message à un contact dans un réseau social | |
| US20110225516A1 (en) | Instantiating browser media into a virtual social venue | |
| US20150294633A1 (en) | Life Experience Enhancement Illuminated by Interlinked Communal Connections | |
| US20110225518A1 (en) | Friends toolbar for a virtual social venue | |
| Philpot et al. | User experience of panoramic video in CAVE-like and head mounted display viewing conditions | |
| US11845011B2 (en) | Individualized stream customizations with social networking and interactions | |
| JP2010219849A (ja) | 画像生成システム、プログラム、情報記憶媒体、サーバシステム、及びコンテンツ配信システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15833313 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.06.2017) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15833313 Country of ref document: EP Kind code of ref document: A1 |