[go: up one dir, main page]

EP2896210A2 - Distribution de contenu multimédia - Google Patents

Distribution de contenu multimédia

Info

Publication number
EP2896210A2
EP2896210A2 EP13771566.0A EP13771566A EP2896210A2 EP 2896210 A2 EP2896210 A2 EP 2896210A2 EP 13771566 A EP13771566 A EP 13771566A EP 2896210 A2 EP2896210 A2 EP 2896210A2
Authority
EP
European Patent Office
Prior art keywords
media
user
data
event
media data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13771566.0A
Other languages
German (de)
English (en)
Inventor
Tupac Martir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of EP2896210A2 publication Critical patent/EP2896210A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1845Arrangements for providing special services to substations for broadcast or conference, e.g. multicast broadcast or multicast in a specific location, e.g. geocast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/222Monitoring or handling of messages using geographical location information, e.g. messages transmitted or received in proximity of a certain spot or area
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/21805Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present invention relates to media content distribution.
  • Embodiments of the present invention relate to a system and method for improving a supporter's experience of an event at an event venue, and in particular to a system and method that links a supporter's device to the devices of other users, enabling the supporter to receive audio/video data captured by the other users.
  • supporters have a limited viewpoint for viewing the event, usually being their own direct view of the event supplemented with such additional views on large screens at the event venue as are selected by organisers of the event. After an event has finished, a supporter's experience of the event is often even more limited - to their own captured video footage (if they were present themselves) or commercial footage if available. It would be desirable to provide supporters with access to additional viewpoints, and to provide supporters with facilities to increase their involvement in the event, both before, during and after the event actually takes place.
  • a media content distribution system comprising:
  • a media handling device for receiving and distributing media data; and a plurality of user devices, each user device having a camera function and being operable in a media transmit mode to transmit media data of an event occurring at an event venue to the media handling device, each user device being operable in a media receive mode to receive media data from the media handling device;
  • a first user device from amongst the plurality of user devices is operable to select a media source from a second user device among the user devices which are currently operating in the media transmit mode and the media handling device is operable to stream the media data of the event being received from the second user device to the first user device.
  • the media content is preferably a video stream, optionally with audio, but may instead be still images. It will be appreciated that media content transmitted from one device may at any given time be streamed to one device, no devices or many devices.
  • the media handling device may only receive media content from devices actually present at (for example within or in some cases in the immediate vicinity outside) the event venue. This could be achieved using location information indicating the location of the image capture devices, or by limiting to devices registered to the event as part of the ticket buying process for example.
  • the selection of the media source by the user may be achieved by selecting a specific device which is currently transmitting, while in other cases the selection of the media source by the user may be achieved by selecting a geographical area or zone of the event venue, with the specific source then being allocated by the media handling device - for example by allocating the highest quality device in that area or zone.
  • a user may be able to access footage from a media source in a number of ways, for example by selecting from thumbnail versions of each media source from a drop down list, for example ordered by media source location.
  • the first user device is operable to select the media source using an event venue representation indicating the location within the event venue of the user devices operating in the media transmit mode.
  • the event venue representation may for example be a plan or isometric view of the event venue. This enables the user to visually identify media sources in an area from which the user would like to view the event, and to select appropriate media sources for view based on location.
  • the user may select either a specific transmitting device, or an area or zone from which a specific transmitting device is to be allocated by the media handling device.
  • the location within the event venue of each user device operating in the media transmit mode may be determined from one of a GPS position or other positioning facility determined by the user device and a seat or area identifier entered by the user or otherwise allocated to the user.
  • the event venue representation may comprise a first representation indicating the location of plural selectable areas within the event venue (top level representation, or overview). Selection of an area within the event venue representation may cause the display of a second representation (close up of selected area) indicating the location within the selected area of user devices operating in the media transmit mode.
  • top level representation permits the user to navigate to a location of interest within the event venue and the second representation permits the user to actually select a media source from which to receive footage. This is useful because it may be difficult to distinguish between and selected individual media content sources on a top level representation.
  • the event venue representation may have a deeper hierarchy than this and comprise a multi-tiered hierarchy of representations.
  • This may include a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area of user devices operating in the media transmit mode or a further intermediate level representation.
  • This arrangement is particularly suitable for very large event venues, such as a racing circuit, where two levels of resolution may be insufficient to properly navigate the geography of the event venue and select individual media sources.
  • a user database of device users may be provided, each user having a user profile associated with one or more user devices, the user profile comprising one or more of a current location of a user device associated with the user, a specification for the one or more user devices and quality information indicating the image capture capability of the one or more user devices.
  • the user database may comprise device information regarding at least some of the user devices, the device information indicating the media capture capabilities of the user device, wherein an indicator of media quality is displayed on the event venue representation in relation to at least some of the media sources, the indicator of media quality being based on the device information stored in the user database.
  • the event venue representation may be automatically presented on the first (viewing) user device to enable the selection of an alternative media source.
  • an alternative media source is automatically selected by the media handling device (or the viewing user's device) based on its position relative to that of the originally selected media source.
  • a media content distribution method comprising the steps of:
  • a media handling device for receiving and distributing media data, the media handling device being arranged
  • a user device having a camera function, the user device being operable
  • the media receive mode to receive the media data of the event being received from the selected media source and streamed via the media handling device to the user device.
  • a media content distribution system comprising:
  • a media handler for receiving, storing and distributing media data; and a plurality of camera devices, each camera device having a camera function and being operable to upload media data of an event occurring at an event venue to the media handler, the media handler being operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device within the event venue at the time of capture of the uploaded media data; and
  • the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
  • This arrangement permits footage capturing users to upload footage they captured at an event, and users other than the uploading user to add supplemental content, for example text describing their own memories of the event or the scene captured by the footage, to be associated with the uploaded content, and to be viewable along with the uploaded content by subsequent viewers.
  • supplemental content for example text describing their own memories of the event or the scene captured by the footage
  • the playback device may be operable to display a user interface comprising an event venue representation indicating the locations within the event venue at which uploaded media data was captured;
  • the playback device may be operable to receive a user selection of uploaded media data via the user interface, and to display the selected media data along with any additional data stored in association with the selected media data.
  • the playback device may be operable to select stored media data to which description data is to be associated using the event venue representation. In this way, a user may be able to find a media source close to the area from which they watched the event, which should provide the closest resemblance to the experience they themselves had at the event.
  • the playback device may be a camera device which itself captured footage at the event. In other words, users present at the event can access footage of other users present at the event and swap memories and supplemental content. However, in some embodiments supplemental content may be added by users who were not at the event, but nonetheless have an interest in the event and the footage.
  • the camera devices may be operable to store media data locally during the event at the time of image capture, and to upload the media data to the media handler after the event. This prevents the media handler and related storage device from being overburdened with media content during the match and encourages users to be more selective about the material they upload. However, in the alternative users can selectively stream with or without storing.
  • the event venue representation may comprise a first representation indicating the location of plural selectable areas within the event venue, selection of an area within the event venue representation on the playback device causing the display of a second representation indicating the location within the selected area at which uploaded media data was captured.
  • the event venue representation may comprise a hierarchy of representations, a top level representation in the hierarchy representing the event venue as a whole and comprising a plurality of user selectable areas within the event venue, one or more intermediate level representations in the hierarchy each representing one of said user selectable areas and comprising either an indication of the location within said user selectable area at which uploaded media data was captured or a further intermediate level representation.
  • the location within the event venue of each media source may be determined from one of a GPS position or other positioning service determined by the user device which generated the media source, and a seat or area identifier entered by the user of the device which generated the media source, or otherwise allocated to the user of the device.
  • An indicator of the media quality of at least some of the uploaded media data may be displayed on the event venue representation.
  • a user database is provided which stores device information regarding at least some of the camera devices used to capture the media data stored at the media handler, the device information indicating the media capture capabilities of the user device, wherein the indicator of media quality displayed on the event venue representation is based on the device information stored in the user database.
  • quality related information may be stored at the image capture device and transmitted to the media handling device along with the media content.
  • the event venue representation may indicate the availability of different uploaded media data for different time periods or occurrences during the event, the event venue representation being navigable at the playback device with respect to time or occurrences to present the user with access to the different uploaded media data.
  • the playback devices may be operable to select a particular time period of key occurrence during the event, and the indication of media data shown on the event venue representation is updated to reflect uploaded media data corresponding to the selected time or key occurrence.
  • an event play mode in which the indications of media data are continuously updated with respect to a progression in time through the event.
  • a media content distribution method comprising the steps of:
  • the playback device submitting from the playback device to the media handler additional data for association with an item of the stored media data, the playback device being a different device than the device used to capture and/or upload the media data; storing the additional data in association with the media data;
  • a media handler for receiving, storing and distributing media data, the media handler being operable to receive, from a plurality of camera devices each having a camera function, uploaded media data of an event occurring at an event venue;
  • a playback device for accessing media data stored by a media handler; wherein the playback device is operable
  • the media handler being operable to store the additional data in association with the media data
  • the media data to which the additional data has been added is accessed subsequently by a playback device, it is provided to the playback device along with the associated additional data.
  • a media content distribution system comprising:
  • a media handler for receiving and distributing media data
  • each camera device having a camera function and being operable to capture and transmit media data of an event occurring at an event venue to the media handler, the media handler being operable to stream the captured media data to device users or store the captured media data for later access; and a playback device operable to access streamed or stored media data of an event via the media handler and display the accessed media data to a user;
  • the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • the technique of providing restricted access to content may be applied equally to real-time streamed content, or the subsequent access to stored content.
  • the media handler may be operable to store uploaded media data in association with date and time data indicating the time of capture of the uploaded media data and location data indicating the location of the camera device at the time of capture of the uploaded media data.
  • the number of media data sources made available to the playback device may be dependent on how close the playback device is to the event venue.
  • the media handler may be operable to provide access to more media data sources for playback devices relatively closer to the event venue than for playback devices relatively further from the event venue.
  • the subset of media data made available to the playback device may be dependent on a current orientation of the playback device with respect to the event venue.
  • the playback device may be operable: to display a user interface comprising an event venue representation indicating the locations within the event venue at which media data was captured;
  • a playback device further than a predetermined distance from the event venue may be provided with access only to one or more exterior views of the event venue.
  • an exterior view of particular event venue may be displayed at the display device when the display device is aimed towards the geographical location of that event venue.
  • the exterior views presented may correspond to the side of the event venue closest to the playback device, resulting in a telescope effect.
  • a playback device within or relatively nearby the event venue may be provided with access to media data captured at locations within the event venue closest to the current location of the playback device.
  • a media content distribution method comprising:
  • a playback device access streamed or stored media data of an event via the media handler and displaying the accessed media data to a user; wherein the playback device is restricted to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • a media handler for receiving and distributing media data, operable
  • the media handler is operable to restrict the playback device to accessing only a subset of the media data being generated and/or stored in relation to the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • a playback device for accessing media data from a plurality of camera devices via a media handler, the camera device capturing media data of an event occurring at an event venue, the playback device being operable to restrict access to only a subset of the media data being generated of the event, the subset being selected in dependence on the location of the playback device with respect to the event venue.
  • aspects of the present invention are also envisaged, and include a computer program, a media handler, a content distribution device and a user device.
  • Figure 1 schematically illustrates a media content distribution system
  • Figure 2 schematically illustrates an event venue representation of a sports stadium
  • Figure 3 schematically illustrates a seating area display within the event venue representation of Figure 2;
  • Figure 4 schematically illustrates a selectable timeline for navigating footage available at different times within the event venue representation
  • Figure 5 schematically illustrates a set of selectable buttons for accessing footage at different times or in relation to different occurrences at the event
  • Figure 6 schematically illustrates an event venue representation of a theatre hall
  • Figure 7 schematically illustrates a simplified event venue representation of the theatre hall
  • Figure 8 schematically illustrates a seating area within the simplified event venue representation of Figure 7;
  • Figure 9 schematically illustrates an event venue representation of a racing circuit
  • Figure 10 schematically illustrates a trackside area within the event venue representation of Figure 9;
  • Figure 11 schematically illustrates a process by which viewers at an event are able to access live footage of the event captured by other viewers
  • Figure 12 schematically illustrates a process by which viewers at an event are able to upload footage of the event along with related information, and other users are able to access the uploaded footage and add their own related information;
  • Figure 13 schematically illustrates an augmented reality system in which a user is able to see a representation of an event venue by directing a portable device in the direction of the geographical location of the event venue;
  • Figure 14 schematically illustrates a view of the event venue as provided for by the system of Figure 13;
  • Figure 15 schematically illustrates a process by which a user of the augmented reality system of Figure 13 is able to access live or stored footage of an event taking place at the event venue;
  • Figures 16A and 16B schematically illustrate how a different level of access to footage is available when the user is at different distances from the event venue.
  • Figures 17A and 17B schematically illustrate how a different subset of cameras/views are available when the user is at different positions around the event venue.
  • Connect a fan which in its core is about connecting supporters/fans in the entertainment industry, by sharing the view of the show, match, concert, etc. as how they experience these from their point of view and share it with supporters/fans that are not in the location - fan-zone spectators.
  • the app will be compatible with a number of electronic devices and platforms for example an iPad, an iPod, an iPhone, an Android phone, an Android Tablet, a Windows phone, a Blackberry device and computers.
  • the app will run in real time, so that someone in the venue, using the camera in their device, becomes a window for other fan-zone spectators around the world to connect and see, listen and experience the event in their own device.
  • the fan that is not in the venue gets to experience from a specific seat or area the way that the crowd experiences the show.
  • the fan-zone spectator can choose from any of the cameras available in the venue, each fan inside the venue will agree to the terms and conditions to become a relay of this image.
  • the venue/client Because of streaming speed, the venue/client will have to make available a fast broadband, so that the fans inside the venue are able to stream the images and audio to the fan-zone spectators. It will be the fan-zone spectators' responsibility to have a fast broadband in order to receive the feed. This will be part of the terms and conditions.
  • a "live switching" module may be available, in which the fan-zone spectator can decide on more than one camera at the time. There will be a protection, so that this is not able to record, but only switch in between cameras.
  • screens may be installed so that the faces of these fans that are watching through the fans in the venue may be seen, thus making them part of the global audience, so that organisers can know how fans are reacting and adding to the experience of the show.
  • the first part of it means that by understanding the GPS and location in the world of the fan, it is possible to establish were the fan is relative to the stadium or venue.
  • the GPS will tell the fan were the stadium/venue is in relation to him/her, and instead of seeing the environment around him/her, the stadium/venue will appear in the screen, as if being transported to the outside of the stadium/venue.
  • the fan will see the facade of the stadium/venue and some historic moments of the team/artist.
  • By clicking into stadium/venue it will have a clean view from that side as if it was a window into the stadium/venue.
  • the fan will have the opportunity to select the camera and see what is currently happening inside the stadium/venue. This can connect to any of the cameras that have been placed inside the stadium, as long as it has the same angle of view. For instance, if coming from the southwest (SW), only the SW cameras will show.
  • SW southwest
  • fans using the system are recording a football match
  • the goals can then be seen from all the different angles of the cameras, so a fan-zone spectator can see the goal from the SW or east (E) and see and hear how the fans experienced the moment.
  • the system may have its own social media account, which will allow fans around the world to be closer with other fans and their clubs, bands, etc. It will be compatible to the already existing social medias (Twitter (RTM), Facebook (RTM), tumblr (RTM), etc.) with the difference that only members of the website will be allowed to trade points, images, videos and related uploads.
  • the facade is designed to be a reflection of the passion and history of the club and its fans. The way it communicates and evolves enables the stadium, the heart of Real Madrid (RM) to become a living fan that transcends time and distance.
  • the facade is 43 million pixels, making it the largest screen in the world, by a lot. The idea of the lighting is to produce an ever evolving sculpture. Where the east and the west facade will be running content that shows the history and passion of the club; the north and the south will work more as atmospheric pieces.
  • This merging of all of the screens merges all of the elements of Real Madrid, its fans, history and future; creating one screen, one community and one Real Madrid at which the stadium is the heart.
  • the east and west screens will show content of former players, goals and historic moments. They will also be able to show sponsorships and brands that want to be associated will RM. All the content will be displayed in white so as to create a cohesive enthusiasm for team colours (Los Blancos).
  • the facade will be interactive with fans that are both close and far away from the stadium thus giving the possibility of giving people the feeling they are close to what's happening in the stadium. This allows the fans and the stadium to have a mutually influential bond, regardless of how close they are to one another fans can still connect, send messages and the stadium can send exclusive content and generate excitement and atmosphere.
  • the opacity of the facade will vary depending on whether it's a day after a match or 3 days away etc., this will be part of the impact the stadium will have.
  • louvers containing 24mm LED screens will have 3 positions; close, mid and open. This will provide different angles to the light. In an open position the entire screen becomes available; in the closed position only the middle 2 sections will be available. In the mid position they will act as awash. Behind each of these louvers there will be 2 lamps that will help to light the internal space. These louvers will have independent motion/movement which will be affected by the sun, temperature and various other sensors or factors that will create. The old stadium will be lit with LED fixtures that will help accentuate the space and while adding depth to the building.
  • the facade could potentially be utilised to run advertisements, games, movies, and make it the building that people go visit when special moment happens in the country, city, and club.
  • App_ The facade in combination with the app will allow for an ever-evolving building that connects to the fans and becomes the heart of the team and the city, making it a living human being.
  • the app is the tool that allows fans of all different locations to become part of the action. Instead of viewing distance as potentially negative, this app embraces the distance, interacting with RM fans all over the world in a way that evolves to show different parts of the history and passion that the fans have for club depending on location and proximity to the stadium.
  • the app gives fans from all over the world the opportunity to interact with the stadium and the content of the facade, other fans and connect to the match in an inspiring new way; through the eyes of the fans at the stadium.
  • the app delivers an augmented reality (AR) 'fan finder' that allows fan to see the global audience simply by holding up their device. They can also hold up their device to view an exclusive AR player intro with videos, statistics, goals scored etc. the app can be used as a guide to find their seat, the bar, a restaurant or friend.
  • AR augmented reality
  • fans can create a fan pixel, much like the devices in the London 2012 Olympics. This is where fan devices are all held in the air to create a giant screen where messages and effects can be broadcast.
  • a media content distribution system 1 is schematically illustrated.
  • the top portion of Figure 1 shows elements of the system 1 which are disposed insider an event venue 5, while the bottom portion of Figure 1 shows elements of the system 1 which are disposed outside the event venue 5.
  • the event venue in the present case is a football stadium, but it will be appreciated that it could instead be a cricket ground, a race track, a theatre or any other event venue, sporting or otherwise.
  • visitors are using personal electronic devices 62a, 62b, which may be mobile telephones, ipods, ipads or camera devices, to capture footage of the event, in this case the football match.
  • the footage captured by the devices 62a, 62b is transmitted via a wireless hub 20 to a media content distribution device 10.
  • the footage is provided by the transmitting personal electronic device in association with the user ID of the user, a device ID indicating the type or identity of the transmitting device, and in some embodiments the location of the user (e.g. seat identifier or GPS location). Some or all of the footage may optionally be provided to an external content store 70, for access by users at a later time (for example subsequently to the event).
  • the media content distribution device 10 is operable to stream the received footage to another device on request.
  • the requesting device may be a device 62c, also present within the event venue, one of devices 64a, 64b immediately outside the event venue, or one of devices 66a, 66b some distance away from the event venue.
  • these are all within range of the wireless hub 20, and therefore the footage may be streamed from the media content distribution device 10 to these devices via the wireless hub 20.
  • the devices 66a and 66b are outside of the range of the wireless hub, and are therefore required to access the footage via a telecommunications network 50 and the Internet 30.
  • Each of the devices 62a, 62b, 62c, 64a, 64b, 66a, 66b has installed thereon the app described above, and is subscribed to the service.
  • the app may cause the personal electronic device to register itself with the media handling device when the personal electronic device enters the event venue, or the vicinity thereof. This registration may take place automatically (based for example on GPS location, or by detection of the presence of the wireless hub 20), or when the app is launched. Alternatively, there may be a registration function selectable by the user, which causes the device to register itself and the user as present at the event venue.
  • the requesting device may actively select a particular media source to watch. While this selection could be made from a list for example, preferably the selection is made using an event venue representation which permits the user of the requesting device to see where within the event venue currently transmitting media sources are present. This may be in the form of a plan or isometric view of the event venue, in which selectable media sources are visibly presented (for example by flashing icons or highlighted portions). The user of the requesting device is able to select one of these media sources using the event venue representation, for example by clicking or tapping on an icon/highlighted portion. As a result, if the user of the requesting device would like to watch a football game from a location near the goal for example, a media source from this area can be selected.
  • a media source may be chosen either by the user specifically selecting a particular transmitting media source, or by the user selecting an area or zone within the event venue, from which a specific media source is allocated by the media handling device. In this way, the user is able to choose from where they would like to be able to watch the event, while the media handling device is able to take care of choosing a specific device - for example taking into account image capture quality, or favoured image capturing users.
  • the event venue representation is populated with media sources based on the currently received footage from transmitting devices, the location (seat identifier or GPS location) of the users/transmitting devices, and optionally a quality indicator providing some indication of the quality of the footage in terms of resolution/frame rate etc.
  • the quality of the footage may be determined from a subscriber database, as will be explained below.
  • a subscriber database 40 which stores details of users of the service, as well as details of any devices which those users have registered to the service.
  • the database may use the following fields:
  • the User ID is a unique identifier for the user.
  • the User name is also stored.
  • Access to the service may be password protected, and accordingly the database stores a password to be checked against a password entered by the user (or automatically entered by an app on the user's device) when the user accesses the service using their device.
  • the current GPS location may optionally be stored in the database, but might instead be provided directly to the media content distribution device 10 along with the footage.
  • the current seat location may optionally be stored in the database, but again might instead be stored on the user's device and provided directly to the media content distribution device along with the footage.
  • Each device which a user registers to the service is also identified (by a Device ID) in the database in association with the user.
  • Each device may have an associated device specification entry indicating relevant video image capture characteristics of the device. For example, the frame rate, resolution or optical characteristics of the device could be stored, or alternatively a "quality rating" could be tagged against the device specification entry. Finally, accrued credit, for example the star rating, for the user is stored. This credit may be spent or used to access certain features of the service.
  • a user ID and device ID may also be provided, permitting the media content distribution device 10 to obtain the quality rating for the device from the subscriber database, or to determine the probable quality of the footage by referring to the subscriber database using the user ID and device ID and accessing the associated device specification.
  • the location of the user providing the footage may be based on a GPS position - either provided directly to the media content distribution device 10, or provided first to the database 40 and then accessed from there. Alternatively, the location of the user providing the footage may be based on a seat position associated with the user.
  • the seat position may be determined by pre-allocating the seat to the user at the time of purchase, or the time of entry into the event venue, and recording the seat allocation in association with the user on the database 40.
  • the user may enter his seat number into the app on his portable electronic device (on purchase, or while at the event venue for example), which will in turn provide this information to the database via the Internet.
  • the seat position may instead be stored on the user's device and provided directly to the media content distribution device along with the footage. In any of these cases, the position information is used to set the position of the media source on the event venue representation.
  • FIG. 2 an example event venue representation for a football stadium is schematically illustrated.
  • the football pitch is provided in the centre of the figure, while the various seating areas are shown around the outside of the pitch, designated by alphanumeric codes.
  • the position of the viewing user could be represented on the event venue representation, for example by an icon (for example the user's avatar) or a flashing dot, permitting the user to identify their own position at the event, and position of other media sources relative to himself.
  • the view of Figure 2 is the top level event venue representation showing the whole event venue.
  • Each seating area shown in the top level representation of Figure 2 corresponds to an area of seating, such as that shown schematically in Figure 3.
  • Any seating area shown in the top level representation within which a currently transmitting media content source (that is, a user/device currently transmitting footage to the media content distribution device 10) is present may be highlighted to indicate the presence of a transmitting source to a viewer of the event venue representation. Any seating area within which a currently transmitting media content source is present can be selected, resulting in the display switching to show a close up (zoomed) view of that seating area.
  • a close up of the seating area Dl of Figure 2 is shown in Figure 3.
  • seat rows A to G form part of the seating area Dl, each row having either 15 or 16 seats.
  • the event venue representation is a real time representation of where footage is available from at any given time. Accordingly, seating areas may become highlighted and switch off throughout the event, as users turn on and off the capture and transmit functions of their devices.
  • streamed footage from a given device may suddenly terminate when the user of that device decides to stop capturing and streaming the footage.
  • the media content distribution device may intelligently and smoothly switch to streaming footage from a different image capture device nearby to the image capture device which has ceased to transmit footage. In this way the viewing user should continue to experience a similar view of the event.
  • termination of a footage stream may cause the event venue representation to be displayed, with the viewing user being prompted to select another source.
  • the media content distribution device may switch streaming from a first device to a second device even if the first device does not discontinue its transmission.
  • the second device is in a similar location to the first device but has higher quality image capture characteristics.
  • a "favourite" source of the viewing user e.g. a friend
  • the viewing user's preferences might dictate that transmissions from certain users always take precedence, or take precedence over other nearby users and devices.
  • the quality of the devices providing the footage may be indicated for example by having different coloured highlighting for seats associated with footage of different quality. It will be appreciated that other visual indicators could instead be provided.
  • “Favourite” sources may be distinguished in the event venue representation from other sources, either by a different colour, or an annotation or any other visual indication. This enables a viewing user to preferentially select footage from their friends, from participating celebrities at the event, or from other users whom they have found to capture event footage in a manner which they like.
  • media sources could be tagged (in the database for example) as being associated with a particular team. An indication of the team could then be visually identified on the event venue representation, permitting the user to select media sources associated with a particular team. Either as an alternative or an adjunct to providing visual representations of e.g. favourite sources, the quality of sources or a team association of sources, filtering could be required, permitting the user to view only favourite sources, high quality sources or sources associated with a particular team, for example.
  • the event venue representation may indicate a direction in which the transmitting image capture devices are facing, giving the user an idea of what the associated footage will contain.
  • This indication could be provided by a directional arrow originating at the location of the media source and pointing in the direction which the image capturing device is pointing. This feature would require the image capture device to transmit an indication of its current facing to the media handling device along with the footage.
  • Hardware facilitating the self- determination of the orientation of personal devices such as smartphones is readily available, and can be used to provide this function.
  • the direction of facing of image capture devices may also be used in combination with techniques such as goal line tracking or ball location tracking to permit the auto-selection or filtering of camera device directed at a specific area of interest at the event.
  • the media handling device may compare a direction of facing of each camera device at a particular time with the location of (for example) the ball at that same time to determine whether the ball is within the field of view of the camera device, or preferably in the centre of the field of view of the camera device.
  • a similar principle could be applied to other objects of interests in other contexts, for example the location of racing cars on a racing circuit.
  • the event venue representation may also indicate the location of fixed-camera installed within the event venue, or mobile cameras utilised by professional cameramen at the event. These media sources may also be selected by users for playback on their personal device vide the event venue representation.
  • footage When footage is captured, it may simply be streamed (only stored temporarily) from the image capture device to the viewing device via the media content distribution centre, or it may be stored at the content store 70 in addition to being streamed to requesting users. If it is stored in the content store 70 then it can also be accessed at a later time. In one example, most footage is streamed, but some footage is marked by the user capturing that footage as being for permanent storage. This footage, when transmitted to the media content distribution device, is both streamed to any requesting users and also stored in the content store 70.
  • At least some of the footage being transmitted to the media content distribution device is stored at the image capture device (but not at or by the media distribution device) at the time of capture, and is then uploaded to the content store 70 at a later time, for example after the event has finished.
  • Each user may only be permitted to store a limited amount of footage (in terms of one or both of data capacity and duration), in order that the storage facility of the content store is not overwhelmed.
  • a user may be encouraged or prompted to store footage, for example if there are too few media sources in their area - with this being rewarded by extra points.
  • Footage stored into the content store 70 is stored in association with the location (e.g. GPS location or seat location, or derived location) from which the footage was captured, the identity of the user who captured the footage, the quality of the footage (e.g. characteristics of the image capture device), and the time of capture of the footage.
  • This information may be provided from the image capture device to the media content distribution device or from the subscriber database, or a combination of the two.
  • the time of capture, location and footage may be provided by the image capture device along with the user ID and device ID, and the user ID and device ID may be used to obtain the quality information from the subscriber database.
  • the quality information may be stored at the image capture device and provided to the media content distribution device along with the footage and other data.
  • the GPS location may be converted by the media content distribution device into a location within the event venue.
  • This derived location might be a seat position, or alternatively a seating area (e.g. Dl) or a zone within a seating area. This will be sufficient for the event venue representation to highlight an appropriate position for the footage.
  • the uploading user may also store additional content, for example text describing their personal memories of the event, or photographs or sound clips captured during the event.
  • FIG. 4 a timeline 100 for a football match is shown. The timeline 100 is broken down into periods, these being pre-match, first half, half time, second half, extra time and post- match. It will be appreciated that this is only one example breakdown of a football match.
  • a slider button 110 is provided which can be dragged along the timeline to access content associated with any given moment in time. When the slider button 110 is dragged to a particular position on the timeline 100, the event venue representation is updated to highlight those media sources for which stored content captured at the time indicated on the slider 100 is available.
  • a series of buttons are provided for selecting different periods during the event. When a button is selected, the event venue representation is updated to highlight those media sources for which stored content captured during the period indicated by the selected button is available.
  • buttons could be selected, resulting in media sources which captured footage during some or all of the selected periods being highlighted. It will further be appreciated that the interactive timeline of Figure 4 and the period selection buttons of Figure 5 could be provided in combination, permitting a user to access content either from a particular moment in time, or during a particular period.
  • buttons are also shown, these being a "Goals! button, a "Saves! button and a "Fouls! button. These buttons can be used to access footage relating to each of these themes.
  • the footage associated with each theme may be identified in the database with tags - manually entered by the uploading user for example, or by associating with a theme footage captured at a time during the event at which an occurrence corresponding to that theme was known to have taken place. This could be determined automatically by the system based on event related information entered by the system operator.
  • the system operator may associate with an event a dataset indicating times during the event of various key occurrences, such as goals, saves or fouls for example, resulting in a list of time instants or periods at which each given event type occurs.
  • An example is shown in the table below. Based on this information, the selection of "goals" by the viewing user will cause the event venue representation to be populated with identifiers of the availability of media sources which were being captured at the times 17m32s, 27m02s and 91mlls.
  • the footage may commence from the start of the clip, or may start from the point in time selected by the user or derived from the user's input.
  • the viewing user is also able to attach text, sound clips or images to the footage, which are the accessible to other users subsequently accessing that item of footage.
  • the attached text, sound or image data is then stored at the content store 70 in association with the footage itself.
  • a viewing user is able to associate their own memories of the event with footage captured by someone else.
  • Other views accessing the same footage at a subsequent time will have access not only to the footage and related content uploaded by the user who captured the footage, but also any additional materials uploaded in association with the footage by previous viewers.
  • Figure 6 shows a seating plan for a theatre hall, and this may constitute an event venue representation provided to a user wishing to access footage of the event (either in real-time or in relation to a past event, in each case as described above). It can be seen from Figure 6 that certain seats are shaded - these seats represent the location of available media sources within the theatre hall. In the context of realtime (during event) viewing, these shaded seats represent the location of currently transmitting image capture devices, while in the context of viewing a historical (ended) event, the shaded seats represent the location of media sources which captured data at a particular selected time, period or occurrence within the event. It will be appreciated that the detail present in Figure 6 may be inappropriate for a small hand held device. In this regard, a simplified top level event venue representation such as that shown in Figure 7 may be used instead.
  • FIG 7 a simplified view of the theatre hall is provided, broken down into multiple selectable seating areas 210, 220, 230, 240, 250, 260 and 270.
  • a user is able to select one of these seating areas which is of interest, resulting in a close up view of that seating area being shown.
  • selecting the seating area 260 in Figure 7 may result in the close up of Figure 8 being presented to the user.
  • three blocks of seating are shown, with some seats within each block being shaded to represent that footage is available in relation to each of those seats. The user is able to select these shaded seats to gain access to the footage associated with that location.
  • the selection of a shaded seat would trigger the streaming of footage from an image capture device in the case of real-time operation (connect-a-fan) or the playback of stored footage in the case of viewing after the event.
  • Figure 9 shows a plan view of a racing track, and this may constitute an event venue representation provided to a user wishing to access footage of the event (either in real-time or in relation to a past event, in each case as described above).
  • the event venue representation is marked with 20 zones, each of which is selectable to gain access to media content associated with that zone. In some cases, selection of a zone may trigger the playback or a random or best available (based on quality information for example) item of footage associated geographically with that zone.
  • Figure 10 is an example of a close up view within the event venue representation of Figure 9.
  • an indication of the location of the track is provided, along with an indication of the direction in which cars are travelling (in this case provided by the car representations and related arrows).
  • Various seating and standing areas are identified in Figure 10, identified by numbering.
  • These areas may be selectable to obtain access to more detailed seating plans similar to that shown in Figure 8, or the location of available media sources may be indicated by icons or highlighting directly on the representation of Figure 10. Again, such media sources are selectable in order to initiate playback of footage relating to these.
  • a users capturing footage of the race may be positioned all round the track at many or all of the 20 identified locations.
  • a particular viewer will be located only at a single location at any given time. However, that viewer may wish to track the action at pole position throughout a lap of the race.
  • the user can achieve this by selecting appropriate media sources around the track as the car in pole position progresses.
  • the system is operable to sequentially select media sources at different positions around the track to follow the progress of a particular car, for example the car in pole position or a car selected by the viewing user.
  • the system in this case is able to monitor the location of the selected car (for example based on GPS trackers affixed to each car), and repeatedly switch the media source presented to the viewing user such that the selected car is always likely to be in view.
  • the media sources selected could be those closest to the current position of the selected car, and/or those oriented towards the current position of the selected car (based on internal sensors of the image capture devices).
  • the event venue representation identifies where key events are happening, for example the current position of the car in pole position, the location of a crash or other incident, or a pit stop.
  • on-car cameras may also be provided, and may again be accessible to a user using the event-venue representation.
  • the event venue representations can be navigated after the event both with respect to location and with respect to time/occurrence in like manner to the football stadium embodiment described above.
  • an image capture device transmits footage to the media content distribution device 10 via the hub 20 (not shown).
  • the footage received also includes metadata such as the user ID and device ID of the user of the image capture device and the device itself respectively, and optionally location information (for example GPS coordinates) and information regarding the current orientation of the image capture device, if this is not to be retrieved from the database 40.
  • the media content distribution device 10 obtains information about the user of the image capture device and the device itself from the database 40.
  • the obtained information may include quality information about the image capture device 62a (based on the device ID) and in some cases the location of the user where this has been stored in advance - as might be the case for allocated seating associated with tickets booked via the app or seat numbers entered into the app and uploaded to the database 40. Based on this information (and based on similar information obtained in relation to other footage received from other image capture devices) a data structure representing the location and optionally quality of available footage streams is formed. Optionally at a step S3, at least some of the received footage is stored in the content store 70 along with the identity of the user which provided the footage, the time of capture, the location from which the footage was taken, and quality information about the device which generated the footage.
  • a viewer device 62c requests access to an event venue representation displaying the location (and optionally quality) of footage streams which it is able to currently access.
  • an event venue representation is generated based on the data structure and passed back to the viewer device 62c at a step S5.
  • the user of the viewer device 62c selects desired footage using the event venue representation, consequently sending a request for this footage to the media content distribution device.
  • the requested footage is provided (streamed) from the media content distribution device 10 to the viewer device 62c.
  • a schematic flow diagram of the collective memory method is shown.
  • captured content is uploaded from an image capture device 62a to the content store 70.
  • the captured content is uploaded in association with the time of capture, the user ID of the user and the device ID of the image capture device.
  • the location of the user may also be uploaded. This step may take place at the time of capture during an event, or subsequently.
  • the content store 70 obtains information about the user of the image capture device and the device itself from the database 40.
  • the obtained information may include quality information about the image capture device 62a (based on the device ID) and in some cases the location of the user where this has been stored in advance - as might be the case for allocated seating associated with tickets booked via the app or seat numbers entered into the app and uploaded to the database 40.
  • the captured content and the other uploaded and acquired information referred to above is stored in association in the content store.
  • the upload step Ul may include the association of additional text, sound or image content with the captured content and the upload of this additional content to the content store for storage.
  • a user requests an event venue representation from the content store 70. This step may be triggered by the user opening the app, tapping an icon or an external view of and representing the event venue and/or event, or similar.
  • the content store 70 generates an event venue representation based on the uploaded content corresponding to a particular event at a particular venue.
  • the event venue representation is populated with visual indications of content which is available at particular times during the event and at particular locations within the venue. The locations and times for each item of content are derivable from the time of capture information and location information stored at the content store 70 in association with the footage.
  • the event venue representation may also indicate whether a particular item of content has additional information ("memories") associated with it, for example in the form of text-based descriptions of recollections of the event, or audio and/or images associated with a the uploading user or another user's memory of the event. This indication may be given visually, for example by colour coding, icon shaping or any other technique.
  • the viewer uses the event venue representation to identify and select a desired item of content, whereupon this item of content is requested from the content store 70.
  • the content store 70 provides the requested item of content to the viewer, along with any additional information ("memories") associated with that item of content.
  • the viewer is then able to watch the content, and examine the additional information.
  • the user is able to upload their own additional information (“memories"), in the form of text, audio or image, to be associated with the item of content. The viewer's additional information will then be accessible to subsequent viewers accessing the item of content.
  • an augmented reality system for accessing media content is schematically illustrated.
  • a user 310 possessing an electronic device 315 is able to view an augmented reality image of various event venues 320, 325, 330, 335 by aiming his electronic device 315 at the geographical location of one of the devices, wherever in the world those event venues are.
  • the event venues 320 and 335 are sports stadiums
  • the event venue 325 is a theatre hall
  • the event venue 330 is a racing circuit.
  • Each of the event venues in the example of Figure 13 is a different distance from the user 310.
  • the event venue 320 is 4700km from the user
  • the event venue 325 is 179km from the user
  • the event venue 330 is 23km from the user
  • the event venue 335 is 1km from the user.
  • Figure 14 schematically illustrates the electronic device 315 of Figure 13 showing an augmented reality image of the event venue 320, at which the electronic device 315 is aimed. For event venues beyond a certain predetermined range of the viewing user, this may be the only content available to the user. For closer event venues, access to additional content may be provided, for example by clicking or tapping on the external view.
  • a schematic flow diagram of the augmented reality method is shown.
  • footage is captured and transmitted from image capture devices to a media distribution device.
  • the image capture devices may be portable electronic device of users at the event venue, or fixed or portable camera devices associated with the event venue itself.
  • the footage received also includes metadata such as the user ID and device ID of the user of the image capture device and the device itself respectively, and optionally location information (for example GPS coordinates).
  • location information for example GPS coordinates
  • a database may store a predetermined location for each of these devices.
  • data associated with the captured footage is obtained from the database.
  • the obtained information may include quality information about the image capture device (based on the device ID) and in some cases the location of the transmitting user where this has been stored in advance. Based on this information (and based on similar information obtained in relation to other footage received from other image capture devices) a data structure representing the location and optionally quality of available footage streams is formed.
  • a data structure representing the location and optionally quality of available footage streams is formed.
  • at least some of the received footage is stored in the content store 70 along with the identity of the user which provided the footage, the time of capture, the location from which the footage was taken, and quality information about the device which generated the footage.
  • a viewer device 62c requests access to an event venue representation displaying the location (and optionally quality) of footage streams which it is able to currently access.
  • an event venue representation is generated based on the data structure and passed back to the viewer device at a step W5. It will be appreciated that the data structure, and thus the event venue representation, includes only the media sources which the viewer device is permitted to access based on its location relative to the event venue.
  • a step W6 the user of the viewer device selects desired footage using the event venue representation, consequently sending a request for this footage to the media content distribution device. If the desired footage is stored (non real-time) footage, then at a step W7 a request for the footage is sent to the content store, and returned to the media distribution device at a step W8. The retrieved footage is then provided to the viewer device at a step W9. If on the other hand the footage is real-time footage then the steps W7 and W8 are not required and instead footage from the selected source is streamed from the capture device to the viewer device at the step S9.
  • FIG. 16A and 16B an access restriction based on distance from the event venue referred to above is schematically illustrated.
  • a viewer device 620 is shown at a relatively large distance from a media venue 610. At this distance, the viewer device 620 is only able to access a single internal camera 630 at the event venue 610. It will be appreciated that at an even greater distance the viewer device 620 may not be able to access any internal cameras at all, limiting the display to an exterior view of the event venue as per Figure 14.
  • Figure 16B the viewer device 620 is much closer to the event venue 610. In this case, the viewer device 620 is not only able to access the internal camera 630, but also additional cameras 640a, 640b, 640c, 640d. In other words, as the viewer device 620 approaches the event venue 610, more internal cameras are available to view.
  • the cameras 630, 640 may be user devices used by fans within the event venue, or fixed or mobile camera devices installed within the event venue and operated by the event organisers.
  • the camera devices available to the user are selected from cameras closest to the viewer device, providing the experience of a "window" into the event venue from the user's real world location.
  • FIG. 17A an access restriction based on position around the event venue referred to above is schematically illustrated.
  • the viewer device 620 is located just outside the event venue 610.
  • the viewer device 620 has access to the internal cameras 650a, 650b, 650c closest to his location outside the event venue.
  • the experience of a "window" into the event venue from the user's real world location is provided.
  • a different set of internal cameras 660a, 660b, 660c are available to the user.
  • the references to internal cameras in relation to Figures 16 and 17 can be replaced with media sources, and the position within the event venue from which they were captured.
  • a user can move around an event venue after the event and "look in” using their viewer device as if the event was currently occurring.
  • the user could turn up at an event venue after an event has finished (potentially weeks, months or years later) and trigger playback of footage associated with the event.
  • the selection of cameras may be based in part on the direction in which the viewer device is facing - determined for example by existing functionality of the device itself (electronic compass technology).
  • An internal camera may be available for access if it is facing in the same or a similar direction as the viewer device. It will be appreciated that the cameras available for access may be determined based on a combination two or more of the distance, position, and orientation of the viewer device with respect to the position and/or orientation of the camera devices.
  • the audio data of a higher quality may be audio data professionally captured at the event venue and provided to the media handling device, or audio data from another image capture device having a better audio capture capability.
  • the user may be able to selectively switch between the actual audio associated with a media source, or higher quality substituted audio.
  • the audio substitution could be achieved at the media handling device by stripping the audio from audio/video data received from an image capture device, and inserting a higher quality audio stream captured in parallel. It will be appreciated that appropriate synchronisation between the audio/video stream and the substitute audio stream would need to be provided.
  • the video data is stored in association with date/time information indicating the time at which the video data was captured.
  • a substitute audio stream could be stored separately from the stored video data, with its own date /time of capture information.
  • the data /time of capture information of each of the video data and substitute audio data can be used to correlate the two streams together, for combination either for playback only, or for storage as a higher audio quality version of the video file.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

L'invention concerne un dispositif de gestion d'élément multimédia pour recevoir et distribuer des données multimédias et une pluralité de dispositifs d'utilisateur. Chaque dispositif d'utilisateur a une fonction d'appareil photographique et peut fonctionner dans un mode de transmission d'élément multimédia pour transmettre des données multimédias d'un évènement se produisant à un lieu d'évènement au dispositif de gestion d'élément multimédia. Chaque dispositif d'utilisateur peut fonctionner dans un mode de réception d'élément multimédia pour recevoir des données multimédias à partir du dispositif de gestion d'élément multimédia. Lors du fonctionnement dans le mode de réception d'élément multimédia, un premier dispositif d'utilisateur parmi la pluralité de dispositifs d'utilisateur est apte à sélectionner une source multimédia à partir d'un deuxième dispositif d'utilisateur parmi les dispositifs d'utilisateur qui fonctionnent actuellement dans le mode de transmission d'élément multimédia et le dispositif de gestion d'élément multimédia est apte à diffuser en continu les données multimédias de l'évènement qui sont reçues à partir du deuxième dispositif d'utilisateur au premier dispositif d'utilisateur. De cette manière, des utilisateurs de dispositif, lors d'un évènement, sont aptes à partager un contenu multimédia qu'ils ont capturé, par exemple une séquence vidéo, avec d'autres utilisateurs à l'extérieur ou à distance de l'évènement. Ceci permet aux utilisateurs de vivre l'évènement à partir de différents points de vue autour du lieu d'évènement.
EP13771566.0A 2012-09-13 2013-09-12 Distribution de contenu multimédia Withdrawn EP2896210A2 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GBGB1216360.6A GB201216360D0 (en) 2012-09-13 2012-09-13 System and method for improving a supporter's experience
GB1306151.0A GB2505978A (en) 2012-09-13 2013-04-05 Media content distribution system
PCT/GB2013/052384 WO2014041353A2 (fr) 2012-09-13 2013-09-12 Distribution de contenu multimédia

Publications (1)

Publication Number Publication Date
EP2896210A2 true EP2896210A2 (fr) 2015-07-22

Family

ID=47144227

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13771566.0A Withdrawn EP2896210A2 (fr) 2012-09-13 2013-09-12 Distribution de contenu multimédia

Country Status (3)

Country Link
EP (1) EP2896210A2 (fr)
GB (2) GB201216360D0 (fr)
WO (1) WO2014041353A2 (fr)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9888296B2 (en) * 2015-03-27 2018-02-06 Bygge Technologies Inc. Real-time wireless synchronization of live event audio stream with a video recording
CN106999788A (zh) * 2014-11-30 2017-08-01 杜比实验室特许公司 社交媒体链接的大幅面剧院设计
US11271648B2 (en) 2017-07-11 2022-03-08 Supreme Architecture Ltd. Spatial optical wireless communication system
US11126846B2 (en) 2018-01-18 2021-09-21 Ebay Inc. Augmented reality, computer vision, and digital ticketing systems
CN110213596B (zh) * 2018-03-28 2021-08-24 腾讯科技(深圳)有限公司 直播切换方法、装置、计算机设备和存储介质
WO2020211077A1 (fr) 2019-04-19 2020-10-22 Orange Procédé d'aide à l'acquisition de contenu multimédia au niveau d'une scène
US11395049B2 (en) * 2020-03-31 2022-07-19 Northwest Instrument Inc. Method and device for content recording and streaming
US11812084B2 (en) 2020-03-31 2023-11-07 Northwest Instrument Inc. Method and device for content recording and streaming
CN113094011B (zh) * 2021-03-26 2023-12-26 联想(北京)有限公司 屏幕分享方法、装置、设备及计算机可读存储介质
EP4322534A1 (fr) * 2022-08-12 2024-02-14 Rami Khatib Procédés permettant de visualiser des événements enregistrés
JP2025156861A (ja) * 2024-04-02 2025-10-15 株式会社Jvcケンウッド 端末装置及び表示映像決定システム

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2852769B1 (fr) * 2003-03-20 2005-09-16 Eastman Kodak Co Procede de partage de donnees multimedia
US8659636B2 (en) * 2003-10-08 2014-02-25 Cisco Technology, Inc. System and method for performing distributed video conferencing
US20070127508A1 (en) * 2003-10-24 2007-06-07 Terry Bahr System and method for managing the transmission of video data
US8171516B2 (en) * 2004-02-24 2012-05-01 At&T Intellectual Property I, L.P. Methods, systems, and storage mediums for providing multi-viewpoint media sharing of proximity-centric content
US8522289B2 (en) * 2007-09-28 2013-08-27 Yahoo! Inc. Distributed automatic recording of live event
US8635645B2 (en) * 2008-09-30 2014-01-21 Qualcomm Incorporated Apparatus and methods of providing and receiving venue level transmissions and services
FR2959372A1 (fr) * 2010-04-23 2011-10-28 Orange Vallee Procede et systeme de gestion d'une session de diffusion en continu d'un flux video affiche en direct
EP2403236B1 (fr) * 2010-06-29 2013-12-11 Stockholms Universitet Holding AB Système de mixage vidéo mobile

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2014041353A3 *

Also Published As

Publication number Publication date
WO2014041353A3 (fr) 2014-05-22
WO2014041353A2 (fr) 2014-03-20
GB2505978A (en) 2014-03-19
GB201216360D0 (en) 2012-10-31
GB201306151D0 (en) 2013-05-22

Similar Documents

Publication Publication Date Title
EP2896210A2 (fr) Distribution de contenu multimédia
US11546566B2 (en) System and method for presenting and viewing a spherical video segment
KR102194222B1 (ko) 증강 현실 효과를 사용한 이벤트 향상
US10701448B2 (en) Video delivery method for delivering videos captured from a plurality of viewpoints, video reception method, server, and terminal device
ES2974683T3 (es) Sistemas y métodos para enjambres multimedia
US10020025B2 (en) Methods and systems for customizing immersive media content
US9026596B2 (en) Sharing of event media streams
US10795557B2 (en) Customizing immersive media content with embedded discoverable elements
ES2905535T3 (es) Servicios de emisión de vídeo en vivo
US9743060B1 (en) System and method for presenting and viewing a spherical video segment
CN109416931A (zh) 用于视线跟踪的装置和方法
US10770113B2 (en) Methods and system for customizing immersive media content
US20170142486A1 (en) Information processing device, display device, information processing method, program, and information processing system
KR20150100795A (ko) 그룹 이벤트에서의 영상 캡처, 처리 및 전달
US9968853B2 (en) Media system and method
US12301781B1 (en) System and method for messaging channels, story challenges, and augmented reality
US9973746B2 (en) System and method for presenting and viewing a spherical video segment
Mase et al. Socially assisted multi-view video viewer
US20190012834A1 (en) Augmented Content System and Method
NELSON Enhancing fan connection to the Seattle Seahawks through digital media

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150313

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160401