[go: up one dir, main page]

WO2009047572A1 - Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content and autonomous product for the interactive play-back of prerecorded events. - Google Patents

Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content and autonomous product for the interactive play-back of prerecorded events. Download PDF

Info

Publication number
WO2009047572A1
WO2009047572A1 PCT/GR2008/000060 GR2008000060W WO2009047572A1 WO 2009047572 A1 WO2009047572 A1 WO 2009047572A1 GR 2008000060 W GR2008000060 W GR 2008000060W WO 2009047572 A1 WO2009047572 A1 WO 2009047572A1
Authority
WO
WIPO (PCT)
Prior art keywords
spherical
play
event
interactive
viewer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/GR2008/000060
Other languages
French (fr)
Inventor
Antonios Karydis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ANALYSIS SYSTEMS RESEARCH HIGH-TECH SA
Original Assignee
ANALYSIS SYSTEMS RESEARCH HIGH-TECH SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GR20070100613A external-priority patent/GR1006496B/en
Priority claimed from GR20080100618A external-priority patent/GR1006709B/en
Application filed by ANALYSIS SYSTEMS RESEARCH HIGH-TECH SA filed Critical ANALYSIS SYSTEMS RESEARCH HIGH-TECH SA
Publication of WO2009047572A1 publication Critical patent/WO2009047572A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • the Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content on a computer or through the internet involves the synchronous recording of events in multiple spherical videos from different positions in the area where the event takes place, the interactive play-back on the recorded event on a computer or through the internet by means of an interactive application for the synchronized play-back of multiple spherical videos, which offers to the viewer, during the play-back and without interrupting or disrupting it, the capability to interactively choose whichever spherical video they wish to watch from the available alternatives.
  • This choice of a different spherical video corresponds to a virtual shift of the viewer to a different physical viewing position as each spherical video corresponds to a different spherical camera which had been positioned on a predetermined position in the area in which the event took place.
  • the viewer during the play-back of the event and without interrupting or disrupting it, can interactively alter the angle and direction of viewing as if they were turning their head to whichever direction they wished, as well as approach and move away from the point of interest (zoom-in and zoom-out).
  • the autonomous product for the interactive play-back of prerecorded events refers to a product in the form of a DVD which operates autonomously in a computer and allows the interactive play-back of an event that has been recorded and produced with the above integrated system. It utilizes an interactive application for the synchronized play-back of multiple spherical videos, in association with a graphical interface. The viewer, during the playback and without interrupting or disrupting it, is offered the capability to interactively choose whichever spherical video they wish to watch from the available alternatives.
  • This choice of a different spherical video corresponds to a virtual shift of the viewer to a different physical viewing position as each spherical video corresponds to a different spherical camera which had been positioned on a predetermined position in the area in which the event took place.
  • the viewer can interactively alter the angle and direction of viewing as if they were turning their head to whichever direction they wished, as well as approach and move away from the point of interest (zoom-in and zoom-out).
  • a number of video cameras are required for the complete recording of an event using conventional video technologies.
  • Those cameras are either stationary, positioned in preselected positions of perceived increased interest or movable. They can be moved and rotated around the vertical and lateral horizontal axis either by trained personnel (cameramen) or by mechanical means, in order to focus and approach selected points of interest. In all circumstances, the width of view that the video cameras can record is limited despite the availability of a wide range of different lenses, from extreme zooms to extreme wide angles.
  • the responsible event-director When an event is transmitted live, the responsible event-director, while the event takes place watches all the available images corresponding to all the available cameras on the event and chooses on the spot which image is transmitted at any time, usually the view from one camera at any given time. This is the view that the viewer watches in a passive way on their TV screen or on their computer screen streamed through the internet.
  • the method and application for the synchronized interactive play-back of multiple spherical videos on a computer or through the internet and the autonomous product for the interactive play-back of prerecorded events in the form of a DVD every activity and individual happening which take place during an event are recorded in very high resolution digital video format by means of two or more digital spherical video recording devices covering a angle of view of 360°x360° (spherical cameras). These devices record digitally on a single image a full sphere surrounding their position. Each such device produces a different spherical representation of the event.
  • Spherical video consists of a sequence of single spherical images (frames) each capturing an angle of view of 360°x360° ( Figure 1).
  • each frame of the spherical video constitutes a panoramic image which represents 360° on the horizontal and 360° on the vertical axis.
  • two or more spherical cameras are used, positioned on pre-determined and preselected positions aiming to cover every aspect of the event area and as a result, capture totally every happening, activity and instance that takes place during the event.
  • two or more spherical cameras typically four
  • placed in strategically chosen positions of the event area 100% of the activities happening on any point of the event area at any time during the event are recorded.
  • the sound from the event is recorded in digital format.
  • the spherical videos from every spherical camera and the corresponding sound are played-back synchronized with full interactive user control on a computer or through the internet and are projected in a normalized projection with a rate of 25 or 30 frames per second (depending on the local video system standard, PAL or NTSC) on a computer screen.
  • the viewer has the capability and freedom, at any point in time during the play-back, with out interrupting or disrupting the play-back, with full control using his/her mouse or any other pointing/navigation device, to alter interactively the position from which the event is viewed, by choosing which spherical video he/she wishes to watch at any given time, selecting among the several spherical videos available, each one corresponding to any of the several different spherical cameras, which had been placed on preselected positions in the area where the event took place. In this way, the viewer selects to virtually move between the places where the cameras were positioned and watch the event from these points.
  • the viewer has total freedom to choose the direction and width (angle) of viewing from the available full angle of view of 360°x360° and also total freedom to rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom-out), while maintaining full interactive control of the spherical video play-back. Irrespective of the interactive alterations in viewing position, angle, direction or virtual distance, the corresponding sound remains synchronized with the projected images.
  • the viewer has the ability, whenever he/she chooses, to interactively select any one among the available spherical videos, each one of which has captured the full event duration and watch the event from the position corresponding to the particular place where the particular spherical camera had been positioned.
  • the viewer has full freedom to repeat the play-back of an event while choosing to watch it from a completely different view point (choosing a spherical video from a different spherical camera) but also from a different viewing angle as he/she has tha freedom to control interactively the direction and angle of view presented to him/her. In this way, the viewer in every repetition of the play-back of the event has the option to focus on a different point of interest of the same event.
  • the viewer has the ability, whenever he/she chooses, to interactively select a different spherical video to be projected to him/her. In this way he/she is moving his/her position in space and is transferred virtually to the place where the chosen spherical camera was positioned during the event, while the time-continuity of the play-back is not disrupted.
  • the viewer would have no ability to alter his/her position as this would have been fixed by the event director who decides which image from which camera is included in the final linear sequence that is presented to the viewer.
  • the viewer has total freedom to choose what he/she will watch, from which point in space and towards which direction to focus his/her attention as he/she has total freedom to choose, interactively, at any time, to move to a different position in space (by selecting a different spherical video), to turn his/her viewpoint around at any direction, to approach or retreat, while maintaining control of the temporal play-back.
  • the experience offered to the viewer by watching an event using an application based on the method and for the synchronized interactive play-back of multiple spherical video content of 360°x360° angle of view on a computer or through the internet or by using the autonomous product for the interactive play-back of prerecorded events in the form of a DVD approximates the experience of him/her being actually in the area where the event took place and being free to move around at the same time turning his/her view to focus on points of interest at will.
  • the viewer is not anymore a passive viewer of an event but interacts with and controls the reenaction of the event through the use of an interactive application.
  • the viewer is no more dependent on the choices made by somebody else (the director) regarding what to watch and what not to watch, as in conventional video play-back but instead, he/she assumes actively the role of the director.
  • the integrated system for the synchronized interactive play-back of multiple spherical videos on a computer or through the internet is based on the method that utilizes multiple digital video sensors arranged in a predefined pattern for the capture of individual conventional videos that are then combined (stitched) together to produce a spherical video.
  • Such a device is the LadyBug 2 spherical camera from Point Grey Research Inc. ( Figure 2).
  • Each spherical camera integrates six digital video sensors (the LadyBug 2 integrates six Sony ICX204AQ 1/3" 1024x768 progressive scan CCDs). Each individual sensor captures images with a resolution of 1024 x 768 pixels at a rate of up to 30 frames per second (fps), with sensitivity (gain) between OdB and 25dB and with shutter speeds between 0.06ms and 34ms.
  • Five sensors are arranged around a horizontal circular pattern in a portrait orientation. In this configuration each sensor covers an angle of view wider that 72° in the horizontal axis and produces images of 1024 pixels vertical by 768 pixels horizontal resolution. The five sensors together produce a composite image with a horizontal resolution of 3840 pixels (768 pixels x 5 images).
  • the sixth sensor is positioned to aim vertically and produces the top portion (dome) of the spherical image.
  • the bottom part of the image does not present imagery of any interest as it would record the support of the camera and therefore is not captured.
  • the stitched video images at a rate up to 30 fps are sent from the spherical video camera via optical fiber with a transmission speed of 1 ,2Gbps to a compressor unit where they are compressed in real time in the JPEG standard with variable compression rate and the compressed spherical images are transmitted via a FireWire 1394b connection with a transmission speed of 800Mbps, to a computer where they are either stored in a native RAW format or converted to digital video format, stored and transmitted.
  • spherical video is captured for PAL countries, 35fps are captured while for NTSC the rate is set to 30fps.
  • the individual spherical RAW images are combined into 5-second long AVI video sequences with an image aspect of 2:1 and 3500 x 1750 pixels horizontal to vertical resolution. These images represent the full surrounding sphere on an orthogonal projection.
  • the sound of the event is recorded using digital recording technologies in either two or more discreete channels and stored in WAV format.
  • the sound is synchronized with the images and is combined with the images to produce spherical videos with embedded audio which is stored in Quicktime format encoded with the Sorenson 3 algorithm.
  • the audio is also stored separately in MP3 format at 128 kbps, 44.1 Hz stereo.
  • spherical video sequences In order to use the spherical video sequences in the method for synchronized interactive play-back of multiple spherical videos they are converted into FLV format and their resolution is reduced to 1024x512 Pixels with quality setting at 1000 kbps with a key frame every 30 (or 25) frames.
  • the resulting spherical videos (one from each spherical camera) in order to be played back have to be stored on a computer or a removable medium (DVD) or be made available for streaming through the internet.
  • DVD removable medium
  • These spherical videos are played-back in a normalized projection, in a similar way as conventional videos projecting part of the sphere in a viewing window and offering the viewer interactive of which part of the sphere to view at any point in time.
  • the application for the synchronized interactive play-back of multiple spherical videos offers the viewer the capability to select interactively which spherical video he/she wishes to be projected, to select alternative spherical videos while play-back is taking place while all the time during play-back the sound remains synchronized to the projected spherical video images.
  • the application for synchronized interactive play-back of multiple spherical videos presents to the viewer a navigational/control interface ( Figure 8), with a window in which the selected spherical video is played-back in a normalized projection, in a similar fashion as a conventional video at a frame rate of 30 or 25 frames per second, depending on territorial video standard (PAL or NTSC).
  • the viewer may control interactively what is projected. He/she may turn around, altering the direction of viewing towards any point in any direction in a 360 degree range. He/she may alter the angle of viewing using zoom-in and zoom-out controls. This is similar to having the capability to move a virtual camera around its position within a 3D real space. Adjacent to the actual spherical video projection window the application interface presents the viewer with a graphical representation of the actual area where the event took place. This graphical representation of the actual scene of the event may be from very simple to very elaborate, without affecting the operation and performance of the application.
  • a graphical representation of the actual scene of the event may be from very simple to very elaborate, without affecting the operation and performance of the application.
  • the radar cone becomes wider or narrower, depending on the alteration effected.
  • the viewer using his/her pointing devise on this representation may interactively and directly alter the direction of viewing by dragging this graphic towards the desired point on the scene graphical representation.
  • the image projected in the main projection window follows every change in real time without any disruption or interruption in the play-back.
  • the graphical representation offers an additional and very useful and viewer friendly navigational and control tool.
  • the application has the capability to highlight the positions of the spherical cameras also on the projection window, superimposing graphical elements on the actual projected spherical video that is played back.
  • a graphic element will make this apparent to the viewer by highlighting its relative position.
  • This graphic element will also offer a control link to the viewer who may, by clicking on this graphic, move his/her position to the point where the corresponding selected camera was placed, altering the projected spherical video accordingly.
  • the viewer from within the application for synchronized interactive play-back of multiple spherical videos, may continuously alter his/her viewing position, direction and angle of view, in a direct and interactive fashion not offered by any other technology currently in use for the play-back of recorded events.
  • the application for synchronized interactive play-back of multiple spherical videos undertakes and controls the play-back of spherical video on a computer by feeding the projection system with the corresponding spherical video images.
  • spherical videos have a much higher resolution compared to conventional video content, the play-back system poses increased load to the computer compared to conventional video play-back applications.
  • spherical videos of reduced resolution e.g.
  • the application based on the method for synchronized interactive play-back of multiple spherical videos utilizes the Adobe FLASH libraries to implement the play-back of spherical videos locally on a computer or through the internet using Adobe Flash Media Server or similar.
  • the application for synchronized interactive play-back of multiple spherical videos receives as input all the available spherical videos in FLV format as produced by the available spherical cameras on an event and digital audio in MP3 format.
  • the spherical videos may be stored locally on the computer, in a removable magnetic or optical medium (Hard Disk or DVD) or in a Web Server which makes them available for viewing through Adobe Flash Media Server.
  • the application for synchronized interactive play-back of multiple spherical videos implements the necessary graphics engine using the Sandy3D "open source" 3D graphics library for Flash objects in association to the Flash video play-back libraries of Adobe Flash.
  • the encoding of the complete application for synchronized interactive play-back of multiple spherical videos is based on the ActionScript 3.0 programming language through Adobe Flash.
  • the graphics engine generates a sphere and projects, in real time, at 30 or 25 fps, on its internal surface, as texture the spherical image that is supplies by the video play-back system.
  • This command generates a sphere with a given radius and polygonal count ( Figure 3) which constitute the sphere geometry (segments).
  • the sphere radius is estimated based on the width of the available spherical video image.
  • the polygonal count of the sphere geometry is usually set to 30.
  • the value chosen constitutes an acceptable compromise for the given parameters of computer power, and spherical image resolution.
  • the image projected on the projection window becomes thereafter the image as viewed by the camera based on the above settings, all of which may be altered interactively by the viewer during play-back of the spherical video.
  • the viewer as a result is given the view of the selected virtual camera and is therefore virtually placed on the same position as the position from which the virtual camera looks towards the spherical image.
  • the viewer may rotate the virtual camera interactively to point to any direction inside the sphere, using his/her pointing device and in so doing his/her field of view is altered to reflect the movement of the virtual camera.
  • the co-ordinate system used by the Sandy 3D graphics engine assigns x to the axis going through the camera lens, y to the axis vertical with respect to the camera and z to the lateral axis, sideways from the camera. This distinction is necessary because there is a discrepancy between the projection window and the camera/sphere co-ordinate system ( Figure 4).
  • the x-axis of the projection window corresponds to the z-axis of the camera/sphere co-ordinate system, while the y-axis in both systems refers to the vertical axis.
  • the projection window is two dimensional, there is no need to reference the third axis which is the x-axis on the camera/sphere coordinate system and refers to forward/backward movement of the camera. Such movements are not allowed in the current application.
  • the viewer watches always the spherical video in a normalized projection, which in effect is the projection of the part of the spherical surface that corresponds to the virtual camera field of view, on a rectangle with the dimensions given above.
  • the viewer may interactively stop or fast forward the play-back of the spherical video, rewind or even start play-back from the beginning using the controls available on the navigational interface. These controls execute the corresponding Flash commands.
  • the PLAY control executes the Flash "playQ" command, and commences play-back of whichever spherical video is currently selected.
  • the PAUSE control executes the Flash "pauseQ" command, which pauses play-back of whichever spherical video is currently selected at the point in time where the control was pressed.
  • the STOP control executes the Flash "stopQ” command, which stops play- back of whichever spherical video is currently selected.
  • There is also a SEEK BAR control which executes the Flash "seekQ” command, which moves whichever spherical video is currently selected forward and or backward in time, in relation to where the pointing device is placed.
  • the control/navigation interface of the application for synchronized interactive play-back of multiple spherical videos is shown in Figure 7.
  • the graphics engine allows the viewer to move his/her view towards any direction, up, down, left or right and to widen or narrow his/her angle of view.
  • These movements are implemented by the co-ordination of two distinct and independent movements/rotations, one of the virtual camera, the other of the sphere itself. Which rotation is activated depends on the movements of the pointing device by the viewer.
  • the sphere and/or the virtual camera are rotated by means of the "rotate" command of the Flash library, depending on the kind of movement of the pointing device by the viewer. If the pointing device is moved upwards or downwards, along the y-axis of the projection window coordinates, then the projection system rotates the virtual camera accordingly around the X-axis of the sphere 3D space co-ordinates, with the "camera. rotateZ' command ( Figure 5). If the pointing device is moved to the left or right, along the x-axis of the projection window co-ordinates, then the projection system rotates the sphere accordingly around the Y-axis of the sphere 3D space co-ordinates, with the " sphere. rotateY' command ( Figure 6).
  • the projection system alters the field of view of the virtual camera with the "camera.foV' command of the "scenegraph” library of the Sandy 3D graphics engine.
  • the angle of view becomes wider and the viewer appears to move away from the point of interest while activating the "camera.fov-” command, the angle of view becomes narrower and the viewer appears to move closer to the point of interest.
  • the graphical representation of the area where the event took place (scene graphical representation), incorporating the positions of the available spherical cameras with the associated links to the corresponding spherical videos, and the radar-like graphical representation of the angle of view and direction of the selected camera, is initially designed using a graphics package such as Adobe Photoshop.
  • the interactive controls are generated in Flash and are connected to the corresponding spherical video files. Therefore, whenever the viewer selects a different camera by clicking on its image on the scene graphical representation, the corresponding spherical video is assigned as texture for the inside surface of the sphere and the portion of it that corresponds to the selected direction and angle of view of the virtual camera is projected on the projection window.
  • the selected camera image on the scene graphical representation is highlighted and the radar-like display depicts the direction and angle of viewing.
  • Every pixel on the scene graphical representation diagram constitutes an interactive point and therefore the relative position of the pointing device whenever this is positioned within the scene graphical representation diagram screen area is recorded with the "MouseEvent.MOUSE_UP' command of the Flash "event” library, its colour is altered to indicate that it is an active point.
  • This radar graphic follows and recognizes any movement of the pointing device within the spherical video projection window and is rotated around the corresponding camera image to reflect any changes in the direction and viewing angle.
  • the viewer can alter the viewing direction by click and dragging the pointing device inside the scene graphical representation window to indicate the direction towards which he/she wishes the camera to be turned to.
  • Such movements interactively result in rotations of either the sphere or the virtual camera or both at the same time, in real time and are based on the calculation in real time of the x,y co-ordinates of the position of the pointing device and the differences between initial and final values, in a way similar to the approach described above.
  • the radar graphic in the scene graphical representation window rotates accordingly to reflect these changes.
  • the radar graphic follows his/her movements while at the same time, the projected view in the spherical video projection window is altered by directing the sphere or the virtual camera or both at the same time, to rotate in real time, according to these movements.
  • the viewer during play-back and without interrupting or disrupting it, can interactively choose among different cameras, alter viewing direction and angle of viewing.
  • the viewer using the application based on the method for synchronized interactive play-back of multiple spherical videos or the autonomous product for the interactive play-back of one or more prerecorded events in a computer acts more like a director and not as a passive viewer, having the freedom to control every aspect of his/her viewing experience.
  • the application based on the method for synchronized interactive play-back of multiple spherical videos controls play- back of several spherical videos and synchronized audio at the same time.
  • the application commences play-back of the first assigned spherical video with the command "(play(videoi))” synchronized with the corresponding sound which is played back using the command " (sound. play ())”.
  • Play-back of the spherical video corresponding to the newly selected camera commences from this point in time using the "ns.seek(time/1000)" command where ns is the NetStream to which the video is connected.
  • the key to the method for establishing and maintaining synchronization in the play-back of multiple spherical videos and the corresponding sound is to commence play-back of the new spherical video not from the point in time where the previous spherical video was paused but from the point in time which the sound has reached when play-back of the newly selected spherical video is ready to commence.
  • the performance of the application based on the method for synchronized interactive play-back of multiple spherical videos depends on the capabilities of the computer which runs the application, the characteristics and performance of the graphics card, the amount of available memory and several other factors which influence the play-back of the spherical videos and sound and to varying levels introduce delays in the play-back process which would inevitably lead to synchronization loss. For this reason, the application in predetermined time intervals re- synchronizes the spherical video projected at these particular times with the sound by jumping to the spherical video frame corresponding to the sound.
  • the autonomous product for the interactive play-back of prerecorded events offers to the viewer the ability to repeatedly play-back an event produced using the method for synchronized interactive play-back of multiple spherical videos, on any suitable computer without the need to be connected to the internet.
  • Each product is produced for one or more specific events and include all the spherical videos and the digital audio corresponding to the particular event(s), the application for synchronized interactive play-back of multiple spherical videos and a graphical interface for navigation and control of the application, designed especially for the particular event(s).
  • the viewer through this interface interactively controls the play-back of the event which is projected on his/her computer screen.
  • the viewer watches the play-back while at the same time he/she is given a graphical overview of the event area, by means of the scene graphical representations described above.
  • the viewer has the capability to interactively choose whichever spherical video they wish to watch from the available alternatives.
  • This choice of a different spherical video corresponds to a virtual shift of the viewer to a different physical viewing position as each spherical video corresponds to a different spherical camera which had been positioned on a predetermined position in the area in which the event took place.
  • the viewer In addition to choosing the position from which the event is viewed, the viewer, during the play-back of the event and without interrupting or disrupting it, can interactively alter the angle and direction of viewing as if they were turning their head to whichever direction they wished, as well as approach and move away from the point of interest (zoom-in and zoom-out).
  • the viewer using the autonomous product for the interactive play-back of one or more prerecorded events in a computer acts more like a director and not as a passive viewer, having the freedom to control every aspect of his/her viewing experience.
  • An example of the use of the integrated system, method and application for the synchronized interactive play-back of multiple spherical video content of 360°x360° angle of view and the autonomous product for the interactive playback of prerecorded events is the recording and reproduction of a concert.
  • several spherical cameras typically four, are placed in carefully chosen positions ton and around the concert stage. These cameras record full spheres around them, thereby capturing 100% of the event.
  • a digital audio recording system is used to record the sound of the event in uncompressed high quality WAV format.
  • the spherical video sequences are stored in large capacity and high reliability hard disks in the native RAW spherical video format.
  • the spherical videos are converted to AVI format, while the audio is converted to MP3 format.
  • the AVI files are converted to FLV format and thei resolution reduced to 1024 x 512 pixels.
  • the FLV files are uploaded to a WEB server where Flash Media Server is installed. The viewer connects to the web site where the event is uploaded and is presented with the application for the synchronized interactive play-back of multiple spherical video content, using which he/she may watch the event play-back with full interactive control as described earlier, from within his/her web browser application.
  • the viewer has total freedom to decide on which aspect of the concert he/she wishes to focus on. He/she may move his/her view point continuously in order to follow one particular performer, or even choose to assess the reactions of the crowd. He/she may focus on one musician in order to study and analyze his/her technique for as long as he/she wishes. Moreover, the viewer may choose to watch the event from different positions by virtually moving to different places in the area. While he/she applies such control, play-back remains continuous and synchronized to the sound and the flow of activity is not disrupted nor interrupted in any sense.
  • the viewer is no longer passive but interacts with the play-back through an interactive application that gives the impression that the viewer's experience of the event is not restricted by the decisions of others. If the viewer decides to repeat the play-back of any part or even the full event, he/she may choose to watch it in a completely different way every time.
  • the corresponding control and navigational interface is designed to suit the particular event.
  • the scene graphical representation is also designed to reflect the scent of the event and the full application is encoded in HTML to create a fully integrated interactive application.
  • the application for the synchronized interactive play-back of multiple spherical video content and all the corresponding spherical video and digital audio files are also included and a Master of the DVD is produced which then may be reproduced (replicated) in whatever quantity is desired by standard industrial processes.
  • the resulting product could be distributed on its own or in combination with other products (e.g combined with a CD or conventional DVD of the event).
  • the owner of the product may play it on any computer with full interactive control and freedom as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The Integrated system, method and application for the synchronized interactive play-back of multiple spherical videos, captures in spherical video format of 360°x360° angle of view events from different positions in space and allows their interactive play-back on a computer or through the internet. The autonomous product allows the interactive play-back of prerecorded events from a DVD. The viewer, during play-back, without interrupting or disrupting it, interactively, can alter the position from which the event is viewed by choosing which one he/she wishes to watch, among several available spherical videos, corresponding to different places in the area where the event took place, alter direction and angle of viewing from the full 360°x360° angle of view, rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom- out) while the corresponding sound remains synchronized with the projected images.

Description

DESCRIPTION
Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content and autonomous product for the interactive play-back of prerecorded events.
The Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content on a computer or through the internet, involves the synchronous recording of events in multiple spherical videos from different positions in the area where the event takes place, the interactive play-back on the recorded event on a computer or through the internet by means of an interactive application for the synchronized play-back of multiple spherical videos, which offers to the viewer, during the play-back and without interrupting or disrupting it, the capability to interactively choose whichever spherical video they wish to watch from the available alternatives. This choice of a different spherical video corresponds to a virtual shift of the viewer to a different physical viewing position as each spherical video corresponds to a different spherical camera which had been positioned on a predetermined position in the area in which the event took place. In addition to choosing the position from which the event is viewed, the viewer, during the play-back of the event and without interrupting or disrupting it, can interactively alter the angle and direction of viewing as if they were turning their head to whichever direction they wished, as well as approach and move away from the point of interest (zoom-in and zoom-out).
The autonomous product for the interactive play-back of prerecorded events refers to a product in the form of a DVD which operates autonomously in a computer and allows the interactive play-back of an event that has been recorded and produced with the above integrated system. It utilizes an interactive application for the synchronized play-back of multiple spherical videos, in association with a graphical interface. The viewer, during the playback and without interrupting or disrupting it, is offered the capability to interactively choose whichever spherical video they wish to watch from the available alternatives. This choice of a different spherical video corresponds to a virtual shift of the viewer to a different physical viewing position as each spherical video corresponds to a different spherical camera which had been positioned on a predetermined position in the area in which the event took place. In addition to choosing the position from which the event is viewed, the viewer, during the play-back of the event and without interrupting or disrupting it, can interactively alter the angle and direction of viewing as if they were turning their head to whichever direction they wished, as well as approach and move away from the point of interest (zoom-in and zoom-out). For the complete recording of an event using conventional video technologies, usually a number of video cameras are required. Those cameras are either stationary, positioned in preselected positions of perceived increased interest or movable. They can be moved and rotated around the vertical and lateral horizontal axis either by trained personnel (cameramen) or by mechanical means, in order to focus and approach selected points of interest. In all circumstances, the width of view that the video cameras can record is limited despite the availability of a wide range of different lenses, from extreme zooms to extreme wide angles.
Although usually a great number of conventional video cameras is used to record an event it is practically not feasible to record everything that takes place. After the recording of the event, specific sections (scenes) considered to be the most interesting or characteristic from each one of the individual videos from each camera are selected and edited to produce a linear video sequence which finally represents the recorded event. The finished content can be replicated on a variety of media in order to be distributed and reproduced on specific players, transmitted or streamed through the internet and viewed on a suitable screen e.g. a TV.
When an event is transmitted live, the responsible event-director, while the event takes place watches all the available images corresponding to all the available cameras on the event and chooses on the spot which image is transmitted at any time, usually the view from one camera at any given time. This is the view that the viewer watches in a passive way on their TV screen or on their computer screen streamed through the internet.
In both above scenarios, due to both the limitation in the angle of view that conventional video cameras transmit or record, as well as the manner in which the subject to be transmitted or edited at any time is chosen, many aspects of the event are lost, even though viewers might wish to view them. As an example, if it is assumed that the event recorded or transmitted is a concert, the director has to select what to show at any time, from all the different activities taking place at the same time, excluding the rest. As a result, for as long as the director chooses to focus on the singer of the group, a viewer that might have liked wish to watch the guitarist or drummer has no option to do so. Further still, as it often happens especially in sports, when something of special interest happens, if none of the available cameras is focused on the specific location, the incident is not recorded at all. In car racing, where instances of interest are unpredictable, often such incidents are missed. In conclusion, in the viewing of conventional video content either live or prerecorded, the viewer is presented with a predetermined sequence of images which they can only view passively without any capability to influence the play-back. The view point is predetermined by the positioning of the cameras and the content and subject presented at any time is preselected by the director or the editor.
With the Integrated system, the method and application for the synchronized interactive play-back of multiple spherical videos on a computer or through the internet and the autonomous product for the interactive play-back of prerecorded events in the form of a DVD, every activity and individual happening which take place during an event are recorded in very high resolution digital video format by means of two or more digital spherical video recording devices covering a angle of view of 360°x360° (spherical cameras). These devices record digitally on a single image a full sphere surrounding their position. Each such device produces a different spherical representation of the event. Spherical video consists of a sequence of single spherical images (frames) each capturing an angle of view of 360°x360° (Figure 1). Therefore, each frame of the spherical video constitutes a panoramic image which represents 360° on the horizontal and 360° on the vertical axis. Depending on the physical dimensions of the area where an event takes place, it is possible that two or more spherical cameras are used, positioned on pre-determined and preselected positions aiming to cover every aspect of the event area and as a result, capture totally every happening, activity and instance that takes place during the event. With two or more spherical cameras (typically four), placed in strategically chosen positions of the event area, 100% of the activities happening on any point of the event area at any time during the event are recorded. At the same time, the sound from the event is recorded in digital format. The spherical videos from every spherical camera and the corresponding sound are played-back synchronized with full interactive user control on a computer or through the internet and are projected in a normalized projection with a rate of 25 or 30 frames per second (depending on the local video system standard, PAL or NTSC) on a computer screen.
The viewer has the capability and freedom, at any point in time during the play-back, with out interrupting or disrupting the play-back, with full control using his/her mouse or any other pointing/navigation device, to alter interactively the position from which the event is viewed, by choosing which spherical video he/she wishes to watch at any given time, selecting among the several spherical videos available, each one corresponding to any of the several different spherical cameras, which had been placed on preselected positions in the area where the event took place. In this way, the viewer selects to virtually move between the places where the cameras were positioned and watch the event from these points. At the same time, the viewer has total freedom to choose the direction and width (angle) of viewing from the available full angle of view of 360°x360° and also total freedom to rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom-out), while maintaining full interactive control of the spherical video play-back. Irrespective of the interactive alterations in viewing position, angle, direction or virtual distance, the corresponding sound remains synchronized with the projected images.
Comparing the Integrated system, the method and application for the synchronized interactive play-back of multiple spherical videos of 360°x360° angle of view on a computer or through the internet and the autonomous product for the interactive play-back of prerecorded events in the form of a DVD to conventional event recording, transmission and play-back technologies, in the former the viewer has at his/her disposal locally or through the internet, multiple alternative full spherical video sequences from an event, each one of which represents a full surrounding sphere and as a result every action, instance or happening in all the area where the event takes place is captured or transmitted during the full duration of the event, each sequence corresponding to one of the several spherical cameras that had been used to capture the event. In contrast, in a conventional event recording, transmission of the same event, the viewer would have at his disposal only the linear sequence of individual and by necessity partial scenes-instances which the event director would have chosen to include in the final presentation of the event.
During the synchronized play-back of an event and without interrupting or disrupting this play-back, the viewer has the ability, whenever he/she chooses, to interactively select any one among the available spherical videos, each one of which has captured the full event duration and watch the event from the position corresponding to the particular place where the particular spherical camera had been positioned. He/she may watch the event from the selected point for as long as he/she wishes, having the capability to focus his/her attention to any particular point or subject in space, by interactively altering his/her direction of view (look around), by turning left or right, up or down, in a range of 360° in all directions (both y- and z- axes) using the interactive rotation controls, approach or retreat using the interactive zoom-in and zoom out controls (moving along the x-axis). In this way, he/she alters interactively the direction and angle of view of the viewing window. In contrast, in a conventional play-back of video material, the viewer would have had no choice with regards to the image presented to him, apart from moving forward or backward in time by means of controls such as rewind, fast forward, previous and next. Even in the restricted cases that the viewer is offered control of alternative angles (e.g. multi-angle DVDs) he/she still has no choice on the actual camera movements which dictate the viewing direction and angle of view as all these would have been fixed during the recording of the event.
With the method and the application for the synchronized interactive play-back of multiple spherical video content of 360°x360° angle of view on a computer or through the internet and the autonomous product for the interactive playback of prerecorded events in the form of a DVD, the viewer has full freedom to repeat the play-back of an event while choosing to watch it from a completely different view point (choosing a spherical video from a different spherical camera) but also from a different viewing angle as he/she has tha freedom to control interactively the direction and angle of view presented to him/her. In this way, the viewer in every repetition of the play-back of the event has the option to focus on a different point of interest of the same event. In contrast, in a conventional video play-back, the viewer, no matter how many times watched the video, he/she would have watched the same material as the content is fixed and offers no capability of interactive viewpoint control. All camera movements, no matter how elaborate, would have been fixed and the image content and focus would also have been predetermined with no option to alter. If a particular action or happening was not within the viewing range of one of the available conventional cameras, then it would have not been recorded and there would have been no way to recreate it.
During the synchronized play-back of an event and without interrupting or disrupting this play-back, the viewer has the ability, whenever he/she chooses, to interactively select a different spherical video to be projected to him/her. In this way he/she is moving his/her position in space and is transferred virtually to the place where the chosen spherical camera was positioned during the event, while the time-continuity of the play-back is not disrupted. In contrast in a conventional video play-back, the viewer would have no ability to alter his/her position as this would have been fixed by the event director who decides which image from which camera is included in the final linear sequence that is presented to the viewer.
Therefore, with the integrated system, the method and the application for the synchronized interactive play-back of multiple spherical videos of 360°x360° angle of view on a computer or through the internet and the autonomous product for the interactive play-back of prerecorded events in the form of a DVD, the viewer has total freedom to choose what he/she will watch, from which point in space and towards which direction to focus his/her attention as he/she has total freedom to choose, interactively, at any time, to move to a different position in space (by selecting a different spherical video), to turn his/her viewpoint around at any direction, to approach or retreat, while maintaining control of the temporal play-back. The experience offered to the viewer by watching an event using an application based on the method and for the synchronized interactive play-back of multiple spherical video content of 360°x360° angle of view on a computer or through the internet or by using the autonomous product for the interactive play-back of prerecorded events in the form of a DVD, approximates the experience of him/her being actually in the area where the event took place and being free to move around at the same time turning his/her view to focus on points of interest at will. The viewer is not anymore a passive viewer of an event but interacts with and controls the reenaction of the event through the use of an interactive application. The viewer is no more dependent on the choices made by somebody else (the director) regarding what to watch and what not to watch, as in conventional video play-back but instead, he/she assumes actively the role of the director. The integrated system for the synchronized interactive play-back of multiple spherical videos on a computer or through the internet is based on the method that utilizes multiple digital video sensors arranged in a predefined pattern for the capture of individual conventional videos that are then combined (stitched) together to produce a spherical video. Such a device is the LadyBug 2 spherical camera from Point Grey Research Inc. (Figure 2). Each spherical camera integrates six digital video sensors (the LadyBug 2 integrates six Sony ICX204AQ 1/3" 1024x768 progressive scan CCDs). Each individual sensor captures images with a resolution of 1024 x 768 pixels at a rate of up to 30 frames per second (fps), with sensitivity (gain) between OdB and 25dB and with shutter speeds between 0.06ms and 34ms. Five sensors are arranged around a horizontal circular pattern in a portrait orientation. In this configuration each sensor covers an angle of view wider that 72° in the horizontal axis and produces images of 1024 pixels vertical by 768 pixels horizontal resolution. The five sensors together produce a composite image with a horizontal resolution of 3840 pixels (768 pixels x 5 images). Due to the arrangement of the five sensors, parts of the neighbouring images, along the vertical edges are common and therefore some overlap is allowed which results in better stitched video images. Stitching together the five individual video images from the five radial sensors, allowing for the necessary overlap, results in composite video images with an angle of view of 360° on the horizontal axis and horizontal resolution of 3500 pixels. The sixth sensor is positioned to aim vertically and produces the top portion (dome) of the spherical image. The bottom part of the image does not present imagery of any interest as it would record the support of the camera and therefore is not captured. The stitched video images at a rate up to 30 fps are sent from the spherical video camera via optical fiber with a transmission speed of 1 ,2Gbps to a compressor unit where they are compressed in real time in the JPEG standard with variable compression rate and the compressed spherical images are transmitted via a FireWire 1394b connection with a transmission speed of 800Mbps, to a computer where they are either stored in a native RAW format or converted to digital video format, stored and transmitted. When spherical video is captured for PAL countries, 35fps are captured while for NTSC the rate is set to 30fps. The individual spherical RAW images are combined into 5-second long AVI video sequences with an image aspect of 2:1 and 3500 x 1750 pixels horizontal to vertical resolution. These images represent the full surrounding sphere on an orthogonal projection. In parallel to the capture of the spherical video images, the sound of the event is recorded using digital recording technologies in either two or more discreete channels and stored in WAV format. The sound is synchronized with the images and is combined with the images to produce spherical videos with embedded audio which is stored in Quicktime format encoded with the Sorenson 3 algorithm. The audio is also stored separately in MP3 format at 128 kbps, 44.1 Hz stereo.
In order to use the spherical video sequences in the method for synchronized interactive play-back of multiple spherical videos they are converted into FLV format and their resolution is reduced to 1024x512 Pixels with quality setting at 1000 kbps with a key frame every 30 (or 25) frames. The resulting spherical videos (one from each spherical camera) in order to be played back have to be stored on a computer or a removable medium (DVD) or be made available for streaming through the internet. These spherical videos are played-back in a normalized projection, in a similar way as conventional videos projecting part of the sphere in a viewing window and offering the viewer interactive of which part of the sphere to view at any point in time. The application for the synchronized interactive play-back of multiple spherical videos offers the viewer the capability to select interactively which spherical video he/she wishes to be projected, to select alternative spherical videos while play-back is taking place while all the time during play-back the sound remains synchronized to the projected spherical video images. The application for synchronized interactive play-back of multiple spherical videos, presents to the viewer a navigational/control interface (Figure 8), with a window in which the selected spherical video is played-back in a normalized projection, in a similar fashion as a conventional video at a frame rate of 30 or 25 frames per second, depending on territorial video standard (PAL or NTSC). By positioning his/her pointing device (e.g. computer mouse) within this projection window, the viewer may control interactively what is projected. He/she may turn around, altering the direction of viewing towards any point in any direction in a 360 degree range. He/she may alter the angle of viewing using zoom-in and zoom-out controls. This is similar to having the capability to move a virtual camera around its position within a 3D real space. Adjacent to the actual spherical video projection window the application interface presents the viewer with a graphical representation of the actual area where the event took place. This graphical representation of the actual scene of the event may be from very simple to very elaborate, without affecting the operation and performance of the application. On this scene graphical representation the positions of the spherical cameras which were used to record the event are shown. By clicking on any of these cameras, the viewer selects it and playback in the main projection window continues synchronized to the sound but now displaying the spherical video corresponding to the selected camera. Video play-back always remains synchronized to the sound and play-back following change of cameras continues from the correct point in time. When a camera is selected, this is indicated also on the scene graphical representation by means of a colour change and by displaying an additional radar-like graphic that depicts the direction and angle of viewing of the selected camera. This graphic is interactively changing with every change effected by the viewer. When the direction is changed, the radar cone moves to show the new direction. When the angle of viewing changes, the radar cone becomes wider or narrower, depending on the alteration effected. Moreover, the viewer, using his/her pointing devise on this representation may interactively and directly alter the direction of viewing by dragging this graphic towards the desired point on the scene graphical representation. The image projected in the main projection window follows every change in real time without any disruption or interruption in the play-back. In this way, the graphical representation offers an additional and very useful and viewer friendly navigational and control tool. The application has the capability to highlight the positions of the spherical cameras also on the projection window, superimposing graphical elements on the actual projected spherical video that is played back. Therefore, as the viewer rotates his/her view, when the position of another spherical camera comes within his/her viewport, a graphic element will make this apparent to the viewer by highlighting its relative position. This graphic element will also offer a control link to the viewer who may, by clicking on this graphic, move his/her position to the point where the corresponding selected camera was placed, altering the projected spherical video accordingly.
As a result, the viewer, from within the application for synchronized interactive play-back of multiple spherical videos, may continuously alter his/her viewing position, direction and angle of view, in a direct and interactive fashion not offered by any other technology currently in use for the play-back of recorded events. The application for synchronized interactive play-back of multiple spherical videos, undertakes and controls the play-back of spherical video on a computer by feeding the projection system with the corresponding spherical video images. As spherical videos have a much higher resolution compared to conventional video content, the play-back system poses increased load to the computer compared to conventional video play-back applications. Yet, spherical videos of reduced resolution (e.g. (1024x512 pixels) can be played back on medium-powered computers of current technology. As computer technology will undoubtedly continue advancing, the method and application for synchronized interactive play-back of multiple spherical videos is not dependent on nor restricted by the resolution of the spherical videos involved.
The application based on the method for synchronized interactive play-back of multiple spherical videos utilizes the Adobe FLASH libraries to implement the play-back of spherical videos locally on a computer or through the internet using Adobe Flash Media Server or similar. The application for synchronized interactive play-back of multiple spherical videos receives as input all the available spherical videos in FLV format as produced by the available spherical cameras on an event and digital audio in MP3 format. The spherical videos may be stored locally on the computer, in a removable magnetic or optical medium (Hard Disk or DVD) or in a Web Server which makes them available for viewing through Adobe Flash Media Server. The application for synchronized interactive play-back of multiple spherical videos implements the necessary graphics engine using the Sandy3D "open source" 3D graphics library for Flash objects in association to the Flash video play-back libraries of Adobe Flash. The encoding of the complete application for synchronized interactive play-back of multiple spherical videos is based on the ActionScript 3.0 programming language through Adobe Flash. These technologies enable the application for synchronized interactive play-back of multiple spherical videos to be executed locally or by means of a removable medium (HD or DVD) or, embedded in the web pages of a web site, on any computer that connects to this site.
The graphics engine generates a sphere and projects, in real time, at 30 or 25 fps, on its internal surface, as texture the spherical image that is supplies by the video play-back system. This is implemented with the "sphere = new Sphere('sphere', radius, segmentsW, segmentsH)" command of the "primitive" library of the Sandy 3D graphics engine. This command generates a sphere with a given radius and polygonal count (Figure 3) which constitute the sphere geometry (segments). The sphere radius is estimated based on the width of the available spherical video image. The polygonal count of the sphere geometry is usually set to 30. The bigger this number, the better the quality of the projected image but at the same time the generation of the sphere will become slower and this will inevitably lead to delays in the projection, therefore, the value chosen constitutes an acceptable compromise for the given parameters of computer power, and spherical image resolution. Using the "material = new VideoMaterial(video, UpdateMS)" command of the "materials" library of the Sandy 3D graphics engine, the spherical video to be projected is assigned as material (VideoMaterial) for the sphere and with the "app = new Appearance(material)" and the "sphere. appearance = app" commands of the "materials" library of the Sandy 3D graphics engine, the spherical image is assigned as texture for the sphere. Next, the Sandy 3D graphics engine creates a virtual camera using the command "camera = new Camera3D(width, height, fov)", where "width" and "height' are the relative dimensions of the projection window and "fov" is the field of view (angle of view) of the virtual camera in degrees. The virtual camera is positioned in the centre of the sphere (Figure 3), using the command "camera.x = camera.y = camera.z = 0" of the Sandy 3D graphics engine (values of zero are assigned to all the virtual camera co-ordinates (x.y.z)).
The image projected on the projection window becomes thereafter the image as viewed by the camera based on the above settings, all of which may be altered interactively by the viewer during play-back of the spherical video. The viewer as a result is given the view of the selected virtual camera and is therefore virtually placed on the same position as the position from which the virtual camera looks towards the spherical image. The viewer may rotate the virtual camera interactively to point to any direction inside the sphere, using his/her pointing device and in so doing his/her field of view is altered to reflect the movement of the virtual camera.
The co-ordinate system used by the Sandy 3D graphics engine assigns x to the axis going through the camera lens, y to the axis vertical with respect to the camera and z to the lateral axis, sideways from the camera. This distinction is necessary because there is a discrepancy between the projection window and the camera/sphere co-ordinate system (Figure 4). As a result, the x-axis of the projection window corresponds to the z-axis of the camera/sphere co-ordinate system, while the y-axis in both systems refers to the vertical axis. As the projection window is two dimensional, there is no need to reference the third axis which is the x-axis on the camera/sphere coordinate system and refers to forward/backward movement of the camera. Such movements are not allowed in the current application.
The viewer watches always the spherical video in a normalized projection, which in effect is the projection of the part of the spherical surface that corresponds to the virtual camera field of view, on a rectangle with the dimensions given above. The viewer may interactively stop or fast forward the play-back of the spherical video, rewind or even start play-back from the beginning using the controls available on the navigational interface. These controls execute the corresponding Flash commands. The PLAY control executes the Flash "playQ" command, and commences play-back of whichever spherical video is currently selected. The PAUSE control executes the Flash "pauseQ" command, which pauses play-back of whichever spherical video is currently selected at the point in time where the control was pressed. The STOP control executes the Flash "stopQ" command, which stops play- back of whichever spherical video is currently selected. There is also a SEEK BAR control which executes the Flash "seekQ" command, which moves whichever spherical video is currently selected forward and or backward in time, in relation to where the pointing device is placed. The control/navigation interface of the application for synchronized interactive play-back of multiple spherical videos is shown in Figure 7.
At the same time, the graphics engine allows the viewer to move his/her view towards any direction, up, down, left or right and to widen or narrow his/her angle of view. These movements are implemented by the co-ordination of two distinct and independent movements/rotations, one of the virtual camera, the other of the sphere itself. Which rotation is activated depends on the movements of the pointing device by the viewer. Left clicking the pointing device inside the projection window area stores the initial values of the x- and y- axes co-ordinates of the pointing device, using the "mousePosition.x = stage. mouseX" and "mousePosition.y = stage. mouseY' commands of the Flash library. As the pointing device moves on a 2 dimensional plane (the viewing window on the computer screen, there is no movement on the Z axis Every subsequent movement of the pointing device by the viewer, while the left click is maintained (click and drag), results in continuous calculation of the differences from the initial co-ordinate values, with the "((mousePosition.y - stage. mouseY)" and "(mousePosition.x - stage. mouseX)" commands where mousePosition.x and mousePosition.y are the initial co-ordinate values while stage. mouseX and stage. mouseY are the final (current) co-ordinate values. The resulting differences lead to the corresponding rotation of either the virtual camera or the sphere. The sphere and/or the virtual camera are rotated by means of the "rotate" command of the Flash library, depending on the kind of movement of the pointing device by the viewer. If the pointing device is moved upwards or downwards, along the y-axis of the projection window coordinates, then the projection system rotates the virtual camera accordingly around the X-axis of the sphere 3D space co-ordinates, with the "camera. rotateZ' command (Figure 5). If the pointing device is moved to the left or right, along the x-axis of the projection window co-ordinates, then the projection system rotates the sphere accordingly around the Y-axis of the sphere 3D space co-ordinates, with the " sphere. rotateY' command (Figure 6).
When the viewer activates the ZOOM-IN or ZOOM-OUT controls, aiming to widen or narrow his/her angle of view, the projection system alters the field of view of the virtual camera with the "camera.foV' command of the "scenegraph" library of the Sandy 3D graphics engine. Activating the "camera.fov++" command, the angle of view becomes wider and the viewer appears to move away from the point of interest while activating the "camera.fov-" command, the angle of view becomes narrower and the viewer appears to move closer to the point of interest.
The graphical representation of the area where the event took place, (scene graphical representation), incorporating the positions of the available spherical cameras with the associated links to the corresponding spherical videos, and the radar-like graphical representation of the angle of view and direction of the selected camera, is initially designed using a graphics package such as Adobe Photoshop. The interactive controls are generated in Flash and are connected to the corresponding spherical video files. Therefore, whenever the viewer selects a different camera by clicking on its image on the scene graphical representation, the corresponding spherical video is assigned as texture for the inside surface of the sphere and the portion of it that corresponds to the selected direction and angle of view of the virtual camera is projected on the projection window. At the same time, the selected camera image on the scene graphical representation is highlighted and the radar-like display depicts the direction and angle of viewing. Every pixel on the scene graphical representation diagram constitutes an interactive point and therefore the relative position of the pointing device whenever this is positioned within the scene graphical representation diagram screen area is recorded with the "MouseEvent.MOUSE_UP' command of the Flash "event" library, its colour is altered to indicate that it is an active point. If the viewer clicks over the position of a camera other than the currently selected, this action is recognized in real time with the uMouseEvent.MOUSE_DOWN" command of the Flash "event" library, and the projection system, ceases the projection of the spherical video that was being played back until this moment and starts projecting on the projection window, the spherical video corresponding to the selected camera, from the current position in time, thereby maintaining temporal continuity and synchronization with the sound. These actions are implemented using the "ns.stopO" and uns.play(video2)" commands and at the same time invokes the command "(randar.visible = true)" to make the radar graphic visible and active for the newly selected camera on the scene graphical representation window. This radar graphic follows and recognizes any movement of the pointing device within the spherical video projection window and is rotated around the corresponding camera image to reflect any changes in the direction and viewing angle. In addition to moving the pointing device within the projection window area, the viewer can alter the viewing direction by click and dragging the pointing device inside the scene graphical representation window to indicate the direction towards which he/she wishes the camera to be turned to. Such movements interactively result in rotations of either the sphere or the virtual camera or both at the same time, in real time and are based on the calculation in real time of the x,y co-ordinates of the position of the pointing device and the differences between initial and final values, in a way similar to the approach described above. When the viewer moves the pointing device within the spherical video projection window, not only the view is altered accordingly to his/her movements but the radar graphic in the scene graphical representation window rotates accordingly to reflect these changes. When the viewer moves the pointing device within the scene graphical representation window, the radar graphic follows his/her movements while at the same time, the projected view in the spherical video projection window is altered by directing the sphere or the virtual camera or both at the same time, to rotate in real time, according to these movements.
With this capability, the viewer, during play-back and without interrupting or disrupting it, can interactively choose among different cameras, alter viewing direction and angle of viewing. The viewer using the application based on the method for synchronized interactive play-back of multiple spherical videos or the autonomous product for the interactive play-back of one or more prerecorded events in a computer, acts more like a director and not as a passive viewer, having the freedom to control every aspect of his/her viewing experience.
Using Flash commands, the application based on the method for synchronized interactive play-back of multiple spherical videos controls play- back of several spherical videos and synchronized audio at the same time. Initially the application commences play-back of the first assigned spherical video with the command "(play(videoi))" synchronized with the corresponding sound which is played back using the command " (sound. play ())". When the viewer chooses to move to another camera position, the point in running time based on the sound play-back is recorded in fractions of a second using the "time = channel. position" Flash command, where channel is the corresponding audio channel. Play-back of the spherical video corresponding to the newly selected camera commences from this point in time using the "ns.seek(time/1000)" command where ns is the NetStream to which the video is connected. The key to the method for establishing and maintaining synchronization in the play-back of multiple spherical videos and the corresponding sound is to commence play-back of the new spherical video not from the point in time where the previous spherical video was paused but from the point in time which the sound has reached when play-back of the newly selected spherical video is ready to commence. The performance of the application based on the method for synchronized interactive play-back of multiple spherical videos depends on the capabilities of the computer which runs the application, the characteristics and performance of the graphics card, the amount of available memory and several other factors which influence the play-back of the spherical videos and sound and to varying levels introduce delays in the play-back process which would inevitably lead to synchronization loss. For this reason, the application in predetermined time intervals re- synchronizes the spherical video projected at these particular times with the sound by jumping to the spherical video frame corresponding to the sound.
The autonomous product for the interactive play-back of prerecorded events offers to the viewer the ability to repeatedly play-back an event produced using the method for synchronized interactive play-back of multiple spherical videos, on any suitable computer without the need to be connected to the internet. Each product is produced for one or more specific events and include all the spherical videos and the digital audio corresponding to the particular event(s), the application for synchronized interactive play-back of multiple spherical videos and a graphical interface for navigation and control of the application, designed especially for the particular event(s).
The viewer through this interface interactively controls the play-back of the event which is projected on his/her computer screen. The viewer watches the play-back while at the same time he/she is given a graphical overview of the event area, by means of the scene graphical representations described above. During the play-back and without interrupting or disrupting it, the viewer has the capability to interactively choose whichever spherical video they wish to watch from the available alternatives. This choice of a different spherical video corresponds to a virtual shift of the viewer to a different physical viewing position as each spherical video corresponds to a different spherical camera which had been positioned on a predetermined position in the area in which the event took place. In addition to choosing the position from which the event is viewed, the viewer, during the play-back of the event and without interrupting or disrupting it, can interactively alter the angle and direction of viewing as if they were turning their head to whichever direction they wished, as well as approach and move away from the point of interest (zoom-in and zoom-out). The viewer using the autonomous product for the interactive play-back of one or more prerecorded events in a computer, acts more like a director and not as a passive viewer, having the freedom to control every aspect of his/her viewing experience.
An example of the use of the integrated system, method and application for the synchronized interactive play-back of multiple spherical video content of 360°x360° angle of view and the autonomous product for the interactive playback of prerecorded events, is the recording and reproduction of a concert. For the complete capture of such an event, several spherical cameras, typically four, are placed in carefully chosen positions ton and around the concert stage. These cameras record full spheres around them, thereby capturing 100% of the event. In parallel, a digital audio recording system is used to record the sound of the event in uncompressed high quality WAV format. The spherical video sequences are stored in large capacity and high reliability hard disks in the native RAW spherical video format. After the event is over, the spherical videos are converted to AVI format, while the audio is converted to MP3 format. The AVI files are converted to FLV format and thei resolution reduced to 1024 x 512 pixels. The FLV files are uploaded to a WEB server where Flash Media Server is installed. The viewer connects to the web site where the event is uploaded and is presented with the application for the synchronized interactive play-back of multiple spherical video content, using which he/she may watch the event play-back with full interactive control as described earlier, from within his/her web browser application.
During the event play-back, the viewer has total freedom to decide on which aspect of the concert he/she wishes to focus on. He/she may move his/her view point continuously in order to follow one particular performer, or even choose to assess the reactions of the crowd. He/she may focus on one musician in order to study and analyze his/her technique for as long as he/she wishes. Moreover, the viewer may choose to watch the event from different positions by virtually moving to different places in the area. While he/she applies such control, play-back remains continuous and synchronized to the sound and the flow of activity is not disrupted nor interrupted in any sense. The viewer is no longer passive but interacts with the play-back through an interactive application that gives the impression that the viewer's experience of the event is not restricted by the decisions of others. If the viewer decides to repeat the play-back of any part or even the full event, he/she may choose to watch it in a completely different way every time.
If the event is to be published as an autonomous product for interactive play- back of the concert in DVD format, the corresponding control and navigational interface is designed to suit the particular event. The scene graphical representation is also designed to reflect the scent of the event and the full application is encoded in HTML to create a fully integrated interactive application. The application for the synchronized interactive play-back of multiple spherical video content and all the corresponding spherical video and digital audio files are also included and a Master of the DVD is produced which then may be reproduced (replicated) in whatever quantity is desired by standard industrial processes. The resulting product could be distributed on its own or in combination with other products (e.g combined with a CD or conventional DVD of the event). The owner of the product may play it on any computer with full interactive control and freedom as described above.

Claims

1. Integrated system for the recording and interactive play-back of events by means of spherical video in which events are recorded with digital spherical video recording devices capable of capturing an angle of view of 360°x360°, the corresponding sound of the event is recorded and synchronized with the spherical videos and the unified sequences of digital spherical videos and sound are played-back under interactive user control on a computer or through the internet and are projected in a normalized projection at a rate of 25 or 30 frames per second on a computer screen, characterized by the synchronized and interactive play-back of multiple spherical videos and the corresponding sound, and the ability to offer to the viewer the capability, at any point in time during the play-back, without interrupting or disrupting the play-back, with full control using his/her mouse or any other pointing/navigation device, to alter interactively the position from which the event is viewed by choosing which spherical video he/she wishes to watch at any given time, selecting among the several available spherical videos, each one corresponding to any of the several different spherical cameras which had been placed on preselected positions in the area where the event took place, choosing in this way to virtually move to the view-points where the spherical cameras had been placed and watch the event from these points, while at the same time he/she maintains total freedom to choose the direction and angle of viewing from the available full 360°x360° angle of view and total freedom to rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom-out), while also maintaining full interactive control of the spherical video play-back. Irrespective of the interactive alterations in viewing position, angle, direction or virtual distance, the corresponding sound remains synchronized with the projected images.
2. Method for synchronized interactive play-back on a computer or through the internet of multiple spherical videos and the corresponding sound which are projected in a normalized projection at a rate of 25 or 30 frames per second on a computer screen characterized by the synchronized and interactive playback of multiple spherical videos and the corresponding sound, offers to the viewer the capability, at any point in time during the play-back, without interrupting or disrupting the play-back, with full control using his/her mouse or any other pointing/navigation device, to alter interactively the position from which the event is viewed by choosing which spherical video he/she wishes to watch at any given time, selecting among the several available spherical videos, each one corresponding to any of the several different spherical cameras which had been placed on preselected positions in the area where the event took place, choosing in this way to virtually move to the view-points where the spherical cameras had been placed and watch the event from these points, while at the same time he/she maintains total freedom to choose the direction and angle of viewing from the available full 360°x360° angle of view and total freedom to rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom-out), while also maintaining full interactive control of the spherical video play-back. Irrespective of the interactive alterations in viewing position, angle, direction or virtual distance, the corresponding sound remains synchronized with the projected images.
3. Application for synchronized interactive play-back on a computer or through the internet of multiple spherical videos and the corresponding sound which are stored locally on the play-back computer or are made available for streaming through the internet or are stored in an autonomous optical medium (DVD), are projected in a normalized projection at a rate of 25 or 30 frames per second on a computer screen, characterized by the synchronized and interactive play-back of multiple spherical videos and the corresponding sound, offers to the viewer the capability, at any point in time during the playback, without interrupting or disrupting the play-back, with full control using his/her mouse or any other pointing/navigation device, to alter interactively the position from which the event is viewed by choosing which spherical video he/she wishes to watch at any given time, selecting among the several available spherical videos, each one corresponding to any of the several different spherical cameras which had been placed on preselected positions in the area where the event took place, choosing in this way to virtually move to the view-points where the spherical cameras had been placed and watch the event from these points, while at the same time he/she maintains total freedom to choose the direction and angle of viewing from the available full 360°x360° angle of view and total freedom to rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom-out), while also maintaining full interactive control of the spherical video play-back. Irrespective of the interactive alterations in viewing position, angle, direction or virtual distance, the corresponding sound remains synchronized with the projected images.
4. Autonomous product for the interactive play-back of one or more prerecorded events in a computer, which is available in the form of one or more optical media (DVD), operates locally in the computer of the viewer, offers control and navigational interfaces which enable the viewer to choose freely which part of the event or events wishes to watch in a normalized projection at a rate of 25 or 30 frames per second on the screen of the computer of the viewer, characterized by the synchronized and interactive play-back of multiple spherical videos and the corresponding sound, which are stored in the same optical media (DVD) that constitutes the product, are projected in a normalized projection at a rate of 25 or 30 frames per second on the screen of the computer of the viewer, offers to the viewer the capability, at any point in time during the play-back, without interrupting or disrupting the play-back, with full control using his/her mouse or any other pointing/navigation device, to alter interactively the position from which the event is viewed by choosing which spherical video he/she wishes to watch at any given time, selecting among the several available spherical videos, each one corresponding to any of the several different spherical cameras which had been placed on preselected positions in the area where the event took place, choosing in this way to virtually move to the view-points where the spherical cameras had been placed and watch the event from these points, while at the same time he/she maintains total freedom to choose the direction and angle of viewing from the available full 360°x360° angle of view and total freedom to rotate (turn around) his/her viewing window, approach (zoom-in) and retreat (zoom-out), while also maintaining full interactive control of the spherical video play-back. Irrespective of the interactive alterations in viewing position, angle, direction or virtual distance, the corresponding sound remains synchronized with the projected images.
PCT/GR2008/000060 2007-10-09 2008-10-09 Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content and autonomous product for the interactive play-back of prerecorded events. Ceased WO2009047572A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GR20070100613 2007-10-09
GR20070100613A GR1006496B (en) 2007-10-09 2007-10-09 Integrated system and method of use and production of global video image of optical field 360 degrees x 360 degrees.
GR20080100618 2008-09-29
GR20080100618A GR1006709B (en) 2008-09-29 2008-09-29 Integrated system, method and application of synchronized interactive reproduction of multiple 360x360 deg optical field global videos - interctive reproduction of events through computer or internet

Publications (1)

Publication Number Publication Date
WO2009047572A1 true WO2009047572A1 (en) 2009-04-16

Family

ID=40254560

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GR2008/000060 Ceased WO2009047572A1 (en) 2007-10-09 2008-10-09 Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content and autonomous product for the interactive play-back of prerecorded events.

Country Status (1)

Country Link
WO (1) WO2009047572A1 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160005435A1 (en) * 2014-07-03 2016-01-07 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US20160071546A1 (en) * 2014-09-04 2016-03-10 Lev NEYMOTIN Method of Active-View Movie Technology for Creating and Playing Multi-Stream Video Files
US9743060B1 (en) 2016-02-22 2017-08-22 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US9792709B1 (en) 2015-11-23 2017-10-17 Gopro, Inc. Apparatus and methods for image alignment
US9848132B2 (en) 2015-11-24 2017-12-19 Gopro, Inc. Multi-camera time synchronization
US9922398B1 (en) 2016-06-30 2018-03-20 Gopro, Inc. Systems and methods for generating stabilized visual content using spherical visual content
US9934758B1 (en) 2016-09-21 2018-04-03 Gopro, Inc. Systems and methods for simulating adaptation of eyes to changes in lighting conditions
EP3145199A4 (en) * 2014-05-16 2018-04-25 Unimoto Incorporated 360-degree video-distributing system, 360-degree video distribution method, image-processing device, and communications terminal device, as well as control method therefor and control program therefor
US9973696B1 (en) 2015-11-23 2018-05-15 Gopro, Inc. Apparatus and methods for image alignment
US9973746B2 (en) 2016-02-17 2018-05-15 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10033928B1 (en) 2015-10-29 2018-07-24 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10043552B1 (en) 2016-10-08 2018-08-07 Gopro, Inc. Systems and methods for providing thumbnails for video content
US10129516B2 (en) 2016-02-22 2018-11-13 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10194101B1 (en) 2017-02-22 2019-01-29 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10242714B2 (en) 2016-12-19 2019-03-26 Microsoft Technology Licensing, Llc Interface for application-specified playback of panoramic video
US10244200B2 (en) 2016-11-29 2019-03-26 Microsoft Technology Licensing, Llc View-dependent operations during playback of panoramic video
US10244215B2 (en) 2016-11-29 2019-03-26 Microsoft Technology Licensing, Llc Re-projecting flat projections of pictures of panoramic video for rendering by application
US10268896B1 (en) 2016-10-05 2019-04-23 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10341564B1 (en) 2018-05-18 2019-07-02 Gopro, Inc. Systems and methods for stabilizing videos
US10432864B1 (en) 2018-09-19 2019-10-01 Gopro, Inc. Systems and methods for stabilizing videos
US10469818B1 (en) 2017-07-11 2019-11-05 Gopro, Inc. Systems and methods for facilitating consumption of video content
US10666863B2 (en) 2018-05-25 2020-05-26 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using overlapping partitioned sections
US10684679B1 (en) 2016-10-21 2020-06-16 Gopro, Inc. Systems and methods for generating viewpoints for visual content based on gaze
US10764494B2 (en) 2018-05-25 2020-09-01 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using composite pictures
CN111669568A (en) * 2019-03-07 2020-09-15 阿里巴巴集团控股有限公司 Multi-angle free visual angle interaction method and device, medium, terminal and equipment
US11330151B2 (en) 2019-04-16 2022-05-10 Nokia Technologies Oy Selecting a type of synchronization

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040075738A1 (en) * 1999-05-12 2004-04-22 Sean Burke Spherical surveillance system architecture
JP2005244410A (en) * 2004-02-25 2005-09-08 Nippon Telegr & Teleph Corp <Ntt> Remote video display device, remote video display method, and remote video display program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040075738A1 (en) * 1999-05-12 2004-04-22 Sean Burke Spherical surveillance system architecture
JP2005244410A (en) * 2004-02-25 2005-09-08 Nippon Telegr & Teleph Corp <Ntt> Remote video display device, remote video display method, and remote video display program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HEID J: "VR: QUICKTIME'S NEW EDGE", MACWORLD, PC WORLD COMMUNICATIONS, SAN FRANCISCO, CA, US, 1 July 1995 (1995-07-01), pages 1 - 74, XP002924983, ISSN: 0741-8647 *
MORITA S ET AL: "Networked video surveillance using multiple omnidirectional cameras", COMPUTATIONAL INTELLIGENCE IN ROBOTICS AND AUTOMATION, 2003. PROCEEDIN GS. 2003 IEEE INTERNATIONAL SYMPOSIUM ON JULY 16 - 20, 2003, PISCATAWAY, NJ, USA,IEEE, vol. 3, 16 July 2003 (2003-07-16), pages 1245 - 1250, XP010650320, ISBN: 978-0-7803-7866-7 *

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3145199A4 (en) * 2014-05-16 2018-04-25 Unimoto Incorporated 360-degree video-distributing system, 360-degree video distribution method, image-processing device, and communications terminal device, as well as control method therefor and control program therefor
US9570113B2 (en) * 2014-07-03 2017-02-14 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US10410680B2 (en) 2014-07-03 2019-09-10 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US20160005435A1 (en) * 2014-07-03 2016-01-07 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US10573351B2 (en) 2014-07-03 2020-02-25 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US10679676B2 (en) 2014-07-03 2020-06-09 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US10056115B2 (en) 2014-07-03 2018-08-21 Gopro, Inc. Automatic generation of video and directional audio from spherical content
US20160071546A1 (en) * 2014-09-04 2016-03-10 Lev NEYMOTIN Method of Active-View Movie Technology for Creating and Playing Multi-Stream Video Files
US10033928B1 (en) 2015-10-29 2018-07-24 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10999512B2 (en) 2015-10-29 2021-05-04 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US10560633B2 (en) 2015-10-29 2020-02-11 Gopro, Inc. Apparatus and methods for rolling shutter compensation for multi-camera systems
US9792709B1 (en) 2015-11-23 2017-10-17 Gopro, Inc. Apparatus and methods for image alignment
US10498958B2 (en) 2015-11-23 2019-12-03 Gopro, Inc. Apparatus and methods for image alignment
US9973696B1 (en) 2015-11-23 2018-05-15 Gopro, Inc. Apparatus and methods for image alignment
US10972661B2 (en) 2015-11-23 2021-04-06 Gopro, Inc. Apparatus and methods for image alignment
US9848132B2 (en) 2015-11-24 2017-12-19 Gopro, Inc. Multi-camera time synchronization
US9973746B2 (en) 2016-02-17 2018-05-15 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US11546566B2 (en) 2016-02-22 2023-01-03 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10129516B2 (en) 2016-02-22 2018-11-13 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US9743060B1 (en) 2016-02-22 2017-08-22 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10536683B2 (en) 2016-02-22 2020-01-14 Gopro, Inc. System and method for presenting and viewing a spherical video segment
US10607313B2 (en) 2016-06-30 2020-03-31 Gopro, Inc. Systems and methods for generating stabilized visual content using spherical visual content
US9922398B1 (en) 2016-06-30 2018-03-20 Gopro, Inc. Systems and methods for generating stabilized visual content using spherical visual content
US9934758B1 (en) 2016-09-21 2018-04-03 Gopro, Inc. Systems and methods for simulating adaptation of eyes to changes in lighting conditions
US10546555B2 (en) 2016-09-21 2020-01-28 Gopro, Inc. Systems and methods for simulating adaptation of eyes to changes in lighting conditions
US10915757B2 (en) 2016-10-05 2021-02-09 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10268896B1 (en) 2016-10-05 2019-04-23 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10607087B2 (en) 2016-10-05 2020-03-31 Gopro, Inc. Systems and methods for determining video highlight based on conveyance positions of video content capture
US10043552B1 (en) 2016-10-08 2018-08-07 Gopro, Inc. Systems and methods for providing thumbnails for video content
US11614800B2 (en) 2016-10-21 2023-03-28 Gopro, Inc. Systems and methods for generating viewpoints for visual content based on gaze
US11061474B2 (en) 2016-10-21 2021-07-13 Gopro, Inc. Systems and methods for generating viewpoints for visual content based on gaze
US10684679B1 (en) 2016-10-21 2020-06-16 Gopro, Inc. Systems and methods for generating viewpoints for visual content based on gaze
US10244200B2 (en) 2016-11-29 2019-03-26 Microsoft Technology Licensing, Llc View-dependent operations during playback of panoramic video
US10244215B2 (en) 2016-11-29 2019-03-26 Microsoft Technology Licensing, Llc Re-projecting flat projections of pictures of panoramic video for rendering by application
US10242714B2 (en) 2016-12-19 2019-03-26 Microsoft Technology Licensing, Llc Interface for application-specified playback of panoramic video
US10412328B2 (en) 2017-02-22 2019-09-10 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10194101B1 (en) 2017-02-22 2019-01-29 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10893223B2 (en) 2017-02-22 2021-01-12 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10560648B2 (en) 2017-02-22 2020-02-11 Gopro, Inc. Systems and methods for rolling shutter compensation using iterative process
US10469818B1 (en) 2017-07-11 2019-11-05 Gopro, Inc. Systems and methods for facilitating consumption of video content
US10341564B1 (en) 2018-05-18 2019-07-02 Gopro, Inc. Systems and methods for stabilizing videos
US10587807B2 (en) 2018-05-18 2020-03-10 Gopro, Inc. Systems and methods for stabilizing videos
US12256147B2 (en) 2018-05-18 2025-03-18 Gopro, Inc. Systems and methods for stabilizing videos
US11363197B2 (en) 2018-05-18 2022-06-14 Gopro, Inc. Systems and methods for stabilizing videos
US11025824B2 (en) 2018-05-18 2021-06-01 Gopro, Inc. Systems and methods for stabilizing videos
US11696027B2 (en) 2018-05-18 2023-07-04 Gopro, Inc. Systems and methods for stabilizing videos
US10574894B2 (en) 2018-05-18 2020-02-25 Gopro, Inc. Systems and methods for stabilizing videos
US10587808B2 (en) 2018-05-18 2020-03-10 Gopro, Inc. Systems and methods for stabilizing videos
US10666863B2 (en) 2018-05-25 2020-05-26 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using overlapping partitioned sections
US10764494B2 (en) 2018-05-25 2020-09-01 Microsoft Technology Licensing, Llc Adaptive panoramic video streaming using composite pictures
US10750092B2 (en) 2018-09-19 2020-08-18 Gopro, Inc. Systems and methods for stabilizing videos
US11228712B2 (en) 2018-09-19 2022-01-18 Gopro, Inc. Systems and methods for stabilizing videos
US11172130B2 (en) 2018-09-19 2021-11-09 Gopro, Inc. Systems and methods for stabilizing videos
US10432864B1 (en) 2018-09-19 2019-10-01 Gopro, Inc. Systems and methods for stabilizing videos
US10536643B1 (en) 2018-09-19 2020-01-14 Gopro, Inc. Systems and methods for stabilizing videos
US11647289B2 (en) 2018-09-19 2023-05-09 Gopro, Inc. Systems and methods for stabilizing videos
US11678053B2 (en) 2018-09-19 2023-06-13 Gopro, Inc. Systems and methods for stabilizing videos
US10958840B2 (en) 2018-09-19 2021-03-23 Gopro, Inc. Systems and methods for stabilizing videos
US11979662B2 (en) 2018-09-19 2024-05-07 Gopro, Inc. Systems and methods for stabilizing videos
US12289523B2 (en) 2018-09-19 2025-04-29 Gopro, Inc. Systems and methods for stabilizing videos
CN111669568B (en) * 2019-03-07 2024-05-28 阿里巴巴集团控股有限公司 Multi-angle free view angle interaction method and device, medium, terminal and equipment
CN111669568A (en) * 2019-03-07 2020-09-15 阿里巴巴集团控股有限公司 Multi-angle free visual angle interaction method and device, medium, terminal and equipment
US11330151B2 (en) 2019-04-16 2022-05-10 Nokia Technologies Oy Selecting a type of synchronization

Similar Documents

Publication Publication Date Title
WO2009047572A1 (en) Integrated system, method and application for the synchronized interactive play-back of multiple spherical video content and autonomous product for the interactive play-back of prerecorded events.
US6741250B1 (en) Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path
US11997413B2 (en) Media content presentation
US10812868B2 (en) Video content switching and synchronization system and method for switching between multiple video formats
US11153625B2 (en) Image display method, image distribution method, image display apparatus, and image distribution apparatus
US9661275B2 (en) Dynamic multi-perspective interactive event visualization system and method
US9621768B1 (en) Multi-view media display
JP5920708B2 (en) Multi-view video stream viewing system and method
JP6249202B2 (en) Image display device, image display method, and program
US7836475B2 (en) Video access
KR100836667B1 (en) Simultaneous video and sub-frame metadata capture system
EP1487205B1 (en) Display system for views of video item
CN103875033B (en) The selectivity of local image section shoots and presents
US10063790B2 (en) Virtual flying camera system
JP2014220724A (en) Display controller, display control method and program
JP2020524450A (en) Transmission system for multi-channel video, control method thereof, multi-channel video reproduction method and device thereof
JP6628343B2 (en) Apparatus and related methods
US10764655B2 (en) Main and immersive video coordination system and method
JP6777141B2 (en) Display control device, display control method, and program
JP2009169334A (en) Image processing apparatus and method, program, and recording medium
JP2014220722A (en) Display controller, display control method and program
JP5592264B2 (en) Editing device, editing method and editing program
US20250088705A1 (en) Systems and methods for providing rapid content switching in media assets featuring multiple content streams that are delivered over computer networks
KR20200001597A (en) Multi channel video generation method, multi channel video playback method and multi channel video playback program
JP4934066B2 (en) Information generating apparatus, information generating method, and information generating program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08806813

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08806813

Country of ref document: EP

Kind code of ref document: A1