WO2017029918A1 - Système, procédé et programme pour afficher une image animée avec un champ de vision spécifique - Google Patents
Système, procédé et programme pour afficher une image animée avec un champ de vision spécifique Download PDFInfo
- Publication number
- WO2017029918A1 WO2017029918A1 PCT/JP2016/071040 JP2016071040W WO2017029918A1 WO 2017029918 A1 WO2017029918 A1 WO 2017029918A1 JP 2016071040 W JP2016071040 W JP 2016071040W WO 2017029918 A1 WO2017029918 A1 WO 2017029918A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- view
- field
- comment
- moving image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F13/00—Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/24—Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
Definitions
- the present invention relates to a system, method, and program for displaying a moving image with a specific field of view.
- a video that can have different fields of view among users such as a 360-degree video
- the target being viewed may vary depending on the field of view. Therefore, for example, it is difficult for another user who is viewing a video with a field of view different from the field of view of the user who provided the comment to sympathize with the content of the comment because the target viewed is different from the user who provided the comment. . If the content of the comment cannot be sympathized, the activation of communication is also limited. Therefore, it is desirable to appropriately display information such as a comment in a moving image that may have different fields of view among users.
- An embodiment of the present invention has an object to appropriately display information such as comments input in moving images that may have different fields of view among users. Other objects of the embodiments of the present invention will become apparent by referring to the entire specification.
- a system is a system that displays a moving image in a specific field of view, and includes one or more computer processors, and the one or more computer processors execute readable instructions.
- a specific video that is configured as a video having a wide-angle field of view and a virtual space is associated with the entire field of view in each terminal device of a plurality of users including the first and second users, A step of displaying each field of view of the plurality of users, and a first position in the virtual space included in the field of view of the first user when first input information is received from the first user. And specifying the first input information at the first position, and according to the arrangement of the first input information, the second user's end including the first position in the field of view The device executes a step of displaying the first input information.
- the specific moving image may be configured as a moving image having a visual field of 360 degrees in at least the horizontal direction, and the virtual space may be configured as an inner surface of a virtual sphere.
- a moving image may be referred to as a “360-degree moving image”, and the vertical visual field is, for example, in the range of 180-360 degrees.
- a method is a method for displaying a moving image with a specific field of view, which is executed by one or a plurality of computers, and each terminal device of a plurality of users including a first user and a second user. And displaying a specific moving image having a wide-angle visual field and a virtual space associated with the entire visual field in each visual field of the plurality of users; Identifying a first position in the virtual space included in the field of view of the first user when receiving the first input information, and arranging the first input information at the first position; Displaying the first input information on the terminal device of the second user including the first position in the field of view according to the arrangement of the first input information.
- a program according to an embodiment of the present invention is a program for displaying a moving image in a specific field of view, and is executed on the one or more computers in accordance with the first and the first computers.
- Each terminal device of a plurality of users including the second user is provided with a specific moving image that is configured as a moving image having a wide-angle field of view and has a virtual space associated with the entire field of view of each of the plurality of users.
- Various embodiments of the present invention can appropriately display information such as comments input in moving images that can have different fields of view among users.
- FIG. 1 is a configuration diagram schematically showing a configuration of a network including a system 1 according to an embodiment of the present invention.
- the block diagram which shows roughly the function which the system 1 (server 10 and the terminal device 30) in one Embodiment has.
- the figure for demonstrating the user's visual field in one Embodiment The figure which shows an example of the 1st moving image reproduction screen 60 in one Embodiment.
- the flowchart which shows an example of the comment arrangement
- FIG. 1 is a configuration diagram schematically showing the configuration of a network including a system 1 according to an embodiment of the present invention.
- the system 1 in one embodiment includes a server 10 and a plurality of terminal devices 30 that are communicably connected to the server 10 via a communication network 40 such as the Internet, as illustrated.
- the server 10 provides a moving image distribution service that distributes various moving images to the terminal device 30.
- the moving image distributed in the moving image distribution service in the embodiment includes a real-time moving image (live moving image) provided by the moving image providing apparatus 20.
- the server 10 in one embodiment is configured as a general computer, and as illustrated, a CPU (computer processor) 11, a main memory 12, a user I / F 13, a communication I / F 14, and storage (storage). Device) 15, and these components are electrically connected to each other via a bus.
- the CPU 11 loads an operating system and various other programs from the storage 15 into the main memory 12 and executes instructions included in the loaded programs.
- the main memory 12 is used for storing a program executed by the CPU 11, and is configured by a DRAM or the like, for example.
- the server 10 in one embodiment may be configured using a plurality of computers each having a hardware configuration as described above.
- the user I / F 13 includes, for example, an information input device such as a keyboard and a mouse that accepts an operator's input, and an information output device such as a liquid crystal display that outputs a calculation result of the CPU 11.
- the communication I / F 14 is implemented as hardware, firmware, communication software such as a TCP / IP driver or a PPP driver, or a combination thereof, and can communicate with the video providing device 20 and the terminal device 30 via the communication network 40. Composed.
- the storage 15 is composed of, for example, a magnetic disk drive, and stores various programs such as a control program for providing a moving image distribution service.
- the storage 15 can also store various data for providing the moving image distribution service.
- Various data that can be stored in the storage 15 may be stored in a database server or the like that is physically separate from the server 10 that is communicably connected to the server 10.
- the server 10 also functions as a web server that manages a website composed of a plurality of web pages in a hierarchical structure, and provides a video distribution service to the user of the terminal device 30 through such a website. obtain.
- the storage 15 can also store HTML data corresponding to this web page. HTML data is associated with various image data, and various programs described in a script language such as JavaScript (registered trademark) can be embedded.
- the server 10 can provide a video distribution service via an application executed in an execution environment other than the web browser in the terminal device 30.
- Such applications can also be stored in the storage 15.
- This application is created using a programming language such as Objective-C or Java (registered trademark).
- the application stored in the storage 15 is distributed to the terminal device 30 in response to the distribution request.
- the terminal device 30 can also download such an application from a server other than the server 10 (such as a server providing an application market).
- the server 10 can manage the website for providing the video distribution service and distribute the web page (HTML data) constituting the website in response to a request from the terminal device 30.
- the server 10 may be an application executed in the terminal device 30 in place of or in addition to the provision of the moving image distribution service using such a web page (web browser).
- the video distribution service can be provided based on the communication. Regardless of which aspect of the service is provided, the server 10 can transmit and receive various data (including data necessary for screen display) necessary for providing the video distribution service to and from the terminal device 30.
- the server 10 can store various data for each identification information (for example, user ID) for identifying each user, and can manage the provision status of the video distribution service for each user.
- the server 10 may have a function of performing user authentication processing, billing processing, and the like.
- the moving image providing apparatus 20 is configured as a general computer, and as illustrated in FIG. 1, a CPU (computer processor) 21, a main memory 22, a user I / F 23, a communication I / F 24, and the like. , A storage (storage device) 25 and an ultra-wide-angle camera 26, and these components are electrically connected to each other via a bus.
- the CPU 21 loads an operating system and various other programs from the storage 25 to the main memory 22 and executes instructions included in the loaded programs.
- the main memory 22 is used for storing a program executed by the CPU 21 and is configured by, for example, a DRAM or the like.
- the user I / F 23 includes, for example, an information input device that receives an operator input and an information output device that outputs a calculation result of the CPU 21.
- the communication I / F 24 is implemented as hardware, firmware, communication software such as a TCP / IP driver or a PPP driver, or a combination thereof, and is configured to be able to communicate with the server 10 and the terminal device 30 via the communication network 40.
- the ultra-wide-angle camera 26 has a built-in microphone and is configured to capture an ultra-wide-angle image via an ultra-wide-angle lens or a plurality of lenses.
- the super wide-angle camera 26 is configured as a 360-degree camera having a 360-degree field of view in the horizontal direction and a field-of-view range of 180-360 degrees in the vertical direction.
- 20 is configured to transmit a 360-degree moving image having a substantially omnidirectional field of view photographed through the super wide-angle camera 26 to the server 10 in real time.
- the terminal device 30 is an arbitrary information processing device that displays a web page of a website provided by the server 10 on a web browser and implements an execution environment for executing an application.
- Wearable devices eg, head-mounted displays
- personal computers e.g., Apple MacBook Air, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, Samsung Galaxy Tabs, etc.
- the terminal device 30 is configured as a general computer, and as shown in FIG. 1, a CPU (computer processor) 31, a main memory 32, a user I / F 33, a communication I / F 34, a storage (storage device) ) 35 and various sensors 36, each of which is electrically connected to each other via a bus.
- a CPU computer processor
- main memory main memory
- user I / F 33 main memory
- communication I / F 34 communication I / F 34
- storage (storage device) ) 35 various sensors 36, each of which is electrically connected to each other via a bus.
- the CPU 31 loads an operating system and various other programs from the storage 35 to the main memory 32 and executes instructions included in the loaded programs.
- the main memory 32 is used for storing a program executed by the CPU 31, and is configured by, for example, a DRAM or the like.
- the user I / F 33 includes, for example, an information input device such as a touch panel that accepts user input, a keyboard, a button, and a mouse, and an information output device such as a liquid crystal display that outputs a calculation result of the CPU 31.
- the communication I / F 34 is implemented as hardware, firmware, communication software such as a TCP / IP driver or a PPP driver, or a combination thereof, and can communicate with the server 10 and the moving image providing apparatus 20 via the communication network 40. Composed.
- the storage 35 is composed of, for example, a magnetic disk drive, a flash memory, or the like, and stores various programs such as an operating system.
- the storage 35 can store various applications received from the server 10 or the like.
- the various sensors 36 include, for example, an acceleration sensor, a gyro sensor (angular velocity sensor), a geomagnetic sensor, and the like. Based on information detected by these sensors, the terminal device 30 can specify the posture, inclination, direction, and the like of the terminal device 30 itself.
- the terminal device 30 includes, for example, a web browser for interpreting an HTML file (HTML data) and displaying the screen, and interprets and receives the HTML data acquired from the server 10 by the function of the web browser. A web page corresponding to the HTML data thus displayed can be displayed.
- the web browser of the terminal device 30 can incorporate plug-in software that can execute various types of files associated with HTML data.
- the user of the terminal device 30 uses the video distribution service provided by the server 10, for example, HTML data, an animation instructed by an application, an operation icon, and the like are displayed on the terminal device 30 on the screen.
- the user can input various instructions using the touch panel of the terminal device 30 or the like.
- the instruction input from the user is transmitted to the server 10 via the function of the application execution environment such as the web browser of the terminal device 30 and NgCore (trademark).
- FIG. 2 is a block diagram schematically illustrating functions of the server 10 and the terminal device 30 according to an embodiment.
- the server 10 includes an information storage unit 51 that stores information, a moving image distribution control unit 52 that controls distribution of moving images, and a virtual that manages a virtual space associated with the entire field of view of moving images.
- a space management unit 53 These functions are realized by the cooperative operation of hardware such as the CPU 11 and the main memory 12 and various programs and tables stored in the storage 15, for example, instructions included in the loaded program This is realized by the CPU 11 executing. 2 may be realized by the cooperation of the server 10 and the terminal device 30 or may be realized by the terminal device 30.
- the information storage unit 51 is realized by the storage 15 or the like, and as illustrated in FIG. 2, a user management table 51a for managing information related to a user of the video distribution service, and comments (input information) input by the user A comment management table 51b for managing information related to
- FIG. 3 shows an example of information managed in the user management table 51a in the embodiment.
- the user management table 51a is associated with a “user ID” that identifies an individual user, and includes information such as “nickname” of this user and “avatar information” that is information related to the user's avatar. to manage. These pieces of information can be provided from the user at a timing such as new user registration of the moving image distribution service, and can be updated as appropriate thereafter.
- FIG. 4 shows an example of information managed in the comment management table 51b in one embodiment.
- the comment management table 51b is associated with a combination of “moving image ID” that identifies an individual moving image and “comment ID” that identifies an individual comment, and identifies the user who has input this comment.
- the comment management table 51b manages information about comments for each moving image to be distributed.
- the 360-degree moving image in one embodiment is associated with a virtual space in which the entire field of view is configured as an inner surface of a virtual sphere, and the comment management table 51b has an arrangement position on the virtual space.
- a value specifying the position (coordinate) of is set.
- the user may input support (favorable emotion) for a comment input by another user who is viewing (playing back) the same video as “Like”.
- the number of Like in the comment management table 51b is set to the number of “Like” input for the comment.
- the moving image distribution control unit 52 in one embodiment executes various controls related to moving image distribution.
- the moving image distribution control unit 52 converts a real-time 360 degree moving image received from the moving image providing device 20 or the like into a streaming format and distributes it to the terminal device 30 or a streaming format real time received from the moving image providing device 20 or the like.
- the 360 degree moving image is distributed to the terminal device 30.
- the moving image distributed by the moving image distribution control unit 52 may include a three-dimensional moving image configured to be viewed stereoscopically by the user.
- the virtual space management unit 53 executes various processes related to management of a virtual space associated with the entire field of view of the moving image. For example, when the user inputs a comment, the virtual space management unit 53 specifies a position in the virtual space included in the user's field of view, and places the comment at the specified position.
- the virtual space and the user's field of view associated with the entire field of view of the 360-degree moving image in one embodiment will be described with reference to FIG.
- the 360-degree moving image is configured as a moving image having the entire field of view on the inner surface (all or a part) of the virtual sphere S, and the line of sight of the user located at the center C of the sphere S
- the user's visual field V is specified based on a predetermined viewing angle ⁇ . That is, in the 360 moving image, when the direction of the line of sight of the user located at the center C of the virtual sphere S is specified, the moving image of the portion included in the field of view based on the direction of the line of sight is displayed.
- the visual field V is expressed as a curve, but the visual field V is a partial region of the inner surface of the sphere S as illustrated in FIG. 6.
- the virtual space in one embodiment is configured as the inner surface of the sphere S, that is, associated with the entire field of view of the 360 moving images.
- the virtual space management unit 53 in one embodiment transmits various pieces of information regarding the virtual space to the terminal device 30.
- the virtual space management unit 53 can transmit information related to the comment object based on the comments arranged in the virtual space to the terminal device 30.
- the virtual space management unit 53 receives an input of “Like” for the comment. For example, when the state in which the comment object is displayed at the user's point of interest continues for a predetermined effective time (for example, 10 seconds), the virtual space management unit 53 displays “ The input of “Like” is received via the terminal device 30.
- a predetermined effective time for example, 10 seconds
- the terminal device 30 includes a reproduction control unit 55 that controls reproduction of a moving image, and an input management unit 56 that manages input by a user.
- These functions are realized by the cooperation of hardware such as the CPU 31 and the main memory 32, and various programs and tables stored in the storage 35. For example, instructions included in the loaded program This is realized by the CPU 31 executing.
- part or all of the functions of the terminal device 30 illustrated in FIG. 2 can be realized by the cooperation of the server 10 and the terminal device 30, or can be realized by the server 10.
- the playback control unit 55 executes various controls related to playback of moving images.
- the reproduction control unit 55 displays the 360 degree moving image received from the server 10 on the terminal device 30 with a field of view specified by the user.
- the playback control unit 55 identifies the direction of the user's line of sight according to the user's operation to change the attitude, tilt, orientation, and the like of the terminal device 30 or the flick / drag operation of the screen.
- the moving image of the part included in the visual field determined based on the direction of the user's line of sight is displayed among the moving images of the entire visual field of.
- the reproduction control unit 55 displays information related to the virtual space on the terminal device 30 based on various information related to the virtual space received from the server 10. For example, the reproduction control unit 55 displays information related to the comment on the terminal device 30 based on information related to the comment arranged in the virtual space. In addition, the reproduction control unit 55 outputs a sound based on various information related to the virtual space, for example, outputs a sound corresponding to the comment being placed in the virtual space.
- the input management unit 56 in one embodiment executes various processes related to management of input by the user. For example, when the input management unit 56 detects that the state where the comment is displayed on the user's gaze point has continued for a predetermined effective time, the input management unit 56 recognizes the input of “Like” for the comment, Information indicating the input of “Like” is transmitted to the server 10.
- a user who uses the video distribution service in one embodiment can select a desired video from a plurality of videos provided in the video distribution service via the terminal device 30 and reproduce it on the terminal device 30.
- the server 10 that has received the video distribution request from the terminal device 30 sends the 360-degree video received in real time from the video providing device 20 or the like to the terminal device 30 in a streaming format.
- two video playback screens having different screen configurations are provided as screens for playback of 360-degree video, and the user can select one of the screens to play the video. .
- FIG. 7 is an example of a first video playback screen 60 that is one of the video playback screens.
- the first moving image playback screen 60 in one embodiment is configured as a display area 61 that displays a 360-degree moving image with a specific field of view.
- the display area 61 displays a moving image of a portion included in the visual field specified by the user among the entire visual field of the 360-degree moving image. For example, when the user's field of view (the direction of the line of sight) changes according to the user's operation to change the posture, tilt, orientation, and the like of the terminal device 30 or the flick / drag operation of the display area 61, Is displayed in the display area 61.
- the display area 61 of the first moving image playback screen 60 is configured to display comments (comment objects) arranged in a virtual space included in the visual field so as to overlap the moving image of the portion included in the visual field of the user. Has been. Details will be described later.
- the first moving image playback screen 60 is a moving image included in a specific field of view of a 360-degree moving image using, for example, a VR (Virtual Reality) glass or a VR headset that is equipped with a smartphone or the like.
- a VR Virtual Reality
- the first moving image playback screen 60 can be configured to be divided into a screen for the right eye and a screen for the left eye. Since it is considered that the user who uses the VR glasses or the VR headset cannot perform the flick / drag operation of the screen, the attitude, inclination, and orientation of the terminal device 30 (the VR glasses, the smartphone, etc.) Change the field of view by changing etc.
- FIG. 8 is an example of a second video playback screen 70 that is one of a plurality of video playback screens.
- the second moving image playback screen 70 in one embodiment is configured as a space including a virtual stage, and a display area 71 for displaying a 360-degree moving image with a specific field of view is virtual. Is placed on the stage. Similar to the display area 61 of the first moving image playback screen 60, the display area 71 displays a portion of the moving image included in the visual field specified by the user out of the entire visual field of the 360-degree moving image.
- the user's field of view changes in response to an operation for changing the orientation, tilt, orientation, etc. of the terminal device 30 or a flick / drag operation of the display area 71, a moving image of a part included in the changed field of view is displayed. It is displayed in area 71.
- the avatar 110 of the user who is viewing (playing back) the same video is arranged in the avatar display area 76 corresponding to the space before the virtual stage. ing.
- the second moving image playback screen 70 has a comment input area 72 and a comment transmission button 74 displayed as “Send” at the bottom of the screen.
- the user inputs a desired character string or the like as a comment in the comment input area 72 and selects the comment transmission button 74, the input comment is transmitted to the server 10.
- a comment placement process executed by the server 10 in response to receiving a comment will be described.
- FIG. 9 is a flowchart showing an example of comment placement processing in an embodiment.
- the position on the virtual space where the input comment is placed is specified (step S110).
- the position where the comment is arranged is specified to be a position on the virtual space included in the visual field (when the comment is input) of the user who input the comment.
- Information regarding the visual field of the user who input the comment can be received from the terminal device 30 together with the input comment, for example.
- FIG. 10A is a diagram for describing a user's gaze point FP and a gaze area FR according to an embodiment.
- the center of the user's field of view V (the intersection of the direction of the user's line of sight and the virtual space) is defined as the gazing point FP that the user is gazing at, and the gazing point FP is the center.
- a circular area having a predetermined radius is defined as a gaze area FR.
- the position where the comment is arranged is specified as a position in which the direction of moving (separating from) the gazing point FP is specified and moved in the specified direction by a distance corresponding to the radius of the gazing area FR.
- the direction of moving from the gazing point FP is specified at random, for example.
- FIG. 10B for example, a building that appears in the moving image displayed in the display area 71 (view V) of the second moving image playback screen 70 is being viewed (the building is displayed at the gazing point FP).
- the position outside the range of the gaze area FR is a position moved in a direction away from the building, and is difficult to overlap the building.
- step S120 When the position where the comment is arranged is specified in this way, next, the comment is arranged at the specified position in the virtual space (step S120), and this comment arrangement processing is ended. Specifically, information related to the comment is registered in the comment management table 51b.
- a time obtained by adding a predetermined arrangement duration (for example, 30 seconds) to the current time is set as the deletion time.
- FIG. 11A illustrates the first moving image playback screen 60 displayed on the user terminal device 30 including the position in the virtual space in which the comment is arranged in the field of view.
- the comment object 114 is displayed so as to overlap the moving image at a position on the virtual space where the comment is arranged.
- FIG. 11B illustrates details of the comment object 114.
- the comment object 114 includes a user's avatar 110 that has input a comment, and a balloon object 112.
- the content of the comment (“Excellent!” In the example of FIG. 11B), the nickname of the user who input the comment (“by XXX” in the example of FIG. 11B), and the comment are input.
- the number of Likes is displayed.
- the comment is arranged at a position away from the gazing point FP.
- the corresponding comment object 114 is displayed at a position away from the building.
- FIG. 12 illustrates the first moving image playback screen 60 in which a plurality of comment objects 114 based on comments input by each of a plurality of users are displayed in the display area 61.
- a plurality of comment objects 114 based on comments input by each of a plurality of users are displayed in the display area 61.
- the direction away from the gazing point FP when specifying the position where the comment is arranged is for each comment (for example, randomly) Since it is specified, it is suppressed that the plurality of comment objects 114 are displayed in an overlapping manner at a position moved in the same direction from the gazing point FP.
- FIG. 13 exemplifies a second moving image reproduction screen 70 displayed on the user terminal device 30 including the position where the comment is arranged in the field of view.
- the avatar 110 of the user who views the same video is displayed in the avatar area 76, and the avatar 110 displayed in the avatar area 76 is arranged in the virtual space.
- the above-described balloon object 112 is added to the avatar 110 of the user who has input the comment being displayed.
- the comment placed in the virtual space is displayed on the terminal device 30 of the user including the position where the comment is placed in the field of view.
- the comment object 114 is displayed in the display area 61 so as to be superimposed on the moving image at the position in the virtual space where the comment is arranged.
- the balloon object 112 is added to the avatar 110 of the user who has input the comment in the avatar display area 76 and displayed.
- the position of the comment and the position of the user's field of view (gaze point) in each terminal device 30 of a plurality of users viewing the same video Sound effects based on the relationship are output.
- the sound effect output from the terminal device 30 may be configured such that the sound volume increases as the user's field of view (gaze point) is closer to the position where the comment is placed.
- FIG. 14 illustrates the volume of the sound effect set based on the positional relationship between the position where the comment is placed and the user's field of view.
- the sound effect S1 output from the terminal device 30 of the user is commented at a position not included in the user's visual field V.
- the volume is larger than the sound effects S2 and S3 when C2 and C3 are arranged.
- the sound effect S2 when the comment C2 is arranged is the sound effect S3 when the comment C3 (the distance between the position where the comment is arranged and the user's visual field V is farther than the comment C2) is arranged. Is louder than.
- the distance between the position where the comment is arranged and the visual field of the user is the angle between the direction in which the comment is arranged and the direction of the user's line of sight with respect to the center of the sphere S ( ⁇ 1, ⁇ 2 in FIG. 14).
- the user can feel a sense of realism by configuring the volume so that the closer the user's field of view (gaze point) and the position where the comment is placed, the greater the volume.
- the sound effect is output as a sound in a direction (for example, right direction or left direction) where the comment is arranged with reference to the user's visual field (gaze point).
- a direction for example, right direction or left direction
- the sound effect S2 corresponding to the arrangement of the comment C2 arranged on the left side of the visual field is output as a sound audible from the left direction, and corresponds to the arrangement of the comment C3 arranged on the right side of the user's visual field.
- the sound effect S3 is output as a sound that can be heard from the right direction.
- a sound effect is output from each terminal device 30 of a plurality of users who view the same video, so the user is not included in the field of view. It is possible to know the input of a comment including a comment arranged at a position.
- a comment placed in the virtual space is deleted (arranged) when an erasing time set for each comment is reached.
- the comment object 114 displayed on the moving image in the display area 61 of the first moving image reproduction screen 60 is also erased (cannot be displayed).
- the balloon object 112 displayed in the avatar display area 76 of the second video playback screen 70 may be configured to be deleted in response to the deletion of the comment. It may be erased at different timings. That is, in one embodiment, the deletion of the comment object 114 on the first moving image playback screen 60 and the deletion of the balloon object 112 on the second moving image playback screen 70 can be controlled independently.
- a comment having a larger number of accepted Likes is configured to have a longer time until the comment is deleted. Specifically, for example, every time the number of Likes reaches a predetermined number (for example, 10), a predetermined additional time (for example, 10 seconds) is added to the erasing time.
- a predetermined number for example, 10
- a predetermined additional time for example, 10 seconds
- FIG. 16 exemplifies a change in the display of the comment object 114 until the state where the comment object 114 is positioned at the point of gaze continues for an effective time.
- a certain time for example, 3 seconds
- the background color of the balloon object 112 changes and the progress gauge 113 is added (i).
- the display of the progress gauge 113 changes with the passage of time, and when the valid time is reached (ii), the progress gauge 113 is erased and the background color of the balloon object 112 is restored to the original, and the input of “Like” is accepted. And the display of the number of Likes is updated (iii).
- the server 10 Upon accepting “Like” via the terminal device 30, the server 10 updates the number of Likes in the comment management table 51b. Further, as described above, the erase time can be updated (added) as the number of Likes increases.
- the input of “Like” for the comment corresponding to the balloon object 112 Is accepted.
- the operation of accepting the input of “Like” and updating the display of the number of Likes is the same as in the case of the first moving image playback screen 60 described above.
- a 360-degree moving image distributed in a streaming format in real time is stored in the information storage unit 51 of the server 10, and the stored moving image is used as a request from the terminal device 30. Accordingly, it can be configured so that it can be played back later.
- the comment input at the time of live streaming distribution is arranged in the virtual space according to the information managed in the comment management table 51b. Specifically, corresponding comments are arranged in the virtual space according to the arrangement time, arrangement position, and deletion time of the comment management table 51b, and then deleted.
- the comment is input via the second video playback screen 70.
- a comment corresponding to the first video playback screen 60 can be input. It can also be configured.
- the first moving image playback screen 160 in another embodiment illustrated in FIG. 17 has a display area 61 similar to the first moving image playback screen 60 described above, and the second moving image is displayed at the lower end of the display area 61.
- a comment input area 72 and a comment transmission button 74 similar to those on the reproduction screen 70 are arranged.
- comment input can also be realized by applying a voice input technique.
- a comment is exemplified as input information that is input from a user and arranged in the virtual space.
- the input information is not limited to a comment.
- various information that can be input from the user such as a stamp and an icon, can be included in the input information.
- the sound effect output according to the arrangement of the comment is configured such that the sound volume increases as the user's field of view and the position where the comment is arranged are closer to each other.
- the timbre and pitch of the sound effect Etc. can vary.
- each terminal device 30 of a plurality of users is configured with a plurality of moving images that are configured as a moving image having a wide-angle visual field and a virtual space is associated with the entire visual field.
- a comment is received from the user
- the position in the virtual space included in the user's field of view is specified and arranged, and the arranged position is viewed according to the arrangement of the comment.
- the comment is displayed on the terminal device 30 of the user included. Therefore, the user of the terminal device 30 in which the comment is displayed has a field of view close to the field of view of the user who has input the comment.
- the embodiment of the present invention can appropriately display information such as a comment input in a moving image that may have different fields of view among users.
- the processes and procedures described in this specification are realized by software, hardware, or any combination thereof other than those explicitly described in the embodiment. More specifically, the processes and procedures described in this specification are performed by mounting logic corresponding to the processes on a medium such as an integrated circuit, a volatile memory, a nonvolatile memory, a magnetic disk, or an optical storage. Realized. Further, the processes and procedures described in this specification can be implemented as a computer program and executed by various computers.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
L'objectif de la présente invention est d'afficher de manière appropriée des informations telles qu'un commentaire entré dans une image animée avec un champ de vision qui peut différer entre utilisateurs. Un système selon un mode de réalisation affiche, sur un dispositif terminal de chacun d'une pluralité d'utilisateurs, une image animée qui est construite sous la forme d'une image animée ayant un champ de vision à grand angle et dont le champ de vision total est associé à un espace virtuel avec les champs de vision respectifs de la pluralité d'utilisateurs, et lors de l'acceptation d'un commentaire provenant d'un utilisateur, spécifie une position sur l'espace virtuel compris dans le champ de vision de l'utilisateur, place le commentaire à ladite position, et selon le placement du commentaire, affiche le commentaire sur le dispositif terminal d'un utilisateur comprenant la position à laquelle le commentaire est placé dans le champ de vision de ce dernier.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015-162720 | 2015-08-20 | ||
| JP2015162720A JP2017041780A (ja) | 2015-08-20 | 2015-08-20 | 動画を特定の視野で表示するシステム、方法、及びプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017029918A1 true WO2017029918A1 (fr) | 2017-02-23 |
Family
ID=58050792
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/071040 Ceased WO2017029918A1 (fr) | 2015-08-20 | 2016-07-15 | Système, procédé et programme pour afficher une image animée avec un champ de vision spécifique |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2017041780A (fr) |
| WO (1) | WO2017029918A1 (fr) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018198946A1 (fr) * | 2017-04-28 | 2018-11-01 | 株式会社コナミデジタルエンタテインメント | Dispositif serveur et programme informatique destiné à être utilisé avec celui-ci |
| WO2020129115A1 (fr) * | 2018-12-17 | 2020-06-25 | 株式会社ソニー・インタラクティブエンタテインメント | Système de traitement d'informations, procédé de traitement d'informations et programme informatique |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7139681B2 (ja) * | 2018-05-14 | 2022-09-21 | 富士通株式会社 | 制御プログラム、制御方法、制御装置および制御サーバ |
| JP7356827B2 (ja) * | 2019-06-26 | 2023-10-05 | 株式会社コロプラ | プログラム、情報処理方法、及び情報処理装置 |
| JP7346983B2 (ja) * | 2019-07-31 | 2023-09-20 | 株式会社リコー | 表示端末、遠隔制御システム、表示制御方法およびプログラム |
| US11228737B2 (en) | 2019-07-31 | 2022-01-18 | Ricoh Company, Ltd. | Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium |
| CN116260986A (zh) * | 2021-12-10 | 2023-06-13 | 华为技术有限公司 | 自由视角视频的弹幕的显示方法、装置及系统 |
| DE112023001654T5 (de) | 2022-03-30 | 2025-01-16 | Sony Group Corporation | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren und aufzeichnungsmedium |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014183380A (ja) * | 2013-03-18 | 2014-09-29 | Nintendo Co Ltd | 情報処理プログラム、情報処理装置、情報処理システム、パノラマ動画表示方法、および、制御データのデータ構造 |
| JP2015018013A (ja) * | 2013-07-08 | 2015-01-29 | 株式会社リコー | 表示制御装置、プログラム及び記録媒体 |
-
2015
- 2015-08-20 JP JP2015162720A patent/JP2017041780A/ja active Pending
-
2016
- 2016-07-15 WO PCT/JP2016/071040 patent/WO2017029918A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014183380A (ja) * | 2013-03-18 | 2014-09-29 | Nintendo Co Ltd | 情報処理プログラム、情報処理装置、情報処理システム、パノラマ動画表示方法、および、制御データのデータ構造 |
| JP2015018013A (ja) * | 2013-07-08 | 2015-01-29 | 株式会社リコー | 表示制御装置、プログラム及び記録媒体 |
Non-Patent Citations (1)
| Title |
|---|
| "Dark Souls 2' kara ''Yurui Tsunagari'' no Enshutsu Yoso o Matomete Shokai! Gen'ei ya Kekkon ga Egaku Dokutoku no Online Play ga Sarani Kyoka", DENGEKI ONLINE, 18 November 2013 (2013-11-18), XP055365531, Retrieved from the Internet <URL:http://dengekionline.com/elem/000/000/753/753375> [retrieved on 20151112] * |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018198946A1 (fr) * | 2017-04-28 | 2018-11-01 | 株式会社コナミデジタルエンタテインメント | Dispositif serveur et programme informatique destiné à être utilisé avec celui-ci |
| JP2018191064A (ja) * | 2017-04-28 | 2018-11-29 | 株式会社コナミデジタルエンタテインメント | サーバ装置、及びそれに用いられるコンピュータプログラム |
| CN110574382A (zh) * | 2017-04-28 | 2019-12-13 | 科乐美数码娱乐株式会社 | 服务器装置以及在该服务器装置中使用的计算机程序 |
| US11273372B2 (en) | 2017-04-28 | 2022-03-15 | Konami Digital Entertainment Co., Ltd. | Server device and storage medium for use therewith |
| WO2020129115A1 (fr) * | 2018-12-17 | 2020-06-25 | 株式会社ソニー・インタラクティブエンタテインメント | Système de traitement d'informations, procédé de traitement d'informations et programme informatique |
| US11831854B2 (en) | 2018-12-17 | 2023-11-28 | Sony Interactive Entertainment Inc. | Information processing system, information processing method, and computer program |
| US12278936B2 (en) | 2018-12-17 | 2025-04-15 | Sony Interactive Entertainment Inc. | Information processing system, information processing method, and computer program |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2017041780A (ja) | 2017-02-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7589374B2 (ja) | 情報処理装置、情報処理方法およびコンピュータプログラム | |
| US12073234B2 (en) | Management framework for mixed reality devices | |
| WO2017029918A1 (fr) | Système, procédé et programme pour afficher une image animée avec un champ de vision spécifique | |
| CN105358227B (zh) | 分享三维游戏过程 | |
| TWI571130B (zh) | 體積式視訊呈現 | |
| JP6470356B2 (ja) | 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置 | |
| EP3396511A1 (fr) | Dispositif de traitement d'informations et procédé de réception d'opération | |
| JP7503122B2 (ja) | 位置に基づくゲームプレイコンパニオンアプリケーションへユーザの注目を向ける方法及びシステム | |
| JP6392945B1 (ja) | 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置 | |
| US9294670B2 (en) | Lenticular image capture | |
| JP6277329B2 (ja) | 立体広告枠決定システム、ユーザ端末および立体広告枠決定コンピュータ | |
| US20230073750A1 (en) | Augmented reality (ar) imprinting methods and systems | |
| KR20190088545A (ko) | 상호작용적 증강 현실 프리젠테이션들을 표시하기 위한 시스템들, 방법들 및 매체들 | |
| JP6932206B2 (ja) | 空間オーディオの提示のための装置および関連する方法 | |
| CN108604175A (zh) | 装置和关联方法 | |
| US20150213784A1 (en) | Motion-based lenticular image display | |
| WO2019060985A1 (fr) | Système et procédé en nuage permettant de créer une visite guidée virtuelle | |
| JP2020520576A5 (fr) | ||
| CN116233513A (zh) | 虚拟现实直播间虚拟礼物特效播放处理方法、装置和设备 | |
| JP2017041872A (ja) | 動画を特定の視野で表示するシステム、方法、及びプログラム | |
| JP6952065B2 (ja) | 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置 | |
| JP6921789B2 (ja) | 仮想空間を提供するコンピュータで実行されるプログラム、方法、および当該プログラムを実行する情報処理装置 | |
| CN116132702A (zh) | 虚拟现实直播间虚拟礼物的赠送处理方法、装置和设备 | |
| JP6974253B2 (ja) | 仮想空間を提供するための方法、当該方法をコンピュータに実行させるためのプログラム、および当該プログラムを実行するための情報処理装置 | |
| US20250148700A1 (en) | Information processing apparatus capable of preventing unwanted objects from entering into photographing range of virtual camera in xr space, control method for information processing apparatus, and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16836909 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16836909 Country of ref document: EP Kind code of ref document: A1 |