WO2019111348A1 - Dispositif de traitement vidéo, procédé de traitement vidéo, programme informatique et système de traitement vidéo - Google Patents
Dispositif de traitement vidéo, procédé de traitement vidéo, programme informatique et système de traitement vidéo Download PDFInfo
- Publication number
- WO2019111348A1 WO2019111348A1 PCT/JP2017/043803 JP2017043803W WO2019111348A1 WO 2019111348 A1 WO2019111348 A1 WO 2019111348A1 JP 2017043803 W JP2017043803 W JP 2017043803W WO 2019111348 A1 WO2019111348 A1 WO 2019111348A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image information
- frame data
- timing
- time interval
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/18—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with elements, i.e. platforms, having a circulating, nutating or rotating movement, generated by oscillating movement of the user, e.g. platforms wobbling on a centrally arranged spherical support
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/06—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of ships, boats, or other waterborne vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/93—Regeneration of the television signal or of selected parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a video processing apparatus, a video processing method, a computer program, and a video processing system.
- Patent Document 1 proposes a training apparatus incorporating a video processing system. In this training apparatus, shooting frame data shot every time the vehicle travels a prescribed distance is displayed in conjunction with the traveling distance on the treadmill.
- the training device of Patent Document 1 assumes exercise that continues to run in one direction, such as running, and does not assume reciprocating motion of a specific part by the user.
- the training device of Patent Document 1 has room for improvement in this respect.
- the present invention has been made in view of such circumstances, and an object thereof is to realize video processing suitable for reciprocating motion of a specific part by a user.
- the present invention is an operation which periodically changes with reciprocating motion of a specific part of the user between the first position and the second position, which is output from the operation information output unit.
- An image processing apparatus that processes an image based on information, and an image storage unit that stores a plurality of captured image information captured in time series, and an image selection that selects the captured image information stored in the image storage unit.
- a time interval setting unit for setting a time interval between the photographed image information, wherein the photographed image information corresponds to first photographed image information corresponding to the first position, and the second position.
- the image selection unit further includes a prediction unit that includes second captured image information, and predicts, based on the operation information, the movement time of the specific part between the first position and the second position.
- the first specific image information is selected at the first timing at which the fixed part is located at the first position, and the specific part after the movement time predicted by the prediction unit has passed from the first timing is
- the second captured image information is selected at a second timing when the second position is reached, and the time interval setting unit determines the time between the captured image information from the first captured image information to the second captured image information.
- the interval is set to a time interval based on the movement time predicted by the prediction unit.
- FIG. 1 is a diagram showing a schematic configuration of a video display system according to an embodiment of the present invention. It is a figure showing the appearance of an operation information output device.
- A is a figure which shows the whole structure of a rowing machine.
- B is a figure which shows a user's wrist of a rowing machine in a position away from a user's chest.
- C is a figure which shows a user's wrist of a rowing machine in a position close
- (A) is a figure explaining the hardware constitutions of a control device.
- B) is a figure which shows the various data memorize
- (C) is a figure which shows the functional block of a control part. It is a figure showing hardware constitutions of an operation information output device.
- (A) is a figure showing each axis of a gyro sensor.
- (B) is a figure showing each axis of an acceleration sensor.
- (C) is a figure explaining normalization of the sensor output outputted from a gyro sensor. It is a figure which shows the relationship between the sensor output from the gyro sensor after normalization, and the position of a wrist. It is a figure which shows switching of the introductory scene, the interlocking
- (A) is a figure which shows the kind of imaging
- (B) is a figure which shows the kind of audio
- (C) is a figure which shows the metadata added with respect to imaging
- (D) is a figure which shows the relationship between the interlocking scene of each period, and the reciprocation frequency of a wrist. It is a figure which shows the relationship between the marker data in each period of a interlocking scene, and imaging
- (A) is a figure which shows the kind of audio
- (B) is a figure which shows the production
- A) is a figure which shows reproduction
- (B) is a figure which shows reproduction
- (C) is a figure which shows reproduction
- (A) is a figure which shows a prediction time calculation procedure.
- (B) is a figure which shows the dispersion
- A is a figure which shows production
- A) is a figure which shows production
- B) is a figure which shows production
- C) is a figure which shows production
- D) is a figure showing generation of synthetic frame data F (mix14).
- A) is a flowchart which shows the signal monitoring process which CPU of a control apparatus performs.
- B) is a flowchart which shows the main processing which CPU of a control device performs. It is a flowchart which shows the interlocking scene reproduction process which CPU of a control apparatus performs. It is a figure which shows the example which makes imaging frame data in which the time interval was adjusted into display frame data.
- FIG. 1 is a diagram showing a schematic configuration of a video display system 1 according to an embodiment of the present invention.
- the video display system 1 shown in FIG. 1 stores video content (a plurality of shooting frame data shot in time series and audio data to be played back along with the plurality of shooting frame data), and controls playback of the video content.
- Control device 2 an operation information output device 3 (operation information output unit) which is attached to the user and is communicably connected to the control device 2 and transmits / receives various signals to / from the control device 2;
- a sound output device 5 connected communicably to the control device 2 and capable of reproducing audio of the video content.
- the control device 2 functions as a video processing device that processes video content to be displayed on the display device 4 and the sound output device 5 based on a sensor output (motion information, which will be described later) from the motion information output device 3.
- the control device 2 can use, for example, a general personal computer. The control device 2 will be described in detail later.
- FIG. 2 is a view showing the appearance of the operation information output device 3.
- the operation information output device 3 is attached to the housing 3 a housing the components such as the control unit 31 and the gyro sensor 32 (see FIG. 5) and the housing 3 a, and is used by the user's wrist. And a band 3b wound around the (specific part).
- the first mode switch 34 and the second mode switch 35 which are operated when switching the playback scene of the video content, and various operations related to the playback of the video content are operated on the surface of the housing 3a.
- a playback related switch 36 and a menu switch 37 operated when displaying a menu screen related to the video content are provided.
- the motion information output device 3 transmits, to the control device 2, a sensor output in which the signal level changes periodically as the user's wrist reciprocates.
- Display frame data is wirelessly transmitted and received between the display device 4 and the control device 2.
- the display frame data is read from the frame buffer 23 (see FIG. 4) included in the control device 2 and transmitted.
- the display device 4 displays an image according to the display frame data, specifically, a stereoscopic image which can be viewed stereoscopically.
- the display device 4 is not limited to a head mounted display, and may be a liquid crystal display or a projector. Communication between the display device 4 and the control device 2 is not limited to wireless communication but may be wired communication.
- headphones are preferably used.
- An acoustic signal based on audio data (described later) is wirelessly transmitted and received between the sound output device 5 and the control device 2.
- the sound output device 5 When the sound signal is input to the sound output device 5, the sound output device 5 outputs a sound according to the sound signal.
- the sound output device 5 is not limited to headphones but may be a speaker.
- Communication between the sound output device 5 and the control device 2 is not limited to wireless communication but may be wired communication.
- FIG. 3A is a diagram showing an entire configuration of the rowing machine 6.
- FIG. 3 (b) is a view showing that the user's wrist of the rowing machine 6 is located at a distance from the user's chest.
- FIG. 3C shows the wrist of the user of the rowing machine 6 in a position close to the chest of the user.
- the video display system 1 reproduces the video content in conjunction with the operation of the user, for example, when using the rowing machine 6 shown in FIGS. 3 (b) and 3 (c).
- the illustrated rowing machine 6 has a prismatic frame 61, a seat member 62 movable along the frame 61, on which the user's buttocks can be placed, and a frame that can rotate around the proximal end. And a damper having a substantially T-shaped handle 63 whose tip end is gripped by the user's hands, a damper whose one end is rotatably attached to the frame 61 and whose other end is rotatably attached to the handle 63 64 and a footrest member 65 on which the user's sole is placed.
- the user of the rowing machine 6 wears the motion information output device 3 on the wrist (specific part) at the time of training, and the display 4 on the head (head mounted display And the sound output device 5 (headphones).
- the user uses the rowing machine 6 to perform bending and stretching exercises between the state in which the body is bent as shown in FIG. 3 (b) and the state in which the body is extended as shown in FIG. 3 (c).
- the user's wrist is positioned at a distant position away from the user's chest with the body bent and at a close position close to the user's chest with the body extended. That is, the user's wrist (specific part) reciprocates along the trajectory of the other end of the handle 63 between the close position and the separated position. Therefore, the motion information output device 3 mounted on the wrist of the user also reciprocates between the close position and the separated position.
- FIG. 4A is a diagram for explaining the hardware configuration of the control device 2.
- FIG. 4B is a view showing various data stored in the storage unit of the control device 2.
- the control device 2 stores display frame data corresponding to an image to be displayed on the display device 4, and a control unit 21 serving as a main body of various controls, a storage unit 22 storing various data.
- a frame buffer 23 and a communication interface (I / F) 24 for transmitting and receiving information to and from other devices are provided.
- the control unit 21 includes a central processing unit (CPU) 21 a that develops a program stored in the storage unit 22 in a random access memory (RAM) 21 b and executes the program.
- the storage unit 22 is, for example, a hard disk drive (HDD) or a solid state drive (SSD), and as illustrated in FIG.
- the frame buffer 23 is a volatile memory that stores display frame data for one screen displayed by the display device 4.
- the display frame data is generated by the control unit 21 based on the shooting frame data stored in the storage unit 22.
- the control unit 21 writes the generated display frame data in the frame buffer 23.
- the generation of display frame data by the control unit 21 will be described later.
- the communication interface 24 transmits and receives information to and from the operation information output device 3.
- the communication interface 24a transmits and receives information to and from the operation information output device 3.
- the communication interface 24b transmits and receives information to and from the display device 4.
- a communication interface 24c is provided.
- the communication interface 24 for example, one conforming to a communication method or protocol such as Bluetooth (registered trademark) or WirelessUSB (Wireless Universal Serial Bus) can be used. In the case of wired communication, one conforming to a method such as RS-232C or USB (Universal Serial Bus) can be used.
- a communication method or protocol such as Bluetooth (registered trademark) or WirelessUSB (Wireless Universal Serial Bus)
- wired communication one conforming to a method such as RS-232C or USB (Universal Serial Bus) can be used.
- FIG. 4C is a diagram showing functional blocks of the control unit 21.
- the functional block shown in FIG. 4C is realized by the CPU 21a developing a program stored in the storage unit 22 on the RAM 21b and executing the program.
- the image selection unit 21 c selects shooting frame data stored in the storage unit 22 (image storage unit).
- the time interval setting unit 21d sets a time interval between the imaging frame data.
- the prediction unit 21e outputs the movement information output device 3 (movement information output time) of the movement time of the user's wrist (specific part) between one (first position) and the other (second position) of the proximity position and the separation position.
- the display image information generation unit 21 f Prediction based on the sensor output (operation information) output from The display image information generation unit 21 f generates display frame data by performing alpha blending processing on a pair of shooting frame data which are adjacent to each other in accordance with the timing defined by the display device 4.
- the photographed image information selection unit 21g selects photographed frame data of any one cycle among photographed frame data of a plurality of cycles (a plurality of sets of photographed image information) based on a periodic change of sensor output (operation information). select.
- the communication control unit 21 h controls the communication interfaces 24 a to 24 c to communicate with the operation information output device 3, the display device 4, and the sound output device 5.
- FIG. 5 is a diagram showing a hardware configuration of the operation information output device 3.
- the operation information output device 3 is controlled mainly to perform various controls in addition to the first mode switch 34, the second mode switch 35, the reproduction related switch 36, and the menu switch 37 described above
- a communication interface (I / F) 38 for transmitting and receiving information to and from the control unit 2, a gyro sensor 32 for detecting an angular velocity with respect to a reference axis, an acceleration sensor 33 for detecting acceleration, There is.
- the control unit 31 includes a CPU 31a (Central Processing Unit) that executes a program stored in the non-volatile memory 31b, and a volatile memory 31c used by the CPU 31a.
- CPU 31a Central Processing Unit
- FIG. 6A shows the axes of the gyro sensor 32.
- FIG. FIG. 6 (b) is a view showing each axis of the acceleration sensor 33.
- FIG. 6C is a view for explaining normalization of the sensor output output from the gyro sensor 32.
- the gyro sensor 32 detects angular velocities centered on the X axis, the Y axis, and the Z axis orthogonal to one another.
- the acceleration sensor 33 detects an acceleration for each of the X axis, the Y axis, and the Z axis orthogonal to each other.
- the control unit 31 of the motion information output device 3 detects the angular velocity detected by the gyro sensor 32 and the acceleration sensor when the user is exercising on the rowing machine 6 (see FIGS. 3B and 3C). Based on the acceleration detected by 33, a sensor output that changes periodically as the user's wrist reciprocates is acquired. For example, based on the sensor output of the acceleration sensor 33, the control unit 31 acquires the inclination of the housing included in the operation information output device 3 with respect to the gravity direction. Further, the control unit 31 obtains a sensor output by normalizing the angular velocity detected by the gyro sensor 32 with the reciprocating motion of the wrist according to the inclination of the housing 3a with respect to the gravity direction. For example, as shown in FIG.
- the control unit 31 selects the angular velocity of the two axes with a large amount of change from the angular velocity of the three axes output by the gyro sensor 32, and selects the angular velocity of the selected two axes.
- the sensor output is acquired by normalizing to one axis according to the inclination of the housing 3a.
- the control unit 31 transmits the acquired sensor output to the control device 2 via the communication interface 38.
- FIG. 7 is a diagram showing the relationship between the sensor output NS from the gyro sensor 32 after normalization and the position of the wrist.
- the sensor output NS indicated by the alternate long and short dash line in FIG. 7 has a positive (+) value when the wrist moves in a direction away from the chest during exercise with the rowing machine 6, and the wrist approaches the chest It is negative (-) when moving.
- the sensor output NS indicates the value "0" when the movement of the wrist stops.
- the position PW of the wrist shown by the solid line in FIG. 7 when the sensor output NS changes from a negative value to "0", the wrist is at the closest position B closest to the chest (see FIG. 3C). It can be said.
- the control unit 21 of the control device 2 obtains the position of the wrist (specific part) at the time of exercise on the rowing machine 6 (reciprocation of the wrist). can do.
- FIG. 8 is a diagram showing the switching of the introductory scene, the interlocked scene, and the finish scene included in the video content.
- the video content is played back by the user's instruction of playback.
- reproduction of the video content is started. That is, the control unit 21 of the control device 2 starts the reproduction of the video content based on the operation information output by the reproduction related switch 36 when the operation is performed.
- the video content includes an introduction scene, an interlocking scene, and a finish scene.
- the introduction scene is a prologue scene.
- the interlocked scene is a scene to be reproduced interlockingly with the user's action (reciprocal movement of the wrist).
- the finish scene is an ending scene.
- the introduction scene is a scene from before the race to just before the start.
- the interlocked scene is a scene from the start of the competition to the front of the goal, and the finish scene is a scene from the front of the goal to the rear of the goal.
- the reproduction of the video content is started from the introduction scene.
- the reproduction of the introduction scene is performed regardless of the sensor output of the operation information output device 3. That is, the control unit 21 of the control device 2 reproduces the imaging frame data at a predetermined interval dt1 (see FIG. 10) and reproduces the audio data regardless of the sensor output of the operation information output device 3.
- the control unit 21 of the control device 2 reproduces the imaging frame data at a predetermined interval dt1 (see FIG. 10) and reproduces the audio data regardless of the sensor output of the operation information output device 3.
- the control device 2 receives the sensor output from the operation information output device 3 and controls the reproduction of the interlocked scene based on the sensor output (operation information).
- the control unit 21 of the control device 2 starts reproduction of the finish scene.
- the reproduction of the finish scene is performed regardless of the sensor output of the motion information output device 3, and the reproduction of the video content ends with the completion of the reproduction of the finish scene.
- the video display system 1 acquires a sensor output (a sensor output from the normalized gyro sensor 32) from the motion information output device 3 when reproducing the interlocked scene, and the interlocked scene is acquired according to the acquired sensor output. Control playback. In other words, the video display system 1 controls the reproduction of the video content in accordance with the cycle of the user's wrist reciprocation when using the rowing machine 6.
- five types of linked scenes are prepared: the first cycle to the fifth cycle, and immediately after switching to the linked scene, linked scenes having a specific cycle (for example, the first cycle) are reproduced. Thereafter, in accordance with the periodic change of the sensor output, the interlocked scene of the first period is sequentially switched to the interlocked scene of the appropriate period among the interlocked scenes of the fifth period. Further, even during the reproduction of the interlocked scene of one cycle, the reproduction of the interlocked scene is controlled in accordance with the reciprocating motion of the user's wrist. The details will be described below.
- FIG. 9A is a view showing the contents of shooting frame data.
- the shooting frame data includes three types of shooting frame data of the introduction scene, shooting frame data of the interlocked scene, and shooting frame data of the finish scene.
- the video content is reproduced regardless of the sensor output from the motion information output device 3.
- the interlocked scene the reproduction of the video content is controlled interlocked with the sensor output from the operation information output device 3.
- the video display system 1 of the present embodiment includes five types from the interlocked scene of the first period to the interlocked scene of the fifth period, and switches the interlocked scene to be reproduced according to the cycle of the reciprocation of the wrist.
- the interlocking scene of the first cycle is selected when the cycle of the reciprocating movement of the wrist is the longest
- the interlocking scene of the fifth cycle is selected when the cycle of the reciprocating movement of the wrist is the shortest.
- Ru for example, the linked scene of the first cycle is selected when the cycle of rowing all is the longest (the boat traveling speed is the slowest).
- the interlocking scene of the fifth period is selected when the period in which the row is over the all is the shortest (when the boat traveling speed is the fastest).
- the interlocked scene in the second period to the interlocked scene in the fourth period are also selected according to the cycle length of the reciprocating movement of the wrist.
- the linked scene of the first cycle corresponds to the boat running scene in the case where the section from the start to the goal front is covered all at the longest repeat cycle
- the linked scene of the second cycle is from the start to the goal
- the run scene of the boat in the case where you go through the whole in the second longest repetition cycle corresponds to the section up to the section
- the interlocking scene of the third cycle is the section from the start to the goal front in the third slow repetition cycle
- the traveling scene of the boat in the case of looking over the oar corresponds.
- the linked scene of the fourth cycle corresponds to the traveling scene of the boat in the case where the section from the start to the front of the goal is covered in the second short repetition cycle
- the linked scene of the fifth cycle starts The run scene of the boat in the case where you go through all in the shortest repetition cycle corresponds to the section from the goal to the goal front.
- the interlocking scene of each cycle is a scene captured by changing the traveling speed of the boat for each cycle in the section from the start to the goal front.
- the linked scenes in each cycle have the same content but different traveling speeds. In the example shown in FIG.
- the code F (S) indicates a plurality of shooting frame data belonging to the introduced scene
- the code F (E) indicates a plurality of shooting frame data belonging to the finish scene
- reference signs F (F1) to F (F5) respectively indicate a plurality of shooting frame data belonging to the interlocked scenes of the first to fifth cycles.
- FIG. 9 (b) is a diagram showing the type of audio data.
- the audio data like the shooting frame data, comprises three types of scenes: an introductory scene, an interlocking scene, and a finish scene. Each audio data is paired with corresponding shooting frame data. Therefore, the audio data of the interlocked scene includes five types of audio data from the interlocked scene of the first period to the interlocked period of the fifth period.
- the audio data may be in any format as long as it can be handled by the control device 2. For example, it may be uncompressed PCM (Pulse Code Modulation) format or compressed MP3 (MPEG-1 Audio Layer-3) format.
- FIG. 9C is a diagram showing metadata added to shooting frame data and audio data.
- FIG. 9D is a diagram showing the relationship between the linked scene in each cycle and the round-trip frequency of the wrist.
- as metadata a range of the round-trip frequency of the wrist corresponding to each period (first period to fifth period) of the interlocked scene, and a plurality of shooting frame data F (F1 belonging to the interlocked scene of each period )
- To F (F5) marker data to be added to specific imaging frame data, and start time data indicating start time for each of a plurality of reproduction audio data.
- the reciprocating frequency of the wrist is the number of times the reciprocating motion of the wrist is repeated per unit time (for example, one minute).
- the control unit 21 of the control device 2 acquires the sensor output from the operation information output device 3.
- the range of the round-trip frequency of the wrist is lower than the frequency H2 for the interlocked scene of the first period and higher than the frequency H1 for the interlocked scene of the second period, The frequency is H4 or less.
- the frequency is higher than frequency H3 and lower than frequency H6 for the interlocked scene in the third period, and higher than frequency H5 and lower than frequency H8 for the interlocked scene in the fourth period, and the fifth period
- the frequency of H7 or higher is set for the interlocked scene of
- the upper limit frequency H2 in the interlocked scene of the first period is higher than the lower limit frequency H1 in the interlocked scene of the second period
- the lower limit frequency H3 in the interlocked scene of the third period is the upper limit frequency H4 in the interlocked scene of the second period.
- the upper limit frequency H6 in the interlocked scene of the third period is higher than the lower limit frequency H5 in the interlocked scene of the fourth period
- the lower limit frequency H7 in the interlocked scene of the fifth period is the upper limit frequency in the interlocked scene of the fourth period Lower than H8.
- the marker data shown in FIG. 9C is added to specific imaging frame data. Specifically, the marker data is added to the imaging frame data that is the basis of the image displayed on the display device 4 at the timing when the user's wrist reaches the proximity position B or the separation position T. Therefore, at the timing when the user's wrist reaches the proximity position B or the separation position T, the reproduction of the video content is controlled so that the image is displayed based on the imaging frame data to which the marker data is added.
- marker data B ⁇ T is added to the imaging frame data F (F1 M1)
- marker data T ⁇ B is added to the imaging frame data F (F1 M2).
- the shooting frame data F (F1) is a plurality of shooting frame data belonging to the interlocked scene of the first cycle.
- the code M1 indicates the first marker data. Therefore, the imaging frame data F (F1M1) indicates imaging frame data to which the first marker data is added among the plurality of imaging frame data belonging to the interlocked scene of the first cycle.
- shooting frame data F (F1M2) to shooting frame data F (F1 M4) are shooting frame data to which the second marker data is added among the plurality of shooting frame data belonging to the interlocked scene of the first cycle.
- the imaging frame data F (F1 M1) to the imaging frame data F (F1 M4) to which the respective marker data (B ⁇ T, T ⁇ B) are added are among the plurality of imaging frame data belonging to the interlocked scene of the first cycle.
- 14 shows imaging frame data selected when the user's wrist reaches the proximity position B or the separation position T.
- FIG. 10 is a diagram showing the relationship between marker data and shooting frame data in each cycle (first to fifth cycles) of the interlocked scene.
- a plurality of shooting frame data belonging to the interlocked scene of the first cycle are shot at time intervals dt1, and the shooting frame data F (F1M1) to shooting frame data F (F1M5) are taken.
- Marker data (B ⁇ T, T ⁇ B) is added.
- a plurality of shooting frame data belonging to the interlocked scene of the second cycle are shot at time intervals dt1, and marker data (B ⁇ T,) for shooting frame data F (F2M1) to shooting frame data F (F2M5) T ⁇ B) is added.
- a plurality of shooting frame data belonging to the interlocked scene of the third cycle are shot at time intervals dt1, and marker data (frame data F (F3M1) to shooting frame data F (F3M5) are recorded. B ⁇ T, T ⁇ B) are added.
- a plurality of shooting frame data belonging to the interlocked scene of the fourth cycle are shot at time intervals dt1, and marker data (B ⁇ T,) for shooting frame data F (F4M1) to shooting frame data F (F4M5) T ⁇ B) is added.
- a plurality of shooting frame data belonging to the interlocked scene of the fifth period are shot at time intervals dt1, and marker data (B ⁇ T,) for shooting frame data F (F5M1) to shooting frame data F (F5M5) T ⁇ B) is added.
- the time length TW in the interlocking scene of each cycle is common.
- the linked scenes in each cycle have the same content.
- the marker data (B ⁇ T, T ⁇ B) is added to the imaging frame data at the timing when the user's wrist reaches the proximity position B or the separation position T. Therefore, if the ordinal numbers added after the code M are the same, it can be said that the contents of the photographed frame data are common even if the cycles are different.
- the imaging frame data F (F1 M1) belonging to the interlocked scene of the first cycle to which the first marker data (B to T) is added is the second to which the first marker data (B to T) is added. It can be said that the content is common to the shooting frame data F (F2M1) belonging to the linked scene of the cycle.
- the content of the shooting frame data F (F2M1) is common to the shooting frame data F (F3M1) belonging to the interlocked scene of the third cycle.
- shooting frame data F (F1 M2) belonging to the linked scene of the first period to which the second marker data (T ⁇ B) is added is the same as the second to which the second marker data (T ⁇ B) is added.
- the contents are common to the photographing frame data F (F3M2) belonging to the interlocked scene of three cycles and the photographing frame data F (F5M2) belonging to the interlocked scene of the fifth cycle.
- switching of the cycle of the interlocked scene is performed between imaging frame data having the same ordinal number appended after the code M.
- the linked scenes in each cycle have the same content in each cycle, but the traveling speed is different in each cycle.
- the time interval at the time of shooting between successive shooting frame data is common to the time interval dt1 in the interlocked scene of each cycle. Therefore, a shooting frame included between the shooting frame data to which the marker data (B ⁇ T or T ⁇ B) is added and the shooting frame data to which the next marker data (T ⁇ B or B ⁇ T) is added The number of data is different in each cycle of the interlocked scene.
- the number N 11 of shooting frame data F included between the shooting frame data F (F 1 M 1) to the shooting frame data F (F 1 M 2) in the linked scene of the first cycle is the shooting frame data F in the linked scene of the second cycle
- This number is larger than the number N21 of shooting frame data F included between F2M1) and shooting frame data F (F2M2).
- the number N32 of imaging frame data F included between imaging frame data F (F3M2) to imaging frame data F (F3 M3) in the interlocked scene of the third period is the imaging frame data F in the interlocked scene of the fourth period.
- This number is larger than the number N42 of shooting frame data F included between (F4M2) and shooting frame data F (F4M3).
- the imaging frame data in the interlocking cycle of each period is mutually including imaging frame data (one of the first imaging image information and the second imaging image information) to which the marker data (B ⁇ T) is added, and marker data (T It can be said that the number of other shooting frame data included between the shooting frame data (the first shooting image information and the other of the second shooting image information) to which B is added is different.
- the control unit 21 of the control device 2 performs reproduction control of sound based on the reproduction sound data.
- FIG. 11A shows the type of audio data.
- FIG. 11 (b) is a diagram showing a generation procedure of reproduction audio data.
- the storage unit 22 (see FIG. 4A) of the control device 2 stores audio data MF1-ALL to MF5-ALL of the interlocking scene of each cycle over the time length TW shown in FIG. 11A. There is.
- the control unit 21 of the control device 2 refers to the combination of the audio file and the start time from the metadata shown in FIG. 9C, on condition that the elapsed time from the switching timing to the interlocked scene has reached the start time. , Generate audio data for reproduction.
- the control unit 21 of the control device 2 for reproduction on condition that the elapsed time from the switching timing to the interlocked scene has reached the start time Mt1-001.
- Voice data MF1-001 is generated.
- the control unit 21 of the control device 2 generates the audio data for reproduction MF1-002 on condition that the elapsed time from the switching timing to the interlocked scene has reached the start time Mt1-002.
- the playback audio data MF1-XXX (XXX is a natural number of 3 digits, hereinafter the same) to be played back in the linked scene of the first cycle shown in FIG. 11 (b) to the playback voice played in the linked scene of the fifth cycle.
- the case of generating the data MF5-XXX will be described as an example.
- the control unit 21 of the control device 2 sets the audio data MF1- on condition that the elapsed time from the switching timing to the interlocked scene has reached the start time Mt1-XXX.
- the voice data MF1-XXX ' is acquired by copying the portion of the prescribed time width Mtw from the start time Mt1-XXX in ALL.
- the prescribed time width Mtw may be, for example, 0.5 seconds to 1.5 seconds, but is not limited to these seconds.
- the control unit 21 applies fade-in processing and fade-out processing to the copied audio data MF1-XXX ′ to generate reproduction audio data MF1-XXX.
- the generation of the reproduction audio data MF1-XXX has been described above, but the reproduction audio data MF2-XXX to MF5-XXX related to other cycles can also be generated in the same procedure.
- FIG. 12 (a) is a diagram showing the reproduction of reproduction audio data at standard time intervals. As shown in FIG.
- the linked scene in the first cycle is selected, and the playback audio data MF1-001 starts playback at the timing when display based on the shooting data frame F (F1L1) is performed, and playback is performed. It is assumed that the audio data MF1-002 starts to be reproduced at the timing when the display based on the shooting data frame F (F1L2) is performed. In the above case, since the fade-out of the audio data for reproduction MF1-001 and the fade-in of the audio data for reproduction MF1-002 overlap, the time for the audio data for reproduction MF1-001 and the audio data for reproduction MF1-002 is appropriate Played at intervals.
- FIG. 12B is a diagram showing the reproduction of the audio data for reproduction at a time interval narrower than the standard time interval.
- the reproduction start timing of the reproduction audio data MF1-002 starts earlier than the reference, since the reproduction audio data MF1-002 fades in, the reproduction audio data MF1 -Uncomfortable feeling with playback start can be suppressed.
- the reproduction audio data MF1-001 is faded out, it is possible to suppress the discomfort caused by the end of reproduction of the reproduction audio data MF1-001.
- FIG. 12C is a diagram showing the reproduction of the audio data for reproduction at a time interval wider than the standard time interval. In the case of FIG.
- the control unit 21 calculates, as a predicted time based on the sensor output, the movement time taken for the reciprocation of the specific part to interlock the reciprocation of the specific part and the captured image in the interlocked scene Do.
- This predicted time may be calculated for the time of each reciprocation based on the sensor output applied to one direction in the reciprocation, or may be calculated in the moving direction based on the bi-directional sensor output in the reciprocation. It may be calculated.
- the prediction time is the average travel time of the reciprocating motion. In this embodiment, the case of calculating the predicted time in one direction will be described.
- the control unit 21 corresponds to the interlocked scene, and the movement time (predictive time) taken for the one-way movement of the specific part in the reciprocation is the same direction that was performed multiple times before this one-way movement.
- the movement is calculated based on the sensor output (operation information) output from the operation information output device 3 (operation information output unit) (described in detail below).
- the control unit 21 (time interval setting unit 21d) generates next marker data (B ⁇ T) from the imaging frame data (first imaging image information) to which the marker data (one of B ⁇ T and T ⁇ B) is added.
- the time interval up to the shooting frame data (second shooting image information) to which the other T ⁇ B) is added is set to the time interval based on the above-mentioned prediction time.
- the control unit 21 (display image information generation unit 21 f) generates display image information by performing alpha blending processing on a pair of shooting frame data which are adjacent to each other in accordance with the timing defined by the display device 4.
- FIGS. 13 to 19 show a procedure of generating display image information corresponding to the first marker data F (F1 M1) in the interlocked scene of the first cycle.
- first marker data F (F1M1) is described below, it is merely an example.
- the same procedure is performed for odd-numbered marker data such as the third marker data F (F1 M3) and the fifth marker data F (F1 M5).
- FIG. 13 (a) is a diagram showing a calculation procedure of the prediction time.
- the predicted time TS5 'shown in FIG. 13 is a predicted time until the sensor output corresponding to the imaging frame data F (F1 M1) is acquired from the latest sensor output acquisition timing ts5.
- the predicted time TS5 ' is the movement time until the user's wrist moves from the remote position T to the near position B, which is predicted at the timing ts5.
- the user's wrist is located at the separated position T at timings ts1, ts3 and ts5 and at the proximity position B at timings ts2 and ts4.
- the control unit 21 of the control device 2 recognizes the respective timings ts 1 to ts 5 based on the sensor output from the operation information output device 3.
- a preparation period is set immediately after switching to the interlocked scene.
- reproduction of the video content interlocked with the operation of the user is not performed, and the photographing frame data is displayed at predetermined intervals dt1.
- the preparation period continues, for example, over a predetermined time length or until predetermined imaging frame data is displayed.
- preparation for interlocking control is performed.
- the control unit 21 acquires the moving time TS1 required to move from the separated position T to the close position B at timing ts2, and the moving time required to move from the close position B to the separated position T at timing ts3. Get TS2.
- the control unit 21 acquires the moving time TS3 required for moving from the separated position T to the near position B at timing ts4, and the movement required for moving from the near position B to the separated position T at timing ts5. Get time TS4.
- the control unit 21 of the control device 2 obtains the predicted time TS5 '.
- the prediction time TS5 ' relates to one-way movement in which the user moves from the separation position T to the near position B in one-way movement in which the user's wrist moves from one of the close position B to the other position T toward the other. It is predicted time.
- the control unit 21 acquires the predicted time TS5 'based on the one-way motion from the separation position T to the proximity position B, which is performed before the timing ts5 at which the latest sensor output is acquired. Specifically, the control unit 21 obtains an average of the time TS1 required from the timing ts1 to the timing ts2 and the time TS3 required for the timing ts3 to the timing ts4 as the prediction time TS5 '.
- FIG. 13 (b) is a diagram showing the variation of the prediction time TS 5 ′.
- variations may occur in the predicted time TS5 '. This is because the respective timings ts1 to ts5 are acquired along with the reciprocating motion of the user's wrist.
- a variation may occur as in the timings ts6a' to 6c 'also for the timing ts6' when the prediction time TS5 'has passed from the timing ts5.
- FIG. 13 (b) is a diagram showing the variation of the prediction time TS 5 ′.
- the imaging frame is generated from the imaging frame data F (F1 M0) displayed at the timing ts5.
- the time interval between the plurality of shooting frame data included in the data F (F1M1) may be deviated from the time interval dt1 between the plurality of shooting frame data in the preparation period.
- the time interval between shooting frame data from shooting frame data F (F1 M0) to shooting frame data F (F1 M1) is the time between multiple shooting frame data in the preparation period. It is equal to the interval dt1.
- the time interval between the shooting frame data from the shooting frame data F (F1 M0) to the shooting frame data F (F1 M1) is shorter than the time interval dt1 between the plurality of shooting frame data in the preparation period Become.
- the time interval between the shooting frame data from the shooting frame data F (F1 M0) to the shooting frame data F (F1 M1) is greater than the time interval d1 between the plurality of shooting frame data in the preparation period. become longer.
- FIG. 14 is a diagram showing the time interval of the imaging frame data and the interval adjustment curve SP1 at the predicted time TS5 '.
- the plurality of shooting frame data from the shooting frame data F (F1 M0) to the shooting frame data F (F1 M1) are defined at equal time intervals dt2.
- the time interval between the imaging frame data is adjusted by the interval adjustment curve SP1 so that the image is displayed smoothly at the control switching timing (for example, timing ts5).
- the interval adjustment curve SP1 acquires a frame rate (the number of frame data per unit time) at each of the timing at which the latest sensor output is acquired and the timing defined by the prediction time, and spline interpolation (for example, cubic spline interpolation) It is acquired by doing.
- the control unit 21 of the control device 2 calculates the effective frame rate V0 based on the number of imaging frame data actually displayed by the timing ts5 and the elapsed time until the timing ts5. . Further, the control unit 21 determines the frame rate based on the number of shooting frame data included in the shooting frame data F (F1 M0) to the shooting frame data F (F1 M1) and the predicted time from the timing ts5 to the timing ts6 '. Calculate V1 '. Furthermore, the control unit 21 spline-interpolates the effective frame rate V0 and the frame rate V1 'to acquire the interval adjustment curve SP1.
- FIG. 15 is a diagram showing a time interval between imaging frame data adjusted by the interval adjustment curve SP1.
- the interval between the imaging frame data F (F1M0 + 2) and the imaging frame data F (F1M0 + 3) is adjusted from the time interval dt2 to the time interval dt2a.
- the interval between the imaging frame data F (F1 M0 + n) and the imaging frame data F (F 1 M 0 + m) is adjusted from the time interval dt2 to the time interval dt 2 b, and the imaging frame data F (F1 M1-3) and the imaging frame data F (F1 M1) Between -2), the time interval dt2 is adjusted to the time interval dt2c.
- the interval adjustment curve SP1 adjusts the time interval between the imaging frame data in a stepwise manner toward the imaging frame data F (F1 M1).
- FIG. 16 is a diagram showing a time interval between shooting frame data adjusted by another interval adjustment curve SP2.
- the interval between the imaging frame data F (F1M0 + 2) and the imaging frame data F (F1M0 + 3) is adjusted from the time interval dt2 to the time interval dt2d.
- the interval between the imaging frame data F (F1 M0 + n) and the imaging frame data F (F 1 M 0 + m) is adjusted from the time interval dt2 to the time interval dt2 e, and the imaging frame data F (F1 M1-3) and the imaging frame data F (F1 M1) Between -2), the time interval dt2 is adjusted to the time interval dt2f.
- the interval adjustment curve SP2 adjusts the time interval between the imaging frame data in a stepwise manner toward the imaging frame data F (F1 M1).
- FIG. 17 is a diagram showing generation of combined frame data F (mix 1) to F (mix 4).
- FIG. 18A is a diagram showing the generation of combined frame data F (mix 1).
- FIG. 18B is a diagram showing generation of combined frame data F (mix 2).
- FIG. 18C is a diagram showing generation of combined frame data F (mix 3).
- FIG. 18D is a diagram showing generation of combined frame data F (mix 4).
- the display device 4 performs refresh at refresh timings Rf1 to Rf4.
- the refresh timings Rf1 to Rf4 arrive at fixed time intervals dRf.
- the time interval dRf is defined by the refresh rate.
- the control unit 21 of the control device 2 controls the shooting frame data F (F1 M0) before the refresh timing Rf1 and the shooting frame data F (F1 M0 + 1) after the refresh timing Rf1.
- the composite frame data F (mix1) is composited by alpha blending processing using a division ratio of a time interval between shooting frame data at the refresh timing Rf1 as a coefficient.
- the ratio of the shooting frame data F (F 1 M 0) is [(1 ⁇ 1) / dt 2]
- the ratio of the shooting frame data F (F 1 M 0 + 1) is It is synthesized to be [ ⁇ 1 / dt2].
- the control unit 21 of the control device 2 sets the shooting frame data F (F1 M0) before the refresh timing Rf2 and the shooting frame data F (F1 M0 + 1) after the refresh timing Rf2.
- the composite frame data F (mix 2) is composited by alpha blending processing using the division ratio of the time interval between shooting frame data at the refresh timing Rf 2 as a coefficient.
- the ratio of the shooting frame data F (F 1 M 0) is [(1 ⁇ 1) / dt 2] and the ratio of the shooting frame data F (F 1 M 0 + 1) is It is synthesized to be [ ⁇ 1 / dt2].
- the control unit 21 of the control device 2 sets the shooting frame data F (F1M0 + 1) before the refresh timing Rf3 and the shooting frame data F (F1M0 + 2) after the refresh timing Rf3.
- the composite frame data F (mix 3) is composited by alpha blending processing using a division ratio of a time interval between shooting frame data at the refresh timing Rf 3 as a coefficient.
- the ratio of the shooting frame data F (F1M0 + 1) is [(1- ⁇ 1) / dt2]
- the ratio of the shooting frame data F (F1M0 + 2) is It is synthesized to be [ ⁇ 1 / dt2].
- the control unit 21 of the control device 2 sets the shooting frame data F (F1M0 + 1) before the refresh timing Rf4 and the shooting frame data F (F1M0 + 2) after the refresh timing Rf4.
- the composite frame data F (mix 4) is composited by alpha blending processing using a division ratio of a time interval between shooting frame data at the refresh timing Rf 4 as a coefficient.
- the ratio of the shooting frame data F (F1M0 + 1) is [(1-.delta.1) / dt2]
- the ratio of the shooting frame data F (F1M0 + 2) is It is synthesized so as to be [ ⁇ 1 / dt2].
- FIGS. 19 to 22 show a procedure of generating display image information corresponding to the second marker data F (F1M2) in the interlocked scene of the first cycle.
- the second marker data F (F1M2) is described below, it is merely an example.
- the same procedure is performed for even-numbered marker data such as the fourth marker data F (F1 M4) and the sixth marker data F (F1 M6).
- FIG. 19A is a diagram showing a calculation procedure of the prediction time.
- the predicted time TS6 ′ shown in FIG. 19A is a predicted time from when the latest sensor output acquisition timing ts5 until the sensor output corresponding to the imaging frame data F (F1M2) is acquired.
- the wrist of the user is located at the proximity position B at timing ts6 after the example described in FIG.
- the control unit 21 of the control device 2 recognizes the timing ts6 based on the sensor output from the operation information output device 3.
- the control unit 21 of the control device 2 acquires the predicted time TS6 ′.
- the prediction time TS6 ′ relates to one-way movement in which the user moves from the proximity position B to the separation position T in one-way movement in which the user's wrist moves from one of the close position B to the separation position T toward the other. It is predicted time.
- the control unit 21 acquires the predicted time TS6 ′ based on the one-way motion from the proximity position B to the separated position T, which is performed before the timing ts6 at which the latest sensor output is acquired. Specifically, the control unit 21 obtains an average of the time TS2 required from the timing ts2 to the timing ts3 and the time TS4 required from the timing ts4 to the timing ts5 as the prediction time TS6 '.
- FIG. 19 (b) is a diagram showing the variation of the prediction time. As shown in FIG. 19 (b), variations may occur in the prediction time TS6 '. With the variation of the prediction time TS6 ′, a variation may occur as in the timings ts7a ′ to 7c ′ also at the timing ts7 ′ where the prediction time TS6 ′ has elapsed from the timing ts6.
- FIG. 20 is a diagram showing the time interval of the shooting frame data and the interval adjustment curve SP3 at the prediction time TS6 '.
- a plurality of pieces of shooting frame data from the shooting frame data F (F1 M1) to the shooting frame data F (F1 M2) are defined at equal time intervals dt3.
- the time interval between the imaging frame data is adjusted by the interval adjustment curve SP3 so that the image is displayed smoothly at the control switching timing (for example, timing ts6).
- the interval adjustment curve SP3 is created by spline interpolation (for example, cubic spline interpolation) between the effective frame rate V1 and the frame rate V2 '.
- the space adjustment curve SP3 is equivalent to the space adjustment curve SP1 described above.
- the adjustment of the time interval between the imaging frame data using the interval adjustment curve SP3 is also performed similarly to the adjustment of the time interval between the imaging frame data using the interval adjustment curve SP1 described above. Therefore, the description is omitted.
- FIG. 21 is a diagram showing the generation of combined frame data F (mix 11) to F (mix 14).
- FIG. 22A shows generation of combined frame data F (mix 11).
- FIG. 22B is a diagram showing generation of combined frame data F (mix 12).
- FIG. 22C shows the generation of combined frame data F (mix 13).
- FIG. 22D is a diagram showing the generation of combined frame data F (mix 14).
- the display device 4 performs refresh at refresh timings Rf11 to Rf14.
- the refresh timings Rf11 to Rf14 arrive at fixed time intervals dRf defined by the refresh rate.
- the control unit 21 of the control device 2 uses the photographed frame data F (F1 M1) before the refresh timing Rf11 and the photographed frame data F (F1 M1 + 1) after the refresh timing Rf11 to synthesize frame data.
- the combined frame data F (mix 11) is combined by alpha blending processing using a division ratio of a time interval between shooting frame data at the refresh timing Rf 11 as a coefficient.
- the ratio of the shooting frame data F (F 1 M 1) is [(1 ⁇ 2) / dt 3]
- the ratio of the shooting frame data F (F 1 M 1 + 1) is It is synthesized to be [ ⁇ 2 / dt 3].
- the control unit 21 of the control device 2 controls the imaging frame data F (F1 M1) before the refresh timing Rf12 and the imaging frame data F (F1 M1 + 1) after the refresh timing Rf12.
- the composite frame data F (mix 12) is composited by alpha blending processing using the division ratio of the time interval between the captured frame data at the refresh timing Rf 12 as a coefficient.
- the ratio of the shooting frame data F (F 1 M 1) is [(1 ⁇ 2) / dt 3]
- the ratio of the shooting frame data F (F 1 M 1 + 1) is It is synthesized to be [ ⁇ 2 / dt3].
- the control unit 21 of the control device 2 controls the shooting frame data F (F1M1) before the refresh timing Rf13 and the shooting frame data F (F1M1 + 1) after the refresh timing Rf13.
- the combined frame data F (mix 13) is combined by alpha blending processing using a division ratio of a time interval between shooting frame data at the refresh timing Rf 13 as a coefficient. As shown in FIG.
- the ratio of the shooting frame data F (F 1 M 1) is [(1- ⁇ 2) / dt 3]
- the ratio of the shooting frame data F (F 1 M 1 + 1) is It is synthesized so as to be [ ⁇ 2 / dt3].
- the control unit 21 of the control device 2 controls the shooting frame data F (F1M1 + 1) before the refresh timing Rf14 and the shooting frame data F (F1M1 + 2) after the refresh timing Rf14.
- the combined frame data F (mix 14) is combined by alpha blending processing using a division ratio of a time interval between shooting frame data at the refresh timing Rf 14 as a coefficient. As shown in FIG.
- the ratio of the shooting frame data F (F1M1 + 1) is [(1 ⁇ 2) / dt3]
- the ratio of the shooting frame data F (F1M1 + 2) is It synthesizes so that it may become [ ⁇ 2 / dt3].
- FIG. 23A is a flowchart showing a signal monitoring process performed by the CPU 21a.
- FIG. 23B is a flowchart showing main processing performed by the CPU 21a.
- ⁇ Signal monitoring process> In the signal monitoring process shown in FIG. 23A, the CPU 21a determines whether or not a predetermined data acquisition timing has come (step S1). If it is determined that the data acquisition timing has arrived, the CPU 21a acquires various data from the operation information output device 3 (step S2). For example, the sensor output from the operation information output device 3 is acquired, or the operation information for the reproduction related switch 36 is acquired. After acquiring various data in step S2, the CPU 21a returns to step S1 and determines whether the data acquisition timing has arrived.
- the CPU 21a determines whether or not the reproduction operation of the video content has been performed by the reproduction related switch 36 (step S11). This determination is performed based on the data acquired in step S2 of the signal monitoring process, specifically, the operation information for the reproduction related switch 36.
- the CPU 21a starts reproduction of the introduction scene in the video content (step S12).
- the CPU 21a reads the imaging frame data of the introduction scene described in FIG. 9A and stores the imaging frame data as display frame data in the frame data.
- the display frame data stored in the frame data is transmitted to the display device 4 each time the refresh timing comes.
- the display device 4 displays an image according to the display frame data.
- the CPU 21a reads the audio data of the introduction scene described in FIG. 9B, generates an acoustic signal based on the audio data, and outputs the acoustic signal to the sound output device 5.
- the sound output device 5 outputs a sound according to the sound signal.
- step S11 When it is determined in step S11 that the reproduction operation of the video content is not performed, or when the reproduction of the introduction scene is started in step S12, the CPU 21a determines whether or not the introduction scene is being reproduced (step S13). . If it is determined that the introduction scene is being reproduced, the CPU 21a determines whether or not the first mode switch 34 has been operated (step S14). This determination is performed based on the data acquired in step S2 of the signal monitoring process, specifically, the operation information for the first mode switch 34. When the first mode switch 34 is operated, the CPU 21a performs interlocked scene reproduction processing (step S15). The interlocked scene reproduction processing will be described later.
- the CPU 21a determines whether or not the reproduction of the introduced scene has ended (step S16). If it is determined that the introduction scene has been reproduced, the CPU 21a performs interlocked scene reproduction processing (step S15). If it is determined in step S13 that the introduction scene is not being reproduced, the CPU 21a determines whether or not the interlocked scene is being reproduced (step S17). If it is determined that the interlocked scene is being reproduced, the CPU 21a performs interlocked scene reproduction processing (step S15). If it is determined in step S16 that the reproduction of the introduced scene is not completed, and if it is determined in step S17 that the interlocked scene is not being reproduced, the CPU 21a returns to the process of step S11.
- the CPU 21a determines whether or not the second mode switch 35 has been operated (step S18). This determination is performed based on the data acquired in step S2 of the signal monitoring process, specifically, the operation information for the second mode switch 35.
- the CPU 21a performs a finish scene reproduction process (step S19).
- the CPU 21a returns to the process of step S11.
- the CPU 21a reads the photographing frame data of the finish scene described in FIG. 9A, and stores it in the frame data as display frame data.
- the display device 4 displays an image according to the display frame data.
- the CPU 21a reads the sound data of the finish scene described in FIG. 9 (b), generates an acoustic signal based on the sound data, and outputs the sound signal to the sound output device 5.
- the sound output device 5 outputs a sound according to the sound signal. This finish scene reproduction process is performed until the reproduction of a series of finish scenes is completed.
- the CPU 21a determines whether the cycle of the interlocked scene is not set (step S21). In other words, the CPU 21a determines whether it is immediately after the transition from the introduction scene to the interlocking scene.
- the CPU 21a (the photographed image information selection unit 2e) sets an initial cycle (step S22). In the present embodiment, the CPU 21a sets a first period as an initial period. Along with this, reproduction of the interlocked scene of the first cycle is started. Next, the CPU 21a determines whether the sensor output from the operation information output device 3 has the value "0" (step S22).
- the CPU 21a determines whether the wrist of the user is at either the close position B or the separated position T. If the CPU 21a determines that the sensor output is the value "0", the process proceeds to step S24. If the CPU 21a determines that the sensor output is not the value "0", the process proceeds to step S38.
- step S24 the CPU 21a determines whether the sensor output has changed from a negative value ("-") to a value "0". In other words, the CPU 21a determines whether or not the wrist of the user who has moved in a direction approaching the chest of the user has stopped. If the sensor output changes from a negative value to a value “0”, the CPU 21a determines that the user's wrist is located at the close position (step S25). In other words, the CPU 21a (image selection unit 2a) determines that the display timing of the imaging frame data to which the marker data (B ⁇ T) is added has come.
- step S24 determines whether the sensor output changes from a positive value ("+") to a value "0". Is determined (step S26). In other words, the CPU 21a determines whether or not the wrist of the user who has moved in a direction away from the user's chest has stopped. When the sensor output changes from a positive value to the value "0", the CPU 21a determines that the user's wrist is located at the separated position (step S27). In other words, the CPU 21a (image selection unit 2a) determines that the display timing of the imaging frame data to which the marker data (T ⁇ B) is added has come.
- step S26 when it is determined that the sensor output does not become the value "0" from the positive value (when the sensor output of the value "0" was the previous process, too), the CPU 21a has the value "0". It is determined whether or not the sensor output of has continued a prescribed number of times (step S28). If it is determined that the sensor output of the value “0” has continued for a prescribed number of times, the CPU 21a determines that the wrist of the user is at rest (step S29). If it is determined that the sensor output of the value “0” is not continuous the prescribed number of times, the CPU 21a proceeds to the process of step S38.
- step S30 the CPU 21a determines whether or not it is necessary to change the cycle in the interlocked scene (step S30). As described in FIG. 9D, the CPU 21a acquires the reciprocating frequency of the wrist based on the sensor signal from the operation information output device 3, and selects the interlocked scene of the cycle suitable for the acquired reciprocating frequency. If the cycle of the selected linked scene is different from the current linked scene cycle, the CPU 21a determines that the linked scene cycle change is necessary, and if both cycles are the same, the linked scene cycle change is not necessary. judge. The CPU 21a proceeds to the process of step S34 when determining that the cycle change of the interlocked scene is necessary, and proceeds to the process of step S31 when determining that the cycle change of the interlocked scene is not necessary.
- the CPU 21a predicts the next sensor output timing. For example, as described in FIG. 13A and FIG. 19A, the CPU 21a acquires the prediction time based on the one-way movement of the wrist performed before the latest sensor output timing, and By adding the prediction time to the sensor output timing, the next sensor output timing is predicted.
- the CPU 21a (time interval setting unit 2b) sets the time interval between shooting frame data from the latest sensor output timing to the next sensor timing as the time interval based on the predicted time. (Step S32). For example, as described in FIG. 14, the CPU 21a selects the imaging frame data F (F1 M0) at timing ts5, selects the imaging frame data F (F1 M1) at timing ts 6 ', and belongs between both imaging frame data A time interval dt2 is set between a plurality of shooting frame data. Similarly, as described in FIG.
- the CPU 21a selects the photographing frame data F (F1M1) at timing ts6, and selects the photographing frame data F (F1M2) at timing ts7 ′, and between the two photographing frame data
- a time interval dt3 is set between a plurality of pieces of shooting frame data to which the image belongs.
- the CPU 21a sets an interval adjustment curve (step S33).
- the interval adjustment curve SP1 described in FIG. 14, the interval adjustment curve SP2 described in FIG. 16, or the interval adjustment curve SP3 described in FIG. 20 is set.
- the interval adjustment curve By setting the interval adjustment curve, the time interval between the imaging frame data is changed from the timing when the user's wrist is located at one of the close position B and the distant position T toward the timing when it reaches the other. Can.
- the CPU 21a (captured image information selection unit 2e) predicts the next sensor output timing. For example, as described in FIG. 13A and FIG. 19A, the CPU 21a acquires the prediction time based on the one-way movement of the wrist performed before the latest sensor output timing, and By adding the prediction time to the sensor output timing, the next sensor output timing is predicted.
- the CPU 21a predicts the next sensor output timing (step S35).
- the CPU 21a (time interval setting unit 2b) sets the time interval between shooting frame data from the latest sensor output timing to the next sensor timing as the time interval based on the predicted time. (Step S36).
- the processes of step S35 and step S36 are the same as the processes of step S31 and step S32 described above, and thus the description thereof is omitted.
- the CPU 21a After executing the process of step S36, the CPU 21a sets a composite curve (step S37).
- the composite curve is a curve that defines a composite ratio of shooting frame data belonging to the linked scene of the current cycle and shooting frame data belonging to the linked scene of the next cycle.
- the CPU 21a combines both shooting frame data at the combining ratio defined by the combining curve in combining frame generation processing (step S39) described later, and generates combining frame data. As a result, when switching the cycle of the interlocked scene, the image can be changed smoothly.
- step S33 is performed when it is determined that the sensor output is not the value "0" in the process of step S23.
- the CPU 21a determines whether the refresh timing of the display device 4 has come (step S38).
- the CPU 21a display image information generation unit 21f
- the CPU 21a generates composite frame data by performing alpha blending processing on the pair of shooting frame data, and writes the composite frame data as display frame data in the frame buffer 23 (step S39).
- the display frame data (combined frame data) written in the frame buffer 23 is output to the display device 4 at the refresh timing. Thereby, the display device 4 displays an image based on the display frame data. If it is determined in step S38 that the refresh timing is not reached, and after the process of step S39 is executed, the CPU 21a returns to the main process.
- the video display system 1 By performing the above processing, in the video display system 1 according to the present embodiment, it is possible to reproduce the interlocked scene in interlock with the reciprocation movement of the user's wrist.
- the types of exercise to be targeted can be expanded as compared with the conventional system that was directed to the exercise that continues traveling in one direction.
- the first timing when the user's wrist is located at one of the close position B and the distant position T and the prediction time have elapsed from the first timing.
- the photographing frame data is selected at each of the second timing when the other reaches the other of the proximity position B and the separation position T, and the time interval between the plurality of photographing frame data belonging to the selected photographing frame data is Since the time interval is set based on the video content, it is possible to follow and reproduce the video content even if the time of reciprocating movement of the wrist varies. Further, in the image display system 1 of the present embodiment, by using the interval adjustment curves SP1 to SP3, the time interval between the imaging frame data is changed along with going from the first timing to the second timing. Therefore, even when the selected set of shooting frame data is updated, the image can be smoothly changed. For example, from the set of shooting frame data F (F1 M0) and shooting frame data F (F1 M1) shown in FIG.
- the image displayed on the display device 4 can be changed smoothly.
- alpha-blend processing is performed on a pair of consecutive shooting frame data in succession to generate composite frame data, and the generated composite frame data is displayed on the display device 4 as display frame data. Because of this, the image displayed on the display device 4 can be smoothly changed also at this point.
- shooting frame data F (F1 M0) is stored in the frame buffer 23 at timing Rf20
- shooting frame data F (F1 M0 + 1) is stored in the frame buffer 23 at timing Rf21
- shooting frame data F (F F1M0 + 2) is stored in the frame buffer 23.
- the time interval from the timing Rf20 to the timing Rf21 is different from the time interval from the timing Rf21 to the timing Rf22, since the display device 4 corresponds to the variable refresh rate, the image is appropriately displayed. Can.
- the interlocked scene to be reproduced first is not limited to the interlocked scene of the first period.
- a linked scene of the second cycle may be used, or a linked scene of the third cycle may be used.
- interlocking control is performed on a specific one-way motion related to the reciprocating motion. Specifically, the interlocking control is performed by the movement time of the wrist from the close position B to the separation position T and the movement time of the wrist from the separation position T to the close position B.
- control of interlocking is not limited to these. For example, control may be performed based on the movement time obtained by averaging the movement time of the wrist from the close position B to the separation position T and the movement time of the wrist from the separation position T to the close position B.
- control of interlocking may be performed on the basis of reciprocating motion. Specifically, the interlock control may be performed based on the movement time of the wrist from the proximity position B through the separation position T to the proximity position B. Similarly, interlock control may be performed by the movement time of the wrist from the separated position T to the separated position T through the close position B.
- control apparatus 2 provided with CPU21a, RAM21b, and the frame buffer 23 was illustrated in the above-mentioned embodiment, it is not limited to this structure.
- hardware logic may execute all or part of the processing executed by the CPU 21a.
- a plurality of CPUs 21a may be mounted, or a plurality of processors of different architectures such as the CPU 21a and a GPU (Graphics Processing Unit) may be mounted.
- the process performed using the frame buffer 23 may be performed using the RAM 21b, and conversely, the process performed using the RAM 21b may be performed using the frame buffer 23.
- control device 2 the operation information output device 3, the display device 4, and the sound output device 5 are separately provided in the above embodiment, the present invention is not limited to this configuration.
- the control device 2 may be incorporated in the operation information output device 3 and integrated, or the control device 2 may be integrated in the display device 4.
- the control device 2, the display device 4, and the sound output device 5 may be integrated.
- the composition which processes by one control device 2 was illustrated, it is not limited to this composition.
- the configuration may be such that processing is performed by a plurality of control devices 2 communicably connected via a communication network.
- the motion information output device 3 is illustrated as being worn on the wrist in the above-described embodiment, the motion information output device 3 is not limited to being worn on the wrist.
- the motion information output device 3 may be attached to a portion capable of reciprocating movement, such as the user's ankle, head or waist.
- operation information output device 3 was provided with control part 31, it is not limited to this composition.
- the processing performed by the control unit 31 may be performed by hardware logic.
- the control unit 31 of the operation information output device 3 may be omitted, and the operation information output device 3 may be controlled by the control unit 21 of the control device 2.
- the sensor output of the gyro sensor 32 included in the operation information output device 3 is exemplified by performing normalization processing by the control unit 31 included in the operation information output device 3.
- the present invention is not limited to this configuration.
- the sensor output normalization process may be performed by the control unit 21 of the control device 2.
- a video camera for capturing the motion of the user may be used, and the motion of the user may be extracted as motion information from the video obtained by the video camera.
- the present invention is not limited to this configuration.
- the present invention may be applied to a sperm collection support system.
- the sperm collection support system is a system that supports the collection of sperm when collecting male sperm for medical research and therapeutic needs. This sperm collection support system is used to collect sperm for investigating the cause of infertility between husband and wife, to treat sexual dysfunction, and to acquire sperm for artificial insemination.
- the sperm collection support system can meet various social needs such as prevention of sexual crimes by eliminating personal sexual desires, prevention of prostitution and reduction of the number of sexually transmitted infections.
- the motion information output device 3 is worn on a male's wrist using adult content obtained by photographing a sexual activity of a male and female as video content. Then, in response to the reciprocation of the wrist accompanying the male masturbating act, the adult content is interlocked and reproduced.
- the sense of reality can be enhanced and the collection efficiency of sperm can be enhanced, as compared with a general configuration in which adult content is simply reproduced.
- An image processing apparatus that processes an image based on motion information (sensor output) that changes periodically along with the reciprocation of a specific part (wrist) of a person, and is a plurality of captured image information captured in time series
- a prediction unit that includes information (photographing frame data F (F1 M1), F (
- the prediction unit performs a moving time (time interval TS5 ') related to the one-way movement of the reciprocation for another one-way movement (at timing ts1 to ts2) performed a plurality of times before the one-way movement.
- the movement, movement of the wrist performed from timing ts3 to ts4) is predicted based on the operation information output from the operation information output unit. According to the video processing device of this aspect, even if the moving time related to the one-way motion of the reciprocating motion varies, the display of the video can be made to follow the moving time of the specific part.
- the time interval setting unit is characterized in that the time interval between the photographed image information is changed stepwise (dt2a to dt2c, dt2d to dt2f) as it goes from the first timing to the second timing. According to the video processing device of this aspect, an image can be displayed smoothly at the switching timing of control.
- a display image information generation unit (display image information generation unit 21f) that generates display image information (display frame data) to be displayed on the display device (display device 4) based on the captured image information is characterized.
- an image suitable for the display device can be generated based on the photographed image information.
- the display image information generation unit generates display image information by performing alpha blend processing of a pair of successive captured image information, and the alpha blend processing is performed by dividing the time interval between the captured image information according to the refresh timing of the display device. Is a coefficient. According to the video processing device of this aspect, an image can be displayed smoothly.
- the display device is characterized in that it is a stereoscopic image display device that displays photographed image information by a stereoscopic image that can be viewed stereoscopically. According to the video processing apparatus of this aspect, an image can be displayed as a stereoscopic image.
- the photographed image information is another photographed image information included between the first photographed image information and the second photographed image information, and the number of the photographed image information is different.
- a plurality of sets (photographing of the interlocked scene of the first to fifth periods)
- the present invention is characterized by including a photographed image information selection unit (photographed image information selection unit 21g) that includes frame data and selects any one of a plurality of sets of photographed image information based on operation information. According to the image processing apparatus of this aspect, it is possible to cope with fluctuations in the speed of the reciprocation of a specific part in a wide range.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Cardiology (AREA)
- Vascular Medicine (AREA)
- Studio Devices (AREA)
Abstract
L'invention permet d'obtenir un traitement vidéo approprié pour un déplacement en va-et-vient d'une partie spécifique par un utilisateur. Une partie de prédiction (21e) prédit une quantité de temps nécessaire pour qu'un poignet d'un utilisateur se déplace entre une position rapprochée (B) et une position distante (T) sur la base d'une sortie de capteur qui est émise à partir d'un dispositif de sortie d'informations de déplacement (3). Une partie de sélection d'image (21c) sélectionne des premières données de trame photographique à un premier moment auquel le poignet de l'utilisateur se trouve à la position rapprochée (B) pendant le déplacement en va-et-vient et sélectionne des deuxièmes données de trame photographique à un deuxième moment auquel le temps de déplacement prédit s'est écoulé à partir du premier moment. Une partie de réglage d'intervalle de temps (21d) définit l'intervalle de temps entre les ensembles de données de trame photographique, c'est-à-dire l'intervalle de temps allant des premières données de trame photographique aux deuxièmes données de trame photographique, à un intervalle de temps sur la base du temps de déplacement prédit.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/043803 WO2019111348A1 (fr) | 2017-12-06 | 2017-12-06 | Dispositif de traitement vidéo, procédé de traitement vidéo, programme informatique et système de traitement vidéo |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/043803 WO2019111348A1 (fr) | 2017-12-06 | 2017-12-06 | Dispositif de traitement vidéo, procédé de traitement vidéo, programme informatique et système de traitement vidéo |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019111348A1 true WO2019111348A1 (fr) | 2019-06-13 |
Family
ID=66751403
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/043803 Ceased WO2019111348A1 (fr) | 2017-12-06 | 2017-12-06 | Dispositif de traitement vidéo, procédé de traitement vidéo, programme informatique et système de traitement vidéo |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2019111348A1 (fr) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06154354A (ja) * | 1992-11-16 | 1994-06-03 | Kanji Murakami | 立体映像を用いたトレーニング装置 |
| JPH0780096A (ja) * | 1993-09-14 | 1995-03-28 | Sony Corp | 画像表示装置付き疑似体験装置 |
| JP2007144107A (ja) * | 2005-10-25 | 2007-06-14 | Vr Sports:Kk | 運動補助システム |
| JP2011097988A (ja) * | 2009-11-04 | 2011-05-19 | Pioneer Electronic Corp | トレーニング支援装置 |
| JP2014531142A (ja) * | 2011-08-16 | 2014-11-20 | デスティニーソフトウェアプロダクションズ インク | スクリプトをベースとするビデオ・レンダリング |
| JP2016062184A (ja) * | 2014-09-16 | 2016-04-25 | 学校法人立命館 | 動画像生成システム、動画像生成装置、動画像生成方法、及びコンピュータプログラム |
-
2017
- 2017-12-06 WO PCT/JP2017/043803 patent/WO2019111348A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06154354A (ja) * | 1992-11-16 | 1994-06-03 | Kanji Murakami | 立体映像を用いたトレーニング装置 |
| JPH0780096A (ja) * | 1993-09-14 | 1995-03-28 | Sony Corp | 画像表示装置付き疑似体験装置 |
| JP2007144107A (ja) * | 2005-10-25 | 2007-06-14 | Vr Sports:Kk | 運動補助システム |
| JP2011097988A (ja) * | 2009-11-04 | 2011-05-19 | Pioneer Electronic Corp | トレーニング支援装置 |
| JP2014531142A (ja) * | 2011-08-16 | 2014-11-20 | デスティニーソフトウェアプロダクションズ インク | スクリプトをベースとするビデオ・レンダリング |
| JP2016062184A (ja) * | 2014-09-16 | 2016-04-25 | 学校法人立命館 | 動画像生成システム、動画像生成装置、動画像生成方法、及びコンピュータプログラム |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN100435728C (zh) | 再现内容数据的方法和装置 | |
| JP2018198444A (ja) | 触感記録および再生 | |
| JPH1132284A (ja) | 画像撮影再生方式及び方法並びに画像再生プログラムを記録した記録媒体 | |
| JP2015130169A (ja) | 触覚コンテンツを伴う視点動画を記録及び再生するシステム並びに方法 | |
| JP6688378B1 (ja) | コンテンツ配信システム、配信装置、受信装置及びプログラム | |
| US20200388190A1 (en) | Information processing apparatus, information processing method, and program | |
| CN104168436B (zh) | 运动图像再生装置以及运动图像再生方法 | |
| JP5751942B2 (ja) | 再生装置及び再生方法 | |
| ES2192482A1 (es) | Aparato gimnastico y deportivo con pantalla de proyeccion estereoscopica. | |
| US12440729B2 (en) | Timeline and media controller for exercise machine | |
| TW201509487A (zh) | 運動影片訓練方法及系統 | |
| US20160179206A1 (en) | Wearable interactive display system | |
| JP3694663B2 (ja) | 移動疑似体験装置及びその方法 | |
| US9448762B2 (en) | Precognitive interactive music system | |
| WO2019111348A1 (fr) | Dispositif de traitement vidéo, procédé de traitement vidéo, programme informatique et système de traitement vidéo | |
| JP2008295037A (ja) | 動画再生装置及びその方法 | |
| JPWO2015033446A1 (ja) | ランニング支援システムおよびこれに用いるヘッドマウントディスプレイ装置 | |
| JP5493362B2 (ja) | 動画再生装置及びプログラム | |
| JP6665273B1 (ja) | コンテンツ配信システム、受信装置及びプログラム | |
| US12354726B1 (en) | Remote group workouts | |
| WO2018168175A1 (fr) | Appareil et procédé de traitement d'informations, et programme | |
| JP2022191592A (ja) | 精子採取装置およびデータ提供システム | |
| KR101766892B1 (ko) | 연동장치, 이를 포함하는 전동장치 시스템 및 그 제어방법 | |
| JP7049515B1 (ja) | 情報処理装置、プログラム及び描画方法 | |
| JP2013066020A (ja) | 動画再生装置及び動画再生方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17933919 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17933919 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |