[go: up one dir, main page]

WO2012018001A1 - Dispositif de traitement d'image vidéo, dispositif d'affichage et procédé de traitement d'image vidéo - Google Patents

Dispositif de traitement d'image vidéo, dispositif d'affichage et procédé de traitement d'image vidéo Download PDF

Info

Publication number
WO2012018001A1
WO2012018001A1 PCT/JP2011/067647 JP2011067647W WO2012018001A1 WO 2012018001 A1 WO2012018001 A1 WO 2012018001A1 JP 2011067647 W JP2011067647 W JP 2011067647W WO 2012018001 A1 WO2012018001 A1 WO 2012018001A1
Authority
WO
WIPO (PCT)
Prior art keywords
video frame
eye
video
frame
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2011/067647
Other languages
English (en)
Japanese (ja)
Inventor
阿部 貴志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Publication of WO2012018001A1 publication Critical patent/WO2012018001A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity

Definitions

  • the present invention relates to a video processing device, a display device, and a video processing method capable of changing the parallax of the right-eye and left-eye video frames.
  • a display device that alternately displays a left-eye video frame and a right-eye video frame having parallax, and allows a left-eye video and a right-eye video to be incident on a user's left eye and right eye, respectively, so as to visually recognize a stereoscopic video.
  • the left-eye video frame and the right-eye video frame are generated by, for example, photographing the subject by two cameras arranged to have a predetermined parallax with respect to the subject.
  • a left-eye video frame and a right-eye video frame having a predetermined parallax may be generated by processing a two-dimensional video frame (see, for example, Patent Document 1).
  • the conventional technique has a problem that it is impossible to change the parallax of the left-eye video frame and the right-eye video frame.
  • the purpose is to provide a left-eye video by providing a synthesizing unit that synthesizes a central video frame in which one of the video pairs displaced between the right-eye video frame and the left-eye video frame is arranged at a substantially central position of the video pair.
  • An object of the present invention is to provide a video processing device, a display device, and a video processing method capable of changing the parallax between the frame and the right-eye video frame.
  • the video processing device disclosed in the present application is a video processing device that processes a right-eye video frame and a left-eye video frame. From the right-eye video frame and the left-eye video frame, between the right-eye video frame and the left-eye video frame.
  • An extraction unit that extracts a display position in each frame of the video pair that is deviated in the position
  • a position calculation unit that calculates a substantially center position of the display position in each frame extracted by the extraction unit
  • a synthesizing unit that synthesizes a central video frame arranged at a substantially central position calculated by the position calculating unit from the right-eye video frame and the left-eye video frame, and the central video frame obtained by synthesizing by the synthesizing unit.
  • a generation unit that generates a display image frame for the right eye and a display image frame for the left eye using a predetermined parallax.
  • a video pair consisting of a part of the left-eye video and the right-eye video having a parallax is included in each frame. It is extracted together with the display position within. The approximate center position of the display position in each extracted frame is calculated. A central video frame in which one of the extracted video pairs is arranged at the calculated approximate center position is synthesized from the right-eye video frame and the left-eye video frame as a two-dimensional video frame having no parallax. A right-eye display video frame and a left-eye display video frame having a predetermined parallax are generated based on the center video frame.
  • the extraction unit further extracts a background video corresponding to a display position in each frame of the video pair from the left-eye video frame and the right-eye video frame
  • the synthesis unit includes: A central video frame in which the background video extracted by the extraction unit is arranged at a display position in each frame of the video pair is synthesized.
  • the right-eye video frame is included in an area other than the area where one of the video pairs having parallax extracted from the right-eye video frame and the left-eye video frame is arranged. Also, a background image extracted from the left-eye video frame is arranged.
  • the video processing device disclosed in the present application includes a parallax calculation unit that calculates parallax included in the right-eye video frame and the left-eye video frame based on a display position in each frame extracted by the extraction unit, and the parallax calculation unit.
  • a change unit that changes the calculated parallax, and the generation unit uses the parallax changed by the change unit.
  • the parallax is calculated based on the amount of displacement of the image that is deviated between the right-eye video frame and the left-eye video frame.
  • the calculated parallax is changed by being increased or decreased.
  • a right-eye display video frame and a left-eye display video frame having a changed parallax are each generated from the central video frame.
  • the video processing device disclosed in the present application includes a parallax calculation unit that calculates a parallax included in the right-eye video frame or the left-eye video frame based on the central video frame, and a change unit that changes the parallax calculated by the parallax calculation unit.
  • the generation unit uses the parallax changed by the changing unit.
  • the central video frame having no parallax is compared with the right-eye video frame and the left-eye video frame to calculate the parallax of the right-eye video frame and the left-eye video frame.
  • the calculated parallax is changed by being increased or decreased.
  • a right-eye display video frame and a left-eye display video frame having a changed parallax are each generated from the central video frame.
  • the video processing device disclosed in the present application further includes a receiving unit that receives a parallax change instruction, and the changing unit changes the parallax according to the changing instruction received by the receiving unit.
  • the video processing apparatus accepts a change instruction for increasing or decreasing the parallax from the user.
  • the parallax of the right-eye video frame and the left-eye video frame is changed according to the received change instruction, and the right-eye display video frame and the left-eye display video frame having the changed parallax are generated.
  • a display device disclosed in the present application is a display device that processes and displays a right-eye video frame and a left-eye video frame. From the right-eye video frame and the left-eye video frame, the right-eye video frame and the left-eye video frame are displayed.
  • An extraction unit that extracts a display position in each frame of a video pair that is deviated between, a position calculation unit that calculates a substantially center position of a display position in each frame extracted by the extraction unit, and the video pair
  • a synthesizing unit that synthesizes a central video frame arranged at a substantially central position calculated by the position calculating unit from the right-eye video frame and the left-eye video frame, and the central video frame obtained by synthesizing by the synthesizing unit Generating a right-eye display video frame and a left-eye display video frame using a predetermined parallax, and the right generated by the generation unit
  • a display unit for displaying the use display image frame and the display video frame for the left eye.
  • a video pair consisting of a part of the left-eye video and the right-eye video having a parallax is included in each frame. It is extracted together with the display position within. The approximate center position of the display position in each extracted frame is calculated.
  • a central video frame in which one of the extracted video pairs is arranged at the calculated approximate center position is synthesized from the right-eye video frame and the left-eye video frame as a two-dimensional video frame having no parallax.
  • a right-eye display video frame and a left-eye display video frame having a predetermined parallax are generated based on the center video frame. The generated right-eye display video frame and left-eye display video frame are displayed, and the user visually recognizes the stereoscopic video based on the changed parallax.
  • the video processing method disclosed in the present application is a video processing method for processing a right-eye video frame and a left-eye video frame, and includes a right-eye video frame and a left-eye video frame between the right-eye video frame and the left-eye video frame.
  • a center image in which the display position in each frame of the video pair that is deviated in is extracted, the approximate center position of the display position in each extracted frame is calculated, and one of the video pairs is arranged at the approximately center position Frames are synthesized from the right-eye video frame and the left-eye video frame, and a right-eye display video frame and a left-eye display video frame are generated using a predetermined parallax based on the center video frame.
  • a video pair consisting of a part of the left-eye video and the right-eye video having a parallax is included in each frame. It is extracted together with the display position within. The approximate center position of the display position in each extracted frame is calculated. A central video frame in which one of the extracted video pairs is arranged at the calculated approximate center position is synthesized from the right-eye video frame and the left-eye video frame as a two-dimensional video frame having no parallax. A right-eye display video frame and a left-eye display video frame having a predetermined parallax are generated based on the center video frame.
  • a synthesizing unit that synthesizes a central video frame in which one of a video pair displaced between a right-eye video frame and a left-eye video frame is arranged at a substantially central position of the video pair.
  • FIG. 10 is a block diagram illustrating an example of internal hardware of a display device according to Embodiment 2.
  • FIG. 12 is a block diagram illustrating an example of internal hardware of a display device according to Embodiment 3.
  • FIG. FIG. 10 is a block diagram illustrating an example of internal hardware of a display device according to Embodiment 2.
  • FIG. 10 is a block diagram illustrating an example of internal hardware of a display device according to a fourth embodiment. It is explanatory drawing for demonstrating the method to produce
  • the display device receives the right-eye video frame and the left-eye video frame, and alternately displays the right-eye display video frame and the left-eye display video frame.
  • a user of the display device wears an eyeglass device having a liquid crystal shutter or the like that opens and closes in synchronization with the right-eye display video frame and the left-eye display video frame that are alternately displayed on the display device. Then, the user visually recognizes the stereoscopic video by the left-eye video and the right-eye video that enter the left and right eyes via the eyeglass device.
  • a display device for example, there is a receiver that receives and displays a television broadcast wave including a three-dimensional video signal.
  • a display device that reproduces and displays a three-dimensional video signal recorded on a recording medium such as a DVD (Digital Versatile Disk) or a BD (Blu-ray Disk, registered trademark).
  • the display device may be, for example, a game device that generates 3D video data according to a game application and displays a right-eye display video frame and a left-eye display video frame.
  • the video processing apparatus is a video processing unit that is provided in a display device and generates a right-eye display video frame and a left-eye display video frame from a right-eye video frame and a left-eye video frame.
  • the video processing device receives a right-eye video frame and a left-eye video frame, generates a left-eye display video frame and a right-eye display video frame with changed parallax, and outputs them to an external display device or recording device.
  • This embodiment has a video processing unit, receives a video signal including a right-eye video frame and a left-eye video frame, and displays a right-eye display video frame and a left-eye display video frame as an example. I will give you a description.
  • FIG. 1 is a block diagram illustrating an example of internal hardware of a display device.
  • the display device performs processing by a video signal input terminal 1 to which a video signal is input, a change receiving unit 2 that receives a parallax change instruction, a video processing unit 3 that processes the input video signal, and a video processing unit 3. And a display unit 4 for displaying a video signal.
  • a video signal including a right-eye video frame and a left-eye video frame is input to the video signal input terminal 1.
  • the video processing unit 3 changes the parallax of the right-eye video frame and the left-eye video frame included in the video signal in response to the parallax change instruction received by the change receiving unit 2, and displays the right-eye display video frame and the left-eye display.
  • a video frame is generated and given to the display unit 4.
  • the display unit 4 is, for example, a liquid crystal display or the like, and alternately displays the right-eye video and the left-eye video by alternately displaying the given right-eye display video frame and the left-eye display video frame.
  • the right-eye video and the left-eye video displayed on the display unit 4 are incident on the eyes corresponding to the user through a not-shown spectacle device worn by the user of the display device, and the user visually recognizes the stereoscopic video.
  • the change receiving unit 2 is, for example, a slider or a dial connected to a variable resistor, a potentiostat, or the like, and receives a change instruction indicating an increase or decrease in parallax when operated by a user.
  • the video processor 3 stores an input video signal, a storage unit 30, frame memories 31 R and 31 L for extracting a right eye video frame and a left eye video frame from the video signal, a right eye video frame and a left eye, respectively.
  • an extraction unit 32 that extracts a video pair that is deviated between video frames.
  • the display device synthesizes a position calculation unit 33 that calculates a center position that is approximately the center of the display position in each frame of the extracted image, and a right-eye image frame and a left-eye image frame, which will be described later.
  • a synthesis unit 34 that obtains a frame, and a parallax setting unit 36 that sets a parallax according to a change instruction given from the change receiving unit 2 are provided.
  • the display device includes a generation unit 35 that generates a right-eye display video frame and a left-eye display video frame from the center video frame, the right-eye video frame, and the left-eye video frame based on the set parallax.
  • the accumulation unit 30 is a memory, accumulates video signals sequentially input from the video signal input terminal 1, separates them into right-eye video frames and left-eye video frames, and sequentially supplies them to the frame memories 31 R and 31 L.
  • the frame memories 31R and 31L are memories for storing frames for one frame, and the extraction unit 32 extracts a right-eye video frame for one frame and a left-eye video frame for one frame corresponding to the right-eye video frame. To give.
  • the extraction unit 32 compares the right-eye video frame and the left-eye video frame for one frame given from the frame memories 31R and 31L, respectively, so that each of the videos shifted between the right-eye video frame and the left-eye video frame is displayed.
  • the image is extracted from the frame as a video pair together with the display position in each frame, and is supplied to the position calculation unit 33 and the synthesis unit 34.
  • the position calculation unit 33 calculates the center position of the display position in each frame of the video pair given from the extraction unit 32 and gives it to the synthesis unit 34.
  • the synthesizing unit 34 extracts and synthesizes the background video from the right-eye video frame and the left-eye video frame given by the frame memories 31R and 31L, and further, one of the video pairs given from the extraction unit 32 is position calculation unit 33.
  • a central video frame is obtained by arranging at a central position given by. Then, the synthesizing unit 34 provides the central video frame to the generating unit 35 together with the right-eye video frame and the left-eye video frame.
  • the generation unit 35 uses the parallax set by the parallax setting unit 36 for the right-eye display video frame and the left-eye display video frame from the center video frame, the right-eye video frame, and the left-eye video frame given by the synthesis unit 34. Is generated and given to the display unit 4.
  • FIG. 2 is a flowchart showing a procedure for changing the parallax of the right-eye video frame and the left-eye video frame.
  • FIG. 3 is an explanatory diagram for explaining a method of extracting a video pair that is displaced between the right-eye video frame and the left-eye video frame.
  • FIG. 4 is an explanatory diagram for explaining a method of generating a central video frame.
  • the extraction unit 32 extracts a video pair that is displaced between the right-eye video frame and the left-eye video frame (step S11). For the extraction of the video pair, first, the extraction unit 32 calculates the deviation amount of the display position in each video frame shown in the right-eye video frame and the display position in each video frame shown in the left-eye video frame. calculate.
  • the extraction unit 32 may extract a video whose deviation amount exceeding a set threshold is calculated as a video pair from each of the right-eye video frame and the left-eye video frame.
  • a conventional motion detection device may be used for the extraction of the video pair by the extraction unit 32.
  • the motion detection device of the prior art compares a plurality of video frames included in a moving image in time series order, and detects the motion of the subject by detecting the video of the subject that is deviated between the frames.
  • the motion detection device may be used to compare the right-eye video frame and the left-eye video frame, detect a video deviated between the frames, and extract a video pair.
  • the images MR and ML of the same subject are shown in the display position PR in the frame of the right-eye video frame FR and the display position PL in the frame of the left-eye video frame FL.
  • the distance between the display positions PR and PL is calculated as the displacement amount ⁇ X, and when the displacement amount ⁇ X exceeds the threshold value, the images MR and ML are extracted as image pairs.
  • the position calculation unit 33 calculates a central position that is substantially the center of the display position in each frame of the video pair extracted by the extraction unit 32 (step S12).
  • the synthesizing unit 34 extracts a video excluding the video pair extracted by the extraction unit 32 from the right-eye video frame and the left-eye video frame supplied from the frame memories 31R and 31L as a background video (step S13). Are synthesized (step S14).
  • the synthesizing unit 34 obtains the central video frame by arranging one of the video pairs extracted by the extracting unit 32 with respect to the synthesized background video at the central position calculated by the position calculating unit 33 (step S15).
  • one of the images MR and ML as the image pair extracted from the right-eye image frame FR and the left-eye image frame shown in FIG. 3 is displayed as the image MC at the display positions PR and PL. It is arranged at the central position PC.
  • the partial areas AR and AL of the central video frame surrounded by broken lines shown in FIG. 4 correspond to the areas where the video MR and ML are displayed in the right-eye video frame FR and the left-eye video frame FL, respectively.
  • the background video corresponding to the partial area AR is extracted and arranged from the left-eye video frame FL by the synthesis of the background video.
  • the background image corresponding to the partial area AL is extracted from the right-eye video frame FR and arranged in the partial area AL by combining the background images.
  • the generation unit 35 generates a right-eye display video frame and a left-eye display video frame using the parallax set by the parallax setting unit 36 from the central video frame, the right-eye video frame, and the left-eye video frame given from the synthesis unit 34. (Step S16) to complete the change of the parallax of the right-eye video frame and the left-eye video frame.
  • the video of the subject in the central video frame is respectively displayed for the right-eye display frame and the left-eye display frame.
  • the position to be displayed is calculated.
  • the right-eye display frame and the left-eye display frame in which the video of the subject is arranged at each of the calculated positions are generated by synthesizing the right-eye video frame and the left-eye video frame.
  • the background video extracted from the right-eye video frame and the left-eye video frame is arranged in a region excluding the subject video arranged in the right-eye display video frame and the left-eye display video frame.
  • FIG. 5 is a schematic diagram showing a data layout example.
  • FIG. 5 shows a video signal given to the video processing unit 3, a central video frame output from the synthesis unit 34, and a display video frame including a right-eye display video frame and a left-eye display video frame output from the generation unit 35. They are shown side by side from the top to the bottom.
  • video signals in which left-eye video frames L1, L2, L3,... And right-eye video frames R1, R2, R3,. Have been entered.
  • the synthesizing unit 34 combines the left-eye video frames L1, L2, L3,... And the right-eye video frames R1, R2, R3,..., And the central video frames C1, C2, C3,. Output sequentially.
  • the generation unit 35 generates left-eye display video frames L1 ′, L2 ′, L3 ′, and right-eye display video frames R1 ′, R2 ′, R3 ′, generated from the center video frames C1, C2, C3,. ... each set is output sequentially.
  • the display unit 4 displays the left-eye display video frames L1 ′, L2 ′, L3 ′, and the right-eye display video frames R1 ′, R2 ′, R3 ′,. .
  • the present invention is not limited to this.
  • the right-eye video frame and the left-eye video frame may be divided into a plurality of blocks having a predetermined number of pixels, and the partial videos displayed in each block may be compared between the frames.
  • a block pair displaced between frames is extracted as a video pair.
  • each pixel included in the right-eye video frame and the left-eye video frame may be compared between the frames, and a pixel pair displaced between the frames may be extracted as a video pair.
  • the present invention is not limited to this, and the display position in each frame of the video pair and one of the video pairs are extracted. May be. In this case, one of the extracted video pairs is arranged in the central video frame.
  • the right-eye display video frame and the left-eye display video frame are generated from the right-eye video frame, the left-eye video frame, and the midpoint video frame, the present invention is not limited thereto.
  • a right-eye display video frame and a left-eye display video frame having a predetermined parallax may be generated from a central video frame that is a two-dimensional video by using a conventional technique for generating a right-eye display video frame and a left-eye display video frame.
  • a video generated by interpolation processing based on a background video extracted from the central video frame may be arranged in a region excluding the subject video arranged in each of the right-eye display video frame and the left-eye display video frame.
  • a computer executes a program read from a storage medium such as a memory, an optical disk, and a magnetic disk, and the computer realizes functions of each hardware of the video processing unit 3 according to the program.
  • a RAM Random-Access Memory
  • the parallax included in the input right-eye video frame and left-eye video frame can be changed.
  • FIG. 6 is a block diagram illustrating an example of internal hardware of the display device according to the second embodiment.
  • the display device includes a video processing unit 5. Since the internal hardware parts of the display device excluding the video processing unit 5 are the same as those of the first embodiment, the same reference numerals are used and detailed description thereof is omitted.
  • the video processing unit 5 combines the right-eye video frame and the left-eye video frame included in the video signal input to the video signal input terminal 1 to obtain a central video frame.
  • the video processing unit 5 generates a right-eye display video frame and a left-eye display video frame from the central video frame using the parallax changed according to the user's parallax change instruction received by the change receiving unit 2, and the display unit Give to 4.
  • the video processing unit 5 changes the parallax calculation unit 57 that calculates the parallax included in the right-eye video frame and the left-eye video frame, and changes the parallax calculated by the parallax calculation unit 57 according to the change instruction received by the change receiving unit 2.
  • a change unit 58 Since other internal hardware units of the video processing unit 5 are the same as the video processing unit 3 of the first embodiment, only the difference in reference numerals is described.
  • the video processing unit 5 includes a storage unit 50, frame memories 51R and 51L, an extraction unit 52, a position calculation unit 53, a synthesis unit 54, and a generation unit 55.
  • the parallax calculation unit 57 calculates the parallax that the right-eye video frame and the left-eye video frame have based on the deviation amount of the video pair extracted by the extraction unit 52, and provides the parallax to the changing unit 58.
  • the changing unit 58 changes the parallax given from the parallax calculating unit 57 in accordance with the change instruction received by the change receiving unit 2, and gives it to the generating unit 55.
  • a change instruction for instructing increase or decrease can be received and changed with respect to the parallax included in the video frame for the right eye and the video frame for the left eye included in the video signal input to the video signal input terminal 1. It becomes possible.
  • the second embodiment is as described above, and the others are the same as in the first embodiment. Therefore, the corresponding parts are denoted by the same reference numerals and processing names, and detailed description thereof is omitted.
  • Embodiment 3 the parallax of the right-eye video frame and the left-eye video frame is calculated based on the deviation amount of the video pair extracted in the second embodiment, whereas the central video frame is used as the right-eye video. Calculate by comparing with the frame.
  • FIG. 7 is a block diagram illustrating an example of internal hardware of the display device according to the third embodiment.
  • the display device includes a video processing unit 6. Since the internal hardware parts of the display device excluding the video processing unit 6 are the same as those of the first embodiment, the same reference numerals are used and detailed description thereof is omitted.
  • the video processing unit 6 combines the right-eye video frame and the left-eye video frame included in the video signal input to the video signal input terminal 1 to obtain a central video frame.
  • the video processing unit 6 generates a right-eye display video frame and a left-eye display video frame from the center video frame using the parallax changed according to the change instruction received from the user by the change receiving unit 2, and give.
  • the video processing unit 6 changes the parallax calculation unit 67 that calculates the parallax included in the right-eye video frame and the left-eye video frame, and changes the parallax calculated by the parallax calculation unit 67 according to the change instruction received by the change receiving unit 2.
  • a change unit 68 Since other internal hardware units of the video processing unit 6 are the same as the video processing unit 3 of the first embodiment, only the difference in reference numerals is described.
  • the video processing unit 6 includes a storage unit 60, frame memories 61R and 61L, an extraction unit 62, a position calculation unit 63, a synthesis unit 64, and a generation unit 65.
  • the parallax calculation unit 67 detects a deviation amount of the subject image between the central video frame provided from the synthesis unit 64 and the right-eye video frame provided from the frame memory 61R, and based on the detected deviation amount. Parallax of the right-eye video frame and the left-eye video frame is calculated. For example, a value obtained by doubling the deviation amount of the subject image between the central video frame and the right-eye video frame may be calculated as the parallax of the right-eye video frame and the left-eye video frame. In the third embodiment, the case where the deviation amount of the subject image between the central video frame and the right-eye video frame is detected is shown, but the present invention is not limited to this, and the subject between the central video frame and the left-eye video frame is detected. The parallax may be calculated by detecting the deviation amount of the video.
  • the third embodiment is as described above, and the other parts are the same as in the first embodiment. Therefore, the corresponding parts are denoted by the same reference numerals and processing names, and detailed description thereof is omitted.
  • Embodiment 4 relates to a mode of generating a right-eye display video frame and a left-eye display video frame by applying frame frequency conversion including motion detection (hereinafter referred to as FRC (frame rate control)).
  • FRC frame rate control
  • FIG. 8 is a block diagram illustrating an example of internal hardware of the display device according to the fourth embodiment.
  • the display device includes a video processing unit 7. Since the internal hardware parts of the display device excluding the video processing unit 7 are the same as those of the first embodiment, the same reference numerals are used and detailed description thereof is omitted.
  • the video processing unit 7 combines the right-eye video frame and the left-eye video frame included in the video signal input to the video signal input terminal 1 by FRC to obtain a central video frame.
  • the video processing unit 7 uses the acquired central video frame to generate a right-eye display video frame and a left-eye display video frame by FRC.
  • the extraction unit 72 extracts a video pair that is displaced between the right-eye video frame and the left-eye video frame by motion detection.
  • the extraction unit 72 compares the right-eye video frame and the left-eye video frame, detects a video deviated between the frames, and extracts a video pair.
  • the video is detected even when the video portion moves in any direction, up, down, left, or right. Therefore, when other video parts that are not detection targets move up and down, an unintended video may be erroneously detected.
  • the extraction unit 32 has no or little vertical movement and detects the video of the video part moving in the horizontal direction. Therefore, the extraction unit 32 suppresses the possibility of false detection lower than that of the normal motion detection. Can be extracted.
  • the position calculation unit 73 calculates a central position that is substantially the center of the display position in each frame of the video pair extracted by the extraction unit 72.
  • the position calculation unit 73 calculates the center position as follows, for example. Since the images MR and ML in FIG. 3 are always deviated only in the horizontal direction, the position calculation unit 73 creates a horizontal detection line including the images MR and ML, and detects edges of the images MR and ML. .
  • the position calculation unit 73 calculates the center position from the left and right edge positions of each of the images MR and ML.
  • the synthesizing unit 74 extracts a video excluding the video pair extracted by the extracting unit 72 from each of the right-eye video frame and the left-eye video frame given from the frame memories 71R and 71L as a background video, and synthesizes the background video.
  • the background video for the center video frame, the right-eye display video frame, and the left-eye display video frame can be used for any video obtained by removing the video pair from the right-eye video frame and the left-eye video frame. There may be no. For this reason, there are cases where the background video for each video frame cannot be simply synthesized, but the synthesis unit 74 estimates a portion having no background video from the video determined as the background video, and synthesizes the background video. Alternatively, in the case where the surrounding area of the missing background video portion is a blank image, a video in which a simple shape pattern is repeated, etc., the compositing unit 74 complements the missing portion with the surrounding area video, You may synthesize.
  • the FRC extracts a video part that has changed between two videos at close times, and interpolates or interpolates an intermediate video between the two extracted video parts, so that the intermediate between the two videos at the close time is obtained.
  • This is a technology for generating video.
  • the synthesizing unit 74 estimates and generates an intermediate video of the video pair by FRC interpolation or interpolation by replacing the right-eye video frame and the left-eye video frame with two images at close times.
  • the video MC generated by the synthesizing unit 74 is an intermediate video between the video MR and ML in display position and form.
  • the synthesizing unit 74 obtains the central video frame by arranging the video MC generated for the synthesized background video at the central position calculated by the position calculating unit 73.
  • the generation unit 75 uses the parallax set by the parallax setting unit 76 from the central video frame, the right-eye video frame, and the left-eye video frame given from the synthesis unit 74, and performs the right-eye display video frame and the left-eye by FRC. Generate a display video frame.
  • the parallax set by the parallax setting unit 76 may be larger or smaller than the parallax corresponding to the right-eye video frame and the left-eye video frame received by the video processing unit 7.
  • the generation unit 75 When the parallax set by the parallax setting unit 76 is smaller than the parallax corresponding to the right-eye video frame and the left-eye video frame, the generation unit 75 performs the right-eye display video frame and the left-eye display video frame by FRC interpolation. Generate. When the parallax set by the parallax setting unit 76 is larger than the parallax corresponding to the right-eye video frame and the left-eye video frame, the generation unit 75 generates the right-eye display video frame and the left-eye display video frame by extrapolation of FRC. Generate. Extrapolation here is to estimate and generate a video corresponding to a past time or a future time closer to the time corresponding to two videos handled by the FRC.
  • the accuracy related to the display position and form of the video interpolated or extrapolated by the FRC varies depending on the degree of positional or morphological difference between the two videos that are the source of the interpolation or extrapolation. That is, the accuracy regarding the display position or form of the video newly generated by FRC is higher as the positional or morphological difference between the two original videos is smaller, and the positional or morphological difference between the two original videos is higher. The larger the is, the lower it is.
  • the accuracy of the video newly generated by the FRC further includes the color, luminance, shading, size, brightness, accuracy of the position and direction of the light source that illuminates the subject, and the position and direction of the viewpoint.
  • the deviation amount includes a difference amount such as a display position and a form between two videos.
  • FIG. 9 is an explanatory diagram for explaining a method of directly generating a right-eye display video frame and a left-eye display video frame from a right-eye video frame and a left-eye video frame.
  • the upper part of FIG. 9 shows a left-eye video frame and a right-eye video frame.
  • the lower part of FIG. 9 shows the left-eye display video frame. Display for left eye by FRC interpolation or extrapolation based on the amount of deviation between each subject in the left eye video frame, right eye video frame, left eye video frame and right eye video frame, and the set parallax A new video frame is generated.
  • a new right-eye display video frame can be generated.
  • FIG. 10 is an explanatory diagram for explaining a method of generating a right-eye display video frame and a left-eye display video frame from a right-eye video frame and a left-eye video frame via a center video frame.
  • the upper part of FIG. 10 shows a left-eye video frame and a right-eye video frame.
  • the middle part of FIG. 10 shows the left-eye video frame and the center video frame.
  • the lower part of FIG. 10 shows the left-eye display video frame.
  • the synthesizer 74 generates a central video frame by FRC interpolation from the right-eye video frame and the left-eye video frame. Therefore, the display position, form, and the like of the subject in the central video frame are substantially between those of the subjects in the right-eye video frame and the left-eye video frame.
  • the generating unit 75 performs interpolation or extrapolation of FRC based on the deviation amount between the subjects in the left-eye video frame and the central video frame, the left-eye video frame and the central video frame, and the set parallax.
  • a left-eye display video frame is newly generated by the insertion.
  • the generation unit 75 performs interpolation or extrapolation of FRC based on the deviation amount between each subject in the right-eye video frame and the central video frame, the right-eye video frame and the central video frame, and the set parallax.
  • a right-eye display video frame is newly generated by the insertion.
  • the deviation amounts in the two video frames that are the basis for generating the left-eye display video frame are different between the case of FIG. 9 and the case of FIG. 10, and the deviation amount in FIG. 10 is approximately half of the deviation amount in FIG. It is. That is, the deviation amount between the right-eye video frame or the left-eye video frame and the central video frame is approximately half of the deviation amount between the right-eye video frame and the left-eye video frame. Since the accuracy of the video that is internally or extrapolated by the FRC is higher as the deviation amount related to the original video or video frame is smaller, the display device displays the right-eye display video frame and the left-eye display video frame via the central video frame. By generating, the accuracy of the generated video can be increased.
  • the generation unit 75 uses the parallax set by the parallax setting unit 76, the video of the subject in the central video frame, and the video of the subject in the right-eye video frame or the left-eye video frame. Each subject to be displayed in each of the display video frame and the left-eye display video frame is generated by FRC.
  • the generation unit 75 uses the parallax set by the parallax setting unit 76, the position of the subject in the central video frame, and the position of the subject in the right-eye video frame or the left-eye video frame. The position of each subject displayed in each of the display video frame and the left-eye display video frame is calculated.
  • the generation unit 75 generates the right-eye display video frame and the left-eye display video frame in which the video of each subject is arranged at each calculated position with respect to the background video synthesized by the synthesis unit 74.
  • the background video extracted from the right-eye video frame and the left-eye video frame is arranged in a region excluding the subject video arranged in the right-eye display video frame and the left-eye display video frame.
  • the generation unit 75 may extract the background video of the central video frame synthesized by the synthesis unit 74, and set the extracted background video as the background video of the right-eye display video frame and the left-eye display video frame.
  • the fourth embodiment is as described above, and the other parts are the same as those of the first embodiment. Therefore, the corresponding parts are denoted by the same reference numerals and processing names, and detailed description thereof is omitted.
  • Video signal input terminal 2 Change accepting unit 3, 5, 6, 7 Video processing unit 30, 50, 60, 70 Storage unit 31R, 31L, 51R, 51L, 61R, 61L, 71R, 71L Frame memory 32, 52, 62 72 Extraction unit 33, 53, 63, 73 Position calculation unit 34, 54, 64, 74 Composition unit 35, 55, 65, 75 Generation unit 36, 76 Parallax setting unit 57, 67 Parallax calculation unit 58, 68 Change unit 4 Display section

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

La présente invention porte sur un dispositif de traitement d'image vidéo, un dispositif d'affichage et un procédé de traitement d'image vidéo aptes à modifier la parallaxe d'images vidéo isolées pour l'œil droit et pour l'œil gauche. Une unité de traitement d'image vidéo (3) possédée par un dispositif d'affichage de la présente invention contient une unité d'extraction (32) pour extraire une position d'affichage dans chaque image isolée d'une paire d'images vidéo qui présentent un écart entre l'image vidéo isolée pour l'œil droit et l'image vidéo isolée pour l'œil gauche ; une unité de calcul de position (33) pour calculer une position sensiblement centrale entre les positions d'affichage dans chacune des images isolées extraites ; une unité de combinaison (34) pour combiner l'image vidéo isolée centrale, dans laquelle l'une des deux images vidéo de la paire a été positionnée à la position sensiblement centrale, à partir de l'image vidéo isolée pour l'œil droit et de l'image vidéo isolée pour l'œil gauche ; et une unité de génération (35) qui, sur la base de l'image vidéo isolée centrale qui a été acquise par combinaison, génère une image isolée vidéo d'affichage pour l'œil droit et une image isolée vidéo d'affichage pour l'œil gauche à l'aide d'une parallaxe prédéterminée.
PCT/JP2011/067647 2010-08-02 2011-08-02 Dispositif de traitement d'image vidéo, dispositif d'affichage et procédé de traitement d'image vidéo Ceased WO2012018001A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2010-173950 2010-08-02
JP2010173950 2010-08-02
JP2011167590A JP2012054912A (ja) 2010-08-02 2011-07-29 映像処理装置、表示装置及び映像処理方法
JP2011-167590 2011-07-29

Publications (1)

Publication Number Publication Date
WO2012018001A1 true WO2012018001A1 (fr) 2012-02-09

Family

ID=45559498

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/067647 Ceased WO2012018001A1 (fr) 2010-08-02 2011-08-02 Dispositif de traitement d'image vidéo, dispositif d'affichage et procédé de traitement d'image vidéo

Country Status (2)

Country Link
JP (1) JP2012054912A (fr)
WO (1) WO2012018001A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101340308B1 (ko) 2012-06-08 2013-12-11 주식회사 마이씨에프 영상 처리 장치 및 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0927969A (ja) * 1995-05-08 1997-01-28 Matsushita Electric Ind Co Ltd 複数画像の中間像生成方法及び視差推定方法および装置
JP2000209614A (ja) * 1999-01-14 2000-07-28 Sony Corp 立体映像システム
JP2003209858A (ja) * 2002-01-17 2003-07-25 Canon Inc 立体画像生成方法及び記録媒体
WO2009139740A1 (fr) * 2008-05-12 2009-11-19 Thomson Licensing Système et procédé de mesure de la fatigue oculaire potentielle d’images animées stéréoscopiques

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5556394B2 (ja) * 2010-06-07 2014-07-23 ソニー株式会社 立体画像表示システム、視差変換装置、視差変換方法およびプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0927969A (ja) * 1995-05-08 1997-01-28 Matsushita Electric Ind Co Ltd 複数画像の中間像生成方法及び視差推定方法および装置
JP2000209614A (ja) * 1999-01-14 2000-07-28 Sony Corp 立体映像システム
JP2003209858A (ja) * 2002-01-17 2003-07-25 Canon Inc 立体画像生成方法及び記録媒体
WO2009139740A1 (fr) * 2008-05-12 2009-11-19 Thomson Licensing Système et procédé de mesure de la fatigue oculaire potentielle d’images animées stéréoscopiques

Also Published As

Publication number Publication date
JP2012054912A (ja) 2012-03-15

Similar Documents

Publication Publication Date Title
EP2549762B1 (fr) Appareil d'adaptation de position d'image en stéréovision, procédé d'adaptation de position d'image en stéréovision et programme correspondant
JP5917017B2 (ja) 画像処理装置及びその制御方法、並びにプログラム
US9215452B2 (en) Stereoscopic video display apparatus and stereoscopic video display method
US9007442B2 (en) Stereo image display system, stereo imaging apparatus and stereo display apparatus
US8704881B2 (en) Stereoscopic image display apparatus
US20140111627A1 (en) Multi-viewpoint image generation device and multi-viewpoint image generation method
US20120163701A1 (en) Image processing device, image processing method, and program
CN102326397B (zh) 图像处理设备和图像处理方法
WO2010084724A1 (fr) Dispositif de traitement d'images, programme, procédé de traitement d'images, procédé d'enregistrement et support d'enregistrement
US20090244258A1 (en) Stereoscopic display apparatus, stereoscopic display method, and program
CN102379127A (zh) 影像处理装置、影像处理方法以及计算机程序
US9167223B2 (en) Stereoscopic video processing device and method, and program
WO2013108285A1 (fr) Dispositif et procédé d'enregistrement d'image et dispositif et procédé de reproduction d'image en trois dimensions
JP5521608B2 (ja) 画像処理装置、および画像処理方法、並びにプログラム
US20110242294A1 (en) Stereoscopic image display device and method of deriving motion vector
JP5127973B1 (ja) 映像処理装置、映像処理方法および映像表示装置
CN102547330A (zh) 图像处理设备、图像处理方法和程序
JP5915158B2 (ja) タイムコード表示装置及びタイムコード表示方法
JP4892105B1 (ja) 映像処理装置、映像処理方法および映像表示装置
JP2014072809A (ja) 画像生成装置、画像生成方法、画像生成装置用プログラム
WO2012018001A1 (fr) Dispositif de traitement d'image vidéo, dispositif d'affichage et procédé de traitement d'image vidéo
JP2013171539A (ja) 映像処理装置、映像処理方法及びコンピュータプログラム
US9113140B2 (en) Stereoscopic image processing device and method for generating interpolated frame with parallax and motion vector
KR101192121B1 (ko) 양안시차 및 깊이 정보를 이용한 애너그리프 영상 생성 방법 및 장치
JP5088973B2 (ja) 立体撮像装置およびその撮像方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11814626

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11814626

Country of ref document: EP

Kind code of ref document: A1