WO2004014080A1 - 画像処理装置および方法、情報処理装置および方法、記録媒体、並びにプログラム - Google Patents
画像処理装置および方法、情報処理装置および方法、記録媒体、並びにプログラム Download PDFInfo
- Publication number
- WO2004014080A1 WO2004014080A1 PCT/JP2003/009062 JP0309062W WO2004014080A1 WO 2004014080 A1 WO2004014080 A1 WO 2004014080A1 JP 0309062 W JP0309062 W JP 0309062W WO 2004014080 A1 WO2004014080 A1 WO 2004014080A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coordinates
- image
- small
- screen
- predetermined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2625—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect
- H04N5/2627—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of images from a temporal image sequence, e.g. for a stroboscopic effect for providing spin image effect, 3D stop motion effect or temporal freeze effect
Definitions
- Image processing apparatus and method information processing apparatus and method, recording medium, and program
- the present invention relates to an image processing apparatus and method, an information processing apparatus and method, a recording medium, and a program, and more particularly, to an image processing apparatus and method capable of appropriately dividing and displaying a plurality of images, and an information processing apparatus. And a method, a recording medium, and a program. Background art
- FIG. 1 shows a configuration example of a conventional image processing system.
- N video cameras 2-1 to 2-N arranged at predetermined positions with respect to the subject 1 (hereinafter referred to simply as the video camera 2 when it is not necessary to distinguish them individually. The same applies to other cases. ) Supplies the image obtained as a result of the imaging to the image processing device 3.
- the image processing device 3 generates an image in which the N images supplied from the video camera 2 are divided and displayed on the display 4A of the display device 4, and supplies the image to the display device 4.
- the display device 4 displays the image from the image processing device 3 on the display 4A.
- the processing content of the image processing device 3 differs depending on the number of the video cameras 2 and the divided display mode.
- the processing contents of the image processing device 3 are different between the case where the obtained eight images are divided and displayed as shown in FIG.
- the number attached to the small screen W shown in FIGS. 3 and 5 indicates the video camera 2 that provides the image displayed thereon, and the number shown after the video camera 2 Is associated with. That is, for example, an image obtained as a result of imaging by the video camera 2-1 is displayed on the small screen W1, and an image obtained as a result of imaging by the video camera 2-2 is displayed on the small screen W2. .
- the image processing device 3 is a dedicated device corresponding to these conditions. As a result, there has been a problem that conventional image processing systems cannot easily cope with changes in usage conditions. Disclosure of the invention
- the present invention has been made in view of such a situation, and is intended to easily respond to a change in use conditions.
- a small image including predetermined coordinates corresponding to predetermined coordinates on a large screen on which a large image composed of a plurality of small images arranged at a predetermined position is displayed is displayed.
- Detecting means for detecting coordinates on the small screen reading means for reading the pixel value of a pixel of a predetermined small image at a position corresponding to the coordinates on the small screen detected by the detecting means, and reading means
- Output means for outputting the read pixel value as a pixel value of a pixel of a large image at a position corresponding to predetermined coordinates on a large screen.
- the coordinates on the large screen, the coordinates on the small screen including the coordinates on the large screen corresponding to the coordinates on the large screen, and the information for identifying the small image displayed on the small screen are respectively supported.
- a storage means for storing the attached table is further provided, and the detecting means causes the reading means to detect coordinates on the small screen including the predetermined coordinates corresponding to the predetermined coordinates on the large screen from the table. Is a pixel of a pixel located at a position corresponding to the coordinates on the small screen detected by the detecting means of the small image identified by the information for identifying the small image associated with the predetermined coordinates in the table.
- the value can be read.
- the small image can be an image corresponding to a captured image obtained as a result of imaging by the imaging device.
- a small image including predetermined coordinates corresponding to predetermined coordinates on a large screen on which a large image including a plurality of small images arranged at a predetermined position is displayed is displayed.
- a detection step of detecting coordinates on the small screen a reading step of reading pixel values of pixels of a predetermined small image at a position corresponding to the coordinates on the small screen detected in the processing of the detection step; Outputting a pixel value read in the step process as a pixel value of a pixel of a large image at a position corresponding to predetermined coordinates on a large screen.
- the program of the first recording medium includes a small image including predetermined coordinates corresponding to predetermined coordinates on a large screen on which a large image composed of a plurality of small images arranged at a predetermined position is displayed.
- a detection control step for controlling the detection of coordinates on the small screen on which is displayed, and a pixel value of a pixel of a predetermined small image at a position corresponding to the coordinates on the small screen detected in the processing of the detection control step
- a read control step for controlling the reading of the image, and controlling the output as a pixel value of a pixel of the large image at a position corresponding to a predetermined coordinate on the large screen of the pixel value read in the processing of the reading control step.
- an output control step for controlling the reading of the image, and controlling the output as a pixel value of a pixel of the large image at a position corresponding to a predetermined coordinate on the large screen of the pixel value read in the processing of the reading control step.
- a small image including predetermined coordinates corresponding to predetermined coordinates on a large screen on which a large image composed of a plurality of small images arranged at a predetermined position is displayed is displayed.
- a reading control step of controlling reading of a pixel value of a pixel of a predetermined small image at a position corresponding to the coordinates on the small screen detected in the processing of the step; and a pixel value read in the processing of the reading control step An output control step of controlling an output as a pixel value of a pixel of a large image at a position corresponding to predetermined coordinates on the large screen.
- predetermined coordinates corresponding to predetermined coordinates on a large screen on which a large image composed of a plurality of small images arranged at a predetermined position is displayed.
- the coordinates on the small screen on which the included small image is displayed are detected, and the pixel values of the pixels of the predetermined small image at positions corresponding to the detected coordinates on the small screen are read, and the read pixel values are read. Is output as the pixel value of the pixel of the large image at the position corresponding to the predetermined coordinates on the large screen.
- An information processing apparatus is configured such that first detecting means for detecting coordinates on a small screen, including coordinates on a large screen, corresponding to coordinates on a large screen, is associated with the coordinates on the large screen.
- Second detection means for detecting information for identifying the small image; coordinates on the large screen; coordinates on the small screen detected by the first detection means; and small coordinates detected by the second detection means.
- Generating means for storing information for identifying an image in association with each other and generating a first table.
- the small image is an image corresponding to the captured image obtained as a result of imaging by the imaging device, and the coordinates on the large screen are, for each region corresponding to the imaging range of the imaging device, an image of the imaging device having the corresponding imaging range.
- the identification information can be associated as information for identifying the small image.
- the coordinates on the small screen of the captured image for the small screen cut out from the captured image corrected based on the condition of the optical system of the imaging device, and the captured image for the small screen corresponding to the coordinates on the small screen Is further provided with a storage means for storing a second table in which the coordinates on the large screen when are located on the predetermined small screen are associated with each other.
- the coordinates on the small screen to be detected from the second table Then, the coordinates on the small screen of the captured image before capturing can be detected from the detected coordinates on the small screen.
- An information processing method includes: a first detection step of detecting coordinates on a small screen including coordinates on a large screen corresponding to coordinates on a large screen; A second detection step for detecting information for identifying a small image that is present, coordinates on the large screen, coordinates on the small screen detected in the processing of the first detection step, and processing in the second detection step And generating information for identifying the small image detected in the step (b) and storing the information in correspondence with each other, and generating a first table.
- the program of the second recording medium of the present invention comprises: a first detection control step for controlling detection of coordinates on a small screen including coordinates on a large screen, corresponding to coordinates on a large screen; A second detection control step for controlling detection of information for identifying a small image associated with the coordinates, a coordinate on the large screen, and a small screen detected in the processing of the first detection control step. And the generation control step of controlling the generation of the first table by storing the information for identifying the small image detected in the processing of the second detection control step in association with each other. It is characterized by including.
- a second program includes: a first detection control step of controlling detection of coordinates on a small screen including coordinates on a large screen corresponding to coordinates on a large screen; A second detection control step for controlling detection of information for identifying the associated small image, coordinates on the large screen, coordinates on the small screen detected in the processing of the first detection control step And a generation control step for controlling generation of the first table by storing information for identifying the small image detected in the processing of the second detection control step in association with each other. It is characterized by.
- FIG. 1 is a block diagram showing a configuration example of a conventional image processing system.
- FIG. 2 is a diagram showing an example of the arrangement of the video cameras in FIG.
- FIG. 3 is a diagram showing a display example of an image obtained as a result of imaging by the video camera of FIG.
- FIG. 4 is a diagram showing another arrangement example of the video camera in FIG.
- FIG. 5 is a diagram showing another display example of an image obtained as a result of imaging by the video camera in FIG.
- FIG. 6 is a block diagram illustrating a configuration example of an image processing system to which the present invention has been applied.
- FIG. 7 is a diagram showing the size of the display of the display device of FIG.
- FIG. 8 is a block diagram showing a configuration example of the image processing device of FIG.
- FIG. 9 is a block diagram showing a configuration example of the coordinate conversion table generation device of FIG.
- FIG. 10 is a diagram showing another display example of an image obtained as a result of imaging by the video camera of FIG.
- FIG. 11 is a flowchart illustrating image processing of the image processing apparatus in FIG.
- FIG. 12 is a diagram illustrating an example of the coordinate conversion table.
- FIG. 13 is a flowchart illustrating the operation of the coordinate conversion table generation device of FIG. 6 when generating the coordinate conversion table.
- FIG. 14 is a diagram showing an example of a correspondence table between coordinates on the display and a camera number.
- FIG. 15 is a diagram showing an example of a correspondence table between the coordinates on the display and the coordinates on the small screen of the SD image after correction.
- FIG. 16 is a flowchart illustrating a process of generating the correspondence table of FIG.
- FIG. 17A is a diagram illustrating the process of generating the correspondence table of FIG.
- FIG. 17B is a diagram illustrating the process of generating the correspondence table of FIG.
- FIG. 17C is a diagram illustrating a process of generating the correspondence table in FIG.
- FIG. 18 is a diagram showing an example of the arrangement of the video cameras in FIG.
- FIG. 19 is a diagram illustrating a display example of an image obtained as a result of imaging by the video camera in FIG.
- FIG. 20 is a diagram illustrating the imaging range of the video camera in FIG.
- FIG. 21 is another diagram illustrating the imaging range of the video camera in FIG.
- FIG. 22 is another diagram illustrating the imaging range of the video camera in FIG.
- FIG. 23 is another diagram illustrating the imaging range of the video camera in FIG.
- FIG. 24 is another diagram illustrating the imaging range of the video camera in FIG.
- FIG. 25 is a diagram illustrating another example of the coordinate conversion table.
- FIG. 26 is a flowchart illustrating another operation of the coordinate conversion table generation device of FIG. 6 when generating the coordinate conversion table.
- FIG. 27 is a diagram showing another example of the correspondence table between the coordinates on the display and the force camera number.
- FIG. 28 is a diagram showing an area on the display corresponding to the imaging range of the video camera in FIG.
- FIG. 29 is another diagram showing an area on the display corresponding to the imaging range of the video camera in FIG.
- FIG. 30 is another diagram showing an area on the display corresponding to the imaging range of the video camera in FIG.
- FIG. 31 is a diagram showing another example of a correspondence table between the coordinates on the display and the coordinates on the small screen of the SD image after correction.
- FIG. 32 is a flowchart illustrating the processing for generating the correspondence table in FIG. 31.
- Figure 33 is an external view of the omnidirectional camera.
- FIG. 34 is a flowchart illustrating another operation of the coordinate conversion table generation device of FIG. 6 when generating the coordinate conversion table.
- FIG. 35 is a flowchart illustrating a process of generating a correspondence table between the coordinates on the display and the coordinates on the small screen of the SD image after correction.
- FIG. 6 shows a configuration example of an image processing system to which the present invention is applied.
- parts corresponding to those in FIG. 1 are denoted by the same reference numerals.
- the N video power cameras 2-1 to 2-N arranged at predetermined positions with respect to the subject 1 convert images obtained as a result of imaging (for example, SD (standard definition) images) into image processing devices. 1 supply to 1.
- the image processing device 11 is configured to convert the SD images supplied from each of the video cameras 2-1 to 2-N by the coordinate conversion table generation device 13 provided via the memory card 21 by the coordinate conversion table generation device 13. Using the table Ta, an image (HD (high definition inition) image) in which the N SD images are divided and displayed in a predetermined form on the display 12A of the display device 12 is generated. The image processing device 11 supplies the generated HD image to the display device 12.
- HD high definition inition
- the display device 12 displays the HD image from the image processing device 11 on the display 12A.
- the display 12A has a resolution of 1920 ⁇ 1080 pixels, as shown in FIG.
- the coordinate conversion table generation device 13 generates a coordinate conversion table Ta corresponding to use conditions such as the number and arrangement position of the video cameras 2, lens distortion, and the display form of an SD image. And provide it to the image processing device 11.
- FIG. 8 shows a configuration example of the image processing device 11.
- Decoder 3 1 1 1 to 3 1 1 N Decodes the SD image input from the video camera 2 and supplies it to the corresponding field memories 32-1 to 32 -N.
- the field memory 32 stores the image data supplied from the decoder 31 in field units.
- the conversion unit 33 appropriately reads the image data stored in the field memories 32-1 to 32 -N, and, based on the read image data, based on the coordinate conversion table Ta stored in the storage unit 36. Then, an HD image in which N images are divided and displayed in a predetermined form on the display 12 A of the display device 12 is generated and supplied to the frame memory 34.
- the frame memory 34 stores the HD image supplied from the conversion unit 33.
- the encoder 35 appropriately reads and encodes the image data (image data of the HD image) stored in the frame memory 34, and supplies the resulting image data to the display device 12.
- the storage unit 36 reads the coordinate conversion table Ta from the memory card 21 mounted on the image processing device 11 via the interface 37, and stores it.
- FIG. 9 shows a configuration example of the coordinate conversion table generation device 13.
- An input / output interface 46 is connected to the CPU (Central Processing Unit) 41 via a bus 45, and the CPU 41 is connected to an input including a keyboard, a mouse, and the like via an input / output interface 46.
- a command is input from the unit 48, for example, a magnetic disk 61, an optical disk 62, or a magneto-optical disk 63 mounted on a ROM (Read Only Memory) 42, a hard disk 44, or a drive 50
- a program stored on a recording medium such as a semiconductor memory 64 for example, a program for generating a coordinate conversion tape hole Ta
- a RAM Random Access Memory
- the CPU 41 outputs the processing result to an output unit 47 such as an LCD (Liquid Crystal Display) via the input / output interface 46 as necessary.
- the program is stored in the hard disk 44 or the ROM 42 in advance, and is integrated with the coordinate conversion table generator 13.
- package media such as magnetic disk 61, optical disk 62, magneto-optical disk 63, and semiconductor memory 64, and provided from a satellite or a network via hard disk 44 via communication unit 49. Can be provided.
- the CPU 41 stores the generated coordinate conversion table Ta in the memory card 21 via the drive 50.
- the operation of the image processing apparatus 11 in the case where the obtained image is divided and displayed as shown in FIG. 10 (hereinafter, such a use condition is appropriately referred to as a first use condition) is shown in FIG. This will be described with reference to the flowchart of FIG.
- step S1 the conversion unit 33 of the image processing apparatus 11 reads a coordinate conversion table Ta stored in the storage unit 36 as shown in FIG.
- the coordinate conversion table Ta includes the coordinates on the display 12A, the coordinates on the small screen W including the coordinates on the display 12A corresponding to the coordinates on the display 12A, and the small screen W
- the camera numbers assigned to the video cameras 2 that provide the images displayed in are associated with each other.
- Information indicating that black is to be output (information indicated by “black output” in FIG. 12) is set in the coordinates on the predetermined display 12A. Note that the coordinates on the display 12A to which information indicating that black is output are associated with the coordinates on the small screen W and the camera number are not associated.
- step S2 the conversion unit 33 selects one coordinate (the coordinate of one pixel forming the HD image) on the display 12A.
- step S3 the conversion unit 33 determines whether information indicating that black is to be output is associated with the coordinates on the display 12A selected in step S2 in the coordinate conversion table Ta. Then, if it is determined that the information is not associated, the process proceeds to step S4, and the camera number associated with the coordinates on the selected display 12A is stored in the coordinate conversion table Ta. Detect from.
- step S5 the conversion unit 33 selects the field memory 32 corresponding to the video camera 2 to which the camera number detected in step S4 is assigned.
- step S6 the conversion unit 33 detects the coordinates on the small screen W associated with the coordinates on the display 12A selected in step S2 from the coordinate conversion table Ta.
- step S7 the conversion unit 33 converts the SD image stored in the field memory 32 selected in step S5 at a position corresponding to the coordinates on the small screen W detected in step S6.
- the pixel value of the pixel is read and stored in the frame memory 34 as the pixel value output to the coordinates on the display 12A selected in step S2.
- step S3 If it is determined in step S3 that information indicating that black is to be output is set, the process proceeds to step S8, where the conversion unit 33 stores the pixel value displayed as black in the frame memory 3 4 Then, it is stored as the pixel value output to the coordinates on the display 12A selected in step S2.
- step S2 when predetermined coordinates in the small screen W5 in FIG. 10 are selected, in the coordinate conversion table Ta, the selected coordinates include the camera number of the video camera 2-5. Since the information is associated and the information indicating that black is to be output is not associated, the camera number of the video camera 2-5 is detected (step S3, step S4).
- step S5 the field memory 3 2-5 corresponding to the video camera 2-5 is selected (step S5), and the coordinates on the small screen W associated with the coordinates on the selected display 12A. Is detected from the coordinate conversion table Ta (step S6).
- step S5 the pixel value of the pixel of the SD image (step S5) from the video camera 2-5 at the position corresponding to the detected coordinates (step S6) on the small screen W is displayed on the selected display 1 2 It is stored as the pixel value output to the coordinates on A (step S7).
- step S2 when the coordinates of the screens other than the small screens W1 to W9 in FIG. 10 (the shaded portions in the figure) are selected, in the coordinate conversion table Ta, Since information indicating that black is output is set in such coordinates (step S 3), the pixel value displayed as black is changed to the pixel output at the selected coordinates on the display 12 A. It is stored as a value (step S8).
- step S9 it is determined whether or not the coordinates of are selected, and if it is determined that there are still unselected coordinates, the process returns to step S2 to select the next coordinate on the display 12A.
- step S9 If it is determined in step S9 that all coordinates have been selected, the processing ends.
- the image processing device 11 Since the coordinate conversion table Ta shown in FIG. 12 is generated by the coordinate conversion table generation device 13 based on the first use condition and the lens distortion of the video camera 2 as described later, In the present system, the image processing device 11 performs the above-described processing according to the coordinate conversion table Ta, and the nine SD images supplied from the video camera 2 can be converted as shown in FIG. It can be divided and displayed.
- step S21 the CPU 41 of the coordinate conversion table generation device 13 converts the coordinates (the coordinates of one pixel constituting the HD image) on the display 12A of the display device 12 (FIG. 10). Choose one.
- step S22 the CPU 41 determines that the coordinates on the display 12A selected in step S21 are within the small screen W (corresponding to the SD image obtained as a result of imaging by the video camera 2).
- the screen where the image to be displayed is displayed).
- the camera number of the video camera 2 which is the provider of the image displayed on the coordinates in the small screen W1 to the small screen W9 is displayed.
- the coordinates in the screens other than the small screen W1 to the small screen W9 are associated with information indicating the fact (hereinafter referred to as information outside the display area).
- the CPU 1 refers to the table Tbl and stores the video camera 2 in the coordinates on the display 12A selected in step S21. It is determined whether or not these camera numbers are associated.
- step S22 When it is determined in step S22 that the coordinates on the display 12A selected in step S21 are within the small screen W (in the correspondence table Tbl, the coordinates selected in step S21) If the coordinates are associated with the camera number of the video camera 2), the process proceeds to step S23, where the CPU 41 corresponds to the coordinates on the display 12A selected in step S21. The camera number of the video camera 2 is detected from the correspondence table Tbl.
- step S24 the CPU 41 compares the coordinates on the display 12A shown in FIG. 15 stored in the hard disk 44 with the lens distortion of the video camera 2 as described later. Selected in step S21 from the correspondence table Tb2 showing the correspondence between the coordinates on the small screen W of the small screen W cut out from the SD image obtained as a result of imaging by the video camera 2 corrected based on The coordinates on the small screen W that are associated with the coordinates on the display 12 A that have been set are detected.
- step S25 the CPU 41 determines the coordinates on the small screen W of the SD image (original SD image) before capturing from the coordinates on the small screen W detected in step S24 as described later. Is calculated.
- step S26 the CPU 41 determines the coordinates on the display 12A selected in step S21, the camera number detected in step S23, and the small number calculated in step S25.
- the coordinates on the screen W are stored in the coordinate conversion table Ta (FIG. 12) in correspondence with each other.
- step S22 when it is determined that the coordinates on the display 1 2A selected in step S21 are not within the small screen W (in the correspondence table Tbl, the coordinates selected in step S21) When information outside the display area is associated with the coordinates on the display 12A), the process proceeds to step S27.
- step S27 the CPU 41 associates the coordinates on the display 12A selected in step S21 with information indicating that black is to be output, by associating them with the coordinate conversion table T a (FIG. 1). 2) Save to.
- step S 26 or step S 27 the process proceeds to step S 28, and the CPU 41 executes all the displays 1 on the display 1 2 A. 2 It is determined whether or not the coordinates on A have been selected, and if it is determined that the coordinates that have not been selected still remain, the process returns to step S21 to select the next coordinate and perform the subsequent processing. Execute.
- step S28 When it is determined in step S28 that all coordinates have been selected, the process ends.
- step S41 the CPU 41 of the coordinate conversion table generation device 13 selects one camera number from the camera numbers of the video cameras 2-1 to 2-9.
- step S42 the CPU 41 generates an SD image having the same size as the SD image obtained as a result of the imaging by the video camera 2 (in this example, an image having the same size as the small screen W).
- the coordinates (Xa, Ya) of each pixel constituting the SD image on the small screen W are corrected according to the equation (1).
- calculate the coordinates (Xb, Yb) (coordinates of the corrected SD image on the small screen).
- Yb Yac + (Ya-Yac) (1 + k) xr 2 + k 2 xr 4 )
- coordinates (Xac, Yac) are the coordinates of the pixel located at the center of the SD image.
- k 1 and k 2 are coefficients determined based on the lens distortion of the video camera 2 and the like.
- r is the distance between the pixel at coordinates (Xa, Ya) and the pixel at the center of the distortion in the SD image.
- step S43 the CPU 41 determines the range of the SD image obtained as a result of the correction in step S42 corresponding to the size of the small screen W, and in step S44, the determination is made. Select the coordinates (Xb, Yb) on the small screen W of each pixel of the SD image within the specified range. In other words, the coordinates on the small screen W of the captured SD image displayed on the small screen W are selected.
- the SD image (FIG. 17A) obtained as a result of imaging by the video camera 2 is enlarged and deformed into a wound type as shown by a solid line in FIG. 17B, for example, by the correction according to the equation (1). Therefore, the coordinates within the range determined in step S43 shown by the dotted line in FIG. 17B are selected from among them (FIG. 17C).
- the coordinates (Xai: Yai) in FIG. 17A indicate arbitrary coordinates (Xa, Ya), and the coordinates (Xbi, Ybi) in FIGS. 17B and 17C correspond to the coordinates (Xai, Yai).
- Yai) represents the coordinates (Xb, Yb) obtained as a result of the correction.
- step S45 the CPU 41 selects one coordinate from the coordinates on the small screen W of the captured SD image selected in step S44, and displays it in step S46. 2 Convert to coordinates on A.
- the coordinates on the small screen W selected in step S45 are obtained by converting the SD image (Fig. 17C) in the range determined in step S43 into the force selected in step S41.
- the image is mapped to the small screen W on which the image of the video camera 2 having the camera number is displayed, it is converted into the corresponding coordinates on the display 12A.
- the coordinates on the small screen W selected in step S45 are in the range determined in step S43.
- the SD image is mapped to the small screen W5
- it is converted to the corresponding coordinates on the display 12A.
- step S47 the CPU 41 compares the coordinates on the small screen W of the corrected SD image selected in step S45 with the coordinates on the display 12A obtained in step S46. Correspondence is stored in the correspondence table Tb2 (Fig. 15).
- step S48 the CPU 41 determines whether or not all of the coordinates selected in step S44 have been selected. If it is determined that unselected coordinates still remain, the CPU 41 determines in step S4. Return to step 5, select the next coordinate, and execute the subsequent processing.
- step S48 If it is determined in step S48 that all coordinates have been selected, the process proceeds to step S49, in which the CPU 41 determines whether all camera numbers have been selected. If it is determined that there is still a camera, the process returns to step S41 to select the next camera number.
- step S49 If it is determined in step S49 that all camera numbers have been selected, the process ends.
- the pixel value output to the predetermined coordinates Ph (not shown) on the display 12A in the small screen W5 is the figure of the SD image obtained as a result of the imaging by the video camera 2-5.
- the coordinates (Xbi, Ybi) on small screen W detected as corresponding to the coordinates Ph on display 12A (Fig. 17B) (Fig. 1B)
- the image processing device 11 calculates the coordinates corresponding to the lens distortion of the video camera 2 of the input SD image obtained as a result of the imaging by the video camera 2, and displays the display 12 according to the divided display mode. There is no need to calculate the coordinates on A, and as shown in Fig. 11, the pixel values of the SD image are converted to the HD image according to the coordinate conversion table Ta. By simply setting an image, it is possible to generate an HD image that divides and displays an SD image obtained as a result of imaging using the video camera 2 that has been subjected to distortion correction.
- step S25 inverse correction is performed by optimization corresponding to the expression, and the coordinates of the SD image before correction on the small screen W are calculated.
- FIG. 18 nine video cameras 2-1 to 2-9 are arranged so as to form a surface in the vertical direction with respect to the distant subject 1, and the video cameras 2
- the operation of the image processing apparatus 11 in a case where the image obtained as a result of imaging is displayed in a divided manner as shown in FIG. 19 (hereinafter, such a use condition is appropriately referred to as a second use condition). explain.
- the video cameras 2 are arranged so that the imaging ranges of the adjacent cameras partially overlap as indicated by the dotted frame in FIG.
- the imaging range of the video camera 2-1 solid frame in Fig. 21
- the imaging range of the video camera 2-2 the implementation range of Fig. 22
- the imaging range of the video camera 2-2 The range indicated by the solid line in Fig. 23
- the imaging range of the video camera 2-5 the range indicated by the solid line in Fig. 24).
- the image processing device 11 generates a coordinate conversion table T a as shown in FIG. 25 generated by the coordinate conversion table generation device 13 based on the second use condition, the lens distortion of the video camera 2, and the like.
- the process shown in the flowchart of FIG. 11 is executed by using the same as in the first usage mode. In other words, the processing of the image processing apparatus 11 changes the values of the coordinates to be handled, but is substantially the same in the case of the first use condition and the case of the second use condition. It can be easily handled.
- the display form in FIG. 19 displays an HD image on the entire display 12 A of the display device 12, so that the coordinate conversion table T As in the case of the first usage condition (FIG. 12), information indicating that black is output is not set in a. Therefore, in this example, the flow chart of Fig. 11 In the process, since the determination of YES is not made in step S3, the pixel value for outputting black is not stored in the process of step S8. ⁇ ⁇
- step S61 the CPU 41 of the coordinate conversion table generation device 13 converts the coordinates of the display 12A of the display device 12 (the coordinates of one pixel constituting the HD image) (FIG. 19). One is selected, and in step S62, one camera number is selected.
- step S63 the CPU 41 stores in the correspondence table Tel indicating the correspondence between the coordinates on the display 12A and the camera number shown in FIG. 27, on the display 12A selected in step S61. It is determined whether or not the coordinates of are associated with the camera number selected in step S62.
- the coordinates on the display 12 A of the corresponding table T c 1 shown in FIG. 27 are provided for each area shown in FIG. 28 corresponding to the imaging range of the video camera 2 shown in FIG.
- the camera number of the video camera 2 having the corresponding imaging range is associated.
- the camera number of the video camera 2-1 is associated with the coordinates in the area Q1 (Fig. 29) on the display 12A corresponding to the imaging range (Fig. 21) of the video camera 2-1.
- the camera number of the video camera 2-2 is associated with the coordinates in the area Q2 (Fig. 30) corresponding to the imaging range of the video camera 2-2 (Fig. 22). I have.
- the coordinates on the display 12A belonging to both the area Q1 and the area Q2 are the video force cameras 2-1, 2 and 2. Two camera numbers are associated with each other.
- step S63 when it is determined in step S63 that the coordinates on the display 12A selected in step S61 are associated with the camera number selected in step S62. Proceed to step S64.
- step S64 the CPU 41 corrects the video camera 2 corrected based on the coordinates on the display 12A shown in FIG. 31 and the lens distortion of the video camera 2 stored in the hard disk 44.
- the coordinates on the small screen W of the image of the small screen W cut out from the SD image obtained as a result of the imaging From the correspondence table Tc2 indicating the correspondence relationship, the coordinates on the small screen W associated with the coordinates on the display 12A selected in step S61 and the camera number selected in step S62 are detected. I do.
- the method of generating the correspondence table Tc2 will be described later.
- step S65 the CPU 41 calculates the coordinates on the small screen W of the SD image before the capture from the coordinates on the small screen W of the corrected SD image detected in step S64. I do.
- step S63 When it is determined in step S63 that the coordinates on the display 1 2A selected in step S61 and the camera number selected in step S62 are not associated with each other, or in step S65.
- step S66 the CPU 41 determines whether or not all the camera numbers have been selected. If it is determined that there is still nothing left, return to step S62 and select the next camera number.
- step S66 When it is determined in step S66 that all the camera numbers have been selected, the process proceeds to step S67, where the CPU 41 determines that the coordinates on the display 12A selected in step S61 are: In the correspondence table Tel (FIG. 27), it is determined whether or not a plurality of camera numbers are associated. If it is determined that the camera numbers are associated with a plurality of camera numbers, the process proceeds to step S68.
- step S61 the coordinates (X hi, Yhi) are stored in step S61. If it is selected in step, go to step S68.
- step S68 the CPU 41 calculates the coordinates on the small screen W of the plurality of uncorrected SD images calculated by performing the processes of steps S62 to S65 a plurality of times.
- Step S 65 (For example, when the camera number of video camera 2-1 is selected in step S 62 and the camera number of video camera 2-2 are selected, From the calculated values, one coordinate to be set in the coordinate conversion table Ta is determined.
- step S67 When it is determined in step S67 that the coordinates selected on the display 12A selected in step S61 and the correspondence table Tel do not correspond to a plurality of camera numbers, or in step S68. When the coordinates on one small screen W are determined, the process proceeds to step S69.
- step S69 the CPU 41 associates the coordinates on the display 12A selected in step S61 and the coordinates on the display 12A selected in step S61 with a plurality of camera numbers.
- Step S 65 when the coordinates on the small screen W calculated in Step S 65 or the coordinates on the display 12 A selected in Step S 61 are associated with multiple camera numbers 6
- the coordinates on the small screen W determined in step 8 and the camera number selected in step S62 when the coordinates on the small screen W are calculated correspond to the coordinate conversion table T a ( Save it in Figure 25).
- step S70 the CPU .41 determines whether all the coordinates on the display 12A have been selected, and if it is determined that there are still unselected coordinates, Returning to step S61, the next coordinate is selected, and the subsequent processing is executed. When it is determined in step S70 that all coordinates have been selected, the process ends.
- step S81 the CPU 41 of the coordinate conversion table generation device 13 selects one camera number.
- step S82 the CPU 41 assumes an SD image having the same size as the SD image obtained as a result of imaging by the video camera 2, and displays the image on the small screen W of each pixel constituting the SD image.
- the coordinates (Xa, Ya) (coordinates on the small screen W of the SD image before correction) are corrected according to the equation (1), and the coordinates (Xb, Yb) (the small image on the small screen of the SD image after correction) Coordinates).
- step S83 the CPU 41 determines the range of the SD image obtained as a result of the correction in step S82 according to the size of the small screen W, and in step S84, the determination is made. Select the coordinates (Xb, Yb) on the small screen W of each pixel of the SD image within the specified range. That is, the coordinates on the small screen W of the corrected SD image displayed on the small screen W are selected.
- step S85 the CPU 41 selects one coordinate from the coordinates on the small screen W of the SD image selected in step S84 and after correction, and In S86, the coordinates are converted to coordinates on the display 12A.
- the coordinates on the small screen W selected in step S85 are obtained by converting the SD image in the range determined in step S83 into a video camera having the camera number selected in step S81.
- the image is mapped to the small screen W on which the image 2 is displayed, it is converted to the corresponding coordinates on the display 12A.
- step S81 when the camera number of video camera 2-5 is selected in step S81, the coordinates on the small screen W selected in step S85 are determined in step S83.
- the SD image in the specified range is mapped to the small screen W5 (Fig. 19), it is converted to the corresponding coordinates on the display 12A.
- step S87 the CPU 41 determines the camera number selected in step S81, the coordinates on the small screen W of the corrected SD image selected in step S85, and step S87.
- the coordinates on the display 12A obtained in 86 are stored in the correspondence table Tc2 in correspondence with each other as shown in FIG.
- step S88 the CPU 41 determines whether or not all of the coordinates selected in step S84 have been selected, and if it is determined that unselected coordinates still remain, Returning to step S85, the next coordinate is selected.
- step S88 When it is determined in step S88 that all coordinates have been selected, the process proceeds to step S89, in which the CPU 41 determines whether or not all of the camera numbers have been selected. If it is determined that the number still remains, the process returns to step S81 and selects the next camera number.
- step S89 If it is determined in step S89 that all the camera numbers have been selected, the process ends.
- the eight video cameras 2 shown in FIG. 4 are actually the omnidirectional cameras 101 as shown in FIG.
- the video cameras 2 are arranged at predetermined angular intervals so that they coincide near the center, and the viewing directions of the respective video cameras 2 are on one horizontal plane.
- the plane mirror 110 is arranged in the viewing direction of the video camera 2. That is, when the video camera 2 captures the surrounding landscape reflected by the corresponding plane mirror 110, the omnidirectional camera 101 as a whole can capture a horizontal 360-degree landscape. .
- the image processing apparatus 11 uses the coordinate conversion table Ta generated by the coordinate conversion table generation apparatus 13 based on the third use condition and the lens distortion of the video camera 2 and the like, The processing shown in the flowchart of FIG. 11 is executed as in the case of the first use condition and the second use condition. As a result, eight images obtained as a result of imaging by the video camera 2 can be divided and displayed as shown in FIG. Next, the operation of the coordinate conversion table generation device 13 when generating the coordinate conversion table Ta used in the case of this example will be described with reference to the flowchart of FIG.
- steps S101 to S104 the same processes as those in steps S61 to S64 of FIG. 26 are performed, and thus description thereof will be omitted.
- step S105 the CPU 41 optimizes the coordinates on the small screen W of the corrected SD image detected in step S104, and performs a process reverse to the mirror inversion process (reverse mirror inversion). Process) to calculate the coordinates on the small screen W of the SD image before correction.
- steps S106 to S110 the same processing as in steps S66 to S70 of FIG. 26 is performed, and a description thereof will be omitted.
- step S121 the CPU 41 of the coordinate conversion table generator 13 selects one camera number.
- step S122 the CPU 41 assumes an SD image having the same size as the SD image obtained as a result of imaging by the video camera 2, and sets the coordinates (x a , Ya) are corrected according to equation (1) to calculate coordinates (Xb, Yb) and perform mirror inversion processing.
- steps S123 to S129 the same processing as in steps S83 to S89 of FIG. 32 is performed, and a description thereof will be omitted.
- the series of processes described above can be realized by hardware, but can also be realized by software.
- a program constituting the software is installed in a computer, and the program is executed by the computer, so that the image processing apparatus 11 and the coordinate conversion table generating apparatus described above are used. 13 is functionally realized.
- the steps for describing the program provided by the recording medium are not limited to the processing performed in chronological order according to the described order, but are not necessarily performed in chronological order. It also includes processes that are executed in parallel or individually.
- a system refers to an entire device including a plurality of devices.
- divided display of a small screen can be easily performed.
- the divided display of the small screen can be easily performed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Input (AREA)
- Closed-Circuit Television Systems (AREA)
- Controls And Circuits For Display Device (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP03741438A EP1542471A4 (en) | 2002-07-31 | 2003-07-17 | IMAGE PROCESSING DEVICE, METHOD, INFORMATION PROCESSING DEVICE, METHOD, RECORDING MEDIUM AND PROGRAM |
| US10/523,078 US20060165309A1 (en) | 2002-07-31 | 2003-07-17 | Image processing apparatus and method, information processing apparatus and method, recording medium, and program |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2002224150A JP2004062103A (ja) | 2002-07-31 | 2002-07-31 | 画像処理装置および方法、情報処理装置および方法、記録媒体、並びにプログラム |
| JP2002-224150 | 2002-07-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2004014080A1 true WO2004014080A1 (ja) | 2004-02-12 |
Family
ID=31492124
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2003/009062 Ceased WO2004014080A1 (ja) | 2002-07-31 | 2003-07-17 | 画像処理装置および方法、情報処理装置および方法、記録媒体、並びにプログラム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20060165309A1 (ja) |
| EP (1) | EP1542471A4 (ja) |
| JP (1) | JP2004062103A (ja) |
| WO (1) | WO2004014080A1 (ja) |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4297111B2 (ja) * | 2005-12-14 | 2009-07-15 | ソニー株式会社 | 撮像装置、画像処理方法及びそのプログラム |
| JP4345829B2 (ja) | 2007-03-09 | 2009-10-14 | ソニー株式会社 | 画像表示システム、画像表示装置、画像表示方法およびプログラム |
| TW201010429A (en) * | 2008-08-20 | 2010-03-01 | Av Tech Corp | Image processing system and method thereof |
| US8754941B1 (en) * | 2009-09-22 | 2014-06-17 | Altia Systems, Inc. | Multi-imager video camera with frame-by-frame view switching |
| JP5803184B2 (ja) * | 2010-11-19 | 2015-11-04 | 株式会社リコー | 画像投影装置、メモリアクセス方法 |
| KR101675804B1 (ko) * | 2015-05-27 | 2016-11-15 | 한화테크윈 주식회사 | 비디오 월(video wall)이 형성되는 감시 시스템 |
| WO2020042025A1 (zh) * | 2018-08-29 | 2020-03-05 | 深圳市大疆创新科技有限公司 | 视频的处理方法、装置、显示系统及存储介质 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0777965A (ja) * | 1992-12-28 | 1995-03-20 | Sanyo Electric Co Ltd | 多分割画面表示装置 |
| JPH09270954A (ja) * | 1996-04-03 | 1997-10-14 | Hitachi Ltd | 画面合成回路 |
| WO2000064175A1 (en) * | 1999-04-16 | 2000-10-26 | Matsushita Electric Industrial Co., Ltd. | Image processing device and monitoring system |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4890314A (en) * | 1988-08-26 | 1989-12-26 | Bell Communications Research, Inc. | Teleconference facility with high resolution video display |
| US6088005A (en) * | 1996-01-11 | 2000-07-11 | Hewlett-Packard Company | Design and method for a large, virtual workspace |
| JP3339284B2 (ja) * | 1996-01-29 | 2002-10-28 | 三菱電機株式会社 | 大画面表示方式 |
| JP2962348B2 (ja) * | 1996-02-08 | 1999-10-12 | 日本電気株式会社 | 画像符号変換方式 |
| JPH11155142A (ja) * | 1997-11-19 | 1999-06-08 | Mitsubishi Electric Corp | 医療支援システム |
| JP2004500731A (ja) * | 1998-06-18 | 2004-01-08 | ソニー エレクトロニクス インク | 複数のディスプレイ装置に亘ってビデオ及び/又はグラフィック画像を分割し、スケーリングし、表示する方法及び装置 |
| US6657637B1 (en) * | 1998-07-30 | 2003-12-02 | Matsushita Electric Industrial Co., Ltd. | Moving image combining apparatus combining computer graphic image and at least one video sequence composed of a plurality of video frames |
| US6477314B1 (en) * | 1999-07-29 | 2002-11-05 | Geo Vision Inc. | Method of recording image data, and computer system capable of recording image data |
| JP3557168B2 (ja) * | 2000-11-27 | 2004-08-25 | 三洋電機株式会社 | レンズ歪み係数算出装置および算出方法、レンズ歪み係数算出プログラムを記録したコンピュータ読み取り可能な記録媒体 |
| US6912695B2 (en) * | 2001-09-13 | 2005-06-28 | Pixia Corp. | Data storage and retrieval system and method |
-
2002
- 2002-07-31 JP JP2002224150A patent/JP2004062103A/ja not_active Abandoned
-
2003
- 2003-07-17 EP EP03741438A patent/EP1542471A4/en not_active Withdrawn
- 2003-07-17 WO PCT/JP2003/009062 patent/WO2004014080A1/ja not_active Ceased
- 2003-07-17 US US10/523,078 patent/US20060165309A1/en not_active Abandoned
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0777965A (ja) * | 1992-12-28 | 1995-03-20 | Sanyo Electric Co Ltd | 多分割画面表示装置 |
| JPH09270954A (ja) * | 1996-04-03 | 1997-10-14 | Hitachi Ltd | 画面合成回路 |
| WO2000064175A1 (en) * | 1999-04-16 | 2000-10-26 | Matsushita Electric Industrial Co., Ltd. | Image processing device and monitoring system |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP1542471A4 * |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1542471A4 (en) | 2005-12-07 |
| US20060165309A1 (en) | 2006-07-27 |
| JP2004062103A (ja) | 2004-02-26 |
| EP1542471A1 (en) | 2005-06-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102385360B1 (ko) | 이미지 보정을 수행하는 전자 장치 및 그 동작 방법 | |
| JP5906028B2 (ja) | 画像処理装置、画像処理方法 | |
| US8228396B2 (en) | Image processing apparatus, image capturing apparatus, and image distortion correction method | |
| US9071751B2 (en) | Image processor method and program for correcting distance distortion in panorama images | |
| EP2063646A2 (en) | Method and apparatus for predictive coding | |
| CN101518045B (zh) | 图像处理装置和图像处理方法 | |
| CN110969575B (zh) | 自适应图像拼接的方法及图像处理装置 | |
| JP5522174B2 (ja) | 動画像符号化装置 | |
| US8289420B2 (en) | Image processing device, camera device, image processing method, and program | |
| US8625007B2 (en) | Image pickup apparatus, image combination method, and computer program | |
| US9083882B2 (en) | Image capture apparatus and method for controlling the same | |
| EP1892665A2 (en) | Entire-view video image process system and method thereof | |
| KR20190087119A (ko) | 이미지를 안정화하는 이미지 처리 장치 및 이미지를 안정화하는 방법 | |
| US8976276B2 (en) | Image processing apparatus, image capturing apparatus, and image processing method | |
| WO2004014080A1 (ja) | 画像処理装置および方法、情報処理装置および方法、記録媒体、並びにプログラム | |
| EP3619679B1 (en) | Chromatic aberration modeling in image compression and enhancement | |
| KR102418852B1 (ko) | 이미지 표시를 제어하는 전자 장치 및 방법 | |
| JP6336341B2 (ja) | 撮像装置及びその制御方法、プログラム、記憶媒体 | |
| US11206344B2 (en) | Image pickup apparatus and storage medium | |
| CN108876717B (zh) | 图像视角转换方法及其系统 | |
| JP2005277732A (ja) | 矩形画像の四隅検出方法及び装置及びプログラム及び矩形画像の四隅検出プログラムを格納した記憶媒体 | |
| JP4218278B2 (ja) | 情報処理システム、情報処理装置、情報処理方法、画像処理装置、画像処理方法、およびプログラム | |
| KR20200098806A (ko) | 이상 픽셀을 포함하는 센서로부터의 데이터를 처리하기 위한 프로세서 | |
| US11182617B2 (en) | Device and method for generating heat map | |
| JP6467940B2 (ja) | 画像処理装置、画像処理方法及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): US |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2003741438 Country of ref document: EP |
|
| WWP | Wipo information: published in national office |
Ref document number: 2003741438 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2006165309 Country of ref document: US Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 10523078 Country of ref document: US |
|
| WWP | Wipo information: published in national office |
Ref document number: 10523078 Country of ref document: US |