[go: up one dir, main page]

WO2012039043A1 - Unité de génération d'image stéréo, procédé de génération d'image stéréo et programme informatique de génération d'image stéréo - Google Patents

Unité de génération d'image stéréo, procédé de génération d'image stéréo et programme informatique de génération d'image stéréo Download PDF

Info

Publication number
WO2012039043A1
WO2012039043A1 PCT/JP2010/066450 JP2010066450W WO2012039043A1 WO 2012039043 A1 WO2012039043 A1 WO 2012039043A1 JP 2010066450 W JP2010066450 W JP 2010066450W WO 2012039043 A1 WO2012039043 A1 WO 2012039043A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
point
coefficient
movement component
rotational movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2010/066450
Other languages
English (en)
Japanese (ja)
Inventor
智史 島田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to PCT/JP2010/066450 priority Critical patent/WO2012039043A1/fr
Priority to JP2012534860A priority patent/JP5392415B2/ja
Publication of WO2012039043A1 publication Critical patent/WO2012039043A1/fr
Priority to US13/849,013 priority patent/US9094672B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/211Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the present invention relates to, for example, a stereo image generation apparatus, a stereo image generation method, and a computer program for stereo image generation that generate a stereo image based on two images obtained by photographing the same object from different directions.
  • a first image is created by photographing a target object at a first point using one camera, and then the camera is moved to a second point.
  • a second image is created by photographing the target object.
  • the first image and the second image are created by photographing the target object with two cameras arranged at the first point and the second point.
  • the first image and the second image are a set of stereo images.
  • the images projected on the two images are It is preferable that the image is taken under the same conditions as the conditions under which an observer generally sees an object. Since the left and right eyes of a human are aligned in the horizontal direction, it is preferable that the first point and the second point are also aligned horizontally. In addition, since the lines of sight of the left and right eyes are substantially parallel, the optical axis of the camera at the first point and the optical axis of the camera at the second point are preferably parallel to each other.
  • the photographer In order to accurately arrange the cameras in this way at the first point and the second point, the photographer must perform a strict alignment adjustment. Therefore, the position and orientation of the camera at the first point and the second point are measured, and the arrangement of the camera is adjusted according to the measurement result.
  • strict alignment adjustment for example, an operator performs surveying using a measuring instrument for surveying, or performs various manual operations for adjusting the position of the camera according to the result. become. Therefore, enormous costs and work time are required to create a stereo image. Therefore, in practice, it is often difficult to perform such exact alignment adjustment.
  • the camera height at the first point is different from the camera height at the second point, or the camera optical axis direction at the first point is different from the camera optical axis direction at the second point.
  • one image rotates around the optical axis with respect to the other image.
  • Patent Document 1 a technique for creating an image suitable for a stereo image by converting the position of the image on the image has been proposed (see, for example, Patent Document 1 and Non-Patent Document 1).
  • the camera characteristics such as camera focal length and distortion aberration are corrected so as to correct the camera height at two points at the time of shooting, the difference in the optical axis direction, and the deviation of rotation around the optical axis.
  • the position of the image on the image is converted in consideration.
  • An image processing apparatus determines a rotation method or a rotation translation method of planar projection conversion so that corresponding feature points extracted from two images match each other. Then, the image processing apparatus generates a stereo image by converting the position of each point on the image using a rotation type or a rotation translation type.
  • another known image processing apparatus generates a stereo image by converting an image in consideration of epipolar geometry.
  • the epipolar geometry represents a geometric correspondence between a point of interest on the image and each shooting position when one point of interest on the object to be imaged is shot from two different shooting positions.
  • the line connecting the camera arranged at the first position and the point of interest is a straight line projected on the image taken by the camera arranged at the second position.
  • a line connecting the camera arranged at the second position and the point of interest is a straight line projected on an image photographed by the camera arranged at the first position.
  • the straight line projected on this image is called an epipolar line.
  • An apparatus using epipolar geometry considers such a correspondence and obtains a planar projective transformation coefficient for parallelizing epipolar lines to convert at least one image so that the optical axis of the camera is parallel. It is possible to create a set of two images photographed in step 4 in a pseudo manner.
  • this technique can obtain a relatively accurate coefficient even when a part of a set of feature points includes a set of feature points that are incorrectly associated with feature points.
  • the distance between the feature points extracted from the two images corresponding to the point located on the near side of the object to be photographed due to the parallax is the back side of the object to be photographed. It becomes larger than the distance between the feature points extracted from the two images corresponding to the point located at.
  • each point on the image is originally moved to the object to be imaged at that point.
  • Non-linear position conversion that adjusts the amount of movement according to the distance is required.
  • the position conversion performed by the rotation method or the rotation translation method is a linear conversion that converts a straight line into another straight line. For this reason, when the object to be imaged has a depth, the image of the object to be imaged is distorted by the conversion in the image converted using the planar projective transformation coefficient determined based on the feature points on the object to be imaged. There was a thing.
  • an object of the present specification is to provide a stereo image generation apparatus, a stereo image generation method, and a stereo image generation computer program that can suppress distortion of an image reflected in an image generated by plane projective transformation.
  • a stereo image generation device includes a feature point extracting unit that extracts a plurality of feature points from a first image obtained by photographing a subject to be photographed at a first point, and a subject to be photographed at a second point different from the first point.
  • a tracking unit that calculates the coordinates of a plurality of feature points extracted from the first image and corresponding points as the same point on the object to be photographed, and a feature point of the first image Is obtained by performing plane projective transformation using a plane projective transformation coefficient including a coefficient representing a rotational movement component and a coefficient representing a parallel movement component, and a second corresponding to the feature point of the first image.
  • a stereo image generation method is provided.
  • a plurality of feature points are extracted from a first image obtained by photographing a subject to be photographed at a first point, and a second subject obtained by photographing a subject to be photographed at a second point different from the first point.
  • the coordinates of points corresponding to the plurality of feature points extracted from the first image and the same point on the object to be imaged are calculated from the image of the first image, and the feature points of the first image are rotated and translated.
  • a plane projective transformation coefficient is determined so as to minimize an evaluation value including a weight term having a value according to a component, and the first image is subjected to plane projective transformation using the plane projective transformation parameter, and the plane projective transformation is performed.
  • a computer program for causing a computer to generate a stereo image.
  • the computer program extracts a plurality of feature points from a first image obtained by photographing a subject to be photographed at a first point, and a second image obtained by photographing the subject at a second point different from the first point.
  • the coordinates of points corresponding to the plurality of feature points extracted from the first image and the same point on the object to be imaged are calculated, and the feature points of the first image are calculated from the rotational movement component and the parallel movement component.
  • a plane projective transformation coefficient is determined so as to minimize an evaluation value including a weight term having a corresponding value, and the first image is subjected to plane projective transformation using the plane projective transformation parameter; Generating a pair with the second image as a stereo image Comprising instructions for causing a computer to execute.
  • the stereo image generation device, the stereo image generation method, and the stereo image generation computer program disclosed herein can suppress distortion of an image shown in an image caused by plane projective transformation.
  • FIG. 1A is a diagram showing an example of two images taken at different points
  • FIG. 1B is a diagram in which one of the two images shown in FIG. It is a figure which shows the example of the image after the conversion converted using the planar projection conversion coefficient by which it was carried out.
  • FIG. 2 is a schematic configuration diagram of a stereo image generating apparatus according to one embodiment.
  • FIG. 3 is a diagram illustrating the relationship between the parallax set for two images and the reproduction position in the depth direction of the images on the two images.
  • FIG. 4 is an operation flowchart of the stereo image generation process.
  • FIG. 5 is a schematic configuration diagram of a stereo image generating apparatus according to another embodiment.
  • FIG. 1A is a diagram showing an example of an image shown in two images taken using two cameras arranged so that the optical axes are parallel at two different points.
  • the same object 101 is shown in both the first image 100 and the second image 110.
  • the object 101 has a depth.
  • the four corners 102a to 105a and 102b to 105b of the object 101 are extracted as feature points in order to obtain a plane projective transformation coefficient.
  • a set of feature points (102a, 102b), (103a, 103b), (104a, 104b), and (105a, 105b) correspond to the same point of the object 101, respectively.
  • the corners of the object 101 corresponding to the feature point pairs (102a, 102b) and (103a, 103b) are located on the near side, while the feature point pairs (104a, 104b), (105a, 105b).
  • the corner of the object 101 corresponding to is located on the back side. Therefore, on the basis of each feature point on the first image 100, the feature points 102b and 103b corresponding to the front corner in the second image 110 are the feature points 104b and 105b corresponding to the back corner. It is moving to the left relatively larger than.
  • the feature point pairs (104a, 104b) and (105a, 105b) are located above the feature point pairs (102a, 102b) and (103a, 103b).
  • the planar projective transformation that minimizes the positional deviation between the feature points between the two images 100 and 110 includes a rotational movement component that rotates the image clockwise.
  • FIG. 1B shows a converted image 120 obtained by performing the planar projective transformation on the first image 100 shown in FIG. 1A using the planar projective transformation coefficient determined by the prior art.
  • FIG. 1B shows a converted image 120 obtained by performing the planar projective transformation on the first image 100 shown in FIG. 1A using the planar projective transformation coefficient determined by the prior art.
  • FIG. 1B shows a converted image 120 obtained by performing the planar projective transformation on the first image 100 shown in FIG. 1A using the planar projective transformation coefficient determined by the prior art.
  • FIG. 1B shows a converted image 120 obtained by performing the planar projective transformation on the first image 100 shown in FIG. 1A using the planar projective transformation coefficient determined by the prior art.
  • FIG. 1B shows a converted image 120 obtained by performing the planar projective transformation on the first image 100 shown in FIG. 1A using the planar projective transformation coefficient determined by the prior art.
  • FIG. 1B shows a converted image 120 obtained by performing the planar projective transformation on the first image 100
  • the plane projection conversion coefficient is used for conversion. There is a risk that image distortion will be noticeable.
  • the stereo image generation apparatus performs planar projective transformation that performs planar projective transformation on at least one of two images obtained by photographing an object to be photographed at different points in order to generate a stereo image. Find the coefficient.
  • the plane projective transformation coefficient includes only the coefficient representing the rotational movement component and the parallel movement component, and the rotational movement component is determined by the optical axis of the camera at one photographing point and the other photographing point.
  • the plane projective transformation coefficient is determined so as to be less than the rotation amount between the optical axis of the camera.
  • FIG. 2 is a schematic configuration diagram of a stereo image generating apparatus according to one embodiment.
  • the stereo image generation device 1 includes an image acquisition unit 11, a storage unit 12, a feature point extraction unit 13, a tracking unit 14, a parameter determination unit 15, a stereoscopic effect adjustment unit 16, a conversion unit 17, and an output unit. 18.
  • Each part of the stereo image generating device 1 is formed as a separate circuit.
  • each unit of the stereo image generation device 1 may be mounted on the stereo image generation device 1 as one integrated circuit in which circuits corresponding to the respective units are integrated.
  • the stereo image generating apparatus 1 may include one or more processors and a memory circuit.
  • the feature point extraction unit 13, the tracking unit 14, the parameter determination unit 15, the stereoscopic effect adjustment unit 16, and the conversion unit 17 are functions realized by a computer program executed on the processor included in the stereo image generation device 1. It may be a module.
  • the image acquisition unit 11 captures the same object to be photographed from the camera 10-1 placed at the first point and the camera 10-2 placed at a second point different from the first point. Acquire the obtained image.
  • the image acquisition unit 11 conforms to a serial bus standard such as a universal serial bus (USB) for connecting the cameras 10-1 and 10-2 and the stereo image generating device 1 to each other.
  • a serial bus standard such as a universal serial bus (USB) for connecting the cameras 10-1 and 10-2 and the stereo image generating device 1 to each other.
  • USB universal serial bus
  • the image acquisition unit 11 connects the stereo image generation apparatus 1 with another apparatus such as a server storing two images generated by the cameras 10-1 and 10-2 via a communication network.
  • An interface circuit may be included.
  • the image acquisition unit 11 stores the acquired two images in the storage unit 12.
  • the image acquisition unit 11 outputs the image acquired from the camera 10-1 of the two acquired images to the feature point extraction unit 13. Further, the image acquisition unit 11 outputs the image acquired from the camera 10-2 to the tracking unit 14.
  • an image created by the camera 10-1 is referred to as a first image
  • an image created by the camera 10-2 is referred to as a second image.
  • the storage unit 12 includes, for example, a readable / writable volatile or nonvolatile semiconductor memory circuit, or a magnetic recording medium or an optical recording medium.
  • the storage unit 12 stores the first and second images received from the image acquisition unit 11. Moreover, when at least any one of each part which the stereo image generation apparatus 1 has is implement
  • the feature point extraction unit 13 extracts a characteristic point of an image shown in each image as a feature point from the first image. For example, the feature point extraction unit 13 performs a filtering process using a corner detection filter such as a Harris operator or a Forstner operator on the first image, and extracts points having corner-like features as feature points.
  • the feature point extraction unit 13 converts the pixel value of the first image into, for example, an HSV color system value, and performs filtering on the pixel value representing brightness. Processing may be performed.
  • the feature point extraction unit 13 selects other characteristic points as feature points, for example, pixels having pixel values higher or lower than the values of surrounding pixels as the first image and such feature points.
  • the feature point extraction unit 13 generates information representing a plurality of feature points for each image.
  • the information representing a plurality of feature points includes, for example, the position of each feature point.
  • the information representing a plurality of feature points may include characteristics of each feature point, for example, pixel values of the feature points, corner directions, and the like.
  • the feature point extraction unit 13 then outputs the information to the tracking unit 14.
  • the tracking unit 14 uses a plurality of feature points extracted from the first image and a feature point in the second image from the second image obtained by photographing the subject at a second point different from the first point. As a result, the coordinates of points corresponding to the plurality of feature points extracted from the first image and the same point on the object to be imaged are calculated. The tracking unit 14 obtains at least one set of such corresponding feature points. In the present embodiment, the tracking unit 14 specifies a point on the second image corresponding to the feature point of interest on the first image by using a tracking technique such as Kanade Lucas Tomasi Tracker (KLT) method, for example. Thus, a set of corresponding feature points is obtained.
  • KLT Kanade Lucas Tomasi Tracker
  • the tracking unit 14 may obtain a set of corresponding feature points using any one of various known tracking methods such as a gradient method or a block matching method. At that time, the tracking unit 14 may be a pair of corresponding feature points as long as the feature points have similar characteristics. For each pair of corresponding feature points, the tracking unit 14 generates corresponding feature point information including information such as an identifier and positions of two feature points included in the pair of corresponding feature points. Then, the tracking unit 14 outputs the corresponding feature point information to the parameter determination unit 15.
  • the parameter determination unit 15 calculates a planar projection conversion coefficient for performing planar projection conversion on one of the two images so that the images of the same object to be photographed in each image match.
  • the parameter determination unit 15 includes an evaluation value calculation unit 151, a determination unit 152, and a conversion parameter correction unit 153.
  • the stereo image generating device 1 uses the first image so that the image of the photographing object shown in the first image matches the image of the photographing object shown in the second image. Perform planar projective transformation of the image.
  • the plane projective transformation generally includes a rotational movement component, a parallel movement component, and a shear component.
  • the parameter determination unit 15 uses the rotational movement component and the parallel movement component of the planar projective transformation to approximately match the image on the first image and the image on the second image of the same object to be photographed. Only need to be considered. Therefore, in the present embodiment, the first image is subjected to planar projective transformation according to the following equation.
  • p 1 is a vector representing the coordinates of each pixel on the first image
  • p 1 (x, y, 1) t where x is the horizontal coordinate and y is the vertical coordinate
  • R x ( ⁇ x ), R y ( ⁇ y ), and R z ( ⁇ z ) are respectively set such that the optical axis of the camera 10-1 is the z direction, and the x axis and the y axis are respectively relative to the optical axis z
  • This is a matrix representing rotational movement components centered on the x, y, and z axes when the horizontal and vertical axes are orthogonal to each other, and is expressed by the following equation.
  • ⁇ x , ⁇ y, and ⁇ z are plane projective transformation coefficients representing rotational movement components around the x-axis, y-axis, and z-axis, respectively.
  • T (t x , t y , t z ) is a parallel progression representing a translation component, and is represented by the following equation.
  • t x , ty, and t z are plane projective transformation coefficients that represent translational components along the x, y, and z axes, respectively.
  • N is an internal parameter matrix that represents the characteristics of the camera 10-1
  • N -1 is an inverse matrix of the internal parameter matrix N.
  • the internal parameter matrix N is represented by the following equation, for example.
  • f is the focal length of the camera 10-1
  • c x and cy are the horizontal coordinate and the vertical coordinate on the first image corresponding to the point on the optical axis.
  • the focal length of the camera 10-1 is not known, for example, the focal length of a camera having a field angle that is commonly used may be used as the focal length f.
  • the aspect ratio of the pixel height and width of the image sensor included in the camera 10-1 is not 1: 1
  • the internal parameter matrix N may be expressed according to the following equation, for example.
  • F y f / H.
  • the evaluation value includes a weight term that increases as the rotational movement component increases and is smaller than the amount of movement of the feature point by the rotational movement component in order to suppress the rotational movement component.
  • the parameter determination unit 15 obtains a planar projective transformation coefficient that minimizes such an evaluation value.
  • the evaluation value C is expressed by the following equation.
  • N is an integer of 1 or more and represents the number of pairs of corresponding feature points.
  • ⁇ ( ⁇ x - ⁇ x ) 2 + ( ⁇ y - ⁇ y ) 2 ⁇ 1/2 .
  • W (v i , v ′ i , ⁇ x , ⁇ y , ⁇ z ) is a weight term that increases according to the rotational movement component.
  • the first term on the right side of equation (6) is the sum of the absolute values of the differences between the point v ′ i and the point h ( ⁇ x , ⁇ y , ⁇ z , t x , t y , t z ) v i. It may be.
  • ⁇ i is an error caused by the deviation of the extracted feature points or distortion.
  • equation (8) is established for all the pairs of corresponding feature points. Therefore, the following equation is established. If the error term ⁇ i is sufficiently small, the following condition is assumed from the equation (9). However, the coefficient ⁇ has a value satisfying 0 ⁇ ⁇ 1.
  • the left side of the expression (10) ′ calculates the evaluation value C when the rotational movement amount and the parallel movement amount are 0 (that is, h (0,0,0,0,0)). This corresponds to the first term on the right side of equation (6). Therefore, (10) 'equation, coefficient square sum and rotational movement amount of each feature point v i of the distance between the corresponding feature points when the respective feature points v i was angle ⁇ rotated about the z-axis ⁇ It shows that the sum with the sum of the values multiplied by becomes smaller.
  • the right side of the expression (10) ′ is obtained by obtaining an appropriate value of ⁇ . Is smaller than the evaluation value C when there is no rotational movement component and parallel movement component.
  • the movement of the optical axis of the camera 10-2 with respect to the optical axis of the camera 10-1 does not include a rotational movement component centered on the z axis, the relationship of the expression (10) ′ is not established. Therefore, even if only the value of ⁇ is adjusted, the value on the right side of the expression (10) ′ is not likely to be smaller than the evaluation value C when there is no rotational movement component and parallel movement component.
  • the weight term is expressed by the following equation in consideration that all rotational movement components are included, for example. Note that r i is the distance from the center of the first image to the feature point v i .
  • the coefficients ⁇ x , ⁇ y , and ⁇ z are positive real numbers less than 1 (that is, 0 ⁇ x , ⁇ y , ⁇ z ⁇ 1), respectively. Also, the coefficients ⁇ x , ⁇ y , ⁇ z may be set to different values. As the values of these coefficients are larger, the weight term due to the rotational movement component of the planar projective transformation corresponding to the coefficient is larger, and as a result, the evaluation value C is larger.
  • the coefficient ⁇ y may be set to a value smaller than the other coefficients ⁇ x and ⁇ z so that the vertical rotation component in the planar projective transformation coefficient is increased.
  • the weight term may be expressed by the following equation.
  • the stereo image generating apparatus 1 receives information about the direction of the optical axis of each camera via the image acquisition unit 11, for example. Can be obtained from.
  • the parameter determination unit 15 can calculate the difference between the optical axis directions of the respective cameras as an estimated value of the rotation amount of the optical axis of the camera 10-2 with respect to the optical axis of the camera 10-1. Therefore, in such a case, the weight term may be defined as the following equation using the estimated value of the rotation amount.
  • ⁇ x0 , ⁇ y0 , and ⁇ z0 are estimated values of the rotation amount of the optical axis of the camera 10-2 with respect to the optical axis of the camera 10-1 around the x axis, the y axis, and the z axis, respectively. .
  • the evaluation value calculation unit 151 calculates the evaluation value C according to the expression (6) using the planar projective transformation coefficient corrected by the conversion parameter correction unit 153 until the evaluation value is determined to be converged by the determination unit 152 described later. .
  • the evaluation value calculation unit 151 may calculate a differential value C ′ of the evaluation value C instead of the evaluation value C itself. In this case, the evaluation value calculation unit 151 calculates a primary partial differential value obtained by partial differentiation of the right side of the equation (6) with respect to each plane projective transformation coefficient. Then, the evaluation value calculation unit 151 sets the value having the maximum absolute value among the primary partial differential values for each plane projective transformation coefficient as the differential value C ′.
  • the evaluation value calculation unit 151 passes the calculated evaluation value C or the differential value C ′ of the evaluation value C and the plane projective transformation coefficient used for calculating the evaluation value C and the like to the determination unit 152.
  • the determination unit 152 determines whether or not the evaluation value C calculated by the evaluation value calculation unit 151 or its differential value C ′ satisfies the convergence condition. For example, when the evaluation value C is less than the predetermined threshold value Thc, the determination unit 152 determines that the evaluation value C satisfies the convergence condition and the evaluation value C is minimized.
  • the predetermined threshold Thc is set to 0.1, for example.
  • the determination unit 152 determines that the differential value C ′ satisfies the convergence condition and the evaluation value C is minimized.
  • the predetermined threshold Thd is set to 0.000001, for example.
  • the determination unit 152 outputs the planar projective transformation coefficient to the stereoscopic effect adjusting unit 16.
  • the determination unit 152 converts the plane projection conversion coefficient used for calculating the evaluation value C and the like into a conversion parameter correction unit to search for a plane projection conversion coefficient suitable for plane projection conversion of the first image. Pass to 153.
  • the conversion parameter correction unit 153 corrects the value of at least one plane projection conversion coefficient.
  • the conversion parameter correction unit 153 corrects all the planar projection conversion coefficients according to the following equation so that the evaluation value C decreases.
  • the partial differential ⁇ C / ⁇ q j is calculated by partial differentiation of the right side of the equation (6) with respect to the planar projective transformation coefficient q j .
  • the coefficient ⁇ is set to 0.2, for example.
  • the value of partial differential ⁇ C / ⁇ q j may be calculated numerically. Note that the conversion parameter correction unit 153 may correct only one coefficient in one correction.
  • the conversion parameter correction unit 153 may correct only the planar projective transformation coefficient that maximizes the absolute value of the partial differential value of the evaluation value C in the equation (14). Alternatively, the conversion parameter correction unit 153 may correct each planar projection conversion coefficient one by one in a predetermined order. The conversion parameter correction unit 153 passes the corrected plane projection conversion coefficient to the evaluation value calculation unit 151.
  • the parameter determination unit 15 corrects the planar projective transformation coefficient until the evaluation value C or its differential value C ′ satisfies the convergence condition. Therefore, the parameter determination unit 15 can obtain a plane projective transformation coefficient corresponding to the minimized evaluation value C. Further, as expressed by the equations (6) and (11) to (13), the calculation formula for the evaluation value C includes a weight term that becomes heavier as the rotational movement component increases. Therefore, the evaluation value C for the planar projective transformation coefficient including a coefficient representing a rotational movement component larger than the rotation amount between the optical axes of the two cameras is not minimized. On the other hand, the value of the weight term is smaller than the moving amount of the feature point due to the rotational movement component.
  • the parameter determination unit 15 can obtain an appropriate planar projective transformation coefficient by minimizing the evaluation value C.
  • the parameter determination unit 15 may obtain a planar projective transformation coefficient that minimizes the evaluation value using a wide-area optimization method such as simulated annealing or a genetic algorithm. Also in this case, the evaluation value is set so as to have a weight term expressed by the equations (11) to (13).
  • the stereoscopic effect adjusting unit 16 converts the parallax corresponding to a desired position in the depth direction to reproduce the image on the stereo image created based on the two images acquired via the image acquiring unit 11 into the two images.
  • the planar projective transformation coefficient is corrected as given. For example, in a three-dimensional display using binocular parallax, the image in the right-eye image and the image in the left-eye image are only the parallax when the image is located at a predetermined distance from the observer. Created so that the viewing direction is different. Since the left and right eyes of the person are separated in the horizontal direction, the parallax need only have a horizontal component.
  • the stereoscopic effect adjusting unit 16 modifies the coefficient t x representing the horizontal translation amount among the plane projective transformation coefficients received from the parameter determining unit 15 according to the parallax.
  • the parallax is set in advance and stored in the storage unit 12, for example.
  • the stereo image generation device 1 may acquire parallax via a user interface such as a keyboard (not shown).
  • FIG. 3 is a diagram illustrating the relationship between the parallax set for two images and the reproduction position in the depth direction of the images on the two images.
  • the image 300 is an image for the left eye, and is created, for example, by subjecting the first image created by the camera 10-1 to plane projective transformation.
  • the image 310 is an image for the right eye, for example, the second image itself created by the camera 10-2.
  • a point 301 on the image 300 and a point 311 on the image 310 represent the same point on the image of the same object.
  • the three-dimensional image by the point 301 and the point 311 is reproduced at a position where the line of sight 302 connecting the point 301 and the left eye of the observer intersects with the line of sight 312 connecting the point 311 and the right eye of the observer. Therefore, as the point 301 moves to the right with respect to the point 311, the three-dimensional image of the point 301 and the point 311 is reproduced closer to the observer. Conversely, as the distance between the point 311 and the point 301 is shorter, the three-dimensional image by the point 301 and the point 311 is reproduced near the display on which the images 300 and 310 are displayed. When the line of sight 302 and the line of sight 312 intersect at a position farther than the display on which the images 300 and 310 are displayed, the three-dimensional image of the point 301 and the point 311 is reproduced on the back side of the display.
  • the stereoscopic effect adjusting unit 16 corresponds to the ratio of the distance from the observer to the position where the three-dimensional image is reproduced with respect to the distance between the left and right eyes of a general person corresponding to the same object on the two images. It is made equal to the ratio of the distance from the display to the position where the 3D image is reproduced with respect to the distance between them.
  • the stereoscopic effect adjusting unit 16 sets a horizontal movement amount s x corresponding to the parallax so as to satisfy this condition.
  • the stereoscopic effect adjusting unit 16 calculates a 3 ⁇ 3 matrix H from the parameters obtained by the parameter determining unit 15, the rotation matrix R, and the parallel progression column T, using Equation (1), and the element of the i row and j column of the matrix H is represented by h.
  • the stereoscopic effect adjusting unit 16 outputs the planar projective transformation coefficient modified according to the parallax to the converting unit 17.
  • the stereoscopic effect adjusting unit 16 may specify the range of such a region according to the parallax.
  • the stereoscopic effect adjusting unit 16 outputs information representing the range of the region to the converting unit 17.
  • the conversion unit 17 creates a stereo image by performing planar projective transformation of one image using the obtained planar projective transformation coefficient.
  • the planar projective transformation coefficient for the first image created by the camera 10-1 is obtained. Therefore, the conversion unit 17 reads the first image from the storage unit 12, and performs plane projective conversion on each pixel of the first image using the plane projection conversion coefficient.
  • the conversion unit 17 performs linear interpolation using, for example, the value of the converted pixel positioned around the pixel for each pixel of the converted image. Thus, the value of each pixel after conversion is obtained.
  • the conversion unit 17 receives from the stereoscopic effect adjustment unit 16 information on a range of an area where an image is projected only on one eye, the value of each pixel in the range may be set to “0”.
  • the conversion unit 17 outputs a set of the first image obtained by plane projection conversion and the second image read from the storage unit 12 to the output unit 18 as a stereo image.
  • the output unit 18 includes, for example, an interface circuit for connecting the stereo image generation device 1 to another device. Note that the output unit 18 may add header information in a predetermined format to the stereo image.
  • the header information includes information such as the parallax and the sizes of the first and second images, for example. Then, the output unit 18 outputs the created stereo image to another device.
  • FIG. 4 is an operation flowchart of the stereo image generation process.
  • the stereo image generating apparatus 1 acquires a first image from the camera 10-1 placed at the first point via the image acquisition unit 11, and places it at a second point different from the first point.
  • a second image is acquired from the camera 10-2 (step S101).
  • the first and second images are stored in the storage unit 12.
  • the first image is passed to the feature point extraction unit 13.
  • the second image is passed to the tracking unit 14.
  • the feature point extraction unit 13 extracts feature points from the first image (step S102).
  • the feature point extraction unit 13 outputs information representing the feature points of the image to the tracking unit 14.
  • the tracking unit 14 determines a set of corresponding feature points between the first image and the second image using the tracking process (step S103).
  • the tracking unit 14 outputs information representing a set of corresponding feature points to the parameter determination unit 15.
  • the parameter determination unit 15 initializes the planar projective transformation coefficient (step S104). For example, with this initialization, the coefficients ⁇ x , ⁇ y , ⁇ z , t x , t y , and t z are set to 0, respectively. Then, the parameter determination unit 15 passes the initialized planar projective transformation coefficient to the evaluation value calculation unit 151 of the parameter determination unit 15. The evaluation value calculation unit 151 rotates and moves the sum of the distance between the position where each feature point of the first image is plane-projected using the plane projection conversion coefficient and the position of the corresponding point on the second image. An evaluation value C that is the sum of the weight terms according to the amount is calculated (step S105).
  • the evaluation value calculation unit 151 passes the evaluation value C and the plane projective transformation coefficient used for calculation of the evaluation value C to the determination unit 152 of the parameter determination unit 15.
  • the determination unit 152 determines whether or not the evaluation value C is less than a predetermined threshold value Thc (step S106). When the evaluation value C is equal to or greater than the predetermined threshold value Thc (No in step S106), the determination unit 152 passes the plane projection conversion coefficient used for calculating the evaluation value C to the conversion parameter correction unit 153 of the parameter determination unit 15.
  • the conversion parameter correction unit 153 corrects the planar projection conversion coefficient (step S107). Then, the conversion parameter correction unit 153 passes the corrected plane projection conversion coefficient to the evaluation value calculation unit 151. Then, the parameter determination unit 15 repeats the processes after step S105.
  • the determination unit 152 outputs the planar projection conversion coefficient used for calculating the evaluation value C to the stereoscopic effect adjustment unit 16. Then, the stereoscopic effect adjusting unit 16 corrects the planar projective transformation coefficient so as to give a parallax corresponding to the target image reproduction position (step S108). The stereoscopic effect adjusting unit 16 outputs the corrected planar projection conversion coefficient to the converting unit 17.
  • the conversion unit 17 reads the first and second images from the storage unit 12.
  • the conversion unit 17 creates a stereo image by performing the planar projective transformation on the first image using the planar projective transformation coefficient received from the stereoscopic effect adjusting unit 16 (step S109). Then, the conversion unit 17 outputs the stereo image to another device via the output unit 18. Thereafter, the stereo image generating apparatus 1 ends the stereo image generating process.
  • the evaluation value calculation unit 151 may calculate the differential value C ′ of the evaluation value C as described above. In this case, in step S106, if the absolute value of the differential value C ′ is less than the threshold value Thd for the differential value of the evaluation value, the determination unit 152 satisfies the convergence condition, and the evaluation value C is the minimum value. It is determined that
  • this stereo image generation apparatus limits the planar projective transformation coefficient to a coefficient representing a rotational movement component and a coefficient representing a parallel movement component that are assumed as actual camera movements. Further, this stereo image generating apparatus has a feature point extracted from one image and subjected to plane projective transformation, a positional deviation amount between corresponding points extracted from the other image, and a weight that increases in accordance with a rotational movement component. Each plane projective transformation coefficient is obtained so as to minimize an evaluation value having a term. Thereby, if the plane projection conversion coefficient includes a coefficient corresponding to a rotational movement component larger than the actual rotation amount between the optical axes of the two cameras, the evaluation value becomes relatively large and is not minimized. Therefore, the stereo image generating apparatus can suppress the need for a plane projective transformation coefficient having a rotational movement component larger than the actual amount of rotation between the optical axes. Therefore, this stereo image generation apparatus can create a stereo image with little image distortion.
  • the stereo image generating device may be configured so that the image of the shooting target in the second image matches the image of the shooting target in the first image.
  • the second image may be subjected to planar projective transformation.
  • the stereo image generating apparatus obtains a planar projective transformation coefficient for the second image.
  • the feature point extraction unit may extract feature points from the second image.
  • the tracking unit detects a point on the first image corresponding to the feature point extracted from the second image.
  • the stereo image generation device may receive a moving image from each camera and create a stereo image for each frame.
  • the stereo image generating apparatus creates a planar projective transformation coefficient when the first frame is received from each camera and when at least one of the cameras moves.
  • FIG. 5 is a schematic configuration diagram of a stereo image generating apparatus according to this modification.
  • the stereo image generating apparatus 2 according to this modification includes an image acquisition unit 11, a storage unit 12, a feature point extraction unit 13, a tracking unit 14, a parameter determination unit 15, a stereoscopic effect adjustment unit 16, and a conversion unit 17. And an output unit 18 and an inter-frame difference unit 19.
  • the stereo image generation device 2 is different from the stereo image generation device 1 according to the above-described embodiment shown in FIG. Therefore, in the following, points relating to the inter-frame difference unit 19 will be described. For other components, refer to the related description of the above embodiment.
  • the stereo image generating apparatus 2 When the stereo image generating apparatus 2 receives the first frame from each camera, the stereo image generating apparatus 2 calculates a planar projective transformation coefficient. Then, the stereo image generation device 2 stores the plane projective transformation coefficient in the storage unit 12. In addition, the stereo image generation device 2 temporarily stores the frame in the storage unit 12 every time a frame is received from each camera. The storage unit 12 stores the latest several frames for each moving image of each camera.
  • the inter-frame difference unit 19 determines, for each moving image of each camera, whether or not the camera has moved by examining changes in pixel values between the latest frame and the previous frame. For this purpose, the inter-frame difference unit 19 performs an inter-frame difference between the latest frame and the previous frame, and obtains a difference value representing a change in pixel value in the time direction for each pixel. The inter-frame difference unit 19 determines that the camera has moved when the number of pixels whose absolute value of the difference value is equal to or greater than a predetermined threshold is equal to or greater than a predetermined ratio with respect to the number of pixels of the entire image.
  • the threshold for the absolute value of the difference value is set to, for example, a value of about 1/4 to 1/2 of the range of values that the luminance value can take.
  • the predetermined ratio is set to a value of about 40% to 70%, for example.
  • the stereo image generation device 2 When the inter-frame difference unit 19 determines that any one of the cameras has moved, the stereo image generation device 2 newly calculates a plane projection conversion coefficient, and calculates the plane projection conversion coefficient stored in the storage unit 12. The newly calculated plane projective transformation coefficient is updated. Then, the conversion unit 17 generates a stereo image by performing plane projection conversion on the frame of one camera using the newly calculated plane projection coefficient. On the other hand, when the inter-frame difference unit 19 determines that no camera has moved, the conversion unit 17 uses one of the plane projection conversion coefficients stored in the storage unit 12 and created in the past. A stereo image is generated by plane projective transformation of the frame. In this case, the processing of the feature point extraction unit 13, the tracking unit 14, the parameter determination unit 15, and the stereoscopic effect adjustment unit 16 is omitted.
  • the stereo image generation device may generate a stereo image based on two images received from one camera.
  • the stereo image generating device receives the first image via the image acquisition unit. .
  • the stereo image generation device stores the first image in the storage unit.
  • the stereo image generating device passes through the image acquisition unit.
  • the second image is received.
  • the stereo image generating apparatus receives both the first and second images, the stereo image generating apparatus obtains a planar projective transformation coefficient according to the above embodiment.
  • the stereo image generating apparatus creates a stereo image by performing planar projective transformation on one of the first and second images using the planar projective transformation coefficient.
  • the stereo image generation device may receive a third image captured by the camera at at least one point between the first point and the second point. Then, the feature point extraction unit of the stereo image generation device extracts the feature points from the first image. The tracking unit obtains a set of corresponding feature points between the first image and the third image, and obtains a set of corresponding feature points between the third image and the second image. The feature points extracted from the second image corresponding to the feature points extracted from the first image may be specified.
  • a computer program that causes a computer to realize the functions of the feature point extraction unit, the tracking unit, the parameter determination unit, the stereoscopic effect adjustment unit, and the conversion unit included in the stereo image generation device is recorded in a computer-readable medium. May be provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

La présente invention concerne un dispositif de génération d'image stéréo comprenant : une unité d'extraction de point caractéristique qui extrait une pluralité de points caractéristiques à partir d'une première image prélevée au niveau d'un premier point cartographique ; une unité de suivi qui calcule la coordonnée d'un point correspondant à la pluralité de points caractéristiques de la première image comme points identiques sur l'objet formé en image, à partir d'une seconde image prélevée au niveau d'un second point cartographique ; une unité de détermination de paramètre qui détermine un coefficient de conversion de projection à plat de façon à minimiser une valeur d'évaluation comprenant un point caractéristique de conversion obtenu par un coefficient de conversion de projection à plat comprenant un coefficient représentant une composante de mouvement de rotation et un coefficient représentant une composante de mouvement parallèle, utilisé pour réaliser une conversion de projection à plat du point caractéristique de la première image, une différence de distance du point caractéristique de la première image correspondant à un point dans la seconde image et à un intervalle, et un terme de pondération détenant une valeur correspondant à la composante de mouvement de rotation ; et une unité de conversion qui convertit la première image en une image en projection à plat utilisant un paramètre de conversion de projection à plat et générant une combinaison de l'image convertie de la projection à plat et de la seconde image comme une image stéréo.
PCT/JP2010/066450 2010-09-22 2010-09-22 Unité de génération d'image stéréo, procédé de génération d'image stéréo et programme informatique de génération d'image stéréo Ceased WO2012039043A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2010/066450 WO2012039043A1 (fr) 2010-09-22 2010-09-22 Unité de génération d'image stéréo, procédé de génération d'image stéréo et programme informatique de génération d'image stéréo
JP2012534860A JP5392415B2 (ja) 2010-09-22 2010-09-22 ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム
US13/849,013 US9094672B2 (en) 2010-09-22 2013-03-22 Stereo picture generating device, and stereo picture generating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/066450 WO2012039043A1 (fr) 2010-09-22 2010-09-22 Unité de génération d'image stéréo, procédé de génération d'image stéréo et programme informatique de génération d'image stéréo

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/849,013 Continuation US9094672B2 (en) 2010-09-22 2013-03-22 Stereo picture generating device, and stereo picture generating method

Publications (1)

Publication Number Publication Date
WO2012039043A1 true WO2012039043A1 (fr) 2012-03-29

Family

ID=45873557

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/066450 Ceased WO2012039043A1 (fr) 2010-09-22 2010-09-22 Unité de génération d'image stéréo, procédé de génération d'image stéréo et programme informatique de génération d'image stéréo

Country Status (3)

Country Link
US (1) US9094672B2 (fr)
JP (1) JP5392415B2 (fr)
WO (1) WO2012039043A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021135163A (ja) * 2020-02-27 2021-09-13 株式会社サキコーポレーション 検査装置

Families Citing this family (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US8866920B2 (en) 2008-05-20 2014-10-21 Pelican Imaging Corporation Capturing and processing of images using monolithic camera array with heterogeneous imagers
JP2011523538A (ja) 2008-05-20 2011-08-11 ペリカン イメージング コーポレイション 異なる種類の撮像装置を有するモノリシックカメラアレイを用いた画像の撮像および処理
EP2502115A4 (fr) 2009-11-20 2013-11-06 Pelican Imaging Corp Capture et traitement d'images au moyen d'un réseau de caméras monolithique équipé d'imageurs hétérogènes
CN103004180A (zh) 2010-05-12 2013-03-27 派力肯影像公司 成像器阵列和阵列照相机的架构
US9485495B2 (en) 2010-08-09 2016-11-01 Qualcomm Incorporated Autofocus for stereo images
US8878950B2 (en) 2010-12-14 2014-11-04 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using super-resolution processes
TWI405030B (zh) * 2011-02-23 2013-08-11 Largan Precision Co Ltd 轉變成像軸線立體取像方法及裝置
KR101973822B1 (ko) 2011-05-11 2019-04-29 포토네이션 케이맨 리미티드 어레이 카메라 이미지 데이터를 송신 및 수신하기 위한 시스템들 및 방법들
KR20140045458A (ko) 2011-06-28 2014-04-16 펠리칸 이매징 코포레이션 어레이 카메라와 함께 사용하는 광학 장치
US20130265459A1 (en) 2011-06-28 2013-10-10 Pelican Imaging Corporation Optical arrangements for use with an array camera
WO2013043751A1 (fr) 2011-09-19 2013-03-28 Pelican Imaging Corporation Systèmes et procédés permettant de commander le crénelage des images capturées par une caméra disposée en réseau destinée à être utilisée dans le traitement à super-résolution à l'aide d'ouvertures de pixel
US9438889B2 (en) 2011-09-21 2016-09-06 Qualcomm Incorporated System and method for improving methods of manufacturing stereoscopic image sensors
CN104081414B (zh) 2011-09-28 2017-08-01 Fotonation开曼有限公司 用于编码和解码光场图像文件的系统及方法
WO2013126578A1 (fr) 2012-02-21 2013-08-29 Pelican Imaging Corporation Systèmes et procédés pour la manipulation de données d'image de champ lumineux capturé
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
JP2015534734A (ja) 2012-06-28 2015-12-03 ペリカン イメージング コーポレイション 欠陥のあるカメラアレイ、光学アレイ、およびセンサを検出するためのシステムおよび方法
US20140002674A1 (en) 2012-06-30 2014-01-02 Pelican Imaging Corporation Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors
DK4296963T3 (da) 2012-08-21 2025-03-03 Adeia Imaging Llc Metode til dybdedetektion i billeder optaget med array-kameraer
CN104685513B (zh) 2012-08-23 2018-04-27 派力肯影像公司 根据使用阵列源捕捉的低分辨率图像的基于特征的高分辨率运动估计
WO2014043641A1 (fr) 2012-09-14 2014-03-20 Pelican Imaging Corporation Systèmes et procédés de correction d'artéfacts identifiés d'utilisateur dans des images de champ de lumière
EP2901671A4 (fr) 2012-09-28 2016-08-24 Pelican Imaging Corp Création d'images à partir de champs de lumière en utilisant des points de vue virtuels
US9398264B2 (en) 2012-10-19 2016-07-19 Qualcomm Incorporated Multi-camera system using folded optics
WO2014078443A1 (fr) 2012-11-13 2014-05-22 Pelican Imaging Corporation Systèmes et procédés de commande de plan focal de caméra matricielle
US9462164B2 (en) 2013-02-21 2016-10-04 Pelican Imaging Corporation Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
WO2014133974A1 (fr) 2013-02-24 2014-09-04 Pelican Imaging Corporation Caméras à matrices informatiques et modulaires de forme mince
US9638883B1 (en) 2013-03-04 2017-05-02 Fotonation Cayman Limited Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process
US9917998B2 (en) 2013-03-08 2018-03-13 Fotonation Cayman Limited Systems and methods for measuring scene information while capturing images using array cameras
US8866912B2 (en) 2013-03-10 2014-10-21 Pelican Imaging Corporation System and methods for calibration of an array camera using a single captured image
US9106784B2 (en) 2013-03-13 2015-08-11 Pelican Imaging Corporation Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
US9124831B2 (en) 2013-03-13 2015-09-01 Pelican Imaging Corporation System and methods for calibration of an array camera
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
WO2014165244A1 (fr) 2013-03-13 2014-10-09 Pelican Imaging Corporation Systèmes et procédés pour synthétiser des images à partir de données d'image capturées par une caméra à groupement utilisant une profondeur restreinte de cartes de profondeur de champ dans lesquelles une précision d'estimation de profondeur varie
WO2014159779A1 (fr) 2013-03-14 2014-10-02 Pelican Imaging Corporation Systèmes et procédés de réduction du flou cinétique dans des images ou une vidéo par luminosité ultra faible avec des caméras en réseau
US9100586B2 (en) 2013-03-14 2015-08-04 Pelican Imaging Corporation Systems and methods for photometric normalization in array cameras
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
DK2973476T3 (da) 2013-03-15 2025-05-19 Adeia Imaging Llc Systemer og fremgangsmåder til stereobilleddannelse med kamerarækker
US9445003B1 (en) 2013-03-15 2016-09-13 Pelican Imaging Corporation Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
WO2014150856A1 (fr) 2013-03-15 2014-09-25 Pelican Imaging Corporation Appareil de prise de vue matriciel mettant en œuvre des filtres colorés à points quantiques
US9633442B2 (en) 2013-03-15 2017-04-25 Fotonation Cayman Limited Array cameras including an array camera module augmented with a separate camera
JP6092371B2 (ja) * 2013-04-10 2017-03-08 株式会社東芝 電子機器および画像処理方法
US10178373B2 (en) 2013-08-16 2019-01-08 Qualcomm Incorporated Stereo yaw correction using autofocus feedback
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
WO2015070105A1 (fr) 2013-11-07 2015-05-14 Pelican Imaging Corporation Procédés de fabrication de modules de caméra matricielle incorporant des empilements de lentilles alignés de manière indépendante
WO2015074078A1 (fr) 2013-11-18 2015-05-21 Pelican Imaging Corporation Estimation de profondeur à partir d'une texture projetée au moyen de réseaux d'appareils de prises de vue
US9426361B2 (en) 2013-11-26 2016-08-23 Pelican Imaging Corporation Array camera configurations incorporating multiple constituent array cameras
US10089740B2 (en) 2014-03-07 2018-10-02 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US9383550B2 (en) 2014-04-04 2016-07-05 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
US9374516B2 (en) 2014-04-04 2016-06-21 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
US9247117B2 (en) 2014-04-07 2016-01-26 Pelican Imaging Corporation Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array
US9521319B2 (en) 2014-06-18 2016-12-13 Pelican Imaging Corporation Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US10013764B2 (en) 2014-06-19 2018-07-03 Qualcomm Incorporated Local adaptive histogram equalization
US9386222B2 (en) * 2014-06-20 2016-07-05 Qualcomm Incorporated Multi-camera system using folded optics free from parallax artifacts
US9819863B2 (en) 2014-06-20 2017-11-14 Qualcomm Incorporated Wide field of view array camera for hemispheric and spherical imaging
US9549107B2 (en) 2014-06-20 2017-01-17 Qualcomm Incorporated Autofocus for folded optic array cameras
US9541740B2 (en) 2014-06-20 2017-01-10 Qualcomm Incorporated Folded optic array camera using refractive prisms
US9294672B2 (en) 2014-06-20 2016-03-22 Qualcomm Incorporated Multi-camera system using folded optics free from parallax and tilt artifacts
KR101558805B1 (ko) * 2014-09-03 2015-10-07 현대자동차주식회사 스테레오 매칭용 보간 계수 보정 장치
KR20170063827A (ko) 2014-09-29 2017-06-08 포토네이션 케이맨 리미티드 어레이 카메라들의 동적 교정을 위한 시스템들 및 방법들
US9832381B2 (en) 2014-10-31 2017-11-28 Qualcomm Incorporated Optical image stabilization for thin cameras
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
JP6636963B2 (ja) * 2017-01-13 2020-01-29 株式会社東芝 画像処理装置及び画像処理方法
JP6780093B2 (ja) * 2017-03-30 2020-11-04 富士フイルム株式会社 画像処理装置及び画像処理方法
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
CN110532840B (zh) * 2018-05-25 2022-05-10 深圳市优必选科技有限公司 一种方形物体的形变识别方法、装置及设备
WO2021055585A1 (fr) 2019-09-17 2021-03-25 Boston Polarimetrics, Inc. Systèmes et procédés de modélisation de surface utilisant des repères de polarisation
CN114746717A (zh) 2019-10-07 2022-07-12 波士顿偏振测定公司 利用偏振进行表面法线感测的系统和方法
WO2021108002A1 (fr) 2019-11-30 2021-06-03 Boston Polarimetrics, Inc. Systèmes et procédés de segmentation d'objets transparents au moyen de files d'attentes de polarisation
US11195303B2 (en) 2020-01-29 2021-12-07 Boston Polarimetrics, Inc. Systems and methods for characterizing object pose detection and measurement systems
JP7542070B2 (ja) 2020-01-30 2024-08-29 イントリンジック イノベーション エルエルシー 偏光画像を含む異なる撮像モダリティで統計モデルを訓練するためのデータを合成するためのシステムおよび方法
US11953700B2 (en) 2020-05-27 2024-04-09 Intrinsic Innovation Llc Multi-aperture polarization optical systems using beam splitters
US12020455B2 (en) 2021-03-10 2024-06-25 Intrinsic Innovation Llc Systems and methods for high dynamic range image reconstruction
US12069227B2 (en) 2021-03-10 2024-08-20 Intrinsic Innovation Llc Multi-modal and multi-spectral stereo camera arrays
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US12067746B2 (en) 2021-05-07 2024-08-20 Intrinsic Innovation Llc Systems and methods for using computer vision to pick up small objects
US12175741B2 (en) 2021-06-22 2024-12-24 Intrinsic Innovation Llc Systems and methods for a vision guided end effector
US12340538B2 (en) 2021-06-25 2025-06-24 Intrinsic Innovation Llc Systems and methods for generating and using visual datasets for training computer vision models
US12172310B2 (en) 2021-06-29 2024-12-24 Intrinsic Innovation Llc Systems and methods for picking objects using 3-D geometry and segmentation
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers
US12293535B2 (en) 2021-08-03 2025-05-06 Intrinsic Innovation Llc Systems and methods for training pose estimators in computer vision

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010128820A (ja) * 2008-11-27 2010-06-10 Fujifilm Corp 立体画像処理装置、方法及びプログラム並びに立体撮像装置

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6473536B1 (en) * 1998-09-18 2002-10-29 Sanyo Electric Co., Ltd. Image synthesis method, image synthesizer, and recording medium on which image synthesis program is recorded
US6661927B1 (en) * 2000-07-27 2003-12-09 Motorola, Inc. System and method for efficiently encoding an image by prioritizing groups of spatially correlated coefficients based on an activity measure
AU2002343441A1 (en) * 2001-09-26 2003-04-07 Massachusetts Institute Of Technology Versatile cone-beam imaging apparatus and method
US6865246B2 (en) * 2001-09-26 2005-03-08 Massachusetts Institute Of Technology True 3D cone-beam imaging method and apparatus
JP4069855B2 (ja) * 2003-11-27 2008-04-02 ソニー株式会社 画像処理装置及び方法
JP4297501B2 (ja) * 2004-08-11 2009-07-15 国立大学法人東京工業大学 移動体周辺監視装置
JP3937414B2 (ja) * 2004-08-11 2007-06-27 本田技研工業株式会社 平面検出装置及び検出方法
JP4328692B2 (ja) * 2004-08-11 2009-09-09 国立大学法人東京工業大学 物体検出装置
US8593524B2 (en) * 2006-12-18 2013-11-26 Koninklijke Philips N.V. Calibrating a camera system
JP2008186145A (ja) 2007-01-29 2008-08-14 Mitsubishi Electric Corp 空撮画像処理装置および空撮画像処理方法
US8270770B1 (en) * 2008-08-15 2012-09-18 Adobe Systems Incorporated Region-based dense feature correspondence
JP5960595B2 (ja) * 2010-11-11 2016-08-02 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 画像処理装置、画像処理方法、プログラム及び撮影装置
JP5800494B2 (ja) * 2010-11-19 2015-10-28 キヤノン株式会社 特定領域選択装置、特定領域選択方法及びプログラム

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010128820A (ja) * 2008-11-27 2010-06-10 Fujifilm Corp 立体画像処理装置、方法及びプログラム並びに立体撮像装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021135163A (ja) * 2020-02-27 2021-09-13 株式会社サキコーポレーション 検査装置
JP7437188B2 (ja) 2020-02-27 2024-02-22 株式会社サキコーポレーション 検査装置

Also Published As

Publication number Publication date
US20130222556A1 (en) 2013-08-29
JP5392415B2 (ja) 2014-01-22
JPWO2012039043A1 (ja) 2014-02-03
US9094672B2 (en) 2015-07-28

Similar Documents

Publication Publication Date Title
JP5392415B2 (ja) ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム
JP6764533B2 (ja) キャリブレーション装置、キャリブレーション用チャート、チャートパターン生成装置、およびキャリブレーション方法
JP4095491B2 (ja) 距離測定装置、距離測定方法、及び距離測定プログラム
JP5285619B2 (ja) カメラシステムのキャリブレーション
CN109040738B (zh) 校准方法和非暂态计算机可读介质
TWI555379B (zh) 一種全景魚眼相機影像校正、合成與景深重建方法與其系統
US8928736B2 (en) Three-dimensional modeling apparatus, three-dimensional modeling method and computer-readable recording medium storing three-dimensional modeling program
JP2002324234A (ja) 立体画像を偏歪修正する方法および装置
CN102072706B (zh) 一种多相机定位与跟踪方法及系统
CN113592721B (zh) 摄影测量方法、装置、设备及存储介质
KR20150120066A (ko) 패턴 프로젝션을 이용한 왜곡 보정 및 정렬 시스템, 이를 이용한 방법
TW201715880A (zh) 一種全景魚眼相機影像校正、合成與景深重建方法與其系統
JP5672112B2 (ja) ステレオ画像較正方法、ステレオ画像較正装置及びステレオ画像較正用コンピュータプログラム
JP2014192613A (ja) 画像処理装置及び方法、及び撮像装置
JP2011253376A (ja) 画像処理装置、および画像処理方法、並びにプログラム
CN106296825B (zh) 一种仿生三维信息生成系统及方法
WO2013005265A1 (fr) Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles
Liu et al. Epipolar rectification method for a stereovision system with telecentric cameras
KR20200142391A (ko) 광학식 위치 추적 시스템의 3차원 마커 좌표 추정 방법
JP2022024688A (ja) デプスマップ生成装置及びそのプログラム、並びに、デプスマップ生成システム
CN111998834B (zh) 一种裂缝监测方法及系统
CN117581260A (zh) 人脸深度图像的面部变形补偿方法、成像装置和存储介质
JP5925109B2 (ja) 画像処理装置、その制御方法、および制御プログラム
CN118334116A (zh) Led灯珠标定方法、装置、设备及介质
CN111292380A (zh) 图像处理方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10857537

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2012534860

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10857537

Country of ref document: EP

Kind code of ref document: A1