WO2013005244A1 - Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles - Google Patents
Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles Download PDFInfo
- Publication number
- WO2013005244A1 WO2013005244A1 PCT/JP2011/003774 JP2011003774W WO2013005244A1 WO 2013005244 A1 WO2013005244 A1 WO 2013005244A1 JP 2011003774 W JP2011003774 W JP 2011003774W WO 2013005244 A1 WO2013005244 A1 WO 2013005244A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewpoint
- points
- point
- pixel
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Definitions
- the present invention relates to a three-dimensional relative coordinate measuring apparatus that measures a relative coordinate between any one of three reference points whose relative coordinates are known and one target point whose coordinates are unknown in a three-dimensional coordinate system.
- a stereo method is known as a method for measuring relative coordinates from an image.
- stereo method two images are obtained by photographing with two cameras placed at different positions or by photographing from two different positions with one camera.
- relative coordinate information is obtained from subtle differences in images based on parallax, as in human binocular vision.
- Patent Document 1 discloses a photo measurement system capable of performing highly accurate surveying by a stereo method.
- Patent Document 1 it is assumed that the position and orientation of the camera at the photographing point are grasped in advance by using a detection device such as an inclination angle sensor.
- the present invention has been made in view of such a situation, and provides a three-dimensional relative coordinate measuring apparatus capable of measuring relative coordinates without grasping in advance the position and orientation of a camera at a shooting point.
- the purpose is to do.
- a relative coordinate measuring apparatus includes any one of three first reference points whose relative coordinates in a three-dimensional coordinate system are known, and the three-dimensional coordinate system.
- a three-dimensional relative coordinate measuring apparatus that measures relative coordinates with a single target point whose coordinates are unknown, wherein the three first reference points are imaged by a first imaging device from a first viewpoint.
- a viewpoint projection angle extraction unit that acquires three first reference viewpoint projection angles that are the first pixel viewpoint projection angles corresponding to the first reference point, and the first pixel viewpoint projection angle is The first pixel is projected to the first two-dimensional coordinates that are the coordinates of each pixel in the first acquired image.
- An angle formed by a line segment connecting each point of the three-dimensional coordinate system and the first viewpoint, and the optical axis of the first imaging device, and the three first reference viewpoint projections Using a corner and the relative coordinates of the three first reference points, a first imaging surface that is an imaging surface of the first acquired image and a first reference point that includes the three first reference points Relative coordinates between one of the three first reference points and the target point are measured using an inclination angle calculation unit that calculates a first inclination angle formed with a reference plane, and the first inclination angle. And a measuring unit to perform.
- the viewpoint projection angle it is possible to measure relative coordinates without knowing in advance the position and orientation of the camera at the shooting point.
- the three first reference points are any three of three or more reference points whose relative coordinates in the three-dimensional coordinate system are known, and the image acquisition unit includes the first reference point.
- the first reference image of the three first reference points and the target point from the viewpoint are acquired by the first imaging device, and the reference points of the three or more points are acquired from the second viewpoint.
- a second acquired image obtained by imaging a second reference point, which is any three of the three points, and the target point with a second imaging device that is the same as or different from the first imaging device.
- the viewpoint projection angle extraction unit further holds information on a second pixel viewpoint projection angle for each pixel, and projects the second acquired image on the second acquired image using the information on the second pixel viewpoint projection angle.
- Three second reference views that are the second pixel viewpoint projection angles corresponding to the three second reference points
- a projection angle is acquired, and the second pixel viewpoint projection angle is calculated based on each point of the three-dimensional coordinate system projected on a second pixel two-dimensional coordinate that is a coordinate of each pixel in the second acquired image, and An angle formed by a line segment connecting the second viewpoint and the optical axis of the second imaging device, and the tilt angle calculation unit further includes the three second reference viewpoint projection angles and the three points.
- the second imaging plane formed by the second imaging plane which is the imaging plane of the second acquired image
- the second reference plane including the three second reference points.
- the measurement unit calculates a relative positional relationship between the first reference point of the three points and the second reference point of the three points, the first inclination angle, and the second inclination point.
- the relative coordinates between the target point and any one of the first reference point of the three points and the second reference point of the three points may be measured.
- the relative positional relationship of the camera at each shooting point is determined without grasping in advance the relative posture relationship of the camera at each shooting point. Relative coordinates between any one of the three reference points and the target point can be measured without doing so.
- the tilt angle calculation unit uses the three first and second reference viewpoint projection angles and the three first and second reference points to use the first viewpoint and the first reference point. Determining the first reference plane including the first reference points of the three points on a straight line connecting the first reference points of the three points projected on the acquired image, and the second viewpoint And determining the second reference plane including the three second reference points on a straight line connecting the three second reference points projected onto the second acquired image Also good.
- the viewpoint projection angle extraction unit further acquires a first target viewpoint projection angle that is the first pixel viewpoint projection angle corresponding to the target point projected on the first acquired image, and A second target viewpoint projection angle that is the second pixel viewpoint projection angle corresponding to the target point projected on the two acquired images
- the measurement unit includes the first target viewpoint projection angle, Calculating a first vector passing through the first viewpoint and the target point projected on the first acquired image in the three-dimensional coordinate system using the first inclination angle; A second passing through the second viewpoint and the target point projected on the second acquired image in the three-dimensional coordinate system using the second target viewpoint projection angle and the second inclination angle. And calculate the vector in the three-dimensional coordinate system using the relative positional relationship.
- the coordinates of the closest point between the first vector and the second vector are calculated as relative coordinates between the target point and any one of the three first reference points and the three second reference points. May be.
- At least one of the three first reference points may be different from the three second reference points.
- the target point may be the first viewpoint.
- the relative coordinates between any of the three reference points and the shooting point can be measured without previously knowing the posture of the camera at the shooting point. can do.
- the tilt angle calculation unit projects the first viewpoint and the first acquired image using the three first reference viewpoint projection angles and the three first reference points.
- the first reference plane including the three first reference points on a straight line connecting each of the three first reference points may be determined.
- the tilt angle calculation unit can calculate the posture of the camera at the shooting point.
- the measurement unit corresponds to the two angles in the first reference plane by using two of the three first reference viewpoint projection angles and the first inclination angle. Two vectors passing through two of the three first reference points are calculated, and the coordinates of the closest point of the two vectors are set as one of the three first reference points and the You may calculate as a relative coordinate with an object point.
- the present invention can be realized not only as such a three-dimensional relative coordinate measuring apparatus, but also as a step in the operation of the characteristic components included in such a three-dimensional relative coordinate measuring apparatus. It can also be realized as a measurement method or a program for causing a computer to execute these steps.
- the present invention can provide a three-dimensional relative coordinate measuring apparatus capable of measuring relative coordinates without grasping in advance the position and orientation of the camera at the photographing point.
- FIG. 1A is a diagram showing a configuration of a system including a three-dimensional relative coordinate measurement device at the time of three-dimensional relative coordinate measurement according to the present invention.
- FIG. 1B is a diagram showing a configuration of a system including a three-dimensional relative coordinate measuring apparatus at the time of pixel calibration of an image according to the present invention.
- FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional relative coordinate measuring apparatus according to the present invention.
- FIG. 3 is a flowchart showing a procedure for pixel calibration of an image according to the present invention.
- FIG. 4 is a diagram for explaining a viewpoint projection angle according to the present invention.
- FIG. 5A is a diagram showing an image when a calibration plate is photographed before distortion correction according to the present invention.
- FIG. 5A is a diagram showing an image when a calibration plate is photographed before distortion correction according to the present invention.
- FIG. 5B is a diagram showing an image when the calibration plate is photographed after distortion correction according to the present invention.
- FIG. 6 is a diagram for explaining processing for calculating the distance between the calibration board and the viewpoint according to the present invention.
- FIG. 7 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention.
- FIG. 8 is a diagram three-dimensionally showing the relationship among the viewpoint, imaging surface, reference point, and target point according to Embodiment 1 of the present invention.
- FIG. 9A is a diagram showing a reference point and a target point projected on the first imaging surface according to Embodiment 1 of the present invention.
- FIG. 9B is a diagram showing a reference point and a target point projected on the second imaging surface according to Embodiment 1 of the present invention.
- FIG. 10 is a flowchart showing a processing procedure performed by the tilt angle calculation unit according to the present invention.
- FIG. 11 is a diagram for explaining the processing shown in the flowchart of FIG. 10 according to the present invention.
- FIG. 12 is a diagram when FIG. 11 is projected onto the Z 1 -X 1 plane according to the present invention.
- FIG. 13 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 2 of the present invention.
- FIG. 14 is a diagram three-dimensionally illustrating the relationship between the viewpoint, the imaging surface, and the reference point according to Embodiment 2 of the present invention.
- FIG. 15 is a diagram illustrating a reference point projected on the first imaging surface according to Embodiment 2 of the present invention.
- FIG. 1A is a diagram showing a configuration of a system including a three-dimensional relative coordinate measurement device at the time of three-dimensional relative coordinate measurement according to the present invention.
- FIG. 1A shows a system configuration when acquiring two images from two viewpoints.
- a three-dimensional relative coordinate measuring apparatus 90 according to the present invention is connected to the camera 10 via a cable 30.
- the cable 30 is, for example, a USB (Universal Serial Bus) cable.
- the three-dimensional relative coordinate measuring apparatus 90 can acquire the image data 60 captured by the camera 10 via the cable 30.
- the cable 30 may be a cable other than USB.
- the image data 60 may be acquired wirelessly or via a recording medium.
- M 1 and M 2 correspond to the positions of the principal points of the photographing lens of the camera 10.
- O 1 and O 2 written by two-dot chain lines indicate the optical axis of the photographing lens (the optical axis of the camera) when the viewpoint is at M 1 and M 2 , respectively.
- the camera 10 may capture an image from the first viewpoint so that the reference structure 40 is captured.
- the reference structure 40 has three reference points A, B, and C whose relative coordinates are known in a three-dimensional coordinate system. In other words, the triangle defined by the points A, B, and C is referred to as the reference structure 40.
- the target structure 50 has a target point W whose coordinates are unknown in a three-dimensional coordinate system. In addition, when acquiring one image from one viewpoint, the target structure 50 may not be used.
- FIG. 1B is a diagram showing a configuration of a system including a three-dimensional relative coordinate measuring apparatus at the time of pixel calibration of an image according to the present invention.
- a three-dimensional relative coordinate measuring apparatus 90 according to the present invention is connected to the camera 10 via a cable 30.
- the cable 30 is, for example, a USB cable.
- the three-dimensional relative coordinate measuring apparatus 90 can acquire the image data 61 captured by the camera 10 via the cable 30.
- the cable 30 may be a cable other than USB.
- the image data 61 may be acquired wirelessly or via a recording medium.
- the camera 10 captures an image at the viewpoint M so that the calibration plate 70 can be seen.
- M corresponds to the position of the principal point of the taking lens of the camera 10.
- O written by a two-dot chain line indicates the optical axis of the photographing lens (the optical axis of the camera) when M has a viewpoint.
- the calibration plate 70 is a transparent plate having high rigidity and has a grid pattern on the surface.
- a weight 72 is attached to the calibration plate 70 with a thread 71.
- the three-dimensional relative coordinate measuring device 90 is, for example, a computer.
- a common computer is used for three-dimensional relative coordinate measurement and image pixel calibration, but separate computers may be used. Two or more computers may be used.
- FIG. 2 is a block diagram showing a characteristic functional configuration of the three-dimensional relative coordinate measuring apparatus according to the present invention.
- the three-dimensional relative coordinate measurement system 90 includes a three-dimensional relative coordinate measurement unit 20 and an image pixel calibration unit 80 according to the present invention.
- the three-dimensional relative coordinate measurement unit 20 includes an image acquisition unit 100, a two-dimensional coordinate extraction unit 110, a viewpoint projection angle extraction unit 120, a calculation unit 130, and a display unit 140. Further, the image pixel calibration unit 80 includes an adjustment unit 200.
- the adjustment unit 200 performs distortion correction of image pixels from the image data 61 obtained by imaging the calibration plate 70, the thread 71, and the weight 72 from the viewpoint M, and the viewpoint projection angle of the arbitrary image pixel (pixel) (present invention). Of the first and second pixel viewpoint projection angles).
- Image acquisition unit 100 from the first viewpoint M 1, the reference point of the 3-point, or a reference point and the target point of the three points (corresponding to the first and second imaging apparatus of the present invention) camera
- the image data 60 imaged in (1) is acquired as the first acquired image.
- the image acquisition unit 100 from the second viewpoint M 2, and the reference point and the target point of the three points to obtain the image data 60 captured by the camera as the second acquired image.
- the two-dimensional coordinate extraction unit 110 extracts the three reference points projected on the first acquired image acquired by the image acquisition unit 100, or the two-dimensional coordinates of the three reference points and the target point. In addition, the two-dimensional coordinate extraction unit 110 extracts the three reference points and the two-dimensional coordinates of the target points that are projected on the second acquired image acquired by the image acquisition unit 100.
- the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 extracts the viewpoint projection angle of the arbitrary image pixel to be held.
- a viewpoint projection angle corresponding to the two-dimensional coordinates is extracted from the viewpoint projection angles of arbitrary image pixels. The definition of the viewpoint projection angle will be described later with reference to FIG.
- the calculation unit 130 further includes an inclination angle calculation unit 130a, a vector calculation unit 130b, a conversion vector calculation unit 130c, and a relative coordinate measurement unit 130d.
- the tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle using the relative coordinates of the three reference points and the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120, respectively.
- the vector calculation unit 130b calculates the first vector and the vector 2 using the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120, respectively. Further, the vector calculation unit 130b calculates two vectors using the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120 and the first tilt angle calculated by the tilt angle calculation unit 130a.
- the conversion vector calculation unit 130c uses the first inclination angle and the second inclination angle calculated by the inclination angle calculation unit 130a and the vector 2 calculated by the vector calculation unit 130b to calculate the second vector. calculate.
- the relative coordinate measurement unit 130d uses either the first vector calculated by the vector calculation unit 130b and the second vector calculated by the conversion vector calculation unit 130c to obtain either one of the three reference points and the target point. Measure relative coordinates. In addition, the relative coordinate measuring unit 130d measures the relative coordinates between any of the three reference points and the shooting point using the two vectors calculated by the vector calculating unit 130b.
- the display unit 140 displays, for example, the acquired image acquired by the image acquisition unit 100, displays the two-dimensional coordinates extracted by the two-dimensional coordinate extraction unit 110, or is extracted by the viewpoint projection angle extraction unit 120. Display viewpoint projection angles, and display various calculation results performed by the calculation unit 130.
- FIG. 3 is a flowchart showing a procedure for pixel calibration of an image according to the present invention.
- the pixel calibration image set the optical center position C p is the center point of the imaging plane (S101), performs distortion correction (S102), perspective projection angle of an arbitrary image pixels (pixel) theta (present invention (Corresponding to the first and second pixel viewpoint projection angles) (S103).
- the viewpoint projection angle ⁇ of an arbitrary image pixel is a two-dimensional coordinate (corresponding to the first and second pixel two-dimensional coordinates of the present invention) of an arbitrary image pixel P on the image, as shown in FIG.
- the camera optical axis O (corresponding to the optical axis of the first and second imaging device of the present invention) It is the angle formed by.
- the viewpoint M since the optical axis O and the imaging surface of the camera are orthogonal, and the viewpoint M, and the distance x between the optical center position C p is the intersection of the optical axis O and the imaging surface of the camera, for example, optical
- the viewpoint projection angle ⁇ of the arbitrary image pixel P can be calculated using the two-dimensional coordinates of the arbitrary image pixel P with the center position C p as the origin.
- the optical center position Cp is set (S101).
- a weight 72 is attached by a thread 71 to a transparent plate having high rigidity (hereinafter referred to as a calibration plate 70).
- the calibration plate 70 is assumed to be kept horizontal by a level or the like. Further, it is assumed that grids are marked on the surface of the calibration plate 70.
- the calibration plate 70 is imaged by the camera 10 from above. Then, the image pickup surface is (1) a grid around the attachment position of the thread 71 so that the attachment position of the calibration plate 70 and the thread 71 and the attachment position of the weight 72 and the thread 71 overlap.
- the mounting position of the thread 71 on the calibration plate 70 is the optical center position C p.
- the adjustment unit 200 performs image distortion correction (S102). As shown in FIG. 5A, before distortion correction, the image when the calibration plate 70 is photographed is an image in which the grid is distorted as shown in FIG. 5A due to the characteristics of the photographing lens of the camera 10. The adjustment unit 200 adjusts the captured image to an image as illustrated in FIG. 5B by performing normalization processing on the captured image.
- the adjustment unit 200 calculates the viewpoint projection angle ⁇ of an arbitrary image pixel P (S103).
- FIG. 6 is a diagram for explaining processing for calculating the distance between the calibration board and the viewpoint according to the present invention.
- the adjustment unit 200 acquires an image of a plate before movement, which is an image after distortion correction obtained by photographing the calibration plate 70 as shown in FIG. 5B.
- the adjustment unit 200 is a distortion-corrected image obtained by capturing the calibration plate 70 when the calibration plate 70 is moved close to the viewpoint M by the distance y while the calibration plate 70 is kept horizontal. Acquire an image of the later board.
- the adjustment unit 200 uses the image of the plate before movement and the image of the plate after movement, for example, two-dimensional coordinates of the points P 1 ′ and P 2 ′ with the optical center position C p as the origin. To extract.
- an intersection of a straight line connecting the viewpoint M and P 1 and the imaging surface is defined as P 1 ′. Further, corresponding to P 1 of the calibration plate 70 surface before the movement, the point of the calibration plate 70 surface after the movement and P 2.
- P 2 ′ be the intersection of the straight line connecting the viewpoints M and P 2 and the imaging surface.
- Q be the intersection of a straight line passing through the point P 2 and the point P 2 ′ and the surface of the calibration plate 70 before the movement.
- d 2 represents the length of the line segment C 1 P 1
- d 3 represents the length of the line segment P 1 ′ P 2 ′
- d 4 represents the length of the line segment C p P 1 ′.
- d 5 indicates the length of the line segment C 1 Q.
- Viewpoint M and the length x of a line connecting the optical center position C p can be expressed by equation 5.
- the viewpoint projection angle ⁇ of an arbitrary pixel point P can be calculated by Expression 6.
- d 6 indicates the length of the line segment C p P.
- the viewpoint projection angle ⁇ of the arbitrary image pixel P can be calculated.
- the optical center position Cp is set, distortion correction is performed, and the viewpoint projection angle ⁇ of an arbitrary image pixel is calculated, whereby the pixel calibration of the image is completed.
- the image pixel calibration method has been described above based on the embodiment, but the image pixel calibration method is not limited to this embodiment.
- the grid of the grid is written on the calibration plate 70, but the calibration plate may be marked with a known interval.
- the light from the lamp light source passes through a cylindrical structure processed perpendicularly to the calibration plate.
- a light beam or a laser beam may be used.
- FIG. 7 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 1 of the present invention.
- the image acquisition unit 100 acquires a first acquired image and a second acquired image (S110).
- the first acquired image and the second captured image from the first viewpoint M 1 and second viewpoint M 2, the reference point A of the 3-point coordinates are known, B, and C, and coordinates
- the three reference points included in the first acquired image and the second acquired image are the same will be described as an example.
- the three reference points included in the second acquired image may be different.
- the two-dimensional coordinate extraction unit 110 uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image.
- the coordinates of the points A 01 , B 01 , C 01 that are C are extracted from the first acquired image.
- the two-dimensional coordinate extraction unit 110 extracts the coordinates of the point W 01 that is the target point W projected on the first acquired image (first target two-dimensional coordinates) from the first acquired image (S120). .
- the two-dimensional coordinate extraction unit 110 uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image.
- the coordinates of the points A 01 , B 01 , C 01 that are C are extracted from the first acquired image.
- the two-dimensional coordinate extraction unit 110 extracts the coordinates of the point W 01 that is the target point W projected on the first acquired image (first target two-dimensional coordinates) from the first acquired image (S120). .
- the two-dimensional coordinate extraction unit 110 uses the optical center position Cp2 in the second acquired image as an origin and the reference points A, B, and B projected on the second acquired image.
- the coordinates of the points A 02 , B 02 , and C 02 (three second reference two-dimensional coordinates) that are C are extracted from the second acquired image.
- 2-dimensional coordinate extracting section 110 extracts the second acquired image point which is the target point W which is projected to W 02 coordinates (second target two-dimensional coordinates) from the second acquired image (S120) .
- the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 out of the viewpoint projection angles of the arbitrary image pixel to be held.
- the first and second reference two-dimensional coordinates of the three points extracted by the above and the viewpoint projection angles of arbitrary image pixels corresponding to the first and second target two-dimensional coordinates are set as the three first and second references.
- the viewpoint projection angle and the first and second target viewpoint projection angles are extracted (S130).
- the inclination angle calculation unit 130a includes first and second imaging surfaces that are imaging surfaces of the first and second acquired images, and a reference plane (first and second reference planes) including a reference point.
- the formed angle (first and second inclination angles) is calculated (S140).
- FIG. 10 is a flowchart showing a processing procedure performed by the inclination angle calculation unit according to the present invention.
- FIG. 11 is a figure explaining the process shown by the flowchart of FIG. 10 based on this invention.
- the process of calculating the first inclination angle will be described mainly based on FIGS. 10 and 11.
- the tilt angle calculation unit 130a has one of the three reference points A, B, and C whose relative coordinates are known, for example, the X 1 -Y 1 -Z 1 coordinate system with the point A as the origin. (First three-dimensional coordinate system) is set (S141). At this time, the inclination angle calculating unit 130a, X 1 axis and Y 1 axis is set to be parallel to the first imaging plane taken from the first viewpoint M 1. The inclination angle calculating unit 130a, Z 1 axis is set to be the first imaging plane and perpendicular captured from the first viewpoint M 1. Note that the X 1 -Y 1 plane at this time is also referred to as a first conversion plane.
- the inclination angle calculating unit 130a a relative coordinate reference point A of the three known, B, set C and equivalent first virtual point B 1 and C 1 of the coordinates (S142).
- the point B 1 corresponds to the point B
- the point C 1 corresponds to the point C.
- the coordinates of the point B 1 are set as (B 1x , 0, 0)
- the coordinates of the point C 1 are set as (0, C 1y , 0).
- the tilt angle calculation unit 130a finely and separately rotates the points B 1 and C 1 by an angle ⁇ 1n around the X 1 axis, an angle ⁇ 1n around the Y 1 axis, and an angle ⁇ 1n around the Z 1 axis.
- a plurality of first conversion reference points B 1n ′ (B 1nx ′, B 1ny ′, B 1nz ′) and point C 1n ′ (C 1nx ′, C 1ny ′, C 1nz ′) are calculated ( S143).
- n 1, 2, 3,...
- the tilt angle calculation unit 130a uses the viewpoint projection angle extracted by the viewpoint projection angle extraction unit 120 to project a plurality of points B 1n ′ and point C 1n ′ onto the X 1 -Y 1 plane.
- the point B 1n ′′ (B 1nx ′′, B 1ny ′′, 0) and the point C 1n ′′ (C 1nx ′′, C 1ny ′′, 0), which are the first conversion projection reference points, are calculated (S144).
- n 1, 2, 3,...
- Z 1 -X 1 side perspective projection angle epsilon 1x of point B 01 a perspective projection angle of Z 1 -Y 1 side of the point B 01 epsilon 1y
- viewpoint projection angle of Z 1 -X 1 side of the point C 01 ⁇ 1x viewpoint projection angle of Z 1 -Y 1 side of the point C 01 and phi 1y.
- Reference point A in the first imaging plane, B that it is C A 01, B 01, a triangle A 01 B 01 C 01 connecting the C 01, points A, triangle A connecting the points B 1n "point C 1n” , B 1n ′′ C 1n ′′ is similar, ⁇ B 1n ′ M 1 C p1 and ⁇ B 1n ′′ M 1 C p1 are ⁇ 1x as shown in FIG.
- B 1nx ′′ can be calculated using Equation 7.
- B 1ny ′′, C 1nx ′′, and C 1ny ′′ can be calculated using Equation 8 to Equation 10.
- the inclination angle calculation unit 130a the triangle A 01 B 01 C 01 connecting the points A 01 , B 01 , C 01 which are the reference points A, B, C on the first imaging surface, and the points A, B
- the first transformed projection reference points B 1 ′′ and C 1 ′′ that are most similar to the triangle AB 1n ′′ C 1n ′′ connecting 1 n ′′ and the point C 1n ′′
- the first inclination angle ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) can be calculated (S145).
- the line segment A 01 B 01 is made the same length as the line segment AB 1n ′′, and the point B 01 and the point C 01 are moved so that the point A 01 coincides with the point A. Then, the first inclination angle ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) is calculated so that the sum of the length of the line segment B 01 B 1n ′′ and the length of the line segment C 01 C 1n ′′ is minimized. There is a way.
- the similarity comparison calculation method is not limited to the above method.
- a line segment A 01 B 01 "in the same length as the line segment A 01 B 01 segment AB 1n” segment AB 1n moving the point C 01 to match the point C
- the first inclination angles ( ⁇ 1 , ⁇ 1 , ⁇ 1 ) may be calculated so that the deviation amount between 01 and the point C 1n ′′ is minimized.
- the inclination angle calculation unit 130a can calculate a second inclination angle ( ⁇ 2 , ⁇ 2 , ⁇ 2 ) that is an angle formed between the second imaging surface and a plane including the reference point. .
- the calculation of the tilt angle has been described based on the embodiment.
- the calculation of the tilt angle is not limited to this embodiment.
- the tilt angle calculation unit 130a has an X 1 -Y 1 -Z 1 coordinate system with one of the three reference points A, B, and C whose coordinates are known as the origin.
- the origin may be selected from other than the three reference points whose coordinates are known.
- the point A is placed at a certain position on the viewpoint projection angle line of the point A, and the point B is at a certain pitch on the viewpoint projection angle line of the point B and in a certain range.
- the point C is obtained by being a similar shape based on the line segment AB by the point B at a certain pitch, and the point C is on the circumference with the line segment AB as an axis. I understand that.
- the condition of the closest approach on the viewpoint projection angle of the point C may be stored, the point B may be sequentially slid to calculate the closest point of the point C at that time, and the inclination angle obtained when the closest approach is obtained.
- the point A is placed at a certain position on the viewpoint projection angle line of the point A, and the point B at that time is set to Since the point C is on the circumference around the line segment AB, the condition closest to the point C projection angle line is memorized.
- the inclination angle obtained when the point A is closest to the viewpoint projection angle line when the above-described calculation is performed for the point A within a certain range may be used.
- the vector calculation unit 130b calculates the first vector and the vector 2 (S150).
- the first target two-dimensional coordinates of the point W 01 that is the target point W on the first imaging surface are converted into similar shapes on the first conversion plane where the triangles A, B 1 ′′, C 1 ′′ exist. This point is defined as W 1 (W 1x , W 1y , 0).
- the first vector is the viewpoint projection.
- the viewpoint projection angle ⁇ 1x of the Z 1 -X 1 plane of the point W 01 and the viewpoint projection angles ⁇ 1y and W 1z ′ of the Z 1 -Y 1 plane of the point W 01 extracted by the corner extraction unit 120.
- the vector calculation unit 130b can calculate the vector 2.
- the conversion vector calculation unit 130c generates a line segment connecting any two points of the points A, B 1 ′′, C 1 ′′, for example, the line segment AB 1 ′′ and the points A, B 2 ′′, C 2 ′′. Among them, the line segment AB 2 ′′ corresponding to the line segment AB 1 ′′ is extracted. Then, the conversion vector calculation unit 130 c uses the first inclination angles ( ⁇ 1 , ⁇ 1 , ⁇ obtained by the inclination angle calculation unit 130 a. 1 ) and the second inclination angle ( ⁇ 2 , ⁇ 2 , ⁇ 2 ), the vector 2 is moved so that the line segment AB 2 ′′ and the line segment AB 1 ′′ overlap with each other. Is calculated (S160).
- the relative coordinate measuring unit 130d calculates the coordinates of the closest point between the first vector and the second vector, so that the origin A and the target point W, which are any one of the reference points, are calculated. Relative coordinates are calculated (S170).
- vector calculation unit 130b corresponds to the measurement unit of the present invention.
- the three-dimensional relative coordinate measuring unit 20 makes a relative coordinate between any one of the three reference points and the photographing point (corresponding to the first viewpoint of the present invention). The operation of measuring the will be described.
- FIG. 13 is a flowchart showing a processing procedure of relative coordinate measurement by the three-dimensional relative coordinate measurement apparatus according to Embodiment 2 of the present invention.
- the image acquisition unit 100 acquires a first acquired image as shown in FIG. 14 (S210).
- the first acquired image is considerably than the first viewpoint M 1
- the reference point A of the 3-point coordinates are known
- B the first imaging device to the camera (the present invention as C objects appear together Is the image data 60 imaged in (1).
- the two-dimensional coordinate extraction unit 110 uses the optical center position C p1 in the first acquired image as an origin, and the reference points A, B, and B projected on the first acquired image.
- the coordinates of the points A 01 , B 01 and C 01 (three first reference two-dimensional coordinates) that are C are extracted from the first acquired image (S220).
- the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and the two-dimensional coordinate extraction unit 110 out of the viewpoint projection angles of the arbitrary image pixel to be held.
- the viewpoint projection angles of arbitrary image pixels corresponding to the three first reference two-dimensional coordinates extracted by the above are extracted as three first reference viewpoint projection angles (S230).
- the tilt angle calculation unit 130a calculates an angle (first tilt angle) formed by the first imaging plane, which is the imaging plane of the first acquired image, and the first reference plane including the reference point ( S240).
- the vector calculation unit 130b calculates two vectors (S250).
- the two vectors for example, B 1 determined in (S145) and "the, and the straight line B 1 'B 1 connecting the' B 1 corresponding to 'point B 1n at that time" (S145) C 1 determined in is that of "a, linear C 1 'C 1 connecting the' C 1 corresponding to the 'C 1n terms of their time".
- the relative coordinate measuring unit 130d calculates the coordinates of the closest point of the two vectors, thereby calculating the relative coordinates between the point A, which is one of the reference points, and the shooting point (S260). ).
- the point A is placed at a certain position on the viewpoint A projection angle line of the point A, and the point B at that time is the point B Since the point C is on the circumference around the line segment AB, the condition closest to the point C projection angle line is memorized.
- the inclination angle obtained when the point A is closest to the viewpoint projection angle line when the above-described calculation is performed for the point A within a certain range may be used.
- the triangle ABM 1 connecting the point A and point B and the point M 1 because the length and ⁇ M 1 AB and ⁇ M 1 BA line segment AB is known, using the triangle ABM 1, the reference point The relative coordinates of any one of the points and the shooting point can be calculated. Further, since ⁇ AM 1 B is known from the viewpoint projection angle when the line segment AB is viewed from the viewpoint M 1 , similarly, using the triangle ABM 1 , any one of the reference points and the shooting point Relative coordinates can be calculated.
- the viewpoint projection angle extraction unit 120 holds the viewpoint projection angle of an arbitrary image pixel calculated by the adjustment unit 200 in advance, and selects the two-dimensional coordinates from the stored viewpoint projection angles of the arbitrary image.
- the reference and target viewpoint projection angles corresponding to the two-dimensional coordinates extracted by the extraction unit 110 have been extracted.
- the viewpoint projection angle extraction unit 120 holds Equation 6, and uses the held Equation 6 to use the two-dimensional coordinates extracted by the two-dimensional coordinate extraction unit 110 and the distortion correction equation as a reference and target.
- the viewpoint projection angle may be calculated.
- the adjustment unit 200 by Equation 5, it is necessary to obtain the distance x between the viewpoint M and the optical center position C p.
- the two-dimensional coordinate extraction unit 110 extracts the three reference points and the two-dimensional coordinates of the target points in the first and second acquired images.
- the image acquisition unit 100 May extract the two-dimensional coordinates. In this case, the two-dimensional coordinate extraction unit 110 is not necessary.
- the tilt angle calculation unit 130a calculates the first tilt angle and the second tilt angle by setting two different three-dimensional coordinate systems, but one three-dimensional coordinate system is used.
- the first inclination angle and the second inclination angle may be calculated by setting only.
- the vector calculation unit 130b, a first target viewpoint projection angle using the first inclination angle calculates a first vector which passes through the point M 1 and the point W 01 in one of the three-dimensional coordinate system .
- the vector calculation unit 130b, and a second target viewpoint projection angles by using the second inclination angle calculates a second vector passing through the point M 2 and the point W 02 in one of the three-dimensional coordinate system. Therefore, in this case, the conversion vector calculation unit 130c is not necessary.
- the conversion vector calculation unit The calculation may be performed by the vector calculation unit 130b instead of 130c.
- shooting is performed from two viewpoints, but shooting may be performed from three or more viewpoints, or the number of reference points whose relative coordinates are known may be increased to four or more. .
- the inclination angle is an angle with respect to different reference deformation surfaces (first and second reference planes).
- the relative coordinate measurement unit 130d calculates the relative coordinates of the target point using the relative positional relationship between the three first reference points and the three second reference points. Specifically, the relative coordinate measuring unit 130d calculates the closest point of two vectors using the relative positional relationship. Further, this relative positional relationship may be calculated when calculating the tilt angle. In other words, the calculated tilt angle may be adjusted to one of the reference planes. Alternatively, the calculated tilt angle may be calculated as an angle unified in the three-dimensional coordinate system.
- first tilt angle and the second tilt angle may be calculated by performing nonlinear approximation calculation using a large number of reference points whose relative coordinates are known.
- the correspondence can be identified by using a special marker or line.
- a reference point for verification whose relative coordinates are known may be used in order to prevent reversal (turn over) due to rotation of the reference point and calculation of an incorrect solution (tilt angle).
- restrictions for example, rotation conditions
- restrictions may be added so as not to calculate an incorrect solution.
- a part or all of the functions of the three-dimensional relative coordinate measuring apparatus according to the embodiment of the present invention may be realized by a processor such as a CPU executing a program.
- connection relationship between the constituent elements is exemplified for specifically explaining the present invention, and the connection relationship for realizing the function of the present invention is not limited to this.
- division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be. Further, the functions of a plurality of functional blocks having similar functions may be processed by the three-dimensional measuring apparatus in parallel or in time division.
- the present invention may be the above program or a non-transitory computer-readable recording medium on which the above program is recorded.
- the program can be distributed via a transmission medium such as the Internet.
- the configuration of the three-dimensional measurement apparatus is for illustrating the present invention in detail, and the three-dimensional measurement apparatus according to the present invention does not necessarily have all of the above-described configurations. In other words, the three-dimensional measuring apparatus according to the present invention only needs to have a minimum configuration that can realize the effects of the present invention.
- the three-dimensional measurement method using the three-dimensional measurement apparatus is for illustration in order to specifically describe the present invention, and the three-dimensional measurement method using the three-dimensional measurement apparatus according to the present invention is described above. It is not necessary to include all of the steps. In other words, the three-dimensional measurement method according to the present invention needs to include only the minimum steps that can realize the effects of the present invention.
- the present invention can be used when three-dimensional relative coordinate measurement, for example, an interval between two points is automatically measured in an environment where relative coordinates cannot be measured mechanically and electrically.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention concerne un dispositif (90) de mesure de coordonnées relatives tridimensionnelles, qui comprend : une unité d'acquisition d'image (100) qui acquiert une première image acquise obtenue par capture de trois points de référence d'une première perspective avec un dispositif de capture d'image ; une unité (120) d'extraction d'angle de projection de perspective, qui conserve des informations sur un angle de projection de perspective de pixel pour chaque pixel et utilise lesdites informations pour acquérir trois premiers angles de projection de perspective de référence, qui sont des angles de projection de perspective de pixel correspondant aux trois points de référence projetés par la première image acquise ; une unité (130a) de calcul d'angle d'inclinaison, qui utilise les coordonnées relatives des trois premiers angles de projection de perspective de référence et les trois points de référence pour calculer un premier angle d'inclinaison formé entre un premier plan d'imagerie, qui est un plan d'imagerie de la première image acquise, et un plan de référence contenant les trois points de référence ; et une unité (130d) de mesure de coordonnées relatives qui utilise le premier angle d'inclinaison pour mesurer les coordonnées relatives entre l'un quelconque des trois points de référence et un point cible.
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011541976A JP5070435B1 (ja) | 2011-07-01 | 2011-07-01 | 3次元相対座標計測装置およびその方法 |
| PCT/JP2011/003774 WO2013005244A1 (fr) | 2011-07-01 | 2011-07-01 | Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles |
| JP2013522375A JP5629874B2 (ja) | 2011-07-01 | 2011-10-07 | 三次元座標計測装置及び三次元座標計測方法 |
| PCT/JP2011/005654 WO2013005265A1 (fr) | 2011-07-01 | 2011-10-07 | Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2011/003774 WO2013005244A1 (fr) | 2011-07-01 | 2011-07-01 | Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013005244A1 true WO2013005244A1 (fr) | 2013-01-10 |
Family
ID=47277831
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2011/003774 Ceased WO2013005244A1 (fr) | 2011-07-01 | 2011-07-01 | Dispositif et procédé de mesure de coordonnées relatives tridimensionnelles |
| PCT/JP2011/005654 Ceased WO2013005265A1 (fr) | 2011-07-01 | 2011-10-07 | Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2011/005654 Ceased WO2013005265A1 (fr) | 2011-07-01 | 2011-10-07 | Dispositif de mesure de coordonnées tridimensionnelles et procédé de mesure de coordonnées tridimensionnelles |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP5070435B1 (fr) |
| WO (2) | WO2013005244A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104634246A (zh) * | 2015-02-03 | 2015-05-20 | 李安澜 | 目标空间坐标的浮动式立体视觉测量系统及测量方法 |
| CN108195345A (zh) * | 2017-12-20 | 2018-06-22 | 合肥英睿系统技术有限公司 | 一种基于电子成像器的测距方法及系统 |
| CN110108203A (zh) * | 2019-04-11 | 2019-08-09 | 东莞中子科学中心 | 一种基于摄影测量技术的丝线位置测量方法及系统 |
| CN112325767A (zh) * | 2020-10-16 | 2021-02-05 | 华中科技大学鄂州工业技术研究院 | 一种融合机器视觉和飞行时间测量的空间平面尺寸测量方法 |
| CN112991742A (zh) * | 2021-04-21 | 2021-06-18 | 四川见山科技有限责任公司 | 一种实时交通数据的可视化仿真方法及系统 |
| CN113884081A (zh) * | 2016-11-01 | 2022-01-04 | 北京墨土科技有限公司 | 测定定位点三维坐标的方法及设备 |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103424104B (zh) * | 2013-09-04 | 2015-11-18 | 中测新图(北京)遥感技术有限责任公司 | 一种近景大幅面数字摄影测量系统及方法 |
| CN104748680B (zh) * | 2015-03-19 | 2018-09-14 | 酷派软件技术(深圳)有限公司 | 一种基于摄像头的尺寸测量方法及装置 |
| CN106441243A (zh) * | 2016-09-22 | 2017-02-22 | 云南电网有限责任公司电力科学研究院 | 一种地物净空距离的测量方法及装置 |
| JP6950273B2 (ja) * | 2017-05-17 | 2021-10-13 | 日本電気株式会社 | 飛行物体位置検知装置、飛行物体位置検知システム、飛行物体位置検知方法及びプログラム |
| CN110595433A (zh) * | 2019-08-16 | 2019-12-20 | 太原理工大学 | 一种基于双目视觉的输电杆塔倾斜的测量方法 |
| US12329556B2 (en) * | 2020-03-20 | 2025-06-17 | Koninklijke Philips N.V. | 3-D measurements grid tool for x-ray images |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1139506A (ja) * | 1997-07-16 | 1999-02-12 | Atr Chino Eizo Tsushin Kenkyusho:Kk | 任意視点画像生成装置 |
| JP2000121362A (ja) * | 1998-10-20 | 2000-04-28 | Asahi Optical Co Ltd | 写真測量用ターゲット測定装置 |
| JP2002090118A (ja) * | 2000-09-19 | 2002-03-27 | Olympus Optical Co Ltd | 3次元位置姿勢センシング装置 |
| JP2009248214A (ja) * | 2008-04-03 | 2009-10-29 | Kanto Auto Works Ltd | 画像処理装置、およびロボット制御システム |
| JP2010025759A (ja) * | 2008-07-18 | 2010-02-04 | Fuji Xerox Co Ltd | 位置計測システム |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0680402B2 (ja) * | 1985-05-28 | 1994-10-12 | 富士通株式会社 | 位置計測装置 |
| JPH0493705A (ja) * | 1990-08-09 | 1992-03-26 | Topcon Corp | 3次元位置測定装置及び測定方法 |
| JPH07218251A (ja) * | 1994-02-04 | 1995-08-18 | Matsushita Electric Ind Co Ltd | ステレオ画像計測方法およびステレオ画像計測装置 |
| JP3777067B2 (ja) * | 1999-07-07 | 2006-05-24 | ペンタックス株式会社 | 写真測量画像処理装置、写真測量画像処理方法、および写真測量画像処理プログラムを格納した記憶媒体 |
| JP2004271292A (ja) * | 2003-03-07 | 2004-09-30 | Meidensha Corp | 校正器及びステレオカメラ位置姿勢校正装置 |
| JP4095491B2 (ja) * | 2003-05-19 | 2008-06-04 | 本田技研工業株式会社 | 距離測定装置、距離測定方法、及び距離測定プログラム |
-
2011
- 2011-07-01 JP JP2011541976A patent/JP5070435B1/ja not_active Expired - Fee Related
- 2011-07-01 WO PCT/JP2011/003774 patent/WO2013005244A1/fr not_active Ceased
- 2011-10-07 WO PCT/JP2011/005654 patent/WO2013005265A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1139506A (ja) * | 1997-07-16 | 1999-02-12 | Atr Chino Eizo Tsushin Kenkyusho:Kk | 任意視点画像生成装置 |
| JP2000121362A (ja) * | 1998-10-20 | 2000-04-28 | Asahi Optical Co Ltd | 写真測量用ターゲット測定装置 |
| JP2002090118A (ja) * | 2000-09-19 | 2002-03-27 | Olympus Optical Co Ltd | 3次元位置姿勢センシング装置 |
| JP2009248214A (ja) * | 2008-04-03 | 2009-10-29 | Kanto Auto Works Ltd | 画像処理装置、およびロボット制御システム |
| JP2010025759A (ja) * | 2008-07-18 | 2010-02-04 | Fuji Xerox Co Ltd | 位置計測システム |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104634246A (zh) * | 2015-02-03 | 2015-05-20 | 李安澜 | 目标空间坐标的浮动式立体视觉测量系统及测量方法 |
| CN113884081A (zh) * | 2016-11-01 | 2022-01-04 | 北京墨土科技有限公司 | 测定定位点三维坐标的方法及设备 |
| CN113884081B (zh) * | 2016-11-01 | 2024-02-27 | 北京墨土科技有限公司 | 测定定位点三维坐标的方法及设备 |
| CN108195345A (zh) * | 2017-12-20 | 2018-06-22 | 合肥英睿系统技术有限公司 | 一种基于电子成像器的测距方法及系统 |
| CN110108203A (zh) * | 2019-04-11 | 2019-08-09 | 东莞中子科学中心 | 一种基于摄影测量技术的丝线位置测量方法及系统 |
| CN112325767A (zh) * | 2020-10-16 | 2021-02-05 | 华中科技大学鄂州工业技术研究院 | 一种融合机器视觉和飞行时间测量的空间平面尺寸测量方法 |
| CN112325767B (zh) * | 2020-10-16 | 2022-07-26 | 华中科技大学鄂州工业技术研究院 | 一种融合机器视觉和飞行时间测量的空间平面尺寸测量方法 |
| CN112991742A (zh) * | 2021-04-21 | 2021-06-18 | 四川见山科技有限责任公司 | 一种实时交通数据的可视化仿真方法及系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2013005244A1 (ja) | 2015-02-23 |
| WO2013005265A1 (fr) | 2013-01-10 |
| JP5070435B1 (ja) | 2012-11-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5070435B1 (ja) | 3次元相対座標計測装置およびその方法 | |
| JP6585006B2 (ja) | 撮影装置および車両 | |
| US10621753B2 (en) | Extrinsic calibration of camera systems | |
| TWI555379B (zh) | 一種全景魚眼相機影像校正、合成與景深重建方法與其系統 | |
| CN106548489B (zh) | 一种深度图像与彩色图像的配准方法、三维图像采集装置 | |
| CN110809786B (zh) | 校准装置、校准图表、图表图案生成装置和校准方法 | |
| JP3624353B2 (ja) | 3次元形状計測方法およびその装置 | |
| CN106331527B (zh) | 一种图像拼接方法及装置 | |
| CN106595528A (zh) | 一种基于数字散斑的远心显微双目立体视觉测量方法 | |
| CN103813151A (zh) | 图像处理装置和方法、图像处理系统和程序 | |
| JP2015203652A (ja) | 情報処理装置および情報処理方法 | |
| JP2016100698A (ja) | 校正装置、校正方法、プログラム | |
| KR20240089161A (ko) | 촬영 측정 방법, 장치, 기기 및 저장 매체 | |
| JPWO2008053649A1 (ja) | 広角画像取得方法及び広角ステレオカメラ装置 | |
| CN110580718A (zh) | 图像装置的校正方法及其相关图像装置和运算装置 | |
| JP5648159B2 (ja) | 3次元相対座標計測装置およびその方法 | |
| JP2022024688A (ja) | デプスマップ生成装置及びそのプログラム、並びに、デプスマップ生成システム | |
| JP2017098859A (ja) | 画像のキャリブレーション装置及びキャリブレーション方法 | |
| JP5727969B2 (ja) | 位置推定装置、方法、及びプログラム | |
| CN108062790B (zh) | 应用于物体三维重建的三维坐标系建立方法 | |
| TWM594322U (zh) | 全向立體視覺的相機配置系統 | |
| WO2025195304A1 (fr) | Procédé et appareil de calibrage de perle de lampe à del, dispositif et support | |
| JP2005275789A (ja) | 三次元構造抽出方法 | |
| CN112804515B (zh) | 全向立体视觉的摄像机配置系统及摄像机配置方法 | |
| JP4837538B2 (ja) | 端部位置測定方法および寸法測定方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| ENP | Entry into the national phase |
Ref document number: 2011541976 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11869130 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11869130 Country of ref document: EP Kind code of ref document: A1 |