CN111692992A - High-precision 2D size measurement method based on multi-image time-sharing exposure - Google Patents
High-precision 2D size measurement method based on multi-image time-sharing exposure Download PDFInfo
- Publication number
- CN111692992A CN111692992A CN202010572024.XA CN202010572024A CN111692992A CN 111692992 A CN111692992 A CN 111692992A CN 202010572024 A CN202010572024 A CN 202010572024A CN 111692992 A CN111692992 A CN 111692992A
- Authority
- CN
- China
- Prior art keywords
- measured
- exposure
- edge
- image
- sharing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a high-precision 2D size measurement method based on multi-image time-sharing exposure, which comprises the following steps of 1) counting the definition of all edges to be measured under the exposure environment of different light sources; 2) selecting an exposure environment with the optimal combination according to the edge definition; 3) acquiring a plurality of images of the object to be detected in the exposure environment of the optimal combination by using a camera in a time-sharing exposure mode, extracting edge characteristics and acquiring image coordinates of a straight edge; 4) and calculating the size of the measured object according to the image coordinates and the resolution of the edge. The invention utilizes the time-sharing exposure of the linear array camera to acquire a plurality of images of the object to be measured under different exposure conditions by one-time shooting, and extracts all the edge characteristics to be measured on the surface of the object by comprehensively utilizing the images, thereby accurately measuring the 2D size of the object, improving the measurement accuracy and reducing the hardware cost.
Description
Technical Field
The invention relates to the technical field of image acquisition and processing, in particular to a high-precision 2D size measurement method based on multi-image time-sharing exposure.
Background
The machine vision industry is developed rapidly, and the demand for dimension measurement of parts made of materials such as metal is increasing day by day. In the machining process of parts, errors often exist in the sizes of the parts, the sizes of the parts need to be measured, more manual measurement is performed in the past, the efficiency is low, and the cost is high, so that the requirement for the automatic measuring equipment is urgent.
With the development of the technology, the prior art has appeared to utilize a single line camera to extract all the measuring edges on an image of a single field of view and then calculate the dimensions of the part from the position coordinates of these edges. However, due to the fact that the parts are various in structure and the metal surface is easy to reflect light, under an image of a single view field, edges of the parts at different positions and in different directions cannot be clearly shot, all edges to be measured on the parts cannot be extracted, and therefore algorithm measurement fails. However, if a plurality of cameras are used to shoot a plurality of images at different viewing angles, because the positions between the cameras and the lens distortion are different, the positions between the cameras need to be calibrated to calculate the size, the algorithm debugging is complex, and the hardware cost is high.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the high-precision 2D size measurement method based on multi-image time-sharing exposure is provided, all edge features to be measured are extracted by only shooting once through a single camera, the size measurement speed is improved, and the hardware cost is reduced.
The technical scheme adopted by the invention for solving the technical problems is as follows: a high-precision 2D dimension measurement method based on multi-image time-sharing exposure comprises the following steps,
1) placing an object to be measured in an imaging environment consisting of a single linear array camera and a plurality of light sources, and counting the definition of all edges to be measured on the object to be measured under the exposure condition of each light source;
2) selecting exposure conditions of the optimal combination when the edges to be measured acquire the maximum definition value according to the maximum definition value of each edge to be measured;
3) carrying out one-time shooting by using a linear array camera, acquiring a plurality of images of the object to be measured under the exposure condition of the optimal combination by adopting a time-sharing exposure mode, and extracting edge features of the edge to be measured from the image with the maximum definition to obtain image coordinates of the edge to be measured;
4) and calculating the size of the object to be measured according to the image coordinates and the resolution of the edge to be measured.
Further, in step 1), the definition of the edge to be measured is D, and the value of D is equal to the absolute difference between the average gray level value of the edge region and the average gray level value of the background region; the larger the value of D, the better the imaging effect of the edge.
Still further, in step 2) of the present invention, the exposure condition of the optimal combination is determined in such a manner that,
A. selecting K groups of data from the data counted in the step 1), wherein K is more than or equal to 2 and less than or equal to the maximum time-sharing exposure number supported by the linear array camera;
B. counting the maximum values Dmax1, Dmax2,. Dmaxn of the definition of each edge to be measured in the K groups of data;
C. the exposure effect of the K groups of data is measured by the minimum value Dmin of all the edges to be measured, namely Dmax1, Dmax2,. Dmaxn, the greater the value of Dmin, the better the comprehensive exposure effect of the K groups of data is, so the exposure condition of the best combination is when the Dmin is the maximum.
Furthermore, in step 3), the image coordinates of the edge points are calculated by using an edge extraction algorithm, and the line segment and the circle are fitted according to a least square method to obtain the image coordinates of the line segment and the circle.
Still further, in step 4), the image coordinate and resolution of the edge to be measured are calculated according to a formula
wx=ix*h
wy=iy*v
And calculating the physical coordinates of the object to be measured so as to obtain the size of the object to be measured, wherein wx and wy are actual physical coordinates, ix and iy are image coordinates, h is the horizontal resolution of the image, and v is the vertical resolution of the image.
The invention has the beneficial effects that:
1. the time-sharing exposure of the linear array camera is utilized, a plurality of images of the object to be measured under different exposure conditions are obtained through one-time shooting, all the edge features to be measured of the surface of the object are extracted by comprehensively utilizing the images, and then the 2D size of the object can be accurately measured;
2. the time-sharing exposure condition of the optimal combination is selected by utilizing the definition of the edge features under different exposure conditions, and a plurality of images acquired by time-sharing exposure are comprehensively utilized to carry out high-precision dimension measurement, so that the measurement accuracy is improved, and the hardware cost is reduced.
3. The method can ensure that each edge can be accurately extracted, solves the problem that partial edges cannot be extracted simultaneously under the imaging of a single camera, and simultaneously extracts all edge features to be measured only by shooting once through the single camera relative to a plurality of cameras, thereby reducing the cost of hardware, avoiding calibrating the relative position relationship among the plurality of cameras and improving the speed of size measurement.
Drawings
Fig. 1 is an image a obtained under exposure condition 1;
fig. 2 is an image b obtained by exposure condition 2;
FIG. 3 is a flow chart of the measurement of a multi-pattern time-sharing exposure.
Detailed Description
The invention will now be described in further detail with reference to the drawings and preferred embodiments. These drawings are simplified schematic views illustrating only the basic structure of the present invention in a schematic manner, and thus show only the constitution related to the present invention.
As shown in fig. 1-3, in the time-sharing exposure, the line camera scans once to obtain multiple images of an object under multiple exposure environments. Although each image can cover the area where all the edges to be measured are located, the definition of the edges is different in the pictures obtained under different exposures. In order to accurately extract the position information of the edges, it is necessary to ensure that each edge has a good sharpness on at least one of the images. The method provided by the invention is mainly divided into the following steps.
1) And counting the definition of all edges to be measured under the exposure environments of different light sources.
In order to measure the imaging effect of the edge under different exposure conditions, an edge definition D is defined, which is equal to the absolute difference between the mean gray level of the edge region and the mean gray level of the background region. The greater the definition D, the better the imaging effect representing the edge. And counting the definition of all n edges to be measured under the exposure environment of the light source with different angles. In order to ensure a good definition of all edges, due to the different positions and orientations of the edges on the part, care should be taken in statistics to arrange the light sources at as many different angles as possible. For example, a light source can be arranged around and above the part, and the definition of all edges under the exposure of the five light sources can be counted.
2) And selecting the exposure environment with the optimal combination according to the edge definition.
In time-sharing exposure, the line camera scans once to obtain k pictures of the object under multiple exposure environments (the size of k depends on the time-sharing exposure number supported by the camera sensor, and usually k is more than or equal to 2 and less than or equal to the maximum time-sharing exposure number supported by the camera). Since one edge feature only needs to be extracted once, only the edge needs to be extracted on the picture with the highest definition in the k pictures. Therefore, only one picture in the k pictures needs to be higher in definition at each edge. Therefore, it is necessary to select a combination of multiple exposure environments from the data counted in step 1) so that all the edges have at least one picture with higher definition. Optionally, k exposure environments are selected, and the maximum value Dmax1, Dmax2,. Dmaxn of the definition of each edge in the k exposure environments is counted, and when the definition is actually measured, the edge feature is only required to be extracted under the exposure condition that the definition takes the maximum value. And the exposure effect of the k sets of data is measured as the smallest value Dmin of Dmax1, Dmax2,. Dmaxn of all edges. The larger the Dmin value is, the better the comprehensive exposure effect of the K groups of data is. Therefore, the optimal exposure environment is obtained when Dmin is maximized.
3) And acquiring a plurality of images of the object to be detected in the exposure environment of the optimal combination by using a time-sharing exposure mode of the camera, extracting edge characteristics and acquiring image coordinates of the straight edge.
And (3) acquiring a plurality of images of the object to be measured in the exposure environment with the optimal combination by using a time-sharing exposure mode of the camera, judging which image has the highest definition of each edge in the step 2), and calculating the edge characteristics in the image. And extracting the coordinates of the edge points by using an edge extraction algorithm, and fitting the line outgoing section and the line outgoing circle according to a least square method to obtain the image coordinates of the line section and the line outgoing circle.
4) And calculating the size of the measured object according to the image coordinates and the resolution of the edge.
In fig. 1-2, there are four edge features to be measured on the part, and it is necessary to calculate the distance 1 between the straight line feature 1 and the straight line feature 2, the angle 2 between the straight line feature 1 and the straight line feature 3, and the distance 3 between the centers of the straight line feature 1 and the circular feature 4. The exposure conditions were varied to obtain the sharpness of the four edge features under different exposure conditions, as shown in table 1:
TABLE 1 clarity of four edge features under different exposure conditions
Two groups of data are arbitrarily selected, and maximum values Dmax1, Dmax2, Dmax3 and Dmax4 of the 4 edge features in the two groups of data are selected. Their Dmin was counted and as shown in table 2, the set of data with the largest Dmin, i.e. exposure conditions of 1 and 2, was selected for the best edge imaging.
TABLE 2 comprehensive imaging Effect under different combinations of Exposure conditions
In the actual measurement process, the measurement flow is as shown in fig. 3, time-sharing exposure is performed by using exposure conditions 1 and 2 to acquire two images a and b (as shown in fig. 1 and 2); the imaging effect of the edge feature 1 and the edge feature 4 is better under the exposure condition 1, so the two features are extracted from the image a to obtain the image coordinates of the straight line 1 and the circle 4, the edge feature 2 and the edge feature 3 are extracted from the image b to obtain the image coordinates of the straight line 2 and the straight line 3, the physical coordinates of the straight line and the circle are calculated according to the resolution of the image, and the size of the object is calculated according to the physical coordinates.
The calculated results are shown in table 3:
TABLE 3 measurement results
| |
|
|
| 2.001mm | 90.01° | 4.762mm |
The actual physical coordinates can be calculated from the image coordinates and resolution of the edges using the following formula:
wx=ix*h
wy=iy*v
where (wx, wy) is the actual physical coordinate, (ix, iy) is the image coordinate, h is the horizontal resolution of the image, and v is the vertical resolution of the image.
The distance d between the outgoing line segment and the reference line segment can be calculated according to a triangle area formula:
where (px, py) are the physical coordinates of the midpoint of segment 1, (lx1, ly1) and (lx2, ly2) are the physical coordinates of the start point and end point of the reference segment, respectively. Also the distance between the center of the circle and the reference line segment can be calculated according to the above formula. And for the angle between the line segment and the reference line segment, it can be calculated according to the following formula:
where (lx1, ly1) and (lx2, ly2) are the physical coordinates of the start and end points of the line segment, and (lx3, ly3) and (lx4, ly4) are the physical coordinates of the start and end points of the reference line segment.
While particular embodiments of the present invention have been described in the foregoing specification, various modifications and alterations to the previously described embodiments will become apparent to those skilled in the art from this description without departing from the spirit and scope of the invention.
Claims (5)
1. A high-precision 2D size measurement method based on multi-image time-sharing exposure is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
1) placing an object to be measured in an imaging environment consisting of a single linear array camera and a plurality of light sources, and counting the definition of all edges to be measured on the object to be measured under the exposure condition of each light source;
2) selecting exposure conditions of the optimal combination when the edges to be measured acquire the maximum definition value according to the maximum definition value of each edge to be measured;
3) carrying out one-time shooting by using a linear array camera, acquiring a plurality of images of the object to be measured under the exposure condition of the optimal combination by adopting a time-sharing exposure mode, and extracting edge features of the edge to be measured from the image with the maximum definition to obtain image coordinates of the edge to be measured;
4) and calculating the size of the object to be measured according to the image coordinates and the resolution of the edge to be measured.
2. The multi-map time-sharing exposure-based high-precision 2D dimension measurement method according to claim 1, characterized in that: in the step 1), the definition of the edge to be measured is D, and the value of D is equal to the absolute difference value between the gray average value of the edge area and the gray average value of the background area; the larger the value of D, the better the imaging effect of the edge.
3. The multi-map time-sharing exposure-based high-precision 2D dimension measurement method according to claim 1, characterized in that: in the step 2), the exposure condition of the optimal combination is determined in a manner,
A. selecting K groups of data from the data counted in the step 1), wherein K is more than or equal to 2 and less than or equal to the maximum time-sharing exposure number supported by the linear array camera;
B. counting the maximum values Dmax1, Dmax2,. Dmaxn of the definition of each edge to be measured in the K groups of data;
C. the exposure effect of the K groups of data is measured by the minimum value Dmin of all the edges to be measured, namely Dmax1, Dmax2,. Dmaxn, the greater the value of Dmin, the better the comprehensive exposure effect of the K groups of data is, so the exposure condition of the best combination is when the Dmin is the maximum.
4. The multi-map time-sharing exposure-based high-precision 2D dimension measurement method according to claim 1, characterized in that: in the step 3), the image coordinates of the edge points are calculated by using an edge extraction algorithm, and the line segment and the circle are fitted according to a least square method to obtain the image coordinates of the line segment and the circle.
5. The multi-map time-sharing exposure-based high-precision 2D dimension measurement method according to claim 1, characterized in that: in the step 4), the image coordinate and the resolution ratio of the edge to be measured are calculated according to a formula
wx=ix*h
wy=iy*v
And calculating the physical coordinates of the object to be measured so as to obtain the size of the object to be measured, wherein wx and wy are actual physical coordinates, ix and iy are image coordinates, h is the horizontal resolution of the image, and v is the vertical resolution of the image.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010572024.XA CN111692992A (en) | 2020-06-22 | 2020-06-22 | High-precision 2D size measurement method based on multi-image time-sharing exposure |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010572024.XA CN111692992A (en) | 2020-06-22 | 2020-06-22 | High-precision 2D size measurement method based on multi-image time-sharing exposure |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN111692992A true CN111692992A (en) | 2020-09-22 |
Family
ID=72482660
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010572024.XA Pending CN111692992A (en) | 2020-06-22 | 2020-06-22 | High-precision 2D size measurement method based on multi-image time-sharing exposure |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111692992A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113810621A (en) * | 2021-09-18 | 2021-12-17 | 凌云光技术股份有限公司 | Time-sharing exposure and TDI parallel processing device and method applied to multi-line linear array camera |
Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1175687A (en) * | 1996-07-29 | 1998-03-11 | 埃尔帕特朗尼股份公司 | Method and apparatus for following and inspecting edge or border |
| JP2007271530A (en) * | 2006-03-31 | 2007-10-18 | Brother Ind Ltd | 3D shape detection apparatus and 3D shape detection method |
| CN101118153A (en) * | 2006-07-31 | 2008-02-06 | 三丰株式会社 | Multi-range non-contact probe |
| US20140226038A1 (en) * | 2013-02-12 | 2014-08-14 | Canon Kabushiki Kaisha | Image processing apparatus, image capturing apparatus, control method, and recording medium |
| CN105426848A (en) * | 2014-11-03 | 2016-03-23 | 倪蔚民 | Imaging method for improving success rate of biological recognition |
| CN106197320A (en) * | 2015-05-29 | 2016-12-07 | 苏州笛卡测试技术有限公司 | A kind of time-sharing multiplex quick three-dimensional scanning and data processing method thereof |
| CN106403819A (en) * | 2016-10-12 | 2017-02-15 | 颐中(青岛)烟草机械有限公司 | Irregular cigarette carton measurement positioning detection apparatus |
| CN109725503A (en) * | 2018-12-24 | 2019-05-07 | 无锡影速半导体科技有限公司 | A kind of multiband optical exposure device and method at times |
| CN110441321A (en) * | 2019-10-10 | 2019-11-12 | 征图新视(江苏)科技股份有限公司 | Transparent material Inner Defect Testing method based on different-time exposure image synthesis |
| CN110645911A (en) * | 2019-09-18 | 2020-01-03 | 重庆市光学机械研究所 | Device and method for obtaining complete outer surface 3D contour through rotary scanning |
| CN111047586A (en) * | 2019-12-26 | 2020-04-21 | 中国矿业大学 | A Pixel Equivalent Measurement Method Based on Machine Vision |
-
2020
- 2020-06-22 CN CN202010572024.XA patent/CN111692992A/en active Pending
Patent Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1175687A (en) * | 1996-07-29 | 1998-03-11 | 埃尔帕特朗尼股份公司 | Method and apparatus for following and inspecting edge or border |
| JP2007271530A (en) * | 2006-03-31 | 2007-10-18 | Brother Ind Ltd | 3D shape detection apparatus and 3D shape detection method |
| CN101118153A (en) * | 2006-07-31 | 2008-02-06 | 三丰株式会社 | Multi-range non-contact probe |
| US20140226038A1 (en) * | 2013-02-12 | 2014-08-14 | Canon Kabushiki Kaisha | Image processing apparatus, image capturing apparatus, control method, and recording medium |
| CN105426848A (en) * | 2014-11-03 | 2016-03-23 | 倪蔚民 | Imaging method for improving success rate of biological recognition |
| CN106197320A (en) * | 2015-05-29 | 2016-12-07 | 苏州笛卡测试技术有限公司 | A kind of time-sharing multiplex quick three-dimensional scanning and data processing method thereof |
| CN106403819A (en) * | 2016-10-12 | 2017-02-15 | 颐中(青岛)烟草机械有限公司 | Irregular cigarette carton measurement positioning detection apparatus |
| CN109725503A (en) * | 2018-12-24 | 2019-05-07 | 无锡影速半导体科技有限公司 | A kind of multiband optical exposure device and method at times |
| CN110645911A (en) * | 2019-09-18 | 2020-01-03 | 重庆市光学机械研究所 | Device and method for obtaining complete outer surface 3D contour through rotary scanning |
| CN110441321A (en) * | 2019-10-10 | 2019-11-12 | 征图新视(江苏)科技股份有限公司 | Transparent material Inner Defect Testing method based on different-time exposure image synthesis |
| CN111047586A (en) * | 2019-12-26 | 2020-04-21 | 中国矿业大学 | A Pixel Equivalent Measurement Method Based on Machine Vision |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113810621A (en) * | 2021-09-18 | 2021-12-17 | 凌云光技术股份有限公司 | Time-sharing exposure and TDI parallel processing device and method applied to multi-line linear array camera |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112465912B (en) | Stereo camera calibration method and device | |
| CN108426585B (en) | A geometric calibration method for light field cameras | |
| CN110057552B (en) | Virtual image distance measuring method, device, equipment, controller and medium | |
| CN110163918A (en) | A kind of line-structured light scaling method based on projective geometry | |
| CN113012234A (en) | High-precision camera calibration method based on plane transformation | |
| CN105225224A (en) | Improve arrangements of cameras and the scaling method of depth of field measuring accuracy | |
| WO2007015059A1 (en) | Method and system for three-dimensional data capture | |
| CN114331924B (en) | Large workpiece multi-camera vision measurement method | |
| CN106709955B (en) | Space coordinate system calibration system and method based on binocular stereo vision | |
| CN105205806B (en) | A kind of precision compensation method based on machine vision | |
| CN108305233A (en) | A kind of light field image bearing calibration for microlens array error | |
| JP2004317245A (en) | Distance detecting device, distance detecting method, and distance detecting program | |
| CN108550171B (en) | Calibration method of line scan camera with gossip coding information based on cross ratio invariance | |
| CN112489141B (en) | Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera | |
| CN111692992A (en) | High-precision 2D size measurement method based on multi-image time-sharing exposure | |
| CN110490941B (en) | Telecentric lens external parameter calibration method based on normal vector | |
| CN115631099B (en) | A method, apparatus and electronic device for measuring radial distortion parameters | |
| CN112887700B (en) | Two-dimensional method for lateral position error of unit lens and lens array | |
| CN110443750B (en) | Method for detecting motion in a video sequence | |
| CN112927305A (en) | Geometric dimension precision measurement method based on telecentricity compensation | |
| CN119394164A (en) | A steel plate contour dimension measurement method based on surface detection platform | |
| CN112396651A (en) | Method for realizing equipment positioning through two-angle image | |
| CN115439541B (en) | Glass azimuth calibration method for refraction imaging system | |
| Li et al. | Identification and correction of microlens array rotation error in plenoptic imaging systems | |
| CN113506347B (en) | Camera internal reference processing method and system based on single picture |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200922 |