[go: up one dir, main page]

WO2017046688A1 - Process for the acquisition of the shape, of the dimensions and of the position in space of products to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms - Google Patents

Process for the acquisition of the shape, of the dimensions and of the position in space of products to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms Download PDF

Info

Publication number
WO2017046688A1
WO2017046688A1 PCT/IB2016/055419 IB2016055419W WO2017046688A1 WO 2017046688 A1 WO2017046688 A1 WO 2017046688A1 IB 2016055419 W IB2016055419 W IB 2016055419W WO 2017046688 A1 WO2017046688 A1 WO 2017046688A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
laser
original image
light
background
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2016/055419
Other languages
French (fr)
Inventor
Valeriano BALLARDINI
Matteo SOLAROLI
Cristian PISTOCCHI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Specialvideo Srl
Original Assignee
Specialvideo Srl
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Specialvideo Srl filed Critical Specialvideo Srl
Publication of WO2017046688A1 publication Critical patent/WO2017046688A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • PROCESS FOR THE ACQUISITION OF THE SHAPE OF THE DIMENSIONS AND OF THE POSITION IN SPACE OF PRODUCTS TO BE SUBJECTED TO CONTROLS, TO MECHANICAL MACHINING AND/OR TO GRIPPING AND HANDLING BY ROBOTIC ARMS
  • the invention relates to a process for the remote acquisition of the shape or of the profile, of the dimensions and of the position in space, of products or objects to be subjected to controls, to measurements, to mechanical machining and/or to gripping and handling by robotic arms.
  • the process in question which uses systems of optoelectronic type and comes under the international class G01 B1 1/00 and related subclasses, provides for the acquisition of three-dimensional (3D) images through laser scanning by a projector of a flat and linear beam of laser light, which illuminates a cross section of the object while a high definition video camera samples it at the desired frequency and in correct phase.
  • the object and the video camera can be fixed and the laser can be moving, or, more frequently, as in the case in question, the laser and the video camera will be mounted on a single supporting structure that can be movable and aimed at the fixed object or that more simply can be fixed and aimed at the moving object, the feed direction and speed of which are known.
  • the object of the invention is the rapid and dynamic acquisition of the light line projected on the surface of the object being scanned by a laser projector and by a video camera integral with each other, also in said worst operating conditions in which the video camera can detect light areas at times with intensities even greater than those of the beam of the laser light, but which are extraneous to the scan and for this reason must be ignored.
  • Cited as prior art are the patents US 4 961 155 and US 5 280 542 both with the title "XYZ coordinates measuring system", both by the Japanese company Toyota, and the patent US 5 739 912 with the title Object profile measuring method and apparatus" by the Japanese company N.T.T.
  • FIG. 1 illustrates, schematically and in perspective, the means that provide for projection and extraction of the laser line in the process in question;
  • Fig. 2 illustrates a flow diagram of the process according to the invention
  • FIG. 3 and 4 are plan views of an original image and of a background image respectively after the realignment step and after the subtraction step;
  • Figs. 5 and 6 illustrate in a plan view the images of Figs. 3 and 4 respectively after the dilation step and after the subtraction step;
  • Figs. 7, 8, 9 and 10 illustrate the images of Figs. 3 to 6 sectioned along the line A-A and represented in a graph having on the ordinates the levels of grey and on the abscissas the pixels in the direction of the scanning movement;
  • the first step of the process to be protected is the one indicated by the block 1 , which provides for acquisition of the laser line with the prior art methods and means, mentioned in the preamble and illustrated schematically in Fig.
  • PL indicates the projector that emits a flat, vertical and linear beam of laser light LL, oriented downwards and which intersects, transverse to the movement, the underlying object or product P to be scanned that rests on a surface T, while the profile of the laser line LL projected on the same product P is detected by a video camera TEL that we assume to be integral with the projector PL but outside the laser plane by a suitable triangulation angle, both these components TEL-PL being arranged with a relative movement in the direction of the longitudinal axis Y of the product P.
  • video camera TEL and projector PL can be fixed while the product P is translated longitudinally in the direction Y by the supporting surface T of a conveyor, or the product P can be fixed on the surface T while the video camera TEL and the projector PL are jointly translated in the direction Y.
  • the video camera TEL is connected to a processor El that obtains the laser line "n" indicative of the profile of the scanned product.
  • the video camera performs a series of detections, or photos, of the product illuminated by the laser line and the block 1 , through the branch 101 , transfers this information to a subsequent block 2 that has the function of performing a subtraction between the original image "n" and a background or base image coming from a block 3 that is connected to the branch 201 of the block 1 , through a lag or delay block 4, such that the same block 3 is able to provide an image "n-k" detected by the video camera of k images before the original image "n", not too far before and not immediately before, so as not to visualize and not to even partially subtract the laser line, but in a temporal situation such that the background image can reasonably comprise the same light disturbances, if any, present in the original image of the laser line "n", disturbances that by the block 2 will thus be subtracted from this original image, so that the output 102 of the same block 2 will provide the data related to an original laser line "n” that has substantially no disturbances and that can better contribute
  • Subtraction of the background means subtraction from the images that are progressively analysed of a reference image such that only the differences are substantially visible in the image resulting from the difference, bearing in mind that the result of the same difference must always be greater than or equal to zero to be considered useful.
  • the subtraction is performed by subtracting the value of the corresponding pixel of the reference image from the brightness value of a pixel.
  • Reference or background image means a "base" image, the exact definition of which depends on the type of scan used and therefore on the subtraction method applied, as specified below.
  • the dynamic subtraction method with realignment or shift will be used, which is such that the line "n-k” shifted by a given pixel offset or variance is subtracted from each line "n". Due to the solidarity between laser and video camera, the laser line is always in the same position inside the images, naturally with the same height of the scanned surface, while, on the contrary, the background and the object framed move in the images, so that it is necessary to realign these images before the block 2 performs the subtraction.
  • the subtraction can create spurious spots in the form of bright pixels that could be erroneously mistaken for a part of the laser line being searched for.
  • the background image output from the block 3 or from the realignment block 5 is dilated in all directions by a block 6, so that any areas with a light background of this background image are dilated and enlarged so that, even if a perfect realignment or a perfect phasing between the images subtracted in the block 2 does not exist, the light background of the original image "n" is without doubt covered and cancelled by the light and dilated background of the image n-( n-k).
  • Fig. 3 where the original image coming from the block 1 of Fig. 2 is indicated with lo and a continuous line, while the background image generated by the block 3 and realigned by the block 5 is indicated with Ibr and a dashed line.
  • the images LL relate to the laser line and are indicative of the profile of the scanned product, while LS indicates the spurious images related to an undesirable reflection.
  • Fig. 4 illustrates the images of Fig. 3 as they would be output from the subtraction block 2 of Fig. 2. Everything that after the result of the subtraction was less than zero has been eliminated, while everything that from the difference was greater than zero and was not perfectly overlapped remains. It can be seen in Fig. 4 how the important images LL of the laser line, but also a part of the spurious image LS, remain.
  • Figs. 7 to 10 illustrate the same images as the preceding Figs. 3 to 6, sectioned as by the line A-A, in the direction y of the scanning movement and indicated on a graph having on the abscissas the pixels in the direction of said movement y and on the ordinates the relative grey values. Also in these figures the original image is indicated with lo and a continuous line and the realigned background image is indicated with Ibr, Ibrd and a dashed line. Here too the images respectively of the laser line and those of the spurious light line are indicated with LL and LS.
  • Fig. 7 illustrates the images after the only realignment step and Fig. 8 illustrates the images after the subtraction step. From Fig.
  • Fig. 9 illustrates the images of Fig. 7 after the dilation step as in Fig. 5, while Fig. 10 illustrates the subtraction of the images after the dilation step of Fig. 9.
  • Fig. 10 illustrates the subtraction of the images after the dilation step of Fig. 9.
  • the output 102 of the subtraction block 2 considered above in Fig. 2 reaches a subsequent register block 7 where the various laser lines rectified by the aforesaid blocks form the various scanning columns, which together will be subsequently used to define the three-dimensional profile of the object being scanned.
  • the output 107 of the block 7 goes to a subsequent extraction block 8 of the laser line with the known centre of gravity (C.O.G.) method, which is based on the following algorithm:
  • a predefined percentage for example 90% of the peak selected, is considered to define a range of brightness values
  • the output 108 of the block 8 will thus provide the position Y of the laser line for each column, which, due to the procedure used will never correspond to a whole number of pixels.
  • the output 108 of the block 8 goes to a subsequent block 9 that, if necessary, performs validation of the laser line extracted for each column from the previous block 7.
  • This validation verifies the width of the laser line extracted.
  • the validation thresholds minimum and maximum.
  • the relative brightness threshold SR for which adjacent pixels with a brightness value of less than a certain percentage, for example less than around 80% of the peak P, are not considered as belonging to the laser line n'.
  • the validated information is sent to an optional further validation block 10, for example validation by symmetry or by gradient, and if the peak is also validated by this block, from the output 1 10 of this block the information is sent to a final peak confirmation block 1 1 , bearing in mind that if the validation blocks 9 and 10 validate the modified images, those with the subtracted background, then the same blocks 9 and 10, through the branches 301 , 401 can usefully subject the original image "n" coming from the block 1 to the same verification and, only in the case in which the result of this further test is positive, the peak is confirmed in the final block 1 1 .
  • the outputs 209 and 210 of the same blocks 9 and 10 activate a block 12 that reiterates for a maximum and predetermined number of times, for example approximately 5-6 times, the verifications to search for the point of maximum brightness in the same column, discarding time by time the candidate points already considered in previous tests.
  • the block 10 can perform a validation by symmetry test.
  • the laser line In the image detected by the video camera, the laser line can be brighter in the centre and fainter at the edges, following a function of Gaussian or bell type. Therefore, column by column (i.e., along the Y axis), we should have a particularly bright pixel, with adjacent pixels above and below with gradually degrading brightness, should be obtained.
  • the lack of this symmetry is an indicator of the fact that the point of the light line identified could be a spurious line instead of the centre of the laser line in that column.
  • pre-smoothing is performed, substituting each bright point with the mean of itself and of the adjacent bright points.
  • n, m are the dimensions of the kernel and i, i' are the intensity values of the symmetrical pixels with respect to the central pixel whose kernel it overlaps.
  • S 5 or 6 points can be considered.
  • Another possible validation method that can be performed by the block 10 is the validation by gradient method.
  • the value of the brightness of the pixels above and below the spot considered is also analysed, to verify whether the variation in brightness in this range is more compatible with the presence of the laser line or of a spurious line.
  • the algorithm of the validation by gradient is based on the convolution of a kernel NxM, whose exact definition is chosen by the developer based on the single applications (first derivative, second derivative, etc.).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Process for the acquisition of the shape, of the dimensions and of the position in space of products to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms, by means of laser scanning by a projector (PL) of a linear beam of laser light (LL), which illuminates a cross section of the object (P) while a high definition video camera (TEL) samples it at the desired frequency and in correct phase detects the profile of the product with the triangulation method, in which there is provided the detection of the original image (n) of each laser line diffused by the object, in which there is provided the production of a base or background image (n-k), which comprises any spurious disturbances present in the same original image (n) and which provides for a dynamic subtraction step of the background image from the original image, to generate an image with only the laser line and without light disturbances, characterised in that before the subtraction step there is provided a realignment or shift step to realign the background of the background image with that of the original image, so that the spurious light images of the background overlap and cancel each other out in said subtraction step, the sign of the correction or offset of said shift phase being dependent on the scanning direction. Before said subtraction step, the background image (n-k) is sufficiently dilated.

Description

TITLE
PROCESS FOR THE ACQUISITION OF THE SHAPE, OF THE DIMENSIONS AND OF THE POSITION IN SPACE OF PRODUCTS TO BE SUBJECTED TO CONTROLS, TO MECHANICAL MACHINING AND/OR TO GRIPPING AND HANDLING BY ROBOTIC ARMS
DESCRIPTION
The invention relates to a process for the remote acquisition of the shape or of the profile, of the dimensions and of the position in space, of products or objects to be subjected to controls, to measurements, to mechanical machining and/or to gripping and handling by robotic arms. In particular the process in question, which uses systems of optoelectronic type and comes under the international class G01 B1 1/00 and related subclasses, provides for the acquisition of three-dimensional (3D) images through laser scanning by a projector of a flat and linear beam of laser light, which illuminates a cross section of the object while a high definition video camera samples it at the desired frequency and in correct phase. By knowing the relative position of the video camera, of the laser and of the object point in the image of the video camera, it is possible to obtain the x, y, z coordinates of the various points illuminated by the laser in the absolute reference system and through these coordinates, with the known triangulation method, it is possible to obtain the profile of the section of the objecting being scanned. To carry out the scan and obtain a sufficiently dense image, useful for three-dimensional mapping of the scanned surface, it is necessary for there to be a relative movement between the object and the laser illuminating it, and therefore the object and the video camera can be fixed and the laser can be moving, or, more frequently, as in the case in question, the laser and the video camera will be mounted on a single supporting structure that can be movable and aimed at the fixed object or that more simply can be fixed and aimed at the moving object, the feed direction and speed of which are known.
In these scanning processes, there is often the tendency to use lower power laser light emitters, for example in class 2 or 2M, which are not dangerous for the human eye and therefore can also be used in unshielded environments. However, discrimination of the light emitted by these lasers is low and it can be easily confused with spurious illumination caused by environmental lights of another type, such as illumination by sunlight that filters through windows in the working environment and images reflected on the reflecting objects of the disturbing light sources. To complicate matters, the reflecting nature and/or the light and/or dark contrast areas of the surface of the object being scanned also often influence this.
The object of the invention is the rapid and dynamic acquisition of the light line projected on the surface of the object being scanned by a laser projector and by a video camera integral with each other, also in said worst operating conditions in which the video camera can detect light areas at times with intensities even greater than those of the beam of the laser light, but which are extraneous to the scan and for this reason must be ignored.
Cited as prior art are the patents US 4 961 155 and US 5 280 542 both with the title "XYZ coordinates measuring system", both by the Japanese company Toyota, and the patent US 5 739 912 with the title Object profile measuring method and apparatus" by the Japanese company N.T.T.
Only the last of these patents poses the technical problem of eliminating the spurious light that can be caused, for example, by the light emitted by a welding electrode when the detection system is mounted on the moving arm of an automatic welding machine and this technical problem is solved by subtracting from the image cyclically detected by the video camera, in synchronism with the light beam emitted by the laser projector, a background image detected a few instants earlier by the video camera, when said projector was switched off, so that the value of the coordinates resulting from this subtraction is not influenced by the light emitted by components external to the system.
This solution has proved to be insufficient to solve said technical problem in a simple, fast and dynamic manner, as it provides for the use of a pulsed light laser, because it halves the maximum sampling frequency as to obtain a useful image, two images must be acquired, one with the laser switched on and one with the laser switched off and because it does not take account of the temporal and spatial lag existing between the two images subjected to said subtraction and because it does not have at least one validating and reiterated control.
The invention intends to overcome the limits of the prior art with the process according to the appended claim 1 , and to the subsequent dependent claims, the features and advantages of which will be apparent from the following description of some preferred embodiments thereof, illustrated purely by way of non-limiting example in the figures of the four attached drawings, wherein:
- Fig. 1 illustrates, schematically and in perspective, the means that provide for projection and extraction of the laser line in the process in question;
- Fig. 2 illustrates a flow diagram of the process according to the invention;
- Figs. 3 and 4 are plan views of an original image and of a background image respectively after the realignment step and after the subtraction step;
- Figs. 5 and 6 illustrate in a plan view the images of Figs. 3 and 4 respectively after the dilation step and after the subtraction step;
- Figs. 7, 8, 9 and 10 illustrate the images of Figs. 3 to 6 sectioned along the line A-A and represented in a graph having on the ordinates the levels of grey and on the abscissas the pixels in the direction of the scanning movement;
- Fig. 1 1 illustrates a peak of the light signal chosen, during the validation by width step.
With reference to the flow diagram of Fig. 1 , it can be seen that the first step of the process to be protected is the one indicated by the block 1 , which provides for acquisition of the laser line with the prior art methods and means, mentioned in the preamble and illustrated schematically in Fig. 1 , where PL indicates the projector that emits a flat, vertical and linear beam of laser light LL, oriented downwards and which intersects, transverse to the movement, the underlying object or product P to be scanned that rests on a surface T, while the profile of the laser line LL projected on the same product P is detected by a video camera TEL that we assume to be integral with the projector PL but outside the laser plane by a suitable triangulation angle, both these components TEL-PL being arranged with a relative movement in the direction of the longitudinal axis Y of the product P.
Therefore, video camera TEL and projector PL can be fixed while the product P is translated longitudinally in the direction Y by the supporting surface T of a conveyor, or the product P can be fixed on the surface T while the video camera TEL and the projector PL are jointly translated in the direction Y. The video camera TEL is connected to a processor El that obtains the laser line "n" indicative of the profile of the scanned product.
From Fig. 2 it can be seen that the video camera performs a series of detections, or photos, of the product illuminated by the laser line and the block 1 , through the branch 101 , transfers this information to a subsequent block 2 that has the function of performing a subtraction between the original image "n" and a background or base image coming from a block 3 that is connected to the branch 201 of the block 1 , through a lag or delay block 4, such that the same block 3 is able to provide an image "n-k" detected by the video camera of k images before the original image "n", not too far before and not immediately before, so as not to visualize and not to even partially subtract the laser line, but in a temporal situation such that the background image can reasonably comprise the same light disturbances, if any, present in the original image of the laser line "n", disturbances that by the block 2 will thus be subtracted from this original image, so that the output 102 of the same block 2 will provide the data related to an original laser line "n" that has substantially no disturbances and that can better contribute to defining the three- dimensional profile of the object or product being scanned.
Subtraction of the background means subtraction from the images that are progressively analysed of a reference image such that only the differences are substantially visible in the image resulting from the difference, bearing in mind that the result of the same difference must always be greater than or equal to zero to be considered useful. The subtraction is performed by subtracting the value of the corresponding pixel of the reference image from the brightness value of a pixel. Reference or background image means a "base" image, the exact definition of which depends on the type of scan used and therefore on the subtraction method applied, as specified below.
If, as was assumed in the preamble, the video camera and the laser are integral with each other, whether they are movable on the fixed object or they are fixed on the moving object, the dynamic subtraction method with realignment or shift will be used, which is such that the line "n-k" shifted by a given pixel offset or variance is subtracted from each line "n". Due to the solidarity between laser and video camera, the laser line is always in the same position inside the images, naturally with the same height of the scanned surface, while, on the contrary, the background and the object framed move in the images, so that it is necessary to realign these images before the block 2 performs the subtraction. The sign of the shift offset depends on the scanning direction and it must be borne in mind that there is no offset in pixels that allows perfect realignment of the whole of the background of the images, as the shift is not constant when there is a variation in the actual height of the objects scanned and due to perspective deformations.
For all the solutions indicated above, it must be borne in mind that:
Subtraction means that [Image(n) - Image (n-k)] > 0 for each pixel, that is Max(A-B,0). Since the 8-bit value (brightness) of each pixel is between 0 and 255, the acceptable result of the subtraction cannot be less than zero, as this would mean that the background has pixels that are brighter than those of the original image, which undoubtedly do not belong to the laser line being searched for, which must therefore be ignored and cleared.
The subtraction can create spurious spots in the form of bright pixels that could be erroneously mistaken for a part of the laser line being searched for.
This type of problem occurs particularly in the case of dynamic subtraction with shift and above all with objects having curved surfaces, where, as already stated, alignment between the two images to be subtracted is intrinsically imperfect.
By calculating the difference between two images not perfectly aligned with one another, some very bright points that are not eliminated by the subtraction could appear outside the laser line, above all at the contrasted edge of spurious bright spots, and in turn these could form a very marked, very thin and bright line that the laser line extraction algorithms could erroneously interpret as part of the laser line being searched for.
To further improve the quality of the subtracted laser line output from the block 2, considering that the realignment operation must be able to be performed dynamically, at high speed and that it might not bring about the theorised compensation results, the background image output from the block 3 or from the realignment block 5 is dilated in all directions by a block 6, so that any areas with a light background of this background image are dilated and enlarged so that, even if a perfect realignment or a perfect phasing between the images subtracted in the block 2 does not exist, the light background of the original image "n" is without doubt covered and cancelled by the light and dilated background of the image n-( n-k).
The dilation considered above can be performed in a known way by applying a Max filter to the image, with which the value of each pixel is replaced with a maximum value between those of the adjacent ixels, such as in the table below:
Figure imgf000007_0001
where the centre of the laser line is in the central line, with the intensity that gradually degrades, passing to the horizontal lines at the top and those at the bottom. When considering the pixel indicated in bold, its value will be replaced from 102 to 103. This process increases the width and the intensity of the disturbance spots to be eliminated and as undesirable side effect also of the line to be subtracted, so that a correct choice of the range k considered above is necessary.
To better understand the above, attention is drawn to Fig. 3 where the original image coming from the block 1 of Fig. 2 is indicated with lo and a continuous line, while the background image generated by the block 3 and realigned by the block 5 is indicated with Ibr and a dashed line. In this figure, the images LL relate to the laser line and are indicative of the profile of the scanned product, while LS indicates the spurious images related to an undesirable reflection. Fig. 4 illustrates the images of Fig. 3 as they would be output from the subtraction block 2 of Fig. 2. Everything that after the result of the subtraction was less than zero has been eliminated, while everything that from the difference was greater than zero and was not perfectly overlapped remains. It can be seen in Fig. 4 how the important images LL of the laser line, but also a part of the spurious image LS, remain.
Fig. 5 illustrates how the images of Fig. 1 reach the subtraction block 2 after a step of sufficient dilation in all directions, according to the theory known as Morphology Dilation NxN, of the background image Ibrd generated by the block 6 of Fig. 2. Also in this representation, LL indicates the images related to the laser line, while LS indicates the spurious images. Fig. 6 illustrates how the realigned and dilated images of Fig. 5 are output from the subtraction block 2 of Fig. 2, with only the useful images LL and related to the laser line. The images LS of Fig. 5 have disappeared, as the subtraction n-(n-k) they are subjected to has given a result of less than zero.
Figs. 7 to 10 illustrate the same images as the preceding Figs. 3 to 6, sectioned as by the line A-A, in the direction y of the scanning movement and indicated on a graph having on the abscissas the pixels in the direction of said movement y and on the ordinates the relative grey values. Also in these figures the original image is indicated with lo and a continuous line and the realigned background image is indicated with Ibr, Ibrd and a dashed line. Here too the images respectively of the laser line and those of the spurious light line are indicated with LL and LS. Fig. 7 illustrates the images after the only realignment step and Fig. 8 illustrates the images after the subtraction step. From Fig. 8 it can be seen how the subtraction, with realignment of the background image only, causes all the subtracted parts with a value of less than zero to disappear, but that in addition to the desired peak of laser light LL, a peak of spurious light LS also remains. Fig. 9 illustrates the images of Fig. 7 after the dilation step as in Fig. 5, while Fig. 10 illustrates the subtraction of the images after the dilation step of Fig. 9. Just as in the representation of Fig. 6, Fig. 10 illustrates how from the subtraction of the realigned and dilated background image Ibrd only the part LL related to the laser light remains, while the whole of the part LS related to the spurious light and to the realigned and dilated background image, resulting from the subtraction with a value of less than zero, is cancelled.
The output 102 of the subtraction block 2 considered above in Fig. 2 reaches a subsequent register block 7 where the various laser lines rectified by the aforesaid blocks form the various scanning columns, which together will be subsequently used to define the three-dimensional profile of the object being scanned.
The output 107 of the block 7 goes to a subsequent extraction block 8 of the laser line with the known centre of gravity (C.O.G.) method, which is based on the following algorithm:
- in each column the brightest pixel, and in any case above a predefined absolute threshold, is searched for and selected;
- a predefined percentage, for example 90% of the peak selected, is considered to define a range of brightness values;
- the weighted average between the Y coordinates of the points adjacent to the peak that exceed the predefined threshold is calculated, to define the centre of gravity of each column.
The output 108 of the block 8 will thus provide the position Y of the laser line for each column, which, due to the procedure used will never correspond to a whole number of pixels.
The output 108 of the block 8 goes to a subsequent block 9 that, if necessary, performs validation of the laser line extracted for each column from the previous block 7. This validation verifies the width of the laser line extracted. After having identified a possible peak of the column by the block 7 that exceeds the absolute threshold SA, it must be verified if the width of the line around this peak complies with the validation thresholds (minimum and maximum). To perform this verification, as illustrated in Fig. 1 1 , for example for each peak P the following are considered according to a possible known method:
- the relative brightness threshold SR, for which adjacent pixels with a brightness value of less than a certain percentage, for example less than around 80% of the peak P, are not considered as belonging to the laser line n'.
If, moving along the column, it is necessary to move away by more or less than a certain number of pixels, for example 6 from that of the peak P to reach said threshold SR, i.e., the width measurement of the peak, then the peak is not validated.
If the peak is validated by the block 9, through the outlet 109, the validated information is sent to an optional further validation block 10, for example validation by symmetry or by gradient, and if the peak is also validated by this block, from the output 1 10 of this block the information is sent to a final peak confirmation block 1 1 , bearing in mind that if the validation blocks 9 and 10 validate the modified images, those with the subtracted background, then the same blocks 9 and 10, through the branches 301 , 401 can usefully subject the original image "n" coming from the block 1 to the same verification and, only in the case in which the result of this further test is positive, the peak is confirmed in the final block 1 1 .
If the tests performed by the blocks 9 and/or 10 are negative, the outputs 209 and 210 of the same blocks 9 and 10 activate a block 12 that reiterates for a maximum and predetermined number of times, for example approximately 5-6 times, the verifications to search for the point of maximum brightness in the same column, discarding time by time the candidate points already considered in previous tests.
As stated previously, the block 10 can perform a validation by symmetry test. In the image detected by the video camera, the laser line can be brighter in the centre and fainter at the edges, following a function of Gaussian or bell type. Therefore, column by column (i.e., along the Y axis), we should have a particularly bright pixel, with adjacent pixels above and below with gradually degrading brightness, should be obtained. The lack of this symmetry is an indicator of the fact that the point of the light line identified could be a spurious line instead of the centre of the laser line in that column. Before calculation of the symmetry, pre-smoothing is performed, substituting each bright point with the mean of itself and of the adjacent bright points.
To calculate the symmetry, as first step, for example, reference can be made to the formula:
Figure imgf000010_0001
where n, m are the dimensions of the kernel and i, i' are the intensity values of the symmetrical pixels with respect to the central pixel whose kernel it overlaps. To calculate "S", 5 or 6 points can be considered.
Example:
Figure imgf000011_0002
Given a kernel of 1x5 centred on the pixel 3, we consider the root of a fraction that at the numerator will be the sum of the squares of the following differences: (i-Hs) and (i2-i4) and that at the denominator will simply be n =5 as m=1.
In this case:
Figure imgf000011_0001
Since in order to identify a point that is candidate to be the brightness peak of that given column, it is possible to apply a method such as the Centre of Gravity method indicated above, which can find a peak that does not correspond to a pixel but to a point between two successive pixels (subpixel position), it is necessary to bear this in mind in the symmetry calculation.
To do this, it is sufficient to calculate the symmetry with the usual formula, both for the bottom pixel (floor) and for the top pixel (ceiling) and then calculate the weighted average based on the exact position of the peak between the two pixels. If, for example, the peak is at a height 50.2, S50 and S51 will be calculated with the former that in the average will have a weight of 1 -0.2 and the latter that will have a weight of 0.2, hence S=S5o x O.8+S51 x 0.2.
Finally, it is necessary to normalise the value of S, as a certain brightness difference (e.g. 10 levels of grey) might or might not be too high in relation to the intensity of the peak (e.g. 50 or 150). To do this a normalised S' is calculated, which will be equal to S divided by the intensity of the peak and multiplied by 100, i.e. :
S
S' = Peak x 100
Finally, S' will be compared with an acceptance value, verifying the condition S'<threshold.
Another possible validation method that can be performed by the block 10 is the validation by gradient method.
Just as for the preceding validation with the by symmetry criterion, in the case of validation by gradient the value of the brightness of the pixels above and below the spot considered is also analysed, to verify whether the variation in brightness in this range is more compatible with the presence of the laser line or of a spurious line. The algorithm of the validation by gradient is based on the convolution of a kernel NxM, whose exact definition is chosen by the developer based on the single applications (first derivative, second derivative, etc.). Finally, during this process it must be borne in mind that the position of the peak on which the kernel is to be centred is not whole but will be a subpixel position.
It is understood that the description refers to a preferred embodiment of the invention, to which numerous variants, modifications and technical equivalents can be made, all without abandoning the overarching principle of the invention, as described and illustrated and as claimed below.
In the claims, the references provided in brackets are purely indicative and do not limit the scope of protection of the claims.

Claims

1. Process for the acquisition of the shape, of the dimensions and of the position in space of products or objects (P) to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms, by means of a laser projector (PL) that projects a linear beam of laser light (LL) to illuminate a cross section of the object (P) while a video camera (TEL) samples it at the desired frequency and in correct phase detects the profile of the object with the triangulation method, in which the video camera is in the same fixed or moving condition as the laser projector (PL) and there is provided a relative movement (y) between the laser projector (PL) and the object (P), in which there is provided the detection of the original image (n) of each laser line reflected on the object (P), in which there is provided the production of a base or background image (n-k), which comprises any spurious disturbances present in the original image (n) and which provides for a dynamic subtraction step of the background image from the original image, to generate a new image with only the laser line and without light disturbances, characterised in that before the subtraction step there is provided a realignment or shift step to realign the background of the background image with that of the original image, so that the spurious light images of the background overlap and cancel each other out in said subtraction step, the sign of the correction or offset of said shift phase being dependent on the scanning direction.
2. Process according to claim 1 ), characterised in that before said subtraction step, the background image (n-k) is sufficiently dilated in all directions according to the Morphology Dilatation NxN theory, replacing each pixel with the maximum value between those of the adjacent pixels.
3. Process according to claim 2), wherein after said subtraction step, the resulting image is decomposed into columns, with search for the maximum peak for each column, with the centre of gravity method.
4. Process according to claim 3), wherein the image resulting from the search for the maximum peak is subjected to at least one validation test, preferably validation by width.
5. Process according to claim 4), characterised in that the image resulting from the validation by width is subjected to at least one further validation test, to be selected from those by symmetry or by gradient.
6. Process according to claim 5), characterised in that if the result of the validation tests by width, by symmetry or by gradient is negative, the same tests are reiterated for a maximum and predetermined number of times, for example approximately 5-6 times, discarding time by time the candidate points already considered in previous tests.
7. Process according to claim 5), characterised in that before the test by symmetry or by gradient, the identified point of the light line is preferably subjected to a pre- smoothing step, replacing each light point with the average of itself and of the adjacent points.
8. Process according to claims 4), 5) and 6), characterised in that also the original image (n) is subjected to at least one or to all the validations tests by width, by symmetry or by gradient, pursuant to these claims.
PCT/IB2016/055419 2015-09-14 2016-09-12 Process for the acquisition of the shape, of the dimensions and of the position in space of products to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms Ceased WO2017046688A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102015000051054(UB2015A003616) 2015-09-14
ITUB2015A003616A ITUB20153616A1 (en) 2015-09-14 2015-09-14 Process for acquiring the shape, dimensions and position in the space of products to be subjected to checks, mechanical processing and / or gripping and manipulation by robotic arms

Publications (1)

Publication Number Publication Date
WO2017046688A1 true WO2017046688A1 (en) 2017-03-23

Family

ID=55069972

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/055419 Ceased WO2017046688A1 (en) 2015-09-14 2016-09-12 Process for the acquisition of the shape, of the dimensions and of the position in space of products to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms

Country Status (2)

Country Link
IT (1) ITUB20153616A1 (en)
WO (1) WO2017046688A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114096369A (en) * 2019-07-11 2022-02-25 欧姆龙株式会社 Control device, laser processing system with control device, and laser processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5739912A (en) * 1991-04-26 1998-04-14 Nippon Telegraph And Telephone Corporation Object profile measuring method and apparatus
WO2011069191A1 (en) * 2009-12-08 2011-06-16 Radar Portal Systems Pty Ltd High speed photometric stereo pavement scanner
US20150153161A1 (en) * 2012-10-12 2015-06-04 Nireco Corporation Shape measuring method and shape measureing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5739912A (en) * 1991-04-26 1998-04-14 Nippon Telegraph And Telephone Corporation Object profile measuring method and apparatus
WO2011069191A1 (en) * 2009-12-08 2011-06-16 Radar Portal Systems Pty Ltd High speed photometric stereo pavement scanner
US20150153161A1 (en) * 2012-10-12 2015-06-04 Nireco Corporation Shape measuring method and shape measureing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SHIBATA N ET AL: "DEVELOPMENT OF GROOVE RECOGNITION ALGORITHM WITH VISUAL SENSOR. PRACTICAL DEVELOPMENT OF VISUAL SENSOR FOR WELDING ROBOT (IST REPORT)", 1 January 1999, WELDING INTERNATIONAL, TAYLOR & FRANCIS, ABINGDON, GB, PAGE(S) 761 - 769, ISSN: 0950-7116, XP000912684 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114096369A (en) * 2019-07-11 2022-02-25 欧姆龙株式会社 Control device, laser processing system with control device, and laser processing method
CN114096369B (en) * 2019-07-11 2023-10-10 欧姆龙株式会社 Control device, laser processing system provided with control device, and laser processing method

Also Published As

Publication number Publication date
ITUB20153616A1 (en) 2017-03-14

Similar Documents

Publication Publication Date Title
US10976262B2 (en) Mobile and automated apparatus for the detection and classification of damages on the body of a vehicle
US20130057678A1 (en) Inspection system and method of defect detection on specular surfaces
CN114450711B (en) Workpiece surface defect detection device and detection method, workpiece surface inspection system and program
JP6507653B2 (en) Inspection apparatus and control method of inspection apparatus
EP3388781B1 (en) System and method for detecting defects in specular or semi-specular surfaces by means of photogrammetric projection
US20220011241A1 (en) Surface-defect detecting method, surface-defect detecting apparatus, steel-material manufacturing method, steel-material quality management method, steel-material manufacturing facility, surface-defect determination model generating method, and surface-defect determination model
JP7302599B2 (en) Defect discrimination method, defect discrimination device, defect discrimination program and recording medium
JP2015184143A (en) Inspection device and inspection method for painted surface of vehicle body
US10360684B2 (en) Method and apparatus for edge determination of a measurement object in optical metrology
KR102027986B1 (en) Bead recognition apparatus using vision camera and method thereof
CN103245671A (en) Surface defect detection device and method for stamping piece
CN105205803A (en) Display panel defect detection method
US20230186516A1 (en) Method and flat bed machine tool for detecting a fitting position of a supporting bar
JP2021056183A (en) Apparatus and method for detecting surface defect of workpiece, surface inspection system for workpiece, and program
WO2021065349A1 (en) Workpiece surface defect detection device and detection method, workpiece surface inspection system, and program
JP2021060392A (en) Device, method, and system for detecting surface defects of workpiece and program
JP3493979B2 (en) Method and apparatus for inspecting defects on inspected surface
US20120242984A1 (en) Surface defect inspection apparatus and surface defect inspection method
CN103093475B (en) Image processing method and electronic device
WO2017046688A1 (en) Process for the acquisition of the shape, of the dimensions and of the position in space of products to be subjected to controls, to mechanical machining and/or to gripping and handling by robotic arms
JP2000321039A (en) Paint defect inspection apparatus and method
JP3460541B2 (en) Method and apparatus for inspecting defects on inspected surface
US20170069110A1 (en) Shape measuring method
JP2019079338A (en) Object detection system
JP2018040657A (en) Inspection apparatus and inspection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16795138

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16795138

Country of ref document: EP

Kind code of ref document: A1