[go: up one dir, main page]

WO2019000119A1 - Appareil et procédé de correction de luminosité - Google Patents

Appareil et procédé de correction de luminosité Download PDF

Info

Publication number
WO2019000119A1
WO2019000119A1 PCT/CN2017/085146 CN2017085146W WO2019000119A1 WO 2019000119 A1 WO2019000119 A1 WO 2019000119A1 CN 2017085146 W CN2017085146 W CN 2017085146W WO 2019000119 A1 WO2019000119 A1 WO 2019000119A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light
source
detection object
calibration plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/085146
Other languages
English (en)
Chinese (zh)
Inventor
阳光
韩琨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen A&E Intelligent Technology Institute Co Ltd
Original Assignee
Shenzhen A&E Intelligent Technology Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen A&E Intelligent Technology Institute Co Ltd filed Critical Shenzhen A&E Intelligent Technology Institute Co Ltd
Priority to CN201780036245.1A priority Critical patent/CN109643444B/zh
Priority to PCT/CN2017/085146 priority patent/WO2019000119A1/fr
Publication of WO2019000119A1 publication Critical patent/WO2019000119A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/19Image acquisition by sensing codes defining pattern positions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/06Recognition of objects for industrial automation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the present invention relates to the field of industrial vision technology, and in particular, to a light correction method and apparatus.
  • Industrial visual inspection is an emerging detection method that achieves the required detection purpose by analyzing and calculating the characteristics of pixels, brightness, and color of the image of the object to be detected collected by the machine vision device.
  • Industrial vision systems typically include a light source, an image acquisition device, an image processing device, a monitor, and the like.
  • the light source is an important factor affecting the industrial visual inspection effect, which directly affects the quality and effect of the image acquired by the image acquisition device.
  • the ideal light source should be evenly illuminated on the surface of the test object.
  • the ideal light source does not exist, and it can only be moved closer to the ideal light source by the shape and layout of the light source. That is to say, the actual light source is unevenly illuminated on the surface of the detection object, which affects the quality and effect of the acquired image, so that there is an error between the acquired image and the actual detected object, thereby reducing the accuracy of industrial visual inspection. .
  • the present invention provides a lighting correction method and apparatus, which can solve the problem that the accuracy of industrial vision detection is reduced due to uneven illumination of the light source on the surface of the detection object in the prior art.
  • the present invention provides a lighting correction method, the method comprising: collecting an image of a detection object under illumination by a light source; and positioning the detection object by using at least an image of the detection object to obtain a spatial position of the detection object; The spatial position of the detection object and the light field information of the light source acquired in advance acquire the light field distribution of the visible surface of the detection object; and the image of the detection object is polished according to the light field distribution.
  • the method for acquiring the light field information of the pre-acquired light source includes: collecting an image of the calibration plate under the illumination of the light source; and acquiring the incident light of the spatial position where the reflective region is located according to the image of the calibration plate and the reflectivity of the reflective area on the calibration plate. Strong; at least the light field information of the light source is obtained by using the incident light intensity.
  • the incident light intensity of obtaining the spatial position of the reflective area according to the image of the calibration plate and the reflectivity of the reflective area on the calibration plate includes: positioning the reflective area according to the image of the calibration plate to obtain a spatial position where the reflective area is located; according to the calibration plate The grayscale value of the reflection area in the image acquires the reflected light intensity of the reflection area; and the incident light intensity of the spatial position where the reflection area is located is obtained according to the reflected light intensity and the reflectance of the reflection area.
  • the positioning of the reflective area according to the image of the calibration plate to obtain the spatial position of the reflective area includes: positioning the calibration plate according to the image of the calibration plate to obtain spatial position information of the calibration plate; and distributing according to the reflective area on the calibration plate.
  • the information and the spatial position information of the calibration plate determine the spatial location of the reflective area.
  • obtaining the light field information of the light source by using at least the incident light intensity comprises: acquiring the light field information of the light source by using the incident light intensity and the radiation model corresponding to the light source.
  • At least obtaining the light field information of the light source by using the incident light intensity further comprises: establishing a corresponding radiation model for the light source.
  • the light source comprises a line light source and/or a surface light source
  • the corresponding radiation model for the light source comprises: equalizing the line source/area source into a plurality of point sources; establishing a corresponding radiation model for each equivalent point source; The set of radiation models for all equivalent point sources is used as the radiation model for the line source/area source.
  • the line source/area source is equivalent to a plurality of point sources including: collecting an image of the line source/area source; and confirming the outgoing light intensity distribution of the line source/area source according to the image of the line source/area source; according to the outgoing light intensity The distribution equalizes the line source/area source to multiple point sources.
  • the light field information includes at least the intensity of the outgoing light in each visible direction of each light source.
  • the method before acquiring the incident light intensity of the spatial position where the reflective region is located according to the image of the calibration plate and the reflectance of the reflective region on the calibration plate, the method further includes: acquiring a reflectance in at least one direction of the reflective region.
  • the obtaining the reflectance in the at least one direction of the reflective region includes: detecting a bidirectional reflection distribution function of the reflective region to obtain an isotropic reflectance of the reflective region.
  • the performing the lighting correction on the image of the detection object according to the light field distribution includes: adjusting the grayscale value of the pixel corresponding to the detection object in the image according to the light field distribution, so that the light field corresponding to the adjusted grayscale value is adjusted Evenly distributed.
  • At least the positioning of the detection object by the image of the detection object to acquire the spatial position of the detection object includes acquiring the spatial position of the detection object by using the depth image of the detection object and/or the stereo model information in combination with the image of the detection object.
  • the present invention also provides a light-lighting correction apparatus, the apparatus comprising: a processor and a memory, the memory storing instructions for executing the instructions to implement any of the methods provided above.
  • the present invention also provides a readable storage medium storing instructions that, when executed, implement any of the methods provided above.
  • the beneficial effects of the present invention are: different from the prior art, the present invention uses the spatial position of the detection object and the light field information of the light source obtained in advance to acquire the light field distribution of the visible surface of the detection object;
  • the image of the object is polished to reduce the influence of uneven illumination on the image, so that the error between the image and the actual detected object is reduced, thereby improving the accuracy of industrial visual inspection.
  • FIG. 1 is a schematic flow chart of a first embodiment of a lighting correction method of the present invention
  • FIG. 2 is a schematic flow chart of a second embodiment of the polishing correction method of the present invention.
  • FIG. 3 is a schematic flow chart of a third embodiment of the lighting correction method of the present invention.
  • FIG. 4 is a schematic flow chart of positioning a reflective area in an embodiment of the lighting correction method of the present invention.
  • FIG. 5 is a schematic flow chart of a fourth embodiment of the lighting correction method of the present invention.
  • FIG. 6 is a schematic flow chart of a fifth embodiment of the lighting correction method of the present invention.
  • FIG. 7 is a schematic flow chart of a sixth embodiment of the lighting correction method of the present invention.
  • FIG. 8 is a schematic diagram of calibration of a reflective area in an embodiment of the lighting correction method of the present invention.
  • Figure 9 is a schematic structural view of a first embodiment of the lighting correction device of the present invention.
  • Figure 10 is a block diagram showing the structure of a second embodiment of the lighting correction device of the present invention.
  • the first embodiment of the lighting correction method of the present invention includes:
  • S11 Acquire an image of the detection object under illumination of the light source.
  • Controlling an image capturing device such as a camera, a camera, etc., to collect a detected object under illumination of a light source image.
  • the image capture device can perform photographing to capture an image of the detected object when the detected object enters its own field of view.
  • S12 Position the detection object by using at least the image of the detection object to obtain the spatial position of the detection object.
  • the spatial position of the detection object is for the subsequent illumination correction of the surface of the detection object
  • at least the spatial coordinates of the visible surface of the detection object should be included, which can be represented by the visible surface contour shape of the detection object and the spatial coordinates of the key points. Of course, it can also be expressed by detecting the complete surface contour shape of the object and the spatial coordinates of the key points.
  • the visible surface refers to the surface of the detected object in the acquired image.
  • the detection object can be identified based on the image of the detection object to acquire the coordinates of the visible surface of the detection object. Since the image of the detection object is a plane image taken from a certain angle, the coordinates obtained by the recognition are only the coordinates in the plane image, and the spatial information on all dimensions of the detection object cannot be completely reflected, and it may be necessary to obtain the detection object in combination with other information. The location of the space.
  • the spatial position of the detected object is acquired by using the image of the detected object in combination with the depth image and/or the stereo model information of the detected object.
  • the depth image is obtained by acquiring a detection object using a depth sensor, and includes detecting a depth value of each part of the object, that is, a distance from the depth sensor.
  • the spatial position of the detected object can be calculated.
  • the stereo model information generally includes the shape of the detected object and the size information in each dimension.
  • the distance of the detection object relative to the image acquisition device can be calculated according to the parameters of the image acquisition device, and then combined with the position of the image acquisition device and the stereo model information of the detection object, the calculation can be calculated. Detects the spatial location of the object.
  • the distance between the detection object and the image acquisition device is known, If the detection object is transmitted on the marked pipeline, the marker is used to indicate the distance from the image acquisition device, then the distance between the detection object and the image acquisition device and the image acquisition device are detected according to the coordinates of the detection object in the image. The position and parameters can be used to calculate the spatial position of the detected object.
  • S13 Acquire a light field distribution of the visible surface of the detection object by using the spatial position of the detection object and the light field information of the light source acquired in advance.
  • detecting the light field distribution of the visible surface of the object includes detecting the intensity of the incident light of portions of the visible surface of the object, and the light field information of the light source includes the intensity of the outgoing light for each visible direction of each light source.
  • the light field distribution of the visible surface of the detection object can be calculated according to the light field function of the light source and the visible surface function of the detection object.
  • the visible surface of the detection object may be divided into a plurality of parts, according to the spatial position of the detection object and the known The position of each light source can calculate the distance between each part of the visible surface of the detection object relative to each light source, and combine the light attenuation coefficient (generally constant) and the light attenuation model of the light propagation medium between the light source and the detection object, and can calculate The intensity of incident light.
  • the visible surface of the detection object is composed of a plurality of pixel points.
  • the visible surface of the detection object may be correspondingly divided according to the distribution of the pixel points.
  • Each of the divided portions corresponds to a plurality of adjacent pixels in the image.
  • other methods can be used to divide the visible surface of the detected object.
  • S14 Perform light correction on the image of the detection object according to the light field distribution.
  • the grayscale value of each pixel corresponding to the detected object in the image reflects the reflection of the visible surface of the detected object.
  • the intensity of the light, the intensity of the reflected light is proportional to the intensity of the incident light. Therefore, according to the light field distribution obtained in S13, the image of the detection object can be corrected accordingly, and the gray scale of at least part of the pixel corresponding to the detection object is adjusted. The value, the uniformity of the light field of the detected object in the image after the light correction is improved.
  • the gray scale values are ⁇ g 1 , g 2 , . . . g n ⁇ , and the corresponding light field distribution is ⁇ i 1 , i 2 , . . . i n ⁇
  • I j represents the light intensity after the adjustment of the jth pixel point
  • int ( ⁇ ) represents an integer
  • m is the maximum value of the gray scale value.
  • the adjusted light field distribution ⁇ I 1 , I 2 , ... I n ⁇ is higher than the uniformity of the light field distribution of the original image, and generally shows that the variance/standard deviation of ⁇ I 1 , I 2 , ... I n ⁇ is smaller than The variance/standard deviation of ⁇ i 1 , i 2 ,...i n ⁇ . If I i is equal to the same constant for all the pixels of the corresponding detection object, the adjusted light field is evenly distributed.
  • adjusting the grayscale value by multiplying the coefficient obtained according to the light field distribution is only an indication, and other methods may be used to adjust the grayscale value according to the light field distribution.
  • the light field distribution of the visible surface of the detection object is acquired by using the spatial position of the detection object and the light field information of the light source acquired in advance, and the image of the detection object is corrected according to the light field distribution, and the light correction is not only corrected. It can resist the unevenness of the light field of the visible surface of the detected object in a single image, and can also resist the difference of the light field when the detected object is located at different positions in different images, and reduce the influence of the uneven illumination on the image, so that the image and the actual detected object The error between the two is reduced, thereby improving the accuracy of industrial visual inspection.
  • the second embodiment of the lighting correction method of the present invention is based on the first embodiment of the lighting correction method of the present invention, and the method for acquiring the light field information of the light source includes:
  • the image capturing device is controlled to collect an image of the calibration plate under illumination by the light source.
  • the image capturing device in this step is the same as the image capturing device used in step S11, and the light source is the light source in step S11.
  • Figure The same as the acquisition device generally means that the image acquisition devices used twice are of the same or the same type and are installed at the same position and angle.
  • the calibration plate has at least one reflective area, and the reflective area can reflect the incident light, and the color thereof is not pure black. To avoid the influence of the ambient color temperature, the color of the reflective area is generally white or gray.
  • the calibration plate may have a black non-reflective area in addition to the reflective area.
  • the reflective area and the non-reflective area may be combined to form a specific pattern for positioning, such as a black-and-white/black-gray/black-and-white gray interlaced grid pattern. If there are more than one reflective area on the calibration plate, non-reflective areas may be provided between the different reflective areas for spacing.
  • the reflectance in at least one direction of the reflective area on the calibration plate you need to obtain the reflectance in at least one direction of the reflective area on the calibration plate. If the reflectivity of the reflective region is isotropic, that is, the reflectance is independent of the incident light angle, then the incident light intensity is not required to take into account the angle of the incident light; if the reflectivity of the reflective region is not isotropic, then the calculation is performed. When the incident light intensity is taken, the angle of the incident light needs to be considered, and the angle of the incident light can be calculated according to the spatial position of the reflective region and the spatial position of the light source.
  • S23 Acquire at least the light field information of the light source by using the incident light intensity.
  • the light field information of the resulting light source needs to be stored for subsequent illumination correction of the image of the detected object.
  • the calibration plate can be moved such that its reflective area covers each portion of the field of view of the image capture device and the first two steps are repeated each time the calibration plate is moved, the incident light intensity of each portion of the field of view can be taken as Light field information of the light source. Because of the sheer volume of data, consider modeling the relationship between incident light intensity and spatial location to reduce the amount of storage space. Moving the calibration plate includes translating and/or rotating the calibration plate to change the spatial position of the calibration plate.
  • the light field information of the light source is obtained by using the radiation model corresponding to the light source and the acquired incident light intensity.
  • the radiation model corresponding to the light source may be its theoretical radiation model or a radiation model established according to the characteristics of the actual light source.
  • the radiation model corresponding to the light source generally includes at least the relationship between the intensity of the outgoing light of the light source and the direction, and may also include an attenuation model of the outgoing light during propagation.
  • the calibration plate is moved and the first two steps are repeated several times.
  • the unknown parameters in the radiation model of the light source are calculated according to the incident light intensity at different spatial positions, and the intensity of the light emitted by the light source in each visible direction can be obtained.
  • the light field information of the light source includes at least the intensity of the emitted light in each visible direction, and may also include an attenuation model of the outgoing light during the propagation. If the number of light sources is greater than one, in order to simplify the model and reduce the amount of calculation, you can choose to measure independently for each light source.
  • the light field of the light source is calibrated using a calibration plate of the reflectance of the known reflection region, and the light field information of the light source can be acquired for subsequent illumination correction.
  • step S22 includes:
  • S221 Position the reflective area according to the image of the calibration plate to obtain the spatial position where the reflective area is located.
  • the coordinates of the reflection area can be identified according to the image of the calibration plate. Combined with the actual size of the reflection area, the position and parameters of the image acquisition device, the spatial coordinates of the spatial position where the reflection area is located can be calculated to realize the positioning of the reflection area.
  • the reflective area can be directly positioned, or the calibration plate can be positioned first and then the spatial position of the reflective area can be confirmed according to the spatial position of the calibration plate. If the number of reflective regions is greater than one, you can also choose to locate one of the reflective regions first, and then combine the distribution of the reflective regions to determine other The spatial location of the reflective area.
  • step S221 specifically includes:
  • S2211 Position the calibration plate according to the image of the calibration plate to obtain spatial position information of the calibration plate.
  • S2212 Determine a spatial position where the reflective area is located according to the distribution information of the reflective area on the calibration plate and the spatial position information of the calibration plate.
  • the calibration plate is first positioned, and then the spatial position of the reflective region is determined according to the spatial position of the calibration plate and the distribution information of the known reflective region on the calibration plate.
  • the number of reflective regions is greater than one, The identification and positioning process of different reflective regions can be omitted, especially in the case where the number of reflective regions is large, the calculation amount can be effectively reduced.
  • S222 Acquire a reflected light intensity of the reflective area according to a grayscale value of the reflective area in the image of the calibration plate.
  • the magnitude of the grayscale value reflects the reflected light intensity of the reflective area under illumination from the source.
  • the gray scale value can be directly used to indicate the reflected light intensity, and the gray scale value can be processed to some extent, for example, after the normalization process to represent the reflected light intensity.
  • S223 Acquire an incident light intensity at a spatial position where the reflective region is located according to the reflected light intensity and the reflectance of the reflective region.
  • the fourth embodiment of the lighting correction method of the present invention is based on the second embodiment of the lighting correction method of the present invention.
  • the radiation model of the light source is required. Further before S22, it includes:
  • the types of light sources generally include point light sources, line light sources, surface light sources and combinations thereof.
  • the theoretical radiation model of point light source/line light source/surface light source is based on the uniform distribution of the emitted light on the light-emitting point/line/surface. In practical applications, due to the non-uniformity of the light-emitting elements and/or the light-mixing materials, the outgoing light of the point source/line source/area source is often not uniformly distributed. In order to further improve the accuracy of the light source light field information, it is possible to select a corresponding radiation model for the light source. In the subsequent steps, the actual radiation model established in this step can be used to combine the incident light intensity of the reflection region to obtain the light field information of the light source.
  • the fifth embodiment of the lighting correction method of the present invention is based on the fourth embodiment of the lighting correction method of the present invention.
  • the light source includes a line source and/or a surface light source, and step S24 specifically includes:
  • the camera is facing the line/surface of the line source/area source to capture its image.
  • the camera used here is not necessarily the same as the image acquisition device previously used to acquire the detected object/calibration plate image.
  • S242 Confirm the output light intensity distribution of the line light source/surface light source according to the image of the line light source/surface light source.
  • the emission light intensity distribution is confirmed according to the gray scale value distribution of the pixel points of the corresponding line light source/surface light source in the image.
  • the larger the grayscale value the higher the corresponding light intensity.
  • the line source/area source is equivalent to a plurality of point sources according to the output intensity distribution.
  • the line source/area source can be divided into multiple parts according to the output intensity distribution, and the degree of dispersion of the outgoing light intensity in the same part (which can be expressed by the variance or standard deviation of the gray scale value) should be less than the preset first threshold.
  • the difference between the average values of the outgoing light intensities of the adjacent portions is greater than a preset second threshold. For each part that is divided, you can choose to directly equivalent it to a little light source; you can also choose to make it according to its area.
  • Dividing again that is, the portion whose area is less than or equal to the third threshold is directly equivalent to the point source, and the portion having the area larger than the third threshold is again divided into at least two sub-sections whose area is less than or equal to the third threshold, and each sub-section is Equivalent to a point source.
  • the theoretical model of the point source can be used as its corresponding radiation model, and the corresponding radiation model can be established according to the light intensity distribution of the point source obtained in S202.
  • S245 A set of radiation models of all equivalent point sources is used as a radiation model of the line source/area source.
  • the line source/area source may be selected to be integrally controlled, or may be selected.
  • Independently controlling each of the line source/area source sources for example, adding an optical device to the line source/area source according to an equivalent result, the device partially transmits light, and the remaining portion is shielded from light, and the device is controlled to transmit light.
  • the position and size of the part correspond to the currently controlled equivalent point source.
  • the line source/area source is equivalent to a plurality of point sources according to the outgoing light intensity distribution of the line source/area source.
  • the radiation model of the point source equivalent in this embodiment is closer to the ideal model, and the actual model of the equivalent point source can also be used, and the radiation model of the obtained line source/area source is more accurate.
  • the sixth embodiment of the lighting correction method of the present invention is based on the second embodiment of the lighting correction method of the present invention.
  • the method further includes:
  • BRDF Bidirectional Reflectance Distribution Function
  • the reflectivity of the reflective region can be obtained. For example, as shown in FIG. 8 , the reflection area in the lower right corner of the calibration plate in the left half of the figure is calibrated, and the reflection light is irradiated from various angles with incident light of a known light intensity to measure the reflected light intensity and the reflected light intensity. Dividing the incident light intensity as the reflectance at the current angle, an image of the reflectance-angle in the lower right corner of the figure can be drawn, and the image can represent the reflectance of the reflective region.
  • the reflective area of the specular or diffuse material can be used. Since the BRDF model of the specular/diffuse material is known, a few measurements can determine the reflection of the reflected area. Rate, which simplifies the process. In addition, if the number of reflective regions on the calibration plate is greater than one and the uniformity of different reflective regions is good, only one reflective region can be calibrated, and the obtained reflectances are used as the reflectances of all reflective regions.
  • This step can be performed completely every time the light source field is calibrated; it can also be selected to be performed only once (for example, at the factory or during the initial calibration of the source light field), and used directly in the calibration source light field.
  • the obtained isotropic reflectance or select several angles for sampling, and verify that the obtained isotropic reflectance is correct before use.
  • the first embodiment of the lighting correction device of the present invention comprises a processor 110 and a memory 120, and the processor 110 is connected to the memory 120.
  • the memory 120 is used for instructions and data required for the operation of the processor 110.
  • the processor 110 controls the operation of the lighting correction device, and the processor 110 may also be referred to as a CPU (Central Processing Unit).
  • Processor 110 may be an integrated circuit chip with signal processing capabilities.
  • the processor 110 can also be a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, a discrete gate or transistor logic device, and discrete hardware components.
  • a general purpose processor can be a microprocessor or The processor can also be any conventional processor or the like.
  • the processor 110 is operative to execute instructions stored in the memory 120 to implement any of the embodiments of the present illumination correction method and the methods provided by the non-conflicting combination.
  • the lighting correction device may be a separate device in the industrial vision system, disposed between the image acquisition device and the image processing device, or may be integrated with the image processing device.
  • a first embodiment of the readable storage medium of the present invention includes a memory 210.
  • the memory 210 stores instructions that, when executed, implement the methods provided by any of the embodiments of the lighting correction method of the present invention and combinations that do not conflict.
  • the memory 210 may include a read-only memory (ROM), a random access memory (RAM), a flash memory, a hard disk, an optical disk, and the like.
  • ROM read-only memory
  • RAM random access memory
  • flash memory a hard disk
  • optical disk an optical disk

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un procédé de correction de luminosité, le procédé consistant : à collecter une image d'un objet de détection sous éclairage de source de lumière (S11) ; à utiliser au moins l'image de l'objet de détection afin de positionner l'objet de détection afin d'acquérir la position spatiale de l'objet de détection (S12) ; à utiliser la position spatiale de l'objet détecté et des informations de champ lumineux de source de lumière pré-acquises afin d'acquérir la distribution de champ lumineux sur la surface visible de l'objet détecté (S13) ; et sur la base de la distribution de champ lumineux, à réaliser une correction de luminosité de l'image de l'objet détecté (S14). L'invention concerne également un appareil de correction de luminosité. Le présent procédé peut réduire l'impact d'un éclairage irrégulier sur une image, de telle sorte que l'erreur entre l'image et l'objet de détection réel est réduite, améliorant ainsi la précision de détection visuelle industrielle.
PCT/CN2017/085146 2017-06-26 2017-06-26 Appareil et procédé de correction de luminosité Ceased WO2019000119A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780036245.1A CN109643444B (zh) 2017-06-26 2017-06-26 打光校正方法及装置
PCT/CN2017/085146 WO2019000119A1 (fr) 2017-06-26 2017-06-26 Appareil et procédé de correction de luminosité

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/085146 WO2019000119A1 (fr) 2017-06-26 2017-06-26 Appareil et procédé de correction de luminosité

Publications (1)

Publication Number Publication Date
WO2019000119A1 true WO2019000119A1 (fr) 2019-01-03

Family

ID=64740292

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/085146 Ceased WO2019000119A1 (fr) 2017-06-26 2017-06-26 Appareil et procédé de correction de luminosité

Country Status (2)

Country Link
CN (1) CN109643444B (fr)
WO (1) WO2019000119A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117750595A (zh) * 2024-02-19 2024-03-22 深圳市光脉电子有限公司 一种混合光源控制方法、系统、电子设备及存储介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111105365B (zh) * 2019-12-05 2023-10-24 深圳积木易搭科技技术有限公司 一种纹理影像的色彩校正方法、介质、终端和装置
CN115631243B (zh) * 2022-09-14 2025-09-09 北京达佳互联信息技术有限公司 光场采集系统中光源的标定方法、相关设备及存储介质
CN119758370A (zh) * 2024-12-24 2025-04-04 深圳亿维瑞光科技有限公司 校正光场不均匀的关联成像方法及关联成像系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040085477A1 (en) * 2002-10-30 2004-05-06 The University Of Chicago Method to smooth photometric variations across multi-projector displays
CN203632743U (zh) * 2013-12-16 2014-06-04 威海华菱光电股份有限公司 图像采集设备
CN104156916A (zh) * 2014-07-31 2014-11-19 北京航空航天大学 一种用于场景光照恢复的光场投影方法
CN104539921A (zh) * 2014-11-26 2015-04-22 北京理工大学 一种基于多投影系统的光照补偿方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003168084A (ja) * 2001-11-30 2003-06-13 Sanyo Electric Co Ltd 本人認証システム及び方法
JP2012002541A (ja) * 2010-06-14 2012-01-05 Sony Corp 画像処理装置、画像処理方法、プログラム、及び電子機器
JP2013221767A (ja) * 2012-04-13 2013-10-28 Panasonic Corp 外観検査装置および外観検査方法
CN103761713A (zh) * 2014-01-21 2014-04-30 中国石油大学(华东) 一种微观驱油实验图像亮度不均匀的校正方法
CN105447834B (zh) * 2015-12-28 2018-03-13 浙江工业大学 一种基于特征分类的麻将图像光照不均校正方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040085477A1 (en) * 2002-10-30 2004-05-06 The University Of Chicago Method to smooth photometric variations across multi-projector displays
CN203632743U (zh) * 2013-12-16 2014-06-04 威海华菱光电股份有限公司 图像采集设备
CN104156916A (zh) * 2014-07-31 2014-11-19 北京航空航天大学 一种用于场景光照恢复的光场投影方法
CN104539921A (zh) * 2014-11-26 2015-04-22 北京理工大学 一种基于多投影系统的光照补偿方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117750595A (zh) * 2024-02-19 2024-03-22 深圳市光脉电子有限公司 一种混合光源控制方法、系统、电子设备及存储介质

Also Published As

Publication number Publication date
CN109643444A (zh) 2019-04-16
CN109643444B (zh) 2022-11-22

Similar Documents

Publication Publication Date Title
US20170359573A1 (en) Method and apparatus for camera calibration using light source
US8233157B2 (en) Method and apparatus of a portable imaging-based measurement with self calibration
WO2019000119A1 (fr) Appareil et procédé de correction de luminosité
JP5133626B2 (ja) 表面反射特性測定装置
Bertin et al. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation
US12055384B2 (en) Apparatus and method for capturing an object surface by electromagnetic radiation
TWI490445B (zh) 用於估計一物件之一三維表面形狀之方法、裝置及機器可讀非暫時性儲存媒體
JP6519265B2 (ja) 画像処理方法
US10746536B2 (en) Optical displacement meter
US20080151194A1 (en) Method and System for Illumination Adjustment
CN105277558B (zh) 一种研究表面的多步方法及其对应设备
WO2016145582A1 (fr) Procédé d'étalonnage d'écart de phase, procédé et système de détection de forme 3d et système de projection
JP2017511038A (ja) 2つの投影手段の改良された位置合わせ方法
US20160313180A1 (en) Apparatus and method for profiling a beam of a light emitting semiconductor device
KR102370888B1 (ko) 모델-기반 피크 선택을 이용하는 3 차원 프로파일 결정을 위한 시스템 및 방법
Frangez et al. Surface finish classification using depth camera data
US8334908B2 (en) Method and apparatus for high dynamic range image measurement
KR102602369B1 (ko) 매끄러운 표면을 지닌 측정 대상물의 광학 측정 영상 처리 방법 및 그 측정 시스템
KR101739096B1 (ko) 디스플레이 패널 외관 검사 장치 및 그 검사 방법
CN108240800B (zh) 表面形貌的量测方法
EP3869542B1 (fr) Dispositif et procédé d'inspection
KR100983877B1 (ko) 물체의 반사율을 측정하는 시스템 및 방법
KR102368707B1 (ko) 라인 스캔용 논-램버시안 표면 검사 시스템
JP2021032580A (ja) 測定装置及び測定方法
JP2020129187A (ja) 外形認識装置、外形認識システム及び外形認識方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17916048

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17916048

Country of ref document: EP

Kind code of ref document: A1