WO2014061372A1 - 画像処理装置、画像処理方法および画像処理プログラム - Google Patents
画像処理装置、画像処理方法および画像処理プログラム Download PDFInfo
- Publication number
- WO2014061372A1 WO2014061372A1 PCT/JP2013/074316 JP2013074316W WO2014061372A1 WO 2014061372 A1 WO2014061372 A1 WO 2014061372A1 JP 2013074316 W JP2013074316 W JP 2013074316W WO 2014061372 A1 WO2014061372 A1 WO 2014061372A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- template
- scale
- correction
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39393—Camera detects projected image, compare with reference image, position end effector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40609—Camera to monitor end effector as well as object to be handled
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- the present invention relates to an image processing apparatus, an image processing method, and an image processing program that perform position detection using template matching using a template image.
- template matching for searching a position corresponding to a template image registered in advance from a photographed image has been used in various fields.
- the accuracy of the arrangement of the measurement target (work) with respect to the photographing apparatus (camera) is not so high, the relative positional relationship between the camera and the work is not constant, and template matching is not performed. May fail.
- the following prior arts are known.
- Patent Document 1 discloses an appearance inspection method for an inspection object. More specifically, the method includes a step of inputting in advance a relative position between the three-dimensional shape data of the inspection surface of the inspection object and the CCD camera, and a plurality of inspection surfaces by overlapping at least a part thereof. Acquiring a plurality of images by imaging with a CCD camera, correcting a distortion of each image based on the three-dimensional shape data and developing it on a plane, and creating a plurality of plane images at least partially overlapping including.
- Patent Document 2 discloses an object position detection method for detecting the position of an object by pattern matching. More specifically, in this method, a substrate is imaged by a substrate recognition camera, and a part of the image is registered as a template image. When there is a difference between the mounting angle of the camera when the template image is acquired and the mounting angle of the camera when imaging the substrate for pattern matching, the registered template image is corrected according to the difference in the mounting angle, The position of the substrate is detected by pattern matching with the corrected template image.
- Patent Document 3 discloses an image processing apparatus that processes an image obtained by a visual sensor and acquires information on the position and / or orientation of an object. More specifically, this device creates a conversion model pattern that represents the appearance of an object in a relative posture that is three-dimensionally different from the reference relative posture by performing a two-dimensional geometric transformation on the model pattern. Means for performing pattern matching on the image data using the transformation model pattern, and for a plurality of three-dimensional relative poses, the creation of the transformation model pattern and the pattern matching are repeated to repeat the matching transformation model.
- Non-Patent Document 1 is a template matching method that uses Fourier-Mellin invariant to support scale / roll variation. Is disclosed.
- the work surface When the work is roughly arranged with respect to the camera, the work surface may be inclined with respect to the camera. In this case, the photographed image may be deformed due to the tilt of the workpiece surface, and template matching may fail. Also, if the distance between the workpiece surface and the camera is different, a scale shift may occur between the captured image and the template image, and template matching may fail.
- Non-Patent Document 1 is a method that simply assumes XY equal magnification fluctuations, and does not assume trapezoidal distortion or the like caused by the inclination of the workpiece surface.
- An object of the present invention is to provide an image processing apparatus, an image processing method, and an image processing program capable of determining a position corresponding to a template image with higher accuracy even when the arrangement state of the measurement target with respect to the imaging apparatus varies. It is to be.
- An image processing system includes an imaging device that captures a measurement object and acquires a captured image, an arrangement detection unit that detects an arrangement state of the measurement object, and determines a tilt angle of the measurement object with respect to the imaging device.
- a storage unit that stores the template image, a tilt correction unit that generates a tilt-corrected image by correcting the captured image based on the tilt angle determined by the arrangement detection unit, and the tilt correction image and the template image.
- a scale correction unit that calculates a scale shift amount and generates a scale correction image by correcting the tilt correction image based on the calculated scale shift amount, and template matching using the template image for the scale correction image
- a position search unit for determining a position corresponding to the template image on the photographed image.
- an imaging device that captures a measurement target and acquires a captured image, and an arrangement detection unit that detects an arrangement state of the measurement target and determines a tilt angle of the measurement target with respect to the imaging device.
- An image processing apparatus used in an image processing system includes a storage unit that stores a template image, an inclination correction unit that generates an inclination correction image by correcting a captured image based on a tilt angle determined by the arrangement detection unit, an inclination correction image, and a template image And a scale correction unit that generates a scale correction image by correcting the tilt correction image based on the calculated scale shift amount, and a template image for the scale correction image.
- a position search unit for determining a position corresponding to the template image on the photographed image by performing the template matching used.
- an image processing method for performing template matching using a template image registered in advance includes a step of acquiring a photographed image of a measurement object photographed by the photographing apparatus, a step of detecting an arrangement state of the measurement object, determining a tilt angle of the measurement target with respect to the photographing apparatus, and based on the tilt angle.
- an image processing program for performing template matching using a template image registered in advance is provided.
- the image processing program is based on the tilt angle, the step of acquiring the captured image of the measurement object captured by the imaging device, the step of acquiring the tilt angle of the measurement target for the imaging device determined by the arrangement detection unit, and the tilt angle.
- a step of generating a scale correction image and a step of determining a position corresponding to the template image on the photographed image by performing template matching using the template image on the scale correction image are executed.
- the position corresponding to the template image can be determined with higher accuracy even when the arrangement position of the measurement target with respect to the photographing apparatus varies.
- FIG. 1 is a schematic diagram showing an image processing system including an image processing device according to an embodiment of the present invention. It is a block diagram which shows the structure at the time of implement
- the posture of the workpiece surface is measured using a stereo camera, an optical distance measuring device, or the like, the inclination between the imaging device (camera) and the workpiece surface is specified, and imaging caused by this inclination is performed.
- Stable template matching is realized by correcting image distortion.
- template matching is further stabilized by correcting the scale (magnification) fluctuation caused by the inclination correction.
- FIG. 1 is a schematic diagram showing an image processing system including an image processing apparatus according to an embodiment of the present invention.
- the image processing apparatus takes an image of a measurement target (hereinafter referred to as “work 4”) that is appropriately arranged by a robot arm 8 or the like.
- work 4 a measurement target
- template matching using a template image 18 registered in advance is performed on the image acquired (hereinafter referred to as “captured image 10”).
- captured image 10 the image processing apparatus 100 determines a position corresponding to the template image on the captured image 10.
- the image processing systems 1 ⁇ / b> A and 1 ⁇ / b> B include an image processing device 100, a photographing device (camera 1) for photographing a measurement target (work 4) and acquiring a photographed image 10, and arrangement detection for detecting the arrangement state of the measurement target. Department.
- the arrangement detection unit has a distance measuring function of a plurality of points, and detects a tilt angle of the surface of the workpiece 4 (measurement target surface) with respect to the camera by estimating a measurement surface from the distance measurement results of the plurality of points.
- an image processing system 1A shown in FIG. 1A employs a stereo camera including a pair of cameras 1 and 2, and an image processing system 1B shown in FIG.
- a distance measuring device 3 such as a laser distance measuring device is employed.
- the captured image 10 is generated by the camera 1.
- one of the cameras constituting the stereo camera also functions as a photographing device.
- FIG. 2 is a block diagram showing a configuration when image processing apparatus 100 according to the embodiment of the present invention is realized by a personal computer.
- an image processing apparatus 100 realized by a personal computer is mainly mounted on a computer having a general-purpose architecture.
- an image processing apparatus 100 includes, as main components, a CPU (Central Processing Unit) 102, a RAM (Random Access Memory) 104, a ROM (Read Only Memory) 106, and a network interface (I / F). ) 108, auxiliary storage device 110, display unit 120, input unit 122, memory card interface (I / F) 124, camera interface (I / F) 128, and sensor interface (I / F) 132. including.
- Each component is communicably connected to each other via a bus 130.
- the CPU 102 executes various programs such as an operating system (OS) and a template matching processing program 112 stored in the ROM 106 and the auxiliary storage device 110.
- the RAM 104 functions as a working memory for executing a program by the CPU 102, and temporarily stores various data necessary for executing the program.
- the ROM 106 stores an initial program (boot program) that is executed when the image processing apparatus 100 is started.
- the network interface 108 exchanges data with other devices (such as server devices) via various communication media. More specifically, the network interface 108 is connected via a wired line such as Ethernet (registered trademark) (LAN (Local Area Network), WAN (Wide Area Network), etc.) and / or a wireless line such as a wireless LAN. Perform data communication.
- a wired line such as Ethernet (registered trademark) (LAN (Local Area Network), WAN (Wide Area Network), etc.) and / or a wireless line such as a wireless LAN. Perform data communication.
- the auxiliary storage device 110 typically includes a large-capacity magnetic recording medium such as a hard disk, and an image processing program (such as the template matching processing program 112) and the template image 18 for realizing various processes according to the present embodiment. Etc. are stored. Further, the auxiliary storage device 110 may store a program such as an operating system.
- the display unit 120 displays an image generated by executing the template matching processing program 112 in addition to a GUI (Graphical User Interface) screen provided by the operating system.
- GUI Graphic User Interface
- the input unit 122 typically includes a keyboard, a mouse, a touch panel, and the like, and outputs the content of the instruction received from the user to the CPU 102 or the like.
- the memory card interface 124 reads / writes data from / to various memory cards (nonvolatile recording media) 126 such as an SD (Secure Digital) card and a CF (Compact Flash (registered trademark)) card.
- nonvolatile recording media such as an SD (Secure Digital) card and a CF (Compact Flash (registered trademark)) card.
- the camera interface 128 takes in a photographed image acquired by photographing a measurement object such as the workpiece 4 from the camera 1.
- the camera 1 functions as a photographing device that photographs a measurement target and acquires a photographed image.
- the image processing apparatus 100 may not be directly connected to the camera 1.
- a captured image acquired by capturing a measurement target with a camera may be captured via the memory card 126. That is, in this case, the memory card 126 is mounted on the memory card interface 124, and the photographed image is read from the memory card 126 and stored (copied) in the auxiliary storage device 110 or the like.
- the sensor interface 132 captures a distance measurement result measured by the distance measuring device 3.
- FIG. 2 shows a configuration example that can handle both a case where a stereo camera (camera 1 and camera 2) is used and a case where the distance measuring device 3 is used as an arrangement detection unit for detecting the arrangement state of a measurement target.
- the sensor interface 132 may not be provided.
- the camera interface 128 may be connected only to the camera 1.
- the template matching processing program 112 stored in the auxiliary storage device 110 is stored and distributed on a recording medium such as a CD-ROM (Compact Disk-Read Only Memory) or distributed from a server device or the like via a network.
- the template matching processing program 112 implements processing by calling necessary modules among program modules provided as part of an operating system executed by the image processing apparatus 100 (personal computer) at a predetermined timing and order. It may be.
- the template matching processing program 112 itself does not include a module provided by the operating system, and image processing is realized in cooperation with the operating system.
- the template matching processing program 112 may be provided by being incorporated in a part of some program instead of a single program.
- the program itself does not include modules that are used in common with other programs, and image processing is realized in cooperation with the other programs. Even if such a configuration does not include some of the modules, it does not depart from the spirit of the image processing apparatus 100 according to the present embodiment.
- templates matching processing program 112 may be realized by dedicated hardware.
- ⁇ b3 Realization Example with Other Configuration
- a form such as a so-called cloud service in which at least one server device realizes processing according to the present embodiment may be employed.
- the client device transmits a captured image to be processed and necessary information to the server device (cloud side), and performs a necessary process on the server device side with respect to the transmitted captured image. is assumed. Further, it is not necessary for the server device side to perform all necessary functions (processes), and the client device and the server device may cooperate to realize the necessary processing.
- FIG. 3 is a schematic diagram showing an example of a functional configuration of image processing apparatus 100 according to the embodiment of the present invention.
- image processing apparatus 100 according to the present embodiment has, as its functional configuration, arrangement detection unit 30, inclination correction unit 150, scale correction unit 152, template matching unit 158, and result output unit. 160.
- These functional configurations are typically realized by the CPU 102 executing the template matching processing program 112 in the image processing apparatus 100 shown in FIG.
- the arrangement detection unit 30 includes the stereo camera (camera 1 and camera 2) or the distance measuring device 3. However, since a part of the function is realized by the CPU 102, an image is used here. A description will be given including the functional configuration of the processing apparatus 100. Hereinafter, each functional configuration will be described.
- the arrangement detection unit 30 includes a stereo camera (camera 1 and camera 2) or a distance measuring device 3, and detects the posture and distance of the measurement target (work 4) with respect to the camera 1 based on the output thereof. Output.
- the information of the arrangement state 12 includes the tilt angle of the measurement target (work 4) with respect to the camera 1.
- the tilt correction unit 150 receives the captured image 10 generated by the camera 1 and information on the arrangement state 12 of the workpiece 4 detected by the arrangement detection unit 30.
- the tilt correction unit 150 generates the tilt correction image 14 of the measurement target with the tilt angle corrected using the tilt angle information of the measurement target camera 1 detected by the arrangement detection unit 30. That is, the tilt correction unit 150 performs a process of correcting the tilt angle of the surface of the workpiece 4 (measurement target surface) with respect to the camera 1 obtained by the arrangement detection unit 30 for the captured image 10 of the workpiece 4 captured by the camera 1.
- An inclination correction image 14 is generated.
- the template image 18 is basically generated by extracting a partial image including a region whose position is to be detected from an image photographed so that the camera 1 and the surface of the reference workpiece (measurement target surface) face each other. Is done.
- distance information 20 including an output result (information on the arrangement state 12) from the arrangement detection unit 30 at the time of photographing the template image 18 is held.
- the image processing apparatus 100 includes a template generation unit that generates a template image 18 from the captured image 10.
- This template generation unit is an arrangement when the captured image 10 corresponding to the generated template image 18 is captured.
- the state is stored in association with the generated template image 18.
- the template image 18 is set using the captured image 10 captured by the camera 1.
- the template image 18 (captured image 10) is stored together with the arrangement state 12 output from the arrangement detection unit 30 at the time of shooting.
- the template image 18 (the photographed image 10) can be obtained with the measurement target surface tilted with respect to the camera 1 by holding the template image 18 and the information on the arrangement state 12 of the workpiece 4 at the time of photographing the template. Even if a photograph is taken, the template image 18 can be corrected, and template matching can be executed more stably.
- the scale correction unit 152 calculates a scale shift amount between the tilt correction image 14 and the template image 18 and generates the scale correction image 16 by correcting the tilt correction image 14 based on the calculated scale shift amount. To do.
- the scale correction unit 152 receives the tilt correction image 14 generated by the tilt correction unit 150 and the template image 18.
- the scale correction unit 152 includes a scale deviation amount calculation unit 154 and a scale correction image generation unit 156.
- the scale deviation amount calculation unit 154 includes, for the inclination correction image 14, the distance information from the camera 1 to the surface of the workpiece 4 (measurement target surface) obtained by the arrangement detection unit 30 and the distance information 20 when the template image 18 is captured. Based on this, the amount of scale deviation between the tilt corrected image 14 and the template image 18 is calculated.
- the scale correction image generation unit 156 performs scale correction on the tilt correction image 14 based on the information on the scale shift amount calculated by the scale shift amount calculation unit 154 to generate the scale correction image 16.
- the template matching unit 158 performs template matching on the scale correction image 16 using the template image 18. That is, the template matching unit 158 performs a template matching process between the scale correction image 16 and the template image 18 and acquires a position detection result corresponding to the template image 18 on the scale correction image 16.
- the result output unit 160 determines a position corresponding to the template image 18 on the captured image 10. That is, the result output unit 160 calculates a position on the captured image 10 corresponding to the position based on the position detection result corresponding to the template image 18 on the scale correction image 16. The result output unit 160 calculates the three-dimensional position of the workpiece 4 based on the calculation result of the position on the captured image 10 and the arrangement state 12 detected by the arrangement detection unit 30.
- the result output unit 160 includes a pre-correction position calculation unit 162 and a three-dimensional position calculation unit 164.
- the pre-correction position calculation unit 162 converts the position detection result by the template matching unit 158 into a position on the captured image 10.
- the three-dimensional position calculation unit 164 calculates the three-dimensional position of the workpiece 4 viewed from the camera 1 corresponding to the template position, based on the template position on the captured image 10 and the information on the arrangement state 12 by the arrangement detection unit 30. And output as a position calculation result.
- the image processing apparatus 100 shown in FIG. 1 corrects the distortion generated by the inclination of the surface of the workpiece 4 (measurement target surface) with respect to the camera 1. As a result, an image close to the template image 18 can be acquired, and template matching can be stabilized.
- Arrangement detection unit 30 typically, a method using a stereo camera (camera 1 and camera 2) or a method using a distance measuring device 3 can be adopted.
- FIG. 4 is a schematic diagram showing an arrangement state of the measurement object detected by the arrangement detection unit 30 of the image processing apparatus 100 according to the embodiment of the present invention.
- FIG. 4A shows an example in which the camera 1 and the measurement target are facing each other.
- the tilt angle of the measurement target (work 4) with respect to the camera 1 can be defined as 0 °.
- FIG. 4B shows an example in which the camera 1 and the measurement object are deviated from the state of facing each other.
- the angle between the distance direction axis of the camera 1 and the orientation (normal direction) of the plane of the measurement object can be defined as the tilt angle ⁇ .
- the tilt angle may be defined using an angle between the distance direction axis of the camera 1 and the plane direction (tangential direction) of the measurement target.
- the arrangement detection unit 30 outputs the distance between the camera 1 and the workpiece 4 as information on the arrangement state 12. Further, the arrangement detection unit 30 includes a distance measurement unit that measures distances to a plurality of points on the surface of the workpiece 4, and estimates a measurement surface of the workpiece 4 based on a distance measurement result for the plurality of points on the surface of the workpiece 4. Thus, the tilt angle of the work 4 with respect to the camera 1 is determined.
- the arrangement detection unit 30 includes a stereo camera or an optical distance measuring device (laser distance measuring device or millimeter wave radar distance measuring device) as a distance measuring unit (or a part thereof).
- a plurality of data is acquired as position information corresponding to the surface of the work 4.
- the surface of the workpiece 4 is appropriately estimated.
- the surface of the work 4 can be estimated and the posture and distance of the work 4 with respect to the camera 1 can be grasped.
- a stereo camera and a distance measuring device are employed for the arrangement detection unit 30.
- FIG. 5 is a diagram illustrating an example when a stereo camera is employed for the arrangement detection unit 30.
- the arrangement detection unit 30 acquires a three-dimensional point group indicating a subject in the measurement range by three-dimensional measurement using a stereo camera including the camera 1 and the camera 2.
- positioning detection part 30 extracts the data corresponded to a measuring object about the three-dimensional point group in the acquired measurement range.
- the arrangement detection unit 30 performs a plane approximation on the extracted data, and from the plane equation calculated by the plane approximation, the orientation of the measurement target plane (normal vector) and the camera 1 And a distance (a distance from a camera corresponding to one or a plurality of points on the captured image 10).
- the arrangement detection unit 30 performs a cylindrical approximation on the extracted data, and the orientation of the cylindrical surface (approximation with respect to the horizontal plane) (The angle of the cylindrical axis) and the distance between the camera and the cylindrical surface (the distance from the camera corresponding to one or more points on the cylindrical surface on the captured image 10).
- the following processing can be used as a method for extracting data corresponding to the measurement target.
- Data extraction method (1) For example, as shown in FIG. 5, consider a case where measurement objects are stacked.
- the arrangement detection unit 30 calculates a normal vector for each of the three-dimensional points from the positional relationship with the neighboring three-dimensional points from the three-dimensional point group indicating the subject measured by the stereo camera. That is, a normal vector for each of the three-dimensional points is calculated.
- the arrangement detection unit 30 searches for a 3D point closest to the camera in the 3D point group and extracts a 3D point around the 3D point, thereby extracting the 3D point closest to the camera and the surrounding 3D points.
- the angle between the normal vector for each point and the distance direction axis vector of the stereo camera is calculated.
- the arrangement detection unit 30 counts the number of three-dimensional points that can be regarded as having the same angle. That is, the number of three-dimensional points included in each group in which the angle between the normal vector and the distance direction axis vector is the same is counted.
- the arrangement detection unit 30 extracts a three-dimensional point belonging to the group of the angle having the maximum count number from the three-dimensional point group, and provisionally performs a plane approximation based on the extracted three-dimensional point. Furthermore, the arrangement detection unit 30 calculates the distance of each three-dimensional point with respect to the calculated provisional approximate plane, and sets a three-dimensional point group that is within a predetermined distance from the approximate plane to the three-dimensional points constituting the measurement target surface. Extract as a group.
- the arrangement detection unit 30 performs plane approximation on the 3D point group extracted as the 3D point group constituting the measurement target surface, thereby determining an approximate plane of the measurement target surface, and further, The inclination (tilt angle) of the measurement target surface with respect to the camera is calculated from the normal vector.
- FIG. 6 is a schematic diagram illustrating another example of the arrangement state of the measurement target.
- the robot arm 8 grips the measurement target (work 4).
- a predetermined distance (measurement target in FIG. 3D points existing within the (surface extraction range) can be extracted as 3D points corresponding to the measurement target surface. That is, data indicating points in the image existing within a predetermined distance from the camera 1 is extracted.
- the placement detection unit 30 performs plane approximation on the extracted three-dimensional point to determine an approximate plane of the measurement target surface, and further, the inclination (tilt angle) of the measurement target surface with respect to the camera from the normal vector of the approximate plane. Is calculated.
- Data extraction method (3) For example, consider the case where the relative position between the measurement object and the camera is roughly determined. In this case, an area in which a subject corresponding to the template image is captured on the captured image 10 is set in advance.
- the arrangement detection unit 30 extracts a three-dimensional point group corresponding to the set region from the three-dimensional point group measured by the stereo camera, and performs an approximate plane on the extracted three-dimensional point, thereby measuring the measurement target surface. And the inclination (tilt angle) of the surface to be measured with respect to the camera is calculated from the normal vector of the approximate plane. That is, a three-dimensional point within a predetermined range is extracted from the captured image 10, and the measurement target surface 5 of the workpiece 4 is approximated by a plane.
- FIG. 7 is a schematic diagram showing an example of data extraction processing by the arrangement detection unit 30 according to the embodiment of the present invention.
- the arrangement detection unit 30 performs plane approximation using all of the three-dimensional point groups measured by the stereo camera, and calculates the distance of each of the three-dimensional points with respect to the approximate plane 6.
- most of the measured three-dimensional point group is considered to be points on the measurement target surface. For example, a point on a surface that is not the measurement target, such as a floor surface, is measured as a three-dimensional point. Conceivable. Thereby, it is assumed that the calculated approximate plane 6 is shifted from the actual measurement target surface.
- the arrangement detection unit 30 is configured to include a three-dimensional point group that configures a measurement target surface only with the three-dimensional points included within a predetermined distance (d in the figure) with respect to the approximate plane 6. Extract as By performing an approximate plane on the extracted three-dimensional point, an approximate plane of the measurement target surface is determined, and the inclination (tilt angle) of the measurement target surface with respect to the camera is calculated from the normal vector of the approximate plane.
- the arrangement detection unit 30 estimates the approximate plane 6 for the position corresponding to the workpiece 4 as a whole and then extracts a three-dimensional point within a predetermined distance from the approximate plane.
- a portion corresponding to the measurement target surface can be specified by performing such extraction processing without limiting the range.
- three-dimensional points outside the measurement target surface can be eliminated, and a more accurate approximate plane can be estimated.
- FIG. 8 is a schematic diagram illustrating a configuration example using a laser distance measuring device as the arrangement detection unit.
- laser distance measuring devices 3-1, 3-2 and 3-3 as shown in FIG. 8 may be used.
- a plurality of (preferably three or more) laser distance measuring devices 3-1, 3-2 and 3-3 are arranged around the camera 1 which is an imaging unit.
- the relative positions of the laser distance measuring devices 3-1, 3-2 and 3-3 with respect to the camera 1 are known, and the distance measuring directions of the laser distance measuring devices 3-1, 3-2 and 3-3 are They are arranged so as to be parallel to the optical axis direction of the camera 1.
- the arrangement detection unit 30 uses the distance measurement results of the laser distance measuring devices 3-1, 3-2, and 3-3 and information on the relative positions between the laser distance measuring devices to determine an approximate plane of the measurement target surface. Further, the inclination (tilt angle) of the measurement target surface with respect to the camera is calculated from the normal vector of the approximate plane. In the configuration shown in FIG. 8, for example, the measurement target surface 5 of the workpiece 4 is measured.
- the arrangement detection unit may output the distance from the camera 1 to the workpiece 4 together. By using such information, the distance between the camera 1 and the workpiece 4 can be acquired, and the scale deviation due to the distance deviation can be calculated.
- Millimeter-wave radar ranging device Instead of the laser distance measuring device shown in FIG. 8, a millimeter wave radar distance measuring device may be used. Also in this case, the millimeter wave radar distance measuring device is arranged at a known relative position with respect to the camera 1 which is an imaging unit. The arrangement detection unit 30 can associate the distance measurement result by the millimeter wave radar distance measurement device with each pixel position of the captured image 10 captured by the camera by converting the distance measurement result using the known relative position information. Based on the associated information, the approximate plane of the measurement target surface is determined, and the inclination (tilt angle) of the measurement target surface with respect to the camera is calculated from the normal vector of the approximate plane.
- the tilt correction unit 150 corrects the captured image 10 based on the tilt (tilt angle) of the measurement target surface with respect to the camera, which is acquired by the arrangement detection unit 30, and corresponds to a state in which the measurement target is directly facing the camera. A corrected image is generated.
- FIG. 9 is a diagram for describing the processing contents in inclination correction unit 150 of image processing apparatus 100 according to the embodiment of the present invention.
- a virtual plane 7 having an inclination detected by the arrangement detection unit 30 through a point (rotation center O) at a predetermined distance from the camera is set.
- the inclination correction unit 150 calculates a three-dimensional point on the virtual plane 7 corresponding to a predetermined point on the captured image 10 (for example, four corners of the image, point A in FIG. 9).
- the inclination correction unit 150 calculates a rotation matrix that matches the direction of the normal vector of the virtual plane 7 with the optical axis direction (Z direction) of the camera 1, and for the previously calculated points on the virtual plane 7, A rotation matrix calculated using the rotation center O as the center is applied.
- the tilt correction unit 150 calculates a position (point B in FIG. 9) on the image after tilt correction by projecting the point on the virtual plane 7 rotated by application of the rotation matrix onto the image, and the point A Homography (transformation matrix) for transforming to point B is determined.
- the inclination correction unit 150 generates the inclination correction image 14 by applying the thus determined Homography to the captured image 10.
- the rotation center O is not limited to the position corresponding to the center of the image, but to an arbitrary point (XY position) on the image. Can be set.
- FIG. 10 is a schematic diagram for illustrating the tilt correction processing of image processing apparatus 100 according to the embodiment of the present invention.
- FIG. 11 is a diagram illustrating an example of a processing result corresponding to the processing illustrated in FIG.
- FIG. 10 shows the virtual plane position after the inclination correction due to the difference in the rotation center (indicated by ⁇ in the figure) at the time of the inclination correction.
- FIG. 10 shows the result of tilt correction performed on the captured image 10 at each of the three rotation centers O1, O2, and O3.
- the template equivalent region 9 corresponding to the template image 18 on the photographed image 10 moves to the virtual positions 9-1, 9-2, 9-3 in the virtual space, respectively, by the inclination correction with respect to the rotation centers O1, O2, O3. .
- the difference in the rotation center at the time of the inclination correction causes a difference in the inclination correction image as shown in FIG. That is, the virtual distance to the camera 1 changes as the virtual positions 9-1, 9-2, and 9-3 move, so that the region that is desired to be detected as the template position is enlarged or reduced.
- FIG. 11A shows an example of the photographed image 10
- FIGS. 11B to 11D correspond to the virtual positions 9-1, 9-2, and 9-3, respectively.
- An example of the inclination correction image 14 is shown.
- a difference in scale occurs in the tilt-corrected image 14 due to the displacement of the position of the virtual plane on which the tilt correction has been performed.
- the scale deviation amount calculation unit 154 outputs an output result from the arrangement detection unit 30 at the time of photographing the template image 18 (planar position and orientation of the measurement target for generating the template image 18: information on the arrangement state 12 in FIG. 2).
- the three-dimensional position of the region corresponding to the template image 18 on the measurement target surface is calculated from the position on the captured image 10 of the region set as the template image 18.
- the scale deviation amount calculation unit 154 outputs the distance (Dt) between the camera 1 and the measurement target (work 4) calculated from the three-dimensional position, and the output from the arrangement detection unit 30 acquired when the work 4 is captured.
- the amount of scale deviation is calculated from the result and the position of the rotation center at the time of tilt correction (for example, the position corresponding to the center of the image).
- the scale deviation amount calculation unit 154 calculates the ratio (Dt / Di) of the distance Dt based on the distance (Di) between the rotation center position and the camera 1 at the time of tilt correction calculated from the three-dimensional position of the rotation center. The calculated ratio is determined as the amount of scale deviation. That is, the scale deviation amount calculation unit 154 determines the distance of the position corresponding to the template image 18 of the workpiece 4 (from the camera 1 to the workpiece 4 at the template position of the workpiece 4 acquired by the arrangement detection unit 30 when the template image 18 is captured). And the distance of the position corresponding to the rotation center used when the inclination correction unit 150 generates the inclination correction image 14 (the rotation center of the inclination correction processing acquired by the arrangement detection unit 30 when the workpiece 4 is captured).
- the amount of scale deviation is calculated based on the distance between the camera 1 and the workpiece 4 at the position.
- the scale shift may be calculated by applying the Fourier-Mellin Invariant method to the tilt-corrected image 14 and the template image 18 (for details, see Q. Chen et al. “Symmetric Phase-Only Matched Filtering of Fourier-Mellin Transforms for Image Registration and Recognition ”(see Non-Patent Document 1).
- the scale deviation amount calculation unit 154 calculates the scale deviation amount using the Fourier-Mellin invariant.
- a method using frequency space information such as POC (Phase Only Correlation) is adopted as a template matching method, it is compatible with the Fourier-Mellin invariant method using the same frequency space information, and template matching can be further stabilized.
- the scale deviation amount may be calculated by associating the feature points between the inclination correction image 14 and the template image 18.
- the scale deviation amount calculation unit 154 applies an Sobel filter or the like for each of the inclination correction image 14 and the template image 18 and extracts feature points such as corners.
- a feature point group extracted from the tilt correction image 14 is denoted by Pa
- a feature point group extracted from the template image 18 is denoted by Pt.
- the scale deviation amount calculation unit 154 associates feature points extracted from each of the inclination correction image 14 and the template image 18 based on SIFT (Scale-Invariant Feature Transform) feature amounts and the like.
- SIFT Scale-Invariant Feature Transform
- the scale deviation amount calculation unit 154 includes the distance between the feature points on each image (for example, ⁇ Pt i ⁇ Pt j ⁇ for the tilt correction image 14 and ⁇ Pa i ⁇ for the template image 18. Pa j ⁇ ) is calculated, and a combination in which the distance between the feature points is a predetermined value or more is extracted. Finally, the scale deviation amount calculation unit 154 calculates the ratio between the distance between the feature points on the template image 18 and the distance between the feature points on the inclination correction image 14 ( ⁇ Pt i ⁇ Pt j ⁇ / ⁇ Pa The average value of i ⁇ Pa j ⁇ ) is determined as the amount of scale deviation.
- the scale correction unit 152 performs feature point extraction processing on both the tilt correction image 14 and the template image 18 to extract feature points from the tilt correction image 14 and the template image 18, respectively, and the extracted features. Based on the ratio of the distance between points, the amount of scale deviation is calculated. According to this modification, the scale deviation can be calculated from information between images.
- ⁇ f5 Modified Example 3 of Scale Deviation Calculation
- a plurality of template images are generated by applying different magnification fluctuations to the reference template image 18, and the scale deviation amount calculation unit 154 includes the template image and the inclination correction image 14.
- Template matching is performed.
- the SAD Sud of Absolute Differences
- the scale deviation amount calculation unit 154 determines the magnification corresponding to the template image having the smallest SAD value among the plurality of template images as the scale deviation amount.
- the scale-corrected image generation unit 156 generates the scale-corrected image 16 from the inclination-corrected image 14 based on the scale shift amount calculated by the scale shift amount calculation unit 154 so that the scale is equal to the scale of the template image 18. To do.
- the template matching unit 158 performs template matching on the scale correction image 16 generated by the scale correction image generation unit 156 (scale correction unit 152) using the template image 18, and the template image 18 on the scale correction image 16 is displayed. The position corresponding to is detected.
- the scale correction image 16 has both the inclination (tilt angle) and scale of the measurement target surface corrected, basically any template matching method can be applied.
- the template matching unit 158 executes pattern matching by the RPOC (Rotation Invariant Phase Only Correlation) method.
- RPOC Resolution Invariant Phase Only Correlation
- RIPOC Resolution Invariant Phase Only Correlation
- a pattern matching method using image frequency components, such as RIPOC is preferable in terms of matching accuracy and robustness.
- the scale shift amount calculation unit 154 calculates the scale shift amount by applying the Fourier-Mellin Invariant method, information regarding the shift in the rotation direction can also be acquired.
- a template by the POC (Phase Only Correlation) method or the SAD (Sum of Absolute Difference) method is used. Matching may be performed.
- the POC method using the frequency component of the image is suitable.
- the result output unit 160 calculates the position on the captured image 10 from the detected position. Output.
- the result output unit 160 calculates a position on the pre-correction image (captured image 10) corresponding to the position on the scale correction image 16.
- template matching is performed after the captured image 10 is corrected. Therefore, the template-corresponding position on the captured image 10 cannot be specified unless converted to the position on the image before correction.
- Pre-correction position calculation unit 162 converts the position detection result on the scale correction image 16 acquired by the template matching unit 158 into an image position before correction. More specifically, the pre-correction position calculation unit 162 inversely transforms the scale correction applied by the scale correction image generation unit 156 and the homography applied by the inclination correction unit 150, respectively, and inputs them to the inclination correction unit 150. It converts into the position on the taken image 10 that has been made.
- the three-dimensional position calculation unit 164 is based on the position on the captured image 10 (position on the image before correction) acquired by the pre-correction position calculation unit 162 and the distance measurement result acquired by the arrangement detection unit 30. Thus, the position in the three-dimensional space corresponding to the position on the captured image 10 (position on the image before correction) is calculated.
- the three-dimensional position calculation unit 164 employs the result of measurement with a stereo camera corresponding to the position on the captured image 10 to determine the position in the three-dimensional space.
- FIG. 12 is a schematic diagram for illustrating the process of calculating the three-dimensional position of image processing apparatus 100 according to the embodiment of the present invention.
- 12A and 12B the crosses indicate the template positions on the captured image 10, and the calculation results of the three-dimensional positions on the approximate plane 6 determined by the arrangement detection unit 30 are indicated by the marks ⁇ . Show.
- the three-dimensional position calculation unit 164 includes a line-of-sight vector 11 (see FIG. 12B) calculated from a position on the captured image 10 (a position indicated by an X in FIG. 12A). As an intersection with the approximate plane 6 estimated by the arrangement detection unit 30, a position in the three-dimensional space (the position of the mark ⁇ in FIG. 12B) is calculated.
- the result output unit 160 calculates the three-dimensional position of the measurement target from the position calculation result on the pre-correction image (captured image 10) and the information output from the arrangement detection unit 30.
- a three-dimensional position corresponding to the template position can be calculated as the intersection of the line of sight and the plane.
- a stereo camera is employed as the arrangement detection unit 30, a three-dimensional position corresponding to the template position can be determined by acquiring a three-dimensional measurement result corresponding to the position on the pre-correction image.
- FIG. 13 is a schematic diagram showing a first modification of the functional configuration of the image processing apparatus 100 according to the embodiment of the present invention.
- the image processing apparatus 100 according to the present modification has, as its functional configuration, an arrangement detection unit 30, an inclination correction unit 150, a scale correction unit 152A, a template matching unit 158, and a result output unit 160A.
- These functional configurations are typically realized by the CPU 102 executing the template matching processing program 112 in the image processing apparatus 100 shown in FIG.
- the functional configuration shown in FIG. 13 is different from the functional configuration shown in FIG. 3 in that the scale correction unit 152A (scale deviation amount calculation unit 154A and scale correction image generation unit 156A) and the result output unit 160A (pre-correction position calculation unit 162A). And the three-dimensional position calculation unit 164A) is different in that the information on the arrangement state 12 detected by the arrangement detection unit 30 is not used. That is, the scale correction unit 152A calculates the scale deviation amount based on the tilt correction image 14, the template image 18, and the distance information 20. Further, the result output unit 160A outputs a position calculation result based on the position detection result by the template matching unit 158. At this time, the scale correction unit 152A and the result output unit 160A use the Fourier-Mellin Invariant method.
- FIG. 14 is a schematic diagram showing a second modification of the functional configuration of the image processing apparatus 100 according to the embodiment of the present invention.
- image processing apparatus 100 according to the present modification has, as its functional configuration, arrangement detection unit 30, inclination correction unit 150B, scale correction unit 152B (scale deviation amount calculation unit 154B and scale correction image generation). Unit 156B), template matching unit 158B, and result output unit 160B (pre-correction position calculation unit 162B and three-dimensional position calculation unit 164B).
- These functional configurations are typically realized by the CPU 102 executing the template matching processing program 112 in the image processing apparatus 100 shown in FIG.
- the inclination correction unit 150 ⁇ / b> B generates the inclination correction image 14 by correcting the captured image 10 based on the information on the arrangement state 12 detected by the arrangement detection unit 30, and based on the distance information 20.
- An inclination correction template image 24 is generated by correcting the template image 18.
- the scale correction unit 152B generates the scale correction image 16 and the scale correction template image 26 by performing scale correction on the inclination correction image 14 and the inclination correction template image 24 based on the calculated scale deviation amount.
- the template matching unit 158B performs template matching using the scale correction image 16 and the scale correction template image 26.
- the result output unit 160B outputs a position calculation result from the position detection result by the template matching unit 158B.
- the image processing apparatus 100 includes the template correction unit (scale correction unit 152B) that corrects the template image 18 in parallel with the generation of the scale correction image 16.
- the template image 18 is also corrected together when the scale correction image 16 is generated. Thereby, even if the registered template image 18 is inclined with respect to the camera 1, more stable template matching can be realized by correcting the template image 18.
- ⁇ j3: Modification 3 As described with reference to FIG. 10, in the tilt correction process, it is preferable to match the position of the rotation center with the template position. This is because if the position of the rotation center of the inclination correction processing is shifted from the template position, enlargement or reduction occurs at the template position on the generated inclination correction image 14, which may cause a reduction in the amount of information. It is to be done. Therefore, after template matching, using the template position acquired by template matching as the position of the rotation center of the inclination correction process, the inclination correction process and the template matching are executed again, and the position calculation result is output based on this result. May be. By adopting such a configuration, inclination correction can be executed with the template position as the position of the rotation center. As a result, a reduction in the amount of information due to tilt correction can be avoided, and a more accurate template matching result can be acquired.
- FIG. 15 is a schematic diagram showing a third modification of the functional configuration of the image processing apparatus 100 according to the embodiment of the present invention.
- the image processing apparatus 100 according to the present modification includes an arrangement detection unit 30, an inclination correction unit 150, a scale correction unit 152A (a scale deviation amount calculation unit 154A and a scale correction image generation unit 156A), Template matching unit 158, pre-correction position calculation unit 162, inclination correction unit 150C, scale correction unit 152C (scale deviation amount calculation unit 154C and scale correction image generation unit 156C), template matching unit 158C, and result output unit 160C (pre-correction position calculation unit 162C and three-dimensional position calculation unit 164C).
- These functional configurations are typically realized by the CPU 102 executing the template matching processing program 112 in the image processing apparatus 100 shown in FIG.
- the arrangement detection unit 30, the inclination correction unit 150, the scale correction unit 152 ⁇ / b> A, the template matching unit 158, and the pre-correction position calculation unit 162 are portions corresponding to preprocessing, and a template image on the photographed image 10. A position corresponding to 18 (template position) is provisionally calculated.
- the scale correction unit 152A and the pre-correction position calculation unit 162A use the Fourier-Mellin Invariant method.
- the inclination correction unit 150C generates the inclination correction image 14C by performing the inclination correction with the calculated template position as the rotation center.
- the scale correction unit 152C generates a scale correction image by performing scale correction on the tilt correction image 14C.
- the template matching unit 158C performs template matching on the scale corrected image.
- the result output unit 160C outputs a position calculation result from the position detection result by the template matching unit 158C.
- the image processing apparatus 100 corrects the captured image 10 obtained by capturing the workpiece 4, and sets the position obtained as the template matching result to the position on the captured image 10.
- the converted image is set as the position of the rotation center at the time of the new tilt correction, and the captured image 10 is tilt-corrected again based on the set rotation center to generate the tilt-corrected image 14C, and the template matching is performed again.
- the result is output as the final template matching result.
- the pre-correction position calculation unit 162A performs template matching using the template image 18 on the scale correction image 16 generated by the inclination correction unit 150 and the scale correction unit 152A as preprocessing, so that the captured image 10 is obtained.
- a position corresponding to the upper template image 18 is determined.
- the inclination correction unit 150C generates the inclination correction image 14C by correcting the captured image 10 around the position corresponding to the template image 18 on the captured image 10 determined in the preprocessing.
- the scale correction unit 152C generates a scale correction image 16C by performing scale correction on the inclination correction image 14C generated by the inclination correction unit 150C.
- the template matching unit 158C performs template matching using the template image 18 on the scale correction image 16C, thereby determining a position corresponding to the template image 18 on the captured image 10.
- the tilt-corrected image 14 corresponding to the template image 18 becomes an enlarged or reduced shape, so that information is lost. It is possible that Loss of information can be avoided by performing tilt correction around the template position. A more accurate template position detection result can be acquired.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
Abstract
Description
本実施の形態は、ステレオカメラや光学測距装置などを用いて、ワーク表面の姿勢を測定し、撮影装置(カメラ)とワーク表面との間の傾きを特定するとともに、この傾きに起因する撮影画像の歪みを補正することで、安定したテンプレートマッチングを実現する。このとき、傾き補正に起因するスケール(倍率)変動も併せて補正することで、テンプレートマッチングをより安定させる。
まず、本発明の実施の形態に従う画像処理装置100について説明する。
図1は、本発明の実施の形態に従う画像処理装置を備えた画像処理システムを示す模式図である。図1(a)および図1(b)などに示すように、本実施の形態に従う画像処理装置は、ロボットアーム8などによって適宜配置される測定対象(以下「ワーク4」と称す。)を撮影することで取得される画像(以下「撮影画像10」と称す。)に対して、予め登録されたテンプレート画像18を用いたテンプレートマッチングを行なう。このテンプレートマッチングによって、画像処理装置100は、撮影画像10上のテンプレート画像に対応する位置を決定する。
図2は、本発明の実施の形態に従う画像処理装置100をパーソナルコンピューターにより実現した場合の構成を示すブロック図である。
上述したパーソナルコンピューターにより実現する例に加えて、少なくとも1つのサーバー装置が本実施の形態に従う処理を実現する、いわゆるクラウドサービスのような形態であってもよい。この場合、クライアント装置が処理対象の撮影画像および必要な情報をサーバー装置(クラウド側)へ送信し、当該送信された撮影画像に対して、サーバー装置側で必要な処理を実行するような構成が想定される。さらに、サーバー装置側が必要なすべての機能(処理)を行なう必要はなく、クライエント装置とサーバー装置とが協働して、必要な処理を実現するようにしてもよい。
次に、本発明の実施の形態に従う画像処理装置100の機能構成の一例について説明する。図3は、本発明の実施の形態に従う画像処理装置100の機能構成の一例を示す模式図である。図3を参照して、本実施の形態に従う画像処理装置100は、その機能構成として、配置検知部30と、傾き補正部150と、スケール補正部152と、テンプレートマッチング部158と、結果出力部160とを含む。これらの機能構成は、典型的には、図2に示す画像処理装置100において、CPU102がテンプレートマッチング処理プログラム112を実行することで実現される。なお、配置検知部30は、上述のように、ステレオカメラ(カメラ1およびカメラ2)または測距装置3を含むが、その機能の一部はCPU102によって実現されるものであるため、ここでは画像処理装置100の機能構成に含めて説明する。以下、各機能構成について説明する。
[D.配置検知部]
配置検知部30としては、典型的には、ステレオカメラ(カメラ1およびカメラ2)を用いる方法、または、測距装置3を用いる方法を採用できる。
図5は、配置検知部30にステレオカメラを採用した場合の一例を示す図である。配置検知部30は、カメラ1およびカメラ2からなるステレオカメラを用いた3次元測定により、測定範囲にある被写体を示す3次元点群を取得する。そして、配置検知部30は、取得した測定範囲にある3次元点群について、測定対象に相当するデータを抽出する。
例えば、図5に示すように、測定対象がバラ積みされているような場合を考える。配置検知部30は、ステレオカメラで測定した被写体を示す3次元点群から、3次元点の各々について、その周辺にある3次元点との位置関係から法線ベクトルをそれぞれ算出する。すなわち、3次元点の各々についての法線ベクトルが算出される。配置検知部30は、3次元点群のうち、カメラに最も近い3次元点を探索するとともに、その周辺の3次元点を抽出することで、カメラに最も近い3次元点およびその周辺の3次元点の各点についての法線ベクトルとステレオカメラの距離方向軸ベクトルとの間の角度をそれぞれ算出する。さらに、配置検知部30は、角度が互いに同一とみなせる3次元点の数をカウントする。すなわち、法線ベクトルと距離方向軸ベクトルとの間の角度が互いに同一であるグループ毎にそれに含まれる3次元点の数がカウントされる。
図6は、測定対象の配置状態の別の一例を示す模式図である。例えば、図6に示すように、ロボットアーム8で測定対象(ワーク4)を把持するような場合を考える。この場合、測定対象面はカメラに対してある程度決まった位置(距離)に存在することになるので、ステレオカメラで測定した3次元点群のうち、カメラに対して所定距離(図6の測定対象面抽出範囲)内に存在する3次元点を測定対象面に相当する3次元点であるとして抽出できる。すなわち、カメラ1から所定距離内に存在する画像中の点を示すデータが抽出される。配置検知部30は、この抽出した3次元点について平面近似を行なうことで、測定対象面の近似平面を決定し、さらにこの近似平面の法線ベクトルからカメラに対する測定対象面の傾き(あおり角)を算出する。
例えば、測定対象とカメラとの相対位置がおおよそ決まっている場合を考える。この場合、撮影画像10上でテンプレート画像に対応する被写体が写ると想定されている領域を予め設定しておく。配置検知部30は、ステレオカメラで測定した3次元点群のうち、設定された領域に対応する3次元点群を抽出し、この抽出した3次元点について近似平面を行なうことで、測定対象面の近似平面を決定し、さらにこの近似平面の法線ベクトルからカメラに対する測定対象面の傾き(あおり角)を算出する。すなわち、撮影画像10において所定範囲内にある3次元点が抽出され、ワーク4の測定対象面5が平面近似される。
例えば、撮影画像10の画角いっぱいに測定対象が配置されると想定される場合を考える。図7は、本発明の実施の形態に従う配置検知部30によるデータ抽出処理の一例を示す模式図である。
図8は、配置検知部としてレーザー測距装置を用いた構成例を示す模式図である。上述のステレオカメラに代えて、図8に示すようなレーザー測距装置3-1,3-2,3-3を用いてもよい。具体的には、撮影部であるカメラ1の周辺に、複数(好ましくは、3つ以上)のレーザー測距装置3-1,3-2,3-3を配置する。このとき、レーザー測距装置3-1,3-2,3-3のカメラ1に対する相対位置は既知であり、レーザー測距装置3-1,3-2,3-3の測距方向は、カメラ1の光軸方向と平行になるように配置される。配置検知部30は、レーザー測距装置3-1,3-2,3-3の各測距結果と、レーザー測距装置間の相対位置の情報とを用いて、測定対象面の近似平面を決定し、さらにこの近似平面の法線ベクトルからカメラに対する測定対象面の傾き(あおり角)を算出する。図8に示す構成では、例えば、ワーク4の測定対象面5が測定される。
図8に示すレーザー測距装置に代えて、ミリ波レーダー測距装置を用いてもよい。この場合も、ミリ波レーダー測距装置は、撮影部であるカメラ1に対して既知の相対位置に配置される。配置検知部30は、ミリ波レーダー測距装置による測距結果を、当該既知の相対位置の情報を用いて変換することで、カメラで撮影される撮影画像10の各ピクセル位置に関連付けることができ、この関連付けた情報に基づいて、測定対象面の近似平面を決定し、さらにこの近似平面の法線ベクトルからカメラに対する測定対象面の傾き(あおり角)を算出する。
傾き補正部150は、配置検知部30によって取得される、カメラに対する測定対象面の傾き(あおり角)に基づいて撮影画像10を補正し、測定対象がカメラに対して正対した状態に相当する補正画像を生成する。
《f1:スケールずれ》
上述の傾き補正において、回転中心がテンプレート画像18に対応する位置として検出したい位置とは一致していない場合には、傾き補正画像14において、テンプレート画像18に対応する領域が拡大または縮小された状態になる。すなわち、テンプレート画像18と傾き補正画像14との間にスケールずれが生じる。スケールずれ量算出部154は、これを補正するためのスケールずれ量を算出する。
スケールずれ量算出部154は、テンプレート画像18の撮影時における配置検知部30からの出力結果(テンプレート画像18を生成するための測定対象の平面位置および向き:図2の配置状態12の情報)と、テンプレート画像18として設定した領域の撮影画像10上の位置とから、測定対象面上でのテンプレート画像18と対応する領域の3次元位置を算出する。そして、スケールずれ量算出部154は、この3次元位置から算出されるカメラ1と測定対象(ワーク4)との距離(Dt)と、ワーク4の撮影時に取得される配置検知部30からの出力結果と、傾き補正時の回転中心の位置(例えば、画像中央に相当する位置)とから、スケールずれ量を算出する。
変形例として、傾き補正画像14とテンプレート画像18とに対してFourier-Mellin Invariant法を適用して、スケールずれを算出してもよい(詳細は、Q.Chen et al. "Symmetric Phase-Only Matched Filtering of Fourier-Mellin Transforms for Image Registration and Recognition"(非特許文献1)を参照)。
別の変形例として、傾き補正画像14とテンプレート画像18との間で特徴点について対応付けを行なうことで、スケールずれ量を算出してもよい。
さらに別の変形例として、基準のテンプレート画像18に対して異なる倍率変動を適用して複数のテンプレート画像を生成しておき、スケールずれ量算出部154は、これらのテンプレート画像と傾き補正画像14との間で、テンプレートマッチングを行なう。このテンプレートマッチングには、例えばSAD(Sum of Absolute Differences)法を用いることができる。そして、スケールずれ量算出部154は、テンプレートマッチングの結果、複数のテンプレート画像のうちSAD値が最小となるテンプレート画像に対応する倍率をスケールずれ量として決定する。
スケール補正画像生成部156は、スケールずれ量算出部154によって算出されたスケールずれ量に基づいて、テンプレート画像18のスケールと同等のスケールになるように、傾き補正画像14からスケール補正画像16を生成する。
テンプレートマッチング部158は、スケール補正画像生成部156(スケール補正部152)によって生成されたスケール補正画像16に対して、テンプレート画像18を用いてテンプレートマッチングを行ない、スケール補正画像16上のテンプレート画像18に対応する位置を検出する。
テンプレートマッチング部158によって検出される位置は、スケール補正画像16上のテンプレート画像18に対応する位置であるので、結果出力部160は、当該検出された位置から撮影画像10上の位置を算出して出力する。結果出力部160は、スケール補正画像16上での位置に対応する補正前画像(撮影画像10)上での位置を算出する。本実施の形態においては、撮影画像10を補正した上で、テンプレートマッチングを行なうので、補正前の画像上の位置に換算しないと、撮影画像10上でのテンプレート対応位置を特定できない。
補正前位置算出部162は、テンプレートマッチング部158によって取得されたスケール補正画像16上での位置検出結果を補正前の画像位置に換算する。より具体的には、補正前位置算出部162は、スケール補正画像生成部156で適用されたスケール補正、および、傾き補正部150で適用されたHomographyをそれぞれ逆変換し、傾き補正部150に入力された撮影画像10上での位置に換算する。
3次元位置算出部164は、補正前位置算出部162によって取得された撮影画像10上での位置(補正前画像上での位置)と、配置検知部30によって取得された測距結果とに基づいて、撮影画像10上での位置(補正前画像上での位置)に対応する3次元空間内での位置を算出する。例えば、3次元位置算出部164は、撮影画像10上での位置に対応するステレオカメラでの測定の結果を採用して、3次元空間内での位置を決定する。
《j1:変形例1》
図13は、本発明の実施の形態に従う画像処理装置100の機能構成の変形例1を示す模式図である。図13を参照して、本変形例に従う画像処理装置100は、その機能構成として、配置検知部30と、傾き補正部150と、スケール補正部152Aと、テンプレートマッチング部158と、結果出力部160Aとを含む。これらの機能構成は、典型的には、図2に示す画像処理装置100において、CPU102がテンプレートマッチング処理プログラム112を実行することで実現される。
撮影画像10およびテンプレート画像18のいずれについても、カメラ1に対して正対していない状態で撮影された場合には、以下に示すような構成を採用してもよい。すなわち、撮影画像10およびテンプレート画像18のいずれについても、傾き補正およびスケールずれ補正を行なった上で、テンプレートマッチングを行なうことで、マッチング精度を高めることができる。
図10を参照して説明したように、傾き補正処理においては、回転中心の位置とテンプレート位置とを一致させることが好ましい。これは、傾き補正処理の回転中心の位置がテンプレート位置とずれている場合、生成される傾き補正画像14上のテンプレート位置において拡大または縮小が発生することになり、これにより情報量の減少が懸念されるためである。そこで、テンプレートマッチング後に、テンプレートマッチングによって取得したテンプレート位置を傾き補正処理の回転中心の位置として用いて、傾き補正処理およびテンプレートマッチングを再度実行し、この結果に基づいて位置算出結果を出力するようにしてもよい。このような構成を採用することで、テンプレート位置を回転中心の位置として傾き補正を実行できる。これによって、傾き補正による情報量の減少を回避することができ、より高精度なテンプレートマッチング結果を取得できる。
Claims (19)
- 画像処理システムであって、
測定対象を撮影して撮影画像を取得する撮影装置と、
前記測定対象の配置状態を検知し、前記撮影装置に対する前記測定対象のあおり角を決定する配置検知部と、
テンプレート画像を記憶する記憶部と、
前記配置検知部により決定されたあおり角に基づいて前記撮影画像を補正することで傾き補正画像を生成する傾き補正部と、
前記傾き補正画像と前記テンプレート画像との間のスケールずれ量を算出するとともに、当該算出したスケールずれ量に基づいて前記傾き補正画像を補正することでスケール補正画像を生成するスケール補正部と、
前記スケール補正画像に対して、前記テンプレート画像を用いたテンプレートマッチングを行なうことで、前記撮影画像上の前記テンプレート画像に対応する位置を決定する位置探索部とを備える、画像処理システム。 - 前記配置検知部は、前記測定対象の表面の複数点に対する距離を測定する測距部を含み、
前記配置検知部は、前記測定対象の表面の複数点についての測距結果に基づいて、前記測定対象の測定面を推定する、請求項1に記載の画像処理システム。 - 前記配置検知部は、前記撮影装置と前記測定対象との間の距離を出力する、請求項1に記載の画像処理システム。
- 前記スケール補正部は、前記傾き補正部が前記傾き補正画像を生成する際の回転中心に対応する位置の情報に基づいて、前記スケールずれ量を算出する、請求項1~3のいずれか1項に記載の画像処理システム。
- 前記スケール補正部は、Fourier-Mellin invariantを用いて、前記スケールずれ量を算出する、請求項1~3のいずれか1項に記載の画像処理システム。
- 前記スケール補正部は、前記傾き補正画像および前記テンプレート画像から特徴点をそれぞれ抽出するとともに、抽出した特徴点間の距離の比に基づいて、前記スケールずれ量を算出する、請求項1~3のいずれか1項に記載の画像処理システム。
- 前記撮影画像から前記テンプレート画像を生成するテンプレート生成部をさらに備え、前記テンプレート生成部は、生成したテンプレート画像に対応する撮影画像を撮影した際の測定対象の配置状態を当該生成したテンプレート画像に関連付けて前記記憶部に格納する、請求項1~6のいずれか1項に記載の画像処理システム。
- 前記テンプレート画像を補正するテンプレート補正部をさらに備える、請求項1~7のいずれか1項に記載の画像処理システム。
- 前記配置検知部は、ステレオカメラを含む、請求項1~8のいずれか1項に記載の画像処理システム。
- 前記ステレオカメラの一方のカメラが前記撮影装置によって構成される、請求項9に記載の画像処理システム。
- 前記配置検知部は、前記撮影装置との相対位置が固定された光学測距装置を含む、請求項1~8のいずれか1項に記載の画像処理システム。
- 前記配置検知部は、前記測定対象の表面に対応する位置情報として、前記撮影装置に最も近い点、前記撮影装置から所定距離内に存在する点、測定領域において所定範囲内に存在する点、のいずれかを示すデータを抽出する、請求項1~11のいずれか1項に記載の画像処理システム。
- 前記配置検知部は、前記測定対象の表面に対応する位置情報として、測定領域内の3次元データの全体を用いて前記測定対象の表面形状を推定し、この推定された表面形状に関して所定距離内に存在する3次元データを抽出する、請求項1~11のいずれか1項に記載の画像処理システム。
- 前記位置探索部は、前記テンプレート画像を用いたテンプレートマッチングによって特定された前記スケール補正画像上の位置に対応する前記撮影画像上の位置を算出する結果出力部を含む、請求項1~13のいずれか1項に記載の画像処理システム。
- 前記結果出力部は、前記撮影画像上の位置についての算出結果と、前記配置検知部により検知された配置状態とに基づいて、前記測定対象の3次元位置を算出する、請求項14に記載の画像処理システム。
- 前記位置探索部は、前処理として、前記傾き補正部および前記スケール補正部によって生成された第1のスケール補正画像に対して前記テンプレート画像を用いたテンプレートマッチングを行なうことで、前記撮影画像上の前記テンプレート画像に対応する位置を決定し、
前記スケール補正部は、前処理において決定された前記撮影画像上の前記テンプレート画像に対応する位置を中心として、前記傾き補正部が前記撮影画像を補正することで生成した傾き補正画像に対して、スケール補正を行なうことで第2のスケール補正画像を生成し、
前記位置探索部は、本処理として、前記第2のスケール補正画像に対して前記テンプレート画像を用いたテンプレートマッチングを行なうことで、前記撮影画像上の前記テンプレート画像に対応する位置を決定する、請求項1~15のいずれか1項に記載の画像処理システム。 - 測定対象を撮影して撮影画像を取得する撮影装置と、前記測定対象の配置状態を検知し、前記撮影装置に対する前記測定対象のあおり角を決定する配置検知部とを含む画像処理システムにおいて用いられる画像処理装置であって、
テンプレート画像を記憶する記憶部と、
前記配置検知部により決定されたあおり角に基づいて前記撮影画像を補正することで傾き補正画像を生成する傾き補正部と、
前記傾き補正画像と前記テンプレート画像との間のスケールずれ量を算出するとともに、当該算出したスケールずれ量に基づいて前記傾き補正画像を補正することでスケール補正画像を生成するスケール補正部と、
前記スケール補正画像に対して、前記テンプレート画像を用いたテンプレートマッチングを行なうことで、前記撮影画像上の前記テンプレート画像に対応する位置を決定する位置探索部とを備える、画像処理装置。 - 予め登録されたテンプレート画像を用いたテンプレートマッチングを行なう画像処理方法であって、
撮影装置によって撮影された測定対象の撮影画像を取得するステップと、
前記測定対象の配置状態を検知し、前記撮影装置に対する前記測定対象のあおり角を決定するステップと、
前記あおり角に基づいて前記撮影画像を補正することで傾き補正画像を生成するステップと、
前記傾き補正画像と前記テンプレート画像との間のスケールずれ量を算出するとともに、当該算出したスケールずれ量に基づいて前記傾き補正画像を補正することでスケール補正画像を生成するステップと、
前記スケール補正画像に対して、前記テンプレート画像を用いたテンプレートマッチングを行なうことで、前記撮影画像上の前記テンプレート画像に対応する位置を決定するステップとを備える、画像処理方法。 - 予め登録されたテンプレート画像を用いたテンプレートマッチングを行なう画像処理プログラムであって、前記画像処理プログラムはコンピューターに、
撮影装置によって撮影された測定対象の撮影画像を取得するステップと、
配置検知部によって決定された前記撮影装置に対する前記測定対象のあおり角を取得するステップと、
前記あおり角に基づいて前記撮影画像を補正することで傾き補正画像を生成するステップと、
前記傾き補正画像と前記テンプレート画像との間のスケールずれ量を算出するとともに、当該算出したスケールずれ量に基づいて前記傾き補正画像を補正することでスケール補正画像を生成するステップと、
前記スケール補正画像に対して、前記テンプレート画像を用いたテンプレートマッチングを行なうことで、前記撮影画像上の前記テンプレート画像に対応する位置を決定するステップとを実行させる、画像処理プログラム。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP13846554.7A EP2911116A4 (en) | 2012-10-18 | 2013-09-10 | PICTURE PROCESSING DEVICE, PICTURE PROCESSING METHOD AND PICTURE PROCESSING PROGRAM |
| US14/435,667 US20150262346A1 (en) | 2012-10-18 | 2013-09-10 | Image processing apparatus, image processing method, and image processing program |
| JP2014541995A JPWO2014061372A1 (ja) | 2012-10-18 | 2013-09-10 | 画像処理装置、画像処理方法および画像処理プログラム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012230872 | 2012-10-18 | ||
| JP2012-230872 | 2012-10-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014061372A1 true WO2014061372A1 (ja) | 2014-04-24 |
Family
ID=50487949
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/074316 Ceased WO2014061372A1 (ja) | 2012-10-18 | 2013-09-10 | 画像処理装置、画像処理方法および画像処理プログラム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20150262346A1 (ja) |
| EP (1) | EP2911116A4 (ja) |
| JP (1) | JPWO2014061372A1 (ja) |
| WO (1) | WO2014061372A1 (ja) |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016083972A1 (en) * | 2014-11-25 | 2016-06-02 | Quartesan Diego | Robotic system comprising a telemetric device with a laser measuring device and a passive video camera |
| JP2017083523A (ja) * | 2015-10-23 | 2017-05-18 | 国立大学法人帯広畜産大学 | 撮影装置及び撮影方法、並びに枝肉の肉質評価方法 |
| JP2018124281A (ja) * | 2017-02-01 | 2018-08-09 | カール・ツアイス・インダストリーエレ・メステクニク・ゲーエムベーハー | 3d記録の露光時間を判断する方法 |
| JP2018153899A (ja) * | 2017-03-21 | 2018-10-04 | 株式会社リコー | 部品供給システム |
| JP2018194542A (ja) * | 2017-05-17 | 2018-12-06 | オムロン株式会社 | 画像処理システム、画像処理装置および画像処理プログラム |
| JP2020199612A (ja) * | 2019-06-12 | 2020-12-17 | 株式会社アマダ | ワーク検出装置及びワーク検出方法 |
| JP2021033712A (ja) * | 2019-08-26 | 2021-03-01 | 川崎重工業株式会社 | 画像処理装置、撮像装置、ロボット及びロボットシステム |
| JP2021051548A (ja) * | 2019-09-25 | 2021-04-01 | 東芝テック株式会社 | 物品認識装置 |
| JP2021163151A (ja) * | 2020-03-31 | 2021-10-11 | パイオニア株式会社 | 情報処理装置、制御方法、プログラム及び記憶媒体 |
| CN113752260A (zh) * | 2021-09-07 | 2021-12-07 | 京东方科技集团股份有限公司 | 一种取料定位修正方法及装置 |
| JP2022048686A (ja) * | 2020-09-15 | 2022-03-28 | 日立建機株式会社 | 画像処理装置 |
| WO2022176679A1 (ja) * | 2021-02-17 | 2022-08-25 | 株式会社デンソー | 測距補正装置、測距補正方法、測距補正プログラム、および測距装置 |
| JP2022125966A (ja) * | 2021-02-17 | 2022-08-29 | 株式会社デンソー | 測距補正装置、測距補正方法、測距補正プログラム、および測距装置 |
| CN117274102A (zh) * | 2023-10-08 | 2023-12-22 | 陕西科技大学 | 一种用于铸管表面三维点云数据的滤波方法 |
| CN117372505A (zh) * | 2023-10-08 | 2024-01-09 | 九江精博精密科技有限公司 | 神经网络与光学干涉对心校准方法及系统、存储介质 |
Families Citing this family (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5924020B2 (ja) * | 2012-02-16 | 2016-05-25 | セイコーエプソン株式会社 | プロジェクター、及び、プロジェクターの制御方法 |
| EP3118812B1 (en) * | 2014-03-14 | 2025-05-14 | Omron Corporation | Image processing device, image sensor, and image processing method |
| CN106695880B (zh) * | 2015-11-13 | 2019-09-17 | 联合汽车电子有限公司 | 机器人的设备零位的误差校正装置及其设备零位校正方法 |
| JP6586239B2 (ja) * | 2016-08-29 | 2019-10-02 | 株式会社日立製作所 | 撮影装置及び撮影方法 |
| WO2018163572A1 (ja) * | 2017-03-10 | 2018-09-13 | 富士フイルム株式会社 | 画像処理システム、画像処理装置、画像処理方法及び画像処理プログラム |
| CN108965690B (zh) * | 2017-05-17 | 2021-02-26 | 欧姆龙株式会社 | 图像处理系统、图像处理装置及计算机可读存储介质 |
| JP2019015575A (ja) * | 2017-07-05 | 2019-01-31 | 株式会社東芝 | 画像処理装置、測距装置および処理システム |
| US20190188513A1 (en) * | 2017-12-20 | 2019-06-20 | Datalogic Usa Inc. | Systems and methods for object deskewing using stereovision or structured light |
| CN108280846B (zh) * | 2018-01-16 | 2020-12-29 | 中国科学院福建物质结构研究所 | 基于几何图形匹配的目标跟踪修正方法及其装置 |
| JP6693981B2 (ja) | 2018-02-19 | 2020-05-13 | ファナック株式会社 | ロボットの動作をシミュレーションするシミュレーション装置 |
| CN108600609A (zh) * | 2018-03-19 | 2018-09-28 | 维沃移动通信有限公司 | 辅助拍照的方法及装置 |
| US10558944B1 (en) * | 2018-08-27 | 2020-02-11 | Invia Robotics, Inc. | Inventory verification device |
| CN112334732B (zh) * | 2018-10-12 | 2023-06-09 | 松下知识产权经营株式会社 | 预测装置及预测方法 |
| JP7337495B2 (ja) * | 2018-11-26 | 2023-09-04 | キヤノン株式会社 | 画像処理装置およびその制御方法、プログラム |
| DE102019206977B4 (de) * | 2019-05-14 | 2021-06-02 | Carl Zeiss Industrielle Messtechnik Gmbh | Vorrichtung und Verfahren zur Vermessung mindestens eines schräg im Raum stehenden Messobjekts |
| US12174445B2 (en) * | 2020-02-26 | 2024-12-24 | Pfa Corporation | Camera module manufacturing device |
| CN111531544B (zh) * | 2020-05-13 | 2023-05-23 | 深圳赛动生物自动化有限公司 | 基于图像几何匹配的机器手控制系统及其控制方法 |
| CN112184713A (zh) * | 2020-11-06 | 2021-01-05 | 上海柏楚电子科技股份有限公司 | 切割含焊缝管材的控制方法、装置、切割系统、设备与介质 |
| CN117140539B (zh) * | 2023-11-01 | 2024-01-23 | 成都交大光芒科技股份有限公司 | 基于空间坐标变换矩阵的机器人三维协同巡检方法 |
| CN117726623B (zh) * | 2024-02-07 | 2024-05-24 | 深圳新视智科技术有限公司 | 二叉树线路检测方法、装置及计算机设备 |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10124676A (ja) * | 1996-10-24 | 1998-05-15 | Fujitsu Ltd | 形状追跡方法および装置 |
| JP2003216955A (ja) * | 2002-01-23 | 2003-07-31 | Sharp Corp | ジェスチャ認識方法、ジェスチャ認識装置、対話装置及びジェスチャ認識プログラムを記録した記録媒体 |
| JP2004295223A (ja) | 2003-03-25 | 2004-10-21 | Fanuc Ltd | 画像処理装置及びロボットシステム |
| JP2007309808A (ja) | 2006-05-19 | 2007-11-29 | Juki Corp | 対象物の位置検出方法及び装置 |
| JP2008232805A (ja) * | 2007-03-20 | 2008-10-02 | Kyushu Institute Of Technology | 物体検出方法 |
| WO2009028489A1 (ja) * | 2007-08-30 | 2009-03-05 | Kabushiki Kaisha Yaskawa Denki | 物体検出方法と物体検出装置およびロボットシステム |
| JP2009128261A (ja) | 2007-11-27 | 2009-06-11 | Takashima Giken Kk | 外観検査方法および装置 |
| JP2012059030A (ja) * | 2010-09-09 | 2012-03-22 | Optex Co Ltd | 距離画像カメラを用いた人体識別方法および人体識別装置 |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6711293B1 (en) * | 1999-03-08 | 2004-03-23 | The University Of British Columbia | Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image |
| EP1766552A2 (en) * | 2004-06-23 | 2007-03-28 | Strider Labs, Inc. | System and method for 3d object recognition using range and intensity |
| JP5375488B2 (ja) * | 2009-09-29 | 2013-12-25 | 富士通株式会社 | 外観検査装置,外観検査方法および外観検査プログラム |
| JP5589423B2 (ja) * | 2010-02-15 | 2014-09-17 | 株式会社リコー | 透明平板検出システム |
| JP5317250B2 (ja) * | 2010-08-31 | 2013-10-16 | 国立大学法人 熊本大学 | 画像処理方法および画像処理装置 |
-
2013
- 2013-09-10 US US14/435,667 patent/US20150262346A1/en not_active Abandoned
- 2013-09-10 JP JP2014541995A patent/JPWO2014061372A1/ja active Pending
- 2013-09-10 EP EP13846554.7A patent/EP2911116A4/en not_active Withdrawn
- 2013-09-10 WO PCT/JP2013/074316 patent/WO2014061372A1/ja not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10124676A (ja) * | 1996-10-24 | 1998-05-15 | Fujitsu Ltd | 形状追跡方法および装置 |
| JP2003216955A (ja) * | 2002-01-23 | 2003-07-31 | Sharp Corp | ジェスチャ認識方法、ジェスチャ認識装置、対話装置及びジェスチャ認識プログラムを記録した記録媒体 |
| JP2004295223A (ja) | 2003-03-25 | 2004-10-21 | Fanuc Ltd | 画像処理装置及びロボットシステム |
| JP2007309808A (ja) | 2006-05-19 | 2007-11-29 | Juki Corp | 対象物の位置検出方法及び装置 |
| JP2008232805A (ja) * | 2007-03-20 | 2008-10-02 | Kyushu Institute Of Technology | 物体検出方法 |
| WO2009028489A1 (ja) * | 2007-08-30 | 2009-03-05 | Kabushiki Kaisha Yaskawa Denki | 物体検出方法と物体検出装置およびロボットシステム |
| JP2009128261A (ja) | 2007-11-27 | 2009-06-11 | Takashima Giken Kk | 外観検査方法および装置 |
| JP2012059030A (ja) * | 2010-09-09 | 2012-03-22 | Optex Co Ltd | 距離画像カメラを用いた人体識別方法および人体識別装置 |
Non-Patent Citations (2)
| Title |
|---|
| Q.CHEN; M.DEFRISE; F. DECONINCK: "Symmetric Phase-Only Matched Filtering of Fourier-Mellin Transforms for Image Registration and Recognition", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 16, no. 12, December 1994 (1994-12-01), XP000486818, DOI: doi:10.1109/34.387491 |
| See also references of EP2911116A4 |
Cited By (25)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016083972A1 (en) * | 2014-11-25 | 2016-06-02 | Quartesan Diego | Robotic system comprising a telemetric device with a laser measuring device and a passive video camera |
| JP2017083523A (ja) * | 2015-10-23 | 2017-05-18 | 国立大学法人帯広畜産大学 | 撮影装置及び撮影方法、並びに枝肉の肉質評価方法 |
| JP2018124281A (ja) * | 2017-02-01 | 2018-08-09 | カール・ツアイス・インダストリーエレ・メステクニク・ゲーエムベーハー | 3d記録の露光時間を判断する方法 |
| JP7077031B2 (ja) | 2017-02-01 | 2022-05-30 | カール・ツアイス・インダストリーエレ・メステクニク・ゲーエムベーハー | 3d記録の露光時間を判断する方法 |
| JP2018153899A (ja) * | 2017-03-21 | 2018-10-04 | 株式会社リコー | 部品供給システム |
| JP2018194542A (ja) * | 2017-05-17 | 2018-12-06 | オムロン株式会社 | 画像処理システム、画像処理装置および画像処理プログラム |
| WO2020250761A1 (ja) * | 2019-06-12 | 2020-12-17 | 株式会社アマダ | ワーク検出装置及びワーク検出方法 |
| JP2020199612A (ja) * | 2019-06-12 | 2020-12-17 | 株式会社アマダ | ワーク検出装置及びワーク検出方法 |
| JP2021033712A (ja) * | 2019-08-26 | 2021-03-01 | 川崎重工業株式会社 | 画像処理装置、撮像装置、ロボット及びロボットシステム |
| WO2021039775A1 (ja) * | 2019-08-26 | 2021-03-04 | 川崎重工業株式会社 | 画像処理装置、撮像装置、ロボット及びロボットシステム |
| US12450766B2 (en) | 2019-08-26 | 2025-10-21 | Kawasaki Jukogyo Kabushiki Kaisha | Image processor, imaging device, robot and robot system |
| JP7453762B2 (ja) | 2019-08-26 | 2024-03-21 | 川崎重工業株式会社 | 画像処理装置、撮像装置、ロボット及びロボットシステム |
| JP2021051548A (ja) * | 2019-09-25 | 2021-04-01 | 東芝テック株式会社 | 物品認識装置 |
| JP7337628B2 (ja) | 2019-09-25 | 2023-09-04 | 東芝テック株式会社 | 物品認識装置 |
| JP2021163151A (ja) * | 2020-03-31 | 2021-10-11 | パイオニア株式会社 | 情報処理装置、制御方法、プログラム及び記憶媒体 |
| JP2022048686A (ja) * | 2020-09-15 | 2022-03-28 | 日立建機株式会社 | 画像処理装置 |
| JP7579095B2 (ja) | 2020-09-15 | 2024-11-07 | 日立建機株式会社 | 画像処理装置 |
| JP2022125966A (ja) * | 2021-02-17 | 2022-08-29 | 株式会社デンソー | 測距補正装置、測距補正方法、測距補正プログラム、および測距装置 |
| JP7375838B2 (ja) | 2021-02-17 | 2023-11-08 | 株式会社デンソー | 測距補正装置、測距補正方法、測距補正プログラム、および測距装置 |
| WO2022176679A1 (ja) * | 2021-02-17 | 2022-08-25 | 株式会社デンソー | 測距補正装置、測距補正方法、測距補正プログラム、および測距装置 |
| CN113752260B (zh) * | 2021-09-07 | 2023-12-26 | 京东方科技集团股份有限公司 | 一种取料定位修正方法及装置 |
| CN113752260A (zh) * | 2021-09-07 | 2021-12-07 | 京东方科技集团股份有限公司 | 一种取料定位修正方法及装置 |
| CN117274102A (zh) * | 2023-10-08 | 2023-12-22 | 陕西科技大学 | 一种用于铸管表面三维点云数据的滤波方法 |
| CN117372505A (zh) * | 2023-10-08 | 2024-01-09 | 九江精博精密科技有限公司 | 神经网络与光学干涉对心校准方法及系统、存储介质 |
| CN117274102B (zh) * | 2023-10-08 | 2025-10-10 | 陕西科技大学 | 一种用于铸管表面三维点云数据的滤波方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20150262346A1 (en) | 2015-09-17 |
| EP2911116A1 (en) | 2015-08-26 |
| JPWO2014061372A1 (ja) | 2016-09-05 |
| EP2911116A4 (en) | 2016-09-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014061372A1 (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
| Ahmadabadian et al. | A comparison of dense matching algorithms for scaled surface reconstruction using stereo camera rigs | |
| JP5580164B2 (ja) | 光学情報処理装置、光学情報処理方法、光学情報処理システム、光学情報処理プログラム | |
| CN102472609B (zh) | 位置和姿势校准方法及设备 | |
| US20240394899A1 (en) | Method, Apparatus and Device for Photogrammetry, and Storage Medium | |
| CN102278946A (zh) | 摄像装置以及长度测量方法 | |
| WO2017153793A1 (en) | Methods and computer program products for calibrating stereo imaging systems by using a planar mirror | |
| JP6172432B2 (ja) | 被写体識別装置、被写体識別方法および被写体識別プログラム | |
| KR100951309B1 (ko) | 광학식 모션 캡처 장비를 위한 다중 카메라 보정 방법 | |
| WO2014084181A1 (ja) | 画像計測装置 | |
| JP6086491B2 (ja) | 画像処理装置およびそのデータベース構築装置 | |
| CN108830797A (zh) | 一种基于仿射投影矩阵模型的直线匹配方法 | |
| KR101673144B1 (ko) | 부분 선형화 기반의 3차원 영상 정합 방법 | |
| JP2008224323A (ja) | ステレオ写真計測装置、ステレオ写真計測方法及びステレオ写真計測用プログラム | |
| JP2006113832A (ja) | ステレオ画像処理装置およびプログラム | |
| JP4701848B2 (ja) | 画像マッチング装置、画像マッチング方法および画像マッチング用プログラム | |
| KR101705330B1 (ko) | 스테레오 카메라 이미지에서 물체의 기울어진 각도를 찾기 위한 특징점 선택 방법 | |
| Georgiev et al. | A fast and accurate re-calibration technique for misaligned stereo cameras | |
| KR20150119770A (ko) | 카메라를 사용한 3차원 좌표 측정 장치 및 방법 | |
| JP6080424B2 (ja) | 対応点探索装置、そのプログラムおよびカメラパラメータ推定装置 | |
| WO2008032375A1 (fr) | Dispositif et procédé de correction d'image, et programme informatique | |
| CN117670990A (zh) | 三维相机的定位方法、装置、电子设备及存储介质 | |
| JP2010041416A (ja) | 画像処理装置、画像処理方法、画像処理プログラム、及び、撮像装置 | |
| JP6168601B2 (ja) | 画像変換装置 | |
| CN113330275B (zh) | 相机信息计算装置、系统、相机信息计算方法及记录介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13846554 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2014541995 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14435667 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2013846554 Country of ref document: EP |