WO2005008174A1 - 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 - Google Patents
3次元形状検出装置、撮像装置、及び、3次元形状検出方法 Download PDFInfo
- Publication number
- WO2005008174A1 WO2005008174A1 PCT/JP2004/010298 JP2004010298W WO2005008174A1 WO 2005008174 A1 WO2005008174 A1 WO 2005008174A1 JP 2004010298 W JP2004010298 W JP 2004010298W WO 2005008174 A1 WO2005008174 A1 WO 2005008174A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- pattern
- dimensional shape
- image
- slit light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00681—Detecting the presence, position or size of a sheet or correcting its position before scanning
- H04N1/00684—Object of the detection
- H04N1/00726—Other properties of the sheet, e.g. curvature or reflectivity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00026—Methods therefor
- H04N1/00031—Testing, i.e. determining the result of a trial
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00026—Methods therefor
- H04N1/00045—Methods therefor using a reference pattern designed for the purpose, e.g. a test chart
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00092—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for relating to the original or to the reproducing medium, e.g. imperfections or dirt
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/19—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
- H04N1/195—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a two-dimensional array or a combination of two-dimensional arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/19—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
- H04N1/195—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a two-dimensional array or a combination of two-dimensional arrays
- H04N1/19594—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a two-dimensional array or a combination of two-dimensional arrays using a television camera or a still video camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04753—Control or error compensation of scanning position or velocity
- H04N2201/04758—Control or error compensation of scanning position or velocity by controlling the position of the scanned image area
- H04N2201/04787—Control or error compensation of scanning position or velocity by controlling the position of the scanned image area by changing or controlling the addresses or values of pixels, e.g. in an array, in a memory, by interpolation
Definitions
- 3D shape detection device imaging device, and 3D shape detection method
- the present invention relates to a three-dimensional shape detection device that detects a three-dimensional shape of a target object using a light beam, an imaging device using the three-dimensional shape detection device, and a three-dimensional shape detection method. .
- a slit light is projected on a target object, an image of the target object on which the slit light is projected is captured by an imaging device, and the image of the target object is captured based on image data captured by the imaging device.
- a three-dimensional shape detecting device for detecting a three-dimensional shape the following device described in JP-A-7-318315 is known.
- the three-dimensional shape detection device is configured to convert a light beam from one light source into slit light, split the slit light into two rows by a half mirror, and project the split light onto a target object. .
- the three-dimensional shape detection device captures the reflection positions of the two rows of slit light reflected on the target object (hereinafter referred to as the locus of the slit light), and corresponds to the trajectory of the slit light in the captured image.
- the position of each pixel to be calculated is determined based on the 3D shape detection device.
- the three-dimensional shape detection device when the target object is a sheet, the three-dimensional shape of the entire target object is inferred. As a result, the three-dimensional shape of the target object is detected.
- the three-dimensional shape detection device described above has a configuration in which one slit light generated by the light source unit is divided into two by a half mirror. In this case, two slit lights with the same spread angle are emitted toward the target object, and two slit lights of substantially the same length are projected on the target object. .
- Such a configuration has the following disadvantages.
- a single point light source having a limited total output power is used for the light source unit.
- the power of each slit light is distributed in half when the light is split into two by a half mirror. Power is split in half If the brightness of the locus of the slit light required for accurate reading cannot be obtained with the slit light that has been obtained, there is power.
- the small light source of the light beam used in the above-described three-dimensional shape detection device is a laser diode having a rated output power mW of the total output power.
- the power per unit angular width of this slit light is about 21 ⁇ W / degree.
- the power of each slit light is about 10 ⁇ W / degree, which is half of that.
- the present invention has been made in view of these problems.
- the present invention provides a three-dimensional shape capable of reliably discriminating the trajectory of the pattern light on the target object while maintaining the superiority in terms of the total emission power of the light beam converted into the pattern light.
- An object is to provide a detection device, an imaging device, and a three-dimensional shape detection method.
- a pattern light projecting means for projecting a plurality of pattern lights including two pattern lights having different angular widths, and the plurality of patterns.
- Imaging means for capturing an image of the target object on which the turn light is projected from a position at a certain distance from the pattern light projection means; and projecting the target object based on the image captured by the imaging means.
- the power per angular width of the two patterns having different angular widths can be made the same as when the slit light is in one row. Therefore, even when two rows of pattern light are used, it is possible to reliably discriminate the trajectory of the pattern light without increasing the total output power unlike the conventional three-dimensional shape detection device.
- an imaging device having the following configuration.
- the imaging apparatus includes an imaging unit for imaging a predetermined surface of a target object from an arbitrary direction, a storage unit for storing an image captured by the imaging unit as image data, and a three-dimensional shape of the target object. Based on the three-dimensional shape of the target object obtained by the three-dimensional shape obtaining means, and obtaining the image data stored in the storage means on a predetermined surface of the target object based on the three-dimensional shape of the target object obtained by the three-dimensional shape obtaining means.
- Image correction means for correcting the image data to be plane image data observed from a substantially vertical direction.
- the three-dimensional shape obtaining means includes pattern light projecting means for projecting a plurality of pattern lights including two pattern lights having different angular widths.
- the imaging unit captures an image of the target object on which the plurality of pattern lights are projected from a position at a fixed distance from the pattern light projection unit.
- the three-dimensional shape obtaining means calculates a position of the plurality of pattern lights projected on the target object based on an image captured by the imaging means, and calculates a three-dimensional shape of the target object. Equipped with the required three-dimensional shape calculation means
- a three-dimensional shape detecting method for detecting a three-dimensional shape of a target object, which outputs a light beam, and outputs the light beam. It is converted into a pattern light which is a light beam radiated in a plane at a predetermined angular width, and the pattern light is emitted for projecting to a target object, and a position at a certain distance from the emitted pattern light.
- An image of the target object on which the pattern light is projected Calculating the position of the pattern light projected on the target object based on the obtained image, and calculating the three-dimensional shape of the target object.
- the emitted pattern light includes first and second pattern lights having different angular widths.
- FIG. 1 (a) shows an overall perspective view of an imaging device according to a first embodiment of the present invention
- FIG. 1 (b) shows a schematic sectional view of the imaging device. .
- FIG. 2 is a block diagram illustrating an entire configuration of the imaging device in FIG. 1.
- FIG. 3 is a diagram showing a configuration of a slit light projecting unit in the imaging device in FIG. 1.
- FIG. 4 (a) shows a cross-sectional shape of a rod lens and a reflecting mirror
- FIG. 4 (b) shows a perspective view of a reflecting mirror
- FIG. 5 is a flowchart showing a process in a processor of the imaging device in FIG. 1.
- FIG. 6 (a) is a diagram for explaining an image with slit light by the imaging device of FIG. 1, and FIG. 6 (b) shows an example in which three slit lights are projected on a document.
- FIG. 6C is a diagram showing an example in which dotted slit light is projected on the document.
- FIG. 7 (a) is a diagram for explaining a three-dimensional space position calculation method and also shows a YZ plane
- FIG. 7 (b) is a diagram for explaining a three-dimensional space position calculation method. And the XZ plane.
- FIG. 8 shows a coordinate system at the time of document posture calculation and shows a document in a tilted state
- FIG. 8 (b) shows a coordinate system at the time of document posture calculation
- 8C shows a state in which the original is made parallel to the XY plane
- FIG. 8C shows a coordinate system in the calculation of the original posture and also shows a state of the original being curved.
- FIG. 9 is a flowchart illustrating a process performed by a plane conversion program according to the first embodiment.
- FIG. 10 is a diagram showing a modification of the slit light projecting unit.
- FIG. 11 (a) is a diagram showing a configuration in which slit light is generated using a one-dimensional minute displacement mirror array
- FIG. 11 (b) is a light projection onto a document by the one-dimensional minute displacement mirror array. It is a figure showing the slit light performed.
- FIG. 12 (a) is a diagram for explaining a state of imaging when smear occurs
- FIG. 12 (b) is a diagram showing a state of smear occurring on a document.
- FIG. 13 (a) is a perspective view of an entire imaging device according to a second embodiment of the present invention
- FIG. 13 (b) is a schematic sectional view of the imaging device.
- FIG. 14 is a block diagram showing the entire configuration of the imaging device in FIG.
- FIG. 15 is a diagram illustrating a configuration of a slit light projecting unit of the imaging device in FIG. 13.
- FIG. 16 is a diagram for explaining an angular width of slit light generated by a rod lens
- FIG. 16 (b) is an angular width of slit light generated by using a cylindrical lens.
- FIG. 17 is a flowchart illustrating a process in a processor of the imaging device in FIG. 1.
- FIG. 18 (a) is a diagram for explaining an image with slit light by the imaging device of FIG. 13, and FIG. 18 (b) shows an example in which three slit lights are projected on a document.
- FIG. 18 (b) shows an example in which three slit lights are projected on a document.
- FIG. 19 (a) is a diagram for explaining a three-dimensional space position calculation method and also shows a YZ plane
- FIG. 19 (b) is a diagram for explaining the three-dimensional space position calculation method. The figure also shows the XZ plane.
- FIG. 20 (a) shows the coordinate system when calculating the orientation of the original and shows the original in a tilted state
- FIG. 20 (b) shows the coordinate system when calculating the orientation of the original
- FIG. 20 (c) shows a state in which the original is made parallel to the XY plane
- FIG. 20 (c) shows a coordinate system at the time of the original posture calculation and also shows a state in which the original is curved.
- FIG. 21 is a flowchart showing processing by the plane conversion program according to the second embodiment.
- FIG. 22 (a) is a diagram for explaining the method of identifying the peak of the locus of the slit light according to the second embodiment
- FIG. 22 (b) is a comparative example of FIG. 22 (a).
- Light Pi It is a figure showing the case where a mark is different.
- FIG 23 is a description of the c code a diagram illustrating a modification of the slit beam projection unit of the second embodiment
- the imaging device 1 according to the present embodiment when compared with a conventional three-dimensional shape detection device as described in the above-mentioned Japanese Patent Application Laid-Open No. 7-318315, can reduce the total output power of the light beam converted into slit light.
- the object of reliably discriminating the trajectory of the slit light without ascending can be achieved.
- FIG. 1A is a perspective view of the entire imaging device 1 according to the first embodiment of the present invention.
- FIG. 1B is a schematic cross-sectional view of the imaging device 1.
- FIG. 1B also shows a state of the first slit light 71 and the second slit light 72 projected on the document P.
- Figure 2 shows the block diagram of imaging device 1.
- FIG. 1A is a perspective view of the entire imaging device 1 according to the first embodiment of the present invention.
- FIG. 1B is a schematic cross-sectional view of the imaging device 1.
- FIG. 1B also shows a state of the first slit light 71 and the second slit light 72 projected on the document P.
- Figure 2 shows the block diagram of imaging device 1.
- FIG. 1A is a perspective view of the entire imaging device 1 according to the first embodiment of the present invention.
- FIG. 1B is a schematic cross-sectional view of the imaging device 1.
- FIG. 1B also shows a state of the first slit light 71 and the second
- the imaging device 1 includes a rectangular box-shaped main body case 10, an imaging lens 31 provided on the front of the main body case 10, and It has a CCD image sensor 32 provided behind the image lens 31 (inside the imaging device 1) and a slit light projecting unit 20 provided below the imaging lens 31.
- the imaging device 1 further includes a processor 40 built in the main body case 10, a release button 52 and a mode switching switch 59 provided on an upper part of the main body case 10, and a card memory 55 built in the main body case 10. Have. As shown in FIG. 2, each of these components is connected by a signal line.
- the imaging device 1 further includes an LCD (Liquid Crystal) provided on the back of the main body case 10.
- LCD Liquid Crystal
- Display 51 and a finder 53 disposed from the back to the front of the main body case 10. These are used when the user determines the imaging range of the imaging device 1.
- the LCD 51 is configured by a liquid crystal display or the like that displays an image, and receives an image signal from the processor 40 and displays an image. From the processor 40 to the LCD 51, an image signal for displaying a real-time image received by the CCD image sensor 32, an image stored in the card memory 55, characters of the device setting, etc., according to the situation. Is sent.
- the imaging device 1 has a “normal mode” function corresponding to a function as a so-called digital camera and a “corrected imaging mode” function.
- the “normal mode” when the release button 52 is pressed by the user, the image formed on the CCD image sensor 32 by the imaging lens 31 is captured as image data and written to the card memory 55.
- the ⁇ corrected imaging mode '' is a function that, when the subject is a document P such as paper, makes it possible to create image data corrected as if it were captured from the front even if the document P was imaged from an oblique direction. is there.
- the slit light projecting unit 20 of the imaging apparatus 1 includes a laser diode 21, a collimating lens 22, and a lens. —Has a chair 23, a rod lens 24, and a reflection mirror 25.
- the laser diode 21 emits a red laser beam.
- the switch 21 controls the emission and stop of the laser beam in response to a command from the processor 40.
- the output of the laser diode 21 is based on the maximum power rating (for example, 5 mW).
- the rated output is adjusted so that a constant output (for example, lmW) can be obtained at a portion passing through the aperture 23 in consideration of the individual variation of the spread angle of the laser beam.
- the collimating lens 22 focuses the laser beam from the laser diode 21 so as to focus on a reference distance VP (for example, 330 mm) from the slit light projecting unit 20.
- VP for example, 330 mm
- the aperture 23 is formed of a plate having a rectangular opening.
- the laser beam from the collimating lens 22 is shaped into a rectangle by passing through the opening of the aperture 23.
- the rod lens 24 is a cylindrical lens having a short positive focal length. The rod lens 24 is disposed downstream of the laser beam emitted from the aperture 23.
- FIG. 4 (a) shows a cross-sectional shape of the rod lens 24 and the reflection mirror, and also shows a state in which the laser beam is condensed.
- FIG. 4B is a perspective view of the reflection mirror 25.
- the laser beam passing through the rod lens 24 is condensed at a focal point in the immediate vicinity of the rod lens 24, spreads thereafter, and expands at a predetermined distance. It is emitted as slit light with a spread angle ⁇ (for example, 48 degrees).
- the reflection mirror 25 is made of an optical plastic such as a moldable polymethyl methacrylate ( ⁇ ). As shown in FIG. 4B, the reflection mirror 25 includes a first mirror surface 25a inclined at a predetermined angle ⁇ (for example, 45 degrees) with respect to a surface parallel to the optical axis of the imaging lens 31, and a first mirror surface. And a second mirror surface 25b formed on a wedge-shaped projection formed on the surface 25a. The second mirror surface 25b is inclined at a predetermined angle ⁇ : (for example, 12 degrees) with respect to the first mirror surface 25a at a central portion of the first mirror surface 25a in a direction orthogonal to the inclination direction. I have. On the surface of the reflection mirror 25, an aluminum film and a silicon oxide protective film are deposited so as to substantially totally reflect the laser beam.
- ⁇ polymethyl methacrylate
- the reflection mirror 25 is disposed downstream of the slit light emitted from the rod lens 24.
- the slit light incident on the first mirror surface 25a is reflected by changing the direction at a predetermined angle; twice (90 degrees) I, and is emitted as the first slit light 71.
- the slit light incident on the second mirror surface 25b is reflected away from the first slit light 71 by twice (24 degrees) the predetermined angle c, and is emitted as the second slit light 72.
- the direction in which the first slit light 71 is emitted is called a first direction
- the direction in which the second slit light 72 is emitted is called a second direction.
- the slit light projecting unit 20 responds to the command of the processor 40 by the laser beam.
- a laser beam is emitted from one diode 21 to emit a first slit light 71 in a first direction and a second slit light 72 in a second direction.
- the first slit light 71 and the second slit light 72 are emitted from a window 29 provided below the imaging lens 31 of the main body case 10.
- the imaging lens 31 is composed of a plurality of lenses.
- the imaging device 1 has an autofocus function.
- the imaging lens 31 is driven by the autofocus function to adjust the focal length and the aperture so that light from the outside is formed on the CCD image sensor 32.
- the CCD image sensor 32 has a matrix arrangement of photoelectric conversion elements such as CCD (Charge Coupled Device) elements.
- the CCD image sensor 32 generates a signal corresponding to the color and intensity of the light of the image formed on the surface, converts the signal into digital data, and outputs the digital data to the processor 40.
- the data of one CCD element is pixel data of a pixel forming an image, and the image data is composed of pixel data of the number of CCD elements.
- the release button 52 is constituted by a push button type switch.
- the release button 52 is connected to the socket processor 40, and the processor 40 detects a pressing operation by a user.
- the card memory 55 is constituted by a nonvolatile and rewritable memory, and is detachable from the main body case 10.
- the mode switching switch 59 is configured by a slide switch or the like that can be switched to two positions.
- the processor 40 detects the position of the button of the mode switching switch 59. In the processor 40, one of the positions of the mode switching switch 59 is detected as the "normal mode", and the other is detected as the "corrected imaging mode".
- the processor 40 includes a CPU (Central Processing Unit) 41, a ROM 42, and a RAM 43.
- the CPU 41 executes processing by a program stored in the ROM 42 while using the RAM 43.
- the CPU 41 detects the pressing operation of the release button 52, captures the image data from the CCD image sensor 32, writes the image data to the card memory 55, detects the state of the mode switching switch 59, detects the state of the slit light emitting unit 20, Various processes such as switching of the emission of slit light by the above are performed.
- CPU Central Processing Unit
- the ROM 42 includes a camera control program 421, a difference extraction program 422, a triangulation calculation program 423, a document orientation calculation program 424, and a plane conversion program 425 (see FIG. 2).
- the camera control program 421 is a program relating to control of the entire imaging apparatus 1 including the processing of the flowchart shown in FIG. 5 (details will be described later).
- the difference extraction program 422 is a program for generating image data in which the trajectory of the slit light is extracted from the image of the document P on which the slit light is projected.
- the triangulation calculation program 423 is a program for calculating the three-dimensional spatial position of each trajectory of the slit light based on the image data generated by the difference extraction program with respect to each pixel.
- the document attitude calculation program 424 is a program for estimating the position and the three-dimensional shape of the document P from the three-dimensional spatial positions of the first slit light locus 71a and the second slit light locus 72a.
- the plane conversion program 425 converts the image data stored in the slit-lightless image storage unit 432 into an image as if the image was taken from the front of the document P based on the given position and orientation of the document P.
- the RAM 43 As a storage area, a slit light image storage unit 431, a slit light non-image storage unit 432, and a slit light non-slit image storage unit 432 having a size for storing data in the form of image data from the CCD image sensor 32,
- the difference image storage unit 433 is allocated.
- the RAM 43 stores a triangulation calculation result storage section 434 large enough to store the result of calculating the position of each point of the slit light image, and a document large enough to store the calculation result of the position and orientation of the document P.
- a posture calculation storage section 435 and a working area 436 having a size used for temporarily storing data for calculation in the CPU 41 are allocated.
- the finder 53 has an optical lens.
- the finder 53 is configured such that when a user looks into the image pickup apparatus 1 from behind, a range substantially coincident with a range where the imaging lens 31 forms an image on the CCD image sensor 32 can be seen.
- FIG. 5 is executed under the control of the processor 40 of the imaging device 1.
- the position of the mode switching switch 59 is detected. If the result of the detection in S110 is that the mode switching switch 59 is in the position of the "corrected imaging mode", If so, the processing shifts to SI 20. If the mode switching switch 59 is not in the "corrected imaging mode” but in the "normal mode” position, the process proceeds to S200.
- a command to emit light from the laser diode 21 is issued to the slit light projecting unit 20, and the first slit light 71 and the second slit light 72 are emitted.
- image data is acquired from the CCD image sensor 32 as an image with slit light. The acquired image data is stored in the image storage unit 431 with slit light in the RAM 43.
- the slit light projecting unit 20 is instructed to stop emitting the laser diode 21, and the first slit light 71 and the second slit light 72 are no longer emitted.
- Image data is acquired from the image sensor 32.
- the acquired image is stored in the image storage unit 432 without slit light of the RAM 43.
- the difference extraction program 422 generates image data in which the difference between the image data of the image storage unit 431 with slit light and the image data of the image storage unit 432 without slit light is generated, and stores the difference image. Stored in section 433. That is, images of the locus 71a of the first slit light and the locus 72a of the second slit light projected on the document P are extracted.
- the original posture calculation program is used by using the three-dimensional spatial positions of the first slit light trajectory 71a and the second slit light trajectory 72a stored in the triangulation calculation result storage unit 434. According to 424, the position and orientation of the document P are calculated.
- step S170 based on the position and orientation of the document P calculated in step S160, the image data force stored in the image storage unit 432 without slit light is observed by the plane conversion program 425 from the front side. Is converted into image data of a simple image.
- the image data in the image storage unit 432 without slit light is subtracted from the image data in the image storage unit 431 with slit light, and the attractive force is obtained. That is, subtraction of RGB values is performed for each pixel of both image data. Thereby, a multi-valued image in which only the trajectory of the slit light is extracted is obtained.
- the processing by the triangulation calculation program 423 in S150 is specifically as follows. For example, in the image data of the difference image storage unit 433, the vertical peak force of the trajectory 7la of the first slit light and the trajectory 72a of the second slit light in the vertical direction is obtained for each horizontal coordinate of the image data by calculating the center of gravity. The three-dimensional spatial position with respect to the extracted coordinates is determined as described below.
- the coordinate system of the imaging apparatus 1 with respect to the horizontally curved original P imaged as shown in Fig. 6A is defined as shown in Figs. 7A and 7B. . That is, the optical axis direction of the imaging lens 31 is the Z axis, the position at a reference distance VP from the imaging device 1 is the origin position of the X, Y, and Z axes. The direction is defined as the Y axis.
- the number of pixels in the X-axis direction of the CCD image sensor 32 is referred to as ResX
- the number of pixels in the Y-axis direction is referred to as ResY.
- the upper end of the position where the CCD image sensor 32 is projected on the XY plane through the imaging lens 31 is called Yftop, the lower end is called Yfbottom, the left end is called Xfstart, and the right end is called Xfend.
- the distance from the optical axis of the imaging lens 31 to the optical axis of the first slit light 71 emitted from the slit light projecting unit 20 is D, and the Y axis at which the first slit light 71 intersects the XY plane.
- the position in the direction is las1
- the position in the Y-axis direction where the second slit light 72 intersects the XY plane is las2.
- the three-dimensional spatial position (Xl, Yl) corresponding to the coordinates (ccdxl, ccdyl) on the CCD image sensor 32 of the point of interest 1 that focuses on one of the pixels of the image of the trajectory 71a of the first slit light 71a , Z1) are set for a triangle formed by a point on the image plane of the CCD image sensor 32, an emission point of the first slit light 71 and the second slit light 72, and a point intersecting the XY plane.
- the processing by the document orientation calculation program 424 in S160 is specifically as follows. For example, a line obtained by approximating a regression curve to each point of the three-dimensional space position corresponding to the locus 71a of the first slit light from the data of the triangulation calculation result storage unit 434 is obtained, and the position of the curve in the X-axis direction is set to “0”. ”And a three-dimensional position where the position of the second slit light trajectory 72a in the X-axis direction is“ 0 ”.
- the point at which the linear force axis intersects that is, the point at which the optical axis intersects the document P, is defined as the three-dimensional spatial position (0, 0, L) of the document P (see FIG. 8A).
- the angle between the straight line and the XY plane is defined as the inclination ⁇ of the document P about the X axis.
- a line obtained by approximating a regression curve to the locus 71a of the first slit light is rotationally transformed in the opposite direction by the previously obtained inclination ⁇ ⁇ ⁇ around the X axis. That is, consider a state where the document P is parallel to the XY plane. Further, as shown in FIG. 8 (c), the cross-sectional shape of the document P in the X-axis direction is obtained by calculating the displacement in the Z-axis direction at a plurality of points in the X-axis direction with respect to the cross section of the document P in the X--Z plane. From the displacement, the curvature ⁇ (X), which is a function of the tilt in the X-axis direction with the position in the X-axis direction as a variable, is obtained.
- the processing by the plane conversion program 425 in S 170 is specifically shown in, for example, FIG. This is a process described below, which is represented by a flowchart.
- the processing area for the process is allocated to the working area 436 of the RAM 43, and the initial values of variables used for the process, such as variables for the counter, are set (S1002).
- the area of the erect image which is an image when the surface of the original P on which the characters and the like are written is observed in a substantially vertical direction, is the three-dimensional space of the original P based on the calculation result of the original posture calculation program 424. It is set by transforming the four corner points of the image without slit light based on the position (0, 0, L), the inclination X around the X axis, and the curvature ⁇ (X), and is included in this area. The number of pixels a is obtained (S1003).
- the area of the set erect image is first arranged on the XY plane (S1005), and the three-dimensional spatial position of each pixel included therein is determined based on the curvature ⁇ (X).
- ⁇ curvature
- the coordinates are converted to the coordinates (ccdcx, ccdcy) on the CCD image captured by the ideal camera according to the above triangulation relational expression (S1009).
- the coordinates are converted into the coordinates (ccdx, ccdy) on the CCD image captured by the actual camera by a known calibration method (S1010).
- the state of the pixel of the image without slit light at the position of is determined and stored in the working area 436 of the RAM 43 (S1011). This is repeated for the number of pixels a to generate image data of the erect image.
- the imaging apparatus 1 has the trajectory 71a of the first slit light 72a and the trajectory of the first slit light 72 above it, as shown in FIG. 6 (a).
- a locus 72a of the second slit light having the same length is formed on the original P in the missing portion of the document P.
- the image of the document P is formed on the CCD image sensor 32 by the imaging lens 31 and then imaged. Subsequently, an image of the document P on which the locus of the slit light is not formed is imaged.
- the images of the trajectories 71a and 72a of the first and second slit lights are extracted from the image data, and the first and second slit lights are extracted based on the principle of triangulation.
- the three-dimensional spatial position of each part of the trajectories 71a and 72a is calculated.
- the position L, the inclination ⁇ , and the curvature ⁇ (X) of the document P are obtained from the calculation result, and the shape of the trajectory 71a of the first slit light is defined as the cross-sectional shape of the entire document P, and the three-dimensional shape of the document P is obtained.
- the user switches the mode switching switch 59 to the “corrected imaging mode” side, and determines whether the desired range of the document P is within the imaging range with the viewfinder 53 or the LCD 51.
- the image data as if the flat original P was imaged from the front can be stored in the card memory 55. Can be stored.
- the image data stored in the card memory 55 is displayed on the LCD 51 to check the imaged content, or the card memory 55 is removed from the imaging device 1 and displayed on an external personal computer or the like. It can be used after printing.
- the power per unit angular width of the first slit light 71 and the second slit light 72 is the same as that of the slit light before being deflected, and is not different from the case where one row of slit light is emitted. Therefore, according to the present embodiment, the luminance difference between the trajectory of the slit light and the document P is sufficient, and the trajectory image of the slit light can be reliably extracted by the difference extraction program 422.
- the imaging device 1 can reliably detect the three-dimensional shape of the target object without increasing the output of the laser diode 21 as the light source. Therefore, the configuration of the imaging device 1 can be made simple and small.
- the imaging device 1 of the present embodiment since the central portion of the first slit light 71 is deflected, the possibility that specular reflection light of the first slit light 71 is incident on the imaging lens 31 is low. Les ,. The angle of the second slit light 72 with respect to the document P increases. For this reason, in order for the specular reflection light of the document P by the second slit light 72 to be incident on the imaging lens 31, it is necessary to take an image with a positional force exceeding 90 degrees with respect to the document P. The state is hard to think about for realistic use.
- the imaging device 1 can accurately detect the bright spot ⁇ smear in the image to be captured by the specularly reflected light that does not allow the specular reflected light of the slit light applied to the original P to be incident from the imaging lens 31. It is possible to prevent a problem that a three-dimensional shape cannot be detected. If it is assumed that the central portion of the first slit light 71 is not deflected as in the present embodiment and smear occurs due to the central portion, the situation in that case is shown in FIGS. 12 (a) and 12 (b). ).
- the vapor deposition film forming the reflection film on the mirror surface is usually a specific film. Since the film is formed from one direction, a correct reflective film is not formed on the side surface orthogonal to the first mirror surface 25a and the second mirror surface 25b, and the transflective film is interrupted or has insufficient reflectance. Become. However, if the cross-section is convex as in the reflection mirror 25 of the present embodiment, as shown in FIG. 4B, the slit light from the rod lens 24 that is not reflected by the second mirror surface 25b is radiated. The slit light does not enter the side surface or the corner portion between the first mirror surface 25a and the second mirror surface 25b at all. Therefore, it is possible to accurately detect the three-dimensional shape without disturbing the slit light when the slit light is incident on a portion where the reflective film is insufficient.
- the slit light projecting unit 20 corresponds to a pattern light projecting unit, and the imaging lens 31 and the CCD image sensor 32 are used as a projected image photographing unit.
- the processing from S140 to S160 by the processor 40 corresponds to the three-dimensional shape calculation means.
- the imaging lens 31 and the CCD image sensor 32 correspond to an imaging unit
- the processing in S170 by the processor 40 corresponds to an image correcting unit
- the RAM 42 corresponds to a storage unit.
- the target object imaged by the imaging apparatus 1 may be a smooth surface of a solid block, or a surface of an object having a ridgeline in some cases, in addition to the sheet-shaped original P.
- the effect of detecting the three-dimensional shape of the target object can be similarly exhibited in all applications in which the three-dimensional shape in the three-dimensional space is desired to be obtained from the trajectories of the slit light in approximately two rows.
- the target object is a sheet-shaped document P as in the present embodiment
- the entire shape of the document P is estimated by assuming that the trajectory 71a of the first slit light is the cross-sectional shape of the document P.
- the target object has a three-dimensional shape that is substantially uniform in the direction perpendicular to the longitudinal direction of the slit light
- the detection posture shift caused by a unique shape such as a projection of the target object included in the position where the slit light is projected Therefore, it is not necessary to pay attention to the location where the slit light is projected without taking into account.
- the first slit light 71 and the second slit light 72 emitted from the imaging device 1 are used to convert the slit light, which is also output from the rod lens 24, into the first mirror surface 25 a and the second mirror surface 25 b of the reflection mirror 25.
- a reflecting mirror 26 which is a single mirror and a transparent flat plate 27 having a predetermined section (near the center) as a diffraction grating 27a are formed. Is also good. In this configuration, the slit light from the rod lens 24 is reflected by the reflection mirror 26 as it is and the direction thereof is changed.
- the slit light is deflected by the diffraction grating 27a by the transparent flat plate 27 in the direction corresponding to the grating width.
- the second slit light 72 is formed.
- the first slit light 71 is formed and emitted by the slit light transmitted through a part other than the diffraction grating 27a.
- the power distribution ratio between the zero-order light and the high-order light can be changed depending on the grating width, so that the power per angular width of the second slit light 72 can be changed.
- Device 28 may be used. According to such a configuration, the tilt angle of the one-dimensional minute displacement mirror array 28a near the center of the mirror device 28 is changed according to a command from the processor 40 to deflect the slit light emitted from the rod lens 24.
- the first slit light 71 and the second slit light 72 can be formed with two types of angles.
- the length and position of the second slit light can be changed according to the use condition.
- the first slit light 71 is changed depending on the shape of the target object.
- the tilt angle of the one-dimensional minute displacement mirror array 28a at the position is changed so that the position where smear occurs becomes the second slit light 72, and the CCD image sensor 32 The specularly reflected light does not enter the
- the slit light is deflected by the reflection mirror 25 composed of the first mirror surface 25a and the second mirror surface 25b to convert the first slit light 71 and the second slit light 72.
- the configuration is simple, and the entire device can be reduced in size and cost.
- the part where the first slit light is partially deflected and dropped may not be limited to only one point, but may be configured to deflect a plurality of points.
- a configuration is provided in which a plurality of second mirror surfaces 25b of the reflection mirror 25 are provided, and a substantially dotted linear second surface deflected at a plurality of positions like a locus image of the slit light projected on the document P shown in FIG.
- the slit light projecting unit 20 is configured to emit two rows of slit light of the first slit light 71 and the second slit light 72.
- the light is not limited to two rows, and may be configured to emit three or more rows.
- the reflecting mirror 25 has a configuration in which a third mirror surface inclined at a predetermined angle is provided on the second mirror surface 25b, and the locus image of the slit light projected on the original P shown in FIG.
- the original P may be configured such that the locus of the third slit light is formed above the locus 72a of the second slit light.
- the second slit light 72 is deflected so as to be above the first slit light 71.
- the positional relationship between them is not particularly limited.
- the locus 71a of the first slit light is located below the locus 72b of the second slit light.
- the slit light projecting unit 20 may be configured so as to be formed. Alternatively, the location where the first slit light is lost due to the formation of the second slit light may be formed near the end rather than near the center.
- the angular width of the second slit light 72 is not particularly limited either.
- the second mirror surface 25b is made to have a large width, and most of the angular width is deflected with respect to the first slit light by the second slit light.
- the light 72 may be formed.
- a laser diode 21 that emits a red laser beam is used as a light source.
- other light sources such as a surface emitting laser, an LED, and an EL element can output a light beam. In this case, any one may be used.
- the slit light emitted from the slit light projecting unit 20 may be a stripe-shaped light pattern having a certain width, in addition to a fine line sharply narrowed in a direction perpendicular to the longitudinal direction. .
- the imaging apparatus 1 is configured to capture an image with slit light and an image without slit light using the imaging lens 31 and the CCD image sensor 32.
- an imaging lens and a CCD image sensor for capturing an image having a slit may be additionally provided.
- the accuracy of the three-dimensional shape of the target object to be detected without any shift in the imaging range of the image without slit light with respect to the image can be improved.
- the imaging device 1 of the present embodiment can be made smaller and less expensive with fewer components.
- the term “pattern light” includes a shape in which a part is missing like the first slit light 71.
- the size of the missing part in one pattern light may be various. For example, there may be a case where the force of the missing portion with respect to the entire length of one pattern light is greater than 1Z2.
- the imaging device 1 according to the present embodiment has a total emission power of a light beam converted into slit light when compared with a conventional three-dimensional shape detection device as described in Japanese Patent Application Laid-Open No. 7-318315. To achieve the task of achieving a simple configuration with a low cost, low power consumption and a simple configuration.
- FIG. 13A is a perspective view of the entire imaging device 1 according to the second embodiment of the present invention.
- FIG. 13B is a schematic cross-sectional view of the imaging device 201.
- Figure 1 3 (b) also shows the state of the first slit light 171 and the second slit light 172 projected on the document P.
- FIG. 14 is a block diagram of the imaging device 201.
- the imaging device 201 includes a rectangular box-shaped main body case 10, an imaging lens 31 provided on the front of the main body case 10, and It has a CCD image sensor 32 provided behind the image lens 31 (inside the imaging device 1). Further, the imaging device 201 includes a slit light projecting unit 120 provided below the imaging lens 31, a processor 40 built in the main body case 10, a release button 52 provided on an upper part of the main body case 10, and a mode switching mode. It has a switch 59 and a card memory 55 built in the main body case 10. These components are connected by signal lines as shown in FIG.
- the imaging device 201 includes an LCD (Liquid Crystal Display) 51 provided on the back of the main body case 10 and a finder 53 provided from the back to the front of the main body case 10. These are used when the user determines the imaging range of the imaging device 201.
- LCD Liquid Crystal Display
- the LCD 51 is configured by a liquid crystal display or the like that displays an image, and receives an image signal from the processor 40 and displays an image.
- the imaging device 201 has a “normal mode” function and a “corrected imaging mode” function corresponding to functions as a so-called digital camera.
- the “normal mode” when the release button 52 is pressed by the user, an image formed on the CCD image sensor 32 by the imaging lens 31 is taken in as image data and written into the card memory 55.
- the “correction imaging mode” is a function that can generate image data corrected as if it were captured from the front, even if the document P is imaged from an oblique direction when the subject is a document P such as paper. .
- the slit light projecting unit 120 of the imaging device 201 includes a laser diode 21, a collimating lens 22, and a laser diode.
- a laser diode 21 has a cylinder 23, a transparent flat plate 124, a cylindrical lens 125, a reflection mirror 126, and a rod lens 127.
- the laser diode 21 emits a red laser beam. Under the control of the processor 40, the emission and the stop of the laser beam at the laser diode 21 are switched.
- a certain output for example, lmW
- the maximum output rating for example, 5mW
- the collimating lens 22 focuses the laser beam from the laser diode 21 so as to focus on a reference distance VP (for example, 330 mm) from the slit light projecting unit 120.
- VP for example, 330 mm
- the aperture 23 is formed of a plate having a rectangular opening.
- the laser beam of the collimating lens 22 is shaped into a rectangle by passing through the opening of the aperture 23.
- the transparent flat plate 124 is made of a transparent flat plate made of a solid glass material or the like, and has an AR coat (anti-reflection coating) on the back surface.
- the transparent flat plate 124 has a predetermined angle (for example, 33 °) at the front side of the main body case 10 with respect to the plane of light reflected from the transparent flat plate 124 (second slit light 172) with respect to a plane orthogonal to the optical axis of the laser beam from the aperture 23. (Degree) It is arranged to incline. About 5% (about 50 ⁇ W) of the laser beam incident on the transparent plate 124 from the aperture 23 is reflected by the surface, and about 95% (about 950 ⁇ W) is transmitted through the transparent plate 124.
- the direction in which the laser beam is reflected by the transparent flat plate 124 and travels (upward by 33 degrees with respect to the horizontal plane in front of the imaging device 1) is referred to as a second direction.
- the reflection mirror 126 is made of a member such as a mirror that totally reflects the laser beam.
- the reflection mirror 126 is disposed downstream of the laser beam transmitted through the transparent flat plate 124 at an angle of 45 degrees in front of the main body case 10 with respect to a horizontal plane.
- the laser beam is totally reflected and changes the direction of the optical path by 90 degrees.
- the direction in which the laser beam reflected by the reflecting mirror 126 travels (the direction of 0 ° with respect to the horizontal plane in front of the imaging device 1) is referred to as a first direction.
- the rod lens 127 is formed of a cylindrical lens having a short positive focal length.
- the rod lens 127 is disposed downstream of the laser beam reflected by the reflection mirror 126 so that the axial direction of the cylindrical shape is vertical.
- the focal length of the rod lens 127 is short. Therefore, as shown in FIG. 16A, the laser beam incident on the rod lens 127 is The light spreads beyond the focal point near the rod lens 127 and is emitted in the first direction as slit light having a predetermined spread angle ⁇ (for example, 48 degrees).
- the slit light emitted from the rod lens 127 is referred to as first slit light 171.
- the cylindrical lens 125 is a lens having a concave shape in one direction so as to have a negative focal length.
- the cylindrical lens 125 is disposed downstream of the laser beam reflected by the transparent flat plate 124 such that the lens surface is orthogonal to the second direction.
- the cylindrical lens 125 emits the laser beam incident from the transparent flat plate 124 at a spread angle ⁇ as slit light spread in the second direction.
- the slit light emitted from the cylindrical lens 125 is referred to as a second slit light 172.
- the ratio of the spread angle ⁇ of the second slit light 172 after passing through the cylindrical lens 125 to the spread angle ⁇ of the first slit light 171 is determined by the power at which the laser beam is split by the transparent flat plate 124. Equal to. That is, the spread angle of the second slit light 172 is 5% (2.4 degrees) of the spread angle ⁇ of the first slit light 171.
- the slit light projecting unit 120 emits a laser beam from the laser diode 21 in response to a command from the processor 40, and outputs the first slit light in the first direction.
- the imaging lens 31 includes a plurality of lenses.
- the imaging device 201 has an auto force function.
- the imaging lens 31 is driven by the autofocus function so that the focal length and the aperture are adjusted so that light from the outside forms an image on the CCD image sensor 32.
- the CCD image sensor 32 has a matrix arrangement of photoelectric conversion elements such as CCD (Charge Coupled Device) elements.
- the CCD image sensor 32 generates a signal corresponding to the color and intensity of light of the image formed on the surface, converts the signal into digital data, and outputs the digital data to the processor 40.
- the data for one CCD element is pixel data of a pixel forming an image, and the image data is composed of pixel data of the number of CCD elements.
- the release button 52 is constituted by a push button type switch.
- the release button 52 is connected to the socket processor 40, and the processor 40 detects a pressing operation by a user.
- the card memory 55 is composed of a nonvolatile and rewritable memory, It can be attached to and detached from the case 10.
- the mode switching switch 59 is configured by a slide switch that can be switched to two positions.
- the processor 40 detects the position of the button of the mode switching switch 59. In the processor 40, one of the positions of the mode switching switch 59 is detected as the "normal mode", and the other is detected as the "corrected imaging mode".
- the processor 40 includes a CPU (Central Processing Unit) 41, a ROM 42, and a RAM 43.
- the CPU 41 executes processing by a program stored in the ROM 42 while using the RAM 43.
- the CPU 41 detects the pressing operation of the release button 52, captures image data from the CCD image sensor 32, writes the image data to the card memory 55, detects the state of the mode switching switch 59, detects the state of the slit light emitting unit 120, Performs various processes such as switching the emission of slit light by.
- CPU Central Processing Unit
- the ROM 42 includes a camera control program 421, a difference extraction program 422, a triangulation calculation program 423, a document attitude calculation program 424, and a plane conversion program 425 (see FIG. 2).
- the camera control program 421 is a program relating to the control of the entire imaging apparatus 1 including the processing of the flowchart shown in FIG. 17 (details will be described later).
- the difference extraction program 422 is a program for generating image data in which the trajectory of the slit light is extracted from the image of the document P on which the slit light is projected.
- the triangulation calculation program 423 is a program for calculating the three-dimensional spatial position of each trajectory of the slit light based on the image data generated by the difference extraction program with respect to each pixel.
- the document attitude calculation program 424 is a program for estimating and obtaining the position and the three-dimensional shape of the document P from the three-dimensional space positions of the trajectory 171a of the first slit light and the trajectory 172a of the second slit light.
- the plane conversion program 425 converts the image data stored in the slit-lightless image storage unit 432 into an image as if the image was taken from the front of the document P based on the given position and orientation of the document P.
- the RAM 43 data in the form of image data from the CCD image sensor 32 is stored as a storage area.
- the image storage unit 431 with slit light, the image storage unit 432 without slit light, and the difference image storage unit 433 that are large enough to store data are allocated.
- the RAM 43 stores a triangulation calculation result storage section 434 large enough to store the result of calculating the position of each point of the slit light image, and a document large enough to store the calculation result of the position and orientation of the document P.
- a posture calculation storage section 435 and a working area 436 having a size used for temporarily storing data for calculation in the CPU 41 are allocated.
- the viewfinder 53 is configured by an optical lens.
- the viewfinder 53 is configured such that when a user looks into the image capturing apparatus 201 from behind, a range that substantially matches the range where the imaging lens 31 forms an image on the CCD image sensor 32 can be seen.
- S 2110 the position of the mode switching switch 59 is detected. If the result of the detection in S2110 is that the mode switching switch 59 is at the position of the “corrected imaging mode” (S2110: YES), the process proceeds to S2120. In the case of the “normal mode” position (S2110: N :), the processing shifts to S2200.
- the stop of light emission of the laser diode 21 is instructed to the slit light projecting unit 120, and the CCD image sensor 32 outputs the first slit light 171 and the second slit light 172 in a state where they are no longer emitted.
- Image data is obtained.
- the acquired image is stored in the image storage unit 432 without slit light as an image without slit light.
- the difference extraction program 422 extracts the difference between the image data in the image storage unit 431 with slit light and the image data in the image storage unit 432 without slit light. That is, the image data from which the trajectory 171a of the first slit light projected on the document P and the trajectory 172a of the second slit light 172a are generated are stored in the difference image storage unit 433.
- the three-dimensional spatial position of each pixel of the trajectory 171a of the first slit light and the trajectory 172a of the second slit light extracted from the image data of the difference image storage unit 433 is calculated by triangulation. It is calculated by the program 423. The calculation results are stored in the triangulation calculation result storage unit 434.
- the original posture calculation program 424 is used by using the three-dimensional spatial positions of the trajectories of the first slit light 171 and the second slit light 172 stored in the triangulation calculation result storage unit 434. Thus, the position and orientation of the document P are calculated.
- the image data stored in the image storage unit 432 without slit light 432 is viewed from the front. Is converted into image data of a simple image.
- the processing by the difference extraction program 422 in S2140 is specifically as follows.
- the image data of the image storage unit 432 without slit light is subtracted from the image data of the image storage unit 431 with slit light for each pixel. That is, the RGB values of both image data are subtracted for each pixel. Thereby, a multi-valued image in which only the trajectory of the slit light is extracted is obtained.
- the processing by the triangulation calculation program 423 in S2150 is specifically as follows.
- the vertical peaks of the locus 171a of the first slit light and the locus 172a of the second slit light are obtained for each horizontal coordinate of the image data by calculating the center of gravity.
- the three-dimensional space position for this peak extraction coordinate is obtained as follows.
- the coordinate system of the image capturing apparatus 201 with respect to the horizontally curved original P to be imaged as shown in FIG. 18A is defined as follows. That is, as shown in Fig. 19 (a) and Fig. 19 (b).
- the optical axis direction of the imaging lens 31 as the Z axis
- the horizontal direction with respect to the imaging device 201 as the X axis and the vertical direction.
- the direction be the Y axis.
- the number of pixels in the X-axis direction of the CCD image sensor 32 is called ResX, and the number of pixels in the Y-axis direction is called ResY.
- the upper end of the position where the CCD image sensor 32 is projected on the XY plane through the imaging lens 31 is called Yftop, the lower end is called Yfbottom, the left end is called Xfstart, and the right end is called Xfend.
- the distance from the optical axis of the imaging lens 31 to the optical axis of the first slit light 171 emitted from the slit light projecting unit 120 is D, and the first slit light 171 intersects the XY plane in Y.
- the position in the axial direction is las1
- the position in the Y-axis direction where the second slit light 172 intersects the XY plane is las2.
- the three-dimensional spatial position (XI,) corresponding to the coordinates (ccdxl, ccdyl) on the CCD image sensor 32 of the point of interest 1 that focuses on one of the pixels of the image of the first slit light trajectory 171a Yl, Z 1) are formed by points on the image plane of the CCD image sensor 32, emission points of the first slit light 171 and the second slit light 172, and points intersecting the XY plane. It is derived from the solution of the following five simultaneous equations set up for the triangle.
- the three-dimensional spatial position (X2, Y2) corresponding to the coordinates (ccdx2, ccdy2) of the point of interest 2 that focuses on one of the pixels of the image of the locus 172a of the second slit light on the CCD image sensor 32 , Z2) are derived from the solution of the following five simultaneous equations.
- the processing by the document orientation calculation program 424 in S2160 is specifically as follows. For example, from the data in the triangulation calculation result storage unit 434, a line obtained by approximating a regression curve at each point in the three-dimensional space corresponding to the locus 171a of the first slit light is obtained. It is assumed that a straight line connects a point at “0” and a three-dimensional position where the position of the second slit light trajectory 172a in the X-axis direction is “0”. The next assumed point of intersection with the linear force axis, that is, the point where the optical axis intersects the document P, is defined as the three-dimensional spatial position (0, 0, of the document P) (see FIG. 20 (a)). The angle between this straight line and the XY plane is defined as the inclination ⁇ ⁇ of the document P around the X axis.
- a line obtained by approximating the trajectory 171a of the first slit light with a regression curve is rotationally transformed in the reverse direction by the previously obtained inclination X about the X axis. That is, consider a state in which the original P is parallel to the XY plane.
- the cross-sectional shape of the document P in the X-axis direction is determined with respect to the cross section of the document P in the X-Z plane, and the displacement in the Z-axis direction is obtained at a plurality of points in the X-axis direction.
- a curvature ⁇ (X) which is a function of the tilt in the X-axis direction with the position in the X-axis direction as a variable is obtained from the displacement degree.
- the process by the plane conversion program 425 in S2170 is, for example, a process as shown in the flowchart of FIG. The processing in FIG. 21 will be described below.
- the processing area power S of the process is allocated to the working area 436 of the RAM 43, and initial values of variables used in the process, such as variables for a counter, are set (S3002).
- the area of the erect image which is an image when the surface of the document P on which the characters and the like are written is observed in a substantially vertical direction, is the position of the document P based on the result calculated by the document attitude calculation program 425.
- the points at the four corners of the image without slit light are converted and set.
- the number a of pixels included in this area is determined (S3003).
- the set erect image region is first arranged on the XY plane (S3005), and for each pixel included therein, the three-dimensional spatial position is determined based on the curvature ⁇ (X). Is displaced in the Z-axis direction (S3006), rotated about the X-axis with an inclination ⁇ (S3007), and shifted by a distance L in the Z-axis direction (S3008).
- the obtained three-dimensional spatial position is the relational expression of the previous triangulation. Is converted to coordinates (ccdcx, ccdcy) on the CCD image captured by the ideal camera (S3009).
- the imaging lens 31 is converted to coordinates (ccdx, ccdy) on the CCD image captured by the actual camera by a known calibration method (S3010).
- the state of the pixel of the image without slit light at the position of the converted coordinates is obtained and stored in the working area 436 of the RAM 43 (S3011).
- the above processing is repeatedly executed for the number of pixels a (S3012, S3004), and image data of an erect image is generated.
- the imaging device 201 As described above, in the imaging device 201, the first slit light 171 and the second slit light 172
- Two rows of slit lights are projected on the document P, and the document P is imaged on the CCD image sensor 32 by the image forming lens 31 to perform imaging. Subsequently, an image of the document P to which the slit light is not projected is captured.
- the locus image of the slit light is extracted from the image data, and the three-dimensional spatial position of each part of the locus of the slit light is calculated based on the principle of triangulation. From these, the position, inclination, and bending state of the document P are obtained, and a flat document P is imaged from the front based on the result of estimating the shape of the trajectory 171a of the first slit light as the cross-sectional shape of the entire document P.
- the corrected image data is generated as if it were performed.
- the generated image data is recorded in the card memory 55.
- the imaging apparatus 201 even when the original P whose shape has been deformed such as a curve is imaged obliquely, the user can force image data as if the flat original P were imaged from the front.
- Memory 55 the user switches the normal shooting operation (that is, switches the mode switching switch 59 to the “correction imaging mode” side, and confirms whether the desired range of the document P is within the imaging range using the viewfinder 53 or the LCD 51). Then, by pressing the release button 52 and taking an image), an image can be obtained as if the flat original P was imaged from the front.
- the image data stored in the card memory 55 can be displayed on the LCD 51 to confirm the imaged content.
- the card memory 55 can be removed from the image pickup device 201 to provide an external personal computer. By attaching it to such a device, the image data can be displayed on a PC or printed.
- the laser Of the power output from one diode 21 the power of the first slit light 171 split by the transparent flat plate 124 is 95%, while the power of the second slit light 172 is as small as about 5%.
- the first slit beam 171 with a divergence angle of 48 degrees has a power per unit angle of about 20 zWZ degrees and the second slit beam 172 with a divergence angle of 2.4 degrees.
- the power per unit angle is about 21 zW / degree, which is almost the same.
- the illuminance by the first slit light 171 and the second slit light 172 is about 1,260 lettuce, which is the brightness of a room with a general white steepness. Even at a location of 500-1000 lux, there is a sufficient difference in brightness between the locus of the slit light and the document P. Accordingly, the locus image of the slit light can be extracted by the difference extraction program 422.
- the imaging device 201 emits two rows of slit lights with the same power per angular width as when projecting one row of slit lights without increasing the output of the laser diode 21 as a light source. I do. According to such a configuration, the three-dimensional shape of the target object can be detected, and the configuration of the imaging device 201 can be simplified and reduced in size.
- the first slit light 171 and the second slit light 172 have substantially the same power per unit angle, they correspond to the locus 171a of the first slit light and the locus 172a of the second slit light.
- the signal of the CCD image sensor 32 has a signal level representing substantially the same luminance. For example, as shown in FIG. 22A, a pixel signal viewed in a vertical direction at a predetermined position of the CCD image sensor 32 has a portion where a locus 171a of the first slit light and a locus 172a of the second slit light appear. The signal is as high as the background level.
- the difference extraction program 422 can share the setting of the threshold value for extracting the locus of the slit light.
- the first slit light 171 and the second slit light 172 When there is a difference in power per unit angle between the first slit light 171 and the second slit light 172, as shown in FIG. 22 (b), the first slit light 171 and the second slit light 172
- the peak value may be scattered in the fluctuation of the background level, and the locus of the slit light may not be detected accurately.
- a threshold is set for each detection point in accordance with the background noise in order to detect the locus of the slit light, the number of calculations increases and the efficiency becomes inefficient, but according to the present embodiment, There is no such thing.
- the transparent flat plate 24 is provided with an AR coating on the back surface, the reflection of the laser beam incident on the transparent flat plate 24 when the transparent flat plate 24 is emitted from the transparent flat plate 24 is reduced. The loss of the laser beam within 24 is reduced.
- the transparent flat plate 124 is necessary when the ratio of the laser beam reflected by the transparent flat plate 124 is set as 5% of the surface reflectance determined by the refractive index of the material of the transparent flat plate 124, so that it can be realized by a normal half mirror. In addition, it is not necessary to perform a manufacturing process of forming a metal deposition film on the reflection surface.
- the laser diode 21 corresponds to the light source of the light output means
- the transparent flat plate 24 corresponds to the division means of the light output means.
- the slit light projecting unit 120 corresponds to a pattern light projecting unit, and includes an image forming lens.
- the CCD image sensor 32 corresponds to a projected image capturing unit
- the processing from S140 to S160 in the processor 40 corresponds to a three-dimensional shape calculating unit.
- the imaging lens 31 and the CCD image sensor 32 correspond to imaging means
- the processing of S170 in the processor 40 corresponds to image correction means
- the RAM 43 corresponds to storage means. I do.
- the target object imaged by the imaging device 201 may be a smooth surface of an individual block or a surface of an object having a ridge line in some cases, in addition to the sheet-shaped document P. . That is, the imaging device 201 detects the three-dimensional shape in the three-dimensional space from the trajectories of the slit light in two rows, and detects the three-dimensional shape of the target object in almost all applications. The effect can be exhibited.
- the entire shape of the document P is estimated by assuming that the trajectory 171a of the first slit light is the cross-sectional shape of the document P.
- the target object has a three-dimensional shape that is substantially uniform in the direction perpendicular to the longitudinal direction of the slit light, the detection posture shift caused by a unique shape such as the projection of the target object included in the position where the slit light is projected, etc. Therefore, it is not necessary to pay attention to the location where the slit light is projected without considering the problem.
- the slit light projecting unit 120 is configured to emit the first slit light 171 and the second slit light 172 in two rows of slit light.
- the slit light is not limited to two rows, and may be configured to emit three or more rows.
- the slit light projecting unit 120 includes the second slit light 172 in addition to the first slit light 171 and the second slit light 172.
- a similar third slit light power document P may be configured to be projected above the second slit light 172 in the document P.
- a laser diode 21 that emits a red laser beam is used as a light source.
- Various light sources that can emit a light beam such as a surface emitting laser, an LED, and an EL element, are used. Can be used.
- a force using a transparent flat plate 124 that reflects a predetermined ratio is used as a splitting unit.
- a function equivalent to that of the transparent flat plate 124 is obtained by diffracting a predetermined ratio of the power of the incident laser beam in a predetermined direction.
- the laser beam of the primary light diffracted by the diffraction grating can be used as the second slit light 172, and the laser light of the zero-order light transmitted as it is can be used as the first slit light 171.
- the first and second slit lights 171 and 172 can be generated by using a reflection type diffraction grating 128.
- the reflection type diffraction grating 128 can also play a role instead of the reflection mirror 26 in the above embodiment.
- the laser beam of the primary light diffracted by the diffraction grating of the reflection type diffraction grating 128 can be used as the second slit light 172, and the laser light of the zero-order light reflected as it is can be used as the first slit light 171.
- Such a diffraction grating can change the power distribution ratio between the zero-order light and the high-order light depending on its cross-sectional shape, and is therefore suitable as a dividing means.
- the slit light emitted from the slit light projecting unit 120 is not only a thin line sharply narrowed in a direction orthogonal to the longitudinal direction, but also a stripe-shaped light pattern having a certain width. good.
- each optical element is arranged such that the second slit light 172 is arranged in the first direction, that is, the lower side when viewed from the imaging device 201, and the first slit light 171 is arranged in the second direction. May be.
- the imaging device 201 is configured to capture an image with slit light and an image without slit light using the imaging lens 31 and the CCD image sensor 32.
- an imaging lens and a CCD image sensor for capturing an image with a slit may be separately added to the imaging device 201.
- the accuracy of the three-dimensional shape of the target object that can be detected without any shift in the imaging range of the image without slit light with respect to the image with slit light can be improved.
- the imaging device 201 of the present embodiment can be made smaller and less expensive with fewer components.
- the pattern light projecting means has a light output means for outputting light, and emits light from the light output means in a plane at a predetermined angular width.
- a first pattern light in which a light beam is converted and a part of the light beam in the predetermined angular width direction is deflected; and the first pattern light in which a part of the light beam in the predetermined angle width direction is missing due to the deflection,
- the second pattern light deflected with respect to the pattern light may be formed and emitted.
- the distance between the position of the pattern light projecting means and the position of the projected image capturing means is a fixed distance, and the angle of the slit light emitted from the pattern light projecting means is constant. (Known from the physical configuration). Therefore, based on the image of the target object captured by the projected image capturing means, the predetermined point of the pattern light reflection position (the locus of the slit light) on the target object and the projected image capturing means are used by the three-dimensional shape calculating means. The angle between the line connecting the light source and the optical axis direction of the projected image capturing means is determined, and the angle is used to determine a predetermined point of the trajectory of the pattern light, the pattern light projecting means, and the projected image capturing means.
- the three-dimensional spatial position of a predetermined point on the trajectory of the slit light is determined, and the three-dimensional spatial position of each point on the trajectory of the slit light is determined.
- Slit light projected on Can be determined. It should be noted that the three-dimensional shape of the missing portion of the trajectory of the first pattern light can be inferred from an approximate curve other than the missing portion.
- the target object Since the trajectory of the first pattern light and the trajectory of the second pattern light are formed on the same plane of the target object, the target object is substantially uniform in the direction orthogonal to the length direction of the pattern light. Assuming that the object has a three-dimensional shape, the object obtained by extending the shape based on the three-dimensional spatial position obtained for the trajectory of the first pattern light by the three-dimensional shape calculation means in the direction of the trajectory of the second pattern light is the target object.
- the 3D shape of the target object can be obtained by inferring the 3D shape of the target object.
- the three-dimensional shape detection device having the above configuration, it is possible to detect the three-dimensional shape of the target object without touching the target object. Since the first pattern light and the second pattern light are formed by deflecting a part of the one-row pattern light converted from the light beam, the power per angular width of the first pattern light and the second pattern light is reduced. Also, the pattern light (slit light) can be made the same as when it is in one row, and the discrimination of the locus of the slit light can be surely made even with two rows of slit light without increasing the total output power as in the conventional case.
- the pattern light projecting unit reflects the light beam and emits the first pattern light by a first reflecting surface that reflects light at a predetermined angle;
- the second pattern light is emitted by reflecting the light beam by a second reflection surface that reflects light at a predetermined angle with respect to the optical path direction of the first pattern light with respect to the first reflection surface. It is composed of
- the formation of the pattern light obtained by converting the light beam into the first pattern light and the second pattern light can be performed by using two mirrors or the like.
- a simple and inexpensive configuration can be achieved when using a device for splitting a light beam such as a prism or a diffraction grating.
- the three-dimensional shape calculation means obtains the three-dimensional shape by interpolating the missing part with the partially missing pattern light
- the first pattern light is projected onto the target object.
- the interpolation accuracy of the three-dimensional shape is higher when the data of the missing part is interpolated and calculated based on the trajectory of the pattern light from the image in which both ends are projected. Therefore, in one embodiment of the present invention, the first pattern light emitted from the pattern light projecting unit is Alternatively, a configuration may be adopted in which a portion missing due to the deflection does not include an end of the light beam.
- the first pattern light is converted from a light beam converted from a light beam that does not cause the end portion of the pattern light to be cut off and the angular width to be narrowed in order to form the second pattern light. Angle width. For this reason, the first pattern light having the widest possible angle width can be projected at the stage of irradiating the target object.
- the trajectory (that is, the reflected light) of the pattern light captured by the projected image capturing means includes, as shown in FIG. Some are due to diffused light diffused by the target object. The luminance of the captured image is much higher in the locus of the slit light caused by the specular reflection light than in the locus of the pattern light caused by the diffused light.
- the projected image capturing means is arranged in the vertical direction of the plane of the pattern light with respect to the pattern light projecting means, the pattern light emitted from the pattern light projecting means has a substantially central portion which is specularly reflected. As a result, the light is likely to be incident on the projected image capturing means.
- the three-dimensional shape detection device may be configured as follows. That is, the first pattern light emitted by the pattern light projecting means is formed by lacking a light beam at a substantially central portion in the direction of the predetermined angular width.
- the substantially central portion which is likely to be the specular reflection light of the first pattern light, becomes a missing portion. You can see.
- the three-dimensional shape detection device includes a three-dimensional shape of the target object having a substantially uniform three-dimensional shape in a direction intersecting the projection direction of the plurality of pattern lights. Good to be used to detect.
- the shape of the portion where the pattern light is projected is a three-dimensional shape that is substantially uniform in a direction intersecting with the direction in which the pattern light is projected.
- the three-dimensional shape can be detected by estimating the shape of the whole or a part of the target object, for example, assuming that the top and bottom of the trace have the same shape as the locus of the pattern light.
- the target object has a substantially uniform three-dimensional shape in a direction orthogonal to the direction of the pattern light
- the target object is detected by a peculiar shape such as a projection of the target object included in the position where the pattern light is projected. It is not necessary to take into account deviations in the three-dimensional shape. When detecting the three-dimensional shape, it is not necessary to pay attention to the position where the pattern light is projected.
- the target object having a substantially uniform three-dimensional shape in a direction intersecting the projection direction of the plurality of pattern lights may be a substantially sheet shape.
- the pattern light projecting means has a light output means for outputting a plurality of light beams, and is configured to convert the plurality of light beams into the plurality of pattern lights.
- the plurality of pattern lights may include at least a long pattern light having a predetermined angular width and a short pattern light having an angular width relatively smaller than the long pattern light.
- the pattern emitted from the pattern light projecting means is at a fixed distance between the position of the pattern light projecting means and the position of the projected image capturing means.
- the angle of the light is constant (known from the physical configuration).
- the three-dimensional shape calculation means uses the image of the target object imaged by the light projection image imaging means to determine a predetermined point of the pattern light reflection position (pattern light locus) on the target object and the light projection image imaging means. The angle of the connecting line with respect to the optical axis direction of the projection image capturing means is obtained.
- the so-called triangulation is used to determine the shape of the triangle connecting the predetermined point of the trajectory of the pattern light, the pattern light projecting means, and the projected image capturing means using this angle.
- the position of the pattern light projected on the target object can be obtained by obtaining the three-dimensional space position of the predetermined point and obtaining the three-dimensional space position for each point of the trajectory of the pattern light.
- the trajectory of the pattern light by the long pattern light The shape of the trace can be obtained by analogy as extending in the direction of the locus of the short pattern light, and the three-dimensional shape of the target object can be detected.
- the 3D shape of the target object can be detected.
- the pattern light irradiating the target object a long pattern light and a short pattern light
- the total length of the pattern light can be shortened, and compared with the case of projecting two rows of long pattern light. Therefore, the total output power of the three-dimensional shape detecting device can be reduced.
- the light output means may be configured to output the light beam serving as the short pattern light with less power than the light beam serving as the long pattern light. good.
- the power ratio of each light beam output from the light output means is determined by an angle for each pattern light at which each light beam is converted by the pattern light projecting means. It may be substantially equal to the width ratio.
- the reference value for identifying the trajectory of the pattern light by the three-dimensional shape calculation means can be set in the same manner for the long pattern light and the short pattern light, so that the trajectory of the pattern light is accurate and efficient. Can be identified.
- a method for converting a light beam into pattern light by the pattern light projecting means For example, a method in which a light beam is evenly diffused on a predetermined surface and converted into slit light through a slit having a predetermined length of an elongated hole, or a method in which a rotating polygon mirror is irradiated with the light beam and run. There is a method of making a slit light.
- the pattern light projecting means uses a cylindrical lens having a focal length corresponding to an angular width of the plurality of pattern lights into which the plurality of light beams are converted, respectively. It is configured to convert the plurality of light beams into the plurality of pattern lights.
- the pattern light projecting means can be made compact and simple with small power loss of the light source. In other words, in the above-described method in which the slit light is transmitted through the slit, the power loss of the light beam is large because the light blocked by the slit is large.
- the light output means may be provided with light sources so as to output light beams having different powers in accordance with the angular width of the converted pattern light.
- the number and type of light sources are changed. And the apparatus becomes large and expensive.
- the light output means includes: a light source that generates one light beam; and a dividing means that divides the light beam from the light source into a plurality of light beams and outputs the plurality of light beams. It may be configured.
- the minimum number of light sources is required, and the three-dimensional shape detection device can be reduced in size and cost.
- the power of the light beam that becomes a short pattern light is reduced, and the total output power of the light beam from the light output means is suppressed, so that the output of the light source becomes large, so that one light source can be used. Suitable to do.
- the splitting means may be configured to split the light beam by a substantially transparent flat plate having at least one surface subjected to a non-reflection treatment.
- the light beam can be divided by the substantially transparent flat plate into a light beam reflected when entering the substantially transparent flat plate and a light beam transmitted through the substantially transparent flat plate.
- the power distribution of the light beam can be set.
- the loss due to reflection when transmitting through the substantially transparent flat plate is small, and the weight is lighter than that of a prism or the like. Can be realized.
- the three-dimensional shape detecting device detects the three-dimensional shape of the target object having a substantially uniform three-dimensional shape in a direction intersecting the projection direction of the long pattern light. It may be used to detect.
- the portion where the pattern light is projected intersects the projection direction. It can be assumed that the pattern light has a substantially uniform three-dimensional shape in any direction, and that the upper and lower trajectories of the pattern light are similar to the three-dimensional shape corresponding to the trajectory of the pattern light. The shape of the whole or a part of the target object can be estimated.
- the target object has a three-dimensional shape that is substantially uniform in a direction intersecting with the pattern light projecting direction
- a special object such as a projection of the target object included in the position where the pattern light is projected is included. Eliminates the need to consider deviations in the three-dimensional shape detected by different shapes.Eliminates the need to be aware of where the pattern light is projected when detecting the three-dimensional shape.
- the target object having a substantially uniform three-dimensional shape in a direction intersecting the projection direction of the long pattern light may be substantially sheet-shaped.
- the pattern light projecting unit includes a light output unit that outputs light, and converts the light from the light output unit into a plane at a predetermined angular width.
- the second pattern light deflected with respect to the first pattern light may be formed and emitted.
- the pattern light projecting means includes light output means for outputting a plurality of light beams, and outputs the plurality of light beams to the plurality of pattern lights. May be configured to be converted.
- the plurality of pattern lights may include at least a long pattern light having a predetermined angular width and a short pattern light having an angular width relatively smaller than the long pattern light.
- the imaging unit, the storage unit, the three-dimensional shape acquisition unit, and the image correction unit are configured to be built in a main body case of the imaging device. Is also good.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/296,347 US7440119B2 (en) | 2003-07-23 | 2005-12-08 | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting method |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2003-278411 | 2003-07-23 | ||
| JP2003278410A JP4360145B2 (ja) | 2003-07-23 | 2003-07-23 | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 |
| JP2003278411A JP4608855B2 (ja) | 2003-07-23 | 2003-07-23 | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 |
| JP2003-278410 | 2003-07-23 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/296,347 Continuation-In-Part US7440119B2 (en) | 2003-07-23 | 2005-12-08 | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2005008174A1 true WO2005008174A1 (ja) | 2005-01-27 |
Family
ID=34082389
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2004/010298 Ceased WO2005008174A1 (ja) | 2003-07-23 | 2004-07-20 | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US7440119B2 (ja) |
| WO (1) | WO2005008174A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103292748A (zh) * | 2013-01-05 | 2013-09-11 | 中国航空工业集团公司西安飞机设计研究所 | 一种基于激光测量的多基板拼合检测方法 |
| CN106651959A (zh) * | 2016-11-15 | 2017-05-10 | 东南大学 | 一种光场相机微透镜阵列几何参数的标定方法 |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5092738B2 (ja) * | 2007-12-26 | 2012-12-05 | ソニー株式会社 | 画像処理装置及び方法、並びにプログラム |
| DE102008010965B4 (de) * | 2008-02-25 | 2016-03-31 | Mathias Reiter | Roboter-Bahnführung |
| US8432395B2 (en) * | 2009-06-16 | 2013-04-30 | Apple Inc. | Method and apparatus for surface contour mapping |
| JP2014044060A (ja) * | 2012-08-24 | 2014-03-13 | Canon Inc | 形状測定装置、および形状測定方法 |
| KR102025716B1 (ko) * | 2013-03-21 | 2019-09-26 | 삼성전자주식회사 | 3차원 형상 측정장치 |
| US11169268B1 (en) | 2017-12-12 | 2021-11-09 | Philip Raymond Schaefer | System and method for measuring the position of a moving object |
| GB201721451D0 (en) * | 2017-12-20 | 2018-01-31 | Univ Manchester | Apparatus and method for determining spectral information |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001309132A (ja) * | 2000-04-26 | 2001-11-02 | Minolta Co Ltd | 読み取り装置 |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE4130237A1 (de) * | 1991-09-11 | 1993-03-18 | Zeiss Carl Fa | Verfahren und vorrichtung zur dreidimensionalen optischen vermessung von objektoberflaechen |
| US5636025A (en) * | 1992-04-23 | 1997-06-03 | Medar, Inc. | System for optically measuring the surface contour of a part using more fringe techniques |
| US5668631A (en) * | 1993-12-20 | 1997-09-16 | Minolta Co., Ltd. | Measuring system with improved method of reading image data of an object |
| JPH07318315A (ja) | 1994-05-27 | 1995-12-08 | Minolta Co Ltd | 測定器の姿勢検出装置 |
| JPH0827176A (ja) | 1994-07-15 | 1996-01-30 | Taiyo Kagaku Co Ltd | シアリルリン脂質の製造方法 |
| US5825495A (en) * | 1995-02-27 | 1998-10-20 | Lockheed Martin Corporation | Bright field illumination system |
| JP2000076460A (ja) * | 1998-06-18 | 2000-03-14 | Minolta Co Ltd | モニタ表示装置 |
| US6291817B1 (en) * | 1998-06-23 | 2001-09-18 | Fuji Photo Optical Co., Ltd. | Moire apparatus having projection optical system and observation optical system which have optical axes parallel to each other |
| US6553138B2 (en) * | 1998-12-30 | 2003-04-22 | New York University | Method and apparatus for generating three-dimensional representations of objects |
| JP4298155B2 (ja) * | 2000-11-17 | 2009-07-15 | 本田技研工業株式会社 | 距離測定装置、及び距離測定方法 |
| KR100389017B1 (ko) * | 2000-11-22 | 2003-06-25 | (주) 인텍플러스 | 모아레무늬 발생기를 적용한 위상천이 영사식 모아레방법및 장치 |
| JP2002296020A (ja) * | 2001-03-30 | 2002-10-09 | Nidek Co Ltd | 表面形状測定装置 |
| JP3884321B2 (ja) * | 2001-06-26 | 2007-02-21 | オリンパス株式会社 | 3次元情報取得装置、3次元情報取得における投影パターン、及び、3次元情報取得方法 |
| US7061628B2 (en) * | 2001-06-27 | 2006-06-13 | Southwest Research Institute | Non-contact apparatus and method for measuring surface profile |
-
2004
- 2004-07-20 WO PCT/JP2004/010298 patent/WO2005008174A1/ja not_active Ceased
-
2005
- 2005-12-08 US US11/296,347 patent/US7440119B2/en not_active Expired - Fee Related
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001309132A (ja) * | 2000-04-26 | 2001-11-02 | Minolta Co Ltd | 読み取り装置 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103292748A (zh) * | 2013-01-05 | 2013-09-11 | 中国航空工业集团公司西安飞机设计研究所 | 一种基于激光测量的多基板拼合检测方法 |
| CN103292748B (zh) * | 2013-01-05 | 2015-12-02 | 中国航空工业集团公司西安飞机设计研究所 | 一种基于激光测量的多基板拼合检测方法 |
| CN106651959A (zh) * | 2016-11-15 | 2017-05-10 | 东南大学 | 一种光场相机微透镜阵列几何参数的标定方法 |
| CN106651959B (zh) * | 2016-11-15 | 2019-05-31 | 东南大学 | 一种光场相机微透镜阵列几何参数的标定方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US7440119B2 (en) | 2008-10-21 |
| US20060152738A1 (en) | 2006-07-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7627196B2 (en) | Image processing device and image capturing device | |
| US9826216B1 (en) | Systems and methods for compact space-time stereo three-dimensional depth sensing | |
| US7161682B2 (en) | Method and device for optical navigation | |
| JP3877058B2 (ja) | 小型装置およびその製造方法 | |
| JPH1065882A (ja) | 媒体表面形状データ取得方法 | |
| JP4379056B2 (ja) | 三次元画像撮像装置および方法 | |
| JPH1183459A (ja) | 凹凸面情報検出装置 | |
| WO2005008174A1 (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 | |
| EP3353489B1 (en) | Method and apparatus for measuring the height of a surface | |
| JP4360145B2 (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 | |
| JP3991501B2 (ja) | 3次元入力装置 | |
| TWI258706B (en) | Method and device for optical navigation | |
| CN208921064U (zh) | 一种激光相机及其光学成像系统 | |
| JP4608855B2 (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出方法 | |
| US7391522B2 (en) | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting program | |
| US7365301B2 (en) | Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting program | |
| JP2005148813A5 (ja) | ||
| US7372580B2 (en) | Three-dimensional shape detecting device, three-dimensional shape detecting system, and three-dimensional shape detecting program | |
| JP3360505B2 (ja) | 3次元計測方法及び装置 | |
| KR101332024B1 (ko) | 반사 선택적 집적 영상 소자, 이를 이용한 영상 디스플레이 장치 및 그 방법 | |
| JP2005189021A (ja) | 撮像装置 | |
| JP2005128006A (ja) | 3次元形状検出装置、撮像装置、及び、3次元形状検出プログラム | |
| WO2005080915A1 (ja) | 3次元形状検出装置および撮像装置 | |
| JP2005092629A (ja) | 画像処理装置、及び、撮像装置 | |
| JP2005092629A5 (ja) |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 11296347 Country of ref document: US |
|
| WWP | Wipo information: published in national office |
Ref document number: 11296347 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase |