WO2016002510A1 - Dispositif et procédé de traitement d'image - Google Patents
Dispositif et procédé de traitement d'image Download PDFInfo
- Publication number
- WO2016002510A1 WO2016002510A1 PCT/JP2015/067422 JP2015067422W WO2016002510A1 WO 2016002510 A1 WO2016002510 A1 WO 2016002510A1 JP 2015067422 W JP2015067422 W JP 2015067422W WO 2016002510 A1 WO2016002510 A1 WO 2016002510A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pattern
- projection
- unit
- patterns
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- the present technology relates to an image processing apparatus and method, and more particularly, to an image processing apparatus and method that can more easily determine a relative posture between a projection unit and an imaging unit.
- a method using a gray code was considered as a method for obtaining the correspondence relationship of this pixel.
- this method it is necessary to use a large number of pattern images in order to identify each pixel of the projector, and there is a possibility that the process for grasping the correspondence relationship of the pixels becomes complicated.
- each corner of the checker corresponds to which pixel of the projector.
- a method for detecting the above has been considered (for example, see Non-Patent Document 1).
- the present technology has been proposed in view of such a situation, and an object thereof is to make it possible to more easily determine the relative posture between the projection unit and the imaging unit.
- One aspect of the present technology provides a detection unit that detects a correspondence between a projection image and a captured image of the projection image using a pattern captured image that is a captured image of a pattern projection image that is a projection image of an image including a pattern.
- An image processing apparatus is provided.
- the image includes a plurality of patterns, and each pattern includes a plurality of sub-patterns, and the number of sub-patterns or the positional relationship between the sub-patterns can be different from other patterns.
- the pattern may have a start code in which the sub-patterns are arranged in a common number and positional relationship in each pattern, and a unique code in which the number or positional relationship of the sub-patterns is different from other patterns. it can.
- the start code includes a plurality of sub-patterns arranged as a graphic, and the unique code can be arranged differently from other patterns as a graphic.
- the start code is composed of three patterns arranged in one row and three columns, and the unique code is arranged in two rows and three columns so that the number or positional relationship is different from the other patterns or A plurality of the sub-patterns may be included.
- the pattern can be arranged as a dot-like figure.
- the detecting unit detects the corresponding point between the projected image and the captured image of the projected image using the number of the sub-patterns and the positional relationship for each pattern included in the pattern captured image. Can be detected.
- the detection unit can detect the corresponding points by analyzing the number and the positional relationship of the sub-patterns for each pattern using the adjacent relationship between the patterns.
- the detection unit can detect the adjacent relationship between the sub-patterns using an edge image indicating the edge of the pattern in a captured image of a projected image of a predetermined pattern image.
- the image processing apparatus further includes an edge detection unit that detects an edge of the pattern in a captured image of a projection image of the image of the predetermined pattern and generates the edge image, and the detection unit generates the edge image generated by the edge detection unit. Can be used to detect the adjacent relationship between the sub-patterns.
- the detection unit can detect the adjacent relationship between the sub-patterns using the edge image in which the edge is expanded.
- the detection unit can detect the adjacent relationship between the sub-patterns using the center of gravity of the sub-patterns for each pattern included in the pattern-captured image.
- the detection unit can detect the center of gravity of the sub-pattern using an image obtained by binarizing the pattern captured image.
- the detection unit uses corresponding points between the homography image generated based on the pattern included in the pattern captured image and the captured image of the projected image, and corresponding points between the projected image and the captured image. Can be detected.
- the image having the predetermined pattern may be a check pattern image.
- the detection unit detects corresponding points between the homography image and the captured image for all corners of the predetermined pattern in the captured image, and the projected image for all detected corners of the predetermined pattern. And the corresponding point of the captured image can be detected.
- the detection unit can detect the corner using an image obtained by binarizing the captured image and expanding a predetermined component.
- the imaging unit further includes an imaging unit that captures a projected image and obtains the captured image, and the detection unit uses the pattern captured image obtained by capturing the pattern projection image by the imaging unit, and the projected image and the imaging unit By detecting a corresponding point with the captured image obtained by capturing the projection image by the above, the correspondence relationship can be detected.
- the correspondence relationship can be detected by detecting a corresponding point of the image with the captured image.
- An image processing unit that performs image processing on a portion of an image to be projected that overlaps with another projection image by using a correspondence relationship between the projection image detected by the detection unit and the captured image is further provided. Can do.
- One aspect of the present technology also uses a pattern captured image that is a captured image of a pattern projected image that is a projected image of an image including a pattern to detect a correspondence relationship between the projected image and the captured image of the projected image. It is a processing method.
- a pattern captured image that is a captured image of a pattern projected image that is a projected image of an image including a pattern is used to detect a correspondence between the projected image and the captured image.
- information can be processed based on images. Moreover, according to this technique, the relative attitude
- First Embodiment> ⁇ Overlap area and correction> Conventionally, there is a method of projecting one image using a plurality of projectors. By doing so, for example, the image size of the projected image can be increased without reducing the resolution, or the projected image can be projected onto a curved surface with less distortion.
- projection is performed such that some or all of the projection images overlap each other. That is, in this case, a portion where a plurality of projection images overlap (also referred to as an overlap region) occurs.
- the projection unit 11 of the projection imaging apparatus 10 projects an image onto the screen 30, the image is projected in a range from P0L to P0R of the screen 30.
- the projection unit 21 of the projection imaging apparatus 20 projects an image toward the screen 30, it is assumed that the image is projected in a range from P1L to P1R of the screen 30.
- the range from P1L to P0R of the screen 30 is the overlap region.
- brightness may be different from other regions, or there may be a shift between overlapping projection images, so it is required to perform level correction, distortion correction, or the like.
- the projecting unit 11 and the projecting unit 21 project an image with uniform brightness over the entire projection range
- the image is projected from both the projecting unit 11 and the projecting unit 21 in the overlap region, so that other ranges It becomes brighter than (the range from P0L to P1L and the range from P0R to P1R).
- the range from P0L to P1L only the overlap area is brightened, and there is a risk that the projected image will give an uncomfortable image (that is, the image quality of the projected image will deteriorate). is there. Therefore, in such a case, level correction is necessary to suppress the reduction in image quality.
- the projection imaging apparatus 10 has a projection function (projection unit 11) that projects an image and an imaging function (imaging unit 12) that captures a subject and obtains a captured image.
- the projection imaging device 20 has a projection function (projection unit 21) that projects an image and an imaging function (imaging unit 22) that captures a subject and obtains a captured image.
- the imaging unit 12 images a range from C0L to C0R of the screen 30, and the imaging unit 22 images a range from C1L to C1R of the screen 30. That is, the imaging unit 12 can capture the range from P1L to C0R of the projection image projected by the projection unit 21. Further, the imaging unit 22 can capture a range from C1L to P0R of the projection image projected by the projection unit 11.
- each projection imaging apparatus assuming that the positional relationship between the projection unit and the imaging unit is known, the relative posture (double arrow 41) between the imaging unit 12 and the projection unit 21 and the imaging unit 22 and the projection unit 11 If the relative posture (double arrow 42) can be grasped, the overlap region can be detected.
- a method for obtaining the correspondence between the pixels between the projection unit (projector) and the imaging unit (camera) a method using a gray code has been considered.
- this method for example, a predetermined pattern image as shown in FIG. 3A is projected from the projector while switching in time series, and each pattern is captured by the camera.
- 1 (white) or 0 (black) of each imaging pattern is detected in each pixel of the camera, and as shown in FIG.
- the position of the projector pixel is obtained by decoding. Thereby, the correspondence of pixels can be acquired.
- Non-Patent Document 1 As shown in FIG. 4, four patterns lacking any of the checker and the four corners of the checker are projected, and the pattern projection order and the information on the missing corners are matched. Thus, a method of detecting which pixel of the projector corresponds to each corner of the checker was considered.
- the number of pattern images to be projected and imaged can be greatly reduced as compared with the method using the gray code described above.
- five patterns are required as pattern images to be projected and imaged, and the number is not the smallest.
- it is necessary that at least one missing corner is visible from the camera.
- the robustness of feature point detection is not sufficient with respect to the background color of the screen and the situation of external light.
- the correspondence between the projection image and the captured image of the projection image is detected using a pattern captured image that is a captured image of the pattern projection image that is a projection image of the image including the pattern.
- the corresponding point can be detected without increasing the number of pattern images. That is, the relative posture between the projection unit and the imaging unit can be obtained more easily.
- an image including a pattern includes a plurality of patterns, each pattern is configured by a plurality of sub-patterns, and the number of sub-patterns or the positional relationship between the sub-patterns is different from other patterns. It may be.
- the pattern may have a start code in which the sub-patterns are arranged in a common number and positional relationship in each pattern, and a unique code in which the number or positional relationship of the sub-patterns is different from other patterns. .
- the start code may be composed of a plurality of sub-patterns arranged as a graphic, and the unique code may be arranged differently from other patterns as a graphic. Furthermore, the start code consists of three patterns arranged in 1 row and 3 columns, and the unique code is a single code or a plurality of unique codes arranged in 2 rows and 3 columns so that the number or positional relationship is different from other patterns. You may make it consist of these sub-patterns.
- the sub-pattern may be arranged as a dot-like figure.
- the detection unit detects the correspondence relationship by detecting a corresponding point between the projection image and the captured image of the projection image using the number of subpatterns and the positional relationship for each pattern included in the pattern captured image. You may do it.
- the detection unit may detect the corresponding points by analyzing the number and the positional relationship of the sub-patterns using the adjacent relationship between the patterns for each pattern.
- the detection unit may detect the adjacent relationship between the sub-patterns by using an edge image indicating the edge of the pattern in the captured image of the projection image of the image of the predetermined pattern.
- an edge detection unit that detects an edge of the pattern in a captured image of a projection image of a predetermined pattern image and generates the edge image is further provided, and the detection unit generates an edge image generated by the edge detection unit. May be used to detect the adjacent relationship between the sub-patterns.
- the detection unit may detect the adjacent relationship between the sub-patterns using the edge image in which the edge is expanded.
- the detection unit may detect the adjacent relationship between the sub-patterns using the center of gravity of the sub-patterns for each pattern included in the pattern captured image.
- the detection unit may detect the center of gravity of the sub-pattern using an image obtained by binarizing the pattern captured image.
- the detection unit detects corresponding points between the projection image and the captured image using corresponding points between the homography image generated based on the pattern included in the pattern captured image and the captured image of the projection image. May be.
- the image of the predetermined pattern may be a check pattern image.
- the detection unit detects corresponding points between the homography image and the captured image for all corners of the predetermined pattern in the captured image, and corresponds to the projection image and the captured image for all detected corners of the predetermined pattern. You may make it detect a point.
- the detection unit may detect a corner using an image obtained by binarizing the captured image and expanding a predetermined component.
- the imaging unit further includes an imaging unit that captures the projected image and obtains the captured image, and the detection unit uses the pattern captured image obtained by capturing the pattern projection image by the imaging unit, and the projection image is captured by the imaging unit.
- the correspondence relationship may be detected by detecting a corresponding point with a captured image obtained by imaging.
- the image processing apparatus further includes a projection unit that projects an image, and the detection unit uses a pattern captured image that is a captured image of the pattern projection image projected by the projection unit, and a projection image projected by the projection unit and a captured image of the projection image, The corresponding relationship may be detected by detecting the corresponding points.
- An image processing unit that performs image processing on a portion of an image to be projected that overlaps with another projection image using a correspondence relationship between the projection image detected by the detection unit and the captured image may be further provided.
- FIG. 5 shows a main configuration example of a projection imaging system to which a control apparatus that is an embodiment of an image processing apparatus to which the present technology is applied is applied.
- a projection imaging system 100 shown in FIG. 5 is a system that projects an image.
- the projection imaging system 100 can project one image using a plurality of projection devices (projection imaging devices) as described above, for example.
- the projection imaging system 100 includes a control device 101, a projection imaging device 102-1, a projection imaging device 102-2, and a network 103.
- the control device 101 is connected to the projection imaging device 102-1 and the projection imaging device 102-2 via the network 103, communicates with them, and controls their operations.
- the control device 101 causes the projection imaging device 102-1 and the projection imaging device 102-2 to project an image or cause a projection image to be captured.
- the control device 101 can control the projection imaging device 102-1 and the projection imaging device 102-2 to project one image.
- control device 101 performs processing related to estimation of the relative posture (rotation component, translation component, etc.) between the projection unit and the imaging unit between the projection imaging device 102-1 and the projection imaging device 102-2. . Further, for example, using the estimation result, the control apparatus 101 performs image processing such as level correction and distortion correction on the overlap region of the projection image of the projection imaging apparatus 102-1 and the projection image of the projection imaging apparatus 102-2. It can also be done.
- the projection imaging apparatus 102-1 has a projection function and can project an image on the screen 104.
- the projection imaging apparatus 102-1 also has an imaging function and can capture a projection image projected on the screen 104.
- the projection imaging apparatus 102-2 is the same apparatus as the projection imaging apparatus 102-1, has the same configuration, and has the same function.
- the projection imaging apparatus 102-1 and the projection imaging apparatus 102-2 are referred to as the projection imaging apparatus 102 when there is no need to distinguish between them.
- the projection imaging apparatus 102-1 and the projection imaging apparatus 102-2 are connected to each other via the network 103, and can exchange information (can perform communication).
- the projection imaging apparatus 102-1 and the projection imaging apparatus 102-2 are connected to the control apparatus 101 via the network 103, and each can exchange information with the control apparatus 101 (communication can be performed). it can).
- each of the projection imaging apparatus 102-1 and the projection imaging apparatus 102-2 can project one image onto the screen 104 by projecting the image onto the screen 104 under the control of the control apparatus 101.
- the projection imaging apparatus 102-1 can project an image on the screen 104, and the projection imaging apparatus 102-2 can capture the projection image to obtain a captured image.
- the control device 101 can control the projection imaging device 102-2 to project an image on the screen 104, and the projection imaging device 102-1 can capture the projection image to obtain a captured image.
- the network 103 is a communication network serving as a communication medium between the control device 101 and the projection imaging device 102.
- the network 103 may be any communication network, a wired communication network, a wireless communication network, or both of them.
- it may be a wired LAN, a wireless LAN, a public telephone line network, a so-called 3G line, a wide area communication network for a wireless mobile body such as a 4G line, or the Internet, or a combination thereof.
- the network 103 may be a single communication network or a plurality of communication networks.
- the network 103 is partially or entirely configured by a communication cable of a predetermined standard, such as a USB (Universal Serial Bus) cable or an HDMI (High-Definition Multimedia Interface) cable. You may be made to do.
- the screen 104 is an object on which an image is projected from the imaging projection device 102.
- the surface on which the image of the screen 104 is projected may be flat, curved, or uneven.
- the imaging / projecting apparatus 102 has been described as projecting an image on the screen 104, but the screen 104 is an example of a surface on which the imaging / projecting apparatus 102 projects an image.
- the imaging projection device 102 can project an image onto an arbitrary target capable of projecting an image, such as a wall surface, a building, a floor, a ceiling, a calendar, tableware, a doll, and a stationery.
- FIG. 6A shows an example of the appearance of the projection imaging apparatus 102.
- the projection imaging apparatus 102 has a projection function and an imaging function as described above, and a projection port (lens mechanism) for projecting an image and a camera (lens) for imaging a subject in the casing.
- An optical device such as a mechanism
- the projection imaging device 102 may be any size device, but may be a portable (small) device, for example. In that case, as shown in FIG. 6A, a battery may be provided in the housing of the projection imaging apparatus 102 to improve portability.
- the projection imaging apparatus 102 can project one image by the plurality of projection imaging apparatuses 102.
- FIG. 6B shows an example of a state in which one image is projected on the screen 104 using the four projection imaging devices 102 (projection imaging device 102-1 to projection imaging device 102-4).
- one image area 111 is formed by each projection image projected from each of the projection imaging apparatus 102-1 to the projection imaging apparatus 102-4, and one image is formed in the image area 111. Is displayed.
- each projection image and this one image may be a still image or a moving image.
- the projection imaging devices 102 are arranged so that at least a part of the projection images overlap each other.
- the projection image 112-1 of the projection imaging device 102-1 and the projection image 112-2 of the projection imaging device 102-2 partially overlap each other (the hatched pattern portion in FIG. 7). ).
- An area indicated by the oblique line pattern is an overlap area 113.
- the brightness may be different from other areas, or the projected images may overlap with each other, and deterioration in image quality such as level correction and distortion correction is suppressed.
- FIG. 8 is a diagram illustrating a main configuration example of the control device 101 which is an embodiment of an image processing device to which the present technology is applied.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the input / output interface 160 is also connected to the bus 154.
- An input unit 161, an output unit 162, a storage unit 163, a communication unit 164, and a drive 165 are connected to the input / output interface 160.
- the input unit 161 includes an input device that accepts external information such as user input.
- the input unit 161 includes operation buttons, a touch panel, a camera, a microphone, an input terminal, and the like.
- Various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor may be included in the input unit 161.
- the output unit 162 includes an output device that outputs information such as images and sounds.
- the output unit 162 includes a display, a speaker, an output terminal, and the like.
- the storage unit 163 includes, for example, a hard disk, a RAM disk, and a nonvolatile memory.
- the communication unit 164 includes a network interface, for example.
- the communication unit 164 is connected to the network 103 and communicates with other devices connected via the network 103.
- the drive 165 drives a removable medium 171 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 151 performs various processes by, for example, loading a program stored in the storage unit 163 into the RAM 153 via the input / output interface 160 and the bus 154 and executing the program.
- the RAM 153 also appropriately stores data necessary for the CPU 151 to execute various processes.
- the program executed by the CPU 151 can be recorded on the removable medium 171 as a package medium or the like and provided to the control device 101, for example.
- the program can be installed in the storage unit 163 via the input / output interface 160 by attaching the removable medium 171 to the drive 165.
- this program can be provided to the control apparatus 101 via a wired or wireless transmission medium such as a LAN, the Internet, or digital satellite broadcasting.
- the program can be received by the communication unit 164 via a wired or wireless transmission medium and installed in the storage unit 163.
- this program can be installed in the ROM 152 or the storage unit 163 in advance.
- FIG. 9 is a functional block diagram illustrating an example of main functions realized by the CPU 151.
- the CPU 151 includes functional blocks such as a correspondence relationship detection unit 201, a projection control unit 202, and an imaging control unit 203.
- the correspondence relationship detection unit 201 performs processing related to the detection of the correspondence relationship between the projection unit and the imaging unit between the projection imaging devices 102.
- the projection control unit 202 performs processing related to the control of the projection function of the projection imaging apparatus 102.
- the imaging control unit 203 performs processing related to control of the imaging function of the projection imaging apparatus 102.
- the correspondence relationship detection unit 201 includes a pattern image projection imaging processing unit 211, a corresponding point detection processing unit 212, and a projection image processing unit 213.
- the pattern image projection imaging processing unit 211 performs a process related to pattern image projection and imaging to detect the correspondence.
- the pattern image projection imaging processing unit 211 can project the predetermined pattern image on the screen 104 by controlling the projection imaging apparatus 102 via the projection control unit 202.
- the pattern image projection imaging processing unit 211 controls the projection imaging device 102 via the imaging control unit 203 to capture a projection image of the predetermined pattern image, and the captured image is sent to the corresponding point detection processing unit 212. Supply. It can be projected on the screen 104.
- the corresponding point detection processing unit 212 uses the captured image obtained via the imaging control unit 203 to perform processing related to detection of the corresponding point between the projection image and the captured image.
- the projection image processing unit 213 performs image processing such as level correction and distortion correction on the projection image based on the detected corresponding points.
- the projection image processing unit 213 can also control the projection imaging apparatus 102 via the projection control unit 202 and project the image subjected to the image processing.
- FIG. 10 is a functional block diagram illustrating a main configuration example of the corresponding point detection processing unit 212.
- the corresponding point detection processing unit 212 includes a corner detection processing unit 221, an edge detection processing unit 222, a centroid detection processing unit 223, and an inter-image corresponding point detection processing unit 224.
- the corner detection processing unit 221 performs processing related to corner detection.
- the corner detection processing unit 221 includes a captured image noise reduction unit 231, a difference image generation unit 232, a difference image binarization unit 233, a binary image expansion unit 234, and a binarized expansion image corner detection unit 235.
- the captured image noise reduction unit 231 performs image processing for reducing noise on a captured image of a projected image having a predetermined pattern supplied from the projection imaging apparatus 102.
- the difference image generation unit 232 generates a difference image between the captured images with reduced noise.
- the difference image binarization unit 233 binarizes the generated difference image.
- the binarized image expansion unit 234 expands (expands the area) a predetermined component (for example, “white” portion) of the binarized image that is the binarized difference image.
- the binarized expanded image corner detection unit 235 performs processing related to detection of a corner portion of a predetermined symbol included in the binarized expanded image that is a binarized image in which a predetermined component is expanded.
- the edge detection processing unit 222 performs processing related to edge detection.
- the edge detection processing unit 222 includes a binarized expanded image edge detection unit 241, an edge detection image generation unit 242, and an edge detection image expansion unit 243.
- the binarized dilated image edge detection unit 241 detects an edge portion of a predetermined symbol included in the binarized dilated image.
- the edge detection image generation unit 242 generates an edge detection image indicating the edge portion based on the edge detection result by the binarized expanded image edge detection unit 241.
- the edge detection image expansion unit 243 expands (enlarges the area) each edge of the generated edge detection image.
- the center-of-gravity detection processing unit 223 performs processing for detecting the center of gravity of each sub-pattern constituting the unique pattern. Details of the unique pattern will be described later.
- the centroid detection processing unit 223 includes a unique pattern captured image noise detection unit 251, a difference image generation unit 252, a difference image binarization unit 253, and a binarized image centroid detection unit 254.
- the unique pattern captured image noise detection unit 251 performs image processing for reducing noise with respect to a unique pattern captured image that is a captured image of a unique pattern projected image that is a projected image of a unique pattern image including a plurality of sub-patterns having different patterns from others. I do.
- the difference image generation unit 252 generates a difference image between the unique pattern captured image and a captured image obtained by capturing a projection image of another pattern.
- the difference image binarization unit 253 binarizes the generated difference image.
- the binarized image centroid detection unit 254 performs processing relating to the detection of the centroid of each sub-pattern of the unique pattern included in the binarized image that is a binarized difference image.
- the inter-image corresponding point detection processing unit 224 performs processing related to detection of corresponding points between images.
- the inter-image corresponding point detection processing unit 224 includes a unique pattern adjacency relationship detection unit 261, a unique pattern decoding unit 262, a homography image generation unit 263, a homography image captured image corresponding point detection unit 264, and a projected image captured image correspondence.
- a point detection unit 265 is included.
- the unique pattern adjacency relationship detection unit 261 performs processing related to detection of adjacency relationships between sub-patterns forming a unique pattern.
- the unique pattern decoding unit 262 performs processing for analyzing the unique pattern based on the adjacent relationship between the detected sub-patterns.
- the homography image generation unit 263 performs processing related to generation of an image (homography image) obtained by projective conversion of a projection image using a unique pattern.
- the corresponding point detection unit 264 between the homographic image and the captured image performs processing for obtaining a correspondence relationship between the homography image and the captured image based on the unique pattern or the like.
- the corresponding point detection unit 265 between the captured image and the captured image performs processing for obtaining the correspondence between the projected image and the captured image using the correspondence between the homography image and the captured image.
- FIG. 11 is a block diagram illustrating a main configuration example of the projection imaging apparatus 102.
- the projection imaging apparatus 102 includes a control unit 301, a projection unit 302, an imaging unit 303, an input unit 311, an output unit 312, a storage unit 313, a communication unit 314, and a drive 315.
- the control unit 301 includes, for example, a CPU, a ROM, a RAM, and the like, and controls each processing unit in the apparatus and executes various processes necessary for the control such as image processing.
- the projection unit 302 is controlled by the control unit 301 to perform processing related to image projection.
- the projection unit 302 projects the image supplied from the control unit 301 to the outside of the projection imaging apparatus 102 (for example, the screen 104). That is, the projection unit 302 realizes a projection function.
- the projection unit 302 projects an image by using laser light as a light source and scanning the laser light using a MEMS mirror.
- the light source of the projection unit 302 is arbitrary, and is not limited to laser light, and may be an LED, xenon, or the like. Details of the projection unit 302 will be described later.
- the imaging unit 303 is controlled by the control unit 301 to capture a subject outside the apparatus, generate a captured image, and supply the captured image to the control unit 301. That is, the imaging unit 303 implements an imaging function. For example, the imaging unit 303 captures a projection image projected on the screen 304 by the projection unit 302.
- the input unit 311 includes an input device that accepts external information such as user input.
- the input unit 311 includes operation buttons, a touch panel, a camera, a microphone, an input terminal, and the like.
- Various sensors such as an acceleration sensor, an optical sensor, and a temperature sensor may be included in the input unit 311.
- the output unit 312 includes an output device that outputs information such as images and sounds.
- the output unit 312 includes a display, a speaker, an output terminal, and the like.
- the storage unit 313 includes, for example, a hard disk, a RAM disk, and a nonvolatile memory.
- the communication unit 314 includes a network interface, for example.
- the communication unit 314 is connected to the network 103 and communicates with other devices connected via the network 103.
- the drive 315 drives a removable medium 321 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the control unit 301 performs various processes by, for example, loading a program stored in the storage unit 313 into a RAM built therein and executing the program.
- the RAM also appropriately stores data necessary for the control unit 301 to execute various processes.
- the program executed by the control unit 301 can be recorded on a removable medium 321 as a package medium or the like and provided to the projection imaging apparatus 102, for example.
- the program can be installed in the storage unit 313 by attaching the removable medium 321 to the drive 315.
- This program can also be provided to the projection imaging apparatus 102 via a wired or wireless transmission medium such as a LAN, the Internet, or digital satellite broadcasting.
- the program can be received by the communication unit 314 via a wired or wireless transmission medium and installed in the storage unit 313.
- this program can be installed in advance in the ROM or the storage unit 313 built in the control unit 301.
- Both the projection imaging device 102-1 and the projection imaging device 102-2 have such a configuration.
- FIG. 12 is a block diagram illustrating a main configuration example of the projection unit 302.
- the projection unit 302 includes a video processor 351, a laser driver 352, a laser output unit 353-1, a laser output unit 353-2, a laser output unit 353-3, a mirror 354-1, and a mirror 354-. 2, a mirror 354-3, a MEMS (Micro Electro Mechanical Systems) driver 355, and a MEMS mirror 356.
- the video processor 351 holds an image supplied from the control unit 301 or performs necessary image processing on the image.
- the video processor 351 supplies the projected image to the laser driver 352 and the MEMS driver 355.
- the laser driver 352 controls the laser output unit 353-1 to the laser output unit 353-3 so as to project the image supplied from the video processor 351.
- the laser output units 353-1 to 353-3 output laser beams having different colors (wavelength ranges) such as red, blue, and green. That is, the laser driver 352 controls the laser output of each color so that the image supplied from the video processor 351 is projected.
- the laser output unit 353-1 to the laser output unit 353-3 are referred to as a laser output unit 353 when it is not necessary to distinguish between them.
- the mirror 354-1 reflects the laser beam output from the laser output unit 353-1 and guides it to the MEMS mirror 356.
- the mirror 354-2 reflects the laser beam output from the laser output unit 353-2 and guides it to the MEMS mirror 356.
- the mirror 354-3 reflects the laser beam output from the laser output unit 353-3 and guides it to the MEMS mirror 356. Note that the mirrors 354-1 to 354-3 are referred to as mirrors 354 when there is no need to distinguish them from each other.
- the MEMS driver 355 controls the drive of the mirror of the MEMS mirror 356 so as to project the image supplied from the video processor 351.
- the MEMS mirror 356 scans laser light of each color, for example, as shown in the example of FIG. 13 by driving a mirror (mirror) mounted on the MEMS according to the control of the MEMS driver 355. This laser light is output from the projection port to the outside of the apparatus, and is irradiated on the screen 104, for example. As a result, the image supplied from the video processor 351 is projected onto the screen 104.
- three laser output units 353 are provided to output laser beams of three colors, but the number of laser beams (or the number of colors) is arbitrary.
- the number of laser output units 353 may be four or more, or two or less. That is, the number of laser beams output from the projection imaging apparatus 102 (projection unit 302) may be two or less, or four.
- the number of colors of the laser light output from the projection imaging apparatus 102 (projection unit 302) is arbitrary, and may be two or less colors or four or more colors.
- the configurations of the mirror 354 and the MEMS mirror 356 are also arbitrary, and are not limited to the example of FIG. Of course, the scanning pattern of the laser beam is arbitrary and is not limited to the example of FIG.
- the correspondence relationship detection unit 201 of the control device 101 executes a correspondence relationship detection process.
- This process can be executed at an arbitrary timing.
- the correspondence relationship detection unit 201 may execute this correspondence relationship detection processing before starting projection of one image using both the projection imaging device 102-1 and the projection imaging device 102-2. Good. Further, this correspondence detection process may be executed at a predetermined timing during the projection of the image.
- the pattern image projection imaging processing unit 211 performs pattern image projection imaging processing in step S101.
- the corresponding point detection processing unit 212 performs a corresponding point detection process in step S102.
- the projection image processing unit 213 corrects the projection image to be projected or being projected based on the correspondence relationship detected in step S102 in step S103. That is, the projection image processing unit 213 performs image processing on the overlap area of the projection image.
- the correspondence detection process ends.
- the pattern image projection imaging processing unit 211 selects a pattern image to be projected from unprocessed (not projected) pattern images in step S121.
- the pattern image is an image of a predetermined pattern prepared in advance, and corresponding points are detected (overlapping area is detected) using the pattern image.
- FIG. 16 shows an example of the pattern image.
- the pattern image of pattern 1 shown in FIG. 16A is a two-color (for example, black and white) check pattern (checker) image.
- a pattern image of pattern 2 shown in FIG. 16B is an image of a check pattern (checker) of two colors (for example, black and white) similar to pattern 1. However, pattern 2 is inverted from pattern 1 in positive and negative (black and white).
- the pattern image of pattern 3 shown in FIG. 16C is a black image.
- the pattern image of pattern 4 shown in FIG. 16D is an image in which a plurality of (for example, white) unique patterns are superimposed on a black image.
- This unique pattern is composed of a plurality of sub-patterns, and each unique pattern is different from other unique patterns in the number of sub-patterns or the positional relationship between the patterns. That is, this unique pattern is composed of a group of sub-patterns arranged in a unique arrangement pattern, and can be distinguished from other unique patterns by the number of sub-patterns and the positional relationship.
- the number, shape, and size of the sub-patterns constituting the unique pattern are arbitrary. All the sub-patterns may have the same size or shape, or different objects may exist. In the case of the example of FIG. 16D, each sub pattern is a dot-like figure (white point).
- the pattern image projection imaging processing unit 211 projects a pattern image (to be projected) from unprocessed (not projected) pattern images in the pattern images of patterns 1 to 4. Select the image to be processed this time.
- step S122 the pattern image projection imaging processing unit 211 projects the selected pattern image from the unprocessed projection unit 302 (not projecting the selected pattern image) of each projection imaging apparatus 102.
- the unit 302 projection unit 302 to be processed is selected.
- step S123 the pattern image projection imaging processing unit 211 causes the projection unit 302 selected in step S122 to project the pattern image selected in step S121 onto the screen 104.
- step S ⁇ b> 124 the pattern image projection imaging processing unit 211 projects the pattern image (projected on the screen 104) from the unprocessed imaging unit 303 (not imaging the projected pattern image) of each projection imaging device 102.
- the image pickup unit 303 image pickup unit 303 to be processed that picks up the pattern projection image) is selected.
- step S125 the pattern image projection imaging processing unit 211 causes the imaging unit 303 selected in step S124 to capture the pattern image (pattern projection image) projected on the screen 104 in step S123.
- step S126 the pattern image projection imaging processing unit 211 determines whether all the imaging units 303 have imaged the pattern projection images projected on the screen 104. If it is determined that there is an unprocessed imaging unit 303, the process returns to step S124, and the subsequent processes are repeated.
- the projected pattern projection image is imaged by all the imaging units 303.
- step S126 If it is determined in step S126 that all the image capturing units 303 have captured the pattern projection image projected on the screen 104, the process proceeds to step S127.
- step S127 the pattern image projection imaging processing unit 211 determines whether or not the pattern image selected as the processing target is projected on all the projection units 302. If it is determined that there is an unprocessed projection unit 302, the process returns to step S122, and the subsequent processes are repeated.
- step S122 to step S127 the pattern image to be processed is projected by all the projection units 302, and the pattern projection images projected by each projection unit 302 are respectively projected by all the imaging units 303. Imaged.
- step S127 If it is determined in step S127 that all the projection units 302 have projected the pattern image to be processed, the process proceeds to step S128.
- step S128 the pattern image projection imaging processing unit 211 determines whether all pattern images have been projected. If it is determined that there is an unprocessed pattern image, the process returns to step S121, and the subsequent processes are repeated.
- step S128 If it is determined in step S128 that all pattern images have been projected, the pattern image projection imaging process ends, and the process returns to FIG.
- the outline of the processing order is a loop of pattern image selection, projection unit 302 selection, and imaging unit 303 selection.
- a pattern image to be processed is selected, then a projection unit 302 that projects the pattern is selected, and pattern projection images projected by the projection unit 302 are captured by all the imaging units 303.
- the projection unit 302 that projects the pattern image to be processed is changed, and the pattern projection image is captured by all the imaging units 303.
- the pattern image to be processed is changed, and the above-described processing is repeated.
- the pattern image projection imaging process is terminated.
- the imaging by the imaging unit 303 of the same apparatus as the projection unit 302 that projects the pattern image may be omitted.
- the corresponding point detection processing unit 212 selects a captured image group to be processed from the unprocessed captured image group in step S141.
- the captured image of the pattern image obtained as described above is grouped by the projection unit 302 that projects the pattern image and the imaging unit 303 from which the captured image is obtained. That is, the captured images of all the pattern images projected by one projection unit 302 and captured by one imaging unit 303 are set as a set of captured image groups.
- the captured images of the patterns 1 to 4 projected by the projection unit 302 of the projection imaging apparatus 102-1 and projected by the imaging unit 303 of the projection imaging apparatus 102-2 are 1
- a set of captured images of patterns 1 to 4 projected by the projection unit 302 of the projection imaging device 102-2 and projected by the imaging unit 303 of the projection imaging device 102-1 (FIG. 16). Let it be a captured image group.
- step S141 a captured image group having such a configuration is selected.
- the corner detection processing unit 221 performs corner detection processing on the checker captured image of the processing target captured image group in step S142.
- step S143 the edge detection processing unit 222 performs edge detection processing on the binarized expanded image of the captured image group to be processed.
- step S144 the center-of-gravity detection processing unit 223 performs center-of-gravity detection processing on the unique pattern captured image of the captured image group to be processed.
- step S145 the inter-image corresponding point detection processing unit 224 performs inter-image corresponding point detection processing for the captured image group to be processed.
- step S146 the corresponding point detection processing unit 212 determines whether or not processing has been performed for all captured image groups. If it is determined that there is an unprocessed captured image group, the process returns to step S141, and the subsequent processes are repeated.
- step S141 to step S146 are executed for each captured image group. If it is determined in step S146 that all captured image groups have been processed, the corresponding point detection process ends, and the process returns to FIG.
- the pattern image of pattern 1 or pattern 2 shown in FIG. 16 is also referred to as a checker image
- a projected image of the checker image is also referred to as a checker projection image
- a captured image of the checker projection image is also referred to as a checker captured image.
- the captured image noise reduction unit 231 applies each checker captured image (pattern 1 captured image and pattern 2 captured image in FIG. 19) in step S161 as illustrated in FIG. Then, noise reduction processing is performed.
- step S162 the difference image generation unit 232 generates a difference image of each checker captured image with reduced noise as in the example illustrated in FIG. This difference image is also referred to as a checker difference image.
- step S163 the difference image binarization unit 233 binarizes the checker difference image as in the example shown in FIG.
- This binarized checker difference image is also referred to as a checker binarized image.
- step S164 the binarized image dilating unit 234 performs dilation processing on the checker binarized image as in the example shown in FIG. 19, and dilates the predetermined component (for example, a white portion).
- the checker binarized image that has undergone this expansion processing is also referred to as a checker binarized expanded image.
- step S165 the binarized dilated image corner detection unit 235 performs corner detection for the checker binarized dilated image and detects each corner of the checker (check pattern).
- step S165 When the process of step S165 is completed, the process returns to FIG.
- the checker difference image is reduced in the influence of the color of the screen 104 and external light. Therefore, by performing corner detection using such a checker difference image, the corner of the checker (check pattern) can be detected more robustly (the robustness of corner detection can be improved).
- the binarized dilated image edge detection unit 241 applies the checker binarized dilated image obtained by the corner detection process in step S181 as in the example shown in FIG. Edge detection is performed, and each edge of the checker (check pattern) is detected.
- step S182 the edge detection image generation unit 242 generates a checker edge detection image, which is an image showing each edge, using the edge detection result in step S181 as in the example shown in FIG.
- step S183 the edge detection image expansion unit 243 performs an expansion process on the checker edge detection image to expand the edges as in the example illustrated in FIG.
- the checker edge detection image subjected to this expansion processing is also referred to as a checker edge detection expansion image.
- step S183 When the process of step S183 is completed, the edge detection process is completed, and the process returns to FIG.
- the checker difference image is also used in edge detection. Therefore, the edge of the checker (check pattern) can be detected more robustly (the robustness of edge detection can be improved).
- the pattern image of the pattern 3 shown in FIG. 16C is also referred to as a black image
- the projection image of the black image is also referred to as a black projection image
- the captured image of the black projection image is also referred to as a black captured image.
- the pattern image of the pattern 4 shown in FIG. 16D is also referred to as a unique pattern image
- the projection image of the unique pattern image is also referred to as a unique pattern projection image
- the captured image of the unique pattern projection image is also referred to as a unique pattern captured image. Called.
- the unique pattern captured image noise detection unit 251 When the center-of-gravity detection process is started, the unique pattern captured image noise detection unit 251 performs a black captured image (pattern 3 captured image in FIG. 23) and a unique pattern captured image in step S201 as in the example illustrated in FIG. A noise reduction process is performed on each of (the pattern 4 captured image in FIG. 23).
- step S202 the difference image generation unit 252 generates a difference image between the black captured image with reduced noise and the unique pattern captured image, as in the example illustrated in FIG.
- This difference image is also referred to as a unique pattern difference image.
- step S203 the difference image binarization unit 253 binarizes the unique pattern difference image as in the example shown in FIG.
- This binarized unique pattern difference image is also referred to as a unique pattern binarized image.
- step S204 the binarized image centroid detection unit 254 performs centroid detection processing on the unique pattern binarized image as in the example illustrated in FIG. 23, and each unique pattern included in the unique pattern binarized image. The barycentric coordinates of each sub-pattern of the pattern are obtained.
- step S204 When the process of step S204 is completed, the process returns to FIG.
- the pattern 4 is an image in which a unique pattern is superimposed on the same black image as the pattern 3, the unique pattern difference image is less influenced by the color of the screen 104 and external light. Therefore, by performing centroid detection using such a unique pattern difference image, the centroid of each sub-pattern can be detected more robustly (the robustness of centroid detection can be improved).
- the unique pattern adjacent relationship detection unit 261 executes each sub-unit of the unique pattern binarized image detected by the centroid detection process as in the example illustrated in FIG.
- the adjacent relationship between the coordinates of the center of gravity of the pattern is detected using the checker edge detection dilation image generated by the edge detection process.
- the pixel value of the checker edge detection dilatation image When attention is paid to a certain sub-pattern, as shown in the example shown in FIG. 25, referring to the pixel value of the checker edge detection dilatation image along a straight line connecting points having a short distance between coordinates, the adjacent along the edge If there is a relationship, the pixel value should be 1. Paying attention to this, if the pixel value of the checker edge detection dilatation image is incremented along the straight line connecting the sub-patterns, and the ratio of the pixels whose pixel value is 1 with respect to the reference point is greater than or equal to the threshold value Suppose that they are adjacent to each other in the vertical and horizontal directions.
- the purpose of using the checker edge detection dilated image is to prevent the sub-patterns in the oblique direction that do not follow the edge from being adjacent to each other and prevent erroneous decoding in the subsequent decoding of the unique pattern.
- the unique pattern decoding unit 262 decodes each unique pattern in step S222 based on the adjacent relationship detected in step S221.
- Fig. 26 shows an example of a unique pattern.
- the unique pattern has a start code (StartCode) in which three sub-patterns are arranged in the horizontal direction (a sub-pattern arranged in one row and three columns), and a unique code (UniqueCode) therebelow. It is used as a mark indicating that exists.
- the unique code (UniqueCode) is a combination in which three sub patterns are not arranged in the horizontal direction and there is only one in the screen (sub patterns arranged in two rows and three columns).
- the unique pattern shown in FIG. 26 is merely an example, and the start code or unique code of the unique pattern may be formed in any pattern.
- the correspondence relationship is to be acquired more closely, it can be realized by increasing the number of combinations of unique codes (UniqueCode) such as increasing the number of horizontal four sub-patterns and the number of sub-pattern rows. That is, the start code is composed of a plurality of sub-patterns arranged as a figure, and the unique code may be anything as long as it is arranged differently from other patterns as figures.
- UniqueCode unique codes
- step S223 the homography image generation unit 263 uses the decoding result of the unique pattern obtained in step S222 to projectively transform the checker projection image to generate a homography image as in the example illustrated in FIG. To do.
- the homography image generation unit 263 obtains a homography matrix by using the correspondence relationship, and generates a homography image of the checker projection image. That is, the homography image generation unit 263 obtains the barycentric coordinates of the pattern in the homography image for all the checker corner coordinates of the pattern 1 in the checker projection image. That is, a corresponding point between the projection image and its homography image is detected.
- the homography image is not the same as the checker image (checker binarized expansion image).
- step S224 the corresponding point detection unit 264 between the homographic image captured images associates the nearest coordinates with the homography image and the checker binarized dilated image, as shown in FIG. Corresponding points between the homography image and the checker binarized expansion image are detected.
- step S225 the projected image captured image corresponding point detection unit 265 uses the corresponding point detection results in steps S223 and S224 to check the checker projection image and the checker binarization expansion for all corner coordinates of the checker image. Correspondence with an image (checker captured image) is detected. In this way, the corresponding coordinates can be obtained between the projected image and the captured image.
- step S225 When the processing in step S225 is completed, the inter-image corresponding point detection processing is completed, and the processing returns to FIG.
- the control device 101 uses four pattern images (two if background erasure (difference image) is not performed), which is fewer than the method described in Non-Patent Document 1.
- a corresponding point between the projected image and the captured image can be detected. That is, the relative posture between the projection unit and the imaging unit can be obtained more easily.
- the control device 101 can acquire corresponding points with higher accuracy. Furthermore, the control apparatus 101 can acquire a unique pattern robustly using the feature-value obtained from a projection pattern.
- ⁇ Configuration example of projection imaging system> the projection imaging system 100 has been described as having two projection imaging apparatuses 102. However, the number of projection imaging apparatuses 102 constituting the projection imaging system 100 is, for example, as in the example shown in FIG. There may be three or more (projection imaging device 102-1, projection imaging device 102-2, projection imaging device 102-3,).
- part or all of the correspondence detection processing described above may be executed by a device other than the control device 101.
- the control device 101 may be omitted, and the above-described correspondence detection processing may be executed in any projection imaging device 102.
- a plurality of projection imaging devices 102 may cooperate with each other and execute each processing of the correspondence relationship detection processing described above.
- the projection imaging apparatus 412 may be connected to the network 103 (the control apparatus 101, other projection imaging apparatus 103, etc.) via another information processing apparatus 411.
- the projection imaging apparatus 412 is the same apparatus as the above-described projection imaging apparatus 102. However, this projection imaging apparatus has a communication function such as a mobile phone, a smartphone, a tablet computer, and a notebook computer. It is connected to the network 103 via the information processing apparatus 411.
- the projection imaging apparatus 412 is controlled and driven by the information processing apparatus 411. By doing so, the information processing apparatus 411 having originally high processing capability can be made to perform processing related to communication, processing related to Toei and imaging control, and thus functions necessary for the projection imaging apparatus 412 (such as information processing capability). ) Can be suppressed, and an increase in cost can be suppressed.
- An information processing apparatus 413 in FIG. 30 is an information processing apparatus with originally high processing capability, such as a mobile phone, a smartphone, a tablet computer, and a notebook computer, and a module having the function of the projection imaging apparatus 102 described above. Is incorporated. That is, the information processing device 413 is a device having both functions of the information processing device 411 and the projection imaging device 412. The projection imaging apparatus 102 can also be realized as such an image processing apparatus.
- devices having different functions may be mixed as the projection imaging apparatus 102.
- the projection imaging system 100 may include a projection device 421 having only the projection unit 302, an imaging device 422 having only the imaging unit 303, and the like. Further, a plurality of projection units 302 and imaging units 303 may be provided in one device. Furthermore, in the entire projection imaging system 100, the number of projection units 302 and the number of imaging units 303 do not have to match.
- the series of processes described above can be executed by hardware or software.
- a program constituting the software is installed from a network or a recording medium.
- this recording medium is distributed by a removable medium 171 or a removable medium 321 on which a program is recorded, which is distributed to distribute a program to a user separately from the apparatus main body.
- the removable medium 171 and the removable medium 321 include a magnetic disk (including a flexible disk) and an optical disk (including a CD-ROM and a DVD). Further, magneto-optical disks (including MD (Mini-Disc)) and semiconductor memories are also included.
- the program can be installed in the storage unit 163 by mounting the removable medium 171 in the drive 165.
- the program can be installed in the storage unit 313 by mounting the removable medium 321 in the drive 315.
- This program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be received by the communication unit 164 and installed in the storage unit 163.
- the program can be received by the communication unit 314 and installed in the storage unit 313.
- this program can also be installed in advance in a storage unit or ROM.
- the program in the case of the control device 101, the program can be installed in advance in the storage unit 163, the ROM 153, or the like.
- the program in the case of the projection imaging apparatus 102, the program can be installed in advance in the storage unit 313, the ROM in the control unit 301, or the like.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but may be performed in parallel or It also includes processes that are executed individually.
- each step described above can be executed in each device described above or any device other than each device described above.
- the device that executes the process may have the functions (functional blocks and the like) necessary for executing the process described above.
- Information necessary for processing may be transmitted to the apparatus as appropriate.
- the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
- the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
- the configurations described above as a plurality of devices (or processing units) may be combined into a single device (or processing unit).
- a configuration other than that described above may be added to the configuration of each device (or each processing unit).
- a part of the configuration of a certain device (or processing unit) may be included in the configuration of another device (or other processing unit). .
- the present technology can take a configuration of cloud computing in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- the present technology is not limited to this, and any configuration mounted on such a device or a device constituting the system, for example, a processor as a system LSI (Large Scale Integration), a module using a plurality of processors, a plurality of It is also possible to implement as a unit using other modules, a set obtained by further adding other functions to the unit (that is, a partial configuration of the apparatus), and the like.
- a processor as a system LSI (Large Scale Integration)
- a module using a plurality of processors a plurality of It is also possible to implement as a unit using other modules, a set obtained by further adding other functions to the unit (that is, a partial configuration of the apparatus), and the like.
- An image processing apparatus including a detection unit that detects a correspondence relationship between a projected image and a captured image of the projected image using a pattern captured image that is a captured image of a pattern projected image that is a projected image of an image including a pattern .
- the image includes a plurality of patterns, Each pattern is composed of a plurality of sub-patterns, and the number of the sub-patterns or the positional relationship between the sub-patterns is different from the other patterns.
- the pattern is A start code in which the sub-patterns are arranged in a common number and positional relationship in each pattern; (2) The image processing apparatus according to (2), wherein the number of the sub-patterns or the unique relationship is different from other patterns.
- the start code includes a plurality of sub-patterns arranged as a figure, The image processing apparatus according to (3), wherein the unique code is arranged to be different from other patterns as graphics.
- the start code includes three patterns arranged in one row and three columns, The image processing apparatus according to (3), wherein the unique code includes one or a plurality of the sub-patterns arranged in two rows and three columns so that the number or positional relationship is different from the other patterns.
- the image processing device according to any one of (2) to (5), wherein the sub-pattern is arranged as a dot-like figure.
- the detection unit detects a corresponding point between the projection image and the captured image of the projection image by using the number of sub-patterns and the positional relationship for each pattern included in the pattern captured image,
- the image processing apparatus according to any one of (2) to (6), wherein the correspondence relationship is detected.
- the detection unit detects the corresponding points by analyzing the number and the positional relationship of the sub-patterns using the adjacent relationship between the patterns for each pattern. apparatus.
- the detection unit detects an adjacent relationship between the sub-patterns using a center of gravity of the sub-patterns with respect to each pattern included in the pattern captured image.
- Image processing device (13) The image processing device according to (12), wherein the detection unit detects a center of gravity of the sub-pattern using an image obtained by binarizing the pattern captured image.
- the detection unit includes: Detecting corresponding points between the projected image and the captured image using corresponding points between the homography image generated based on the pattern included in the pattern captured image and the captured image of the projected image.
- the image processing apparatus according to any one of (13).
- (15) The image processing apparatus according to (14), wherein the image of the predetermined pattern is a check pattern image.
- the detection unit detects corresponding points between the homography image and the captured image for all corners of the predetermined pattern in the captured image, The image processing device according to (14) or (15), wherein corresponding points between the projected image and the captured image are detected for all detected corners of the predetermined pattern.
- the camera further includes an imaging unit that captures a projected image and obtains a captured image,
- the detection unit uses the pattern captured image obtained by capturing the pattern projection image by the image capturing unit, and a captured image obtained by capturing the projection image by the image capturing unit;
- the image processing apparatus according to any one of (1) to (17), wherein the correspondence relationship is detected by detecting a corresponding point.
- a projection unit for projecting an image is further provided, The detection unit detects a corresponding point between the projection image projected by the projection unit and the captured image of the projection image using a pattern captured image that is a captured image of the pattern projection image projected by the projection unit.
- the image processing apparatus according to any one of (1) to (18), wherein the correspondence relationship is detected.
- the image processing unit further includes an image processing unit that performs image processing on a portion of the image to be projected that overlaps with another projection image using the correspondence relationship between the projection image detected by the detection unit and the captured image.
- the image processing device according to any one of 1) to (19).
- 100 projection imaging system 101 control device, 102 projection imaging device, 103 network, 104 screen, 111 and 112 projection images, 113 overlap region, 151 CPU, 201 correspondence detection unit, 202 projection control unit, 203 imaging control unit, 211 Pattern image projection imaging processing unit, 212 Corresponding point detection processing unit, 213 Projection image processing unit, 221 Corner detection processing unit, 222 Edge detection processing unit, 223 Gravity center detection processing unit, 224 Inter-image corresponding point detection processing unit, 231 imaging Image noise reduction unit, 232 difference image generation unit, 233 difference image binarization unit, 234 binarization image dilation unit, 235 binarization dilation image corner detection unit, 241 binarization dilation image edge detection unit, 242 edge detection image generation unit, 243 edge detection image expansion unit, 251 unique pattern captured image noise detection unit, 252 difference image generation unit, 253 difference image binarization unit, 254 binarized image centroid detection unit, 261 unique pattern adjacent Relationship detection unit, 262 unique pattern decoding unit, 263 homography image generation
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Projection Apparatus (AREA)
- Controls And Circuits For Display Device (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Image Processing (AREA)
Abstract
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201580027226.3A CN106464825B (zh) | 2014-07-01 | 2015-06-17 | 图像处理设备和方法 |
| JP2016531252A JP6711271B2 (ja) | 2014-07-01 | 2015-06-17 | 画像処理装置および方法 |
| US15/308,741 US10349023B2 (en) | 2014-07-01 | 2015-06-17 | Image processing apparatus and method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014135790 | 2014-07-01 | ||
| JP2014-135790 | 2014-07-01 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016002510A1 true WO2016002510A1 (fr) | 2016-01-07 |
Family
ID=55019057
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/067422 Ceased WO2016002510A1 (fr) | 2014-07-01 | 2015-06-17 | Dispositif et procédé de traitement d'image |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10349023B2 (fr) |
| JP (1) | JP6711271B2 (fr) |
| CN (1) | CN106464825B (fr) |
| WO (1) | WO2016002510A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019054204A1 (fr) * | 2017-09-14 | 2019-03-21 | ソニー株式会社 | Dispositif et procédé de traitement d'image |
| WO2020235400A1 (fr) * | 2019-05-20 | 2020-11-26 | ソニー株式会社 | Dispositif de traitement d'image, procédé de traitement d'image et programme |
| JP2022527943A (ja) * | 2019-11-11 | 2022-06-07 | チョントゥー ジミー テクノロジー カンパニー リミテッド | 超短焦点スクリーンの位置合わせ方法、装置、超短焦点投影機器及び媒体 |
Families Citing this family (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6594170B2 (ja) * | 2015-11-12 | 2019-10-23 | キヤノン株式会社 | 画像処理装置、画像処理方法、画像投影システムおよびプログラム |
| EP3392608A4 (fr) * | 2015-12-18 | 2019-11-13 | Sony Corporation | Dispositif et procédé de traitement d'image, données et support d'enregistrement |
| US10771751B2 (en) * | 2016-02-02 | 2020-09-08 | Panasonic Intellectual Property Management Co., Ltd. | Projection image adjustment system and projection image adjustment method |
| JPWO2018155269A1 (ja) * | 2017-02-27 | 2019-12-19 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
| WO2018167999A1 (fr) * | 2017-03-17 | 2018-09-20 | パナソニックIpマネジメント株式会社 | Projecteur et système de projecteur |
| CN110663249B (zh) * | 2017-05-26 | 2022-04-15 | 索尼公司 | 用于图像处理的装置和方法 |
| US10924718B2 (en) * | 2017-06-09 | 2021-02-16 | Sony Corporation | Image processing device and method |
| JP2019047311A (ja) * | 2017-09-01 | 2019-03-22 | セイコーエプソン株式会社 | 画像投写システム及びその制御方法 |
| US11763482B2 (en) * | 2017-09-27 | 2023-09-19 | Panasonic Intellectual Property Management Co., Ltd. | Baggage recognition device, baggage sorting system, and baggage recognition method |
| US10612912B1 (en) | 2017-10-31 | 2020-04-07 | Facebook Technologies, Llc | Tileable structured light projection system |
| US10521926B1 (en) | 2018-03-21 | 2019-12-31 | Facebook Technologies, Llc | Tileable non-planar structured light patterns for wide field-of-view depth sensing |
| CN110322527B (zh) * | 2019-05-21 | 2021-04-20 | 华为技术有限公司 | 一种图案生成方法及终端 |
| JP7163947B2 (ja) * | 2020-10-22 | 2022-11-01 | セイコーエプソン株式会社 | 投写領域の設定支援方法、設定支援システム、及びプログラム |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH04181106A (ja) * | 1990-11-15 | 1992-06-29 | Komatsu Ltd | 位置寸法計測装置のキャリブレーション装置 |
| JP2005252804A (ja) * | 2004-03-05 | 2005-09-15 | Seiko Epson Corp | マルチプロジェクションシステムのための画像補正方法 |
| JP2009070061A (ja) * | 2007-09-12 | 2009-04-02 | Ricoh Co Ltd | 2次元コード読み取り装置、2次元コード読み取り方法、2次元コード読み取りプログラム及び記録媒体 |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6754370B1 (en) * | 2000-08-14 | 2004-06-22 | The Board Of Trustees Of The Leland Stanford Junior University | Real-time structured light range scanning of moving scenes |
| JP2003078925A (ja) * | 2001-08-31 | 2003-03-14 | Matsushita Electric Ind Co Ltd | カメラ校正システムおよびカメラ校正方法 |
| US7146036B2 (en) * | 2003-02-03 | 2006-12-05 | Hewlett-Packard Development Company, L.P. | Multiframe correspondence estimation |
| US8807762B2 (en) * | 2008-01-11 | 2014-08-19 | Nikon Corporation | Projector |
| JP5440250B2 (ja) * | 2010-02-26 | 2014-03-12 | セイコーエプソン株式会社 | 補正情報算出装置、画像処理装置、画像表示システム、および画像補正方法 |
| JP5461452B2 (ja) * | 2010-03-31 | 2014-04-02 | 三洋電機株式会社 | 制御装置および投写型映像表示装置 |
| CN102215395B (zh) * | 2010-04-09 | 2013-10-09 | 华为技术有限公司 | 一种视频编解码方法和装置 |
| US8941750B2 (en) * | 2011-12-27 | 2015-01-27 | Casio Computer Co., Ltd. | Image processing device for generating reconstruction image, image generating method, and storage medium |
| JP2013168922A (ja) * | 2012-01-18 | 2013-08-29 | Sony Corp | 投影型画像表示装置及び画像投影方法、並びにコンピューター・プログラム |
| US9389067B2 (en) * | 2012-09-05 | 2016-07-12 | Canon Kabushiki Kaisha | Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, program, and storage medium |
-
2015
- 2015-06-17 US US15/308,741 patent/US10349023B2/en active Active
- 2015-06-17 CN CN201580027226.3A patent/CN106464825B/zh not_active Expired - Fee Related
- 2015-06-17 WO PCT/JP2015/067422 patent/WO2016002510A1/fr not_active Ceased
- 2015-06-17 JP JP2016531252A patent/JP6711271B2/ja active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH04181106A (ja) * | 1990-11-15 | 1992-06-29 | Komatsu Ltd | 位置寸法計測装置のキャリブレーション装置 |
| JP2005252804A (ja) * | 2004-03-05 | 2005-09-15 | Seiko Epson Corp | マルチプロジェクションシステムのための画像補正方法 |
| JP2009070061A (ja) * | 2007-09-12 | 2009-04-02 | Ricoh Co Ltd | 2次元コード読み取り装置、2次元コード読み取り方法、2次元コード読み取りプログラム及び記録媒体 |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019054204A1 (fr) * | 2017-09-14 | 2019-03-21 | ソニー株式会社 | Dispositif et procédé de traitement d'image |
| US11109006B2 (en) | 2017-09-14 | 2021-08-31 | Sony Corporation | Image processing apparatus and method |
| WO2020235400A1 (fr) * | 2019-05-20 | 2020-11-26 | ソニー株式会社 | Dispositif de traitement d'image, procédé de traitement d'image et programme |
| US11785188B2 (en) | 2019-05-20 | 2023-10-10 | Sony Group Corporation | Image processing apparatus and image processing method |
| JP2022527943A (ja) * | 2019-11-11 | 2022-06-07 | チョントゥー ジミー テクノロジー カンパニー リミテッド | 超短焦点スクリーンの位置合わせ方法、装置、超短焦点投影機器及び媒体 |
| JP7263546B2 (ja) | 2019-11-11 | 2023-04-24 | チョントゥー ジミー テクノロジー カンパニー リミテッド | 超短焦点スクリーンの位置合わせ方法、装置、超短焦点投影機器及び媒体 |
| US12177613B2 (en) | 2019-11-11 | 2024-12-24 | Chengdu Xgimi Technology Co., Ltd. | Ultra-short-throw picture and screen alignment method and apparatus, ultra-short-throw projection device, and medium |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106464825B (zh) | 2020-11-10 |
| JP6711271B2 (ja) | 2020-06-17 |
| US20170142381A1 (en) | 2017-05-18 |
| JPWO2016002510A1 (ja) | 2017-04-27 |
| US10349023B2 (en) | 2019-07-09 |
| CN106464825A (zh) | 2017-02-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6711271B2 (ja) | 画像処理装置および方法 | |
| JP7067554B2 (ja) | 画像処理装置および方法 | |
| CN103765870B (zh) | 图像处理装置、包括图像处理装置的投影仪和投影仪系统、图像处理方法 | |
| JP6798491B2 (ja) | 情報処理装置および方法、並びにプログラム | |
| CN108369091B (zh) | 图像处理装置和方法、数据及纪录介质 | |
| JP6794981B2 (ja) | 情報処理装置および方法 | |
| JP7074052B2 (ja) | 画像処理装置および方法 | |
| JP6459194B2 (ja) | プロジェクター、及び投写画像制御方法 | |
| JP6915537B2 (ja) | 情報処理装置および方法、並びに、投影撮像装置および情報処理方法 | |
| JP7010209B2 (ja) | 画像処理装置および方法 | |
| KR20190072549A (ko) | 모바일 디바이스들을 위한 강화된 심도 맵 이미지들 | |
| JP2021118413A (ja) | プロジェクターの制御方法、プロジェクター、及び表示システム | |
| CN102736378B (zh) | 投影装置及投影方法 | |
| JP7371753B2 (ja) | 投影制御装置、投影装置、補正用画像投影方法及びプログラム | |
| JP2021127998A (ja) | 距離情報取得装置および距離情報取得方法 | |
| JP2010288062A (ja) | プロジェクター、プログラム、情報記憶媒体および画像投写方法 | |
| US20110157314A1 (en) | Image Processing Apparatus, Image Processing Method and Recording Medium | |
| JP2007017516A (ja) | 2次元の位置情報を投影する機能を備えたプロジェクタ | |
| JP2013005073A (ja) | プロジェクター、およびプロジェクターの制御方法 | |
| JP2019114889A (ja) | 投影装置および投影装置の較正方法 | |
| JP2007048136A (ja) | 2次元の位置情報を投影する機能を備えたプロジェクタ | |
| JP2019185532A (ja) | 画像処理装置、画像処理方法、およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15814182 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2016531252 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15308741 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15814182 Country of ref document: EP Kind code of ref document: A1 |