[go: up one dir, main page]

WO2025069752A1 - Imaging device, information processing device, and imaging system - Google Patents

Imaging device, information processing device, and imaging system Download PDF

Info

Publication number
WO2025069752A1
WO2025069752A1 PCT/JP2024/028916 JP2024028916W WO2025069752A1 WO 2025069752 A1 WO2025069752 A1 WO 2025069752A1 JP 2024028916 W JP2024028916 W JP 2024028916W WO 2025069752 A1 WO2025069752 A1 WO 2025069752A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
imaging device
imaging
information processing
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/028916
Other languages
French (fr)
Japanese (ja)
Inventor
徹 天野
学 木村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of WO2025069752A1 publication Critical patent/WO2025069752A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • This disclosure relates to an imaging device, an information processing device, and an imaging system.
  • Such a panoramic image is created by mechanically driving the imaging device in multiple imaging directions using a motor, and then synthesizing the images so that multiple captured images are continuous.
  • the present disclosure provides an imaging device, an information processing device, and an imaging system that can suppress a decrease in accuracy of image synthesis processing.
  • a first camera and a second camera fixed to the first camera.
  • a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
  • the first camera and the second camera are rotatable about a pan axis;
  • a second optical axis of the imaging system of the second camera may have an intersection with a pan axis.
  • the reference image plate may have a chart including multiple compartments on at least a portion of the inner surface of a hollow sphere.
  • the chart may be drawn as a black rectangle on a white background.
  • the first camera and the second camera are rotatable about a pan axis;
  • the centre of the reference image plate may be on the pan axis.
  • the imaging system of the second camera may be positioned closer to the first camera than the pan axis.
  • the first camera and the second camera are rotatable about a tilt axis;
  • the tilt axis may pass through the intersection point.
  • the lighting device may further include a light-shielding cover that blocks the illumination.
  • a camera frame supporting the first camera and the second camera;
  • the light source and the light-shielding cover may be supported by the camera frame.
  • the device may further include a transparent outer dome that covers the first and second cameras, the camera frame, the light source, and the light-shielding cover.
  • the system may further include a drive control unit that causes the second camera to capture images of multiple sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.
  • the angles of view of the first camera and the second camera are fixed, and the angle of view of the first camera may be greater than or equal to the angle of view of the second camera.
  • composition processing unit that performs composition processing of a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that photographs a chart including a plurality of partitions.
  • a synthesis processing unit that synthesizes a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that captures an image of a chart including a plurality of partitions; It may be prepared.
  • the synthesis processing unit may synthesize the first section image data.
  • the synthesis processing unit may synthesize the first section image data based on the layout information.
  • the partition processing unit recognizes an identification symbol corresponding to the partition,
  • the synthesis processing unit may synthesize the first section image data based on the identification symbol.
  • An imaging device including an information processing device
  • the imaging device includes: a first camera; and a second camera fixed to the first camera. a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections; having
  • the information processing device includes: a synthesis processing unit that synthesizes a plurality of image data output from the first camera based on image data of the second camera that captures an image of a chart including the plurality of partitions;
  • the information processing device includes:
  • the chart may further include a camera control unit that causes the second camera to capture images of the plurality of sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.
  • FIG. 1 is a block diagram illustrating an example of an imaging system according to an embodiment of the present disclosure.
  • FIG. FIG. 4 is a perspective view showing a second camera, a light source, and a light-shielding cover.
  • FIG. FIG. 1 is a perspective view showing an entire imaging device.
  • 5A to 5C are diagrams illustrating a manufacturing process of a reference image plate.
  • FIG. 13 is a schematic diagram showing a zx-axis cross section passing through an intersection point of a spherical reference image plate.
  • FIG. 2 is a schematic diagram showing a yx-axis cross section passing through an intersection point of a spherical reference image plate.
  • FIG. 4 is a conceptual diagram illustrating the inner surface of the reference image plate on the second camera side.
  • FIG. 11A and 11B are schematic diagrams showing examples of images of a square of a chart captured at different tilt angles.
  • FIG. 4 is a diagram illustrating a relationship between a first captured image and a second captured image.
  • FIG. 1 is a block diagram showing an example of the configuration of an information processing device.
  • FIG. 1 is a diagram illustrating a processing example of an information processing device.
  • FIG. 2 is a block diagram showing an example of the configuration of a monitoring device.
  • FIG. 11 is a vertical cross-sectional view of an imaging device according to a second embodiment.
  • FIG. 1 is a vertical cross-sectional view of the reference image plate.
  • FIG. 13 is a diagram showing an example of tilting operation about a tilt axis.
  • FIG. 1 is a perspective view showing an entire imaging device.
  • FIG. 1 is a perspective view showing an entire imaging device.
  • FIG. 13 is a vertical cross-sectional view of an imaging device according to a third embodiment.
  • FIG. 1 is a perspective cross-sectional view with the outer dome removed.
  • FIG. 2 is a perspective view showing the entire imaging device with the outer dome removed.
  • 4A and 4B are diagrams showing an image captured by a first camera and a square area corresponding to the square area;
  • the imaging device, information processing device, and imaging system may have components and functions that are not shown or described.
  • the following description does not exclude components and functions that are not shown or described.
  • FIG. 1 is a block diagram showing an example of an imaging system 1 according to an embodiment of the present disclosure.
  • the imaging system 1 is a system capable of generating a panoramic image, and includes an imaging device 10, an information processing device 20, a first display device 30, a monitoring device 40, and a second display device 50.
  • the imaging system 1 can be used, for example, as a monitoring or observation system. That is, the imaging system 1 can be used, for example, for environmental monitoring (disasters, forest fires, air pollution, etc.), recording the ecology of a wide range of animals, and monitoring plant growth and irrigation.
  • the panoramic image according to this embodiment is an example of a wide-angle image.
  • the information processing device 20 is, for example, an edge computer, and is configured to include a CPU (Central Processing Unit), and is capable of controlling the imaging device 10 and performing image synthesis processing of a panoramic image.
  • the information processing device 20 can communicate with the monitoring device 40 via a network nw.
  • the network nw may be wired or wireless. Details of the information processing device 20 will be described later.
  • the imaging device 10 and the information processing device 20 can also be configured as an integrated unit.
  • the circuit board of the information processing device 20, which includes a CPU can be built into the driving space 104s (see FIG. 2) together with the driving control unit of the imaging device 10.
  • the driving control unit has, for example, a mechanical driving control unit including a CPU and a mechanical driving unit including a motor.
  • the mechanical driving control unit controls the entire imaging device 10.
  • the mechanical driving unit of the driving control unit executes mechanical driving of the imaging device 10 according to the control of the mechanical driving control unit.
  • This mechanical driving control unit also controls the imaging of the first camera 100 and the second camera 102.
  • the imaging device 10 and the information processing device 20 can also be configured as stand-alone devices without being connected via the network nw.
  • the monitoring device 40 may be, for example, a server on the cloud side, includes a CPU, and can be used to monitor the panoramic image generated by the information processing device 20 via the network nw.
  • the second display device 50 may be, for example, a monitor, and displays the panoramic image provided by the monitoring device 40. Details of the monitoring device 40 will be described later.
  • Fig. 2 is a vertical cross-sectional view of the imaging device 10.
  • the vertical direction is the Z axis
  • the directions perpendicular to the Z axis are the x and y axes.
  • the positive direction of the Z axis is the vertically upward direction, but is not limited to this.
  • the positive direction of the Z axis may be the vertically downward direction.
  • the imaging device 10 can also be installed upside down as compared to Fig. 2.
  • the imaging device 10 includes a first camera 100, a second camera 102, a camera platform 104, a reference image plate 106, a light-shielding cover 108, a support 109, an external dome 110, and a light source 112 (see FIG. 3).
  • the first camera 100 is a camera that captures images of the outside world, such as scenery.
  • the imaging system of the first camera 100 has a first optical axis OL1.
  • the imaging system may be composed of a single lens, or may be composed of multiple lenses.
  • the second camera 102 is fixed to the first camera 100 and captures an image of the reference image plate 106.
  • the imaging system of the second camera 102 has a second optical axis OL2.
  • the imaging system may be composed of a single lens, or may be composed of multiple lenses.
  • the gimbal 104 supports the first camera 100 and the second camera 102, and is capable of panoramic and tilt movements.
  • the gimbal 104 is supported by a support 109.
  • the camera platform 104 has a pan axis rotation mechanism 104a that rotates the first camera 100 and the second camera 102 around a pan axis OP, and a tilt axis rotation mechanism 104b that rotates the first camera 100 and the second camera 102 around a tilt axis OT.
  • the pan axis OP is a rotation axis that passes through the pan axis rotation mechanism 104a in the vertical direction
  • the tilt axis OT is a rotation axis that passes through the tilt axis rotation mechanism 104b in the horizontal direction.
  • the first camera 100, the second camera 102, and the light-shielding cover 108 are fixed to the camera frame 104c.
  • the camera frame 104c is fixed to the support frame 104d via the tilt axis rotation mechanism 104b.
  • the support frame 104d is also configured to be rotatable by the pan axis rotation mechanism 104a.
  • the pan head 104 forms a pan axis OP and a tilt axis OT that pass through, for example, the intersection C of the first optical axis OL1 of the first camera 100 and the second optical axis OL2 of the second camera 102.
  • a spherical reference image plate 106 is fixed to the pan head 104.
  • the center point of the sphere of the reference image plate 106 is, for example, the intersection C.
  • the first angle of view of the first camera 100 and the second angle of view of the second camera 102 are fixed.
  • the first angle of view of the first camera 100 is set to be equal to or larger than the second angle of view of the second camera 102.
  • the pan axis rotation mechanism 104a of the pan head 104 can rotate the first optical axes OL1 and OL2 of the first camera 100 and the second camera 102 along a horizontal plane (x, y plane) with the pan axis OP as the rotation axis.
  • the tilt axis rotation mechanism 104b of the pan head 104 can rotate the first optical axes OL1 and OL2 of the first camera 100 and the second camera 102, respectively, along a vertical plane (z, x plane) with the tilt axis OT as the rotation axis.
  • the reference image plate 106 is fixed to the support 109.
  • the reference image plate 106 is disposed opposite the imaging surface of the second camera 102 and has a chart including multiple sections.
  • the light-shielding cover 108 blocks external light, as shown in FIG. 3.
  • the support 109 supports the camera platform (gimbal) 104, the reference image plate 106, and the external dome 110.
  • the external dome 110 is a transparent cover that insulates the first camera 100, the second camera 102, the camera platform 104, the reference image plate 106, the light-shielding cover 108, the light source 112, and the like from the outside air. This makes it possible to suppress the effects of rainfall, etc.
  • FIG. 3 is a perspective view showing the second camera 102, the light source 112, and the light-shielding cover 108.
  • the reference image plate 106 and the external dome 110 are removed.
  • the light source 112 is, for example, an LED (Light Emitting Diode), and illuminates the reference image plate 106.
  • the second camera 102 captures the reference image plate 106 illuminated by the light source 112.
  • the light-shielding cover 108 blocks external light. This allows the illuminance when the second camera 102 captures the reference image plate 106 to be kept constant. This makes it possible to process the second image captured by the second camera 102 under the same conditions without performing gradation processing, etc., in the panoramic image synthesis process described below.
  • FIG. 4 is a perspective view of the imaging device 10, with the external dome 110 removed.
  • FIG. 5 is a perspective view of the entire imaging device 10.
  • the overall size of the imaging device 10 is, for example, within 10 centimeters in both length and width.
  • Such an imaging device 10 can also be attached, for example, to the ceiling of the monitoring area.
  • FIG. 6 is a diagram showing a schematic diagram of the manufacturing process of the reference image plate 106.
  • FIG. 6(a) is a cross-sectional view of the base member 1060 of the reference image plate 106.
  • the base member 1060 is a transparent spherical member.
  • the base member 1060 is, for example, acrylic, polycarbonate, or the like.
  • the spherical member is, for example, a part of a hollow sphere with a center point c106.
  • Figure 6(b) is a diagram showing the process of applying white paint 1062 to base member 1060.
  • White paint 1062 is applied from the outside of base member 1060.
  • Figure 6(c) is a diagram showing the process of drawing the lines of the chart in white paint 1062.
  • White paint 1062 is applied.
  • the lines of the chart are drawn by peeling off the drawing/paint with a very fine laser from the outside of the white paint 1062.
  • FIG. 6(c) is a diagram showing the process of drawing the lines of the chart and applying black paint 1064 to the white paint 1062.
  • the lines of the chart are drawn and the light-blocking black paint 1064 is applied from the outside of the white paint 1062.
  • FIG. 6(e) is a diagram showing the squares 106a that make up the chart diagram. In this way, the squares 106a are drawn in a checkerboard pattern on the base member 1060, and the reference image plate 106 is manufactured.
  • the combination of white and black and white in the chart is a combination that maximizes contrast, but is not limited to this. For example, other color combinations are also possible as long as there is a certain level of contrast required for the chart to be recognized.
  • Fig. 7 is a schematic diagram showing a zx-axis cross section passing through intersection point C of the spherical reference image plate 106.
  • Fig. 8 is a schematic diagram showing a yx-axis cross section passing through intersection point C of the spherical reference image plate 106.
  • the center point c106 of the sphere of the reference image plate 106 is disposed, for example, at the same position as intersection point C (see Fig. 2). With such an arrangement, a distance is always maintained between the imaging surface of the second camera 102 on the second optical axis OL2 of the second camera 102 and the inner surface of the sphere of the reference image plate 106.
  • FIG. 9 is a conceptual diagram that shows a schematic representation of the inner surface of the reference image plate 106 facing the second camera 102.
  • a chart of squares 106a is drawn in a checkerboard pattern.
  • the squares 106a are drawn as straight lines, but because they are drawn on a spherical surface as described above (see FIG. 6), when viewed in a plan view, each side will be curved depending on its position.
  • each square 106a may have an identification symbol 106c drawn on it to indicate its position.
  • the identification symbol 106c is, for example, a different number or QR code for each square 106a.
  • the center coordinates and arrangement coordinates of each rectangle 106a are associated with the identification symbol 106c and stored in the memory unit 202 (see FIG. 12).
  • the coordinates can be acquired when generating a panoramic image. This makes it possible to grasp the arrangement position of the image 100a (see FIG. 10) when arranging the image 100a of each section, eliminating the need to adjust the zero point of the image 100a. This also eliminates the need for angle sensors and encoders, enabling further miniaturization.
  • the second angle of view of the second camera 102 is set to correspond to the second captured image 106b, which is a margin range that includes the reference rectangle 106a.
  • the range of this second captured image 106b reflects the mechanical accuracy of the mechanical drive unit in the imaging device 10.
  • the second camera 102 captures the rectangle 106a, for example, in the numerical order of FIG. 9. In this case, if there is no deviation in the mechanical control of the mechanical drive unit, it is possible to capture only the rectangle 106a, but the range of the second captured image 106b is set by expanding the rectangle 106a in all directions, taking into account the control deviation.
  • the range of the rectangle in this second captured image 106b corresponds to the second angle of view of the second camera 102.
  • the imaging device 10 captures images of the squares 106a in sequence
  • the shift in the center point of the square 106a in the second captured image 106b is reflected.
  • the range of the center point shift is ⁇ 5 mm
  • the range of the square in the second captured image 106b is the range of the square 106a expanded by 5 mm.
  • the second angle of view is set so that the square 106a is always included. This also makes it possible to combine images without using images of areas with poor image quality surrounding the square 106a.
  • FIG. 10 is a schematic diagram showing an example of the squares 106a of the chart captured at different tilt angles in the arrangement example shown in FIG. 7. That is, this is an example of the squares 106a captured by tilting the second camera 102 about the tilt axis OT (see FIG. 2) that passes through the intersection C.
  • each square 106a can be captured with the same shape regardless of the tilt angle at which it is captured.
  • each square 106a can be captured with the same shape regardless of the pan angle at which it is captured.
  • the first angle of view of the first camera 100 is set to be equal to or greater than the second angle of view of the second camera 102.
  • the capturing range of the first captured image 100b of the first camera 100 is also captured in succession with some overlap.
  • the first captured image 100b includes a first square area 100a corresponding to the area of the rectangle 106a included in the second image of the second camera 102.
  • the vertical and horizontal angles of view of the area of the rectangle 106a and the first square area 100a are configured to be equal.
  • FIG. 11 is a diagram showing a schematic relationship between the first captured image 100b of the first camera 100 and the second captured image 106b of the second camera 102.
  • the first captured images 100b are also captured in succession.
  • the first square area 100a corresponding to the area of the square 106a is included in the capture range of the first captured image 100b.
  • the capture of the first captured image 100b and the second captured image 106b is performed under the control of the information processing device 20. That is, the drive control unit arranged in the drive space 104s (see FIG. 2) causes the second camera 102 to capture the squares 106a, which are the multiple sections of the chart, in a predetermined order, and causes the first camera 100 to capture the images according to the capture of the second camera 102.
  • FIG. 12 is a block diagram showing an example configuration of the information processing device 20.
  • the information processing device 20 has a camera control unit 200, a memory unit 202, a partition processing unit 204, a synthesis processing unit 206, a calibration processing unit 208, a display control unit 210, and a communication unit 212.
  • the CPU of the information processing device 20 can configure the camera control unit 200, the partition processing unit 204, the synthesis processing unit 206, the calibration processing unit 208, and the display control unit 210 by executing a program stored in the memory unit 202.
  • each unit may be configured with an electronic circuit.
  • the camera control unit 200 controls the imaging positions of the first camera 100 and the second camera 102 through control of the drive control unit in the driving space 104s (see FIG. 2), and causes the first captured image 100b and the second captured image 106b to be captured at each imaging position.
  • the storage unit 202 stores programs, various control parameters, captured images, etc.
  • the partition processing unit 204 recognizes the numbers in the second captured image 106b, and recognizes the coordinates indicating the range of the rectangle 106a from within the second captured image 106b. The partition processing unit 204 then generates a first rectangular area 100a from the first captured image 100b based on the coordinates indicating the range of the rectangle 106a, and stores this in the memory unit 202 in association with the recognized identification symbol.
  • the memory unit 202 stores programs, various control parameters, captured images, and the like. In other words, the memory unit 202 stores placement information corresponding to the rectangle 106a, which is a partition.
  • the synthesis processing unit 206 synthesizes a panoramic image based on the identification symbol associated with the first square area 100a stored in the memory unit 202 and the arrangement information.
  • the calibration processing unit 208 causes the second camera 102 to capture an image of each square 106a of the reference image board 106.
  • the partition processing unit 204 calculates the coordinates and center coordinates indicating the range of the square 106a as arrangement information, associates it with the recognized identification symbol (e.g., a number), and stores it in the memory unit 202 as shooting parameters.
  • the display control unit 210 causes the first display device 30 to display the panoramic image generated by the synthesis processing unit 206.
  • the communication unit 212 transmits the panoramic image to the monitoring device 40 via a communication interface (not shown).
  • FIG. 13 is a diagram that illustrates a processing example of the information processing device 20.
  • FIG. 13(a) is a diagram showing a first captured image 100b of the first camera 100 and a second captured image 106b of the second camera 102.
  • the camera control unit 200 controls the imaging positions of the first camera 100 and the second camera 102 to capture the first captured image 100b and the second captured image 106b at each imaging position.
  • the camera control unit 200 controls the second optical axis OL2 of the second camera 102 to coincide with the central coordinates of the rectangle 106a stored in the storage unit.
  • the camera control unit 200 then causes the first camera 100 to capture the first captured image 100b at this imaging position, and the second camera 102 to capture the second captured image 106b at this imaging position.
  • FIG. 13(b) is a diagram showing an example of the recognition process of the range of the square 106a from the second captured image 106b.
  • the partition processing unit 204 recognizes the coordinates indicating the square 106a and the identification symbol while referring to the coordinate information of the square 106a stored in the memory unit 202.
  • the partition processing unit 204 sets the range corresponding to the coordinates indicating the recognized square 106a in the first captured image 100b, generates the first square area 100a, associates it with the recognized identification symbol, and stores it in the memory unit 202.
  • the imaging range of the second captured image 106b is limited and the square 106a is a rectilinear figure, it is possible to obtain the coordinate information of the square 106a with higher recognition accuracy than feature points in the image.
  • FIG. 13(c) is a conceptual diagram of the panoramic synthesis process using the first square region 100a.
  • the synthesis processing unit 206 synthesizes a panoramic image based on the identification symbol associated with the first square region 100a stored in the memory unit 202 and the coordinates at which the region should be placed. This process is repeated for all of the squares 106a in order.
  • the synthesis processing unit 206 may perform the process after all of the first square regions 100a have been generated, or the synthesis process may be performed sequentially during shooting. In this way, by performing the panoramic synthesis process based on the coordinate information of the squares 106a, it is possible to perform the panoramic synthesis process with a higher alignment accuracy than the accuracy of mechanical driving in the imaging device 10.
  • FIG. 14 is a block diagram showing an example of the configuration of the monitoring device 40.
  • the monitoring device 40 has a storage unit 402, a partition processing unit 204, a composition processing unit 206, a calibration processing unit 208, a display control unit 410, and a communication unit 412.
  • the monitoring device 40 may also have a configuration including the partition processing unit 204, the composition processing unit 206, and the calibration processing unit 208.
  • the CPU of the monitoring device 40 can configure the partition processing unit 204, the composition processing unit 206, the calibration processing unit 208, and the display control unit 210 by executing a program stored in the storage unit 402. Alternatively, each unit may be configured using electronic circuits.
  • the storage unit 402 stores programs, various control parameters, captured images, etc.
  • the display control unit 410 displays the panoramic image acquired via the communication unit 412 on the second display device 50.
  • the communication unit 412 receives the panoramic image from the information processing device 20.
  • the panoramic image can be generated by the information processing device 20 on the edge side, or by the monitoring device 40 on the server side. This makes it possible to select a configuration that is adapted to the processing volume, communication frequency, communication volume, etc. of the information processing device 20, the cost on the camera side, available power, communication infrastructure, etc.
  • the imaging device 10 is configured to include a first camera 100, a second camera 102 fixed to the first camera 100, and a reference image board 106 having a chart including multiple sections, which is disposed opposite the imaging surface of the second camera 102. This makes it possible to synthesize image data captured by the first camera 100 based on image data including the sections captured by the second camera 102.
  • the imaging device 10a according to the second embodiment differs from the imaging device 10 according to the first embodiment in that the second camera 102 is disposed on the first camera 100 side of the pan axis OP.
  • the differences from the imaging device 10 according to the first embodiment will be described below.
  • FIG. 15 is a vertical cross-sectional view of the imaging device 10a according to the second embodiment.
  • the second camera 102 according to the second embodiment differs from the imaging device 10 according to the first embodiment in that it is disposed on the first camera 100 side of the pan axis OP. This shortens the distance from the second camera 102 to the reference image plate 106, making it possible to miniaturize the reference image plate 106.
  • the second camera 102 can be configured to be enclosed within the inner surface of the reference image plate 106. This makes it possible to expand the pan angle of the imaging device 10a according to the second embodiment from 0 degrees to 360 degrees.
  • the overlapping cover prevents light from entering, it is possible to obtain the same effect as the light-shielding cover 108 (see FIG. 5).
  • FIG. 16 is a cross-sectional perspective view of the reference image plate 106.
  • the light source 112 which is an illumination LED
  • FIG. 17 is a vertical cross-sectional view of the reference image plate 106.
  • the illumination light L112 from the light source 112 suppresses reflections on the second camera 102 and can indirectly illuminate the interior in a uniform manner, resulting in a more stable exposure of the second camera 102.
  • the overlapping cover prevents light from entering the interior, making it possible to obtain the same effect as the light-shielding cover 108 (see FIG. 5).
  • Figure 18 is a vertical cross-sectional view showing an example of tilting the first camera 100 and the second camera 102 with respect to the tilt axis OT.
  • Figure 18(a) shows the case where the tilt angle is 15 degrees downward
  • Figure 18(b) shows the case where the tilt angle is 15 degrees upward.
  • the hole on the lower side of the reference image plate 106 is configured to be covered by a cover that overlaps with the hole in the camera frame 104c, even when tilted up and down.
  • Figure 19 is a perspective view showing the entire imaging device 10a.
  • the overall size of the imaging device 10a is, for example, within 10 centimeters in both length and width.
  • Such an imaging device 10a can also be attached to, for example, the ceiling of a monitoring area.
  • the second camera 102 is arranged on the first camera 100 side of the pan axis OP. This shortens the distance from the second camera 102 to the reference image plate 106, making it possible to miniaturize the reference image plate 106.
  • the second camera 102 can be configured to be contained within the inner surface of the reference image plate 106. This makes it possible to expand the pan angle of the imaging device 10a from 0 degrees to 360 degrees.
  • the imaging device 10b according to the third embodiment differs from the imaging device 10a according to the second embodiment in that the second camera 102 is disposed on the first camera 100 side of the pan axis OP and performs only pan (panoramic) operations.
  • the differences from the imaging device 10b according to the second embodiment will be described below.
  • FIG. 20 is a vertical cross-sectional view of an imaging device 10b according to the third embodiment.
  • FIG. 21 is an oblique cross-sectional view with the external dome 110 removed.
  • the imaging device 10b according to the third embodiment differs from the imaging device 10a according to the second embodiment in that it does not have a tilt axis rotation mechanism 104b. This allows the imaging device 10b according to the third embodiment to be made smaller, and also allows for a simplified structure and reduced costs.
  • Figure 22 is a perspective view showing the entire imaging device 10b with the external dome 110 removed.
  • the overall size of the imaging device 10a is, for example, within 10 centimeters in both length and width.
  • Such an imaging device 10a can also be attached to, for example, the ceiling of the monitoring area.
  • FIG. 23 shows a first captured image 100b captured by the first camera 100, and a first square area 100a corresponding to the area of the square 106a.
  • the first captured image 100b is captured as a rectangle with its long side in the vertical direction. In this way, when the tilt axis rotation mechanism 104b is not provided, the pixels of the first camera 100 can be used effectively by positioning the camera so that its angle of view is vertically long.
  • the second camera 102 is placed on the first camera 100 side of the pan axis OP, and is only allowed to perform pan (panoramic) movements. This makes it possible to further miniaturize the imaging device 10b.
  • This technology can be configured as follows:
  • An imaging device comprising:
  • the first camera and the second camera are rotatable about a pan axis;
  • the first camera and the second camera are rotatable about a tilt axis;
  • (10) a light source for illuminating the chart;
  • the imaging device further comprising a drive control unit that causes the second camera to capture images of a plurality of sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.
  • a synthesis processing unit that synthesizes a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that captures an image of a chart including a plurality of partitions; Provided with an information processing device.
  • the partition processing unit recognizes an identification symbol corresponding to the partition, The information processing device according to (17), wherein the synthesis processing unit synthesizes the first section image data based on the identification symbol.
  • the information processing device further comprising a calibration unit that generates the arrangement information based on an image captured of a chart including the plurality of sections and stores the arrangement information in the storage unit.
  • An imaging device including an information processing device, The imaging device includes: a first camera; and a second camera fixed to the first camera. a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections; having The information processing device includes: a synthesis processing unit that synthesizes a plurality of image data output from the first camera based on image data of the second camera that captures an image of a chart including the plurality of partitions;
  • An imaging system comprising:
  • the information processing device includes: The imaging system according to (19), further comprising a camera control unit that causes the second camera to capture images of the multiple sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.
  • Imaging system 10, 10a, 10b: Imaging device
  • 20 Information processing device
  • 100 First camera
  • 102 Second camera
  • 108 Light shielding cover
  • 106 Reference image plate
  • 110 External dome
  • 112 Light source
  • OL1 First optical axis
  • OL2 Second optical axis
  • OP Pan axis
  • OT Tilt axis.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

[Problem] To provide, through the present disclosure, an imaging device, an information processing device, and an imaging system capable of suppressing a reduction in the accuracy of image synthesis. [Solution] An imaging device according to the present disclosure comprises: a first camera; a second camera fixed to the first camera; and a reference image plate disposed facing an imaging surface of the second camera and having a chart including a plurality of sections.

Description

撮像装置、情報処理装置、及び撮像システムImaging device, information processing device, and imaging system

 本開示は、撮像装置、情報処理装置、及び撮像システムに関する。 This disclosure relates to an imaging device, an information processing device, and an imaging system.

 複数枚の画像から1枚のパノラマ画像を生成する情報処理装置が知られている。このようなパノラマ画像は、撮像装置を複数の撮像方向に、モータを用いた機械駆動により変更して、複数枚の撮像画像が連続するように画像合成処理される。  There is known an information processing device that generates one panoramic image from multiple images. Such a panoramic image is created by mechanically driving the imaging device in multiple imaging directions using a motor, and then synthesizing the images so that multiple captured images are continuous.

特開2019-41208号公報JP 2019-41208 A

 ところが、機械駆動の精度よりも画像合成処理の要求精度が高くなる場合があり、画像合成処理の合成画像にずれが生じる恐れがある。また、撮像装置の高解像化に対応させて機械駆動の精度を上げると、コスト高になってしまう。さらにまた、一般的な画像合成処理により、合成画像のずれを抑制しようとすると、画像を繋ぎ合わせる際の、接合部位置探索及び接合処理の負荷が増加する恐れがある。 However, there are cases where the required accuracy of the image synthesis process is higher than the accuracy of the mechanical drive, and there is a risk of misalignment in the composite image produced by the image synthesis process. Furthermore, increasing the accuracy of the mechanical drive in response to higher resolution imaging devices results in higher costs. Furthermore, if an attempt is made to suppress misalignment in the composite image using general image synthesis processing, there is a risk of an increased load on searching for joint positions and on the synthesis process when stitching images together.

 そこで、本開示では、画像合成処理の精度低下を抑制可能な撮像装置、情報処理装置、及び撮像システムが提供される。 The present disclosure provides an imaging device, an information processing device, and an imaging system that can suppress a decrease in accuracy of image synthesis processing.

 上記の課題を解決するために、本開示によれば、
 第1カメラと
 前記第1カメラに固定された第2カメラと、
 前記第2カメラの撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板と、
 を備える、撮像装置が提供される。
In order to solve the above problems, according to the present disclosure,
a first camera; and a second camera fixed to the first camera.
a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
An imaging device is provided, comprising:

 前記第1カメラ、及び前記第2カメラは、パン軸を中心に回動可能であり、
 前記第2カメラの撮像系の第2光軸は、パン軸との交点を有してもよい。
the first camera and the second camera are rotatable about a pan axis;
A second optical axis of the imaging system of the second camera may have an intersection with a pan axis.

 前記第1カメラの撮像系の第1光軸は、前記パン軸との交点を有してもよい。 The first optical axis of the imaging system of the first camera may have an intersection with the pan axis.

 前記基準画像板は、中空状の球体における少なくとも一部の内面に複数の区画を含むチャートを有してもよい。 The reference image plate may have a chart including multiple compartments on at least a portion of the inner surface of a hollow sphere.

 前記チャートは、白地に黒の方形線図で描画されてもよい。 The chart may be drawn as a black rectangle on a white background.

 前記第1カメラ、及び前記第2カメラは、パン軸を中心に回動可能であり、
 前記基準画像板の中心は、パン軸上にあってもよい。
the first camera and the second camera are rotatable about a pan axis;
The centre of the reference image plate may be on the pan axis.

 前記第1光軸と前記第2光軸とは、同軸であってもよい。 The first optical axis and the second optical axis may be coaxial.

 前記第2カメラの撮像系は、前記パン軸よりも前記第1カメラ側に配置されてもよい。 The imaging system of the second camera may be positioned closer to the first camera than the pan axis.

 前記第1カメラ、及び前記第2カメラは、チルト軸を中心に回動可能であり、
 前記チルト軸は、前記交点を通過してもよい。
the first camera and the second camera are rotatable about a tilt axis;
The tilt axis may pass through the intersection point.

 前記チャートを照明する光源と、
 前記照明を遮光する遮光カバーと、を更に備えてもよい。
a light source for illuminating the chart;
The lighting device may further include a light-shielding cover that blocks the illumination.

 前記第1カメラ、及び前記第2カメラを支持するカメラフレームを、更に備え、
 前記光源と、前記遮光カバーは、前記カメラフレームに支持されてもよい。
a camera frame supporting the first camera and the second camera;
The light source and the light-shielding cover may be supported by the camera frame.

 前記第1カメラ、及び前記第2カメラ、前記カメラフレーム、前記光源、及び前記遮光カバーを覆う透明な外部ドームを更に備えてもよい。 The device may further include a transparent outer dome that covers the first and second cameras, the camera frame, the light source, and the light-shielding cover.

 前記チャートの複数の区画を所定順に前記第2カメラに撮像させ、前記第2カメラの撮像に応じて前記第1カメラを撮像させる駆動制御部を更に備えてもよい。 The system may further include a drive control unit that causes the second camera to capture images of multiple sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.

 前記第1カメラ、及び前記第2カメラの画角は固定されており、前記第1カメラの画角は、前記第2カメラの画角以上であってもよい。 The angles of view of the first camera and the second camera are fixed, and the angle of view of the first camera may be greater than or equal to the angle of view of the second camera.

 上記の課題を解決するために、本開示によれば、複数の区画を含むチャートを撮影する第2カメラの画像データに基づいて、前記第2カメラに固定された第1カメラから出力された複数の画像データを合成処理する合成処理部を、
備える、情報処理装置が提供される。
In order to solve the above problems, according to the present disclosure, there is provided a composition processing unit that performs composition processing of a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that photographs a chart including a plurality of partitions,
An information processing device is provided.

 複数の区画を含むチャートを撮影する第2カメラの画像データに基づいて、前記第2カメラに固定された第1カメラから出力された複数の画像データを合成処理する合成処理部を、
備えてもよい。
a synthesis processing unit that synthesizes a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that captures an image of a chart including a plurality of partitions;
It may be prepared.

 前記第2カメラの第2画像データにおける前記区画に対応する前記第1カメラの第1区画像データを生成する区画処理部を更に備え、
 前記合成処理部は、前記第1区画像データを合成処理してもよい。
a partition processing unit that generates first partition image data of the first camera corresponding to the partition in the second image data of the second camera,
The synthesis processing unit may synthesize the first section image data.

 前記区画に対応する配置情報を記憶する記憶部を更にそなえ、
 前記合成処理部は、前記配置情報に基づき、前記第1区画像データを合成処理してもよい。
Further comprising a storage unit for storing arrangement information corresponding to the partitions,
The synthesis processing unit may synthesize the first section image data based on the layout information.

 前記区画処理部は、前記区画に対応する識別記号を認識し、
 前記合成処理部は、前記識別記号に基づき、前記第1区画像データを合成処理してもよい。
The partition processing unit recognizes an identification symbol corresponding to the partition,
The synthesis processing unit may synthesize the first section image data based on the identification symbol.

 上記の課題を解決するために、本開示によれば、
 撮像装置と、
 情報処理装置と、を備える撮像システムであって、
 前記撮像装置は、
 第1カメラと
 前記第1カメラに固定された第2カメラと、
 前記第2カメラの撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板と、
 を有し、
 前記情報処理装置は、
 前記複数の区画を含むチャートを撮影する前記第2カメラの画像データに基づいて、前記第1カメラから出力された複数の画像データを合成処理する合成処理部を、
を有する、撮像システムが提供される。
In order to solve the above problems, according to the present disclosure,
An imaging device;
An imaging system including an information processing device,
The imaging device includes:
a first camera; and a second camera fixed to the first camera.
a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
having
The information processing device includes:
a synthesis processing unit that synthesizes a plurality of image data output from the first camera based on image data of the second camera that captures an image of a chart including the plurality of partitions;
An imaging system is provided having the following:

 前記情報処理装置は、
 前記チャートの前記複数の区画を所定順に前記第2カメラに撮像させ、前記第2カメラの撮像に応じて前記第1カメラを撮像させるカメラ制御部を更に有してもよい。
The information processing device includes:
The chart may further include a camera control unit that causes the second camera to capture images of the plurality of sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.

本開示の実施形態に撮像システムの一例を示すブロック図。FIG. 1 is a block diagram illustrating an example of an imaging system according to an embodiment of the present disclosure. 撮像装置の垂直断面図。FIG. 第2カメラ、光源、及び遮光カバーを示す斜視図。FIG. 4 is a perspective view showing a second camera, a light source, and a light-shielding cover. 撮像装置を示す斜視図。FIG. 撮像装置の全体を示す斜視図。FIG. 1 is a perspective view showing an entire imaging device. 基準画像板の製造工程を模式的に示す図。5A to 5C are diagrams illustrating a manufacturing process of a reference image plate. 球面状の基準画像板の交点を通過するzx軸断面を模式的に示す図。FIG. 13 is a schematic diagram showing a zx-axis cross section passing through an intersection point of a spherical reference image plate. 球面状の基準画像板の交点を通過するyx軸断面を模式的に示す図。FIG. 2 is a schematic diagram showing a yx-axis cross section passing through an intersection point of a spherical reference image plate. 基準画像板の第2カメラ側の内面を模式的に示す概念図。FIG. 4 is a conceptual diagram illustrating the inner surface of the reference image plate on the second camera side. チャートの方形を異なるチルト角度で撮像した例を示す模式図。11A and 11B are schematic diagrams showing examples of images of a square of a chart captured at different tilt angles. 第1撮像画像と、第2撮像画像との関係を模式的に示す図。FIG. 4 is a diagram illustrating a relationship between a first captured image and a second captured image. 情報処理装置の構成例を示すブロック図。FIG. 1 is a block diagram showing an example of the configuration of an information processing device. 情報処理装置の処理例を模式的に示す図。FIG. 1 is a diagram illustrating a processing example of an information processing device. 監視装置の構成例を示すブロック図。FIG. 2 is a block diagram showing an example of the configuration of a monitoring device. 第2施形態に係る撮像装置の垂直断面図。FIG. 11 is a vertical cross-sectional view of an imaging device according to a second embodiment. 基準画像板の断面斜視図。FIG. 基準画像板の垂直断面図。1 is a vertical cross-sectional view of the reference image plate. チルト軸に対してチルト動作させている例を示す図。FIG. 13 is a diagram showing an example of tilting operation about a tilt axis. 撮像装置の全体を示す斜視図。FIG. 1 is a perspective view showing an entire imaging device. 第3施形態に係る撮像装置の垂直断面図。FIG. 13 is a vertical cross-sectional view of an imaging device according to a third embodiment. 外部ドームを外した状態での斜視断面図。FIG. 1 is a perspective cross-sectional view with the outer dome removed. 外部ドームを外した撮像装置の全体を示す斜視図。FIG. 2 is a perspective view showing the entire imaging device with the outer dome removed. 第1カメラの撮像した撮像画像と、方形の領域に対応する方形領域を示す図。4A and 4B are diagrams showing an image captured by a first camera and a square area corresponding to the square area;

 以下、図面を参照して、発光素子の実施形態について説明する。以下では、撮像装置、情報処理装置、及び撮像システムの主要な構成部分を中心に説明するが、撮像装置、情報処理装置、及び撮像システムには、図示又は説明されていない構成部分や機能が存在しうる。以下の説明は、図示又は説明されていない構成部分や機能を除外するものではない。 Below, an embodiment of the light-emitting element will be described with reference to the drawings. The following description will focus on the main components of the imaging device, information processing device, and imaging system, but the imaging device, information processing device, and imaging system may have components and functions that are not shown or described. The following description does not exclude components and functions that are not shown or described.

(第1実施形態)
 図1は、本開示の実施形態に撮像システム1の一例を示すブロック図である。図1に示すよう、撮像システム1は、パノラマ画像を生成可能なシステムであり、撮像装置10と、情報処理装置20と、第1表示装置30と、監視装置40と、第2表示装置50と、を備える。撮像システム1は、例えば監視、観察システムとして使用可能である。すなわち、この撮像システム1は、例えば、環境モニタ(災害・山火事・大気汚染など)、広範囲な動物の生態記録、植物の育成・灌漑のモニタなどに使用可能である。本実施形態に係るパノラマ画像は、広角画像の例である。
First Embodiment
FIG. 1 is a block diagram showing an example of an imaging system 1 according to an embodiment of the present disclosure. As shown in FIG. 1, the imaging system 1 is a system capable of generating a panoramic image, and includes an imaging device 10, an information processing device 20, a first display device 30, a monitoring device 40, and a second display device 50. The imaging system 1 can be used, for example, as a monitoring or observation system. That is, the imaging system 1 can be used, for example, for environmental monitoring (disasters, forest fires, air pollution, etc.), recording the ecology of a wide range of animals, and monitoring plant growth and irrigation. The panoramic image according to this embodiment is an example of a wide-angle image.

 撮像装置10は、第1カメラ100(後述する図2参照)に固定された第2カメラ102(図2参照)を有する装置である。この撮像装置10は、第2カメラ102で撮像された基準画像の例であるチャート画像を用いて、第1カメラ100で撮像された複数枚の画像から1枚のパノラマ画像を生成することが可能である。なお、撮像装置10の詳細は、後述する。 The imaging device 10 is a device having a second camera 102 (see FIG. 2) fixed to a first camera 100 (see FIG. 2 described later). This imaging device 10 is capable of generating one panoramic image from multiple images captured by the first camera 100, using a chart image, which is an example of a reference image captured by the second camera 102. Details of the imaging device 10 will be described later.

 情報処理装置20は、例えばエッジ側のコンピュータであり、CPU(Central Processing Unit)を含んで構成され、撮像装置10を制御するとともに、パノラマ画像の画像合成処理が可能な装置である。情報処理装置20は、ネットワークnwを介して監視装置40との通信が可能である。ネットワークnwは、有線でも、無線でもよい。なお、情報処理装置20の詳細も後述する。また、撮像装置10と情報処理装置20とは、一体的に構成することも可能である。例えば、CPUを含んで構成された情報処理装置20の回路ボードを撮像装置10の駆動制御部と共に、駆動スペース104s(図2を参照)内に内蔵させることが可能である。駆動制御部は、例えばCPUを含む機械駆動制御部と、モータを含む機械駆動部と、を有する。機械駆動制御部は、撮像装置10の全体を製御する。駆動制御部の機械駆動部は、機械駆動制御部の制御に従い撮像装置10の機械駆動を実行する。この機械駆動制御部は、第1カメラ100、及び第2カメラ102の撮像も制御する。さらにまた、撮像装置10と情報処理装置20とは、ネットワークnwを介さずに、スタンドアローンとして構成することも可能である。 The information processing device 20 is, for example, an edge computer, and is configured to include a CPU (Central Processing Unit), and is capable of controlling the imaging device 10 and performing image synthesis processing of a panoramic image. The information processing device 20 can communicate with the monitoring device 40 via a network nw. The network nw may be wired or wireless. Details of the information processing device 20 will be described later. The imaging device 10 and the information processing device 20 can also be configured as an integrated unit. For example, the circuit board of the information processing device 20, which includes a CPU, can be built into the driving space 104s (see FIG. 2) together with the driving control unit of the imaging device 10. The driving control unit has, for example, a mechanical driving control unit including a CPU and a mechanical driving unit including a motor. The mechanical driving control unit controls the entire imaging device 10. The mechanical driving unit of the driving control unit executes mechanical driving of the imaging device 10 according to the control of the mechanical driving control unit. This mechanical driving control unit also controls the imaging of the first camera 100 and the second camera 102. Furthermore, the imaging device 10 and the information processing device 20 can also be configured as stand-alone devices without being connected via the network nw.

 監視装置40は、例えクラウド側のサーバであり、CPUを含んで構成され、ネットワークnwを介して情報処理装置20が生成したパノラマ画像を監視するために用いることが可能である。第2表示装置50は、例えばモニタであり、監視装置40から供給されるパノラマ画像を表示する。なお、監視装置40の詳細も後述する。 The monitoring device 40 may be, for example, a server on the cloud side, includes a CPU, and can be used to monitor the panoramic image generated by the information processing device 20 via the network nw. The second display device 50 may be, for example, a monitor, and displays the panoramic image provided by the monitoring device 40. Details of the monitoring device 40 will be described later.

 ここで、図2乃至図8を用いて撮像装置10の詳細を説明する。図2は、撮像装置10の垂直断面図である。以下の説明では鉛直方向をZ軸とし、Z軸に直交する方向をx、y軸とする。Z軸の正方向を鉛直上方とするが、これに限定されない。例えば、Z軸の正方向を鉛直下方としてもよい。すなわち、撮像装置10は、図2の上下を反転した設置も可能である。 Here, the imaging device 10 will be described in detail with reference to Figs. 2 to 8. Fig. 2 is a vertical cross-sectional view of the imaging device 10. In the following description, the vertical direction is the Z axis, and the directions perpendicular to the Z axis are the x and y axes. The positive direction of the Z axis is the vertically upward direction, but is not limited to this. For example, the positive direction of the Z axis may be the vertically downward direction. In other words, the imaging device 10 can also be installed upside down as compared to Fig. 2.

 図2に示すように、撮像装置10は、第1カメラ100と、第2カメラ102と、雲台104と、基準画像板106と、遮光カバー108と、支持部109と、外部ドーム110と、光源112(図3参照)とを、備える。第1カメラ100は、景色などの外界を撮像するカメラである。第1カメラ100の撮像系は第1光軸OL1を有する。撮像系は、単一のレンズで構成されてもよく、或いは、複数のレンズで構成されてもよい。 As shown in FIG. 2, the imaging device 10 includes a first camera 100, a second camera 102, a camera platform 104, a reference image plate 106, a light-shielding cover 108, a support 109, an external dome 110, and a light source 112 (see FIG. 3). The first camera 100 is a camera that captures images of the outside world, such as scenery. The imaging system of the first camera 100 has a first optical axis OL1. The imaging system may be composed of a single lens, or may be composed of multiple lenses.

 第2カメラ102は、第1カメラ100と固定されており、基準画像板106を撮像するカメラである。第2カメラ102の撮像系は第2光軸OL2を有する。撮像系は、単一のレンズで構成されてもよく、或いは、複数のレンズで構成されてもよい。 The second camera 102 is fixed to the first camera 100 and captures an image of the reference image plate 106. The imaging system of the second camera 102 has a second optical axis OL2. The imaging system may be composed of a single lens, or may be composed of multiple lenses.

 雲台(ジンバル)104は、第1カメラ100と、第2カメラ102とを支持して、パン(Panoramic)動作、及びチルト(Tilt)動作をさせることが可能である。雲台(ジンバル)104は、支持部109に支持される。 The gimbal 104 supports the first camera 100 and the second camera 102, and is capable of panoramic and tilt movements. The gimbal 104 is supported by a support 109.

 雲台104のパン動作、チルト動作をさせるモータなどの機械駆動部は、駆動スペース104sに配置される。この雲台104は、第1カメラ100、及び第2カメラ102を、パン軸OPを中心に回動させるパン軸回動機構104aと、第1カメラ100、及び第2カメラ102を、チルト軸OTを中心に回動させるチルト軸回動機構104bとを有する。パン軸OPはパン軸回動機構104aを上下方向に貫く回転軸で、チルト軸OTはチルト軸回動機構104bを水平方向に貫く回転軸である。 Mechanical drive units such as motors that perform pan and tilt operations on the camera platform 104 are disposed in the drive space 104s. The camera platform 104 has a pan axis rotation mechanism 104a that rotates the first camera 100 and the second camera 102 around a pan axis OP, and a tilt axis rotation mechanism 104b that rotates the first camera 100 and the second camera 102 around a tilt axis OT. The pan axis OP is a rotation axis that passes through the pan axis rotation mechanism 104a in the vertical direction, and the tilt axis OT is a rotation axis that passes through the tilt axis rotation mechanism 104b in the horizontal direction.

 第1カメラ100、第2カメラ102、及び遮光カバー108は、カメラフレーム104cに固定される。カメラフレーム104cは、チルト軸回動機構104bを介して支持フレーム104dに固定される。また、支持フレーム104dは、パン軸回動機構104aにより回転可能に構成される。 The first camera 100, the second camera 102, and the light-shielding cover 108 are fixed to the camera frame 104c. The camera frame 104c is fixed to the support frame 104d via the tilt axis rotation mechanism 104b. The support frame 104d is also configured to be rotatable by the pan axis rotation mechanism 104a.

 より詳細には、雲台104は、例えば第1カメラ100の第1光軸OL1と、第2カメラ102の第2光軸OL2の交点Cを通過するように、パン軸OPと、チルト軸OTを構成する。また、雲台104には、球面状の基準画像板106が固定されている。基準画像板106の球面の中心点は、例えば交点Cとされる。本実施形態では、第1カメラ100の第1画角と、第2カメラ102の第2画角とは固定される。例えば、第1カメラ100の第1画角を第2カメラ102の第2画角以上の大きさとする。このように、雲台104のパン軸回動機構104aは、パン軸OPを回転軸として、第1カメラ100及び第2カメラ102のそれぞれの第1光軸OL1、OL2を水平面(x、y面)に沿って回転させることが可能である。一方で、雲台104のチルト軸回動機構104bは、チルト軸OTを回転軸として、第1カメラ100及び第2カメラ102のそれぞれの第1光軸OL1、OL2を垂直面(z、x面)に沿って回転させることが可能である。 More specifically, the pan head 104 forms a pan axis OP and a tilt axis OT that pass through, for example, the intersection C of the first optical axis OL1 of the first camera 100 and the second optical axis OL2 of the second camera 102. A spherical reference image plate 106 is fixed to the pan head 104. The center point of the sphere of the reference image plate 106 is, for example, the intersection C. In this embodiment, the first angle of view of the first camera 100 and the second angle of view of the second camera 102 are fixed. For example, the first angle of view of the first camera 100 is set to be equal to or larger than the second angle of view of the second camera 102. In this way, the pan axis rotation mechanism 104a of the pan head 104 can rotate the first optical axes OL1 and OL2 of the first camera 100 and the second camera 102 along a horizontal plane (x, y plane) with the pan axis OP as the rotation axis. On the other hand, the tilt axis rotation mechanism 104b of the pan head 104 can rotate the first optical axes OL1 and OL2 of the first camera 100 and the second camera 102, respectively, along a vertical plane (z, x plane) with the tilt axis OT as the rotation axis.

 基準画像板106は、支持部109に固定される。基準画像板106は、第2カメラ102の撮像面に対向して配置され、複数の区画を含むチャートを有する。遮光カバー108は、図3に示すように、外部光を遮光する。支持部109は、雲台(ジンバル)104、基準画像板106、及び外部ドーム110を支持する。 The reference image plate 106 is fixed to the support 109. The reference image plate 106 is disposed opposite the imaging surface of the second camera 102 and has a chart including multiple sections. The light-shielding cover 108 blocks external light, as shown in FIG. 3. The support 109 supports the camera platform (gimbal) 104, the reference image plate 106, and the external dome 110.

 外部ドーム110は、透明なカバーであり、第1カメラ100、第2カメラ102、雲台104、基準画像板106、遮光カバー108、光源112などを外気から遮断する。これにより、降雨などの影響を抑制できる。 The external dome 110 is a transparent cover that insulates the first camera 100, the second camera 102, the camera platform 104, the reference image plate 106, the light-shielding cover 108, the light source 112, and the like from the outside air. This makes it possible to suppress the effects of rainfall, etc.

 図3は、第2カメラ102、光源112、及び遮光カバー108を示す斜視図である。基準画像板106、及び外部ドーム110を外した状態である。光源112は、例えばLED(Light Emitting Diode)であり、基準画像板106を照明する。第2カメラ102は、光源112により照明された基準画像板106を撮像する。図2、3に示すように、遮光カバー108は、外部光を遮光する。これにより、第2カメラ102が基準画像板106を撮像する際の照度を一定に維持できる。このため、第2カメラ102が撮像した第2画像を、後述するパノラマ画像の合成処理などで階調処理などを行うことなく、同一条件で処理することが可能となる。 FIG. 3 is a perspective view showing the second camera 102, the light source 112, and the light-shielding cover 108. The reference image plate 106 and the external dome 110 are removed. The light source 112 is, for example, an LED (Light Emitting Diode), and illuminates the reference image plate 106. The second camera 102 captures the reference image plate 106 illuminated by the light source 112. As shown in FIGS. 2 and 3, the light-shielding cover 108 blocks external light. This allows the illuminance when the second camera 102 captures the reference image plate 106 to be kept constant. This makes it possible to process the second image captured by the second camera 102 under the same conditions without performing gradation processing, etc., in the panoramic image synthesis process described below.

 図4は、撮像装置10を示す斜視図である。外部ドーム110を外した状態である。図5は、撮像装置10の全体を示す斜視図である。撮像装置10全体の大きさは、例えば縦幅、及び横幅が共に10センチメートル以内である。このような、撮像装置10は、例えば、監視範囲の天井部などに取り付けることも可能である。 FIG. 4 is a perspective view of the imaging device 10, with the external dome 110 removed. FIG. 5 is a perspective view of the entire imaging device 10. The overall size of the imaging device 10 is, for example, within 10 centimeters in both length and width. Such an imaging device 10 can also be attached, for example, to the ceiling of the monitoring area.

 図6は、基準画像板106の製造工程を模式的に示す図である。図6(a)は、基準画像板106のベース部材1060の断面図である。ベース部材1060は、透明な球状部材である。ベース部材1060は、例えば、アクリル、ポリカーボネート等である。球状部材は、例えば中心点c106の中空な球面体の一部である。 FIG. 6 is a diagram showing a schematic diagram of the manufacturing process of the reference image plate 106. FIG. 6(a) is a cross-sectional view of the base member 1060 of the reference image plate 106. The base member 1060 is a transparent spherical member. The base member 1060 is, for example, acrylic, polycarbonate, or the like. The spherical member is, for example, a part of a hollow sphere with a center point c106.

 図6(b)は、ベース部材1060に白色塗料1062を塗布する工程を示す図である。ベース部材1060の外側から白色塗料1062が塗布される。図6(c)は、白色塗料1062のチャートの線図を描画する工程を示す図である。白色塗料1062が塗布される。白色塗料1062の外側から極細のレーザーで描画・ペイントを剥がすことにより、チャートの線部分が描画される。 Figure 6(b) is a diagram showing the process of applying white paint 1062 to base member 1060. White paint 1062 is applied from the outside of base member 1060. Figure 6(c) is a diagram showing the process of drawing the lines of the chart in white paint 1062. White paint 1062 is applied. The lines of the chart are drawn by peeling off the drawing/paint with a very fine laser from the outside of the white paint 1062.

 図6(c)は、チャートの線部分が描画され白色塗料1062に黒色塗料1064を塗布する工程を示す図である。チャートの線部分が描画され白色塗料1062の外側から遮光性のある黒色塗料1064が塗布される。図(e)は、チャート線図を構成する方形106aを示す図である。このように、方形106aが、ベース部材1060に碁盤目状に描画され、基準画像板106が製造される。チャートにおける白色と黒白の組合せは、コントラストを最大化する組み合わせであるが、これに限定されない。例えば、チャートの認識のための所定以上のコントラストがあればよく、別の色の組み合わせでもよい。 FIG. 6(c) is a diagram showing the process of drawing the lines of the chart and applying black paint 1064 to the white paint 1062. The lines of the chart are drawn and the light-blocking black paint 1064 is applied from the outside of the white paint 1062. FIG. 6(e) is a diagram showing the squares 106a that make up the chart diagram. In this way, the squares 106a are drawn in a checkerboard pattern on the base member 1060, and the reference image plate 106 is manufactured. The combination of white and black and white in the chart is a combination that maximizes contrast, but is not limited to this. For example, other color combinations are also possible as long as there is a certain level of contrast required for the chart to be recognized.

 ここで、図7、及び図8を用いて、基準画像板106と、第1カメラ100の第1光軸OL1、及び第2カメラ102の第2光軸OL2の関係を説明する。図7は、球面状の基準画像板106の交点Cを通過するzx軸断面を模式的に示す図である。図8は、球面状の基準画像板106の交点Cを通過するyx軸断面を模式的に示す図である。図7、8に示すように、基準画像板106の球面の中心点c106を、例えば交点C(図2参照)と同位置に配置する。このような配置にすると、第2カメラ102の第2光軸OL2上の第2カメラ102の撮像面と、基準画像板106の球面の内面は、常に距離が維持される。 Here, the relationship between the reference image plate 106, the first optical axis OL1 of the first camera 100, and the second optical axis OL2 of the second camera 102 will be described with reference to Figs. 7 and 8. Fig. 7 is a schematic diagram showing a zx-axis cross section passing through intersection point C of the spherical reference image plate 106. Fig. 8 is a schematic diagram showing a yx-axis cross section passing through intersection point C of the spherical reference image plate 106. As shown in Figs. 7 and 8, the center point c106 of the sphere of the reference image plate 106 is disposed, for example, at the same position as intersection point C (see Fig. 2). With such an arrangement, a distance is always maintained between the imaging surface of the second camera 102 on the second optical axis OL2 of the second camera 102 and the inner surface of the sphere of the reference image plate 106.

 図9は、基準画像板106の第2カメラ102側の内面を模式的に示す概念図である。基準画像板106の第2カメラ102側の内面には、方形106aが碁盤目状にチャートが描画されている。説明を簡単にするため、方形106aを直線として描いているが、上述のように球面に描画(図6参照)されるため、平面図にした場合には、その位置により各辺は曲線となる。また、各方形106aには、その位置を示す識別記号106cが例えば描画されていてもよい。識別記号106cは、例えば方形106a毎に異なる数字またはQRコード等である。 FIG. 9 is a conceptual diagram that shows a schematic representation of the inner surface of the reference image plate 106 facing the second camera 102. On the inner surface of the reference image plate 106 facing the second camera 102, a chart of squares 106a is drawn in a checkerboard pattern. For ease of explanation, the squares 106a are drawn as straight lines, but because they are drawn on a spherical surface as described above (see FIG. 6), when viewed in a plan view, each side will be curved depending on its position. In addition, each square 106a may have an identification symbol 106c drawn on it to indicate its position. The identification symbol 106c is, for example, a different number or QR code for each square 106a.

 方形106a毎の中心座標、及び配置座標は、識別記号106cと関連付けられ、記憶部202(図12参照)に記憶される。これにより、識別記号106cが認識されるとパノラマ画像を生成する際の、座標が取得可能となる。これにより、各区画の画像100a(図10参照)を配置する際に、画像100aの配置位置を把握できるため、画像100aのゼロ点合わせを行う必要がなくなる。また、これにより、角度センサーやエンコーダーが不要となり、更なる小型化が可能となる。 The center coordinates and arrangement coordinates of each rectangle 106a are associated with the identification symbol 106c and stored in the memory unit 202 (see FIG. 12). As a result, when the identification symbol 106c is recognized, the coordinates can be acquired when generating a panoramic image. This makes it possible to grasp the arrangement position of the image 100a (see FIG. 10) when arranging the image 100a of each section, eliminating the need to adjust the zero point of the image 100a. This also eliminates the need for angle sensors and encoders, enabling further miniaturization.

 再び図7、8を参照すると、第2カメラ102の第2画角は、基準となる方形106aを包含するマージン範囲である第2撮像画像106bに対応させて設定される。この第2撮像画像106bの範囲は、撮像装置10における機械駆動部の機械的な精度が反映される。本実施形態では、第2カメラ102は、方形106aを例えば、図9の数字順に撮像する。この場合、機械駆動部の機械制御にずれがない場合には、方形106aのみを撮像可能であるが、制御ずれを考慮して方形106aを全方向に広げた第2撮像画像106bの範囲が設定される。この第2撮像画像106bにおける方形の範囲が第2カメラ102の第2画角に対応する。 7 and 8 again, the second angle of view of the second camera 102 is set to correspond to the second captured image 106b, which is a margin range that includes the reference rectangle 106a. The range of this second captured image 106b reflects the mechanical accuracy of the mechanical drive unit in the imaging device 10. In this embodiment, the second camera 102 captures the rectangle 106a, for example, in the numerical order of FIG. 9. In this case, if there is no deviation in the mechanical control of the mechanical drive unit, it is possible to capture only the rectangle 106a, but the range of the second captured image 106b is set by expanding the rectangle 106a in all directions, taking into account the control deviation. The range of the rectangle in this second captured image 106b corresponds to the second angle of view of the second camera 102.

 例えば、撮像装置10に方形106aを順に撮像させた場合の、第2撮像画像106b内の方形106aの中心点のずれが反映される。例えば、中心点のずれの範囲が+-5ミリであれば、第2撮像画像106bにおける方形の範囲は、方形106aを5ミリ広げた範囲である。換言すると、例えば撮像装置10における機械駆動部の機械的な精度により、方形106aを順に撮像させた場合、必ず、方形106aが含まれるように第2画角が設定される。また、これにより、方形106aの周辺領域の画質が劣る領域の画像を用いることなく、画像合成が可能となる。 For example, when the imaging device 10 captures images of the squares 106a in sequence, the shift in the center point of the square 106a in the second captured image 106b is reflected. For example, if the range of the center point shift is ±5 mm, the range of the square in the second captured image 106b is the range of the square 106a expanded by 5 mm. In other words, for example, due to the mechanical precision of the mechanical drive unit in the imaging device 10, when the squares 106a are captured in sequence, the second angle of view is set so that the square 106a is always included. This also makes it possible to combine images without using images of areas with poor image quality surrounding the square 106a.

 図10は、図7で示す配置例で、チャートの方形106aを異なるチルト角度で撮像した例を示す模式図である。すなわち、第2カメラ102を、交点Cを通過するチルト軸OT(図2参照)を中心にチルトさせて撮像した方形106aの例である。このように、基準画像板106の球面の中心点c106を、交点Cと一致させることにより、どのチルト角で撮像しても、各方形106aは、同等の形状で撮像可能である。同様に、どのパン角で撮像しても、各方形106aは、同等の形状で撮像可能である。 FIG. 10 is a schematic diagram showing an example of the squares 106a of the chart captured at different tilt angles in the arrangement example shown in FIG. 7. That is, this is an example of the squares 106a captured by tilting the second camera 102 about the tilt axis OT (see FIG. 2) that passes through the intersection C. In this way, by making the center point c106 of the sphere of the reference image plate 106 coincide with the intersection C, each square 106a can be captured with the same shape regardless of the tilt angle at which it is captured. Similarly, each square 106a can be captured with the same shape regardless of the pan angle at which it is captured.

 また、図10に示すように、第1カメラ100の第1画角は、第2カメラ102の第2画角以上に設定される。これらから分かるように、チャートの方形106aを順に連続して撮像する場合、第1カメラ100の第1撮像画像100bの撮像範囲も重なりを有して連続して撮像される。このとき、第2カメラ102の第2画像に含まれる方形106aの領域に対応する第1方形領域100aが、第1撮像画像100bに含まれる。また、方形106aの領域と、第1方形領域100aとの、縦横の画角が等しく構成される。 Also, as shown in FIG. 10, the first angle of view of the first camera 100 is set to be equal to or greater than the second angle of view of the second camera 102. As can be seen from this, when the rectangles 106a of the chart are captured in succession in sequence, the capturing range of the first captured image 100b of the first camera 100 is also captured in succession with some overlap. At this time, the first captured image 100b includes a first square area 100a corresponding to the area of the rectangle 106a included in the second image of the second camera 102. Furthermore, the vertical and horizontal angles of view of the area of the rectangle 106a and the first square area 100a are configured to be equal.

 図11は、第1カメラ100の第1撮像画像100bと、第2カメラ102の第2撮像画像106bとの関係を模式的に示す図である。図11に示すように、基準画像板106におけるチャートの方形106aを連続的に順に撮像する場合には、第1撮像画像100bも連続的に撮像される。上述のように方形106aの領域に対応する第1方形領域100aが、第1撮像画像100bの撮像範囲に含まれる。このような、第1撮像画像100b、及び第2撮像画像106bの撮像は、情報処理装置20の制御下で実行される。すなわち、駆動スペース104s(図2を参照)に配置される駆動制御部は、チャートの複数の区画である方形106aを所定順に第2カメラ102に撮像させ、第2カメラ102の撮像に応じて第1カメラ100を撮像させる。 11 is a diagram showing a schematic relationship between the first captured image 100b of the first camera 100 and the second captured image 106b of the second camera 102. As shown in FIG. 11, when the squares 106a of the chart on the reference image board 106 are captured in succession, the first captured images 100b are also captured in succession. As described above, the first square area 100a corresponding to the area of the square 106a is included in the capture range of the first captured image 100b. The capture of the first captured image 100b and the second captured image 106b is performed under the control of the information processing device 20. That is, the drive control unit arranged in the drive space 104s (see FIG. 2) causes the second camera 102 to capture the squares 106a, which are the multiple sections of the chart, in a predetermined order, and causes the first camera 100 to capture the images according to the capture of the second camera 102.

 図12は、情報処理装置20の構成例を示すブロック図である。情報処理装置20は、カメラ制御部200と、記憶部202と、区画処理部204と、合成処理部206と、キャリブレーション処理部208と、表示制御部210と、通信部212と有する。情報処理装置20が有するCPUは、記憶部202に記憶されるプログラムを実行することにより、カメラ制御部200、区画処理部204、合成処理部206、キャリブレーション処理部208、及び表示制御部210を構成可能である。或いは、各部を電子回路で構成してもよい。 FIG. 12 is a block diagram showing an example configuration of the information processing device 20. The information processing device 20 has a camera control unit 200, a memory unit 202, a partition processing unit 204, a synthesis processing unit 206, a calibration processing unit 208, a display control unit 210, and a communication unit 212. The CPU of the information processing device 20 can configure the camera control unit 200, the partition processing unit 204, the synthesis processing unit 206, the calibration processing unit 208, and the display control unit 210 by executing a program stored in the memory unit 202. Alternatively, each unit may be configured with an electronic circuit.

 カメラ制御部200は、駆動スペース104s(図2を参照)内の駆動制御部の制御を介し、第1カメラ100、及び第2カメラ102の撮像位置を制御して、各撮像位置での第1撮像画像100bと第2撮像画像106bと撮像させる。記憶部202は、プログラム、及び各種の制御パラメータと、撮像画像などを記憶する。 The camera control unit 200 controls the imaging positions of the first camera 100 and the second camera 102 through control of the drive control unit in the driving space 104s (see FIG. 2), and causes the first captured image 100b and the second captured image 106b to be captured at each imaging position. The storage unit 202 stores programs, various control parameters, captured images, etc.

 区画処理部204は、第2撮像画像106bの中の数字を認識すると共に、第2撮像画像106bの中から方形106aの範囲を示す座標を認識する。そして、区画処理部204は、方形106aの範囲を示す座標に基づき、第1撮像画像100bのなから第1方形領域100aを生成し、認識した識別記号に関連づけて、記憶部202に記憶する。記憶部202は、プログラム、及び各種の制御パラメータと、撮像画像などを記憶する。すなわち、記憶部202は区画である方形106aに対応する配置情報を記憶する。 The partition processing unit 204 recognizes the numbers in the second captured image 106b, and recognizes the coordinates indicating the range of the rectangle 106a from within the second captured image 106b. The partition processing unit 204 then generates a first rectangular area 100a from the first captured image 100b based on the coordinates indicating the range of the rectangle 106a, and stores this in the memory unit 202 in association with the recognized identification symbol. The memory unit 202 stores programs, various control parameters, captured images, and the like. In other words, the memory unit 202 stores placement information corresponding to the rectangle 106a, which is a partition.

 合成処理部206は、記憶部202に記憶される第1方形領域100aを関連づけられた識別記号、及び配置情報に基づき、パノラマ画像を合成する。キャリブレーション処理部208は、例えば第2カメラ102に基準画像板106の各方形106aを撮像させる。そして、区画処理部204に配置情報として、方形106aの範囲を示す座標及び中心座標を演算させ、認識した識別記号(例えば数字)と関連づけて撮影パラメータとして記憶部202に記憶させる。 The synthesis processing unit 206 synthesizes a panoramic image based on the identification symbol associated with the first square area 100a stored in the memory unit 202 and the arrangement information. The calibration processing unit 208, for example, causes the second camera 102 to capture an image of each square 106a of the reference image board 106. Then, the partition processing unit 204 calculates the coordinates and center coordinates indicating the range of the square 106a as arrangement information, associates it with the recognized identification symbol (e.g., a number), and stores it in the memory unit 202 as shooting parameters.

 表示制御部210は、合成処理部206により生成されたパノラマ画像を第1表示装置30に表示させる。通信部212は、不図示の通信インターフェースを介してパノラマ画像を監視装置40に送信する。 The display control unit 210 causes the first display device 30 to display the panoramic image generated by the synthesis processing unit 206. The communication unit 212 transmits the panoramic image to the monitoring device 40 via a communication interface (not shown).

 ここで、図11、図13を参照にしつつ、図12の各部の処理例をより詳細に説明する。図13は、情報処理装置20の処理例を模式的に示す図である。 Here, a processing example of each unit in FIG. 12 will be described in more detail with reference to FIG. 11 and FIG. 13. FIG. 13 is a diagram that illustrates a processing example of the information processing device 20.

 図13(a)は、第1カメラ100の第1撮像画像100bと、第2カメラ102の第2撮像画像106bとを示す図である。図13(a)に示すように、カメラ制御部200は、第1カメラ100、及び第2カメラ102の撮像位置を制御して、各撮像位置での第1撮像画像100bと第2撮像画像106bとを撮像させる。このとき、カメラ制御部200は、記憶部に記憶される方形106aの中心座標に第2カメラ102の第2光軸OL2が一致するように制御する。そして、カメラ制御部200は、第1カメラ100にこの撮像位置での第1撮像画像100bを撮像させ、第2カメラ102にこの撮像位置での第2撮像画像106bを撮像させる。 13(a) is a diagram showing a first captured image 100b of the first camera 100 and a second captured image 106b of the second camera 102. As shown in FIG. 13(a), the camera control unit 200 controls the imaging positions of the first camera 100 and the second camera 102 to capture the first captured image 100b and the second captured image 106b at each imaging position. At this time, the camera control unit 200 controls the second optical axis OL2 of the second camera 102 to coincide with the central coordinates of the rectangle 106a stored in the storage unit. The camera control unit 200 then causes the first camera 100 to capture the first captured image 100b at this imaging position, and the second camera 102 to capture the second captured image 106b at this imaging position.

 図13(b)は、第2撮像画像106b中からの方形106aの範囲の認識処理例を示す図である。図13(b)に示すように、区画処理部204は、記憶部202に記憶される方形106aの座標情報を参照しつつ、方形106aを示す座標と、識別記号を認識する。次に、区画処理部204は、認識した方形106aを示す座標に対応する範囲を、第1撮像画像100bに設定し、第1方形領域100aを生成して、認識した識別記号に関連づけ、記憶部202に記憶させる。また、第2撮像画像106bの撮像範囲が限定され、且つ方形106aは直線図形であるため、画像内の特徴点などよりも、高い認識精度で方形106aの座標情報を得ることが可能である。 13(b) is a diagram showing an example of the recognition process of the range of the square 106a from the second captured image 106b. As shown in FIG. 13(b), the partition processing unit 204 recognizes the coordinates indicating the square 106a and the identification symbol while referring to the coordinate information of the square 106a stored in the memory unit 202. Next, the partition processing unit 204 sets the range corresponding to the coordinates indicating the recognized square 106a in the first captured image 100b, generates the first square area 100a, associates it with the recognized identification symbol, and stores it in the memory unit 202. In addition, since the imaging range of the second captured image 106b is limited and the square 106a is a rectilinear figure, it is possible to obtain the coordinate information of the square 106a with higher recognition accuracy than feature points in the image.

 図13(c)は、第1方形領域100aを用いたパノラマ合成処理の概念図である。図13(c)に示すように、合成処理部206は、記憶部202に記憶される第1方形領域100aを関連づけられた識別記号、及び配置すべき座標に基づき、パノラマ画像を合成する。このような処理を、全ての方形106aに対して順に繰り返す。合成処理部206の処理は、全第1方形領域100aの生成後でもよいし、撮影中に順次に合成処理を実行してもよい。このように、方形106aの座標情報に基づき、パノラマ合成処理を実行することにより、撮像装置10における機械駆動の精度よりも高い位置合わせ精度でのパノラマ合成処理が可能となる。 FIG. 13(c) is a conceptual diagram of the panoramic synthesis process using the first square region 100a. As shown in FIG. 13(c), the synthesis processing unit 206 synthesizes a panoramic image based on the identification symbol associated with the first square region 100a stored in the memory unit 202 and the coordinates at which the region should be placed. This process is repeated for all of the squares 106a in order. The synthesis processing unit 206 may perform the process after all of the first square regions 100a have been generated, or the synthesis process may be performed sequentially during shooting. In this way, by performing the panoramic synthesis process based on the coordinate information of the squares 106a, it is possible to perform the panoramic synthesis process with a higher alignment accuracy than the accuracy of mechanical driving in the imaging device 10.

 図14は、監視装置40の構成例を示すブロック図である。監視装置40は、記憶部402と、区画処理部204と、合成処理部206と、キャリブレーション処理部208と、表示制御部410と、通信部412とを、有する。また、監視装置40は、区画処理部204と、合成処理部206と、キャリブレーション処理部208と、を有する構成にしてもよい。監視装置40が有するCPUは、記憶部402に記憶されるプログラムを実行することにより、区画処理部204、合成処理部206、キャリブレーション処理部208、及び表示制御部210を構成可能である。或いは、各部を電子回路で構成してもよい。 FIG. 14 is a block diagram showing an example of the configuration of the monitoring device 40. The monitoring device 40 has a storage unit 402, a partition processing unit 204, a composition processing unit 206, a calibration processing unit 208, a display control unit 410, and a communication unit 412. The monitoring device 40 may also have a configuration including the partition processing unit 204, the composition processing unit 206, and the calibration processing unit 208. The CPU of the monitoring device 40 can configure the partition processing unit 204, the composition processing unit 206, the calibration processing unit 208, and the display control unit 210 by executing a program stored in the storage unit 402. Alternatively, each unit may be configured using electronic circuits.

 記憶部402は、プログラム、及び各種の制御パラメータと、撮像画像などを記憶する。表示制御部410は、通信部412を介して取得したパノラマ画像を第2表示装置50に表示させる。通信部412は、パノラマ画像を情報処理装置20から受信する。このような、構成にすることにより、パノラマ画像の生成は、エッジ側である情報処理装置20で行うことも可能であり、或いは、サーバ側である監視装置40で行うことも可能である。これにより、情報処理装置20での処理量、通信頻度、通信量など、カメラ側のコスト、使用可能電力、通信インフラなどに適応させた構成が選択可能となる。 The storage unit 402 stores programs, various control parameters, captured images, etc. The display control unit 410 displays the panoramic image acquired via the communication unit 412 on the second display device 50. The communication unit 412 receives the panoramic image from the information processing device 20. With this configuration, the panoramic image can be generated by the information processing device 20 on the edge side, or by the monitoring device 40 on the server side. This makes it possible to select a configuration that is adapted to the processing volume, communication frequency, communication volume, etc. of the information processing device 20, the cost on the camera side, available power, communication infrastructure, etc.

 以上説明したように、本実施形態に係る撮像装置10は、第1カメラ100と第1カメラ100に固定された第2カメラ102と、第2カメラ102の撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板106と、を備えるように構成した。これにより、第2カメラ102が撮像した区画を含む画像データに基づき、第1カメラ100が撮像した画像データを合成することが可能となる。 As described above, the imaging device 10 according to this embodiment is configured to include a first camera 100, a second camera 102 fixed to the first camera 100, and a reference image board 106 having a chart including multiple sections, which is disposed opposite the imaging surface of the second camera 102. This makes it possible to synthesize image data captured by the first camera 100 based on image data including the sections captured by the second camera 102.

(第2実施形態)
 第2施形態に係る撮像装置10aは、第2カメラ102をパン軸OPの第1カメラ100側に配置する点で第1実施形態に係る撮像装置10と相違する。以下では第1実施形態に係る撮像装置10と相違する点を説明する。
Second Embodiment
The imaging device 10a according to the second embodiment differs from the imaging device 10 according to the first embodiment in that the second camera 102 is disposed on the first camera 100 side of the pan axis OP. The differences from the imaging device 10 according to the first embodiment will be described below.

 図15は、第2施形態に係る撮像装置10aの垂直断面図である。図15に示すように、第2施形態に係る第2カメラ102は、パン軸OPの第1カメラ100側に配置する点で第1実施形態に係る撮像装置10と相違する。これにより、第2カメラ102の基準画像板106までの距離が短縮され、基準画像板106を小型化することが可能となる。また、第2カメラ102を基準画像板106の内面に内包するように構成できる。これにより、第2施形態に係る撮像装置10aのパン角を0度から360度まで拡大することが可能となる。また、オーバーラップしたカバーにより首位の光が入らないことにより、遮光カバー108(図5参照)と同等の効果を得ることが可能となる。 FIG. 15 is a vertical cross-sectional view of the imaging device 10a according to the second embodiment. As shown in FIG. 15, the second camera 102 according to the second embodiment differs from the imaging device 10 according to the first embodiment in that it is disposed on the first camera 100 side of the pan axis OP. This shortens the distance from the second camera 102 to the reference image plate 106, making it possible to miniaturize the reference image plate 106. In addition, the second camera 102 can be configured to be enclosed within the inner surface of the reference image plate 106. This makes it possible to expand the pan angle of the imaging device 10a according to the second embodiment from 0 degrees to 360 degrees. In addition, since the overlapping cover prevents light from entering, it is possible to obtain the same effect as the light-shielding cover 108 (see FIG. 5).

 図16は、基準画像板106の断面斜視図である。図16に示すように、照明LEDである光源112は第2カメラ102の背面に配置される。図17は、基準画像板106の垂直断面図である。図17に示すように、光源112の照明光L112は、第2カメラ102への映り込みを抑制し、内部を間接的に均一化して照らすことが可能となり、第2カメラ102のより安定化した露出が得られる。また、オーバーラップしたカバーにより首位の光が入らないことにより、遮光カバー108(図5参照)と同等の効果を得ることが可能となる。 FIG. 16 is a cross-sectional perspective view of the reference image plate 106. As shown in FIG. 16, the light source 112, which is an illumination LED, is disposed on the back surface of the second camera 102. FIG. 17 is a vertical cross-sectional view of the reference image plate 106. As shown in FIG. 17, the illumination light L112 from the light source 112 suppresses reflections on the second camera 102 and can indirectly illuminate the interior in a uniform manner, resulting in a more stable exposure of the second camera 102. In addition, the overlapping cover prevents light from entering the interior, making it possible to obtain the same effect as the light-shielding cover 108 (see FIG. 5).

 図18は、チルト軸OTに対して第1カメラ100、及び第2カメラ102をチルト動作させている例を示す垂直断面図である。図18(a)は、チルト角度下15度の場合であり、図18(b)は、チルト角度上15度の場合である。基準画像板106の下側の穴部はカメラフレーム104cの穴部とオーバーラップしたカバーにより、上下チルトしても塞がれるように構成される。図19は、撮像装置10aの全体を示す斜視図である。撮像装置10aの全体の大きさは、例えば縦幅、及び横幅が共に10センチメートル以内である。このような、撮像装置10aは、例えば、監視範囲の天井部などに取り付けることも可能である。 Figure 18 is a vertical cross-sectional view showing an example of tilting the first camera 100 and the second camera 102 with respect to the tilt axis OT. Figure 18(a) shows the case where the tilt angle is 15 degrees downward, and Figure 18(b) shows the case where the tilt angle is 15 degrees upward. The hole on the lower side of the reference image plate 106 is configured to be covered by a cover that overlaps with the hole in the camera frame 104c, even when tilted up and down. Figure 19 is a perspective view showing the entire imaging device 10a. The overall size of the imaging device 10a is, for example, within 10 centimeters in both length and width. Such an imaging device 10a can also be attached to, for example, the ceiling of a monitoring area.

 以上説明したように、本実施形態に係る撮像装置10aでは、第2カメラ102をパン軸OPの第1カメラ100側に配置することとした。これにより、第2カメラ102の基準画像板106までの距離が短縮され、基準画像板106を小型化することが可能となる。また、第2カメラ102を基準画像板106の内面に内包するように構成できる。これにより、撮像装置10aのパン角を0度から360度まで拡大することが可能となる。 As described above, in the imaging device 10a according to this embodiment, the second camera 102 is arranged on the first camera 100 side of the pan axis OP. This shortens the distance from the second camera 102 to the reference image plate 106, making it possible to miniaturize the reference image plate 106. In addition, the second camera 102 can be configured to be contained within the inner surface of the reference image plate 106. This makes it possible to expand the pan angle of the imaging device 10a from 0 degrees to 360 degrees.

(第3実施形態)
 第3施形態に係る撮像装置10bは、第2カメラ102をパン軸OPの第1カメラ100側に配置し、パン(Panoramic)動作のみをさせる点で第2実施形態に係る撮像装置10aと相違する。以下では第2実施形態に係る撮像装置10bと相違する点を説明する。
Third Embodiment
The imaging device 10b according to the third embodiment differs from the imaging device 10a according to the second embodiment in that the second camera 102 is disposed on the first camera 100 side of the pan axis OP and performs only pan (panoramic) operations. The differences from the imaging device 10b according to the second embodiment will be described below.

 図20は、第3施形態に係る撮像装置10bの垂直断面図である。図21は、外部ドーム110を外した状態での斜視断面図である。図20、21に示すように、第3施形態に係る撮像装置10bは、チルト軸回動機構104bを有さない点で第2実施形態に係る撮像装置10aと相違する。これにより、第3施形態に係る撮像装置10bをより小型化が可能となると共に、構造の簡略化・低コスト化が可能となる。 FIG. 20 is a vertical cross-sectional view of an imaging device 10b according to the third embodiment. FIG. 21 is an oblique cross-sectional view with the external dome 110 removed. As shown in FIGS. 20 and 21, the imaging device 10b according to the third embodiment differs from the imaging device 10a according to the second embodiment in that it does not have a tilt axis rotation mechanism 104b. This allows the imaging device 10b according to the third embodiment to be made smaller, and also allows for a simplified structure and reduced costs.

 図22は、外部ドーム110を外した撮像装置10bの全体を示す斜視図である。撮像装置10aの全体の大きさは、例えば縦幅、及び横幅が共に10センチメートル以内である。このような、撮像装置10aは、例えば、監視範囲の天井部などに取り付けることも可能である。 Figure 22 is a perspective view showing the entire imaging device 10b with the external dome 110 removed. The overall size of the imaging device 10a is, for example, within 10 centimeters in both length and width. Such an imaging device 10a can also be attached to, for example, the ceiling of the monitoring area.

 図23は、第1カメラ100の撮像した第1撮像画像100bと、方形106aの領域に対応する第1方形領域100aを示す図である。第3施形態に係る撮像装置10bでは、第1撮像画像100bは、垂直方向に長辺を有する長方形として撮像される。このように、チルト軸回動機構104bを有さない場合は、カメラの画角を縦長になるように配置することにより、第1カメラ100の画素を有効に使用することができる。 FIG. 23 shows a first captured image 100b captured by the first camera 100, and a first square area 100a corresponding to the area of the square 106a. In the imaging device 10b according to the third embodiment, the first captured image 100b is captured as a rectangle with its long side in the vertical direction. In this way, when the tilt axis rotation mechanism 104b is not provided, the pixels of the first camera 100 can be used effectively by positioning the camera so that its angle of view is vertically long.

 以上説明したように、本実施形態に係る撮像装置10bでは、第2カメラ102をパン軸OPの第1カメラ100側に配置し、パン(Panoramic)動作のみをさせることとした。これにより、係る撮像装置10bをより小型化することが可能となる。 As described above, in the imaging device 10b according to this embodiment, the second camera 102 is placed on the first camera 100 side of the pan axis OP, and is only allowed to perform pan (panoramic) movements. This makes it possible to further miniaturize the imaging device 10b.

 なお、本技術は以下のような構成を取ることができる。 This technology can be configured as follows:

(1)
 第1カメラと
 前記第1カメラに固定された第2カメラと、
 前記第2カメラの撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板と、
 を備える、撮像装置。
(1)
a first camera; and a second camera fixed to the first camera.
a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
An imaging device comprising:

(2)
 前記第1カメラ、及び前記第2カメラは、パン軸を中心に回動可能であり、
 前記第2カメラの撮像系の第2光軸は、パン軸との交点を有する、(1)に記載の撮像装置。
(2)
the first camera and the second camera are rotatable about a pan axis;
The imaging device according to (1), wherein a second optical axis of the imaging system of the second camera has an intersection with a pan axis.

(3)
 前記第1カメラの撮像系の第1光軸は、前記パン軸との交点を有する、(2)に記載の撮像装置。
(3)
The imaging device according to (2), wherein a first optical axis of the imaging system of the first camera has an intersection with the pan axis.

(4)
 前記基準画像板は、中空状の球体における少なくとも一部の内面に複数の区画を含むチャートを有する、(1)に記載の撮像装置。
(4)
The imaging device according to (1), wherein the reference image plate has a chart including a plurality of divisions on at least a portion of an inner surface of a hollow sphere.

(5)
 前記チャートは、白地に黒の方形線図で描画される、(4)に記載の撮像装置。
(5)
The imaging device according to (4), wherein the chart is drawn as a black rectangular diagram on a white background.

(6)
 前記第1カメラ、及び前記第2カメラは、パン軸を中心に回動可能であり、
 前記基準画像板の中心は、パン軸上にある、(5)に記載の撮像装置。
(6)
the first camera and the second camera are rotatable about a pan axis;
The imaging device described in (5) above, wherein the center of the reference image plate is on the pan axis.

(7)
 前記第1光軸と前記第2光軸とは、同軸である、(3)に記載の撮像装置。
(7)
The imaging device according to (3), wherein the first optical axis and the second optical axis are coaxial.

(8)
 前記第2カメラの撮像系は、前記パン軸よりも前記第1カメラ側に配置される、(7)に記載の撮像装置。
(8)
The imaging device described in (7) above, wherein the imaging system of the second camera is disposed closer to the first camera than the pan axis.

(9)
 前記第1カメラ、及び前記第2カメラは、チルト軸を中心に回動可能であり、
 前記チルト軸は、前記交点を通過する、(8)に記載の撮像装置。
(9)
the first camera and the second camera are rotatable about a tilt axis;
The imaging device according to (8), wherein the tilt axis passes through the intersection point.

(10)
 前記チャートを照明する光源と、
 前記照明を遮光する遮光カバーと、を更に備える、(9)に記載の撮像装置。
(10)
a light source for illuminating the chart;
The imaging device according to (9), further comprising a light-shielding cover that blocks the illumination.

(11)
 前記第1カメラ、及び前記第2カメラを支持するカメラフレームを、更に備え、
 前記光源と、前記遮光カバーは、前記カメラフレームに支持される、(10)に記載の撮像装置。
(11)
a camera frame supporting the first camera and the second camera;
The imaging device according to (10), wherein the light source and the light-shielding cover are supported by the camera frame.

(12)
 前記第1カメラ、及び前記第2カメラ、前記カメラフレーム、前記光源、及び前記遮光カバーを覆う透明な外部ドームを更に備える、(11)に記載の撮像装置。
(12)
The imaging device of (11), further comprising a transparent outer dome covering the first camera, the second camera, the camera frame, the light source, and the light-shielding cover.

(13)
 前記チャートの複数の区画を所定順に前記第2カメラに撮像させ、前記第2カメラの撮像に応じて前記第1カメラを撮像させる駆動制御部を更に備える、(12)に記載の撮像装置。
(13)
The imaging device according to (12), further comprising a drive control unit that causes the second camera to capture images of a plurality of sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.

(14)
 前記第1カメラ、及び前記第2カメラの画角は固定されており、前記第1カメラの画角は、前記第2カメラの画角以上である、(13)に記載の撮像装置。
(14)
The imaging device according to (13), wherein the first camera and the second camera have fixed angles of view, and the angle of view of the first camera is equal to or greater than the angle of view of the second camera.

(15)
 複数の区画を含むチャートを撮影する第2カメラの画像データに基づいて、前記第2カメラに固定された第1カメラから出力された複数の画像データを合成処理する合成処理部を、
備える、情報処理装置。
(15)
a synthesis processing unit that synthesizes a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that captures an image of a chart including a plurality of partitions;
Provided with an information processing device.

(16)
 前記第2カメラの第2画像データにおける前記区画に対応する前記第1カメラの第1区画像データを生成する区画処理部を更に備え、
 前記合成処理部は、前記第1区画像データを合成処理する、(15)に記載の情報処理装置。
(16)
a partition processing unit that generates first partition image data of the first camera corresponding to the partition in the second image data of the second camera,
The information processing device according to (15), wherein the synthesis processing unit synthesizes the first section image data.

(17)
 前記区画に対応する配置情報を記憶する記憶部を更にそなえ、
 前記合成処理部は、前記配置情報に基づき、前記第1区画像データを合成処理する、(16)に記載の情報処理装置。
(17)
Further comprising a storage unit for storing arrangement information corresponding to the partitions,
The information processing device according to (16), wherein the synthesis processing unit synthesizes the first section image data based on the arrangement information.

(18)
 前記区画処理部は、前記区画に対応する識別記号を認識し、
 前記合成処理部は、前記識別記号に基づき、前記第1区画像データを合成処理する、(17)に記載の情報処理装置。
(18)
The partition processing unit recognizes an identification symbol corresponding to the partition,
The information processing device according to (17), wherein the synthesis processing unit synthesizes the first section image data based on the identification symbol.

(19)
 前記複数の区画を含むチャートを撮像した画像に基づき、前記配置情報を生成し、前記記憶部に記憶する、キャリブレーション部を、更に備える、(17)に記載の情報処理装置。
(19)
The information processing device according to (17), further comprising a calibration unit that generates the arrangement information based on an image captured of a chart including the plurality of sections and stores the arrangement information in the storage unit.

(20)
 撮像装置と、
 情報処理装置と、を備える撮像システムであって、
 前記撮像装置は、
 第1カメラと
 前記第1カメラに固定された第2カメラと、
 前記第2カメラの撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板と、
 を有し、
 前記情報処理装置は、
 前記複数の区画を含むチャートを撮影する前記第2カメラの画像データに基づいて、前記第1カメラから出力された複数の画像データを合成処理する合成処理部を、
を有する、撮像システム。
(20)
An imaging device;
An imaging system including an information processing device,
The imaging device includes:
a first camera; and a second camera fixed to the first camera.
a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
having
The information processing device includes:
a synthesis processing unit that synthesizes a plurality of image data output from the first camera based on image data of the second camera that captures an image of a chart including the plurality of partitions;
An imaging system comprising:

(21)
 前記情報処理装置は、
 前記チャートの前記複数の区画を所定順に前記第2カメラに撮像させ、前記第2カメラの撮像に応じて前記第1カメラを撮像させるカメラ制御部を更に有する、(19)に記載の撮像システム。
(21)
The information processing device includes:
The imaging system according to (19), further comprising a camera control unit that causes the second camera to capture images of the multiple sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.

 本開示の態様は、上述した個々の実施形態に限定されるものではなく、当業者が想到しうる種々の変形も含むものであり、本開示の効果も上述した内容に限定されない。すなわち、特許請求の範囲に規定された内容およびその均等物から導き出される本開示の概念的な思想と趣旨を逸脱しない範囲で種々の追加、変更および部分的削除が可能である。 The aspects of the present disclosure are not limited to the individual embodiments described above, but include various modifications that may be conceived by a person skilled in the art, and the effects of the present disclosure are not limited to the above. In other words, various additions, modifications, and partial deletions are possible within the scope that does not deviate from the conceptual idea and intent of the present disclosure derived from the contents defined in the claims and their equivalents.

1:撮像システム、10、10a、10b:撮像装置、20:情報処理装置、100:第1カメラ、102:第2カメラ、108:遮光カバー、106:基準画像板、110:外部ドーム、112:光源、OL1:第1光軸、OL2:第2光軸、OP:パン軸、OT:チルト軸。 1: Imaging system, 10, 10a, 10b: Imaging device, 20: Information processing device, 100: First camera, 102: Second camera, 108: Light shielding cover, 106: Reference image plate, 110: External dome, 112: Light source, OL1: First optical axis, OL2: Second optical axis, OP: Pan axis, OT: Tilt axis.

Claims (20)

 第1カメラと
 前記第1カメラに固定された第2カメラと、
 前記第2カメラの撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板と、
 を備える、撮像装置。
a first camera; and a second camera fixed to the first camera.
a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
An imaging device comprising:
 前記第1カメラ、及び前記第2カメラは、パン軸を中心に回動可能であり、
 前記第2カメラの撮像系の第2光軸は、パン軸との交点を有する、請求項1に記載の撮像装置。
the first camera and the second camera are rotatable about a pan axis;
The imaging device according to claim 1 , wherein a second optical axis of the imaging system of the second camera has an intersection with a pan axis.
 前記第1カメラの撮像系の第1光軸は、前記パン軸との交点を有する、請求項2に記載の撮像装置。 The imaging device according to claim 2, wherein a first optical axis of the imaging system of the first camera has an intersection with the pan axis.  前記基準画像板は、中空状の球体における少なくとも一部の内面に複数の区画を含むチャートを有する、請求項1に記載の撮像装置。 The imaging device of claim 1, wherein the reference image plate has a chart including multiple compartments on at least a portion of the inner surface of a hollow sphere.  前記チャートは、白地に黒の方形線図で描画される、請求項4に記載の撮像装置。 The imaging device of claim 4, wherein the chart is drawn as a black rectangular diagram on a white background.  前記第1カメラ、及び前記第2カメラは、パン軸を中心に回動可能であり、
 前記基準画像板の中心は、パン軸上にある、請求項5に記載の撮像装置。
the first camera and the second camera are rotatable about a pan axis;
The imaging device of claim 5 , wherein the center of the reference image plate is on a pan axis.
 前記第1光軸と前記第2光軸とは、同軸である、請求項3に記載の撮像装置。 The imaging device according to claim 3, wherein the first optical axis and the second optical axis are coaxial.  前記第2カメラの撮像系は、前記パン軸よりも前記第1カメラ側に配置される、請求項7に記載の撮像装置。 The imaging device according to claim 7, wherein the imaging system of the second camera is disposed closer to the first camera than the pan axis.  前記第1カメラ、及び前記第2カメラは、チルト軸を中心に回動可能であり、
 前記チルト軸は、前記交点を通過する、請求項8に記載の撮像装置。
the first camera and the second camera are rotatable about a tilt axis;
The imaging device of claim 8 , wherein the tilt axis passes through the intersection point.
 前記チャートを照明する光源と、
 前記照明を遮光する遮光カバーと、を更に備える、請求項9に記載の撮像装置。
a light source for illuminating the chart;
The imaging device according to claim 9 , further comprising: a light-shielding cover that blocks the illumination.
 前記第1カメラ、及び前記第2カメラを支持するカメラフレームを、更に備え、
 前記光源と、前記遮光カバーは、前記カメラフレームに支持される、請求項10に記載の撮像装置。
a camera frame supporting the first camera and the second camera;
The imaging device according to claim 10 , wherein the light source and the light-shielding cover are supported by the camera frame.
 前記第1カメラ、及び前記第2カメラ、前記カメラフレーム、前記光源、及び前記遮光カバーを覆う透明な外部ドームを更に備える、請求項11に記載の撮像装置。 The imaging device of claim 11, further comprising a transparent outer dome that covers the first camera, the second camera, the camera frame, the light source, and the light-shielding cover.  前記チャートの複数の区画を所定順に前記第2カメラに撮像させ、前記第2カメラの撮像に応じて前記第1カメラを撮像させる駆動制御部を更に備える、請求項12に記載の撮像装置。 The imaging device according to claim 12, further comprising a drive control unit that causes the second camera to capture images of the multiple sections of the chart in a predetermined order and causes the first camera to capture images in response to the images captured by the second camera.  前記第1カメラ、及び前記第2カメラの画角は固定されており、前記第1カメラの画角は、前記第2カメラの画角以上である、請求項13に記載の撮像装置。 The imaging device according to claim 13, wherein the angles of view of the first camera and the second camera are fixed, and the angle of view of the first camera is equal to or greater than the angle of view of the second camera.  複数の区画を含むチャートを撮影する第2カメラの画像データに基づいて、前記第2カメラに固定された第1カメラから出力された複数の画像データを合成処理する合成処理部を、
備える、情報処理装置。
a synthesis processing unit that synthesizes a plurality of image data output from a first camera fixed to a second camera based on image data of the second camera that captures an image of a chart including a plurality of partitions;
Provided with an information processing device.
 前記第2カメラの第2画像データにおける前記区画に対応する前記第1カメラの第1区画像データを生成する区画処理部を更に備え、
 前記合成処理部は、前記第1区画像データを合成処理する、請求項15に記載の情報処理装置。
a partition processing unit that generates first partition image data of the first camera corresponding to the partition in the second image data of the second camera,
The information processing apparatus according to claim 15 , wherein the synthesis processing unit synthesizes the first section image data.
 前記区画に対応する配置情報を記憶する記憶部を更にそなえ、
 前記合成処理部は、前記配置情報に基づき、前記第1区画像データを合成処理する、請求項16に記載の情報処理装置。
Further comprising a storage unit for storing arrangement information corresponding to the partitions,
The information processing apparatus according to claim 16 , wherein the synthesis processing unit synthesizes the first section image data based on the layout information.
 前記区画処理部は、前記区画に対応する識別記号を認識し、
 前記合成処理部は、前記識別記号に基づき、前記第1区画像データを合成処理する、請求項17に記載の情報処理装置。
The partition processing unit recognizes an identification symbol corresponding to the partition,
The information processing apparatus according to claim 17 , wherein the synthesis processing unit synthesizes the first section image data based on the identification symbol.
 前記複数の区画を含むチャートを撮像した画像に基づき、前記配置情報を生成し、前記記憶部に記憶する、キャリブレーション部を、更に備える、請求項17に記載の情報処理装置。 The information processing device according to claim 17, further comprising a calibration unit that generates the arrangement information based on an image of a chart including the plurality of sections and stores the arrangement information in the storage unit.  撮像装置と、
 情報処理装置と、を備える撮像システムであって、
 前記撮像装置は、
 第1カメラと
 前記第1カメラに固定された第2カメラと、
 前記第2カメラの撮像面に対向して配置され、複数の区画を含むチャートを有する基準画像板と、
 を有し、
 前記情報処理装置は、
 前記複数の区画を含むチャートを撮影する前記第2カメラの画像データに基づいて、前記第1カメラから出力された複数の画像データを合成処理する合成処理部を、
を有する、撮像システム。
An imaging device;
An imaging system including an information processing device,
The imaging device includes:
a first camera; and a second camera fixed to the first camera.
a reference image plate disposed opposite to an imaging surface of the second camera and having a chart including a plurality of sections;
having
The information processing device includes:
a synthesis processing unit that synthesizes a plurality of image data output from the first camera based on image data of the second camera that captures an image of a chart including the plurality of partitions;
An imaging system comprising:
PCT/JP2024/028916 2023-09-25 2024-08-13 Imaging device, information processing device, and imaging system Pending WO2025069752A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-161226 2023-09-25
JP2023161226 2023-09-25

Publications (1)

Publication Number Publication Date
WO2025069752A1 true WO2025069752A1 (en) 2025-04-03

Family

ID=95203757

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/028916 Pending WO2025069752A1 (en) 2023-09-25 2024-08-13 Imaging device, information processing device, and imaging system

Country Status (1)

Country Link
WO (1) WO2025069752A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005012619A (en) * 2003-06-20 2005-01-13 Mitsubishi Electric Corp Panorama image generator
JP2006229802A (en) * 2005-02-21 2006-08-31 Hitachi Ltd Image composition device and imaging system
US20140084789A1 (en) * 1993-02-26 2014-03-27 Donnelly Corporation Vehicular vision system
WO2017146224A1 (en) * 2016-02-24 2017-08-31 株式会社Tbwa Hakuhodo Photographing system and data generation method
JP2020024687A (en) * 2018-07-27 2020-02-13 大日本印刷株式会社 Information processing device, information processing system, information processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140084789A1 (en) * 1993-02-26 2014-03-27 Donnelly Corporation Vehicular vision system
JP2005012619A (en) * 2003-06-20 2005-01-13 Mitsubishi Electric Corp Panorama image generator
JP2006229802A (en) * 2005-02-21 2006-08-31 Hitachi Ltd Image composition device and imaging system
WO2017146224A1 (en) * 2016-02-24 2017-08-31 株式会社Tbwa Hakuhodo Photographing system and data generation method
JP2020024687A (en) * 2018-07-27 2020-02-13 大日本印刷株式会社 Information processing device, information processing system, information processing method, and program

Similar Documents

Publication Publication Date Title
US20210207954A1 (en) Apparatus and method for measuring a three-dimensional shape
US12075182B2 (en) Background display device, background display system, recording system, camera system, digital camera and method of controlling a background display device
US9197800B2 (en) Imaging robot
US20110001818A1 (en) Three dimensional shape measurement apparatus
JP5086687B2 (en) Laser processing equipment
JP3025255B1 (en) Image data converter
US20040169827A1 (en) Projection display apparatus
CN113508454B (en) Light-emitting inspection device for micro-light-emitting diodes and inspection method using the device
CN108803198B (en) Shooting device, control method and device applied to shooting device
US20150185462A1 (en) Microscope And Magnifying Observation Method Using The Same
JP2020128931A (en) Inspection equipment
US20220235915A1 (en) Color mixing from different light sources
US7791805B2 (en) Multifocal lens array and three-dimensional stereoscopic image display apparatus
WO2025069752A1 (en) Imaging device, information processing device, and imaging system
KR20200134188A (en) System of generating 3D image data
KR20190110429A (en) Pattern drawing apparatus and pattern drawing method
JP7082927B2 (en) Exposure device
JP3873163B2 (en) Photographed image processing apparatus, photographing apparatus, and storage medium
JP7328824B2 (en) Three-dimensional shape measuring device and three-dimensional shape measuring method
WO2018154871A1 (en) Observation device, observation system, and method for controlling observation device
US8885051B2 (en) Camera calibration method and camera calibration apparatus
WO2022059546A1 (en) Photographing apparatus, photographing system, and control method
CN116503240A (en) Configuration and splicing method and system for annular rotation ground projection panoramic picture
JPH09180513A (en) Illumination control device
JP7636824B1 (en) Method and device for visual inspection of mobile terminals

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24871554

Country of ref document: EP

Kind code of ref document: A1