[go: up one dir, main page]

WO2023013108A1 - Multi-camera device - Google Patents

Multi-camera device Download PDF

Info

Publication number
WO2023013108A1
WO2023013108A1 PCT/JP2022/005257 JP2022005257W WO2023013108A1 WO 2023013108 A1 WO2023013108 A1 WO 2023013108A1 JP 2022005257 W JP2022005257 W JP 2022005257W WO 2023013108 A1 WO2023013108 A1 WO 2023013108A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
imaging
camera device
cameras
imaging timing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/005257
Other languages
French (fr)
Japanese (ja)
Inventor
琢馬 大里
春樹 的野
健 永崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Priority to DE112022001445.8T priority Critical patent/DE112022001445T5/en
Publication of WO2023013108A1 publication Critical patent/WO2023013108A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Definitions

  • the present invention relates to a multi-camera device.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2011-205388
  • the problem is described as "to make it possible to more easily detect the deviation of the photographing timing of the stereo camera.” Images obtained by photographing the presentation signal with two left and right cameras constituting a stereo camera are input. If the frequency of the presentation signal is greater than the Nyquist frequency of the frame rate of the camera, the filtering unit 93 extracts the left and right folding signals of a specific frequency from the capture signal by filtering. The matching processing unit 94 detects the phase shift of the left and right return signals, and the shift amount calculation unit 95 detects the shift of the shooting timing of the camera from the phase shift of the return signals.”
  • a stereo camera that measures distance by triangulation is known as a multi-camera device that detects objects and measures distance. There is a distance accuracy and a processing cycle as indicators of device performance, but these performances have a trade-off relationship.
  • a synchronous stereo camera that synchronizes multiple cameras and captures images at the same time has excellent distance accuracy
  • an asynchronous stereo camera that captures images at different times with multiple cameras is characterized by excellent processing cycle. Since the required distance accuracy and processing cycle change depending on the situation, there is a problem that if only one of them is adopted, the required performance cannot be satisfied depending on the scene.
  • An object of the present invention is to provide a multi-camera device that can meet the required distance accuracy and imaging cycle that change according to the situation with a single multi-camera device.
  • the present invention it is possible to provide a multi-camera device that can satisfy the required performance by appropriately switching the synchronization and asynchronization of the cameras even when the required distance accuracy and imaging cycle change.
  • FIG. 2 is an explanatory diagram showing the configuration of the multi-camera device according to the first embodiment
  • FIG. 1 is an explanatory diagram showing the configuration of an in-vehicle multi-camera device according to Embodiment 1.
  • FIG. FIG. 4 is an explanatory diagram of an image processing unit according to the first embodiment
  • FIG. 4 is an explanatory diagram of a parallax calculation unit according to the first embodiment
  • FIG. 4 is an explanatory diagram of the difference in distance measurement processing between a synchronous stereo camera and an asynchronous stereo camera
  • FIG. 4 is an explanatory diagram of a difference in imaging cycle between a synchronous stereo camera and an asynchronous stereo camera
  • 4 is an explanatory diagram of a synchronous/asynchronous switching unit according to the first embodiment
  • FIG. 9 is an explanatory diagram of a parallax calculation unit according to the second embodiment
  • FIG. 9 is an explanatory diagram of a required performance calculation unit according to the second embodiment
  • FIG. 10 is an explanatory diagram of a required performance integrated determination unit in the second embodiment
  • FIG. 11 is an explanatory diagram of a case in which imaging timing is set for each camera according to the third embodiment
  • Embodiment 1 will be described with reference to FIGS. 1 to 7.
  • FIG. 1 is a diagrammatic representation of Example 1
  • FIG. 1 shows the configuration of the multi-camera device.
  • the multi-camera device 10 is composed of a multi-camera 12 composed of a plurality of cameras 11 , a memory 13 , a CPU 14 , an image processing section 15 and an external output section 16 . Images captured by (each camera 11 of) the multi-camera 12 are stored in the memory 13 .
  • the image processing unit 15 performs processing such as object detection on the stored image, and the obtained result is output to the outside through the external output unit 16 .
  • the multi-camera device 10 is mounted on a vehicle (hereinafter also referred to as a vehicle) 1, captures images around the vehicle, and captures a plurality of images obtained from an arbitrary set of two or more of the plurality of cameras 11.
  • the surroundings of the vehicle (outside world) are recognized by, for example, detecting a target object using the parallax obtained using the parallax, and the obtained information is output to the control command unit 2 as a controller.
  • the control command unit 2 determines (operating states of) the brakes (not shown), the engine 3, the steering 4, etc., which are control means of the own vehicle, based on the input information (information on the detected target object, etc.).
  • the vehicle 1 is controlled (acceleration/deceleration control and steering control). Since the vehicle 1 travels in various scenes such as highways and shopping streets, it is necessary to switch between synchronous and asynchronous switching according to this embodiment in order to always satisfy the required performance (more specifically, the required performance of distance accuracy and imaging cycle (processing cycle)). is necessary.
  • the image processing unit 15 of this embodiment includes a parallax calculation unit 31 and a synchronous/asynchronous switching unit 32 .
  • the synchronous/asynchronous switching unit 32 determines whether the multi-camera 12 (each camera 11 of it) performs synchronous imaging or asynchronous imaging according to the required distance accuracy and imaging cycle (specifically, outputs a synchronous or asynchronous flag) (explained later). This result is transmitted to the multi-camera 12 and used to control the imaging timing of the camera 11 and to correct the parallax of the asynchronous stereo cameras in the parallax calculator 31 . That is, the synchronous/asynchronous switching unit 32 has a function as an imaging timing switching unit that switches between synchronous and asynchronous operation of the plurality of cameras 11 according to desired distance accuracy and imaging cycle.
  • the parallax calculation unit 31 is configured by blocks as shown in FIG.
  • the parallax calculation unit 31 receives at least two images 1 and 2 as inputs from the multi-camera 12 in the parallax calculation processing unit 41, and executes parallax calculation processing.
  • a position where the same point in image 2 is captured as to a specific point in image 1 is searched.
  • the displacement of the coordinate positions of the points obtained as a result of the search at this time is called parallax, which is converted into distance.
  • the parallax correction amount is calculated based on the flag (synchronization flag or asynchronous flag) indicating the synchronous/asynchronous state set in the multi-camera 12 by the synchronous/asynchronous switching unit 32 (parallax correction amount calculation The processing unit 43) and the parallax correction processing unit 42 execute the correction to obtain the final parallax.
  • FIG. 5 An explanation of this parallax correction processing is shown in FIG.
  • the parallax is calculated for the target object 51 based on the image obtained from the left camera 52 at a certain time t and the image obtained from the right camera 53 at the same time t.
  • the parallax is calculated based on the images obtained from the left camera 52 at a certain time t and the right camera 54 at a different time t+1. If the sensor or the target object is moving, the distance to the target object 51 is different when these two images are captured. 55 needs to correct the calculated parallax.
  • Fig. 6 shows the difference in imaging cycle at this time.
  • the imaging cycle of the camera is ⁇ t
  • the imaging cycle of the synchronous stereo camera shown in FIG. 6 is ⁇ t.
  • the imaging period is ⁇ t/2 when imaging is performed with a shift of exactly half the imaging period. Since a new image cannot be obtained for the duration of the imaging cycle after an image has been obtained, the shorter the imaging cycle, the higher the responsiveness of the system.
  • the shorter the imaging cycle the less deformation between images. The shorter the distance, the smaller the amount of movement between images, so there are advantages such as easier tracking.
  • the synchronous/asynchronous switching unit 32 determines whether the multi-camera 12 (each camera 11 thereof) performs synchronous imaging or asynchronous imaging according to the required distance accuracy and imaging cycle. As described above, a synchronous flag is output when distance accuracy is required, and an asynchronous flag is output when a short imaging cycle is required.
  • S73 determination is made using the surrounding environment based on the result of object detection.
  • it is determined whether or not there is a blind spot area around the own vehicle (in other words, whether or not there is a blind spot where an object may suddenly jump out). If there are few blind spots around the vehicle and visibility is good, the possibility of an object suddenly appearing is low. Therefore, an asynchronous stereo camera capable of shortening the imaging cycle is desirable. Therefore, in S73, if the view of the surroundings is good, the process proceeds to S74, and if the view of the surroundings is poor, the process proceeds to S76 to output an asynchronous flag.
  • S74 determination is made based on the travel road.
  • it is determined whether the vehicle is traveling straight ahead or turning, based on map information or the like.
  • Switch to stereo camera That is, if the estimated traveling road is a straight road in S74, the process proceeds to S75 to output a synchronous flag. to output
  • each judgment may be weighted so that important judgments are prioritized, or each judgment is scored and the total is calculated.
  • the score may be the final decision.
  • a stereo camera that measures distance by triangulation is known as a multi-camera device that detects an object and measures distance. There is a distance accuracy and a processing cycle as indicators of device performance, but these performances have a trade-off relationship.
  • the processing cycle of the entire device is equal to the imaging cycle of the camera alone.
  • a device that captures images at different times with multiple cameras is called an asynchronous stereo camera.
  • the matching processing unit 94 detects the phase shift between the left and right return signals
  • the shift amount calculation unit 95 detects the shift in the shooting timing of the camera from the phase shift of the return signals. and corrects the distance measurement result by estimating the amount of deviation in the measured distance due to imaging at different times.
  • the distance measurement accuracy at this time is inferior to that of a synchronous stereo camera, since images are taken at different times by a plurality of cameras, the processing cycle of the entire apparatus is shorter than the image capturing cycle of each camera.
  • Synchronous stereo cameras have excellent distance accuracy, while asynchronous stereo cameras have excellent processing cycle. There was a problem that the performance could not be satisfied.
  • the multi-camera device 10 of the present embodiment described above recognizes the external world using parallax obtained using a plurality of images obtained from an arbitrary set of two or more of a plurality of cameras.
  • an imaging timing switching unit (synchronous/asynchronous switching unit 32) that performs synchronous/asynchronous switching of the plurality of cameras according to a desired distance accuracy and imaging cycle; and a parallax calculator 31 that calculates the parallax by using the amount of deviation in imaging timing between the cameras.
  • the desired distance accuracy and imaging cycle are determined by the type of target object that determines the control means of the own vehicle, the presence or absence of a blind spot area around the own vehicle (whether or not there is a blind spot where a sudden jump may occur), the map It is determined by whether the own vehicle is traveling straight or turning, which is determined from at least one of information or vehicle information such as steering angle.
  • the multi-camera device 10 that satisfies the required performance in various scenes is realized by switching the synchronous/asynchronous mode of the multi-camera device 10 according to the required distance accuracy and processing cycle. can be done.
  • Example 2 In the first embodiment, an embodiment has been described in which synchronization and asynchronization of the camera are switched. In a second embodiment, in addition to switching between synchronous and asynchronous cameras, an embodiment will be described in which an asynchronous stereo camera smoothly changes the imaging timing deviation amount between cameras.
  • FIG. 8 shows an explanatory diagram of the parallax calculation unit in the second embodiment.
  • the parallax calculation unit 31 of this embodiment includes a required performance calculation unit 81 and an imaging timing deviation amount calculation unit 82 in addition to the above-described first embodiment.
  • the required performance calculation unit 81 calculates the required performance that the multi-camera device 10 should satisfy (more specifically, the required performance of distance accuracy and imaging cycle).
  • An explanatory diagram of the required performance calculation unit 81 is shown in FIG. 91, 92, 93 and 94 in FIG. 9 are required performances calculated from the same viewpoints as S71, S72, S73 and S74 in FIG.
  • the shift amount of imaging timing is not simply switched, but is smoothly changed. Therefore, the required performance is calculated so as to change smoothly depending on the scene by scoring instead of a simple determination based on presence or absence.
  • the target type-based required performance calculation unit 91 calculates the required performance according to the type of the target object. For example, when considering a sensor that can determine the type of vehicle or pedestrian, pedestrians are more likely to undergo rapid changes in movement compared to vehicles. . In addition, since it is desirable that the response speed is higher as the time TTC[s] until collision is shorter, the target imaging cycle ⁇ t_a[s] is set according to the type of target object and a table set for each TTC. Vehicles and pedestrians are used as examples of types, but a plurality of types with different possibilities of occurrence of movement changes may be added, such as trucks, which have few sudden changes in movement.
  • the required performance calculation unit 92 calculates the required performance so that the collision can be avoided in time when an object pops out of the blind spot.
  • n [frame] is the frame required to detect the target object and perform collision determination
  • z [m] is the distance to the blind spot of the target object
  • v_car is the vehicle speed. [m/s]
  • the collision avoidable distance is z_th[m].
  • the required imaging cycle is the following formula should be determined so as to satisfy At this time, if n is far away, erroneous braking can be suppressed by increasing it, and if it is nearby, the danger is high, so decreasing it will widen the allowable range of braking. You can change it.
  • the target imaging cycle ⁇ t_c[s] can be expressed by the following formula, for example. As long as the relationship is such that ⁇ t_c becomes smaller when ⁇ is large, other formulas or a form of referring to a table in which values corresponding to ⁇ are stored in advance may be used.
  • the required performance calculation unit 94 based on the surrounding environment changes the required imaging cycle ⁇ t_d[s] for the entire scene. For example, it receives map information of the surrounding area based on GPS information, and increases ⁇ t_d to reduce responsiveness and improve distance accuracy because the visibility of the surrounding area is good on highways, and the visibility of the surrounding areas such as residential areas is poor and diverse. In an environment where an object is expected to pop out, ⁇ t_d is decreased to increase responsiveness. As with the target type-based required performance calculation unit 92, three or more types of switching may be performed as long as the environment requires different required imaging cycles, in addition to highways and residential areas. Also, the above environment may be determined by scene understanding by image recognition instead of GPS.
  • FIG. 10 shows a flow chart of integration processing (that is, processing for calculating the final target imaging cycle) of the required performance integration determination unit 95 .
  • ⁇ t_a and ⁇ t_b include target objects that are supposed to collide in order to set the required imaging cycle, so it is possible to calculate the times TTC_a and TTC_b until the collision when the collision occurs.
  • TTC_a and TTC_b are compared with a threshold value (S101, S102), and TTC_a and TTC_b are compared (S103). [s] or ⁇ t_b[s] is adopted (S104, S105). On the other hand, if these values are equal to or greater than the thresholds, there is time to spare before collision avoidance control is performed even if an object pops out.
  • the imaging timing deviation amount calculation unit 82 calculates the imaging timing deviation amount that satisfies the received required performance.
  • the imaging timing deviation amount calculation unit 82 has a function as an imaging timing determination unit that determines the imaging timing deviation amount between the plurality of cameras 11 according to the desired distance accuracy and imaging cycle.
  • the calculated (determined) image pickup timing deviation amount is transmitted to the multi-camera 12 to control the image pickup timing of the camera 11, and is also sent to the parallax correction amount calculation processing unit 43 and used for the parallax correction processing in the asynchronous stereo camera. be done.
  • the multi-camera device 10 of the present embodiment described above determines the imaging timing shift amount between the plurality of cameras according to desired distance accuracy and imaging cycle.
  • a timing determination unit (imaging timing deviation calculation unit 82) is provided.
  • the apparatus further includes a required performance calculation unit 81 that calculates a (final) required imaging cycle for determining the amount of deviation in imaging timing between the plurality of cameras.
  • the present embodiment by smoothly changing the imaging timing deviation amount of the multi-camera device 10 in accordance with the required distance accuracy and processing cycle, it is possible to obtain characteristics that are more suitable for various scenes than in the first embodiment.
  • a multi-camera device 10 can be realized.
  • Embodiment 3 describes an embodiment in which the cameras of the multi-camera device are used for different purposes.
  • the multi-camera device 10 is composed of a total of three cameras, one monitoring camera and two vehicles equipped with the cameras, and the two vehicles are respectively controlled.
  • FIG. 11 shows an overview of the system.
  • a base camera 111 that constitutes a monitoring camera installed in the environment takes an image at a fixed timing and transmits the image and the timing to the (in-vehicle) cameras 112 and 113 mounted on the vehicles 7 and 8 .
  • the camera 112 mounted on the vehicle 7 traveling straight needs to detect and measure the distance from the preceding vehicle 9 far in front of the own vehicle, so the performance required for distance measurement accuracy is high. Therefore, the camera 112 takes images at the same timing as the base camera 111 and processes them as a synchronous stereo camera to satisfy the required performance.
  • the camera 113 mounted on the vehicle 8 that is turning is required to have high response performance because the viewing range changes greatly depending on the time when the direction of the own vehicle changes.
  • the multi-camera device 10 of the present embodiment uses the imaging timing of at least one base camera 111 among the plurality of cameras as a reference, and the cameras 112 and 113 other than the base camera 111 achieve desired distance accuracy and imaging timing for each camera.
  • a plurality of imaging timing determination units are provided for determining the deviation amount of the imaging timing according to the cycle.
  • the timing is set by communicating and synchronizing the imaging times between the cameras, and by setting the base camera 111 to fire (project) an infrared light flash at the imaging timing, and the other cameras 112 and 113 to transmit the captured images. It is conceivable to adjust the imaging timing (as a reference) using an infrared light flash reflected in the image.
  • the multi-camera device 10 of the present embodiment described above uses the imaging timing of at least one base camera 111 among the plurality of cameras as a reference, and the cameras 112 and 113 other than the base camera 111 are set to the desired timing for each camera.
  • An imaging timing determination unit is provided that determines the amount of deviation in imaging timing according to the distance accuracy and the imaging cycle.
  • the base camera 111 transmits its own imaging timing and image to the cameras 112 and 113 other than the base camera 111 .
  • the base camera 111 emits an infrared light flash at its own imaging timing
  • the cameras 112 and 113 other than the base camera 111 emit desired infrared light flashes based on the timing of the captured infrared light flash.
  • An imaging timing determination unit is provided that determines the amount of deviation in imaging timing according to the distance accuracy and the imaging cycle.
  • a plurality of cameras installed for different purposes such as a surveillance camera and an in-vehicle camera, adjust their imaging timings according to their respective purposes, so that each camera satisfies different performance requirements.
  • multi-camera device 10 can be realized.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
  • it is possible to replace part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • each of the above configurations may be partially or wholly configured by hardware, or may be configured to be realized by executing a program on a processor.
  • control lines and information lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated on the product. In practice, it may be considered that almost all configurations are interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Provided is a multi-camera device that makes it possible to satisfy a required distance accuracy and imaging cycle that vary with circumstance using a single multi-camera device. The present invention recognizes the outside world using a parallax found using a plurality of images obtained from an arbitrary set of two or more of a plurality of cameras and comprises an imaging timing switching unit (the synchronous/asynchronous switching unit 32) that switches the plurality of cameras between synchronous and asynchronous in accordance with a desired distance accuracy and imaging cycle and a parallax calculation unit 31 that calculates the parallax using the plurality of images and the deviation in imaging timing between the plurality of cameras.

Description

マルチカメラ装置Multi-camera device

 本発明は、マルチカメラ装置に関する。 The present invention relates to a multi-camera device.

 本技術分野の背景技術として、特開2011-205388号公報(特許文献1)がある。該公報には、課題として「より簡単にステレオカメラの撮影タイミングのずれを検出できるようにする。」と記載され、解決手段として「生成部91には、輝度が所定の周波数で周期的に変化する提示信号を、ステレオカメラを構成する左右の2つのカメラで撮影して得られた画像が入力される。生成部91は、供給された左右の画像から、提示信号の輝度の時間的変化を示す左右のキャプチャ信号を生成する。提示信号の周波数が、カメラのフレームレートのナイキスト周波数より大きい場合、フィルタ処理部93は、フィルタ処理により、キャプチャ信号から特定の周波数の左右の折り返し信号を抽出する。マッチング処理部94は、左右の折り返し信号の位相ずれを検出し、ずれ量算出部95は、折り返し信号の位相ずれからカメラの撮影タイミングのずれを検出する。」と記載されている。 As background art in this technical field, there is Japanese Patent Application Laid-Open No. 2011-205388 (Patent Document 1). In the publication, the problem is described as "to make it possible to more easily detect the deviation of the photographing timing of the stereo camera." Images obtained by photographing the presentation signal with two left and right cameras constituting a stereo camera are input. If the frequency of the presentation signal is greater than the Nyquist frequency of the frame rate of the camera, the filtering unit 93 extracts the left and right folding signals of a specific frequency from the capture signal by filtering. The matching processing unit 94 detects the phase shift of the left and right return signals, and the shift amount calculation unit 95 detects the shift of the shooting timing of the camera from the phase shift of the return signals.".

特開2011-205388号公報JP 2011-205388 A

 物体を検知し、距離を測定するマルチカメラ装置として、三角測量によって距離を測定するステレオカメラが知られている。装置の性能を示す指標として、距離精度と処理周期があるが、これらの性能はトレードオフの関係にある。 A stereo camera that measures distance by triangulation is known as a multi-camera device that detects objects and measures distance. There is a distance accuracy and a processing cycle as indicators of device performance, but these performances have a trade-off relationship.

 複数のカメラを同期させ、同じ時刻に撮像する同期ステレオカメラは距離精度に優れ、複数のカメラで異なる時刻に撮像する非同期ステレオカメラ(特許文献1参照)は処理周期に優れるという特徴があるが、必要な距離精度と処理周期は状況によって変化するため、どちらか一方のみを採用した場合、シーンによって必要な性能を満たすことができないという問題があった。 A synchronous stereo camera that synchronizes multiple cameras and captures images at the same time has excellent distance accuracy, and an asynchronous stereo camera that captures images at different times with multiple cameras (see Patent Document 1) is characterized by excellent processing cycle. Since the required distance accuracy and processing cycle change depending on the situation, there is a problem that if only one of them is adopted, the required performance cannot be satisfied depending on the scene.

 本発明の目的は、単一のマルチカメラ装置によって、状況に応じて変化する要求距離精度および撮像周期を満たすことのできるマルチカメラ装置を提供することである。 An object of the present invention is to provide a multi-camera device that can meet the required distance accuracy and imaging cycle that change according to the situation with a single multi-camera device.

 上記目的を達成するために、本発明は、複数のカメラのうち2つ以上の任意の組から得られる複数の画像を用いて求められる視差を用いて外界の認識を行うマルチカメラ装置であって、所望の距離精度および撮像周期に応じて、前記複数のカメラの同期・非同期の切り替えを実施する撮像タイミング切り替え部と、前記複数の画像および前記複数のカメラ間での撮像タイミングのずれ量を用いて、前記視差を算出する視差算出部と、を備える。 In order to achieve the above object, the present invention provides a multi-camera device that recognizes the external world using parallax obtained using a plurality of images obtained from an arbitrary set of two or more of a plurality of cameras, , an imaging timing switching unit that switches between synchronization and asynchronization of the plurality of cameras according to a desired distance accuracy and imaging cycle; and a parallax calculator that calculates the parallax.

 本発明によれば、要求される距離精度と撮像周期が変化した場合においてもカメラの同期非同期を適切に切り替えることで、要求性能を満たすことのできるマルチカメラ装置を提供することができる。 According to the present invention, it is possible to provide a multi-camera device that can satisfy the required performance by appropriately switching the synchronization and asynchronization of the cameras even when the required distance accuracy and imaging cycle change.

 上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 Problems, configurations, and effects other than those described above will be clarified by the following description of the embodiments.

実施例1におけるマルチカメラ装置の構成を示す説明図。FIG. 2 is an explanatory diagram showing the configuration of the multi-camera device according to the first embodiment; FIG. 実施例1における車載マルチカメラ装置の構成を示す説明図。1 is an explanatory diagram showing the configuration of an in-vehicle multi-camera device according to Embodiment 1. FIG. 実施例1における画像処理部の説明図。FIG. 4 is an explanatory diagram of an image processing unit according to the first embodiment; 実施例1における視差算出部の説明図。FIG. 4 is an explanatory diagram of a parallax calculation unit according to the first embodiment; 同期ステレオカメラと非同期ステレオカメラの測距処理の違いの説明図。FIG. 4 is an explanatory diagram of the difference in distance measurement processing between a synchronous stereo camera and an asynchronous stereo camera; 同期ステレオカメラと非同期ステレオカメラの撮像周期の違いの説明図。FIG. 4 is an explanatory diagram of a difference in imaging cycle between a synchronous stereo camera and an asynchronous stereo camera; 実施例1における同期/非同期切り替え部の説明図。4 is an explanatory diagram of a synchronous/asynchronous switching unit according to the first embodiment; FIG. 実施例2における視差算出部の説明図。FIG. 9 is an explanatory diagram of a parallax calculation unit according to the second embodiment; 実施例2における要求性能算出部の説明図。FIG. 9 is an explanatory diagram of a required performance calculation unit according to the second embodiment; 実施例2における要求性能統合判定部の説明図。FIG. 10 is an explanatory diagram of a required performance integrated determination unit in the second embodiment; 実施例3における、カメラ毎に撮像タイミングを設定するケースの説明図。FIG. 11 is an explanatory diagram of a case in which imaging timing is set for each camera according to the third embodiment;

 以下、本発明の実施形態を図面を参照して説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.

[実施例1]
 図1~図7を用いて実施例1を説明する。
[Example 1]
Embodiment 1 will be described with reference to FIGS. 1 to 7. FIG.

 図1にマルチカメラ装置の構成を示す。マルチカメラ装置10は、複数のカメラ11によって構成されたマルチカメラ12、メモリ13、CPU14、画像処理部15、外部出力部16によって構成される。マルチカメラ12(の各カメラ11)によって撮像された画像は、メモリ13に格納される。格納された画像に対して、画像処理部15において物体検出などの処理を行い、得られた結果は外部出力部16を通して外部に出力する。 Figure 1 shows the configuration of the multi-camera device. The multi-camera device 10 is composed of a multi-camera 12 composed of a plurality of cameras 11 , a memory 13 , a CPU 14 , an image processing section 15 and an external output section 16 . Images captured by (each camera 11 of) the multi-camera 12 are stored in the memory 13 . The image processing unit 15 performs processing such as object detection on the stored image, and the obtained result is output to the outside through the external output unit 16 .

 本実施例では、図2に示す車載のマルチカメラ装置について説明する。マルチカメラ装置10は、車両(以下、自車などとも称する)1に搭載され、自車周辺の画像を撮像し、複数のカメラ11のうち2つ以上の任意の組から得られる複数の画像を用いて求められる視差を用いて対象物体を検出するなどして、自車周辺(外界)の認識を行い、得られた情報はコントローラとしての制御指令部2に出力する。制御指令部2は、入力された情報(検出された対象物体の情報など)に基づき自車の制御手段であるブレーキ(不図示)、エンジン3、ステア4など(の動作状態)を決定して当該車両1を制御(加減速制御および操舵制御)する。車両1は高速道路や商店街など様々なシーンで走行するため、常に要求性能(詳しくは、距離精度と撮像周期(処理周期)の要求性能)を満たすためには本実施例による同期非同期の切り替えが必要である。 In this embodiment, the in-vehicle multi-camera device shown in FIG. 2 will be described. The multi-camera device 10 is mounted on a vehicle (hereinafter also referred to as a vehicle) 1, captures images around the vehicle, and captures a plurality of images obtained from an arbitrary set of two or more of the plurality of cameras 11. The surroundings of the vehicle (outside world) are recognized by, for example, detecting a target object using the parallax obtained using the parallax, and the obtained information is output to the control command unit 2 as a controller. The control command unit 2 determines (operating states of) the brakes (not shown), the engine 3, the steering 4, etc., which are control means of the own vehicle, based on the input information (information on the detected target object, etc.). The vehicle 1 is controlled (acceleration/deceleration control and steering control). Since the vehicle 1 travels in various scenes such as highways and shopping streets, it is necessary to switch between synchronous and asynchronous switching according to this embodiment in order to always satisfy the required performance (more specifically, the required performance of distance accuracy and imaging cycle (processing cycle)). is necessary.

 次に、図3を用いて、画像処理部15の説明を行う。本実施例の画像処理部15は、視差算出部31、同期/非同期切り替え部32を含んで構成される。 Next, the image processing unit 15 will be explained using FIG. The image processing unit 15 of this embodiment includes a parallax calculation unit 31 and a synchronous/asynchronous switching unit 32 .

 まず、同期/非同期切り替え部32では、要求される距離精度と撮像周期に応じて、マルチカメラ12(の各カメラ11)を同期して撮像するか非同期で撮像するかを決定する(具体的には、同期フラグ又は非同期フラグを出力する)(後で説明)。この結果はマルチカメラ12に送信され、カメラ11の撮像タイミングを制御するとともに、視差算出部31において非同期ステレオカメラの視差を補正するために使われる。すなわち、同期/非同期切り替え部32は、所望の距離精度および撮像周期に応じて、複数のカメラ11の同期・非同期の切り替えを実施する撮像タイミング切り替え部としての機能を有する。 First, the synchronous/asynchronous switching unit 32 determines whether the multi-camera 12 (each camera 11 of it) performs synchronous imaging or asynchronous imaging according to the required distance accuracy and imaging cycle (specifically, outputs a synchronous or asynchronous flag) (explained later). This result is transmitted to the multi-camera 12 and used to control the imaging timing of the camera 11 and to correct the parallax of the asynchronous stereo cameras in the parallax calculator 31 . That is, the synchronous/asynchronous switching unit 32 has a function as an imaging timing switching unit that switches between synchronous and asynchronous operation of the plurality of cameras 11 according to desired distance accuracy and imaging cycle.

 視差算出部31は、図4のようなブロックによって構成される。視差算出部31は、視差算出処理部41で、少なくとも2枚の画像1、画像2を入力としてマルチカメラ12から受け取り、視差算出処理を実行する。視差算出処理では、画像1内の特定の点に対して画像2内で同一の点が撮像されている位置を探索する。このとき探索された結果として得られる点の座標位置のずれが視差と呼ばれ、距離に変換される。算出された視差(距離)は、前記同期/非同期切り替え部32によってマルチカメラ12に設定された同期非同期状態を示すフラグ(同期フラグ又は非同期フラグ)によって視差の補正量が算出され(視差補正量算出処理部43)、視差補正処理部42でその補正を実行し、最終的な視差が得られる。 The parallax calculation unit 31 is configured by blocks as shown in FIG. The parallax calculation unit 31 receives at least two images 1 and 2 as inputs from the multi-camera 12 in the parallax calculation processing unit 41, and executes parallax calculation processing. In the parallax calculation process, a position where the same point in image 2 is captured as to a specific point in image 1 is searched. The displacement of the coordinate positions of the points obtained as a result of the search at this time is called parallax, which is converted into distance. For the calculated parallax (distance), the parallax correction amount is calculated based on the flag (synchronization flag or asynchronous flag) indicating the synchronous/asynchronous state set in the multi-camera 12 by the synchronous/asynchronous switching unit 32 (parallax correction amount calculation The processing unit 43) and the parallax correction processing unit 42 execute the correction to obtain the final parallax.

 この視差補正処理の説明を図5に示す。図5上に示す同期ステレオカメラにおいては、ある時刻tの左カメラ52から得られた画像と、同じ時刻tの右カメラ53から得られた画像を基に対象物体51に対して視差を算出する。一方、図5下に示す非同期ステレオカメラにおいては、ある時刻tの左カメラ52と、異なる時刻t+1の右カメラ54から得られた画像を基に視差を算出する。センサ、あるいは対象物体が動いていた場合、この2枚の画像が撮像されたときの対象物体51までの距離が異なるため、単純な視差からは距離を算出することができず、時間変化による補正55によって、算出された視差を補正する必要がある。つまり、2枚の画像および左右カメラ間での撮像タイミングのずれ量(時間変化による補正55に対応)を用いて、視差を算出(補正)する必要がある。補正には誤差が含まれることから、一般に非同期ステレオカメラの距離精度は同期ステレオカメラに劣る。 An explanation of this parallax correction processing is shown in FIG. In the synchronous stereo camera shown in FIG. 5, the parallax is calculated for the target object 51 based on the image obtained from the left camera 52 at a certain time t and the image obtained from the right camera 53 at the same time t. . On the other hand, in the asynchronous stereo camera shown in the lower part of FIG. 5, the parallax is calculated based on the images obtained from the left camera 52 at a certain time t and the right camera 54 at a different time t+1. If the sensor or the target object is moving, the distance to the target object 51 is different when these two images are captured. 55 needs to correct the calculated parallax. In other words, it is necessary to calculate (correct) the parallax using the two images and the deviation amount of imaging timing between the left and right cameras (corresponding to the correction 55 due to time change). Since the correction includes an error, the distance accuracy of the asynchronous stereo camera is generally inferior to that of the synchronous stereo camera.

 このときの撮像周期の違いを図6に示す。カメラの撮像周期をΔtと置いたとき、図6上で示す同期ステレオカメラの撮像周期はΔtとなる。一方、非同期ステレオカメラにおいては、図6下で示した通り、ちょうど撮像周期の半分だけずらして撮像した場合の撮像周期はΔt/2となる。一度画像を取得してから撮像周期の時間だけは新しい画像を得ることができないため、撮像周期が短ければ短いほどシステムの応答性を高めることができる。加えて、人のように形状が刻一刻と変形する物体を対象とする場合には撮像周期が短いほど画像間での変形が少なく済むことや、移動物体を対象とする場合には撮像周期が短いほど画像間での移動量が少なくなるため、追跡が用意であるなどの利点がある。 Fig. 6 shows the difference in imaging cycle at this time. When the imaging cycle of the camera is Δt, the imaging cycle of the synchronous stereo camera shown in FIG. 6 is Δt. On the other hand, in the asynchronous stereo camera, as shown in the lower part of FIG. 6, the imaging period is Δt/2 when imaging is performed with a shift of exactly half the imaging period. Since a new image cannot be obtained for the duration of the imaging cycle after an image has been obtained, the shorter the imaging cycle, the higher the responsiveness of the system. In addition, when targeting an object whose shape changes moment by moment, such as a person, the shorter the imaging cycle, the less deformation between images. The shorter the distance, the smaller the amount of movement between images, so there are advantages such as easier tracking.

 次に、図7を用いて、同期/非同期切り替え部の説明を行う。同期/非同期切り替え部32は、要求される距離精度と撮像周期に応じて、マルチカメラ12(の各カメラ11)を同期して撮像するか非同期で撮像するかを決定する。前述の通り、距離精度が必要な場合は同期フラグ、撮像周期を短くすることが必要な場合は非同期フラグを出力する。 Next, the synchronous/asynchronous switching unit will be explained using FIG. The synchronous/asynchronous switching unit 32 determines whether the multi-camera 12 (each camera 11 thereof) performs synchronous imaging or asynchronous imaging according to the required distance accuracy and imaging cycle. As described above, a synchronous flag is output when distance accuracy is required, and an asynchronous flag is output when a short imaging cycle is required.

 まず、S71で自車の制御手段(の動作状態)を決定する対象物体の種別による判定を行う。対象物体が車両であれば急激な挙動の変化や形状の変化はないと想定されるため、同期ステレオカメラが望ましい。一方、対象物体が車両でなければ(歩行者などであれば)急激な挙動の変化や形状の変化があると想定されるため撮像周期を短くする必要があり、非同期ステレオカメラが望ましい。よって、S71で、対象物体が車両であれば、S72に進み、対象物体が車両でなければ、S76に進んで非同期フラグを出力する。 First, in S71, a determination is made according to the type of target object that determines (the operating state of) the control means of the own vehicle. If the target object is a vehicle, it is assumed that there is no sudden change in behavior or shape, so a synchronous stereo camera is desirable. On the other hand, if the target object is not a vehicle (such as a pedestrian), it is assumed that there will be abrupt changes in behavior or shape, so the imaging cycle must be shortened, and an asynchronous stereo camera is desirable. Therefore, if the target object is a vehicle in S71, the process proceeds to S72, and if the target object is not a vehicle, the process proceeds to S76 to output the asynchronous flag.

 次に、S72では自車挙動による判定を行う。ここでは、操舵角などの車両情報から判定した、自車が直進中であるか旋回中であるかの判定を行う。自車が直進走行時には周辺環境は急激に変わる可能性が低いが、旋回中はセンサの向きが大きく変わることから周辺環境は急激に変わる可能性が高いため撮像周期を短くする必要があり、非同期ステレオカメラが望ましい。よって、S72で、自車が直進中であれば、S73に進み、自車が直進中でなければ(旋回中であれば)、S76に進んで非同期フラグを出力する。 Next, in S72, judgment is made based on the behavior of the own vehicle. Here, it is determined whether the vehicle is traveling straight or turning, based on vehicle information such as the steering angle. When the vehicle is traveling straight ahead, the surrounding environment is unlikely to change abruptly, but when the vehicle is turning, the direction of the sensor changes significantly, so there is a high possibility that the surrounding environment will change rapidly. A stereo camera is preferable. Therefore, if the vehicle is traveling straight in S72, the process proceeds to S73, and if the vehicle is not traveling straight (if the vehicle is turning), the process proceeds to S76 to output an asynchronous flag.

 次に、S73では物体検知結果による周辺環境を用いた判定を行う。ここでは、自車周辺における死角領域の有無(換言すると、急な物体の飛び出しの発生しうる死角があるか否か)の判定を行う。自車周辺に死角が少なく周辺の見通しが良い場合には急に物体が出現する可能性は低いが、物体が多く死角が存在して周辺の見通しが悪い場合、死角からの急な物体の飛び出しの可能性が高く、撮像周期を短くできる非同期ステレオカメラが望ましい。よって、S73で、周囲の見晴らしが良ければ、S74に進み、周囲の見晴らしが悪ければ、S76に進んで非同期フラグを出力する。 Next, in S73, determination is made using the surrounding environment based on the result of object detection. Here, it is determined whether or not there is a blind spot area around the own vehicle (in other words, whether or not there is a blind spot where an object may suddenly jump out). If there are few blind spots around the vehicle and visibility is good, the possibility of an object suddenly appearing is low. Therefore, an asynchronous stereo camera capable of shortening the imaging cycle is desirable. Therefore, in S73, if the view of the surroundings is good, the process proceeds to S74, and if the view of the surroundings is poor, the process proceeds to S76 to output an asynchronous flag.

 次に、S74では走行路による判定を行う。ここでは、地図情報などから判定した、自車が直進中であるか旋回中であるかの判定を行う。地図情報やセンサによる白線検知結果を利用して自車の走行路を推定し、S72と同様の理由で自車が今後直進するか旋回するかを判定し、旋回の可能性が高いときは非同期ステレオカメラに切り替える。すなわち、S74で、推定した走行路が直進路であれば、S75に進んで同期フラグを出力し、推定した走行路が直進路でなければ(旋回路であれば)、S76に進んで非同期フラグを出力する。 Next, in S74, determination is made based on the travel road. Here, it is determined whether the vehicle is traveling straight ahead or turning, based on map information or the like. Estimates the vehicle's driving path using map information and the results of white line detection by sensors, determines whether the vehicle will go straight or turn for the same reason as in S72, and asynchronously when the possibility of turning is high. Switch to stereo camera. That is, if the estimated traveling road is a straight road in S74, the process proceeds to S75 to output a synchronous flag. to output

 本実施例では複数の条件を同一の重みで扱ったが、実際にはそれぞれの判定に重みをつけて重要な判定が優先されるようにしてもよいし、それぞれの判定をスコア化し、合計のスコアによって最終的な判断をしてもよい。 In this embodiment, a plurality of conditions are treated with the same weight, but in reality, each judgment may be weighted so that important judgments are prioritized, or each judgment is scored and the total is calculated. The score may be the final decision.

(実施例1の作用効果)
 物体を検知し、距離を測定するマルチカメラ装置として、三角測量によって距離を測定するステレオカメラが知られている。装置の性能を示す指標として、距離精度と処理周期があるが、これらの性能はトレードオフの関係にある。
(Effect of Example 1)
A stereo camera that measures distance by triangulation is known as a multi-camera device that detects an object and measures distance. There is a distance accuracy and a processing cycle as indicators of device performance, but these performances have a trade-off relationship.

 測距精度を高めるためには、複数のカメラを同期させ、同じ時刻に撮像することで、対象物体の移動や変形などの影響を取り除くことが望ましい。この時、装置全体での処理周期は、カメラ単体での撮像周期と等しくなる。  In order to improve the accuracy of ranging, it is desirable to synchronize multiple cameras and take images at the same time to remove the effects of movement and deformation of the target object. At this time, the processing cycle of the entire device is equal to the imaging cycle of the camera alone.

 一方、複数のカメラで異なる時刻に撮像する装置は、非同期ステレオカメラと呼ばれる。上述の特許文献1には、「マッチング処理部94は、左右の折り返し信号の位相ずれを検出し、ずれ量算出部95は、折り返し信号の位相ずれからカメラの撮影タイミングのずれを検出する。」と記載されており、異なる時刻に撮像したことによる測定距離のずれ量を推定し、測距結果の補正を行う。このときの測距精度は同期ステレオカメラに劣るが、複数のカメラで異なる時刻に撮像していることから、装置全体での処理周期はカメラ単体の撮像周期より短くなる。 On the other hand, a device that captures images at different times with multiple cameras is called an asynchronous stereo camera. In the above-mentioned Patent Document 1, "The matching processing unit 94 detects the phase shift between the left and right return signals, and the shift amount calculation unit 95 detects the shift in the shooting timing of the camera from the phase shift of the return signals." and corrects the distance measurement result by estimating the amount of deviation in the measured distance due to imaging at different times. Although the distance measurement accuracy at this time is inferior to that of a synchronous stereo camera, since images are taken at different times by a plurality of cameras, the processing cycle of the entire apparatus is shorter than the image capturing cycle of each camera.

 同期ステレオカメラは距離精度に優れ、非同期ステレオカメラは処理周期に優れるという特徴があるが、必要な距離精度と処理周期は状況によって変化するため、どちらか一方のみを採用した場合、シーンによって必要な性能を満たすことができないという問題があった。 Synchronous stereo cameras have excellent distance accuracy, while asynchronous stereo cameras have excellent processing cycle. There was a problem that the performance could not be satisfied.

 以上で説明した本実施例のマルチカメラ装置10は、複数のカメラのうち2つ以上の任意の組から得られる複数の画像を用いて求められる視差を用いて外界の認識を行うマルチカメラ装置10であって、所望の距離精度および撮像周期に応じて、前記複数のカメラの同期・非同期の切り替えを実施する撮像タイミング切り替え部(同期/非同期切り替え部32)と、前記複数の画像および前記複数のカメラ間での撮像タイミングのずれ量を用いて、前記視差を算出する視差算出部31と、を備える。 The multi-camera device 10 of the present embodiment described above recognizes the external world using parallax obtained using a plurality of images obtained from an arbitrary set of two or more of a plurality of cameras. an imaging timing switching unit (synchronous/asynchronous switching unit 32) that performs synchronous/asynchronous switching of the plurality of cameras according to a desired distance accuracy and imaging cycle; and a parallax calculator 31 that calculates the parallax by using the amount of deviation in imaging timing between the cameras.

 また、前記所望の距離精度および撮像周期は、自車の制御手段を決定する対象物体の種別、自車周辺における死角領域の有無(急な飛び出しの発生しうる死角があるか否か)、地図情報あるいは操舵角などの車両情報の少なくとも一つから判定した、自車が直進中であるか旋回中であるか、によって決定される。 Further, the desired distance accuracy and imaging cycle are determined by the type of target object that determines the control means of the own vehicle, the presence or absence of a blind spot area around the own vehicle (whether or not there is a blind spot where a sudden jump may occur), the map It is determined by whether the own vehicle is traveling straight or turning, which is determined from at least one of information or vehicle information such as steering angle.

 本実施例によれば、要求される距離精度と処理周期に応じてマルチカメラ装置10の同期非同期を状況に応じて切り替えることで、多様なシーンにおいて要求性能を満たすマルチカメラ装置10を実現することができる。 According to this embodiment, the multi-camera device 10 that satisfies the required performance in various scenes is realized by switching the synchronous/asynchronous mode of the multi-camera device 10 according to the required distance accuracy and processing cycle. can be done.

[実施例2]
 実施例1では、カメラの同期非同期を切り替える実施の形態を説明した。実施例2では、カメラの同期非同期の切り替えに加え、非同期ステレオカメラにおいてカメラ間の撮像タイミングのずれ量を滑らかに変更する実施の形態について説明する。
[Example 2]
In the first embodiment, an embodiment has been described in which synchronization and asynchronization of the camera are switched. In a second embodiment, in addition to switching between synchronous and asynchronous cameras, an embodiment will be described in which an asynchronous stereo camera smoothly changes the imaging timing deviation amount between cameras.

 図8に、実施例2における視差算出部の説明図を示す。本実施例の視差算出部31は、上述した実施例1に加え、要求性能算出部81、撮像タイミングずれ量算出部82を含んで構成される。 FIG. 8 shows an explanatory diagram of the parallax calculation unit in the second embodiment. The parallax calculation unit 31 of this embodiment includes a required performance calculation unit 81 and an imaging timing deviation amount calculation unit 82 in addition to the above-described first embodiment.

 要求性能算出部81では、マルチカメラ装置10が満たすべき要求性能(詳しくは、距離精度と撮像周期の要求性能)を算出する。要求性能算出部81の説明図を図9に示す。図9の91,92,93,94はそれぞれ図7のS71,S72,S73,S74と同様の観点から計算される要求性能である。本実施例においては単純な切り替えでなく、撮像タイミングのずれ量を滑らかに変更するため、単純な有無による判断でなく、スコア化してシーンによって滑らかに変化するように要求性能を算出する。 The required performance calculation unit 81 calculates the required performance that the multi-camera device 10 should satisfy (more specifically, the required performance of distance accuracy and imaging cycle). An explanatory diagram of the required performance calculation unit 81 is shown in FIG. 91, 92, 93 and 94 in FIG. 9 are required performances calculated from the same viewpoints as S71, S72, S73 and S74 in FIG. In the present embodiment, the shift amount of imaging timing is not simply switched, but is smoothly changed. Therefore, the required performance is calculated so as to change smoothly depending on the scene by scoring instead of a simple determination based on presence or absence.

 対象種別による要求性能算出部91においては、対象物体の種別に応じて要求性能を算出する。例えば車両・歩行者の種別を判断可能なセンサにおいて考えると、歩行者は車両と比べて急激な移動変化が発生しやすいことから、撮像タイミングのずれ量を大きくし、応答速度を上げることが望ましい。加えて、衝突までの時間TTC[s]が短いほど応答速度が高いことが望ましいことから、対象物体の種別とTTCごとに設定したテーブルによって目標撮像周期Δt_a[s]を設定する。種別として車両と歩行者を例に挙げたが、他にはトラックは急激な移動変化が少ないなど、移動変化の発生可能性の異なる複数の種別を追加しても良い。 The target type-based required performance calculation unit 91 calculates the required performance according to the type of the target object. For example, when considering a sensor that can determine the type of vehicle or pedestrian, pedestrians are more likely to undergo rapid changes in movement compared to vehicles. . In addition, since it is desirable that the response speed is higher as the time TTC[s] until collision is shorter, the target imaging cycle Δt_a[s] is set according to the type of target object and a table set for each TTC. Vehicles and pedestrians are used as examples of types, but a plurality of types with different possibilities of occurrence of movement changes may be added, such as trucks, which have few sudden changes in movement.

 死角情報による要求性能算出部92では、死角からの物体の飛び出しが発生した場合に衝突回避が間に合うように要求性能を算出する。目標撮像周期Δt_b[s]を定めるとき、対象物体を検知して衝突判定を行うまでに必要なフレームをn[frame]、対象物体の死角までの距離をz[m]、自車速度をv_car[m/s]、衝突回避可能な距離をz_th[m]とすると、飛び出しから衝突回避判断までの間に衝突回避可能な距離を通過しなければ良いことから、必要な撮像周期は以下の数式を満たすように定めればよい。このときnについては、遠方であれば大きくすることで誤ブレーキを抑制し、近傍であれば危険度が高いため小さくすることでブレーキ実施の許容範囲を広げるなど、対象物体までの距離に応じて変更してもよい。 The required performance calculation unit 92 based on the blind spot information calculates the required performance so that the collision can be avoided in time when an object pops out of the blind spot. When determining the target imaging cycle Δt_b[s], n [frame] is the frame required to detect the target object and perform collision determination, z [m] is the distance to the blind spot of the target object, and v_car is the vehicle speed. [m/s], and the collision avoidable distance is z_th[m]. Since it is sufficient not to pass the collision avoidable distance between jumping out and collision avoidance judgment, the required imaging cycle is the following formula should be determined so as to satisfy At this time, if n is far away, erroneous braking can be suppressed by increasing it, and if it is nearby, the danger is high, so decreasing it will widen the allowable range of braking. You can change it.

Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001

 自車挙動による要求性能算出部93では、自車の角速度ω[rad/s]に応じて線形に要求距離精度と要求撮像周期を変化させる手法が考えられる。ωが大きいほど急激な旋回であり、センサの応答速度を上げる必要があるから、目標撮像周期Δt_c[s]は例えば以下の数式で表すことができる。ωが大きいときΔt_cが小さくなるような関係性であれば、他の数式や、ωに応じた値を事前に保管したテーブルを参照する形式でも構わない。 In the required performance calculation unit 93 based on vehicle behavior, a method of linearly changing the required distance accuracy and the required imaging period according to the angular velocity ω [rad/s] of the own vehicle can be considered. The larger ω is, the more rapid the turn is, and the response speed of the sensor needs to be increased. Therefore, the target imaging cycle Δt_c[s] can be expressed by the following formula, for example. As long as the relationship is such that Δt_c becomes smaller when ω is large, other formulas or a form of referring to a table in which values corresponding to ω are stored in advance may be used.

Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002

 周辺環境による要求性能算出部94では、シーン全体における要求撮像周期Δt_d[s]を変化させる。例えばGPS情報を基に周辺の地図情報を受けとり、高速道路であれば周辺の見通しが良いためΔt_dを大きくして応答性を落として距離精度を上げ、住宅街など周辺の見通しが悪く多種多様な物体の飛び出しが想定される環境であればΔt_dを小さくして応答性を上げる。対象種別による要求性能算出部92と同じように、高速道路と住宅街だけでなく、必要となる要求撮像周期の異なる環境であれば3種類以上の切り替えを実施しても良い。また、GPSではなく画像認識によるシーン理解などで上記環境の判定をしても構わない。 The required performance calculation unit 94 based on the surrounding environment changes the required imaging cycle Δt_d[s] for the entire scene. For example, it receives map information of the surrounding area based on GPS information, and increases Δt_d to reduce responsiveness and improve distance accuracy because the visibility of the surrounding area is good on highways, and the visibility of the surrounding areas such as residential areas is poor and diverse. In an environment where an object is expected to pop out, Δt_d is decreased to increase responsiveness. As with the target type-based required performance calculation unit 92, three or more types of switching may be performed as long as the environment requires different required imaging cycles, in addition to highways and residential areas. Also, the above environment may be determined by scene understanding by image recognition instead of GPS.

 上述した複数の観点から算出された要求性能は要求性能統合判定部95で統合される。同時に要求性能を満たすことのできない場合には、自車の制御の緊急性によって優先度をつけ、優先度の高いものから満たすように統合する。要求性能統合判定部95の統合処理(つまり、最終目標撮像周期の算出処理)のフローチャートを図10に示した。具体的には、Δt_a、Δt_bは要求撮像周期を設定するために衝突を想定した対象物体が存在するため、仮に衝突が発生したときの衝突までの時間TTC_a,TTC_bを算出することができる。TTC_a,TTC_bと閾値とを比較するとともに(S101、S102)、TTC_aとTTC_bとを比較し(S103)、これらが閾値未満であった場合、シーン全体に対する要求性能よりも優先され、目標撮像周期Δt_a[s]あるいはΔt_b[s]を採用する(S104、S105)。一方、これらが閾値以上であった場合、物体の飛び出しが発生しても衝突回避制御を実施するまでに時間的余裕があるため、他手段によって算出された要求撮像周期との重み付平均(重みw_a,w_b,w_c,w_d)によって最終的な要求撮像周期(=最終目標撮像周期)を算出し、採用する(S106)。 The required performance calculated from the multiple viewpoints described above is integrated by the required performance integration determination unit 95 . If the required performance cannot be satisfied at the same time, priority is given according to the urgency of control of the own vehicle, and the items with the highest priority are integrated so as to be satisfied. FIG. 10 shows a flow chart of integration processing (that is, processing for calculating the final target imaging cycle) of the required performance integration determination unit 95 . Specifically, Δt_a and Δt_b include target objects that are supposed to collide in order to set the required imaging cycle, so it is possible to calculate the times TTC_a and TTC_b until the collision when the collision occurs. TTC_a and TTC_b are compared with a threshold value (S101, S102), and TTC_a and TTC_b are compared (S103). [s] or Δt_b[s] is adopted (S104, S105). On the other hand, if these values are equal to or greater than the thresholds, there is time to spare before collision avoidance control is performed even if an object pops out. w_a, w_b, w_c, w_d), the final requested imaging cycle (=final target imaging cycle) is calculated and adopted (S106).

 以上から要求距離精度と要求撮像周期を撮像タイミングずれ量算出部82(図8)に送り、撮像タイミングずれ量算出部82は、受け取った要求性能を満たす撮像タイミングずれ量を算出する。すなわち、撮像タイミングずれ量算出部82は、所望の距離精度および撮像周期に応じて、複数のカメラ11間での撮像タイミングのずれ量を決定する撮像タイミング決定部としての機能を有する。算出(決定)された撮像タイミングずれ量は、マルチカメラ12に送信され、カメラ11の撮像タイミングを制御するとともに、視差補正量算出処理部43に送られ、非同期ステレオカメラにおける視差の補正処理に用いられる。 From the above, the required distance accuracy and the required imaging cycle are sent to the imaging timing deviation amount calculation unit 82 (FIG. 8), and the imaging timing deviation amount calculation unit 82 calculates the imaging timing deviation amount that satisfies the received required performance. In other words, the imaging timing deviation amount calculation unit 82 has a function as an imaging timing determination unit that determines the imaging timing deviation amount between the plurality of cameras 11 according to the desired distance accuracy and imaging cycle. The calculated (determined) image pickup timing deviation amount is transmitted to the multi-camera 12 to control the image pickup timing of the camera 11, and is also sent to the parallax correction amount calculation processing unit 43 and used for the parallax correction processing in the asynchronous stereo camera. be done.

(実施例2の作用効果)
 以上で説明した本実施例のマルチカメラ装置10は、前記撮像タイミング切り替え部に代えて、所望の距離精度および撮像周期に応じて、前記複数のカメラ間での撮像タイミングのずれ量を決定する撮像タイミング決定部(撮像タイミングずれ量算出部82)を備える。また、前記複数のカメラ間での撮像タイミングのずれ量を決定するための(最終的な)要求撮像周期を算出する要求性能算出部81をさらに備える。
(Effect of Example 2)
The multi-camera device 10 of the present embodiment described above, instead of the imaging timing switching unit, determines the imaging timing shift amount between the plurality of cameras according to desired distance accuracy and imaging cycle. A timing determination unit (imaging timing deviation calculation unit 82) is provided. Further, the apparatus further includes a required performance calculation unit 81 that calculates a (final) required imaging cycle for determining the amount of deviation in imaging timing between the plurality of cameras.

 本実施例によれば、要求される距離精度と処理周期に応じてマルチカメラ装置10の撮像タイミングずれ量を滑らかに変化させることで、実施例1と比較して多様なシーンにより適した性質のマルチカメラ装置10を実現することができる。 According to the present embodiment, by smoothly changing the imaging timing deviation amount of the multi-camera device 10 in accordance with the required distance accuracy and processing cycle, it is possible to obtain characteristics that are more suitable for various scenes than in the first embodiment. A multi-camera device 10 can be realized.

[実施例3]
 実施例3では、マルチカメラ装置のカメラがそれぞれ別の目的に使われる際の実施の形態について説明する。本実施例では、監視カメラ1つと、カメラを搭載した車両2つの合計3つのカメラによって、マルチカメラ装置10が構成されるとともに、2台の車両をそれぞれ制御する例について説明する。
[Example 3]
Embodiment 3 describes an embodiment in which the cameras of the multi-camera device are used for different purposes. In the present embodiment, an example will be described in which the multi-camera device 10 is composed of a total of three cameras, one monitoring camera and two vehicles equipped with the cameras, and the two vehicles are respectively controlled.

 図11に、システムの概要を示した。環境に設置された監視カメラを構成するベースカメラ111は一定のタイミングで画像を撮像し、車両7及び8に搭載された(車載)カメラ112及び113に撮像タイミングと画像を送信する。直進中の車両7に搭載されたカメラ112は、自車正面の遠方にある先行車9を検知・測距する必要があるため、測距精度の要求性能が高い。そこで、カメラ112はベースカメラ111と同タイミングで撮像し、同期ステレオカメラとして処理することで要求性能を満たす。一方、旋回中の車両8に搭載されたカメラ113は、自車の向きが変わることで時刻によって視野範囲が大きく変化するため、応答性能の要求が高い。そこで、カメラ113はベースカメラ111と撮像タイミングをずらすことで、非同期ステレオカメラとして処理することで要求性能を満たす。すなわち、本実施例のマルチカメラ装置10は、複数のカメラのうち少なくとも1つのベースカメラ111の撮像タイミングを基準とし、ベースカメラ111以外のカメラ112、113は、カメラ毎に所望の距離精度および撮像周期に応じて撮像タイミングのずれ量を決定する複数の撮像タイミング決定部を備えている。 Fig. 11 shows an overview of the system. A base camera 111 that constitutes a monitoring camera installed in the environment takes an image at a fixed timing and transmits the image and the timing to the (in-vehicle) cameras 112 and 113 mounted on the vehicles 7 and 8 . The camera 112 mounted on the vehicle 7 traveling straight needs to detect and measure the distance from the preceding vehicle 9 far in front of the own vehicle, so the performance required for distance measurement accuracy is high. Therefore, the camera 112 takes images at the same timing as the base camera 111 and processes them as a synchronous stereo camera to satisfy the required performance. On the other hand, the camera 113 mounted on the vehicle 8 that is turning is required to have high response performance because the viewing range changes greatly depending on the time when the direction of the own vehicle changes. Therefore, by shifting the imaging timing of the camera 113 from that of the base camera 111, processing as an asynchronous stereo camera satisfies the required performance. That is, the multi-camera device 10 of the present embodiment uses the imaging timing of at least one base camera 111 among the plurality of cameras as a reference, and the cameras 112 and 113 other than the base camera 111 achieve desired distance accuracy and imaging timing for each camera. A plurality of imaging timing determination units are provided for determining the deviation amount of the imaging timing according to the cycle.

 これによって、複数の独立したカメラから画像を集約し、それぞれ別の目的に画像処理を用いる場合であっても、それぞれのカメラで個別に設定される要求性能を満たすことが可能である。 As a result, even if images are aggregated from multiple independent cameras and image processing is used for different purposes, it is possible to satisfy the performance requirements set individually for each camera.

 タイミングの設定は、カメラ間で撮像時刻を通信して同期する他に、ベースカメラ111が撮像タイミングで赤外光フラッシュを焚く(投光する)ようにし、他のカメラ112、113は撮像した画像に映り込んだ赤外光フラッシュを利用して(基準として)撮像タイミングを調整するなどの手段が考えられる。 The timing is set by communicating and synchronizing the imaging times between the cameras, and by setting the base camera 111 to fire (project) an infrared light flash at the imaging timing, and the other cameras 112 and 113 to transmit the captured images. It is conceivable to adjust the imaging timing (as a reference) using an infrared light flash reflected in the image.

(実施例3の作用効果)
 以上で説明した本実施例のマルチカメラ装置10は、前記複数のカメラのうち少なくとも1つのベースカメラ111の撮像タイミングを基準とし、前記ベースカメラ111以外のカメラ112、113は、カメラ毎に所望の距離精度および撮像周期に応じて撮像タイミングのずれ量を決定する撮像タイミング決定部を備える。
(Effect of Example 3)
The multi-camera device 10 of the present embodiment described above uses the imaging timing of at least one base camera 111 among the plurality of cameras as a reference, and the cameras 112 and 113 other than the base camera 111 are set to the desired timing for each camera. An imaging timing determination unit is provided that determines the amount of deviation in imaging timing according to the distance accuracy and the imaging cycle.

 また、前記ベースカメラ111は自身の撮像タイミングと画像をベースカメラ111以外のカメラ112、113に送信する。 Also, the base camera 111 transmits its own imaging timing and image to the cameras 112 and 113 other than the base camera 111 .

 また、前記ベースカメラ111は自身の撮像タイミングに赤外光フラッシュを投光し、前記ベースカメラ111以外のカメラ112、113は撮像された赤外光フラッシュのタイミングを基準として、カメラ毎に所望の距離精度および撮像周期に応じて撮像タイミングのずれ量を決定する撮像タイミング決定部を備える。 In addition, the base camera 111 emits an infrared light flash at its own imaging timing, and the cameras 112 and 113 other than the base camera 111 emit desired infrared light flashes based on the timing of the captured infrared light flash. An imaging timing determination unit is provided that determines the amount of deviation in imaging timing according to the distance accuracy and the imaging cycle.

 本実施例によれば、監視カメラと車載カメラなどそれぞれ異なる目的のために設置された複数のカメラがそれぞれの目的に沿って撮像タイミングを調整することで、それぞれのカメラが異なる要求性能を満たすようなマルチカメラ装置10を実現することができる。 According to this embodiment, a plurality of cameras installed for different purposes, such as a surveillance camera and an in-vehicle camera, adjust their imaging timings according to their respective purposes, so that each camera satisfies different performance requirements. multi-camera device 10 can be realized.

 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 It should be noted that the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations. In addition, it is possible to replace part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Moreover, it is possible to add, delete, or replace a part of the configuration of each embodiment with another configuration.

 また、上記の各構成は、それらの一部又は全部が、ハードウェアで構成されても、プロセッサでプログラムが実行されることにより実現されるように構成されてもよい。また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。 In addition, each of the above configurations may be partially or wholly configured by hardware, or may be configured to be realized by executing a program on a processor. Further, the control lines and information lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated on the product. In practice, it may be considered that almost all configurations are interconnected.

1:車両(自車)、10:マルチカメラ装置、11:カメラ、12:マルチカメラ、13:メモリ、14:CPU、15:画像処理部、16:外部出力部、31:視差算出部、32:同期/非同期切り替え部(撮像タイミング切り替え部)、81:要求性能算出部(実施例2)、82:撮像タイミングずれ量算出部(撮像タイミング決定部)(実施例2) 1: vehicle (own vehicle), 10: multi-camera device, 11: camera, 12: multi-camera, 13: memory, 14: CPU, 15: image processing unit, 16: external output unit, 31: parallax calculation unit, 32 : synchronous/asynchronous switching unit (imaging timing switching unit), 81: required performance calculating unit (second embodiment), 82: imaging timing deviation amount calculating unit (imaging timing determining unit) (second embodiment)

Claims (9)

 複数のカメラのうち2つ以上の任意の組から得られる複数の画像を用いて求められる視差を用いて外界の認識を行うマルチカメラ装置であって、
 所望の距離精度および撮像周期に応じて、前記複数のカメラの同期・非同期の切り替えを実施する撮像タイミング切り替え部と、
 前記複数の画像および前記複数のカメラ間での撮像タイミングのずれ量を用いて、前記視差を算出する視差算出部と、を備えることを特徴とするマルチカメラ装置。
A multi-camera device that recognizes the external world using parallax obtained using a plurality of images obtained from an arbitrary set of two or more cameras,
an imaging timing switching unit that switches between synchronization and asynchronization of the plurality of cameras according to desired distance accuracy and imaging cycle;
A multi-camera device, comprising: a parallax calculator that calculates the parallax by using the plurality of images and a shift amount of imaging timing between the plurality of cameras.
 請求項1に記載のマルチカメラ装置であって、
 前記撮像タイミング切り替え部に代えて、所望の距離精度および撮像周期に応じて、前記複数のカメラ間での撮像タイミングのずれ量を決定する撮像タイミング決定部を備えることを特徴とするマルチカメラ装置。
The multi-camera device according to claim 1,
A multi-camera device, comprising: instead of the imaging timing switching section, an imaging timing determining section that determines an amount of deviation in imaging timing between the plurality of cameras according to desired distance accuracy and imaging cycle.
 請求項2に記載のマルチカメラ装置であって、
 前記複数のカメラ間での撮像タイミングのずれ量を決定するための要求撮像周期を算出する要求性能算出部をさらに備えることを特徴とするマルチカメラ装置。
A multi-camera device according to claim 2,
A multi-camera apparatus, further comprising a required performance calculation unit that calculates a required imaging cycle for determining an amount of deviation in imaging timing between the plurality of cameras.
 請求項1に記載のマルチカメラ装置であって、
 前記所望の距離精度および撮像周期は、自車の制御手段を決定する対象物体の種別によって決定されることを特徴とするマルチカメラ装置。
The multi-camera device according to claim 1,
A multi-camera device, wherein the desired distance accuracy and imaging cycle are determined by the type of target object that determines the control means of the own vehicle.
 請求項1に記載のマルチカメラ装置であって、
 前記所望の距離精度および撮像周期は、自車周辺における死角領域の有無によって決定されることを特徴とするマルチカメラ装置。
The multi-camera device according to claim 1,
A multi-camera device, wherein the desired distance accuracy and imaging cycle are determined by the presence or absence of a blind area around the vehicle.
 請求項1に記載のマルチカメラ装置であって、
 前記所望の距離精度および撮像周期は、地図情報あるいは車両情報の少なくとも一つから判定した、自車が直進中であるか旋回中であるかによって決定されることを特徴とするマルチカメラ装置。
The multi-camera device according to claim 1,
A multi-camera device, wherein the desired distance accuracy and imaging cycle are determined based on whether the vehicle is traveling straight ahead or turning, as determined from at least one of map information and vehicle information.
 請求項1に記載のマルチカメラ装置であって、
 前記複数のカメラのうち少なくとも1つのベースカメラの撮像タイミングを基準とし、
 前記ベースカメラ以外のカメラは、カメラ毎に所望の距離精度および撮像周期に応じて撮像タイミングのずれ量を決定する撮像タイミング決定部を備えることを特徴とするマルチカメラ装置。
The multi-camera device according to claim 1,
Based on the imaging timing of at least one base camera among the plurality of cameras,
A multi-camera apparatus, wherein each of the cameras other than the base camera includes an imaging timing determination unit that determines an imaging timing shift amount according to a desired distance accuracy and an imaging cycle for each camera.
 請求項7に記載のマルチカメラ装置であって、
 前記ベースカメラは自身の撮像タイミングと画像をベースカメラ以外のカメラに送信することを特徴とするマルチカメラ装置。
A multi-camera device according to claim 7,
A multi-camera apparatus, wherein the base camera transmits its own imaging timing and image to a camera other than the base camera.
 請求項7に記載のマルチカメラ装置であって、
 前記ベースカメラは自身の撮像タイミングに赤外光フラッシュを投光し、
 前記ベースカメラ以外のカメラは撮像された赤外光フラッシュのタイミングを基準として、カメラ毎に所望の距離精度および撮像周期に応じて撮像タイミングのずれ量を決定する撮像タイミング決定部を備えることを特徴とするマルチカメラ装置。
A multi-camera device according to claim 7,
The base camera emits an infrared light flash at its own imaging timing,
The camera other than the base camera is characterized by comprising an imaging timing determination unit that determines the deviation amount of the imaging timing according to the desired distance accuracy and imaging cycle for each camera based on the timing of the infrared light flash that is imaged. A multi-camera device.
PCT/JP2022/005257 2021-08-02 2022-02-10 Multi-camera device Ceased WO2023013108A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112022001445.8T DE112022001445T5 (en) 2021-08-02 2022-02-10 MULTICAMERA DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-126945 2021-08-02
JP2021126945A JP7717524B2 (en) 2021-08-02 2021-08-02 Multi-camera equipment

Publications (1)

Publication Number Publication Date
WO2023013108A1 true WO2023013108A1 (en) 2023-02-09

Family

ID=85155463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/005257 Ceased WO2023013108A1 (en) 2021-08-02 2022-02-10 Multi-camera device

Country Status (3)

Country Link
JP (1) JP7717524B2 (en)
DE (1) DE112022001445T5 (en)
WO (1) WO2023013108A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002344800A (en) * 2001-05-18 2002-11-29 Minolta Co Ltd Synchronized photographing method and photographing system
JP2005100278A (en) * 2003-09-26 2005-04-14 Chube Univ 3D position measurement system and 3D position measurement method
JP2006246307A (en) * 2005-03-07 2006-09-14 Seiko Epson Corp Image data processing device
JP2007172035A (en) * 2005-12-19 2007-07-05 Fujitsu Ten Ltd Onboard image recognition device, onboard imaging device, onboard imaging controller, warning processor, image recognition method, imaging method and imaging control method
WO2008102764A1 (en) * 2007-02-23 2008-08-28 Toyota Jidosha Kabushiki Kaisha Vehicle environment monitoring device and car environment monitoring method
JP2012138671A (en) * 2010-12-24 2012-07-19 Kyocera Corp Stereo camera device
KR20150090647A (en) * 2014-01-29 2015-08-06 시모스 미디어텍(주) The synchronization optimization method of non-synchronized stereoscophic camera
JP2018191248A (en) * 2017-05-11 2018-11-29 ソニーセミコンダクタソリューションズ株式会社 Imaging device, imaging method, and program
JP2021012155A (en) * 2019-07-09 2021-02-04 株式会社小野測器 State measurement device and state measurement method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011205388A (en) 2010-03-25 2011-10-13 Sony Corp Signal processing apparatus and method, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002344800A (en) * 2001-05-18 2002-11-29 Minolta Co Ltd Synchronized photographing method and photographing system
JP2005100278A (en) * 2003-09-26 2005-04-14 Chube Univ 3D position measurement system and 3D position measurement method
JP2006246307A (en) * 2005-03-07 2006-09-14 Seiko Epson Corp Image data processing device
JP2007172035A (en) * 2005-12-19 2007-07-05 Fujitsu Ten Ltd Onboard image recognition device, onboard imaging device, onboard imaging controller, warning processor, image recognition method, imaging method and imaging control method
WO2008102764A1 (en) * 2007-02-23 2008-08-28 Toyota Jidosha Kabushiki Kaisha Vehicle environment monitoring device and car environment monitoring method
JP2012138671A (en) * 2010-12-24 2012-07-19 Kyocera Corp Stereo camera device
KR20150090647A (en) * 2014-01-29 2015-08-06 시모스 미디어텍(주) The synchronization optimization method of non-synchronized stereoscophic camera
JP2018191248A (en) * 2017-05-11 2018-11-29 ソニーセミコンダクタソリューションズ株式会社 Imaging device, imaging method, and program
JP2021012155A (en) * 2019-07-09 2021-02-04 株式会社小野測器 State measurement device and state measurement method

Also Published As

Publication number Publication date
DE112022001445T5 (en) 2024-01-25
JP2023021833A (en) 2023-02-14
JP7717524B2 (en) 2025-08-04

Similar Documents

Publication Publication Date Title
JP5862785B2 (en) Collision determination device and collision determination method
US8175797B2 (en) Vehicle drive assist system
JP3915746B2 (en) Vehicle external recognition device
US9734415B2 (en) Object detection system
JP6787157B2 (en) Vehicle control device
JP7261588B2 (en) Traffic light recognition method and traffic light recognition device
JP2010198552A (en) Driving state monitoring device
JP2009169776A (en) Detector
JP6011625B2 (en) Speed calculation device, speed calculation method, and collision determination device
WO2014132748A1 (en) Imaging device, and vehicle control device
WO2019044625A1 (en) Collision prediction device, collision prediction method, and program
JP6490747B2 (en) Object recognition device, object recognition method, and vehicle control system
JP4850963B1 (en) Vehicle driving support device
JP2013054399A (en) Vehicle periphery monitoring device
CN113875223A (en) External environment recognition device
JP7112255B2 (en) VEHICLE DATA TIME SYNCHRONIZATION DEVICE AND METHOD
JP6531689B2 (en) Moving trajectory detection device, moving object detecting device, moving trajectory detection method
JP7717524B2 (en) Multi-camera equipment
KR20060021922A (en) Obstacle detection technology and device using two cameras
JP2012118682A (en) Driving support controller
JP6253175B2 (en) Vehicle external environment recognition device
JP2018179782A (en) Obstacle detection system
JP2006040029A (en) Obstacle recognition method and obstacle recognition device
JP2019016053A (en) Outside-vehicle environment recognition device and outside-vehicle environment recognition method
JP2000030181A (en) Road surface abnormality detecting device and impact alarming device using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22852537

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112022001445

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22852537

Country of ref document: EP

Kind code of ref document: A1