[go: up one dir, main page]

WO2023013108A1 - Dispositif multicaméra - Google Patents

Dispositif multicaméra Download PDF

Info

Publication number
WO2023013108A1
WO2023013108A1 PCT/JP2022/005257 JP2022005257W WO2023013108A1 WO 2023013108 A1 WO2023013108 A1 WO 2023013108A1 JP 2022005257 W JP2022005257 W JP 2022005257W WO 2023013108 A1 WO2023013108 A1 WO 2023013108A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
imaging
camera device
cameras
imaging timing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/005257
Other languages
English (en)
Japanese (ja)
Inventor
琢馬 大里
春樹 的野
健 永崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Priority to DE112022001445.8T priority Critical patent/DE112022001445T5/de
Publication of WO2023013108A1 publication Critical patent/WO2023013108A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof

Definitions

  • the present invention relates to a multi-camera device.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2011-205388
  • the problem is described as "to make it possible to more easily detect the deviation of the photographing timing of the stereo camera.” Images obtained by photographing the presentation signal with two left and right cameras constituting a stereo camera are input. If the frequency of the presentation signal is greater than the Nyquist frequency of the frame rate of the camera, the filtering unit 93 extracts the left and right folding signals of a specific frequency from the capture signal by filtering. The matching processing unit 94 detects the phase shift of the left and right return signals, and the shift amount calculation unit 95 detects the shift of the shooting timing of the camera from the phase shift of the return signals.”
  • a stereo camera that measures distance by triangulation is known as a multi-camera device that detects objects and measures distance. There is a distance accuracy and a processing cycle as indicators of device performance, but these performances have a trade-off relationship.
  • a synchronous stereo camera that synchronizes multiple cameras and captures images at the same time has excellent distance accuracy
  • an asynchronous stereo camera that captures images at different times with multiple cameras is characterized by excellent processing cycle. Since the required distance accuracy and processing cycle change depending on the situation, there is a problem that if only one of them is adopted, the required performance cannot be satisfied depending on the scene.
  • An object of the present invention is to provide a multi-camera device that can meet the required distance accuracy and imaging cycle that change according to the situation with a single multi-camera device.
  • the present invention it is possible to provide a multi-camera device that can satisfy the required performance by appropriately switching the synchronization and asynchronization of the cameras even when the required distance accuracy and imaging cycle change.
  • FIG. 2 is an explanatory diagram showing the configuration of the multi-camera device according to the first embodiment
  • FIG. 1 is an explanatory diagram showing the configuration of an in-vehicle multi-camera device according to Embodiment 1.
  • FIG. FIG. 4 is an explanatory diagram of an image processing unit according to the first embodiment
  • FIG. 4 is an explanatory diagram of a parallax calculation unit according to the first embodiment
  • FIG. 4 is an explanatory diagram of the difference in distance measurement processing between a synchronous stereo camera and an asynchronous stereo camera
  • FIG. 4 is an explanatory diagram of a difference in imaging cycle between a synchronous stereo camera and an asynchronous stereo camera
  • 4 is an explanatory diagram of a synchronous/asynchronous switching unit according to the first embodiment
  • FIG. 9 is an explanatory diagram of a parallax calculation unit according to the second embodiment
  • FIG. 9 is an explanatory diagram of a required performance calculation unit according to the second embodiment
  • FIG. 10 is an explanatory diagram of a required performance integrated determination unit in the second embodiment
  • FIG. 11 is an explanatory diagram of a case in which imaging timing is set for each camera according to the third embodiment
  • Embodiment 1 will be described with reference to FIGS. 1 to 7.
  • FIG. 1 is a diagrammatic representation of Example 1
  • FIG. 1 shows the configuration of the multi-camera device.
  • the multi-camera device 10 is composed of a multi-camera 12 composed of a plurality of cameras 11 , a memory 13 , a CPU 14 , an image processing section 15 and an external output section 16 . Images captured by (each camera 11 of) the multi-camera 12 are stored in the memory 13 .
  • the image processing unit 15 performs processing such as object detection on the stored image, and the obtained result is output to the outside through the external output unit 16 .
  • the multi-camera device 10 is mounted on a vehicle (hereinafter also referred to as a vehicle) 1, captures images around the vehicle, and captures a plurality of images obtained from an arbitrary set of two or more of the plurality of cameras 11.
  • the surroundings of the vehicle (outside world) are recognized by, for example, detecting a target object using the parallax obtained using the parallax, and the obtained information is output to the control command unit 2 as a controller.
  • the control command unit 2 determines (operating states of) the brakes (not shown), the engine 3, the steering 4, etc., which are control means of the own vehicle, based on the input information (information on the detected target object, etc.).
  • the vehicle 1 is controlled (acceleration/deceleration control and steering control). Since the vehicle 1 travels in various scenes such as highways and shopping streets, it is necessary to switch between synchronous and asynchronous switching according to this embodiment in order to always satisfy the required performance (more specifically, the required performance of distance accuracy and imaging cycle (processing cycle)). is necessary.
  • the image processing unit 15 of this embodiment includes a parallax calculation unit 31 and a synchronous/asynchronous switching unit 32 .
  • the synchronous/asynchronous switching unit 32 determines whether the multi-camera 12 (each camera 11 of it) performs synchronous imaging or asynchronous imaging according to the required distance accuracy and imaging cycle (specifically, outputs a synchronous or asynchronous flag) (explained later). This result is transmitted to the multi-camera 12 and used to control the imaging timing of the camera 11 and to correct the parallax of the asynchronous stereo cameras in the parallax calculator 31 . That is, the synchronous/asynchronous switching unit 32 has a function as an imaging timing switching unit that switches between synchronous and asynchronous operation of the plurality of cameras 11 according to desired distance accuracy and imaging cycle.
  • the parallax calculation unit 31 is configured by blocks as shown in FIG.
  • the parallax calculation unit 31 receives at least two images 1 and 2 as inputs from the multi-camera 12 in the parallax calculation processing unit 41, and executes parallax calculation processing.
  • a position where the same point in image 2 is captured as to a specific point in image 1 is searched.
  • the displacement of the coordinate positions of the points obtained as a result of the search at this time is called parallax, which is converted into distance.
  • the parallax correction amount is calculated based on the flag (synchronization flag or asynchronous flag) indicating the synchronous/asynchronous state set in the multi-camera 12 by the synchronous/asynchronous switching unit 32 (parallax correction amount calculation The processing unit 43) and the parallax correction processing unit 42 execute the correction to obtain the final parallax.
  • FIG. 5 An explanation of this parallax correction processing is shown in FIG.
  • the parallax is calculated for the target object 51 based on the image obtained from the left camera 52 at a certain time t and the image obtained from the right camera 53 at the same time t.
  • the parallax is calculated based on the images obtained from the left camera 52 at a certain time t and the right camera 54 at a different time t+1. If the sensor or the target object is moving, the distance to the target object 51 is different when these two images are captured. 55 needs to correct the calculated parallax.
  • Fig. 6 shows the difference in imaging cycle at this time.
  • the imaging cycle of the camera is ⁇ t
  • the imaging cycle of the synchronous stereo camera shown in FIG. 6 is ⁇ t.
  • the imaging period is ⁇ t/2 when imaging is performed with a shift of exactly half the imaging period. Since a new image cannot be obtained for the duration of the imaging cycle after an image has been obtained, the shorter the imaging cycle, the higher the responsiveness of the system.
  • the shorter the imaging cycle the less deformation between images. The shorter the distance, the smaller the amount of movement between images, so there are advantages such as easier tracking.
  • the synchronous/asynchronous switching unit 32 determines whether the multi-camera 12 (each camera 11 thereof) performs synchronous imaging or asynchronous imaging according to the required distance accuracy and imaging cycle. As described above, a synchronous flag is output when distance accuracy is required, and an asynchronous flag is output when a short imaging cycle is required.
  • S73 determination is made using the surrounding environment based on the result of object detection.
  • it is determined whether or not there is a blind spot area around the own vehicle (in other words, whether or not there is a blind spot where an object may suddenly jump out). If there are few blind spots around the vehicle and visibility is good, the possibility of an object suddenly appearing is low. Therefore, an asynchronous stereo camera capable of shortening the imaging cycle is desirable. Therefore, in S73, if the view of the surroundings is good, the process proceeds to S74, and if the view of the surroundings is poor, the process proceeds to S76 to output an asynchronous flag.
  • S74 determination is made based on the travel road.
  • it is determined whether the vehicle is traveling straight ahead or turning, based on map information or the like.
  • Switch to stereo camera That is, if the estimated traveling road is a straight road in S74, the process proceeds to S75 to output a synchronous flag. to output
  • each judgment may be weighted so that important judgments are prioritized, or each judgment is scored and the total is calculated.
  • the score may be the final decision.
  • a stereo camera that measures distance by triangulation is known as a multi-camera device that detects an object and measures distance. There is a distance accuracy and a processing cycle as indicators of device performance, but these performances have a trade-off relationship.
  • the processing cycle of the entire device is equal to the imaging cycle of the camera alone.
  • a device that captures images at different times with multiple cameras is called an asynchronous stereo camera.
  • the matching processing unit 94 detects the phase shift between the left and right return signals
  • the shift amount calculation unit 95 detects the shift in the shooting timing of the camera from the phase shift of the return signals. and corrects the distance measurement result by estimating the amount of deviation in the measured distance due to imaging at different times.
  • the distance measurement accuracy at this time is inferior to that of a synchronous stereo camera, since images are taken at different times by a plurality of cameras, the processing cycle of the entire apparatus is shorter than the image capturing cycle of each camera.
  • Synchronous stereo cameras have excellent distance accuracy, while asynchronous stereo cameras have excellent processing cycle. There was a problem that the performance could not be satisfied.
  • the multi-camera device 10 of the present embodiment described above recognizes the external world using parallax obtained using a plurality of images obtained from an arbitrary set of two or more of a plurality of cameras.
  • an imaging timing switching unit (synchronous/asynchronous switching unit 32) that performs synchronous/asynchronous switching of the plurality of cameras according to a desired distance accuracy and imaging cycle; and a parallax calculator 31 that calculates the parallax by using the amount of deviation in imaging timing between the cameras.
  • the desired distance accuracy and imaging cycle are determined by the type of target object that determines the control means of the own vehicle, the presence or absence of a blind spot area around the own vehicle (whether or not there is a blind spot where a sudden jump may occur), the map It is determined by whether the own vehicle is traveling straight or turning, which is determined from at least one of information or vehicle information such as steering angle.
  • the multi-camera device 10 that satisfies the required performance in various scenes is realized by switching the synchronous/asynchronous mode of the multi-camera device 10 according to the required distance accuracy and processing cycle. can be done.
  • Example 2 In the first embodiment, an embodiment has been described in which synchronization and asynchronization of the camera are switched. In a second embodiment, in addition to switching between synchronous and asynchronous cameras, an embodiment will be described in which an asynchronous stereo camera smoothly changes the imaging timing deviation amount between cameras.
  • FIG. 8 shows an explanatory diagram of the parallax calculation unit in the second embodiment.
  • the parallax calculation unit 31 of this embodiment includes a required performance calculation unit 81 and an imaging timing deviation amount calculation unit 82 in addition to the above-described first embodiment.
  • the required performance calculation unit 81 calculates the required performance that the multi-camera device 10 should satisfy (more specifically, the required performance of distance accuracy and imaging cycle).
  • An explanatory diagram of the required performance calculation unit 81 is shown in FIG. 91, 92, 93 and 94 in FIG. 9 are required performances calculated from the same viewpoints as S71, S72, S73 and S74 in FIG.
  • the shift amount of imaging timing is not simply switched, but is smoothly changed. Therefore, the required performance is calculated so as to change smoothly depending on the scene by scoring instead of a simple determination based on presence or absence.
  • the target type-based required performance calculation unit 91 calculates the required performance according to the type of the target object. For example, when considering a sensor that can determine the type of vehicle or pedestrian, pedestrians are more likely to undergo rapid changes in movement compared to vehicles. . In addition, since it is desirable that the response speed is higher as the time TTC[s] until collision is shorter, the target imaging cycle ⁇ t_a[s] is set according to the type of target object and a table set for each TTC. Vehicles and pedestrians are used as examples of types, but a plurality of types with different possibilities of occurrence of movement changes may be added, such as trucks, which have few sudden changes in movement.
  • the required performance calculation unit 92 calculates the required performance so that the collision can be avoided in time when an object pops out of the blind spot.
  • n [frame] is the frame required to detect the target object and perform collision determination
  • z [m] is the distance to the blind spot of the target object
  • v_car is the vehicle speed. [m/s]
  • the collision avoidable distance is z_th[m].
  • the required imaging cycle is the following formula should be determined so as to satisfy At this time, if n is far away, erroneous braking can be suppressed by increasing it, and if it is nearby, the danger is high, so decreasing it will widen the allowable range of braking. You can change it.
  • the target imaging cycle ⁇ t_c[s] can be expressed by the following formula, for example. As long as the relationship is such that ⁇ t_c becomes smaller when ⁇ is large, other formulas or a form of referring to a table in which values corresponding to ⁇ are stored in advance may be used.
  • the required performance calculation unit 94 based on the surrounding environment changes the required imaging cycle ⁇ t_d[s] for the entire scene. For example, it receives map information of the surrounding area based on GPS information, and increases ⁇ t_d to reduce responsiveness and improve distance accuracy because the visibility of the surrounding area is good on highways, and the visibility of the surrounding areas such as residential areas is poor and diverse. In an environment where an object is expected to pop out, ⁇ t_d is decreased to increase responsiveness. As with the target type-based required performance calculation unit 92, three or more types of switching may be performed as long as the environment requires different required imaging cycles, in addition to highways and residential areas. Also, the above environment may be determined by scene understanding by image recognition instead of GPS.
  • FIG. 10 shows a flow chart of integration processing (that is, processing for calculating the final target imaging cycle) of the required performance integration determination unit 95 .
  • ⁇ t_a and ⁇ t_b include target objects that are supposed to collide in order to set the required imaging cycle, so it is possible to calculate the times TTC_a and TTC_b until the collision when the collision occurs.
  • TTC_a and TTC_b are compared with a threshold value (S101, S102), and TTC_a and TTC_b are compared (S103). [s] or ⁇ t_b[s] is adopted (S104, S105). On the other hand, if these values are equal to or greater than the thresholds, there is time to spare before collision avoidance control is performed even if an object pops out.
  • the imaging timing deviation amount calculation unit 82 calculates the imaging timing deviation amount that satisfies the received required performance.
  • the imaging timing deviation amount calculation unit 82 has a function as an imaging timing determination unit that determines the imaging timing deviation amount between the plurality of cameras 11 according to the desired distance accuracy and imaging cycle.
  • the calculated (determined) image pickup timing deviation amount is transmitted to the multi-camera 12 to control the image pickup timing of the camera 11, and is also sent to the parallax correction amount calculation processing unit 43 and used for the parallax correction processing in the asynchronous stereo camera. be done.
  • the multi-camera device 10 of the present embodiment described above determines the imaging timing shift amount between the plurality of cameras according to desired distance accuracy and imaging cycle.
  • a timing determination unit (imaging timing deviation calculation unit 82) is provided.
  • the apparatus further includes a required performance calculation unit 81 that calculates a (final) required imaging cycle for determining the amount of deviation in imaging timing between the plurality of cameras.
  • the present embodiment by smoothly changing the imaging timing deviation amount of the multi-camera device 10 in accordance with the required distance accuracy and processing cycle, it is possible to obtain characteristics that are more suitable for various scenes than in the first embodiment.
  • a multi-camera device 10 can be realized.
  • Embodiment 3 describes an embodiment in which the cameras of the multi-camera device are used for different purposes.
  • the multi-camera device 10 is composed of a total of three cameras, one monitoring camera and two vehicles equipped with the cameras, and the two vehicles are respectively controlled.
  • FIG. 11 shows an overview of the system.
  • a base camera 111 that constitutes a monitoring camera installed in the environment takes an image at a fixed timing and transmits the image and the timing to the (in-vehicle) cameras 112 and 113 mounted on the vehicles 7 and 8 .
  • the camera 112 mounted on the vehicle 7 traveling straight needs to detect and measure the distance from the preceding vehicle 9 far in front of the own vehicle, so the performance required for distance measurement accuracy is high. Therefore, the camera 112 takes images at the same timing as the base camera 111 and processes them as a synchronous stereo camera to satisfy the required performance.
  • the camera 113 mounted on the vehicle 8 that is turning is required to have high response performance because the viewing range changes greatly depending on the time when the direction of the own vehicle changes.
  • the multi-camera device 10 of the present embodiment uses the imaging timing of at least one base camera 111 among the plurality of cameras as a reference, and the cameras 112 and 113 other than the base camera 111 achieve desired distance accuracy and imaging timing for each camera.
  • a plurality of imaging timing determination units are provided for determining the deviation amount of the imaging timing according to the cycle.
  • the timing is set by communicating and synchronizing the imaging times between the cameras, and by setting the base camera 111 to fire (project) an infrared light flash at the imaging timing, and the other cameras 112 and 113 to transmit the captured images. It is conceivable to adjust the imaging timing (as a reference) using an infrared light flash reflected in the image.
  • the multi-camera device 10 of the present embodiment described above uses the imaging timing of at least one base camera 111 among the plurality of cameras as a reference, and the cameras 112 and 113 other than the base camera 111 are set to the desired timing for each camera.
  • An imaging timing determination unit is provided that determines the amount of deviation in imaging timing according to the distance accuracy and the imaging cycle.
  • the base camera 111 transmits its own imaging timing and image to the cameras 112 and 113 other than the base camera 111 .
  • the base camera 111 emits an infrared light flash at its own imaging timing
  • the cameras 112 and 113 other than the base camera 111 emit desired infrared light flashes based on the timing of the captured infrared light flash.
  • An imaging timing determination unit is provided that determines the amount of deviation in imaging timing according to the distance accuracy and the imaging cycle.
  • a plurality of cameras installed for different purposes such as a surveillance camera and an in-vehicle camera, adjust their imaging timings according to their respective purposes, so that each camera satisfies different performance requirements.
  • multi-camera device 10 can be realized.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations.
  • it is possible to replace part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • each of the above configurations may be partially or wholly configured by hardware, or may be configured to be realized by executing a program on a processor.
  • control lines and information lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated on the product. In practice, it may be considered that almost all configurations are interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

L'invention concerne un dispositif multicaméra qui permet de satisfaire une précision en distance et un cycle d'imagerie requis qui varient avec les circonstances à l'aide d'un seul dispositif multicaméra. La présente invention reconnaît le monde extérieur en utilisant une parallaxe déterminée à l'aide d'une pluralité d'images obtenues à partir d'un ensemble arbitraire d'au moins deux caméras parmi une pluralité de caméras et comporte une unité de commutation de minutage d'imagerie (l'unité 32 de commutation synchrone/asynchrone) qui commute la pluralité de caméras entre synchrone et asynchrone en fonction d'une précision en distance et d'un cycle d'imagerie souhaités et une unité 31 de calcul de parallaxe qui calcule la parallaxe en utilisant la pluralité d'images et l'écart dans le minutage d'imagerie entre la pluralité de caméras.
PCT/JP2022/005257 2021-08-02 2022-02-10 Dispositif multicaméra Ceased WO2023013108A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112022001445.8T DE112022001445T5 (de) 2021-08-02 2022-02-10 Multikameravorrichtung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-126945 2021-08-02
JP2021126945A JP7717524B2 (ja) 2021-08-02 2021-08-02 マルチカメラ装置

Publications (1)

Publication Number Publication Date
WO2023013108A1 true WO2023013108A1 (fr) 2023-02-09

Family

ID=85155463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/005257 Ceased WO2023013108A1 (fr) 2021-08-02 2022-02-10 Dispositif multicaméra

Country Status (3)

Country Link
JP (1) JP7717524B2 (fr)
DE (1) DE112022001445T5 (fr)
WO (1) WO2023013108A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002344800A (ja) * 2001-05-18 2002-11-29 Minolta Co Ltd 同期撮影方法および撮影システム
JP2005100278A (ja) * 2003-09-26 2005-04-14 Chube Univ 3次元位置計測システムおよび3次元位置計測方法
JP2006246307A (ja) * 2005-03-07 2006-09-14 Seiko Epson Corp 画像データ処理装置
JP2007172035A (ja) * 2005-12-19 2007-07-05 Fujitsu Ten Ltd 車載画像認識装置、車載撮像装置、車載撮像制御装置、警告処理装置、画像認識方法、撮像方法および撮像制御方法
WO2008102764A1 (fr) * 2007-02-23 2008-08-28 Toyota Jidosha Kabushiki Kaisha Dispositif de surveillance d'environnement de véhicule et procédé de surveillance d'environnement de véhicule
JP2012138671A (ja) * 2010-12-24 2012-07-19 Kyocera Corp ステレオカメラ装置
KR20150090647A (ko) * 2014-01-29 2015-08-06 시모스 미디어텍(주) 비동기식 입체카메라의 동기최적화 방법
JP2018191248A (ja) * 2017-05-11 2018-11-29 ソニーセミコンダクタソリューションズ株式会社 撮像装置、撮像方法、並びにプログラム
JP2021012155A (ja) * 2019-07-09 2021-02-04 株式会社小野測器 状態計測装置及び状態計測方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011205388A (ja) 2010-03-25 2011-10-13 Sony Corp 信号処理装置および方法、並びにプログラム

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002344800A (ja) * 2001-05-18 2002-11-29 Minolta Co Ltd 同期撮影方法および撮影システム
JP2005100278A (ja) * 2003-09-26 2005-04-14 Chube Univ 3次元位置計測システムおよび3次元位置計測方法
JP2006246307A (ja) * 2005-03-07 2006-09-14 Seiko Epson Corp 画像データ処理装置
JP2007172035A (ja) * 2005-12-19 2007-07-05 Fujitsu Ten Ltd 車載画像認識装置、車載撮像装置、車載撮像制御装置、警告処理装置、画像認識方法、撮像方法および撮像制御方法
WO2008102764A1 (fr) * 2007-02-23 2008-08-28 Toyota Jidosha Kabushiki Kaisha Dispositif de surveillance d'environnement de véhicule et procédé de surveillance d'environnement de véhicule
JP2012138671A (ja) * 2010-12-24 2012-07-19 Kyocera Corp ステレオカメラ装置
KR20150090647A (ko) * 2014-01-29 2015-08-06 시모스 미디어텍(주) 비동기식 입체카메라의 동기최적화 방법
JP2018191248A (ja) * 2017-05-11 2018-11-29 ソニーセミコンダクタソリューションズ株式会社 撮像装置、撮像方法、並びにプログラム
JP2021012155A (ja) * 2019-07-09 2021-02-04 株式会社小野測器 状態計測装置及び状態計測方法

Also Published As

Publication number Publication date
DE112022001445T5 (de) 2024-01-25
JP2023021833A (ja) 2023-02-14
JP7717524B2 (ja) 2025-08-04

Similar Documents

Publication Publication Date Title
JP5862785B2 (ja) 衝突判定装置及び衝突判定方法
US8175797B2 (en) Vehicle drive assist system
JP3915746B2 (ja) 車両用外界認識装置
US9734415B2 (en) Object detection system
JP6787157B2 (ja) 車両制御装置
JP7261588B2 (ja) 信号機認識方法及び信号機認識装置
JP2010198552A (ja) 運転状況監視装置
JP2009169776A (ja) 検出装置
JP6011625B2 (ja) 速度算出装置及び速度算出方法並びに衝突判定装置
WO2014132748A1 (fr) Dispositif d'imagerie, et dispositif de commande de véhicule
WO2019044625A1 (fr) Dispositif de prédiction de collision, procédé de prédiction de collision et programme
JP6490747B2 (ja) 物体認識装置、物体認識方法および車両制御システム
JP4850963B1 (ja) 車両の運転支援装置
JP2013054399A (ja) 車両周辺監視装置
CN113875223A (zh) 外部环境识别装置
JP7112255B2 (ja) 車両データの時刻同期装置及び方法
JP6531689B2 (ja) 移動軌跡検出装置、移動物体検出装置、移動軌跡検出方法
JP7717524B2 (ja) マルチカメラ装置
KR20060021922A (ko) 두 개의 카메라를 이용한 장애물 감지 기술 및 장치
JP2012118682A (ja) 運転支援制御装置
JP6253175B2 (ja) 車両の外部環境認識装置
JP2018179782A (ja) 障害物検知システム
JP2006040029A (ja) 障害物認識方法及び障害物認識装置
JP2019016053A (ja) 車外環境認識装置および車外環境認識方法
JP2000030181A (ja) 路面異常検出装置及びそれを用いた衝突警報装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22852537

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 112022001445

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22852537

Country of ref document: EP

Kind code of ref document: A1