WO2020255746A1 - Weather identification device and weather identification method - Google Patents
Weather identification device and weather identification method Download PDFInfo
- Publication number
- WO2020255746A1 WO2020255746A1 PCT/JP2020/022261 JP2020022261W WO2020255746A1 WO 2020255746 A1 WO2020255746 A1 WO 2020255746A1 JP 2020022261 W JP2020022261 W JP 2020022261W WO 2020255746 A1 WO2020255746 A1 WO 2020255746A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional object
- edge strength
- type
- weather
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01W—METEOROLOGY
- G01W1/00—Meteorology
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to a weather determination device and a weather determination method.
- Patent Document 1 is an in-vehicle fog determination device provided with a fog determination means for performing fog determination based on an image captured by an in-vehicle camera mounted on a vehicle. Then, based on the road area determining means for determining the road area on which the vehicle is traveling in the image captured by the vehicle-mounted camera and the road area determined by the road area determining means, the said in the image.
- the fog determination means includes a distant road surface region determining means for determining a distant road surface region which is a road surface region distant from a vehicle, and the fog determining means is based on the brightness of the distant road surface region determined by the distant road surface region determining means. Those that perform fog determination are disclosed.
- the image captured by the in-vehicle camera is divided into a near road surface region, a distant road surface region, and an empty region, and the near road surface region and the distant road surface region and the distant road surface region and the sky region are compared with each other to fog. Is being detected.
- the weather such as fog and the windshield cloudiness in front of the camera are detected in the same manner, and the weather cannot be detected properly.
- the present invention has been made in view of the above, and provides a weather determination device and a weather determination method capable of accurately determining the weather even when a three-dimensional object is imaged in an image used for determining the weather.
- the purpose is.
- the present application includes a plurality of means for solving the above problems.
- a three-dimensional object detection unit that detects a three-dimensional object and a distance to the three-dimensional object from an image captured by an imaging unit of a vehicle.
- a three-dimensional object type determination unit that determines the type of the three-dimensional object from the feature amount of the three-dimensional object, and an actual edge strength calculation unit that calculates the actual edge strength that is the edge strength in the region where the three-dimensional object is detected in the image.
- the type edge strength is determined based on the type edge strength table in which the edge strength in good visibility weather is determined in advance, which corresponds to the type and distance of the three-dimensional object determined by the three-dimensional object type determination unit.
- the weather in the range of the image is determined to be poor visibility weather, and the weather determination unit is provided to output the determination result.
- the weather can be accurately determined even when a three-dimensional object is imaged in the image used for determining the weather.
- the weather is good visibility (hereinafter referred to as good visibility weather) and cases where the visibility is poor (hereinafter referred to as poor visibility weather), and the three-dimensional object in the image.
- good visibility weather cases where the weather is good visibility
- poor visibility weather cases where the visibility is poor
- the three-dimensional object in the image By paying attention to the difference in the feature amount (for example, edge strength) of the above and determining the weather based on the feature amount of the three-dimensional object in the image, the three-dimensional object is imaged in the image used for determining the weather. Even if there is, the weather is judged accurately.
- FIG. 1 is a functional block diagram showing the processing contents of the weather determination device according to the present embodiment.
- the weather determination device 100 includes a three-dimensional object detection unit 110 that detects a three-dimensional object and a distance to the three-dimensional object from an image captured by the image pickup unit of the vehicle, and the actual edge strength of the three-dimensional object in the image.
- the edge strength calculation unit 120 for calculating the type edge strength and the actual edge strength and the type edge strength is larger than a predetermined threshold value, the weather in the image range is poor visibility weather. It is provided with a poor visibility weather determination unit 130 that determines and outputs the determination result.
- the edge strength calculation unit 120 determines the type of the three-dimensional object from the actual edge strength calculation unit 121 that calculates the actual edge strength in the region where the three-dimensional object of the image is detected and the feature amount of the three-dimensional object. Based on the type edge strength table in which the edge strength in good visibility weather, which corresponds to the type and distance of the three-dimensional object determined by the object type determination unit 122 and the three-dimensional object type determination unit 122, is determined in advance. It has a type edge strength acquisition unit 123 for acquiring the type edge strength and a type edge strength table storage unit 124 for storing the type edge strength table.
- the three-dimensional object detection unit 110 detects a three-dimensional object in the image based on an image captured by an imaging device such as a stereo camera installed so as to image the front of the vehicle, and detects the three-dimensional object.
- the distance from the image pickup device (the relative position of the image pickup device in the vehicle can be obtained from design information or the like, so it can be said to be the distance from the vehicle of a three-dimensional object).
- the actual edge strength calculation unit 121 calculates the edge strength in the region where the three-dimensional object detected in the image exists. Edge strength is one of the features of a three-dimensional object in an image.
- the three-dimensional object type determination unit 122 determines the type of the three-dimensional object detected in the image, and outputs the determination result together with the distance information to the three-dimensional object obtained by the three-dimensional object detection unit 110.
- the type of the three-dimensional object to be determined is, for example, a vehicle, a roadside tree, a road sign, or the like.
- the method for determining the type of the three-dimensional object is not particularly limited, but is based on, for example, a method such as pattern matching and various feature quantities (for example, the edge strength calculated by the actual edge strength calculation unit 121). A determination method or the like may be used.
- the type edge strength acquisition unit 123 reads and acquires the corresponding type edge strength table from the type edge strength table storage unit 124 based on the type of the three-dimensional object determined by the three-dimensional object type determination unit 122, and reaches the three-dimensional object. Based on the distance information, the edge strength (that is, the type edge strength) of the three-dimensional object estimated in the weather specified by the type edge strength table (here, good visibility weather) is acquired and output.
- the edge strength that is, the type edge strength of the three-dimensional object estimated in the weather specified by the type edge strength table (here, good visibility weather) is acquired and output.
- the type edge strength table stored in the type edge strength table storage unit 124 experimentally obtains the relationship between the distance to a three-dimensional object and the edge strength in good visibility weather (for example, in fine weather) for each type of three-dimensional object. It is a thing.
- the type edge strength table storage unit 124 stores the type edge strength table for each three-dimensional object of at least a part of the three-dimensional objects whose type is determined and symmetrical by the three-dimensional object type determination unit 122.
- the type edge strength table in good visibility weather is used is illustrated and described, but the present invention is not limited to this, and for example, the type edge strength table for each type of weather and travel path is used. It may be configured to store and read them according to the type of three-dimensional object.
- the poor visibility weather determination unit 130 determines the range of the image when the difference between the actual edge strength calculated by the actual edge strength calculation unit 121 and the type edge strength acquired by the type edge strength acquisition unit 123 is larger than a predetermined threshold value. The weather is poor visibility. It is determined that the weather is poor visibility, and the judgment result is output.
- FIG. 2 is a diagram showing the relationship between the distance from the own vehicle to a three-dimensional object and the edge strength.
- FIG. 3 is a diagram showing that the relationship between the distance from the own vehicle to the three-dimensional object and the edge strength differs depending on the weather.
- the edge strength changes according to the distance to the three-dimensional object, and the longer the distance, the lower the edge strength.
- the method of changing the edge strength with respect to the distance to the three-dimensional object differs depending on the road surface condition and the weather.
- FIGS. 2 and 3 as an example, a case where a roadside tree, a preceding vehicle, or the like is within the imaging range of the camera is illustrated, but in reality, one or more features such as road signs may be present. ..
- the amount of edge strength attenuation at a distance is larger than in the case of good visibility weather. Therefore, it is possible to record the edge strength for each type of three-dimensional object and distance in good visibility weather and compare it with the detected edge strength of the three-dimensional object to determine whether or not the weather is poor visibility.
- FIG. 4 is a flowchart showing the processing contents of the actual edge strength calculation unit
- FIG. 5 is a flowchart showing the processing contents of the type edge strength acquisition unit
- FIG. 6 is a flowchart showing the processing contents of the poor visibility weather determination unit.
- the actual edge strength calculation unit 121 acquires information on the position where the three-dimensional object is detected in the image by the three-dimensional object detection unit 110, and extracts a rectangular region in which the three-dimensional object is imaged based on the information. (Step S100), and the edge strength is calculated and output for the obtained rectangular region (step S110).
- the type edge strength acquisition unit 123 acquires the determination result of the type of the three-dimensional object determined by the three-dimensional object type determination unit 122 (step S200), and even the three-dimensional object calculated by the three-dimensional object detection unit 110.
- the type edge strength table read from the type edge strength table storage unit 124 is used based on the determination result of the type of the three-dimensional object obtained, and the type edge strength is based on the distance to the three-dimensional object. (Step S220).
- the poor visibility weather determination unit 130 acquires the actual edge strength calculated by the actual edge strength calculation unit 121 (step S300), and also acquires the type edge strength acquired by the type edge strength acquisition unit 123. (Step S310), it is determined whether or not the difference between the actual edge strength and the type edge strength is equal to or greater than a predetermined threshold value (step S320). If the determination result in step S320 is YES, it is determined that the weather is poor visibility (step S321), and the process ends. If the determination result in step S320 is NO, it is determined that the weather is not poor visibility (here, the weather is good visibility) (step S322), and the process is terminated.
- the weather is determined by determining the weather based on the difference in edge strength of the three-dimensional object in the image between the case of good visibility weather and the case of poor visibility weather. Even when a three-dimensional object is captured in the image used for the determination, the weather can be determined accurately.
- the present invention is not limited to the above-described embodiment, and includes various modifications and combinations within a range that does not deviate from the gist thereof. Further, the present invention is not limited to the one including all the configurations described in the above-described embodiment, and includes the one in which a part of the configurations is deleted. Further, each of the above configurations, functions and the like may be realized by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function.
- 100 Weather determination device, 110 ... Three-dimensional object detection unit, 120 ... Edge strength calculation unit, 121 ... Actual edge strength calculation unit, 122 ... Three-dimensional object type determination unit, 123 ... Type edge strength acquisition unit, 124 ... Type edge strength table Memory unit, 130 ... Poor visibility weather judgment unit
Landscapes
- Engineering & Computer Science (AREA)
- Environmental & Geological Engineering (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Atmospheric Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Environmental Sciences (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
本発明は、天候判定装置および天候判定方法に関する。 The present invention relates to a weather determination device and a weather determination method.
近年、車両制御の自動化に注目が集まっており、例えば、配光制御の自動化についても検討がなされている。このような配光制御を適切に行う為には、霧や雪、豪雨などの車両周辺の天候を精度良く検出し、検出した天候に応じた制御を行うことが必要である。 In recent years, attention has been focused on the automation of vehicle control, and for example, the automation of light distribution control is also being studied. In order to properly perform such light distribution control, it is necessary to accurately detect the weather around the vehicle such as fog, snow, and heavy rain, and to perform control according to the detected weather.
このような天候の検出に係る技術として、例えば、特許文献1には、車両に搭載された車載カメラにより撮像された画像に基づいて霧判定を行う霧判定手段を備えた車載霧判定装置であって、前記車載カメラによって撮像された画像内において前記車両が走行している道路領域を決定する道路領域決定手段と、その道路領域決定手段によって決定された道路領域に基づいて、前記画像内において前記車両から所定距離遠方にある路面領域である遠方路面領域を決定する遠方路面領域決定手段とを備え、前記霧判定手段は、その遠方路面領域決定手段によって決定された遠方路面領域の輝度に基づいて霧判定を行うものが開示されている。 As a technique related to such weather detection, for example, Patent Document 1 is an in-vehicle fog determination device provided with a fog determination means for performing fog determination based on an image captured by an in-vehicle camera mounted on a vehicle. Then, based on the road area determining means for determining the road area on which the vehicle is traveling in the image captured by the vehicle-mounted camera and the road area determined by the road area determining means, the said in the image. The fog determination means includes a distant road surface region determining means for determining a distant road surface region which is a road surface region distant from a vehicle, and the fog determining means is based on the brightness of the distant road surface region determined by the distant road surface region determining means. Those that perform fog determination are disclosed.
上記従来技術においては、車載カメラによって撮像された画像を近傍路面領域、遠方路面領域、空領域に領域分割し、近傍路面領域と遠方路面領域、遠方路面領域と空領域をそれぞれ比較する事で霧を検出している。しかしながら、車載カメラで撮像された画像を用いる場合には、霧などの天候と、カメラ前のフロントガラス曇りとを同じように検出してしまい、天候検出が適切に出来ないことが考えられる。 In the above-mentioned prior art, the image captured by the in-vehicle camera is divided into a near road surface region, a distant road surface region, and an empty region, and the near road surface region and the distant road surface region and the distant road surface region and the sky region are compared with each other to fog. Is being detected. However, when an image captured by an in-vehicle camera is used, it is conceivable that the weather such as fog and the windshield cloudiness in front of the camera are detected in the same manner, and the weather cannot be detected properly.
また、分割した各領域に立体物(車、街路樹、道路標識など)が映り込むと、近傍路面領域と遠方路面領域、遠方路面領域と空領域の比較にノイズが乗ってしまうため、天候を精度良く検出する事が出来ない。 In addition, if a three-dimensional object (car, roadside tree, road sign, etc.) is reflected in each divided area, noise will be added to the comparison between the near road surface area and the distant road surface area, and the distant road surface area and the sky area. It cannot be detected with high accuracy.
本発明は上記に鑑みてなされたものであり、天候の判定に用いる画像中に立体物が撮像されている場合にも精度良く天候を判定することができる天候判定装置および天候判定方法を提供することを目的とする。 The present invention has been made in view of the above, and provides a weather determination device and a weather determination method capable of accurately determining the weather even when a three-dimensional object is imaged in an image used for determining the weather. The purpose is.
本願は上記課題を解決する手段を複数含んでいるが、その一例を挙げるならば、車両の撮像部で撮像された画像から立体物と、前記立体物までの距離とを検知する立体物検知部と、前記立体物における特徴量から前記立体物の種別を判定する立体物種別判定部と、前記画像の前記立体物を検知した領域におけるエッジ強度である実エッジ強度を算出する実エッジ強度算出部と、前記立体物種別判定部で判定した前記立体物の種別と距離とに対応する、視界良好である視界良好天候時のエッジ強度を予め定めた種別エッジ強度テーブルに基づいて、種別エッジ強度を取得する種別エッジ強度取得部と、前記実エッジ強度算出部で算出した前記実エッジ強度と前記種別エッジ強度取得部で取得した前記種別エッジ強度との差分が予め定めた閾値よりも大きい場合に、前記画像の範囲の天候が視界不良である視界不良天候であると判定し、判定結果を出力する天候判定部とを備えるものとする。 The present application includes a plurality of means for solving the above problems. For example, a three-dimensional object detection unit that detects a three-dimensional object and a distance to the three-dimensional object from an image captured by an imaging unit of a vehicle. A three-dimensional object type determination unit that determines the type of the three-dimensional object from the feature amount of the three-dimensional object, and an actual edge strength calculation unit that calculates the actual edge strength that is the edge strength in the region where the three-dimensional object is detected in the image. The type edge strength is determined based on the type edge strength table in which the edge strength in good visibility weather is determined in advance, which corresponds to the type and distance of the three-dimensional object determined by the three-dimensional object type determination unit. When the difference between the type edge strength acquisition unit to be acquired and the actual edge strength calculated by the actual edge strength calculation unit and the type edge strength acquired by the type edge strength acquisition unit is larger than a predetermined threshold value. It is assumed that the weather in the range of the image is determined to be poor visibility weather, and the weather determination unit is provided to output the determination result.
本発明によれば、天候の判定に用いる画像中に立体物が撮像されている場合にも精度良く天候を判定することができる。 According to the present invention, the weather can be accurately determined even when a three-dimensional object is imaged in the image used for determining the weather.
本実施の形態は、視界良好である天候(以降、視界良好天候と称する)である場合と、視界不良である天候(以降、視界不良天候と称する)である場合とで、画像中の立体物の特徴量(例えば、エッジ強度)に差が生じることに注目し、画像中の立体物の特徴量に基づいて天候を判定することにより、天候の判定に用いる画像中に立体物が撮像されている場合にも精度良く天候を判定するものである。 In the present embodiment, there are cases where the weather is good visibility (hereinafter referred to as good visibility weather) and cases where the visibility is poor (hereinafter referred to as poor visibility weather), and the three-dimensional object in the image. By paying attention to the difference in the feature amount (for example, edge strength) of the above and determining the weather based on the feature amount of the three-dimensional object in the image, the three-dimensional object is imaged in the image used for determining the weather. Even if there is, the weather is judged accurately.
以下、本発明の実施の形態を図1~図6を参照しつつ説明する。 Hereinafter, embodiments of the present invention will be described with reference to FIGS. 1 to 6.
図1は、本実施の形態に係る天候判定装置の処理内容を示す機能ブロック図である。 FIG. 1 is a functional block diagram showing the processing contents of the weather determination device according to the present embodiment.
図1において、天候判定装置100は、車両の撮像部で撮像されたた画像から立体物と、立体物までの距離とを検知する立体物検知部110と、画像における立体物の実エッジ強度と種別エッジ強度とを算出するエッジ強度算出部120と、実エッジ強度と種別エッジ強度との差分が予め定めた閾値よりも大きい場合に、画像の範囲の天候が視界不良である視界不良天候であると判定し、判定結果を出力する視界不良天候判定部130とを備えている。
In FIG. 1, the
また、エッジ強度算出部120は、画像の立体物を検知した領域におけるエッジ強度である実エッジ強度を算出する実エッジ強度算出部121と、立体物における特徴量から立体物の種別を判定する立体物種別判定部122と、立体物種別判定部122で判定した立体物の種別と距離とに対応する、視界良好である視界良好天候時のエッジ強度を予め定めた種別エッジ強度テーブルに基づいて、種別エッジ強度を取得する種別エッジ強度取得部123と、種別エッジ強度テーブルを記憶する種別エッジ強度テーブル記憶部124とを有している。
Further, the edge
立体物検知部110は、例えば、車両の前方を撮像するように設置されたステレオカメラなどの撮像装置で撮像された画像に基づいて、画像中の立体物を検知するとともに、検知した立体物の撮像装置からの距離(撮像装置の車両における相対位置は設計情報などから取得できるので、立体物の車両からの距離ということもできる)を算出する。
The three-dimensional
実エッジ強度算出部121は、画像中で検知された立体物の存在する領域におけるエッジ強度を算出する。エッジ強度は、画像中における立体物の特徴量の一つである。
The actual edge
立体物種別判定部122は、画像中で検知された立体物の種別を判定し、判定結果を立体物検知部110で得られた立体物までの距離情報とともに出力する。判定する立体物の種別は、例えば、車両や街路樹、道路標識などである。立体物の種別の判定方法は特に限定しないが、例えば、パターンマッチングなどの方法のほか、種々の特徴量(例えば、実エッジ強度算出部121で算出されたエッジ強度なども該当する)に基づいて判定する方法などを用いても良い。
The three-dimensional object
種別エッジ強度取得部123は、立体物種別判定部122で判定された立体物の種別に基づいて、種別エッジ強度テーブル記憶部124から対応する種別エッジ強度テーブルを読み出して取得し、立体物までの距離情報に基づいて種別エッジ強度テーブルが規定する天候(ここでは、視界良好天候)において推定される立体物のエッジ強度(すなわち、種別エッジ強度)を取得し、出力する。
The type edge
種別エッジ強度テーブル記憶部124に記憶されている種別エッジ強度テーブルは、視界良好天候(例えば、快晴時)における立体物までの距離とエッジ強度の関係を立体物の種別ごとに予め実験的に求めたものである。種別エッジ強度テーブル記憶部124には、立体物種別判定部122で種別の判定対称である立体物の少なくとも一部の種別の各立体物について種別エッジ強度テーブルが記憶されている。
The type edge strength table stored in the type edge strength
なお、本実施の形態においては、視界良好天候における種別エッジ強度テーブルを用いる場合を例示して説明しているが、これに限られず、例えば、天候や走行路の種類ごとの種別エッジ強度テーブルを記憶し、立体物の種別に応じてそれらを読み出すように構成しても良い。 In addition, in this embodiment, the case where the type edge strength table in good visibility weather is used is illustrated and described, but the present invention is not limited to this, and for example, the type edge strength table for each type of weather and travel path is used. It may be configured to store and read them according to the type of three-dimensional object.
視界不良天候判定部130は、実エッジ強度算出部121で算出した実エッジ強度と種別エッジ強度取得部123で取得した種別エッジ強度との差分が予め定めた閾値よりも大きい場合に、画像の範囲の天候が視界不良である視界不良天候であると判定し、判定結果を出力する。
The poor visibility
ここで、本実施の形態における天候判定の基本原理について説明する。 Here, the basic principle of weather determination in this embodiment will be described.
図2は、自車両から立体物までの距離にとエッジ強度との関係を示す図である。また、図3は、自車両から立体物までの距離とエッジ強度との関係が天候によって異なることを示す図である。 FIG. 2 is a diagram showing the relationship between the distance from the own vehicle to a three-dimensional object and the edge strength. Further, FIG. 3 is a diagram showing that the relationship between the distance from the own vehicle to the three-dimensional object and the edge strength differs depending on the weather.
図2に示すように、立体物までの距離に応じてエッジ強度は変化し、距離が遠くなるほどエッジ強度が低下する。また、図3に示すように、路面の状況及び天候によっても立体物までの距離に対するエッジ強度の変化の仕方が異なる。図2及び図3では、一例として、街路樹や先行車などがカメラの撮像範囲にある場合を例示しているが、実際には道路標識などの地物が1つ以上存在していれば良い。図3に示すように、視界不良天候である場合には、視界良好天候である場合よりも距離が離れた時のエッジ強度減衰量が大きい。そこで、視界良好天候において立体物種別と距離ごとにエッジ強度を記録しておき、検出した立体物のエッジ強度と比較する事で視界不良天候であるかどうかを判定する事が可能となる。 As shown in FIG. 2, the edge strength changes according to the distance to the three-dimensional object, and the longer the distance, the lower the edge strength. Further, as shown in FIG. 3, the method of changing the edge strength with respect to the distance to the three-dimensional object differs depending on the road surface condition and the weather. In FIGS. 2 and 3, as an example, a case where a roadside tree, a preceding vehicle, or the like is within the imaging range of the camera is illustrated, but in reality, one or more features such as road signs may be present. .. As shown in FIG. 3, in the case of poor visibility weather, the amount of edge strength attenuation at a distance is larger than in the case of good visibility weather. Therefore, it is possible to record the edge strength for each type of three-dimensional object and distance in good visibility weather and compare it with the detected edge strength of the three-dimensional object to determine whether or not the weather is poor visibility.
図4は、実エッジ強度算出部の処理内容を示すフローチャートであり、図5は種別エッジ強度取得部の処理内容を示すフローチャート、図6は視界不良天候判定部の処理内容を示すフローチャートである。 FIG. 4 is a flowchart showing the processing contents of the actual edge strength calculation unit, FIG. 5 is a flowchart showing the processing contents of the type edge strength acquisition unit, and FIG. 6 is a flowchart showing the processing contents of the poor visibility weather determination unit.
図4において、実エッジ強度算出部121は、立体物検知部110において画像中の立体物が検知された位置の情報を取得し、その情報に基づいて立体物が撮像されている矩形領域を抜き出して取得し(ステップS100)、得られた矩形領域に対してエッジ強度を計算して出力する(ステップS110)。
In FIG. 4, the actual edge
図5において、種別エッジ強度取得部123は、立体物種別判定部122で判定された立体物の種別の判定結果を取得するとともに(ステップS200)、立体物検知部110で算出された立体物までの距離を取得し(ステップS210)、取得した立体物の種別の判定結果に基づいて種別エッジ強度テーブル記憶部124から読み出した種別エッジ強度テーブルを用い、立体物までの距離に基づいて種別エッジ強度を取得する(ステップS220)。
In FIG. 5, the type edge
図6において、視界不良天候判定部130は、実エッジ強度算出部121で算出された実エッジ強度を取得するとともに(ステップS300)、種別エッジ強度取得部123で取得された種別エッジ強度を取得し(ステップS310)、実エッジ強度と種別エッジ強度との差分が予め定めた閾値以上であるかどうかを判定する(ステップS320)。ステップS320での判定結果がYESの場合には視界不良天候であると判定し(ステップS321)、処理を終了する。また、ステップS320での判定結果がNOの場合には、視界不良天候ではない(ここでは、視界良好天候である)と判定し(ステップS322)、処理を終了する。
In FIG. 6, the poor visibility
以上のように構成した本実施の形態においては、視界良好天候である場合と視界不良天候である場合との画像中の立体物のエッジ強度の差に基づいて天候を判定することにより、天候の判定に用いる画像中に立体物が撮像されている場合にも精度良く天候を判定することができる。 In the present embodiment configured as described above, the weather is determined by determining the weather based on the difference in edge strength of the three-dimensional object in the image between the case of good visibility weather and the case of poor visibility weather. Even when a three-dimensional object is captured in the image used for the determination, the weather can be determined accurately.
<付記>
なお、本発明は上記の実施の形態に限定されるものではなく、その要旨を逸脱しない範囲内の様々な変形例や組み合わせが含まれる。また、本発明は、上記の実施の形態で説明した全ての構成を備えるものに限定されず、その構成の一部を削除したものも含まれる。また、上記の各構成、機能等は、それらの一部又は全部を、例えば集積回路で設計する等により実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。
<Additional notes>
The present invention is not limited to the above-described embodiment, and includes various modifications and combinations within a range that does not deviate from the gist thereof. Further, the present invention is not limited to the one including all the configurations described in the above-described embodiment, and includes the one in which a part of the configurations is deleted. Further, each of the above configurations, functions and the like may be realized by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function.
100…天候判定装置、110…立体物検知部、120…エッジ強度算出部、121…実エッジ強度算出部、122…立体物種別判定部、123…種別エッジ強度取得部、124…種別エッジ強度テーブル記憶部、130…視界不良天候判定部 100 ... Weather determination device, 110 ... Three-dimensional object detection unit, 120 ... Edge strength calculation unit, 121 ... Actual edge strength calculation unit, 122 ... Three-dimensional object type determination unit, 123 ... Type edge strength acquisition unit, 124 ... Type edge strength table Memory unit, 130 ... Poor visibility weather judgment unit
Claims (2)
前記画像の前記立体物を検知した領域におけるエッジ強度である実エッジ強度を算出する実エッジ強度算出部と、
前記立体物における特徴量から前記立体物の種別を判定する立体物種別判定部と、
前記立体物種別判定部で判定した前記立体物の種別と距離とに対応する、視界良好である視界良好天候時のエッジ強度を予め定めた種別エッジ強度テーブルに基づいて、種別エッジ強度を取得する種別エッジ強度取得部と、
前記実エッジ強度算出部で算出した前記実エッジ強度と前記種別エッジ強度取得部で取得した前記種別エッジ強度との差分が予め定めた閾値よりも大きい場合に、前記画像の範囲の天候が視界不良である視界不良天候であると判定し、判定結果を出力する天候判定部と
を備えることを特徴とする天候判定装置。 A three-dimensional object detection unit that detects a three-dimensional object from the image captured by the image pickup unit of the vehicle and calculates the distance to the three-dimensional object.
An actual edge strength calculation unit that calculates the actual edge strength, which is the edge strength in the region where the three-dimensional object of the image is detected,
A three-dimensional object type determination unit that determines the type of the three-dimensional object from the feature amount of the three-dimensional object,
The type edge strength is acquired based on the type edge strength table in which the edge strength in good visibility weather, which corresponds to the type and distance of the three-dimensional object determined by the three-dimensional object type determination unit, is determined in advance. Type edge strength acquisition unit and
When the difference between the actual edge strength calculated by the actual edge strength calculation unit and the type edge strength acquired by the type edge strength acquisition unit is larger than a predetermined threshold value, the weather in the range of the image has poor visibility. A weather determination device including a weather determination unit that determines that the weather is poor visibility and outputs the determination result.
前記画像の前記立体物を検知した領域におけるエッジ強度である実エッジ強度を算出する手順と、
前記立体物における特徴量から前記立体物の種別を判定する手順と、
前記立体物の種別と距離とに対応する、視界良好である視界良好天候時のエッジ強度を予め定めた種別エッジ強度テーブルに基づいて、種別エッジ強度を取得する手順と、
前記実エッジ強度と前記種別エッジ強度との差分が予め定めた閾値よりも大きい場合に、前記画像の範囲の天候が視界不良である視界不良天候であると判定し、判定結果を出力する手順と
を有することを特徴とする天候判定装置。 A procedure for detecting a three-dimensional object and a distance to the three-dimensional object from an image captured by an imaging unit of a vehicle, and
A procedure for calculating the actual edge strength, which is the edge strength in the region where the three-dimensional object of the image is detected, and
A procedure for determining the type of the three-dimensional object from the feature amount of the three-dimensional object, and
A procedure for acquiring the type edge strength based on a predetermined type edge strength table for the edge strength in good visibility weather, which corresponds to the type and distance of the three-dimensional object.
When the difference between the actual edge strength and the type edge strength is larger than a predetermined threshold value, the weather in the range of the image is determined to be poor visibility weather, and the determination result is output. A weather determination device characterized by having.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019113264A JP2022116377A (en) | 2019-06-19 | 2019-06-19 | WEATHER DETERMINATION DEVICE AND WEATHER DETERMINATION METHOD |
| JP2019-113264 | 2019-06-19 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020255746A1 true WO2020255746A1 (en) | 2020-12-24 |
Family
ID=74040035
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/022261 Ceased WO2020255746A1 (en) | 2019-06-19 | 2020-06-05 | Weather identification device and weather identification method |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2022116377A (en) |
| WO (1) | WO2020255746A1 (en) |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3444192B2 (en) * | 1998-05-21 | 2003-09-08 | 日産自動車株式会社 | Imaging environment estimation device |
| US20040046866A1 (en) * | 2000-07-15 | 2004-03-11 | Poechmueller Werner | Method for determining visibility |
| JP2006349637A (en) * | 2005-06-20 | 2006-12-28 | Denso Corp | On-vehicle foggy state determination device, and automatic fog lamp system |
| JP2009025050A (en) * | 2007-07-17 | 2009-02-05 | Sumitomo Electric Ind Ltd | Visibility determination device, visibility determination method, computer program |
| US20110013839A1 (en) * | 2009-07-08 | 2011-01-20 | Valeo Vision | Procede de determination d'une region d'interet dans une image |
| JP2011108175A (en) * | 2009-11-20 | 2011-06-02 | Alpine Electronics Inc | Driving support system, driving support method and driving support program |
| JP2012243051A (en) * | 2011-05-19 | 2012-12-10 | Fuji Heavy Ind Ltd | Environment recognition device and environment recognition method |
| JP2017117105A (en) * | 2015-12-22 | 2017-06-29 | トヨタ自動車株式会社 | Line-of-sight judgment device |
| JP2017142760A (en) * | 2016-02-12 | 2017-08-17 | 日立オートモティブシステムズ株式会社 | Ambient environment recognition device for moving objects |
-
2019
- 2019-06-19 JP JP2019113264A patent/JP2022116377A/en active Pending
-
2020
- 2020-06-05 WO PCT/JP2020/022261 patent/WO2020255746A1/en not_active Ceased
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3444192B2 (en) * | 1998-05-21 | 2003-09-08 | 日産自動車株式会社 | Imaging environment estimation device |
| US20040046866A1 (en) * | 2000-07-15 | 2004-03-11 | Poechmueller Werner | Method for determining visibility |
| JP2006349637A (en) * | 2005-06-20 | 2006-12-28 | Denso Corp | On-vehicle foggy state determination device, and automatic fog lamp system |
| JP2009025050A (en) * | 2007-07-17 | 2009-02-05 | Sumitomo Electric Ind Ltd | Visibility determination device, visibility determination method, computer program |
| US20110013839A1 (en) * | 2009-07-08 | 2011-01-20 | Valeo Vision | Procede de determination d'une region d'interet dans une image |
| JP2011108175A (en) * | 2009-11-20 | 2011-06-02 | Alpine Electronics Inc | Driving support system, driving support method and driving support program |
| JP2012243051A (en) * | 2011-05-19 | 2012-12-10 | Fuji Heavy Ind Ltd | Environment recognition device and environment recognition method |
| JP2017117105A (en) * | 2015-12-22 | 2017-06-29 | トヨタ自動車株式会社 | Line-of-sight judgment device |
| JP2017142760A (en) * | 2016-02-12 | 2017-08-17 | 日立オートモティブシステムズ株式会社 | Ambient environment recognition device for moving objects |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2022116377A (en) | 2022-08-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111712731B (en) | Target detection method, target detection system and movable platform | |
| JP6795027B2 (en) | Information processing equipment, object recognition equipment, device control systems, moving objects, image processing methods and programs | |
| KR101411668B1 (en) | A calibration apparatus, a distance measurement system, a calibration method, and a computer readable medium recording a calibration program | |
| JP3861781B2 (en) | Forward vehicle tracking system and forward vehicle tracking method | |
| US9589080B2 (en) | Method and control unit for validating an illumination-range test value of a light cone of a vehicle headlight | |
| US8643721B2 (en) | Method and device for traffic sign recognition | |
| JP6667065B2 (en) | Position estimation device and position estimation method | |
| US20070225933A1 (en) | Object detection apparatus and method | |
| US20100259609A1 (en) | Pavement marker recognition device, pavement marker recognition method and pavement marker recognition program | |
| US10672141B2 (en) | Device, method, system and computer-readable medium for determining collision target object rejection | |
| CN104732233A (en) | Method and apparatus for recognizing object reflections | |
| JP2008039763A (en) | Method for determining distance of visibility for driver of vehicle | |
| US11783597B2 (en) | Image semantic segmentation for parking space detection | |
| US20160207473A1 (en) | Method of calibrating an image detecting device for an automated vehicle | |
| US8442273B2 (en) | Method and device for detecting the course of a traffic lane | |
| CN112485807B (en) | object recognition device | |
| JP6756507B2 (en) | Environmental recognition device | |
| CN112639877A (en) | Image recognition device | |
| CN101978392B (en) | Image processing device for vehicle | |
| WO2021054339A1 (en) | Object detecting device, object detecting system, moving body, and method for detecting object | |
| JP2007057331A (en) | In-vehicle system for determining fog | |
| WO2020255746A1 (en) | Weather identification device and weather identification method | |
| JP3605955B2 (en) | Vehicle identification device | |
| JP2004062519A (en) | Lane mark detection device | |
| JP7210208B2 (en) | Providing device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20827755 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20827755 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |