EP1846794A1 - Procede et dispositif de visualisation de l'environnement d'un vehicule par fusion d'une image infrarouge et d'une representation visuelle - Google Patents
Procede et dispositif de visualisation de l'environnement d'un vehicule par fusion d'une image infrarouge et d'une representation visuelleInfo
- Publication number
- EP1846794A1 EP1846794A1 EP06700891A EP06700891A EP1846794A1 EP 1846794 A1 EP1846794 A1 EP 1846794A1 EP 06700891 A EP06700891 A EP 06700891A EP 06700891 A EP06700891 A EP 06700891A EP 1846794 A1 EP1846794 A1 EP 1846794A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- infrared
- visual
- vehicle
- processing unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 58
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000005855 radiation Effects 0.000 claims abstract description 15
- 230000004927 fusion Effects 0.000 claims description 41
- 230000007613 environmental effect Effects 0.000 claims description 13
- 230000001953 sensory effect Effects 0.000 claims description 2
- 230000002123 temporal effect Effects 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 description 8
- 230000004075 alteration Effects 0.000 description 6
- 230000003595 spectral effect Effects 0.000 description 5
- 230000001419 dependent effect Effects 0.000 description 4
- 230000007794 irritation Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000009849 deactivation Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000000725 suspension Substances 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/30—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/106—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/302—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8053—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
Definitions
- the invention relates to a method and a device for visualizing the surroundings of a vehicle, which contains a visual image containing digital data of the environment, which shows the visually visible objects, and an infrared image containing the digital data of the surroundings, which is visually visible and / or other objects emanating outgoing infrared radiation, fused to a representable in an image display unit target image to facilitate the assignment of objects that emanate infrared radiation in the recorded environment.
- Image fusion devices superimpose the images of at least two cameras that record the current environment of a vehicle in different spectral ranges.
- the spectral regions may comprise, for example, visually visible light and infrared radiation.
- a target image determined from an image fusion enables the driver of the vehicle to more easily and better interpret the information provided about the surroundings of the vehicle in an image display unit.
- a camera device has at least two cameras with largely parallel optical axes, which are spatially offset from one another. Due to the staggered mounting of the cameras, i. the offset optical axes, the images supplied by the cameras over a wide range of distances can not be aligned completely object true to each other.
- the object faithfulness describes the radiation reflected or emitted by one and the same object in the driving environment in the target image on precisely this object which can be unambiguously assigned by the driver.
- Alignment errors and / or the quality of the object fidelity arise depending on the distance of the cameras as well as on the distance between the cameras and the recorded object.
- By calibrating the camera device it is possible to close objects either at close range (this corresponds to a typical for a city driving situation with a distance range of about 15 to 75 m) very well, resulting in the remote area, a bad object loyalty is the result.
- the camera device is optimized for objects in the Fem Championship (this corresponds to, for example, an overland or highway driving with distance ranges between 30 to 150 or 50 to 250 m). This results in an alignment error in the near range.
- Devices and methods for image fusion are known, for example, from DE 102 27 171 A1 and DE 103 04 703 A1 of the Applicant.
- the devices described therein each have a camera device with a visual camera providing a visual image containing a digital data of the environment and an infrared camera providing an infrared image containing a digital data of the surroundings.
- the visual image shows the visually visible objects.
- the infrared image shows the infrared radiation emanating from the visually visible and / or further objects.
- An image processing unit performs image fusion of the visual image and the infrared image, wherein the merged target image can be displayed in an image display unit.
- the image fusion comprises a complete or partial, e.g.
- a pixel or pixel-by-pixel overlay of the visual image and the infrared image In particular, temporally and spatially identical image pairs are superposed with each other. Brightness and / or color values of the pixels or pixel regions can be superimposed or averaged here. For further optimization, the superimposition of the images can be carried out by means of weighting factors. Optionally, the consideration of the brightness and / or the visibility conditions of the vehicle is described. Also, the different weighting of pixels or pixel areas is proposed. Disadvantage of these known methods is the high computational complexity for displaying object-true target images over a wide range of distances.
- the object of the present invention is therefore to provide an improved method and an improved device for visualizing the surroundings of a vehicle, which enable a reliable interpretation of the contents of a target image generated from an image fusion.
- a visual image containing digital data of the environment wherein the visual image shows the visually visible objects.
- an infrared image containing digital data of the environment which shows the infrared radiation emanating from the visually visible and / or further objects.
- An image fusion is performed which fuses the visual image and the infrared image to a target image displayable in an image display unit. the assignment of objects, from which emanates infrared radiation, in the recorded environment easier.
- the image fusion of the visual image and / or the infrared image is exposed as a function of an environmental parameter in order to display only one of the two images or none of the images as the target image.
- the invention thus proposes to suppress the image fusion as a function of an environmental parameter.
- the suspension of image fusion upon the entry of the environmental parameter removes information from the displayable target image
- the user of a vehicle is facilitated to interpret the target image displayed in the image display unit, thereby being less distracted from the traffic of the surroundings of the vehicle.
- the visual image and / or the infrared image are masked out of the surroundings of the respective image in order to avoid fuzziness and / or double images.
- the latter means that no target image is displayed in the image display unit.
- the environmental parameter is determined in accordance with a further expedient embodiment of the method according to the invention from at least one sensory, dynamic driving variable and / or at least one other parameter.
- the speed of the vehicle in particular the undershooting or exceeding a predetermined
- Camera detected object in particular the undershooting or exceeding a predetermined distance, processed.
- both parameters are taken into account in a combination.
- a further expedient embodiment provides to process the current position of the vehicle as the vehicle dynamics variable.
- the topology and / or the current weather conditions are processed in a further embodiment.
- a deactivation of the image fusion depending on the topology or the current position of the vehicle which e.g. can be determined from the GPS system provided a GPS data, for example, when driving through a tunnel.
- an infrared camera is hardly able to meaningfully present the relevant information for the vehicle user. Overlaying the infrared image with the visual image would therefore not lead to an increase in the information content of the vehicle user, so that it makes sense, e.g. hide the infrared image from the target image.
- bad weather conditions such as in rain, where the infrared camera can not provide a sufficiently good resolution of the environment.
- a further embodiment provides that a parameter which can be selected by the vehicle user be processed by the image processing unit as a further parameter.
- a parameter which can be selected by the vehicle user be processed by the image processing unit as a further parameter.
- the further parameter would thus correspond to a deactivation and / or activation of the image fusion actively triggered by the vehicle user.
- the entropy of an image of the environment contains information about the informational value of one of the images. For example, if an image is in saturation, i. if this is overridden, no information is provided that would be meaningful for the vehicle user in any way. Upon detection of such a situation, the image may be hidden, e.g. has too few contrasts to interpret a situation.
- the decision as to whether or not the image fusion of the visual image and / or the infrared image is made or not, may be made dependent on the arrival of one or more of the above-mentioned environmental parameters.
- the inhibition of the image fusion can in particular be made dependent on the simultaneous arrival of several parameters. It is also conceivable to suspend the image fusion in temporally successive environmental parameters.
- a device for visualizing the surroundings of a vehicle, in particular in the dark, has a camera device which has a visual camera providing a visual image containing digital data of the surroundings and an infrared camera providing an infrared image containing a digital data of the surroundings wherein the visual image shows the visually visible objects and the infrared image shows the infrared radiation emanating from the visually visible and / or further objects.
- an image processing unit is provided for processing the visual image and the infrared image, the image processing unit being designed and / or configured and / or configured to perform an image fusion of the visual image and the infrared image.
- An image display unit serves to display the image of the environment generated by the image processing unit. According to the invention, the image processing unit is set up to suppress the image fusion as a function of an environmental parameter.
- At least one sensor coupled to the image processing unit is provided for determining a dynamic driving variable.
- the image processing unit can be supplied with a further parameter which can be determined either by sensors or is supplied by an external means.
- the further parameter could, for example, be transmitted to the image processing unit using mobile technologies or via GPS.
- the image processing unit is set up to determine the environmental parameter from the driving-dynamics variable and / or the further parameter.
- the camera device is calibrated to a defined distance range.
- the defined distance range can be either near or far.
- a close-up is understood to mean a situation that corresponds to a city trip in which distances between 15 and 75 m are important.
- a Fem Society in this application a typical for a country road driving situation, in particular with a distance range of about 30 to 150 m, or a typical for a highway driving situation, which includes in particular a distance range of about 50 to 250 m understood.
- the camera device can be calibrated at any distance.
- the camera apparatus is calibrated to the far range, there are alignment errors in the near range due to the offset optical axes of the cameras, which can cause irritation.
- An essential parameter influencing the image fusion is the distance of the vehicle to a vehicle in front, which is located in the vicinity of the vehicle, and / or falls below a certain speed.
- the camera device can be calibrated to the near range, resulting in a correspondingly aberration in the remote area of the vehicle. If a certain distance and / or a certain speed is exceeded, the image fusion could be prevented.
- Fig. 1 is a block diagram of a device according to the invention for visualizing the environment of a vehicle
- Fig. 2 shows the relationship between the parallax error and the distance in a camera device with two cameras having substantially parallel optical axes aligned.
- the device according to the invention shown as a block diagram in FIG. 1 has a camera device 10 with an electronic spectral region, here referred to as a visual camera 101 (eg a CCD sensor) and an infrared spectral range of approximately 8 to 10 mm sensitive electronic infrared camera 102 (eg an IR sensor) on.
- the visual camera 101 preferably provides a color, visual image.
- both cameras 101 and 102 are preferably aligned parallel to each other, whereby the parallax error can be minimized. They are preferably close to each other, thereby minimizing misalignment.
- the image planes of both cameras or sensors are preferably aligned parallel to each other and perpendicular to the optical axis and are close to each other.
- the photosensitive sensor surfaces of both cameras or sensors are preferably neither relatively tilted nor tilted relative to one another, but are arranged largely parallel to one another. Both cameras or sensors preferably also have the same opening angle. As a result, it can be achieved that the cameras or sensors provide images of different spectral regions, which largely show the same detail of the surroundings and are not twisted relative to one another and to the actual situation. As a result, the cost of processing the images to provide a defined image from both images and thus the hardware and software costs can be significantly reduced.
- An image processing unit 20 connected to the camera apparatus 10 includes a first normalizing apparatus 103, a second normalizing apparatus 104, an aligning apparatus 105, and a superimposing or merging apparatus 106.
- the target image generated by the fusion device 106 can be displayed in an image display unit 30.
- the device shown in Fig. 1 is calibrated by a calibration device to a certain distance range.
- a calibration device e.g. has several incandescent lamps, which are preferably arranged like a checkerboard.
- the light bulbs are characterized by the fact that they both heat radiation as well as visually visible
- a plate provided with a plurality of incandescent lamps or the like in a distance range in front of the two cameras 101,
- the calibration device located in front of the cameras 101, 102 which is preferably arranged in a dark environment and not in the vicinity of heat sources is in the visual camera 101, a so-called visual image showing the checkerboard-like bulbs, as they see the human eye. Furthermore, the calibration device in the infrared camera 102 generates a thermal image which also shows the arrangement of the incandescent lamps. Typically, both the visual image and the infrared image, especially due to optical aberrations, etc., show distortions at the edges of the respective image. In a known manner, the distortions or aberrations in the visual image are largely eliminated by the first normalizing device 103.
- the distortions or aberrations in the infrared image are largely eliminated by the second normalizing device 104.
- the normalization or error correction is preferably carried out by means of known software measures on the digital data of the images using calibration parameters for the visual image and calibration parameter 108 for the infrared image.
- the normalized images are aligned with each other by the registration device 105 using registration parameters 109 by a registration process known per se in digital image processing.
- one of the images preferably remains unchanged and serves as a reference for the other image.
- the second image is changed in size and position so that a largely object-like image is created relative to the first image.
- the orientation of the normalized images can be divided into three steps: displacement, rotation, and scaling.
- the aligned images are superimposed or merged in the overlay or fusion device 106 by the processing of their digital data.
- a fused or superimposed image which is displayed to the driver of the vehicle in an image display unit 30 in the vehicle, is generated from each temporally and spatially identical or object-like image pair of visual image and infrared image.
- a fusion of the temporally and spatially identical image pairs from the visual image and the infrared image takes place on the basis of individual mutually associated pixel pairs from both images or by using a plurality of pixels from the two image images.
- This can in particular be based on which resolution is desired and / or which computing power is available for digital image processing.
- the images as preprocessed as described are superimposed and displayed by digital processing of their image data. As a result, this process can be approximately compared to superimposing slides or slides of the same scene or driving environment.
- Computationally or in digital image processing this is achieved by averaging the pixel information, in particular taking into account their brightness in the respective images and the color information contained in the visual image and / or in the infrared image. This need not necessarily be done pixel by pixel, but can also be done by averaging for local and simultaneous pixel areas in both images.
- weight the pixel information in the infrared image in averaging differently to the same time and location pixel information in the visual image.
- This different weighting can be done, for example, daylight and / or weather-dependent and / or depending on the headlight of the motor vehicle and / or depending on the color in the visual image. This can be achieved, for example, that a red light in the fusion image is particularly clear.
- the vehicle user can cause irritation.
- the target image displayed in the image display unit 30 is limited to the reproduction of either the visual image or the infrared image.
- FIG. 2 shows the relationship between the parallax error and the distance of the camera device to the object picked up by it. It is shown a situation in which the device is optimized for a paralax-free representation in the long-range. If the parallax error exceeds a certain threshold, which depends on various environmental parameters, e.g. Depending on the speed and / or the distance and / or the weather conditions and / or the topology and / or the vehicle environment, the image fusion is suspended and brought the situation more appropriate image of the visual camera or the infrared camera for presentation. In the exemplary embodiment, this threshold is set at a distance which is smaller than the "optimum distance" set by calibration. The threshold need not necessarily be fixed, but may depend dynamically on different parameters or be defined by a range.
- the vehicle drifts closer to a vehicle ahead, increasing the misalignment of the merged images of the visual camera and the infrared camera in the target image.
- a sensor 40 which is coupled to the image processing unit, it is determined at which distance between the object and the camera device 10 a certain alignment error is exceeded. After exceeding this alignment error, the image fusion is deactivated and only the image of the infrared camera is displayed. Alternatively, of course, in another embodiment, only the image of the visual camera can be displayed. If, in another exemplary embodiment, a predetermined speed of the vehicle is undershot, then the range of the own headlights is sufficient for the vehicle user to be able to obtain sufficient information about events in the immediate vicinity.
- the image fusion could be suspended and the image of the visual camera in the target image can not be displayed and only the image of the infrared camera are displayed. This ensures that people, animals, etc., even in a poorly lit environment, such as in a housing estate, in a forest parking lot, etc., are displayed.
- the speed is determined by a speed sensor 41 coupled to the image processing unit 20.
- the two embodiments describe an application of exposing the image fusion at small distances and / or at low speeds of the vehicle.
- the invention can also be applied when the device is calibrated at a short distance. In this case, the image fusion would be exposed if a predetermined distance and / or a certain speed were exceeded.
- the advantage of the procedure according to the invention is that exposure to image fusion in the target image does not result in aberrations that could lead to irritation.
- the device according to the invention thus allows the use of an image fusion device, which is calibrated only on a distance range. Complex control algorithms which avoid aberrations in all ranges of distance and which have to be determined by complex calculation algorithms can therefore be dispensed with.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
L'invention concerne un procédé de visualisation de l'environnement d'un véhicule, en particulier dans l'obscurité. Ce procédé consiste à: mettre à disposition une représentation visuelle contenant des données numériques de l'environnement et représentant les objets visuellement perceptibles; et mettre également à disposition une image infrarouge contenant des données numériques de l'environnement et représentant le rayonnement infrarouge émis par les objets visuellement perceptibles et/ou d'autres objets. Pour faciliter l'affectation d'objets, émettant le rayonnement infrarouge, dans l'environnement enregistré, une fusion d'images est réalisée, consistant à fusionner la représentation visuelle et l'image infrarouge en une image cible pouvant être représentée dans un dispositif d'affichage. La fusion d'images est interrompue en fonction d'un paramètre environnemental pour permettre de représenter uniquement une des deux images ou ne représenter aucune des images en tant qu'image cible.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102005006290A DE102005006290A1 (de) | 2005-02-11 | 2005-02-11 | Verfahren und Vorrichtung zur Sichtbarmachung der Umgebung eines Fahrzeugs durch Fusion eines Infrarot- und eines Visuell-Abbilds |
| PCT/EP2006/000133 WO2006084534A1 (fr) | 2005-02-11 | 2006-01-10 | Procede et dispositif de visualisation de l'environnement d'un vehicule par fusion d'une image infrarouge et d'une representation visuelle |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP1846794A1 true EP1846794A1 (fr) | 2007-10-24 |
Family
ID=36061713
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP06700891A Ceased EP1846794A1 (fr) | 2005-02-11 | 2006-01-10 | Procede et dispositif de visualisation de l'environnement d'un vehicule par fusion d'une image infrarouge et d'une representation visuelle |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US9088737B2 (fr) |
| EP (1) | EP1846794A1 (fr) |
| JP (1) | JP2008530667A (fr) |
| CN (1) | CN101107556B (fr) |
| DE (1) | DE102005006290A1 (fr) |
| WO (1) | WO2006084534A1 (fr) |
Families Citing this family (46)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4466571B2 (ja) * | 2005-05-12 | 2010-05-26 | 株式会社デンソー | ドライバ状態検出装置、車載警報装置、運転支援システム |
| JP4685050B2 (ja) * | 2007-03-19 | 2011-05-18 | 本田技研工業株式会社 | 表示装置 |
| JP4853437B2 (ja) * | 2007-09-18 | 2012-01-11 | 株式会社デンソー | 車両周辺監視システム |
| DE102008047644B4 (de) * | 2008-09-17 | 2015-09-10 | Siemens Aktiengesellschaft | Verfahren zur Registrierung zweier Bildgebungsmodalitäten |
| JP5749442B2 (ja) * | 2010-03-11 | 2015-07-15 | 株式会社日立情報通信エンジニアリング | 監視装置 |
| CN102128686B (zh) * | 2010-12-14 | 2012-11-21 | 天津理工大学 | 红外显微测温仪 |
| KR101349025B1 (ko) * | 2011-11-25 | 2014-01-10 | 현대자동차주식회사 | 원적외선 스마트 나이트 뷰를 위한 차선 정보 합성 장치 및 방법 |
| US8994845B2 (en) * | 2012-04-27 | 2015-03-31 | Blackberry Limited | System and method of adjusting a camera based on image data |
| EP2658245B1 (fr) * | 2012-04-27 | 2016-04-13 | BlackBerry Limited | Système et procédé de réglage de données d'image de caméra |
| CN102700472B (zh) * | 2012-06-13 | 2015-04-15 | 博立码杰通讯(深圳)有限公司 | 车辆驾驶辅助设备及方法 |
| CN102745135A (zh) * | 2012-07-24 | 2012-10-24 | 苏州工业园区七星电子有限公司 | 一种主动式车辆红外夜视系统 |
| DE102012215465A1 (de) * | 2012-08-31 | 2014-03-06 | Robert Bosch Gmbh | Verfahren und Informationssystem zum Filtern von Objektinformationen |
| KR101858646B1 (ko) | 2012-12-14 | 2018-05-17 | 한화에어로스페이스 주식회사 | 영상 융합 장치 및 방법 |
| KR101470198B1 (ko) * | 2013-07-29 | 2014-12-05 | 현대자동차주식회사 | 영상 합성 장치 및 방법 |
| US10154208B2 (en) * | 2013-07-31 | 2018-12-11 | Maxell, Ltd. | Imaging device, imaging method, and on-board imaging system |
| JP6136948B2 (ja) * | 2014-01-24 | 2017-05-31 | 株式会社Jvcケンウッド | 撮像装置、映像信号処理方法及び映像信号処理プログラム |
| US9990730B2 (en) | 2014-03-21 | 2018-06-05 | Fluke Corporation | Visible light image with edge marking for enhancing IR imagery |
| US9723224B2 (en) * | 2014-03-31 | 2017-08-01 | Google Technology Holdings LLC | Adaptive low-light identification |
| KR101601475B1 (ko) * | 2014-08-25 | 2016-03-21 | 현대자동차주식회사 | 야간 주행 시 차량의 보행자 검출장치 및 방법 |
| DE102014115294A1 (de) * | 2014-10-21 | 2016-04-21 | Connaught Electronics Ltd. | Kamerasystem für ein Kraftfahrzeug, Fahrerassistenzsystem, Kraftfahrzeug und Verfahren zum Zusammenführen von Bilddaten |
| CN107107834A (zh) * | 2014-12-22 | 2017-08-29 | 富士胶片株式会社 | 投影型显示装置、电子设备、驾驶者视觉辨认图像共享方法以及驾驶者视觉辨认图像共享程序 |
| CN104811624A (zh) * | 2015-05-06 | 2015-07-29 | 努比亚技术有限公司 | 红外拍摄方法及装置 |
| CN107851311B (zh) * | 2015-06-15 | 2023-01-13 | 前视红外系统股份公司 | 对比度增强的结合图像生成系统和方法 |
| KR102384175B1 (ko) * | 2015-07-29 | 2022-04-08 | 주식회사 만도모빌리티솔루션즈 | 차량의 카메라 장치 |
| US10152811B2 (en) | 2015-08-27 | 2018-12-11 | Fluke Corporation | Edge enhancement for thermal-visible combined images and cameras |
| DE102015216908A1 (de) * | 2015-09-03 | 2017-03-09 | Robert Bosch Gmbh | Verfahren zum Erkennen von Objekten auf einer Abstellfläche |
| US10359618B2 (en) | 2016-01-11 | 2019-07-23 | Nikon Corporation | Multispectral stereoscopic endoscope system and use of same |
| CN105759273A (zh) * | 2016-02-17 | 2016-07-13 | 吴伟民 | 车辆障碍物检测方法及系统 |
| JP6445480B2 (ja) * | 2016-03-23 | 2018-12-26 | トヨタ自動車株式会社 | Soi基板の製造方法 |
| US10412381B1 (en) * | 2016-11-22 | 2019-09-10 | Northrop Grumman Systems Corporation | Calibration target for image sensor |
| DE102017200915A1 (de) * | 2017-01-20 | 2018-07-26 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zum Anzeigen eines Hinweises für einen Anwender und Arbeitsvorrichtung |
| TW201928769A (zh) * | 2017-12-27 | 2019-07-16 | 宇博先進電子工業有限公司 | 標示物體溫度的方法 |
| CN108995591A (zh) * | 2018-08-01 | 2018-12-14 | 北京海纳川汽车部件股份有限公司 | 车辆全景透视显示方法、系统及具有其的车辆 |
| JP7254461B2 (ja) * | 2018-08-01 | 2023-04-10 | キヤノン株式会社 | 撮像装置、制御方法、記録媒体、および、情報処理装置 |
| EP3715993B1 (fr) | 2019-03-25 | 2022-09-14 | Hiab AB | Véhicule comprenant un équipement de travail, équipement de travail et procédé associé |
| SE1950427A1 (en) | 2019-04-05 | 2020-10-06 | Cargotec Patenter Ab | A vehicle comprising a working equipment, and a working equipment, and a method in relation thereto |
| CN110362071B (zh) * | 2019-05-24 | 2022-07-22 | 江西理工大学 | 基于多光谱成像技术的人工智能控制方法及装置 |
| US12366855B2 (en) * | 2020-02-26 | 2025-07-22 | Polaris Industries Inc. | Environment monitoring system and method for a towed recreational vehicle |
| US11418719B2 (en) * | 2020-09-04 | 2022-08-16 | Altek Semiconductor Corp. | Dual sensor imaging system and calibration method which includes a color sensor and an infrared ray sensor to perform image alignment and brightness matching |
| US11568526B2 (en) | 2020-09-04 | 2023-01-31 | Altek Semiconductor Corp. | Dual sensor imaging system and imaging method thereof |
| US11496694B2 (en) | 2020-09-04 | 2022-11-08 | Altek Semiconductor Corp. | Dual sensor imaging system and imaging method thereof |
| US11689822B2 (en) | 2020-09-04 | 2023-06-27 | Altek Semiconductor Corp. | Dual sensor imaging system and privacy protection imaging method thereof |
| US11496660B2 (en) | 2020-09-04 | 2022-11-08 | Altek Semiconductor Corp. | Dual sensor imaging system and depth map calculation method thereof |
| DE102021132334A1 (de) * | 2021-12-08 | 2023-06-15 | Bayerische Motoren Werke Aktiengesellschaft | Abtasten eines Umfelds eines Fahrzeugs |
| CN115170817B (zh) * | 2022-07-21 | 2023-04-28 | 广州大学 | 基于三维人-物网格拓扑增强的人物交互检测方法 |
| CN116645417B (zh) * | 2023-05-29 | 2024-12-24 | 深圳市镭神智能系统有限公司 | 阴影区域调整方法、装置、车辆和可读存储介质 |
Family Cites Families (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5001558A (en) * | 1985-06-11 | 1991-03-19 | General Motors Corporation | Night vision system with color video camera |
| US4751571A (en) * | 1987-07-29 | 1988-06-14 | General Electric Company | Composite visible/thermal-infrared imaging apparatus |
| JPH01268284A (ja) * | 1988-04-19 | 1989-10-25 | Fujitsu Ltd | 画像合成装置 |
| GB2291304A (en) * | 1994-07-07 | 1996-01-17 | Marconi Gec Ltd | Head-mountable display system |
| US6163309A (en) * | 1998-01-16 | 2000-12-19 | Weinert; Charles L. | Head up display and vision system |
| JP4103179B2 (ja) * | 1998-06-30 | 2008-06-18 | マツダ株式会社 | 環境認識装置 |
| JP2001101596A (ja) * | 1999-09-27 | 2001-04-13 | Mazda Motor Corp | 車両の表示装置 |
| JP2002083285A (ja) * | 2000-07-07 | 2002-03-22 | Matsushita Electric Ind Co Ltd | 画像合成装置および画像合成方法 |
| US6646799B1 (en) * | 2000-08-30 | 2003-11-11 | Science Applications International Corporation | System and method for combining multiple energy bands to improve scene viewing |
| JP2002091407A (ja) * | 2000-09-20 | 2002-03-27 | Ricoh Co Ltd | 画像表示装置 |
| JP2002237969A (ja) * | 2001-02-09 | 2002-08-23 | Hitachi Ltd | 車載カメラおよび画像処理システム |
| KR100866450B1 (ko) * | 2001-10-15 | 2008-10-31 | 파나소닉 주식회사 | 차량 주위 감시 장치 및 그 조정 방법 |
| JP2003200755A (ja) * | 2001-12-28 | 2003-07-15 | Yazaki Corp | 車両用表示装置 |
| DE10207039A1 (de) * | 2002-02-20 | 2003-09-04 | Bayerische Motoren Werke Ag | Verfahren und Vorrichtung zur Sichtbarmachung eines Ausschnitts der Umgebung eines Fahrzeugs sowie eine Kalibriervorrichtung zur Kalibrierung der Vorrichtung |
| DE10227171B4 (de) * | 2002-06-18 | 2019-09-26 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zur Sichtbarmachung der Umgebung eines Fahrzeugs mit abstandsabhängiger Fusion eines Infrarot- und eines Visuell-Abbilds |
| US20070035625A9 (en) * | 2002-12-20 | 2007-02-15 | Hamdan Majed M | Vehicle video processing system |
| JP2004235987A (ja) * | 2003-01-30 | 2004-08-19 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
| DE10304703B4 (de) * | 2003-02-06 | 2023-03-16 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Vorrichtung zur Sichtbarmachung der Umgebung eines Fahrzeugs mit umgebungsabhängiger Fusion eines Infrarot- und eines Visuell-Abbilds |
| JP2004312402A (ja) * | 2003-04-08 | 2004-11-04 | Hitachi Ltd | 道路監視システム,道路監視装置 |
| JP3879696B2 (ja) * | 2003-04-25 | 2007-02-14 | 日産自動車株式会社 | 運転支援装置 |
| US7538326B2 (en) * | 2004-12-03 | 2009-05-26 | Fluke Corporation | Visible light and IR combined image camera with a laser pointer |
-
2005
- 2005-02-11 DE DE102005006290A patent/DE102005006290A1/de not_active Ceased
-
2006
- 2006-01-10 CN CN200680002767.1A patent/CN101107556B/zh active Active
- 2006-01-10 JP JP2007554449A patent/JP2008530667A/ja active Pending
- 2006-01-10 EP EP06700891A patent/EP1846794A1/fr not_active Ceased
- 2006-01-10 WO PCT/EP2006/000133 patent/WO2006084534A1/fr not_active Ceased
-
2007
- 2007-08-07 US US11/890,495 patent/US9088737B2/en active Active
Non-Patent Citations (1)
| Title |
|---|
| See references of WO2006084534A1 * |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2008530667A (ja) | 2008-08-07 |
| DE102005006290A1 (de) | 2006-08-24 |
| CN101107556A (zh) | 2008-01-16 |
| WO2006084534A1 (fr) | 2006-08-17 |
| US9088737B2 (en) | 2015-07-21 |
| US20080024608A1 (en) | 2008-01-31 |
| CN101107556B (zh) | 2010-12-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP1846794A1 (fr) | Procede et dispositif de visualisation de l'environnement d'un vehicule par fusion d'une image infrarouge et d'une representation visuelle | |
| DE10304703B4 (de) | Verfahren und Vorrichtung zur Sichtbarmachung der Umgebung eines Fahrzeugs mit umgebungsabhängiger Fusion eines Infrarot- und eines Visuell-Abbilds | |
| DE102012025322B4 (de) | Kraftfahrzeug mit Kamera-Monitor-System | |
| EP2765031B1 (fr) | Système de vision pour véhicules, notamment véhicules utilitaires | |
| DE112018000171B4 (de) | Vorrichtung und Verfahren zur Anzeige von Informationen | |
| EP1504960B1 (fr) | Appareil et méthode pour améliorer la vision dans un vehicule | |
| DE102014018040A1 (de) | Sichtsystem | |
| DE102020107789B4 (de) | Sichtsystem für ein Fahrzeug und Verfahren zum Umschalten zwischen von dem Sichtsystem dargestellten Bildbereichen | |
| EP3434523A1 (fr) | Système de vision indirecte pour un véhicule | |
| EP1339228B1 (fr) | Méthode et dispositif pour la visualisation d'une portion de l'environnement d'un véhicule et unité d'étalonnage pour calibrer le dispositif | |
| DE102012200133A1 (de) | Verfahren und Vorrichtung zur Fahrerinformation | |
| DE10218175B4 (de) | Verfahren und Vorrichtung zur Sichtbarmachung der Umgebung eines Fahrzeugs mit fahrsituationsabhängiger Fusion eines Infrarot- und eines Visuell-Abbilds | |
| DE102013224954A1 (de) | Verfahren und Vorrichtung zum Erzeugen einer Warnung mittels zweier durch Kameras erfasster Bilder einer Fahrzeugumgebung | |
| DE102012018556B4 (de) | Assistenzsystem zur Ermöglichung einer erweiterten Vorausschau für nachfolgende Verkehrsteilnehmer | |
| DE10227171B4 (de) | Verfahren und Vorrichtung zur Sichtbarmachung der Umgebung eines Fahrzeugs mit abstandsabhängiger Fusion eines Infrarot- und eines Visuell-Abbilds | |
| DE102022120236B3 (de) | Verfahren zum harmonisierten Anzeigen von Kamerabildern in einem Kraftfahrzeug und entsprechend eingerichtetes Kraftfahrzeug | |
| DE102013220022B4 (de) | Fahrzeugkamera zur Erfassung von Bildern aus einem Umgebungsbereich eines Fahrzeugs und Fahrzeug | |
| DE102018202683B3 (de) | Kamerasystem sowie Verfahren zum Erfassen eines Einsatzfahrzeugs mit Sonderzeichen | |
| DE102018110597B4 (de) | Verfahren zur Bildharmonisierung, Bildverarbeitungseinrichtung, Kamerasystem und Kraftfahrzeug | |
| WO2022083833A1 (fr) | Système pour éviter les accidents provoqués par des animaux sauvages qui traversent au crépuscule et la nuit | |
| DE102015210870A1 (de) | Vorrichtung und Verfahren zum Erfassen eines Bildes einer Fahrzeugumgebung | |
| DE102014111186A1 (de) | Objekthervorhebung und -erkennung bei Fahrzeugbildanzeigesystemen | |
| DE102005000775B4 (de) | Verfahren zur Überwachung eines Objektraumes von einem Kraftfahrzeug aus | |
| DE102022101893A1 (de) | Verfahren zur Darstellung von Augmented Reality Informationen in Fahrzeugen |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| 17P | Request for examination filed |
Effective date: 20070623 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
| 17Q | First examination report despatched |
Effective date: 20071114 |
|
| DAX | Request for extension of the european patent (deleted) | ||
| RBV | Designated contracting states (corrected) |
Designated state(s): DE FR GB |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
| 18R | Application refused |
Effective date: 20090528 |