[go: up one dir, main page]

WO2019072579A1 - Combinaison dynamique des images partielles en une représentation d'un environnement d'un véhicule - Google Patents

Combinaison dynamique des images partielles en une représentation d'un environnement d'un véhicule Download PDF

Info

Publication number
WO2019072579A1
WO2019072579A1 PCT/EP2018/076299 EP2018076299W WO2019072579A1 WO 2019072579 A1 WO2019072579 A1 WO 2019072579A1 EP 2018076299 W EP2018076299 W EP 2018076299W WO 2019072579 A1 WO2019072579 A1 WO 2019072579A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
sensor
partial
joint line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/076299
Other languages
German (de)
English (en)
Inventor
Philipp Hoffmann
Guenter Bauer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Publication of WO2019072579A1 publication Critical patent/WO2019072579A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Definitions

  • the present invention relates to a method of assembling partial images into an image of a contiguous surrounding area of a vehicle.
  • the present invention relates to a dynamic joining of partial images to avoid distortions in the area of a joint line or joint lines.
  • the present invention also relates to a driver assistance system using the method for joining partial images as well as to a projecting one
  • driver assistance systems which record ambient signals via sensors and represent the driver of the environment. For example, it is known to provide for monitoring a rear traffic space cameras for image capture, their images are displayed to the driver on a display.
  • WO 1999/015360 A1 discloses a viewing device with cameras for
  • Interior rearview mirrors in the motor vehicle through the use of image sensors and serves e.g. as a parking aid and to reduce wind noise to a minimum.
  • From DE 10 2014 213 536 A1 is a driver assistance system and a method for joining partial images to an image of a
  • Image processing techniques such as warping, scaling, alpha blending and
  • Vehicle rear-mounted 2D camera can be calculated a variety of data packets whose perspectives correspond to virtual sensors.
  • the perspective of the externally located virtual sensors on the joint lines is identical to the perspective of the respective outside camera, so that the two joint lines are arranged statically symmetrically parallel to one another.
  • the two outdoor cameras provide the image of the composite
  • the present invention relates in particular to a method for
  • the surrounding area may be, for example, a rear area of the vehicle which lies approximately with an area behind the front bumper of a vehicle and to the left or right of the vehicle.
  • Ambient sensor may be, for example, a two-dimensional (2D) optical camera. This can be in the area of a fender and / or a
  • Exterior mirror of a vehicle to be arranged.
  • a second partial area which is at least not completely identical to the first partial area, of the same surrounding area is detected by means of a second environmental sensor.
  • the second environment sensor may also be an optical sensor and, like the first environment sensor, also a two-dimensional (2D) optical camera.
  • the first and second environmental sensors are embodied and / or arranged in such a way that they jointly detect one area of the first and second partial areas, so that there is more than one perspective for the area jointly detected by the two sensors.
  • the above common area detected by the first and second environmental sensors will also be detected by means of a detected further environmental sensor, which is provided by a sensor array.
  • the further environment sensor can be a
  • the further environmental sensor provided by means of the sensor arrangement can in particular also advantageously be a 3D sensor arrangement which is particularly suitable for detecting a 3D image of the area covered by it and in particular a 3D image of an object located at least partially therein.
  • the possibility is created to recognize a critical object, which may be in particular a vehicle, and to transfer it as a protected (not disassembled) area into a coordinate system of the visualization.
  • a critical object is here as in the entire context of the application in particular a vehicle with particular interest or risk potential for the particular driving situation called.
  • a first partial image of the first partial region is created on the basis of a signal of the first environmental sensor and a second partial image of the second partial region on the basis of a signal of the second environmental sensor corresponding to the above region jointly detected by both 2D sensors also include a common sub-picture.
  • a first joint line between the first and second partial image is taken into account, taking into account a signal of the further
  • the joint line is determined with regard to their shape / course and / or their position.
  • the dynamic definition of the joint line is carried out in particular under
  • the first and second partial images are joined together along the first joint line to form an image of the surrounding region.
  • a third portion of the surrounding area by means of a third
  • the second and third sub-area comprises a second common area detected by the second and third environmental sensor, which is advantageously also detected by the further environmental sensor and in particular the 3D sensor arrangement.
  • a third sub-image of the third subregion is suitably generated on the basis of a signal of the third environmental sensor and a second joint line between the second and third sub-image dynamically determined, taking into account a signal of the other environmental sensor and in particular the 3D sensor array and thereby one of the second Fügeline limited area dynamically assigned to the second or third field.
  • the second and third partial images are joined along the projecting second joint line to form an image of the surrounding area.
  • the third environmental sensor can thus detect, for example, an environmental area which corresponds to the first environmental area with respect to the second
  • the dynamic definition of the first joint line between the first and second partial image advantageously along an image of an object in the first and / or second partial image, wherein the object of the
  • Method is the dynamic determination of the second joint line between the second and third partial image along an image of an object in the second and / or third partial image, wherein the object of the other
  • Ambient sensor and in particular the 3D sensor array is detected at least partially in the second common portion, wherein also the area bounded by the second feint line is associated with the image of an object of the second or third partial image.
  • the possibility is created, as stated above, of recognizing a critical object and transferring it as a protected (not decomposed) area into an image of a surrounding area displayed on a screen.
  • the assignment of the image of an object can be performed individually for each sub-image independently symmetrical or asymmetrical to the other sub-images.
  • first and / or second joint line is suitably submerged
  • the first and / or second joint line of the surrounding area may suitably be formed as a vertical or inclined line, which may also be formed as a polygon along the outlines of the image of an object at least in a partial area, and which, for example ideally a cut along the contours of a critical object can be.
  • first, second and third environmental sensor suitably, an optical 2D sensor and a 3D sensor arrangement, a stereo camera or in particular a
  • Radar sensor or a Lidarsensor or an ultrasonic sensor can be used, wherein the definition of the first and second joint line dynamic for each Frame takes place and / or the assignment of the limited by the first and second joint line area with the image of an object to the first or second field and / or to the second or third field dynamically for each frame depending on the frame rate of an image sequence can be done individually , In this way, information loss in the display can advantageously be avoided and the display of the individual images of an image sequence can be carried out in a fluid manner.
  • a region of the first partial image potentially located beyond the first joint line is removed prior to assembly, and / or a region of the second partial image potentially located beyond the first joint line and / or the second joint line is removed prior to assembly , and / or a region of the third partial image potentially located beyond the second joint line is removed prior to assembly.
  • a first 2D data set from the first partial image with respect to a first virtual projection area can also be generated, and / or a second 2D data set from the second partial image can be generated with respect to a second virtual projection area, and / or a third 2 D data set can be generated from the third partial image with respect to a third virtual projection area, such that the first and / or the second and / or third
  • Projected data sets the user a suitable
  • an advantageous position of the respective projection surface may also be advantageous, taking into account a size and / or brightness of an object and / or position of an object relative to the vehicle and / or a
  • an object and / or a relative speed of an object to the vehicle dynamically determined individually for each frame.
  • first and third virtual projection surfaces respectively correspond to the first and second fins with the second virtual projection surface
  • first and / or second and / or third virtual projection surfaces may suitably have a concave cylindrical portion-like shape, wherein the user is within the cylinder
  • the respective cylinder portion-like shape is associated with a common cylinder while the user or vehicle is substantially at the cylinder center.
  • the present invention more particularly relates to
  • Driver assistance system with a first environment sensor, a second environment sensor, another environment sensor, which may be particularly advantageous a 3D sensor array, an evaluation, and a 2D display unit on the at least a portion of the image is displayed.
  • the evaluation unit may comprise, for example, a programmable processor in the form of a microcontroller or nanocontroller.
  • evaluation units are also referred to as electronic control units (ECU), wherein the evaluation unit for processing image data and for controlling the display unit (screen) in addition to signals of the first and second environmental sensor and the other
  • the 2D display unit may be provided, for example, as a matrix display for mounting in a dashboard of a vehicle and / or in a combination instrument of a vehicle.
  • the first environment sensor, the second environment sensor, the other Ambient sensor and in particular the 3D sensor arrangement and an optionally usable third environmental sensor applies correspondingly in connection with the method according to the invention.
  • the driver assistance system is set up to realize the features, feature combinations and the resulting advantages thereof in accordance with the inventive method described above.
  • the present invention also relates, in particular, to a vehicle which can be designed, for example, as a car, as a transporter, as a truck, as a watercraft and / or aircraft.
  • the vehicle has a driver assistance system described above.
  • FIG. 1 shows components of an exemplary embodiment of a vehicle according to the invention taking into account the method according to the invention
  • 3a shows a simplified representation of the components of the vehicle according to the invention of Figure 1 together with legally prescribed areas ..; 3b shows an artifact-afflicted image of a driving situation;
  • FIGS. 4a, 4b and 4c respectively form partial images merged into an image of a surrounding area according to the method of the invention;
  • Fig. 5a is an enlarged detail of Fig. 4a;
  • Fig. 5b is a modification of Fig. 5a. 1 shows components of an exemplary embodiment of a vehicle 10 according to the invention taking into account the method according to the invention with dynamic joint lines 6, 61 in conjunction with static joint lines 60, 601 known from the prior art according to DE 10 2014 213 536 A1.
  • the embodiment of the vehicle 10 of FIG. 1 comprises a first optical 2D outdoor camera 1 and at the position of the left
  • a sensor assembly 20 which is suitably designed and arranged to create a 3D image of the detected area II of the surrounding area IV and in particular an object 1 1 at least partially located therein.
  • the thus formed as a 3D sensor assembly sensor assembly 20 is formed such that a cooperating with their control unit 4 in particular a relative position of the object 1 1 to the vehicle 10 and / or a relative movement of the object 1 1 to the vehicle 10 and / or a Relative speed of the object 1 1 can detect the speed of the vehicle 10.
  • the area I detected by the 2D outdoor camera 1 has the same as that of the
  • the sensor arrangement 20 is designed and arranged such that the area covered by it substantially corresponds to the area II detected by the 2D outdoor camera 2, so that the sensor arrangement 20 also detects the areas I / II and II / III that are detected together ,
  • the sensor arrangement 20 may suitably be a stereo camera and in particular a radar sensor or lidar sensor or ultrasound sensor.
  • the environmental sensors 1, 2, 3 and the sensor arrangement 20 can be in particular a rear lying surrounding area IV of the vehicle 1 0 also take into account legal requirements, which will be described below with reference to Fig. 3a.
  • the signals of the cameras 1, 2, 3 and the sensor arrangement 20 are detected by the control unit 4 and for displaying an image IV of the
  • control unit 4 suitable further signals such as the speed and / or the direction of travel R1 0 of the vehicle 1 0 are available.
  • the arrow B in FIG. 1 shows the viewing direction on the screen 5 and the oppositely directed arrow A shows the view in FIG.
  • Fig. 1 also shows conventional static joint lines 60, 601 together with and in comparison to for ease of understanding of the application
  • Dynamic joint lines (joining planes) 6, 61 each projecting on the projection surfaces 7, 8 and 9 together with projected onto them
  • Partial images ⁇ , ⁇ and I II 'of the first I, second I I and third I I I surrounding area corresponding joint lines 6, 61 for the composition of the sub-images ⁇ , ⁇ ⁇ and I I I' form an image IV of the surrounding area IV.
  • the dynamic joint lines 6, 61 can thereby taking into account signals of the
  • Sensor array 20 advantageously be symmetrical or asymmetrical to each other along different directions R6, R61 be moved and in particular also be inclined and also to the edge of one of the
  • Sensor arrangement 20 detected object 1 1 to be moved.
  • 2a, 2b, and 2c respectively show steps of the conventional method using static joint lines 60, 601 for assembling the sub-images ⁇ , I I ', I II' to the image IV of the surrounding area IV of a vehicle 1 0, at a for the application of the conventional method suitable driving situation and environmental situation with appropriate brightness of the
  • 2a shows the partial image ⁇ of the first subarea I with the first static fusing line 60, the correspondingly equalized subimage ⁇ of the second subarea II and the third subimage III 'of the third subarea III according to the conventional method of image processing suitable for equalization with the second static joint line 601, which are arranged correspondingly symmetrical to each other.
  • 2b shows the part images I, II and III respectively composed of an image IV of the surrounding area IV on the static joint lines 60 and 601 from the viewing direction A, ie from the vehicle 10 directed backwards in the viewing direction of the cameras 1, 2, 3 and the sensor arrangement 20.
  • FIG. 2c shows a mirrored representation IV '(B) of the image IV' (A) of FIG. 2b for display on the screen 5 in the vehicle 10.
  • the second partial image II ' comprises a data-free region, which is e.g. is suitable for displaying a section of the vehicle 10.
  • FIG. 3a shows a simplified representation of the components of FIG.
  • the statutory field of view of class 3 main external mirror
  • the legal field of view of class 1 internal mirror
  • FIG. 3a shows identical reference numerals for the same components of the representation of Fig. 1, for which reason the relevant description is omitted here and instead reference is made to the description of Fig. 1.
  • FIG. 3b shows a driving situation with for the conventional method unsuitable conditions, such as limited brightness and / or visibility due to, for example, bad weather, in the Object 1 1 addition to due to its position and / or size and / or
  • Speed such as a motorcyclist as shown in Fig. 3, is distorted and can disappear in the area of the alpha diaphragm, for example, when running linearly both images as a critical vehicle 1 1.
  • Speed is distorted and can disappear in the area of the alpha diaphragm, for example, when running linearly both images as a critical vehicle 1 1.
  • a clear separation of the images is recognizable, but this would destroy the impression of the panorama of the image IV.
  • FIGS. 4a, 4b and 4c respectively show images of exemplary embodiments of partial images joined together to form an image of a surrounding region, as in FIG. 3b, for clarity, the reference numerals of the first partial image ⁇ and the joint line 6 are not shown.
  • the images of Figs. 4a, 4b and 4c are respectively as in Fig. 3b the
  • Vehicle 10 each in the data-empty area 12 in particular in the partial image II 'shown.
  • FIG. 4 a shows a driving situation with the dynamic joint line 61 shifted to one side of the object 1 1 in the direction of the sub-image III ', in which the object 1 1 is merely imaged by the rear-view camera 2 in the sub-image II'.
  • FIG. 4b shows a driving situation with the dynamic joint line 61 shifted inward in the direction of the center of the second field II ', in which the object 1 1 is merely imaged by the camera 3 in the field III'.
  • Fig. 4c shows this a driving situation with three in the field of view of a
  • FIG. 5a shows an enlarged detail of FIG. 4a only with the object 1 1, the joint line 61 between the partial image II ', in which the object 1 1 is also shown, and the partial image III', the joint line 61 as shown in FIG 4a is a straight line which is shifted to the side of the object 11.
  • Fig. 5b shows a modification of Fig. 5a, in which the joint line 61 is shifted in the direction inward to the center of the image in the field II 'and is also formed in the vicinity of the object 1 1 as a polygonal polygon, located on the Silhouette of the object 1 1 oriented.
  • first outer camera position: left outer rear view mirror
  • second outer camera rear side position
  • third outer camera position: right outer rear view mirror

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un véhicule (10) comportant un système d'aide à la conduite et un procédé de combinaison dynamique d'images partielles en une représentation d'une zone environnante (IV) cohérente d'un véhicule (10), comprenant les étapes suivantes : la détection d'une première zone partielle (I) et d'une deuxième zone partielle (II) de la zone environnante (IV) au moyen des capteurs environnementaux (1, 2) ; la détection d'au moins une zone (I / II) détectée conjointement par les capteurs environnementaux (1, 2) au moyen d'un autre dispositif capteur (20), qui peut être, en particulier un dispositif capteur tridimensionnel ; la génération de première et deuxième images partielles des première (I) et deuxième (II) zones partielles respectivement en fonction des signaux des capteurs environnementaux (1, 2) ; la définition dynamique d'une ligne d'assemblage (6) entre les première et deuxième images partielles le long d'une représentation d'un objet (11) dans les première et/ou deuxième images partielles en tenant compte des signaux du dispositif capteur (20) ; et l'assemblage des première et deuxième images partielles le long de la première ligne d'assemblage (6).
PCT/EP2018/076299 2017-10-09 2018-09-27 Combinaison dynamique des images partielles en une représentation d'un environnement d'un véhicule Ceased WO2019072579A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017217870.1A DE102017217870B4 (de) 2017-10-09 2017-10-09 Verfahren zum Zusammenfügen von Teilbildern zu einem Abbild eines zusammenhängenden Umgebungsbereiches eines Fahrzeugs sowie Fahrerassistenzsystem und Fahrzeug
DE102017217870.1 2017-10-09

Publications (1)

Publication Number Publication Date
WO2019072579A1 true WO2019072579A1 (fr) 2019-04-18

Family

ID=63713871

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/076299 Ceased WO2019072579A1 (fr) 2017-10-09 2018-09-27 Combinaison dynamique des images partielles en une représentation d'un environnement d'un véhicule

Country Status (2)

Country Link
DE (1) DE102017217870B4 (fr)
WO (1) WO2019072579A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393407A (zh) * 2020-03-12 2021-09-14 平湖莱顿光学仪器制造有限公司 一种获取试样的显微图像信息的方法与设备
CN117455792A (zh) * 2023-12-25 2024-01-26 武汉车凌智联科技有限公司 一种车辆内置360度全景影像合成处理方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020215198A1 (fr) * 2019-04-23 2020-10-29 深圳市大疆创新科技有限公司 Procédé, appareil et dispositif de traitement de données et plate-forme mobile
DE102022125210A1 (de) 2022-09-29 2024-04-04 Bayerische Motoren Werke Aktiengesellschaft Zusammenfügen von Teilbildern eines Fahrzeugumfelds
DE102024107630A1 (de) * 2024-03-18 2025-09-18 Motherson Innovations Company Limited Verfahren, Anordnung und Datenverarbeitungsprogrammprodukt zur Erweiterung des Sichtfeldes einer kamerabasierten Außensichtbaugruppe für Straßenfahrzeuge

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999015360A1 (fr) 1997-09-19 1999-04-01 Robert Bosch Gmbh Dispositif visuel pour un vehicule
DE102013211271A1 (de) * 2013-06-17 2014-12-18 Robert Bosch Gmbh System und Verfahren zum Zusammenfügen mittels mehrerer optischer Sensoren aufgenommener Bilder
DE102014213536A1 (de) 2014-07-11 2016-01-14 Bayerische Motoren Werke Aktiengesellschaft Zusammenfügen von Teilbildern zu einem Abbild einer Umgebung eines Fortbewegungsmittels
DE102015204213A1 (de) * 2015-03-10 2016-09-15 Robert Bosch Gmbh Verfahren zum Zusammensetzen von zwei Bildern einer Fahrzeugumgebung eines Fahrzeuges und entsprechende Vorrichtung

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012018326B4 (de) * 2012-09-15 2019-12-19 Zf Friedrichshafen Ag Verfahren und Vorrichtung für ein bildgebendes Fahrerassistenzsystem mit verdeckungsfreier Umsichtfunktion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999015360A1 (fr) 1997-09-19 1999-04-01 Robert Bosch Gmbh Dispositif visuel pour un vehicule
DE102013211271A1 (de) * 2013-06-17 2014-12-18 Robert Bosch Gmbh System und Verfahren zum Zusammenfügen mittels mehrerer optischer Sensoren aufgenommener Bilder
DE102014213536A1 (de) 2014-07-11 2016-01-14 Bayerische Motoren Werke Aktiengesellschaft Zusammenfügen von Teilbildern zu einem Abbild einer Umgebung eines Fortbewegungsmittels
DE102015204213A1 (de) * 2015-03-10 2016-09-15 Robert Bosch Gmbh Verfahren zum Zusammensetzen von zwei Bildern einer Fahrzeugumgebung eines Fahrzeuges und entsprechende Vorrichtung

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113393407A (zh) * 2020-03-12 2021-09-14 平湖莱顿光学仪器制造有限公司 一种获取试样的显微图像信息的方法与设备
CN117455792A (zh) * 2023-12-25 2024-01-26 武汉车凌智联科技有限公司 一种车辆内置360度全景影像合成处理方法
CN117455792B (zh) * 2023-12-25 2024-03-22 武汉车凌智联科技有限公司 一种车辆内置360度全景影像合成处理方法

Also Published As

Publication number Publication date
DE102017217870A1 (de) 2019-04-11
DE102017217870B4 (de) 2023-09-28

Similar Documents

Publication Publication Date Title
DE102005000739B4 (de) Fahrzeug-Sichtunterstützungssystem
DE19923964C2 (de) Umgebungs-Überwachungsgerät für ein Fahrzeug
EP2623374B1 (fr) Système de vision pour véhicules utilitaires destiné à la représentation de champs de vision règlementaires d'un rétroviseur principal et d'un rétroviseur grand angle
DE102012025322B4 (de) Kraftfahrzeug mit Kamera-Monitor-System
EP3512739B1 (fr) Procédé permettant de produire une vue dans un rétroviseur de l'environnement d'un véhicule
WO2019072579A1 (fr) Combinaison dynamique des images partielles en une représentation d'un environnement d'un véhicule
DE102006003538B3 (de) Verfahren zum Zusammenfügen mehrerer Bildaufnahmen zu einem Gesamtbild in der Vogelperspektive
DE102015008042B3 (de) Anzeigeeinrichtung für Fahrzeuge, insbesondere Nutzfahrzeuge
DE102014213536B4 (de) Verfahren zum Zusammenfügen von Teilbildern zu einem Abbild eines zusammenhängenden Umgebungsbereiches eines Fortbewegungsmittels, Fahrerassistenzsystem und Fortbewegungsmittel
DE102014115037A1 (de) Sichtbasierte Objekterkennung und -hervorhebung bei Fahrzeugbildanzeigesystemen
DE102014018040A1 (de) Sichtsystem
DE102013002111A1 (de) Sichtsystem für Fahrzeuge, insbesondere Nutzfahrzeuge
DE102012012501B4 (de) Kamerasystem für ein Kraftfahrzeug
EP3292535B1 (fr) Procédé de production d'une image complète d'un environnement d'un véhicule et d'un dispositif correspondant
DE102018100211A1 (de) Verfahren zum Erzeugen einer Darstellung einer Umgebung durch Verschieben einer virtuellen Kamera in Richtung eines Innenspiegels eines Fahrzeugs; sowie Kameraeinrichtung
DE102013224954A1 (de) Verfahren und Vorrichtung zum Erzeugen einer Warnung mittels zweier durch Kameras erfasster Bilder einer Fahrzeugumgebung
EP2500216A1 (fr) Procédé et dispositif pour un système d'assistance au conducteur produisant des images
EP3106349B1 (fr) Système de vision pour véhicule utilitaire destiné à la représentation de champs de vision règlementaires d'un rétroviseur principal et d'un rétroviseur grand angle
DE102022114245B3 (de) Kamera-Monitor-System für ein Fahrzeug und Verfahren zur Steuerung eines solchen Kamera-Monitor-Systems
DE10016184A1 (de) Vorrichtung zur Anzeige der Umgebung eines Fahrzeugs
DE102017206175A1 (de) Surround-View-System für ein Fahrzeug
DE102006037600A1 (de) Verfahren zur auflösungsabhängigen Darstellung der Umgebung eines Kraftfahrzeugs
DE102011005368A1 (de) Fahrerassistenzsystem für ein Fahrzeug mit einem angereicherten Videobild in einem Anzeigemittel
DE102015113039A1 (de) Virtuelles Außenspiegelsystem eines Kraftfahrzeugs und Kraftfahrzeug
DE102017009935B4 (de) Außenspiegel für ein Fahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18779639

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18779639

Country of ref document: EP

Kind code of ref document: A1