[go: up one dir, main page]

WO2008138670A1 - Procédé d'affichage d'images vidéo, et système vidéo correspondant - Google Patents

Procédé d'affichage d'images vidéo, et système vidéo correspondant Download PDF

Info

Publication number
WO2008138670A1
WO2008138670A1 PCT/EP2008/053471 EP2008053471W WO2008138670A1 WO 2008138670 A1 WO2008138670 A1 WO 2008138670A1 EP 2008053471 W EP2008053471 W EP 2008053471W WO 2008138670 A1 WO2008138670 A1 WO 2008138670A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
objects
video
transport
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2008/053471
Other languages
German (de)
English (en)
Inventor
Reinhard Meschenmoser
Dirk Luedtke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to EP08718162A priority Critical patent/EP2145157A1/fr
Publication of WO2008138670A1 publication Critical patent/WO2008138670A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera

Definitions

  • the present invention relates to a method for displaying video images in a means of transport and to a video system for carrying out the method.
  • the display has become more and more accurate and realistic in recent years, which has been made possible by the increasing performance of the microprocessors used in the systems for the graphics calculations.
  • navigation systems for motor vehicles are already known, which offer three-dimensional or perspective representations of inner cities. For example, this already allows a virtual departure of a calculated route during the trip preparation, in order to prevent possible confusion of the driver in difficult driving situations.
  • the graphic representation for the virtual departure must convey a realistic impression.
  • the current aircraft position and other information about the flight such as the altitude, the speed of the flight. speed, temperature and / or past or future time of flight.
  • the aircraft position is visualized not only by an indication of location coordinates, but often by arranging an aircraft symbol on a map of the flight area shown below.
  • map presentations When visualizing in the multimedia systems of the aircraft, the passengers expect map presentations whose elements can be recognized easily and quickly when looking out the window. Thus, more accurate and realistic views of map representations are required.
  • the following method steps are provided: acquisition of image data by a video camera arranged in the means of transport, determination of current position data of the means of transport, determination of object data from a database and linking of the acquired image data, the determined one Position data and the determined object data such that display data is generated in which additional information is inserted from the database in the video display.
  • tenbank are determined to objects that are shown in the corresponding video image.
  • a suitable combination of the object data then enables the generation of the display data.
  • the essence of the invention is not to improve the graphical representation of a virtual image with respect to the resolution and realistic view and to insert additional information in the virtual image as before, but to integrate the additional, graphically generated information into a real video image.
  • the displayed video image is thus a combination of real images with graphic image components.
  • the method is performed in real time. This allows the viewer on the video display to receive additional information about the objects he currently sees out of the means of transport.
  • the object data are determined for objects which are located in a detection space of the video camera, i. in the room in front of the video camera, which is observable by this.
  • the detection space can also be called "View Frustum”.
  • a graphical analysis of the image data is performed to recognize imaged objects.
  • graphical analysis techniques such as edge analysis, objects present in the video image can be identified.
  • the detected objects and the detected objects are compared with each other and depending on the comparison result, the detected objects and the detected objects are assigned to each other. Since in each case a plurality of objects can come into question for an assignment, it is preferably provided that the comparison is carried out on the basis of position data of the determined and the detected objects.
  • the shape, size and / or color of the determined and recognized objects are taken into account in the comparison.
  • the possible occlusion by other objects can additionally be taken into account.
  • the current specific position data of the means of transport are corrected, since a higher accuracy can be achieved by comparing the geographical data from the database with the position data of the sensors.
  • This makes it possible to recalculate the position data, in particular if these were originally determined by a satellite-based location system, for example GPS and / or Galileo, which usually has a certain degree of inaccuracy.
  • the video system 1 has at least one video camera 2, which is arranged in a means of transport (not shown) and detects an image outside the means of transport.
  • the camera 2 is either permanently installed, ie immovably installed, so that the area covered by the camera or the view frustum is immutable with respect to the traffic orientation.
  • additional sensors may be provided on the video camera 2, which detect the viewing direction of the video camera 2 with respect to the orientation of the means of transport.
  • the current position data with information about the location and the orientation of the means of transport are determined by means of position determination sensors 3.
  • position determination sensors 3 may be sensors for a satellite-based location system, for example GPS (Global Positioning System) and / or the European Galileo system.
  • speed sensors, route sensors and / or curve sensors can be provided with which the current geographical position and the orientation of the means of transport are determined.
  • the video system 1 has a memory 4, in which a database with object data is stored, which have geographical and totografic as well as any further suitable information.
  • the image data acquired with the video camera 2, the position data determined by the position sensors 3 and the object data retrievable from the memory 4 are made available to a data processing unit 5, for example a suitable microprocessor.
  • the data transmission from the video camera 2, the position sensors 3 and the memory 4 to the data processing unit 5 is indicated in the figure by corresponding arrows.
  • the actual data processing is carried out within the data processing unit 5 by a plurality of modules.
  • the current position and orientation of the means of transport is first calculated on the basis of the position data received by the position sensors 3 in the module 6.
  • the video image of the video camera 2 is evaluated. This can be done by graphical analysis methods, such as edge analysis, and objects present in the video image are identified. With the aid of the previously calculated mode of transport and information as well as knowledge of the orientation and
  • a position of the identified objects is determined.
  • the positional only be carried out with a certain accuracy.
  • the direction indication of a recognized object to the means of transport can be determined with sufficient accuracy.
  • initially only a rough approximation to the object position is required.
  • the objects detected in the video image are compared with objects to which the corresponding object data are stored in the database.
  • the first starting point for the comparison to be carried out is the object position determined in module 7 or the indication in which direction the object is to the means of transport.
  • the objects are searched for and determined, which are in the vicinity of the specific position data, and thus come as candidates for an assignment in question.
  • a correction of the position determination calculated by means of the sensor data can furthermore be carried out.
  • the direction in which detected objects are located in front of the means of transport can be determined. It is possible to deduce the current position of the means of transport from the indication of the direction and the exact position of the objects which can be read out of the memory 4 of the database stored there.
  • the display data is calculated, ie the additional information from the database is inserted into the video image provided by the video camera 2.
  • the assignment of objects in the video image to objects in the database that took place in the previous step in the module 8 plays a decisive role.
  • These objects, such as roads, waters, localities, buildings, and / or point-of-interests may be labeled with their names or designations. It is also possible to display additional information.
  • objects not recognized in the video image can be labeled because their relative position to identified objects results from the geographic database. This can be useful, for example, in the case of partial obscuration of the objects by clouds and / or fog.
  • direction indicators can be entered in the video image on assigned roads. Finally, it is also possible to trace unrecognized or recognizable edges of objects.
  • the multimedia system generally includes a plurality of such monitors 10. For example, in applications with only one monitor 10, this may be a TFT screen. If
  • Input devices are present, for example, a touch screen, an interactive output is possible. The user can thus retrieve additional information by selecting a displayed object and display on the monitor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Instructional Devices (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un procédé d'affichage d'images vidéo dans un moyen de transport, procédé caractérisé en ce qu'il comprend les étapes suivantes : détection de données image par caméra vidéo (2) montée dans le moyen de transport; détermination des données de position actuelles du moyen de transport; détermination des données objet à partir d'une banque de données, et liaison des données image détectées, des données de position déterminées et des données objet déterminées, de manière à générer des données d'affichage dans lesquelles des informations supplémentaires provenant de la banque de données sont insérées dans l'affichage images vidéo. L'invention concerne en outre un système vidéo (1) pour la mise en oeuvre du procédé précité.
PCT/EP2008/053471 2007-05-14 2008-03-25 Procédé d'affichage d'images vidéo, et système vidéo correspondant Ceased WO2008138670A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP08718162A EP2145157A1 (fr) 2007-05-14 2008-03-25 Procédé d'affichage d'images vidéo, et système vidéo correspondant

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102007022588A DE102007022588A1 (de) 2007-05-14 2007-05-14 Verfahren zur Anzeige von Videobildern und Videosystemen
DE102007022588.3 2007-05-14

Publications (1)

Publication Number Publication Date
WO2008138670A1 true WO2008138670A1 (fr) 2008-11-20

Family

ID=39434145

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/053471 Ceased WO2008138670A1 (fr) 2007-05-14 2008-03-25 Procédé d'affichage d'images vidéo, et système vidéo correspondant

Country Status (3)

Country Link
EP (1) EP2145157A1 (fr)
DE (1) DE102007022588A1 (fr)
WO (1) WO2008138670A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2185894A4 (fr) * 2007-08-07 2012-05-02 Hewlett Packard Development Co Affichage de données d'image et de données d'élément géographique
US9329052B2 (en) 2007-08-07 2016-05-03 Qualcomm Incorporated Displaying image data and geographic element data

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011084596A1 (de) 2011-10-17 2013-04-18 Robert Bosch Gmbh Verfahren zum Assistieren eines Fahrers in einer fremden Umgebung
DE102015226178A1 (de) 2015-12-21 2017-06-22 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Mobilgerät zur Anzeige einer geografischen Bereichsdarstellung

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0867690A1 (fr) * 1997-03-27 1998-09-30 Nippon Telegraph And Telephone Corporation Dispositif et système d'étiquetage des images digitales
US6208353B1 (en) * 1997-09-05 2001-03-27 ECOLE POLYTECHNIQUE FEDéRALE DE LAUSANNE Automated cartographic annotation of digital images
WO2005038402A1 (fr) * 2003-10-21 2005-04-28 Waro Iwane Dispositif de navigation
US20060195858A1 (en) * 2004-04-15 2006-08-31 Yusuke Takahashi Video object recognition device and recognition method, video annotation giving device and giving method, and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005124594A1 (fr) * 2004-06-16 2005-12-29 Koninklijke Philips Electronics, N.V. Etiquetage automatique en temps reel de points superposes et d'objets d'interet dans une image visualisee

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0867690A1 (fr) * 1997-03-27 1998-09-30 Nippon Telegraph And Telephone Corporation Dispositif et système d'étiquetage des images digitales
US6208353B1 (en) * 1997-09-05 2001-03-27 ECOLE POLYTECHNIQUE FEDéRALE DE LAUSANNE Automated cartographic annotation of digital images
WO2005038402A1 (fr) * 2003-10-21 2005-04-28 Waro Iwane Dispositif de navigation
US20060195858A1 (en) * 2004-04-15 2006-08-31 Yusuke Takahashi Video object recognition device and recognition method, video annotation giving device and giving method, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2145157A1 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2185894A4 (fr) * 2007-08-07 2012-05-02 Hewlett Packard Development Co Affichage de données d'image et de données d'élément géographique
US8994851B2 (en) 2007-08-07 2015-03-31 Qualcomm Incorporated Displaying image data and geographic element data
US9329052B2 (en) 2007-08-07 2016-05-03 Qualcomm Incorporated Displaying image data and geographic element data

Also Published As

Publication number Publication date
DE102007022588A1 (de) 2008-11-27
EP2145157A1 (fr) 2010-01-20

Similar Documents

Publication Publication Date Title
EP2769373B1 (fr) Transfert de données de services cartographiques à base de données d'images dans un système d'aide à la conduite
DE102016117659B4 (de) Fahrunterstützungseinrichtung
DE112008003424B4 (de) Navigationsgerät, das Videobilder einer Kamera verwendet
DE102010042063A1 (de) Verfahren und Vorrichtung zum Bestimmen von aufbereiteten Bilddaten über ein Umfeld eines Fahrzeugs
DE10138719A1 (de) Verfahren und Vorrichtung zur Darstellung von Fahrhinweisen, insbesondere in Auto-Navigationssystemen
DE102013114928B4 (de) Vorrichtung und Verfahren zum Verarbeiten von Straßendaten
DE202005021607U1 (de) Navigationsvorrichtung mit Kamerainformation
DE69815940T2 (de) Verfahren und Anordnung zur Informationsdarstellung in Form einer Landkarte für Fahrzeugsnavigationsgeräte
WO2009149960A1 (fr) Procédé de sortie combinée d'une image et d'une information locale, et véhicule à moteur associé
DE102010007091A1 (de) Verfahren zur Positionsermittlung für ein Kraftfahrzeug
DE102017208854B4 (de) Verfahren, Vorrichtungen und computerlesbares Speichermedium mit Instruktionen zum Ermitteln von geltenden Verkehrsregeln für ein Kraftfahrzeug
DE102010003851A1 (de) Verfahren und Informationssystem zum Markieren eines Zielorts für ein Fahrzeug
DE60121944T2 (de) Verfahren und vorrichtung zum anzeigen von navigationsinformationen im echtzeitbetrieb
DE102017211613A1 (de) Verfahren zur Verifizierung einer digitalen Karte eines höher automatisierten Fahrzeugs (HAF), insbesondere eines hochautomatisierten Fahrzeugs
WO2021110412A1 (fr) Procédé d'affichage destiné à afficher un modèle d'environnement d'un véhicule, programme informatique, dispositif de commande et véhicule
EP2145157A1 (fr) Procédé d'affichage d'images vidéo, et système vidéo correspondant
WO2013056954A1 (fr) Procédé pour assister un conducteur dans un environnement non familier
EP2813999A2 (fr) Système à réalité augmentée et procédé de production et d'affichage de représentations d'objet à réalité augmentée pour un véhicule
DE102008043756B4 (de) Verfahren und Steuergerät zum Bereitstellen einer Verkehrszeicheninformation
EP1283406A2 (fr) Dispositif et méthode de traitement d'images pour un véhicule
EP1832848B1 (fr) Procédé et dispositif destinés à l'affichage d'une section de carte numérique
DE102017215868A1 (de) Verfahren und Vorrichtung zum Erstellen einer Karte
DE102010042314A1 (de) Verfahren zur Ortsbestimmung mit einem Navigationssystem und Navigationssystem hierzu
DE102020102278A1 (de) Verfahren zur Navigationsunterstützung in einem Kraftfahrzeug auf Grundlage von Informationen aus mehreren erfassenden Kraftfahrzeugen
DE102018121274B4 (de) Verfahren zum Visualisieren einer Fahrabsicht, Computerprogrammprodukt und Visualisierungssystem

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08718162

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008718162

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010507863

Country of ref document: JP