WO2007090660A1 - Procede et dispositif pour l'affichage d'informations géodépendantes dans une representation ou une vue visuelle d'une scene - Google Patents
Procede et dispositif pour l'affichage d'informations géodépendantes dans une representation ou une vue visuelle d'une scene Download PDFInfo
- Publication number
- WO2007090660A1 WO2007090660A1 PCT/EP2007/001108 EP2007001108W WO2007090660A1 WO 2007090660 A1 WO2007090660 A1 WO 2007090660A1 EP 2007001108 W EP2007001108 W EP 2007001108W WO 2007090660 A1 WO2007090660 A1 WO 2007090660A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scene
- measuring system
- distance measuring
- distance
- radiation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/51—Display arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the invention relates to a method and an arrangement for inserting location-related information into a visual representation or view of a scene.
- the invention relates in particular to the technical field of augmented reality (AR).
- AR augmented reality
- AR should be used as a new form of interaction between man and machine.
- the user (human) information can be displayed in his field of view so that they appear in the correct position in the observed scene.
- the information is linked to the scene by location.
- portions of the scene such as individual components of a machine
- the service technician or fitter is supported by the fact that information about the individual work steps is displayed.
- a special problem of AR applications lies in the so-called tracking of the movement of the person or the presentation device, which visually displays the displayed location-related information.
- the presentation device may be, for example, a partially transparent data glasses or a portable display (eg a hand-held device).
- the tracking is required in order to display the location-related information depending on the position and orientation (in particular depending on the line of sight of the user or the orientation of the display) in the correct position.
- the user and / or the display device should be able to be moved and yet the location-related information should be displayed in the correct position.
- an AR array may therefore include a camera that is connected to the display device so that both can not make significant movements relative to each other during use.
- an example small, portable computer unit can be provided, which processes the signals recorded by the camera and controls the fade-in of the information by the display device.
- the presentation device can be transparent depending on the application (for example in the case of data glasses through which the user can view the scene) or in addition to the location-related additional information also an image of the scene (for example in the case of the portable display).
- Such systems in which the camera is connected to the display device, represent an advance over previous tracking systems configured with stationary cameras or other stationary sensors for tracking the movement of the display device.
- the cameras with the moving cameras allow a generally much larger radius of action and are less dependent on the fact that the action space in which the application is to be executed is specially prepared.
- systems with moving cameras are currently not independent of their environment.
- special markers are placed in the environment that are captured by the camera to enable tracking.
- Pure image-based tracking methods that manage without markers in the scene and only calculate position and orientation in space from the acquired two-dimensional image are still the subject of research.
- the camera position and orientation calculation is usually still numerically unstable.
- the results are noise sensitive and often inaccurate in identifying and locating image features of the scene.
- a specially designed camera can be used to measure the distance.
- a single camera is sufficient.
- the use of a stereotactic arrangement with at least two cameras is not required.
- the elements of the scene are e.g. areas that correspond to individual pixels of a digital camera or a picture of a digital camera or areas that correspond to several pixels of the picture or the camera.
- the image data provided by the camera already contain (locally) three-dimensional position information about the scene. With this information, it is possible to determine the current position and orientation of the camera and / or of an object connected to the camera (in particular of the display device for the visual presentation of the displayed location-related information).
- a so-called flight distance camera time-of-flight camera
- Such cameras are known for example from US 2004/0008394 A1. The content of this US patent application is hereby incorporated by reference in its entirety as an exemplary embodiment of a flying distance camera in the present description.
- the flying distance camera may be configured as described at paragraph 78 to paragraph 90 on page 6 of the U.S. application.
- the depth information, or information about the distance to the elements of the scene as described in paragraph 95, paragraph 96 or paragraph 97 of the US Application described from the obtained from the flight distance camera image data and additional information about the time dependence of the radiation used for the illumination of the scene can be determined.
- a method for inserting location-related information into a visual representation or view of a scene in particular augmented reality method, wherein
- Scene is irradiated so that the scene reflects at least a portion of the electromagnetic radiation in the direction of the distance measuring system, the distance measuring system repeatedly images of the radiation reflected by the scene are recorded and from the images
- the display of the location-related information in the visual representation or view of the scene can be done in any way.
- the techniques common for fading information can be used.
- the location-related information can be displayed directly through the display.
- the location-related information of the view of the scene can be superimposed, ie the scene and the superimposed location-related information in total form that image information that is displayed on the screen.
- the user views the image of the scene through the semitransparent display device.
- presentation devices for example display devices, of which the view of the scene is projected.
- the location-related information can each be displayed in the same position relative to the visual representation or view of the scene, the location of the location-related information being the current position and orientation of the viewer's eyes and / or the current position and orientation dependency of the presentation device relative to the scene.
- the distance measuring system (in particular a flight distance camera) is connected to the presentation device, so that at least during a partial period of use no change in the relative position and relative orientation takes place.
- the scene is actively lit.
- electromagnetic radiation can be radiated onto the scene, wherein a radiation flux density of the radiation emitted by the distance measuring system is varied, in particular periodically varied.
- Frequencies of this modulation of the radiation used for the illumination are for example in the range of 15 to 25 MHz.
- the radiation reflected by the scene varies. From this, the distance can be determined, in particular from a phase and / or intensity of a radiation received by the distance measuring system. In itself, the evaluation of the phase is already sufficient. However, additional information may also provide the evaluation of the strength of the received reflected radiation, as the strength or radiation flux density decreases in proportion to the square of the distance between the scene and the distance measuring system.
- the acquisition of the distance information and the determination of the position and orientation are repeated continuously, so that the fading of the location-related information in real time or quasi in real time according to a current position and orientation is possible.
- the user can carry a computer of the distance measuring system. The user is therefore mobile and can therefore take almost any position relative to the scene.
- the computer can also be stationary and the data of the camera (in particular wireless) can be transmitted to the computer.
- the computer supplies the data necessary for the position-correct presentation of the location-related information back to the presentation device which the user carries or which is arranged in the area of the user.
- a three-dimensional model of the scene or a part of the scene can be used.
- Such a three-dimensional model can be created by means known per se, for example from the data of a CAD system with which one or more objects of the scene were constructed.
- the distance measuring system can be used during an initialization phase to obtain a three-dimensional model of the scene.
- an arrangement for fading in location-related information into a visual representation or view of a scene comprising: a display device configured to display at least the displayed information, a distance measuring system connected to the display device,
- the distance measuring system comprises an illumination device for generating electromagnetic radiation that can be radiated onto the scene, wherein the distance measuring system comprises an image recording device for recording images of the scene, and wherein the distance measuring system comprises a distance determination device for determining a distance of a plurality of elements of the scene to the display device and / or to the image recording device.
- FIG. 1 schematically shows a scene viewed by a user
- FIG. 2 shows the arrangement of FIG. 1 and additionally a transparent
- FIG. 3 schematically shows an arrangement with which location-related information can be displayed on the display device illustrated in FIG. 2, the representation of the location-related information depending on the position and orientation of the camera relative to the scene,
- FIG. 4 shows a first flowchart for illustrating a method sequence for the positionally correct insertion of the location-related information and 5 shows a further flowchart for the representation of steps for the correct display of the location-related information.
- the scene 1 shown schematically in FIG. 1 has a plurality of elements 5, 6, 7, 8, 9, which can be connected to one another and are generally arranged in an immovable relative position to each other at least over a period of time.
- An example of a scene is the arrangement of objects in the engine compartment of a road vehicle.
- Some of the objects 5-9 of the scene 1 have a greater distance from the eye 3 of one viewer (user 2) than others.
- the distance between the eye 3 and the object 5 is greater than the distance between the eye 3 and the object 6.
- the preferred embodiment of the invention which is described here with reference to FIG. 1-5, not only distances to entire objects of the scene are taken into account, but also partial areas of the objects that are visible from a flight distance camera 11 (FIG. 2). Whether partial areas are taken into account depends solely on how the image resolution of the camera is arranged and which partial areas or areas of the objects 5 to 9 in a given instantaneous position of the camera 11 are visible to them.
- the arrangement shown schematically in Fig. 1 and Fig. 2 is not to scale. Rather, the distance between the user 2 and the scene 1 will vary in proportion to the extent of the scene 1 in practice.
- the flying distance camera 11 is connected to a depiction device 10, through which the eye 3 of the user 2 views the scene 1.
- the location-related information can be displayed, so that the user 2 sees the location-related information in the correct position to the view of the scene.
- location is meant that the information is related to the location of an object or element of the scene. The location-relatedness thus exists eg in a coordinate system of the scene.
- a positionally correct representation of the location-related information means that the respective location-related information is at a given relative position between the user (in particular between the eye or the eyes of the user) on the one hand and the scene on the other hand, as well as a given orientation of the user's line of sight is displayed so that the location-related information for the user so appears or is the correct Place assigned to the scene.
- the object 5 of the scene 1 may be assigned the color orange as additional color information in order to particularly highlight the object 5 compared to other objects of the scene 1, although the object 5 has a different color, eg gray.
- an area on the presentation device would be colored orange, so that the object 5 for the observer apparently has an orange color.
- other location-related information may also be displayed, for example font information associated with an associated object of the scene via an arrow, which informs the user that a specific action is to be performed in connection with the object.
- the position of the presentation device 10 with respect to the scene 1 and the orientation of the presentation device 10 is now determined using the camera 11.
- the presentation device 10 is again in a fixed position in a coordinate system moved with the eye of the user.
- a display device is therefore particularly suitable a data glasses.
- the (conventional) two-dimensional image generated by the flight distance camera can be used.
- the additional three-dimensional (depth) information of the flight distance camera is then additionally used for the determination of the relative position and orientation.
- the camera 11 shown in FIGS. 2 and 3 is, for example, designed as described in the above-mentioned publication by Oggier, Lehmann et al. described.
- FIG. 3 shows schematically that the camera 11 is connected to a lighting device 13 in order to actively illuminate the scene.
- Preferred dimensions is that of the illumination device 13 emitted radiation in the visible wavelength range. However, this is not necessarily the case in order to determine the distance or the depth information of the scene.
- the lighting device 13 and the camera 11 may be arranged in a common housing or otherwise connected to each other firmly.
- FIG. 2 further shows, its distance from the various objects or regions of the scene 1 can be determined with the aid of the camera 11.
- the distance d1 to the object 5 and the distance d2 to the object 6 are shown.
- the difference in the viewing directions of the camera 11 and the eye 3 is exaggerated compared to practical embodiments.
- FIG. 3 also shows a computer 14, which controls the illumination device 13 and receives and evaluates the image signals recorded by the camera 11.
- the connection to the illumination device 13 serves, in particular, for the computer 14 to know the phase and frequency of the radiation emitted by the illumination device 13 for the evaluation of the image signals.
- the computer 14 may be arranged in a common housing with the camera 11.
- the computer is preferably designed as a separate component, as shown in FIG.
- the determination of the depth information and / or the determination of the position and orientation with respect to the scene to be described below can be carried out by the computer.
- a computer for determining the depth information from the image signals supplied by the camera and a second computer for determining the relative position and orientation.
- a further computer may be provided to calculate and / or control the correct position representation of the location-related information on the display device 10.
- the illumination device 13 generates visible light from an array with a plurality of light-emitting diodes whose radiation flux density is modulated at a frequency of 20 MHz.
- the camera receives the reflected light resolved for each pixel of the camera sensor used in CCD / COMS technology.
- a 12-bit A / D converter of the camera digitizes each of the analog output signals of the sensor pixels, and the data is transmitted to a FPGA (field programmable gate array) of the computer. There, the signals are processed and stored in a RAM memory.
- the FPGA controls the entire camera. It calculates the phase shift (to determine the distance to the respective element of the scene), the offset and the amplitude to determine the intensity or radiation flux density at the receiver.
- the depth determination can be carried out as described in US 2004/0008394 A1.
- depth determination reference is made to this document.
- step S1 the determination of the three-dimensional image information about the scene (classical two-dimensional image information with additional depth information) corresponds to step S1.
- three-dimensional data D of a model of the scene are also used.
- step S2 an alignment of the 3D data obtained by the camera and the 3D data from the model takes place.
- the area of the 3D model that corresponds to the area covered by the camera is determined.
- step S3 which follows step S2, the current position and orientation of the camera and thus also of the display device are determined from the two 3D data sets.
- the transformation matrix is determined with which one of the two data sets (in particular the data record of the model) in the Coordinate system of the other record can be transformed.
- registration is commonly referred to as registration.
- an ICP (iterative dosest point) method is used.
- a preferred embodiment of the ICP process is disclosed in the publication by Paul J. Besl and Neil D. McKay, "A Method for Registration of 3-D Shapes" IEEE Transactions on Pattem Analysis and Machine Intelligence, Vol. 14, No. 2, February 1992 , Pages 239 to 256.
- the content of this publication, in particular section III from page 241 to end section IV on page 246, is hereby incorporated in full in this description.
- the ICP method is already, as the name implies, an iterative method that is used to optimize the registration of 3D datasets. As a result, a solution of the registration is obtained in which the deviation between the transformed first 3D data set and the second 3D data set is minimal.
- a filtering (smoothing) of the 3D data obtained by the camera is preferably carried out.
- the filtering serves, in particular, to compensate (smoothen) statistical fluctuations in the sensor signals of the individual pixels of the camera.
- step S3 follows step S4, in which the position of the location-related information to be displayed on the presentation device is calculated.
- the three-dimensional information from the model of the scene, which is now registered with the three-dimensional information obtained using the camera is projected onto the two-dimensional representation plane of the presentation device 10.
- the projection can be done for selected objects, areas or parts thereof to which the location-related information is assigned.
- it can be determined in an optional additional step whether the object, the area or the part thereof to which the respective location-related information is assigned is hidden by other parts of the scene and is therefore not or only partially visible in the plane of the presentation device.
- it can then be decided that the location-related information on hidden parts of the scene is not displayed.
- step S5 the location-related information is displayed.
- the method is one of many consecutive iterative procedures that are run during an AR application running over a period of time.
- the starting point for a method sequence is that registration of the data of the 3D model of the scene with the 3D data of the scene generated by the camera has already taken place for a past time.
- step S11 starting from this, a 3D image of the scene is again acquired by the camera 11 for a new, later time, and the associated depth information is determined. Furthermore, again filtering can be performed.
- step S12 the comparison between two 3D data sets (step S12) and the determination of the transformation matrix, again according to the ICP method, takes place again.
- step S12 the adjustment or the transformation is made between the camera image present in the previous step and in the present step. Since the position and orientation usually has not changed significantly between times that follow each other at a short distance, the transformation is to be determined with much less computational effort.
- this has the advantage that the correct transformation matrix is found with high certainty and not just a transformation matrix which corresponds to a secondary solution which corresponds to a local but not absolute minimum of the distances.
- the location of the location-related information to be displayed is determined (step S15) and displayed in step S16.
- a 3D model of the scene to be considered does not yet exist at the beginning of an application, such a model can be created using the flight distance camera and using the associated further components of the device (for example the device described with reference to FIG. 3).
- the 3D data record is determined and stored in each case for each position and orientation.
- information is entered by the user, for example.
- the user establishes the association between the object or element of the scene and the location-related information for each object of the scene for which location-related information exists, and controls that the assignment is correct also for the most diverse relative positions and orientations is maintained.
- the latter is facilitated by the fact that the differences between two successive relative positions and / or orientations are not too great.
- the process sequence corresponds to the iterative procedure explained with reference to FIG. 5.
- step S16 it is then to be verified in each case that the assignment of the location-related information to the respective object or element of the scene is still correct.
- the mobile use of the display device with variable relative position and orientation with respect to the scene is possible.
- the depth information or 3D information available in addition to two-dimensional camera images makes the method independent of additional facilities or measures on the scene, such as the attachment of markers in the scene.
- the computational effort for the determination of the relative position and orientation compared to methods that evaluate only two-dimensional camera images, low and leads to a stable solution for optical tracking.
- Cameras and associated hardware for evaluating the camera signals possible to produce very handy, compact designs with low weight, which contain the display device and camera and optionally even the computer.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
Abstract
La présente invention concerne un procédé et un dispositif pour l'affichage d'informations géodépendantes dans une représentation ou une vue visuelle d'une scène (1), en particulier la réalité augmentée. Un système de mesure de distance est relié à un dispositif de représentation (10) pour la représentation visuelle d'informations diffusées. Le système de mesure de distance (11) émet un rayonnement électromagnétique sur la scène (1) de sorte que la scène (1) réfléchit au moins une partie du rayonnement électromagnétique en direction du système de mesure de distance (11). Le système de mesure de distance (11) reçoit de manière répétitive, des images du rayonnement réfléchi de la scène (1) et des informations sont obtenues à partir de ces images respectivement sur une distance d'une pluralité d'éléments de la scène (1). À partir des informations de distance, il est déterminé une position et orientation du système de mesure de distance (1) et/ou le dispositif de représentation (10) par rapport à la scène (1).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102006006001A DE102006006001B3 (de) | 2006-02-08 | 2006-02-08 | Verfahren und Anordnung zum Einblenden ortsbezogener Informationen in eine visuelle Darstellung oder Ansicht einer Szene |
| DE102006006001.6 | 2006-02-08 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2007090660A1 true WO2007090660A1 (fr) | 2007-08-16 |
Family
ID=37963892
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2007/001108 Ceased WO2007090660A1 (fr) | 2006-02-08 | 2007-02-05 | Procede et dispositif pour l'affichage d'informations géodépendantes dans une representation ou une vue visuelle d'une scene |
Country Status (2)
| Country | Link |
|---|---|
| DE (1) | DE102006006001B3 (fr) |
| WO (1) | WO2007090660A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2012033043A (ja) * | 2010-07-30 | 2012-02-16 | Toshiba Corp | 情報表示装置及び情報表示方法 |
| US8315674B2 (en) | 2010-10-08 | 2012-11-20 | Research In Motion Limited | System and method for displaying object location in augmented reality |
| CN106199216A (zh) * | 2016-08-02 | 2016-12-07 | 海信集团有限公司 | 辐射值显示方法及装置 |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102010017630B4 (de) * | 2010-06-29 | 2016-06-02 | Leica Microsystems Cms Gmbh | Verfahren und Einrichtung zur lichtmikroskopischen Abbildung einer Probenstruktur |
| DE102014009608A1 (de) * | 2014-06-27 | 2015-12-31 | Audi Ag | Betrieb einer AR-Brille im Kraftfahrzeug |
| DE102014213021A1 (de) | 2014-07-04 | 2016-01-07 | Bayerische Motoren Werke Aktiengesellschaft | Lokalisierung eines HMD im Fahrzeug |
| DE102017215163B4 (de) | 2017-08-30 | 2019-04-11 | Volkswagen Aktiengesellschaft | System aus einem Kraftfahrzeug und einer Augmented-Reality-Brille und Verfahren zum Bestimmen einer Pose einer Augmented-Reality-Brille im Innenraum eines Fahrzeugs |
| WO2025119433A2 (fr) | 2023-12-07 | 2025-06-12 | Viega Technology Gmbh & Co.Kg | Procédé de mise à disposition d'eau potable, dispositif d'alimentation en eau potable et unité de commutation d'eau chaude et de rinçage |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6445815B1 (en) * | 1998-05-08 | 2002-09-03 | Canon Kabushiki Kaisha | Measurement of depth image considering time delay |
| GB2376397A (en) * | 2001-06-04 | 2002-12-11 | Hewlett Packard Co | Virtual or augmented reality |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10116331A1 (de) * | 2000-08-22 | 2002-03-14 | Siemens Ag | System und Verfahren zum kombinierten Einsatz verschiedener Display-/Gerätetypen mit systemgesteuerter kontextabhängiger Informationsdarstellung |
-
2006
- 2006-02-08 DE DE102006006001A patent/DE102006006001B3/de not_active Expired - Fee Related
-
2007
- 2007-02-05 WO PCT/EP2007/001108 patent/WO2007090660A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6445815B1 (en) * | 1998-05-08 | 2002-09-03 | Canon Kabushiki Kaisha | Measurement of depth image considering time delay |
| GB2376397A (en) * | 2001-06-04 | 2002-12-11 | Hewlett Packard Co | Virtual or augmented reality |
Non-Patent Citations (4)
| Title |
|---|
| BESL P J ET AL: "A method for registration of 3-D shapes", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE USA, vol. 14, no. 2, February 1992 (1992-02-01), pages 239 - 256, XP000248481, ISSN: 0162-8828 * |
| GORDON G ET AL: "The use of dense stereo range data in augmented reality", PROCEEDINGS OF THE IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY IEEE COMPUT. SOC LOS ALAMITOS, CA, USA, 2002, pages 14 - 23, XP010620938, ISBN: 0-7695-1781-1 * |
| OGGIER T ET AL: "An all-solid-state optical range camera for 3D real-time imaging with sub-centimeter depth resolution (SwissRanger(TM))", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE-INT. SOC. OPT. ENG USA, vol. 5249, no. 1, 18 February 2004 (2004-02-18), pages 534 - 545, XP002432315, ISSN: 0277-786X * |
| SCHUTZ C L ET AL: "Augmented reality using range images", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE-INT. SOC. OPT. ENG USA, vol. 3012, 1997, pages 472 - 478, XP002432314, ISSN: 0277-786X * |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2012033043A (ja) * | 2010-07-30 | 2012-02-16 | Toshiba Corp | 情報表示装置及び情報表示方法 |
| US8466894B2 (en) | 2010-07-30 | 2013-06-18 | Kabushiki Kaisha Toshiba | Apparatus and method for displaying information |
| US8315674B2 (en) | 2010-10-08 | 2012-11-20 | Research In Motion Limited | System and method for displaying object location in augmented reality |
| US8571579B2 (en) | 2010-10-08 | 2013-10-29 | Blackberry Limited | System and method for displaying object location in augmented reality |
| US8971970B2 (en) | 2010-10-08 | 2015-03-03 | Blackberry Limited | System and method for displaying object location in augmented reality |
| CN106199216A (zh) * | 2016-08-02 | 2016-12-07 | 海信集团有限公司 | 辐射值显示方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102006006001B3 (de) | 2007-10-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DE69635891T2 (de) | Verbesserte optische kamera zur entfernungsmessung | |
| DE69729684T2 (de) | Navigationsvorrichtung mit Formveränderungsanzeigefunktion | |
| EP2101867B1 (fr) | Assistance à la vision dotée d'une saisie tridimensionnelle d'image | |
| WO2007090660A1 (fr) | Procede et dispositif pour l'affichage d'informations géodépendantes dans une representation ou une vue visuelle d'une scene | |
| EP0805362B1 (fr) | Procède d'avertissage d'obstacles pour aéronefs volant à basse altitude | |
| EP2464098B1 (fr) | Dispositif de représentation d'environnement ainsi qu'un véhicule doté d'un tel dispositif de représentation d'environnement et procédé de représentation d'une image panoramique | |
| DE102010037169A1 (de) | Uneingeschränktes räumlich ausgerichtetes Blickfeld-Darstellungsgerät | |
| DE102011084993A1 (de) | Übernahme von Daten aus bilddatenbasierenden Kartendiensten in ein Assistenzsystem | |
| DE102012200731A1 (de) | Verfahren und Vorrichtung zum Visualisieren der Umgebung eines Fahrzeugs | |
| DE102016217628B4 (de) | Verfahren zum Betreiben eines Operationsmikroskopiesystems, Bewegungsmesssystem für ein Operationsmikroskopiesystem und Operationsmikroskopiesystem | |
| DE102017010683A1 (de) | Verfahren zur automatischen Wiederherstellung eines eingemessenen Zustands eines Projektionssystems | |
| EP2350977B1 (fr) | Procédé pour fusionner au moins deux images pour former une image panoramique | |
| DE102013100569A1 (de) | Verfahren und Vorrichtung zur Anzeige einer Fahrzeugumgebung | |
| DE102016223671A1 (de) | Leuchtsystem zur Ermittlung geometrischer Eigenschaften sowie Fahrerassistenzsystem und Verfahren dazu | |
| EP1791364B1 (fr) | Dispositif de gestion du trafic aérien | |
| DE60305345T2 (de) | Bildverarbeitungseinrichtung mit erkennung und auswahl von lichtquellen | |
| DE102011055967B4 (de) | Messverfahren und Vorrichtung zur Durchführung des Messverfahrens | |
| EP3739291A1 (fr) | Procédé de détermination automatique de la position et de l'orientation pour un balayeur laser terrestre | |
| DE69705465T2 (de) | Verfahren und Vorrichtung zur Identifikation und Ortsbestimmung von festen Objekten entlang eine Strecke | |
| EP1434184B1 (fr) | Commande d'un système multicaméra | |
| EP3844947A1 (fr) | Procédé et ensemble de génération d'une représentation de l'environnement d'un véhicule et véhicule pourvu d'un tel ensemble | |
| DE102012209664B4 (de) | Vorrichtung und verfahren zum kalibrieren von trackingsystemen | |
| DE102022002766B4 (de) | Verfahren zur dreidimensionalen Rekonstruktion einer Fahrzeugumgebung | |
| EP4475079A1 (fr) | Procédé d'étalonnage d'un système d'assistance d'un véhicule automobile civil | |
| DE102019102423A1 (de) | Verfahren zur Live-Annotation von Sensordaten |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 07711478 Country of ref document: EP Kind code of ref document: A1 |