[go: up one dir, main page]

HK1116861A - Navigation device with camera-info - Google Patents

Navigation device with camera-info Download PDF

Info

Publication number
HK1116861A
HK1116861A HK08108498.2A HK08108498A HK1116861A HK 1116861 A HK1116861 A HK 1116861A HK 08108498 A HK08108498 A HK 08108498A HK 1116861 A HK1116861 A HK 1116861A
Authority
HK
Hong Kong
Prior art keywords
camera
navigation device
navigation
display
directions
Prior art date
Application number
HK08108498.2A
Other languages
Chinese (zh)
Inventor
彼得‧安德烈亚斯‧吉莱恩
马克‧丹尼尔‧马图
Original Assignee
通腾科技股份有限公司
Filing date
Publication date
Application filed by 通腾科技股份有限公司 filed Critical 通腾科技股份有限公司
Publication of HK1116861A publication Critical patent/HK1116861A/en

Links

Abstract

The present invention relates to a navigation device (10). The navigation device (10) is arranged to display navigation directions (3, 4, 5) on a display (18). The navigation device (10) is further arranged to receive a feed from a camera (24). The navigation device (10) is further arranged to display a combination of a camera image fr>m the feed fr>m the camera (24) and the navigation directions (3, 4, 5) on the display (18).

Description

Navigation device with camera information
Technical Field
The invention relates to a navigation device arranged to display navigation directions on a display.
Furthermore, the invention relates to a vehicle comprising such a navigation device and to a method of providing navigation directions. The invention further relates to a computer program and a data carrier.
Background
Global Positioning System (GPS) based prior art navigation devices are well known and widely used as in-vehicle navigation systems. Such GPS-based navigation devices involve a computing device that forms a functional connection with an external (or internal) GPS receiver and is capable of determining its global position. Additionally, the computing device is capable of determining a route between a start address and a destination address, which may be input by a user of the computing device. The computing device is typically enabled by software for calculating a "best" or "optimal" route between the start address location and the destination address location from a map database. The "best" or "optimal" route is determined based on predetermined criteria and need not necessarily be the fastest or shortest route.
The navigation device may typically be mounted on the dashboard of a vehicle, but may also form part of an on-board computer of a vehicle or car radio. The navigation device may also be (part of) a handheld system like a PDA.
Using position information derived from the GPS receiver, the computing device can determine its position at regular intervals and can display the current position of the vehicle to the user. The navigation device may also include a memory device for storing map data and a display for displaying selected portions of the map data.
Further, it may provide instructions how to navigate the determined route by appropriate navigation directions, which are displayed on a display and/or generated as audible signals from a speaker (e.g., "turn left in 100 meters"). Graphics depicting the behavior to be performed (e.g., a left arrow indicating a left turn ahead) may be displayed in the status bar, and may also be superimposed over the appropriate junctions/turnings, etc. in the map itself.
It is known to enable an in-vehicle navigation system to allow a driver to initiate a recalculation of a route while driving a car along the route calculated by the navigation system. This is useful when the vehicle is faced with construction work or severe traffic jams.
It is also known to enable a user to select the type of route calculation algorithm deployed by a navigation device, for example to select a "normal" mode or a "fast" mode (which calculates a route in the shortest time, but finds fewer alternative routes than there are normal modes).
It is also known to allow routes to be calculated with user-defined criteria; for example, the user may prefer a scenic route to be calculated by the device. The device software will then calculate various routes and favor those that include the greatest number of points of interest (known as POIs) marked as, for example, landscapes along their route.
In the prior art, the maps displayed by navigation devices are, like most maps, highly stylized or schematic representations of the real world. Many people find it difficult to convert such a fairly abstract version of the real world into something that can be easily recognized and understood. Navigation devices are known which: it displays a (not fully) three-dimensional projection of a map, which can be seen from above and/or behind the vehicle. This is done to make it easier for the user to understand the displayed map data, as it corresponds to the user's visual perception of the world. However, such (incomplete) perspective views are stylized or schematic representations that are still relatively difficult for a user to understand.
However, the need to enable people to easily and quickly follow the directions displayed on the display is particularly acute in personal navigation systems (e.g., usable as in-vehicle navigation systems). It is easy to understand that the driver of the vehicle should spend as little time as possible in viewing and understanding the displayed map data, because his/her main attention should be focused on the roads and traffic.
Disclosure of Invention
It is therefore an object of the present invention to provide a navigation device that overcomes at least one of the above problems and displays instructions for a user that are easy to understand.
To achieve this object, the invention provides a navigation device according to the preamble, characterized in that the navigation device is further arranged to receive a feed from a camera, and the navigation device is arranged to display on a display a combination of a camera image and a navigation direction according to the feed from the camera.
By superimposing or combining the navigation directions on the camera image, the driver is presented with a user-friendly view that facilitates easy and quick understanding. The user does not need to translate the abstract representation of the real world, as the camera image is a one-to-one representation of the real life view seen by the user. The combination of the feed from the camera and the navigation directions may be all types of combinations, e.g. superimposing one on the other while being shown on different parts of the display. However, the combination may also be a combination in time, i.e. alternately showing the camera feed and the navigation directions. This may change after a predetermined time interval (e.g., 5 seconds) or may change due to user input.
According to a further embodiment, the invention relates to a navigation device, wherein the camera is integrally formed with the navigation device. Such navigation devices do not require an external camera feed. The navigation device may be simply mounted on the dashboard of the vehicle, for example, in a manner such that the camera provides images through the front screen.
According to a further embodiment, the invention relates to a navigation device, wherein the navigation direction is one or more of a location arrow, a route, an arrow, a point of interest, a road, a building, map data such as vector data, at least stored in a memory unit, such as a hard disk, a read only memory, an electrically erasable programmable read only memory and a random access memory. All types of navigation directions may be displayed. Note that these navigation directions may also provide information that is not needed for navigation (finding a route) itself, but may also provide additional information to the user.
According to a further embodiment, the invention relates to a navigation device further arranged to superimpose the navigation directions on the camera image such that the position of the navigation directions is in a predefined spatial relationship with respect to the respective part of the camera image. This provides the user with a very easily understandable image, since all navigation directions can be displayed such that they match the actual positions of the respective items in the camera image. For example, an arrow indicating a right turn may be superimposed over the camera image such that it matches the turn visible in the camera image.
According to another embodiment, the invention relates to a navigation device, wherein the navigation device comprises a processing unit, a positioning device and an orientation sensor, the positioning device and the orientation sensor being arranged to communicate with the processing unit, the processing unit being arranged to calculate a position and an orientation of the camera and/or the navigation device using readings from the positioning device and the orientation sensor, the processing unit calculating a position of the navigation direction on the display based on the position and orientation. Knowing the exact position and orientation of the camera and/or navigation device, the navigation directions can be more accurately superimposed on the camera feed.
According to another embodiment, the invention relates to a navigation device, wherein the positioning device determines the geographical position using positioning sensing technology, such as GPS, the european galileo system or any other global navigation satellite system or positioning sensing technology based on ground based beacons.
According to another embodiment, the invention relates to a navigation device, wherein the processing unit calculates the orientation of the camera with respect to a first rotation axis which is substantially vertical in use by comparing the positions of the camera and/or the navigation device determined by the positioning device at successive points in time. By comparing the positions of the camera and/or the navigation device at successive points in time, the direction of travel of the camera and/or the navigation device can be calculated. From this, the orientation and orientation changes of the camera can be calculated.
According to a further embodiment, the invention relates to a navigation device, wherein the navigation device comprises a compass providing compass readings to a processing unit, the processing unit being arranged to calculate an orientation of the camera relative to a first rotational axis, which in use is substantially vertical, based on the compass readings. A compass provides an easy and advantageous way of determining the orientation of the camera.
According to another embodiment, the invention relates to a navigation device, wherein the orientation sensor comprises a tilt sensor (tilt sensor) for determining the orientation of the camera relative to a second and a third axis of rotation, which second and third axis of rotation are substantially horizontal in use. In order to combine or superimpose the navigation directions with respect to the camera image in a more accurate way, the rotational orientation of the camera is measured with respect to the second and/or third direction.
According to another embodiment, the invention relates to a navigation device, wherein the processing unit superimposes the navigation directions on the camera image using pattern recognition techniques such that the position of the navigation directions is in a predefined spatial relationship with respect to the respective part of the camera image. By using pattern recognition techniques, navigation directions can be combined and/or superimposed on the camera feed without knowing the exact orientation of the camera. The determination of the position of the navigation direction on the displayed camera image can be done by using pattern recognition techniques alone, but pattern recognition techniques can also be used in combination with the determined orientation of the camera, which can further improve the accuracy.
According to a further embodiment, the invention relates to a navigation device, wherein the navigation device uses map data as input for pattern recognition techniques. Using map data may simplify pattern recognition techniques because roads are easier to recognize when it is roughly known from map data where, for example, the road is. This makes pattern recognition more accurate and/or may save computation time.
According to another embodiment, the invention relates to a navigation device, wherein the navigation device is arranged to receive calibration corrections (calibration corrections), to store these calibration corrections and to apply the calibration corrections when combining navigation directions with camera images. This is particularly advantageous when the navigation directions are combined in such a way that they have a predefined spatial relationship with respect to the camera image, which is superimposed on the camera image. Calibration may be used to eliminate offset errors.
According to a further embodiment, the invention relates to a navigation device, wherein the navigation device is arranged to receive or read in camera settings and to use the camera settings to calculate the position of navigation directions on the display. Different camera settings may result in different camera feeds. Providing these camera settings to the navigation device further improves the accuracy of the combination of the navigation directions and the camera images.
According to a further embodiment, the invention relates to a navigation device, wherein the navigation device is further arranged to receive feeds from more than one camera, and the navigation device is arranged to select one of the feeds to be displayed on the display. The one or more camera feeds provide different perspectives that can be used, for example, by pattern recognition techniques to improve the quality of pattern recognition using mathematics. The more than one camera may also be used to provide the user with the option of making a selection between different camera angles.
According to a further embodiment, the invention relates to a navigation device, wherein the camera is sensitive to electromagnetic radiation outside the range of the electromagnetic spectrum visible to the unaided human eye.
According to a further embodiment, the invention relates to a navigation device, wherein the camera is an infrared camera. Such a camera enables the navigation device to be used at night.
According to a further embodiment, the invention relates to a navigation device, wherein the camera is arranged to zoom in and/or zoom out. This allows the user to adjust the camera view according to his or her preferences.
According to a further embodiment, the invention relates to a navigation device, wherein the camera is arranged to zoom in or out depending on e.g. the speed of the navigation device/vehicle. This provides a camera feed that is automatically adjusted according to the speed of the navigation device. Thus, where the speed of the navigation device is relatively high, the camera may zoom in to provide a clearer view to the user farther forward.
According to another aspect, the invention relates to an instrument panel comprising a navigation device according to the above.
According to another aspect, the invention relates to a vehicle comprising a navigation device according to the above.
According to another embodiment, the invention relates to a vehicle, wherein the vehicle comprises a vehicle tilt sensor to determine the tilt of the vehicle to provide vehicle tilt readings to a navigation device. This is an advantageous way of measuring the inclination of the vehicle.
According to another aspect, the invention relates to a method of providing navigation directions, the method comprising:
-displaying the navigation directions on a display, characterized in that the method further comprises:
-receiving a feed from a camera, and
-displaying on the display a camera image according to the feed from the camera in combination with the navigation direction on the camera image.
According to another aspect, the invention relates to a computer program which, when loaded on a computer arrangement, is arranged to perform the above-mentioned method.
According to a further aspect, the invention relates to a data carrier comprising a computer program as described above.
Drawings
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying schematic drawings in which corresponding reference symbols indicate corresponding parts, and in which:
figure 1 schematically depicts a schematic block diagram of a navigation device,
figure 2 schematically depicts a schematic view of a navigation device,
figure 3 schematically depicts a schematic block diagram of a navigation device according to an embodiment of the invention,
figure 4 schematically depicts a vehicle comprising a navigation device according to an embodiment of the invention,
figure 5 schematically depicts a navigation device according to an embodiment of the invention,
figure 6 schematically depicts a navigation device according to an embodiment of the invention,
figure 7 schematically depicts a camera according to an embodiment of the invention,
figures 8a and 8b schematically depict different movements of the camera image on the display due to different tilt angles of the camera,
figure 9 schematically depicts a flow diagram of the functionality of a navigation device 10 according to an embodiment of the invention,
figure 10 schematically depicts a navigation device according to an embodiment of the invention,
FIG. 11 depicts a navigation device, according to an embodiment of the invention, and
FIG. 12 depicts a navigation device according to another embodiment of the invention.
Detailed Description
Fig. 1 shows a schematic block diagram of an embodiment of a navigation device 10 comprising a processor unit 11 performing arithmetic operations. The processor unit 11 is arranged to communicate with memory units storing instructions and data, such as a hard disk 12, a Read Only Memory (ROM)13, an Electrically Erasable Programmable Read Only Memory (EEPROM)14 and a Random Access Memory (RAM) 15. The memory unit may include map data 22. This map data may be two-dimensional map data (latitude and longitude), but may also include a third latitude (altitude). The map data may further comprise additional information, such as information about gas stations, points of interest. The map data may also include information about the shape of buildings and objects along the road.
The processor unit 11 may also be arranged to communicate with one or more input devices, such as a keyboard 16 and a mouse 17. The keyboard 16 may, for example, be a virtual keyboard provided on the display 18 (being a touch screen). The processor unit 11 may further be arranged to communicate with one or more output devices, such as a display 18, a speaker 29, and one or more reading units 19 to read, for example, floppy disks 20 or CD ROMs 21. The display 18 may be a conventional computer display (e.g., an LCD) or may be a projection-type display, such as a heads-up display used to project instrument data onto an automobile windshield or windshield. The display 18 may also be a display arranged to act as a touch screen, which allows a user to input instructions and/or information by touching the display 18 with his finger.
The processor unit 11 may further be arranged to communicate with other computing devices or communication devices using the input/output device 25. The input/output device 25 is shown arranged to prepare for communication via a network 27.
The speaker 29 may also be formed as part of the navigation device 10. In the case where the navigation device 10 is used as an in-vehicle navigation device, the navigation device 10 may use speakers of a car radio, a plug-in computer, or the like.
The processor unit 11 may further be arranged to communicate with a positioning device 23, such as a GPS receiver, which provides information about the position of the navigation device 10. According to this embodiment, the positioning device 23 is a GPS based positioning device 23. It will be appreciated, however, that the navigation device 10 may implement any type of location sensing technology and is not limited to GPS. Thus, it may be implemented using other types of GNSS (Global navigation satellite System), such as the European Galileo System. Again, it is not limited to satellite-based position/velocity systems, but may equally be deployed using ground-based beacons or any other type of system that enables a device to determine its geographic location.
It should be understood, however, that more and/or other memory units, input devices, and read devices known to those skilled in the art may be provided. In addition, one or more of which may be located physically remote from memory cells 11, if desired. The processor unit 11 is shown as one block, however it may comprise several processing units running in parallel or controlled by one main processor, possibly located remotely from each other, as known to the person skilled in the art.
The navigation device 10 is shown as a computer system, but it may be any signal processing system with analog and/or digital and/or software techniques arranged to perform the functions discussed herein. It will be appreciated that although the navigation device 10 is shown in figure 1 as multiple components, the navigation device 10 may be formed as a single device.
The navigation device 10 may use navigation software, such as that known as Navigator from TomTom b.v. Navigator software can run on a touch screen (i.e., controlled by a stylus) pocket PC powered PDA device (e.g., compaq ipaq) as well as devices with an integral GPS receiver 23. The combined PDA and GPS receiver system is designed for use as an in-vehicle navigation system. The invention may also be implemented with any other arrangement of navigation device 10, such as a device with an integral GPS receiver/computer/display, or a device designed for non-vehicle use (e.g., for walkers) or vehicles other than automobiles (e.g., aircraft).
Fig. 2 depicts a navigation device 10 as described above.
Navigator software, when running on the navigation device 10, causes the navigation device 10 to display a normal navigation mode screen at the display 18, as shown in fig. 2. This view may provide driving instructions using a combination of text, symbols, voice guidance, and moving maps. The main user interface elements are as follows: the 3D map occupies most of the screen. It should be noted that the map may also be shown as a 2D map.
The map shows the location of the navigation device 10 and its immediate surroundings, rotated in such a way that the direction of movement of the navigation device 10 is always "up". Running on the lower quarter of the screen may be a status bar 2. The current location of the navigation device 10 (as determined by the navigation device 10 itself using conventional GPS location finding methods) and its position (as inferred from its direction of travel) is depicted by the location arrow 3. The route 4 calculated by the device (using a route calculation algorithm stored in the memory devices 11, 12, 13, 14, 15 and applied to the map data stored in the memory devices 11, 12, 13, 14, 15) is shown as a shaded path. On the route 4, all major actions (e.g. turning, crossroads, detours, etc.) are schematically depicted by arrows 5 overlaying the route 4. The status bar 2 may also contain a schematic icon 6 on its left side depicting the next action, here a right turn. The status bar 2 also shows the distance to the next action (i.e. the right turn), here 50 meters, which is extracted from a database of the entire route calculated by the device, i.e. a list of all roads and related actions defining the route to be taken. The status bar 2 also shows the name of the current road 8, the estimated time before arrival 9 (here 2 minutes 40 seconds), the estimated actual arrival time 25 (11: 36 a.m.), and the distance to the destination 26(1.4 Km). Status bar 2 may further show additional information such as GPS signal strength in the form of a mobile phone type signal strength indicator.
As already mentioned above, the navigation device may comprise an input device, such as a touch screen, which enables a user to invoke a navigation menu (not shown). From this menu, other navigation functions may be initiated or controlled. Allowing navigation functions to be selected from a menu screen that is itself very easy to invoke (e.g., only one step apart from the map display to the menu screen) greatly simplifies user interaction and makes it faster and easier. The navigation menu contains options for the user to enter a destination.
The actual physical structure of the navigation device 10 itself may be substantially indistinguishable from any conventional handheld computer except for the integral GPS receiver 23 or a GPS data feed from an external GPS receiver. Thus, the memory devices 12, 13, 14, 15 store route calculation algorithms, map databases and user interface software; the processor unit 12 interprets and processes user inputs (e.g., using a touch screen to enter start and destination addresses and all other control inputs) and deploys route calculation algorithms to calculate an optimal route. "optimal" may refer to a criterion such as shortest time or shortest distance or some other user-related factor.
More specifically, the user enters his start position and desired destination into the navigation software running on the navigation device 10 using the provided input devices (e.g., touch screen 18, keypad 16, etc.). The user then selects the way to calculate the travel route: a variety of modes are provided, such as a "fast" mode, which calculates routes very quickly, but the routes may not be shortest; "complete" mode, which looks at all possible routes and locates the shortest route, but takes longer to calculate, etc. Other options are possible where the user defines a scenic route, such as passing through points of interest (POIs) mostly marked as views of outstanding beauty, or passing through most POIs that children may be interested in, or using the fewest junctions, etc.
Roads themselves are described in the map database that is part of (or otherwise accessed by) the navigation software running on the navigation device 10 as lines, i.e. vectors (e.g. start point, end point, road direction, an entire road being made up of hundreds of such sections, each uniquely defined by start point/end point direction parameters). A map is then a set of such road vectors, plus points of interest (POIs), plus road names, plus other geographic features such as park boundaries, river boundaries, etc., all of which are defined in terms of vectors. All map features (e.g., road vectors, POIs, etc.) are defined with a coordinate system that corresponds or correlates to the GPS coordinate system, enabling the device location determined by the GPS system to be located on the relevant road shown in the map.
Route calculation uses complex algorithms as part of the navigation software. The algorithm is applied to obtain a large number of possible different routes. The navigation software then evaluates them against user-defined criteria (or device defaults), such as full mode scanning, with scenic routes, past museums, and no speed camera. Next, the route that best meets the predetermined criteria is calculated by the processor unit 11 and then stored in a database in the memory means 12, 13, 14, 15 as a sequence of vectors, road names and actions to be taken at the end of the vector (e.g. corresponding to a predetermined distance along each road in the route, e.g. after 100 meters, turning left to street x).
Fig. 3 depicts a schematic block diagram of a navigation device 10 according to the present invention, wherein corresponding reference characters refer to corresponding parts in fig. 1 and 2.
According to the present invention a camera 24 is provided, said camera 24 being arranged to provide a real-time feed to the processor unit 11. The camera 24 is positioned in use such that it records the road in front of the user. When positioned in a car, the camera 24 is positioned such that it records the road in front of the vehicle. The camera 24 may be integral with the navigation device 10 or may be physically separate therefrom. If separate, the camera 24 may be connected to the processor unit 11 via a cable or via a wireless connection. The camera 24 may be positioned on the vehicle roof or in front of the vehicle, for example near the headlights.
The navigation device 10 may also be provided with more than one camera 24 to allow the user to switch between different camera angles. A rear view camera may also be provided. The camera may be any type of camera, such as a digital camera or an analog camera. The image recorded by the camera 24 is displayed at the display 18.
The camera 24 may also be a camera sensitive to electromagnetic radiation outside the electromagnetic spectrum visible to the naked human eye. The camera may be an infrared camera that can be used at night.
Fig. 4 shows one example of a navigation device 10, which is positioned on the dashboard of a car 1. The navigation device 10 comprises a camera 24 directed at the road in front of the car 1. Fig. 4 further shows that the display 18 is user-facing.
According to the invention, the navigation device 10 is arranged to display real-time feeds from the cameras on the display 18 and combine or superimpose one or more navigation directions. The navigation direction may be one or more of the following: location arrow 3, route 4, arrow 5, points of interest, roads, buildings and all other navigation directions stored in the navigation device 10. This may also include the map data itself, e.g. vector data describing the roads. How this is achieved is described in more detail below.
The image provided by the camera 24 will be unstable due to road roughness, engine induced vehicle vibrations, etc. Thus, the navigation device 10 may be provided with software that eliminates these undesirable vibrations to provide a stable image. Software that eliminates undesirable vibrations of the image provided by the camera 24 is widely used in video cameras, where it is used with the aid of a so-called steady (steady cam). As is known to those skilled in the art.
The feed from the camera 24 may be further processed to improve image quality. This processing may include adjusting brightness, contrast, but may be any suitable filtering. Filters may be used to improve image quality in the presence of rain.
The feed from the camera 24 may be displayed in real time on the display, but may also be displayed in a static form that is updated at a particular point in time (e.g., every 0.5 seconds). The appropriate time interval between successive updates may be determined in dependence on the speed of the navigation device 10 vehicle, the change in direction of travel (turn).
Furthermore, the navigation device may be arranged to perform zoom-in or zoom-out depending on, for example, the speed of the navigation device/vehicle. This zoom operation may be performed by sending a control signal to the camera 24 and providing instructions to perform the zoom operation. However, the zoom operation may also be performed by displaying a portion of the received camera feed at the display 18 in an enlarged manner.
Example 1
Fig. 5 depicts a first example of the present invention. Figure 5 shows a still form of the image recorded by the camera 24 displayed by the navigation device 10. It can be seen that the arrow 5 indicating a right turn is superimposed by the processor unit 11. According to this embodiment, a user-friendly image is displayed to the user for easy understanding. This embodiment has the advantage that no complex mathematics and data processing is required.
Instead of the navigation directions depicted in fig. 5, other navigation directions described above may also be displayed, including navigation directions of a stereoscopic shape, such as an arrow of a stereoscopic shape.
Example 2
Fig. 6 shows another still form of the image recorded by the camera 24. According to this example, the navigation device 10 superimposes the route 4 with the arrow 5. The route 4 and the arrow 5 are superimposed in such a way that their positions on the display 18 correspond to the images provided by the camera 24. Fig. 6 clearly shows the display of route 4 such that route 4 corresponds to the road displayed on display 18. Furthermore, the display of the arrow 5 is such that the arrow 5 accurately indicates a right turn in the image provided by the camera 24.
It will be appreciated that the embodiment shown in fig. 5 can be readily obtained by superimposing or combining the image provided by the camera 24 with the navigation direction (e.g. arrow 5). However, in order to create the image provided in fig. 6, more complex data processing is required to match the image provided by the camera 24 with the navigation direction. This will be explained in more detail below.
In order to superimpose the navigation directions such that the navigation directions have a predefined spatial relationship with respect to the respective part of the camera image, the exact camera position, direction and camera settings need to be known. If all said information is known, the processing unit 11 calculates, for example, the position of the road on the display 18 and superimposes it on the route 4.
First, the position of the camera 24 needs to be determined. This may be done simply by using GPS information determined by the processing unit 11 and/or the positioning device 23. According to prior art usage, position information of the navigation device 10 and thus of the camera 24 is already available in the navigation device 10.
Second, the orientation of the camera 24 needs to be determined. This is done using an orientation sensor arranged to communicate with the processing unit 11. The orientation sensors may be a positioning device 23 and tilt sensors 27, 28. The tilt sensors 27, 28 may be gyroscopes.
FIG. 7 depicts a camera 24 according to an embodiment of the invention. The first rotational direction needs to be determined relative to axis C as depicted in fig. 7. Again, this may simply be done using GPS information determined by the processing unit 11 and/or the positioning device 23. By comparing the position of the navigation device 10 at successive points in time, the direction of movement of the navigation device 10 can be determined. This information is already available in the navigation device 10 according to prior art use. It is assumed that the camera 24 is facing the direction of travel of the navigation device 10. However, this is not necessarily the case, as will be explained further below.
The first direction of rotation C of the camera 24 may also be determined by using a navigation device or an (electronic) compass comprised in the camera 24. The compass may be an electronic compass or an analog compass. The compass provides compass readings that are transmitted to the processing unit 11. The processing unit 11 determines a first direction of rotation of the camera 24 based on the compass readings.
To further determine the orientation of the camera 24, the camera 24 may be provided with tilt sensors 27, 28, as depicted in fig. 7. The tilt sensors 27, 28 are arranged to measure the tilt of the camera 24. The first tilt sensor 27 is arranged to measure tilt in a second rotational direction, indicated by curved arrow a in fig. 7, i.e. rotation about an axis substantially perpendicular to the surface of the figure. The tilt angle in the second rotational direction determines the height of the horizon in the camera image displayed on the display 18. The effect of such rotation on the displayed camera image is schematically depicted in fig. 8 a.
The second tilt sensor 28 is arranged to measure the tilt resulting from rotation about a third rotation axis, wherein the third rotation axis is the central axis of the camera 24 depicted by the dashed line B in fig. 7. The effect of such rotation on the displayed camera image is schematically depicted in fig. 8 b.
In use, the first axis of rotation is substantially vertical, and the second and third axes of rotation are substantially vertical with respect to the first axis of rotation and with respect to each other.
The tilt values determined by the tilt sensors 27, 28 are transmitted to the processor unit 11. The tilt sensors 27 and 28 may also be formed as a single integral tilt sensor.
Furthermore, the camera settings (in particular zoom factor, camera angle, focal length, etc. of the lenses of the camera 24) may be communicated to the processor unit 11.
Based on the information available to the processor unit 11 to describe the position, orientation and settings of the camera 24, the processor unit 11 determines at which position at the display 18 roads, intersections, points of interest, etc. corresponding to the map data stored in the memory means 11, 12, 13, 14, 15 are to be displayed.
The processor unit 11 may, based on this information, superimpose navigation directions, such as the route 4, the arrow 5, the point of interest POI, etc., on the camera image displayed by the processor unit 11 so that it coincides with the camera view. It may be useful to superimpose the navigation directions in such a way that they appear to float above the road surface or have some other predefined spatial relationship to the road surface.
Since the navigation device 10 calculates how far away any intersection or turn (or other change in direction) is, it can calculate roughly how the shape of the navigation directions shown on the display 18 should be, and where the navigation directions should be positioned to correspond to the actual location of the change in direction as shown on the feed from the camera 24.
However, errors may exist for several reasons. First, the navigation device 10 can be mounted on the vehicle dashboard in many ways. For example, when a first rotational direction of the camera 24 relative to the axis C is determined by comparing the position of the navigation device 24 on successive points in time, it is assumed that the camera 24 is pointing straight ahead. However, in case the camera 24 is not exactly aligned with the vehicle, a superimposed navigation direction mismatch may occur.
As described above, where the camera 24 is provided with a built-in compass, a first rotational orientation of the camera relative to axis C may be calculated by comparing compass readings to a determined direction of travel of the navigation device 10. However, there may still be errors, resulting in a mismatch between the superimposed navigation directions and the camera feed.
Furthermore, the tilt sensors 27, 28 may only be able to measure relative tilt, not absolute tilt. This means that the navigation device 10 needs to be calibrated in order to be able to accurately locate the navigation direction on the camera image.
To compensate for these errors, the navigation device 10 may be provided with menu options that allow the user to adjust the relative position of the displayed image with respect to the displayed camera image. This adjustment may be performed by the navigation device 10 by: changing the position at which the navigation directions are displayed, and/or changing the position at which the camera images are displayed, and/or changing the orientation of the camera 24. For the last option, the camera 24 may be provided with an actuation means to change its orientation. The camera 24 may be actuated independently of the navigation device 10. Where the camera 24 is integrally formed with the navigation device 10, the actuation device may change the orientation of the navigation device 10 or the orientation of the camera 24 only with respect to the navigation device 10.
The user may simply use the arrow keys to calibrate the position of the navigation directions to match the camera image. For example, if the camera 24 is positioned such that it is tilted to the left about the axis C depicted in fig. 7, the navigation direction is offset to the right relative to the corresponding portion in the camera image. The user can correct this error simply by dragging the navigation direction to the left using the left key arrow. The navigation device 10 may further be arranged to provide the user with the option of adjusting the displayed rotational orientation of the superimposed navigation directions relative to the displayed camera image.
The navigation device 10 may also be arranged to provide the user with the option of correcting for a perspective mismatch, for example caused by different heights of the camera 24. The camera 24 located on the top of the car provides a different view of the road (different stereoscopic shape) than the camera 24 located on the dashboard of the vehicle or between its headlights. In order to conform the navigation direction, e.g. the 3D direction (e.g. 3D arrow) or the vector representation of the road, to the camera view, a stereo warping of the navigation direction needs to be applied. This stereoscopic deformation depends from the height of the camera 24, the camera settings and the second rotational direction of the camera 24 in the direction of arrow a as depicted in fig. 7.
The processor unit 11 stores these inputted calibration corrections and applies similar calibration corrections to all further displayed images. The processor unit 11 may process all further changes in the measured position, direction and orientation of the camera 24 in order to ensure an accurate superposition of the navigation directions at all times. This facilitates accurate compensation for camera movement due to changes in vehicle orientation or due to speed ramps, sharp corners, accelerations, braking, and other causes that affect the orientation of the camera 24.
Figure 9 depicts a flow diagram of the functionality of a navigation device 10 according to a second embodiment of the present invention. The steps shown in the flow chart may be performed by the processing unit 11. Note that all steps relating to entering destination addresses, selecting routes, etc. are omitted from this figure, as these steps are already known in the art.
In a first step 101, the navigation device 10 is switched on and the user selects a camera program. This is depicted in fig. 9 with a "start".
In a second step 102, the processing unit 11 determines the position of the navigation device 10. This is done by using input from a positioning device 23, such as a GPS device, as described above.
In a next step 103, the processing unit 11 determines the direction of travel of the navigation device 10. Also, inputs from the positioning device 23 are used for this.
Next, in step 104, the orientation and camera settings of the camera 24 are determined by the processing unit 11. Also, input from the positioning device 23 is used. The input from the tilt sensors 27, 28 is also used to determine the orientation of the camera 24.
According to step 105, the camera image is displayed on the display 18 by the processing unit 11. In step 106, the processing unit 11 superimposes a selected number of navigation directions (e.g. position arrow 3, route 4, arrow 5, points of interest, roads, map data, etc.). To do this, the position and shape of the displayed navigation directions are calculated using all the collected information. The user may calibrate this calculation by adjusting the position and/or shape of the superimposed navigation directions, if desired. This optional step is depicted by step 107.
Step 102 and step 107 may be repeated as often as needed or desired during use.
In addition to the directional arrow 5, other types of virtual flags may also be stored in the memory devices 12, 13, 14, 15. For example, icons may be stored relating to road names, traffic signs, speed limits, speed cameras or points of interest stored in the memory devices 12, 13, 14, 15. All of this may also be superimposed on the feed from the camera 24, with the spatial position in the displayed camera image corresponding to the real world features to which the virtual marker relates. Thus, the processing unit 11 may obtain 2D map data from the navigation software that contains location data for these real-world features and apply a geometric transformation that can make the features properly positioned when superimposed in a video feed.
When, for example, a vehicle with the navigation device 10 ascends or descends a hill, the tilt sensors 27, 28 detect the tilt in the direction of the arrow a as depicted in fig. 7. However, this tilt should not be corrected in order to correctly superimpose the navigation direction on the camera image so that the navigation direction coincides with the camera image. This can be set by providing the navigation device with map data comprising height information. Based on the map height data, the navigation device 10 calculates the inclination of the camera 24 corresponding to the orientation of the road on which the vehicle is traveling. This predicted tilt angle is compared with the tilt angles detected by the tilt sensors 27, 28. The position of the superimposed navigation directions is adjusted with the difference between the predicted tilt angle and the detected tilt angle.
In case the map data does not comprise height information, the vehicle may be provided with a vehicle tilt sensor 30. The vehicle tilt sensor 30 is arranged to provide vehicle tilt readings to the processing unit 11. The readings of the vehicle tilt sensor 30 are then compared with the readings of the tilt sensors 27, 28 and the differences due to undesired vibrations etc. are used to adjust the position of the superimposed navigation directions.
It will be appreciated that various types of modifications to the examples explained and shown above are conceivable.
Fig. 10 depicts an example in which the map data also includes data describing objects along the road, such as buildings 31. According to this example, the navigation directions 3, 4, 5 superimposed on the building 31 may be shown in dashed or blinking lines. This allows the user to see the map data, route 4 and arrow 5 which would otherwise be blocked from view by the building.
Third embodiment
According to the third embodiment, the navigation direction is superimposed on the camera image by using the pattern recognition technique.
In recent years, there has been great progress in the field of real-time analysis of image frames (e.g., video feeds provided by cameras 24) to identify actual objects in the video feeds. The literature in this field is quite extensive: reference may be made, for example, to US5627915(Princeton Video Image Inc.), where Video from a scene such as a sports stadium is analyzed by pattern recognition software; the operator manually indicates high contrast areas (e.g., marked lines on the countertop; countertop edges; billboards) within the venue, and the software uses these high contrast landmarks to build a geometric model of the entire venue. Software can then analyze the real-time video feed to find these landmarks; it is then able to acquire stored computer-generated images (e.g., advertisements for billboards), apply geometric transformations to the stored images so that when inserted into a video feed at locations defined by the reference geometric model using image synthesis techniques, it appears to a video viewer as a completely natural part of the scene.
See also Facet Technology, US 2001/0043717; a system is disclosed that can analyze video taken from a moving vehicle to identify road signs.
In summary, pattern recognition techniques applied to real-time video analysis in order to recognize real-world features are a wide and well-developed field.
In one implementation, the navigation device 10 deploys pattern recognition software to recognize real-world features in the video feed from the camera 24 and display navigation directions (e.g., arrow 5) on the display 18 in a predefined spatial relationship with the real-world features recognized in the video feed. For example, a video feed might display the road that the navigation device 10 is currently traveling on, and the navigation direction is then a 3D direction (e.g., a 3D arrow) superimposed on the road. Road turns and other features may be represented in the form of graphics or icons and positioned to overlay the real world features to which they relate.
The processing unit 11 may be programmed to discriminate features having a high visual contrast and associated with a given road. The feature may also be a vehicle or road marking (e.g., an edge marking, a center line marking, etc.) that moves in a consistent direction.
Note that the navigation device 10 is programmed so that it can discern features that have high visual contrast and are associated with roads. For example, the feature may be a vehicle or road marking that moves in a consistent direction.
The navigation device 10 may, for example, be programmed with a geometric model of the road ahead: the model may be as simple as two lines. The model may simply be vector data stored to form map data, as described above.
Thus, in use, the pattern recognition software looks for visual features in the real-time video stream provided by the camera 24 that correspond to the stored geometric model (e.g., two lines). Once it locates these features, it has in fact identified the road ahead. This will typically require applying fast translations and transformations to features (e.g., two lines) identified in the video feed to achieve a match with the stored model; the translation is an x-y translation to approximately align the identified features with the stored model. The transformation includes shortening the line of sight to correspond to different camera heights and relative orientations between the two lines to correspond to different camera viewing angles and relative angles between the camera and the road. Likewise, a transformation may be applied to align and shape the stored model relative to the identified features.
Those skilled in the art will appreciate that it is advantageous for the pattern recognition algorithm to have map data as an input. Pattern recognition can be performed in an easier and faster manner when the algorithm knows the information about the pattern to be recognized in advance. The material is readily available from available map data.
Once the transformation is known, the following is a relatively simple operation: the pre-stored arrow icons are shaped so that their perspective, shape and orientation correspond to that of the road in any given video frame (various types of geometric transformations may be appropriate for this operation), and then the directional arrows are superimposed on the road displayed in the display using conventional image synthesis. It may be useful to superimpose the arrows in a manner that appears to float above the road surface or have some other predefined spatial relationship thereto.
Since the navigation device 10 calculates how far any intersections or turns (or other directional changes) are, it can roughly calculate how the navigation directions shown on the display 18 should be shaped to correspond to the actual location of the change in direction shown on the video feed.
It will be appreciated that the navigation device 10 may also use a combination of the above embodiments. For example, the navigation device may use position and location measurements to roughly determine the location of the navigation directions on the display 18 and pattern recognition techniques to determine the location of the navigation directions on the display 18.
It will be appreciated that many alternatives and modifications to the embodiments mentioned above may be envisaged. For example, another feature is that indications of road names, traffic signs (e.g., one way, no pass, exit number, place name, etc.), speed limits, speed cameras, and points of interest stored in device memories 12, 13, 14, 15 may also be superimposed on the video feed-the spatial location of this "visual sign" in the video frame may correspond to the real-world feature to which the virtual sign relates. Thus, the acceleration limit (e.g., the text "30 mph") may be superimposed so that it appears to be overlaid on or a portion of the road surface of a road having a speed limit of 30 mph. Icons representing specific types of traffic signs may be superimposed on the video stream to appear at locations where real world signs would beneficially appear.
In addition to the directional arrow 5, other types of virtual flags may also be stored in the memory devices 12, 13, 14, 15. For example, icons relating to road names, traffic signs, speed limits, speed cameras, bus stops, museums, house numbers or points of interest may be stored in the memory devices 12, 13, 14, 15. All of these may also be superimposed on the video feed, with the spatial location in the displayed video corresponding to the real-world features to which the virtual marker relates. Thus, the software may take 2D map data containing location data of these real-world features from the navigation software and apply a geometric transformation that correctly locates the features when superimposed in the video feed.
According to another alternative, the pattern recognition technique may also be arranged to recognize objects on the road, such as, for example, other vehicles or trucks. When such an object is recognized, the displayed route 4 may be shown as a dashed line, such as shown in fig. 11. This provides an image that is easier for the user to understand.
Fourth embodiment
According to the fourth embodiment, the feed and navigation directions from the camera 24, e.g. position arrow 3, route 4, arrow 5, points of interest (POI), roads, buildings, map data (such as vector data) are not superimposed, but displayed in a combined manner on the display 18.
Such a combination may be achieved by dividing the display into a first part and a second part, wherein the first part displays the camera feed and the second part displays the navigation directions. However, the combination may also be performed temporally, i.e. the navigation device may be arranged to show the camera feed and the navigation direction in turn. This may be accomplished by presenting the camera feed for a first period of time (e.g., 2 seconds) and then presenting the navigation directions for a second period of time (e.g., 2 seconds). However, the navigation device may also provide the user with the option of switching between camera feed and navigation directions as they desire.
Of course, more than one camera may be used. The user may be provided with the option to switch from a first camera feed to a second camera feed. The user may also choose to display more than one camera feed on the display 18 at the same time.
According to another alternative, the user may zoom in or out. As one zooms out, more and more of the environment of the navigation device 10 will be gradually displayed on the display 18. It will be appreciated that the user may select, for example, a helicopter view, as shown in figure 2, including the position of the navigation device 10. Such a view provides an image of the navigation device 10 (or vehicle) viewed from behind. Of course, such a view cannot be provided by a camera fixed on the navigation device 10 or on the vehicle. Thus, the navigation device 10 may provide an image as shown in fig. 12, where only a portion of the image is the camera view, around which are the map data and navigation directions.
While specific embodiments of the invention have been described above, it will be appreciated that the invention may be practiced otherwise than as described. For example, the invention may take the form of a computer program containing one or more sequences of machine-readable instructions describing a method as disclosed above, or a data storage medium (e.g. semiconductor memory, magnetic or optical disk) having such a computer program stored therein. Those skilled in the art will appreciate that any of the software components may also be formed as hardware components.
The above description is intended to be illustrative and not restrictive. Thus, it will be apparent to those skilled in the art that modifications may be made to the invention as described without departing from the scope of the claims set out below.

Claims (25)

1. A navigation device (10), the navigation device (10) being arranged to display navigation directions (3, 4, 5) on a display (18),
the method is characterized in that: the navigation device (10) is further arranged to receive a feed from a camera (24), and the navigation device (10) is arranged to display a camera image of the feed from the camera (24) in combination with the navigation directions (3, 4, 5) on the display (18).
2. The navigation device of claim 1, wherein the camera is integrally formed with the navigation device.
3. Navigation device according to any of claims 1 or 2, wherein the navigation direction is one or more of: the location arrows (3), routes (4), arrows (5), points of interest (POI), roads, buildings, map data such as vector data, etc., stored in at least one memory unit are stored in at least one memory unit such as a hard disk (12), a read only memory (13), an electrically erasable programmable read only memory (14) and a random access memory (15).
4. Navigation device according to any one of the preceding claims, further arranged to superimpose the navigation directions (3, 4, 5) on the camera image such that the positions of the navigation directions (3, 4, 5) are in a predefined spatial relationship with respect to respective parts of the camera image.
5. Navigation device according to any one of the preceding claims, wherein the navigation device (10) comprises a processing unit (11), a positioning device (23) and orientation sensors (23, 27, 28), the positioning device (23) and the orientation sensors (27, 28) being arranged to communicate with the processing unit (11), the processing unit (11) being arranged to calculate a position and orientation of the camera (24) and/or the navigation device (10) using readings from the positioning device (23) and the orientation sensors (23, 27, 28), the processing unit (11) calculating the position of the navigation direction on the display (18) based on the position and orientation.
6. Navigation device according to claim 5, wherein the positioning device (23) is compatible with at least one of: the european galileo system, any other global navigation satellite system and location sensing technology based on ground-based beacons.
7. Navigation device according to any of the claims 5-6, wherein the processing unit (11) calculates the orientation of the camera (24) with respect to a first rotation axis (C) which is substantially vertical in use by comparing the position of the camera (24) and/or the navigation device (10) determined by the positioning device (23) at a subsequent point in time.
8. Navigation device according to any of claims 5 to 6, wherein the navigation device (10) comprises a compass providing compass readings to the processor unit (11), the processing unit (11) being arranged to calculate an orientation of the camera (24) relative to a first rotational axis (C) which is substantially vertical in use, based on the compass readings.
9. A navigation device according to any of claims 5 to 8, wherein the orientation sensor comprises a tilt sensor (27, 28) to determine the orientation of the camera (24) relative to second and third axes of rotation, which are substantially horizontal in use.
10. Navigation device according to any of the preceding claims, wherein the processing unit (11) superimposes the navigation directions (3, 4, 5) on the camera image using a pattern recognition technique such that the positions of the navigation directions (3, 4, 5) are in a predefined spatial relationship with respective parts of the camera image.
11. The navigation device of claim 10, wherein the navigation device uses map data as input for the pattern recognition technique.
12. Navigation device according to any of the preceding claims, wherein the processing unit (11) uses a stabilized camera technique to compensate for vibrations in a camera feed.
12. Navigation device according to any one of the preceding claims, wherein the navigation device (10) is arranged to receive calibration corrections, to store these calibration corrections, and to apply the calibration corrections when combining the navigation directions (3, 4, 5) and the camera images.
13. Navigation device according to any one of the preceding claims, wherein the navigation device is arranged to receive or read in camera settings and to use the camera settings to calculate the position of the navigation directions (3, 4, 5) on the display (18).
14. Navigation device according to any one of the preceding claims, wherein the navigation device (10) is further arranged to receive feeds from more than one camera (24) and the navigation device (10) is arranged to select one of the feeds to be displayed on the display (18).
15. Navigation device according to any one of the preceding claims, wherein the camera (24) is sensitive to electromagnetic radiation outside the range of the electromagnetic spectrum visible to the naked human eye.
16. Navigation device according to claim 15, wherein the camera (24) is an infrared camera.
17. Navigation device according to any one of the preceding claims, wherein the camera (24) is arranged to zoom in and/or zoom out.
18. The navigation device of claim 17, wherein the camera is arranged to zoom in or out according to, for example, the speed of the navigation device/vehicle.
19. An instrument panel comprising a navigation device (10) according to any one of the preceding claims.
20. A vehicle comprising a navigation device (10) according to any one of the preceding claims.
21. A vehicle according to claim 20 wherein the vehicle comprises a vehicle tilt sensor (30) to determine the tilt of the vehicle to provide a vehicle tilt reading to the navigation device (10).
22. A method of providing navigation directions, the method comprising:
displaying navigation directions (3, 4, 5) on a display (18),
the method is characterized in that: the method further comprises:
receiving a feed from a camera (24), an
Displaying a combination of the camera image from the feed of the camera (24) and the navigation directions (3, 4, 5) on the display (18).
23. A computer program which, when loaded on a computer arrangement, is arranged to perform the method of claim 22.
24. A data carrier comprising a computer program according to claim 23.
HK08108498.2A 2005-06-06 Navigation device with camera-info HK1116861A (en)

Publications (1)

Publication Number Publication Date
HK1116861A true HK1116861A (en) 2009-01-02

Family

ID=

Similar Documents

Publication Publication Date Title
US8180567B2 (en) Navigation device with camera-info
US8423292B2 (en) Navigation device with camera-info
JP4705170B2 (en) Navigation device and method for scrolling map data displayed on navigation device
JPH09138136A (en) Car navigation system
CN102798397B (en) Navigation device with camera information
JP4833384B1 (en) Navigation device, navigation method, navigation program, and recording medium
RU2375756C2 (en) Navigation device with information received from camera
KR20080019690A (en) Navigation device with camera information
JP3655738B2 (en) Navigation device
HK1116861A (en) Navigation device with camera-info
JP2011022152A (en) Navigation device
RU2417398C2 (en) Navigation device and method of scrolling cartographic representation displayed in navigation device
HK1114169A (en) Navigation device and method of scrolling map data displayed on a navigation device
KR20080036039A (en) Navigation device and method for scrolling map data displayed on the navigation device
NZ564320A (en) Navigation device and method of scrolling map data displayed on a navigation device