WO2009002603A1 - Systems and methods for generating, storing and using electronic navigation charts - Google Patents
Systems and methods for generating, storing and using electronic navigation charts Download PDFInfo
- Publication number
- WO2009002603A1 WO2009002603A1 PCT/US2008/061386 US2008061386W WO2009002603A1 WO 2009002603 A1 WO2009002603 A1 WO 2009002603A1 US 2008061386 W US2008061386 W US 2008061386W WO 2009002603 A1 WO2009002603 A1 WO 2009002603A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- color
- image
- navigation
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/20—Arrangements for acquiring, generating, sharing or displaying traffic information
- G08G5/21—Arrangements for acquiring, generating, sharing or displaying traffic information located onboard the aircraft
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
- G09B29/106—Map spot or coordinate position indicators; Map reading aids using electronic means
Definitions
- the present invention relates generally to navigation charts, and more specifically to methods of generating, using, and storing navigation charts in an electronic form that is advantageous for use on multiple different platforms and in multiple different environments.
- Navigation charts are commonly used in aviation, marine, and land-based environments. Before the advent of the computer and electronic display systems, such navigation charts were exclusively produced in a paper form. After the development and widespread adoption of computers and electronic displays, it became common to publish the navigation charts in electronic formats. Such electronic formats allowed the navigation charts to be displayed on electronic displays, which reduced the need for bulky compilations of paper navigation charts, allowed virtually instantaneous access to any chart in a particular database of charts, and facilitated the updating of the charts.
- the present invention provides a method and system for rendering computerized navigation charts that overcomes the aforementioned problems with the prior art.
- the present invention is applicable to navigation charts for marine, terrestrial, and avionics environments. In all these fields, the present invention provides the capability of easily rendering navigation charts across a wide variety of different computer platforms with a reduced amount of computational power.
- the present invention allows for the display of aircraft navigation charts utilizing software classified as Level B under the DO-178B standards of RTCA.
- an electronic navigational display system includes a display, memory, data, a user interface, and a controller.
- the display is adapted to display information to a viewer.
- the memory stores an electronic image of a navigation chart that includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section.
- the electronic image of both sections are stored in a raster graphics format within the memory.
- Data is also included within the memory that specifies the locations of the first and second sections of the navigation chart within the image.
- the user interface is adapted to allow a user to select a display option in which only the first section of the navigation chart is displayed on the display.
- the controller is in communication with the user interface and is adapted to read the data and the electronic image from the memory and use the data to display the navigation chart according to the selected display option.
- an electronic navigational display system for a mobile vehicle.
- the navigational display system includes a display, a memory, a navigation system, a controller, and data stored in the memory.
- the display is adapted to display information to a user of the mobile vehicle while the user is inside the mobile vehicle.
- the memory stores an electronic image of a navigation chart in a raster graphics format, along with data corresponding to the electronic image.
- the data specifies a scale and a latitudinal and longitudinal reference for the electronic image of the navigation chart.
- the navigation system is adapted to determine a current position of the mobile vehicle, and the controller is adapted to read the electronic image from the memory and display the navigation chart on the display.
- the controller is further adapted to display the current position of the mobile vehicle as determined by the navigation system on the display in a manner in which the current position of the mobile vehicle is indicated on top of the electronic image of the navigation chart at a location that matches the vehicle's current position with respect to the electronic image of the navigation chart.
- an electronic repository of at least one navigation chart that includes a map section having a plan view of a map.
- the electronic repository includes a memory, image data, and first and second data fields.
- the image data contains an image of the navigation chart that is stored in the memory as a plurality of pixels in a raster graphics format.
- the first data field is contained within the memory and is separate from the image data.
- the first data field specifies a scale for the map section of the image data wherein the scale allows a physical distance to be computed between a pair of pixels within the map section of the image data such that the physical distance computed between the pair of pixels can be converted to an actual distance between a pair of locations on the map corresponding to the pair of pixels.
- the second data field is contained within the memory and is separate from the image data.
- the second data field specifies a geographical reference for the map section of the image data such that a set of geographical coordinates can be determined from the geographical reference for any pixel within the map section of the image data.
- an electronic repository of at least one navigation chart that includes a first and a second section
- the first section includes a plan view of a map
- the second section includes text containing navigation information relating to the first section.
- the electronic repository includes a memory, image data, and first and second data fields within the memory. The first and second data fields are both separate from the image data.
- the image data contains an image of the navigation chart that is stored in the memory as a plurality of pixels in a raster graphics format.
- the first data field identifies a location of the first section of the navigation chart within the image data
- the second data field identifies a location of the second section of the navigation chart within the image data.
- a method for converting a vector graphics file of a navigation chart into a raster graphics file wherein the navigation chart includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section.
- the method includes loading the vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file. Thereafter, the rendered image is converted into a plurality of pixels that each have a color value associated with them. A first set of pixels corresponding to the first section of the navigation chart and a second set of pixels corresponding to the second section of the navigation chart are both determined by using the vector graphics file of the navigation chart.
- a raster graphics file is stored in an electronic target location along with data relating to the color value of each of the plurality of pixels and data identifying the first and second sets of pixels.
- a method for converting a vector graphics file of a navigation chart into a raster graphics file. The method includes loading the vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file wherein the rendered image defines a plurality of object colors. The rendered image is converted into a plurality of pixels wherein at least one of the plurality of pixels has a non-object color different from the object colors.
- a color value for each of the plurality of pixels is determined, then the total number of color values are counted and compared to a predetermined threshold. If the total number of color values exceeds the predetermined threshold, the total number of colors is reduced by calculating a color distance between all of the color values, determining a frequency of a first color value and whether the first color value corresponds to an object color or a non-object color, and determining a frequency of a second color value and whether the second color value corresponds to an object color or a non-object color. Based on the calculated color distances and color frequencies, the following actions are taken: replacing the first color value with the second color value if the first color value corresponds to a non-object color and the second color value corresponds to an object color; or
- a method of converting a vector graphics file of an aircraft navigation chart into a raster graphics file using a computer running on a Windows ® operating system includes loading the vector graphics file into the computer and using a GetDIBits function of the Windows operating system to determine a first set of pixels corresponding to an entire image of the aircraft navigation chart, a second set of pixels corresponding to a first portion of the aircraft navigation chart, and a third set of pixels corresponding to a second portion of the aircraft navigation chart wherein the second set of pixels includes a plurality of pixels not contained within the third set.
- the second and third sets of pixels are compared against the first set of pixels to determine if the pixels in the second and third sets are the same as the corresponding pixels in the first set. If they are not the same, any discrepancy between the pixels of the second and third sets and the pixels of the first set is flagged. If they are the same, a sufficient number of the pixels are saved in a raster graphics file to define an entire image of the navigation chart.
- the navigation charts may be aircraft navigation charts that include a section illustrating a profile view of a desired course of the aircraft.
- Data may be stored in memory identifying the location of the profile view section of the navigation chart within the raster graphics image.
- the data identifying the various sections of the navigation chart may be stored within the same electronic file as the image data, or it may be stored in a file separate from the image data.
- the aircraft navigation chart may also include information relating to flight minimums in another section, and data may be stored in memory specifying the location of the flight minimum information within the raster graphics image.
- a day and a night palette may also be stored in memory and accompany the raster graphics image of the navigation chart whereby the raster graphics image can be displayed with different colors depending upon the time of the day and/or ambient light conditions.
- the size of the raster graphics image of the navigation chart can be reduced by lowering the number of color values for the pixels to a number less than or equal to a predefined threshold.
- the manner of reducing the number of color values for the pixels may involve altering the color values of selected anti-aliasing pixels such that the selected anti-aliasing pixels are assigned new color values that are the same as the color values of other pixels within the navigation chart image.
- the method and systems of the present invention provide improved electronic images of navigation charts that are more easily adapted to different computerized display platforms.
- the electronic images consume relatively small amounts of memory, can be rendered without undue computational demands, provide all the navigation information of prior art navigation charts, and can be manipulated in the same manners as the navigation chart images of the prior art.
- the electronic images of the present invention do not require the computational and/or software requirements necessary to render vector graphic images, the images of the present invention can be displayed on a wider variety of electronic devices than can be done with past images, including, but not limited to cell phones, PDAs, wearable video displays and video glasses, portable media players like ipods, and other similar devices.
- the reduced computational and software requirements necessary to display the charts of the present invention allow the charts to be incorporated into a client/server architecture where a client requests a particular chart and the server delivers it to the client.
- FIG. 1 is a block diagram of a navigational display system according to one aspect of the present invention
- FIG. IA is a block diagram of a navigational display system for a mobile vehicle according to another aspect of the present invention
- FIG. 2 is an elevational view of a pair of flight deck displays that may be used in conjunction with the navigational display systems of FIGS. 1 or IA
- FIG. 3 is an example of an aircraft navigation chart that may be used in accordance with various aspects of the present invention.
- FIG. 3 is an example of an aircraft navigation chart that may be used in accordance with various aspects of the present invention
- FIG. 4 is a table of a first set of metadata inserted into either a raster graphics file containing an image of the navigation chart or a related file that accompanies the raster graphics file;
- FIG. 5 is a table of a second set of metadata inserted into either the raster graphics file containing an image of the navigation chart or a related file that accompanies the raster graphics file;
- FIG. 6 is a cell phone shown displaying a navigation chart that was read from a raster graphics file in accordance with the present invention;
- FIG. 7 is a block diagram of a chart conversion process for changing electronic navigation charts from a vector graphics file to a raster graphics format;
- FIG. 8 is a flowchart illustrating in greater detail a sequence of steps that may be followed in carrying out the chart conversion process of FIG. 7;
- FIG. 9 is a more detailed flowchart of the comparison method illustrated in FIG. 8;
- FIG. 10a is a diagram representing a generic example of a raster graphic navigation chart image stored as a plurality of pixels wherein each pixel has a thirty-two bit color value associated with it;
- FIG. 10b is a table tabulating a frequency of usage of each of the color values in the generic navigation chart example of FIG. 10a;
- FIG. 10c is an unreduced day color palette correlating an index value to all of the colors in the table of FIG. 10b that have a non-zero usage frequency;
- FIG. 1 Od is a reduced day color palette table illustrating a reduced set of color values produced after the color values in the palette of FIG. 10c have undergone a color reduction process;
- FIG. 10a is a diagram representing a generic example of a raster graphic navigation chart image stored as a plurality of pixels wherein each pixel has a thirty-two bit color value associated with it;
- FIG. 10b is a table tabulating a frequency of usage of each of the color values in the generic navigation chart
- FIG. 11 is a flowchart of a color reduction process according to one aspect of the present invention
- FIG. 12a is a diagram illustrating in more detail an example of a color distance computation according to a color reduction process used with one aspect of the present invention
- FIG. 12b is a table arranging pairs of indexed color values in order from the smallest distance to the greatest distance
- FIG. 13a is a table illustrating a color mapping between day and night colors of the navigation chart as provided in an original vector graphics file of the navigation chart; [0038] FIG.
- FIG. 13b is the reduced day color palette table of FIG. 1Od reproduced for ease of reference in conjunction with FIGS. 13c and 13d;
- FIG. 13c is a generic example of a daytime navigation chart image wherein three pixels having the same color value are highlighted;
- FIG. 13d is a generic example of a nighttime navigation chart image that corresponds to the daytime navigation chart image of FIG. 13c;
- FIG. 13e is a table illustrating a night palette for a raster graphics image of a navigation chart;
- FIG. 14 is a flowchart of a night palette creation process according to one aspect of the present invention; [0043] FIG.
- FIG. 15 is a diagram illustrating standard data blocks and data fields of a conventional bitmap computer file; [0044] FIG. 16 is a diagram illustrating various data blocks and data fields of a raster graphics file having an alternative format; [0045] FIG. 17 is a more detailed flowchart of the feedback method illustrated in FIG. 8; [0046] FIG. 18 is a more detailed flowchart of a bitmap-to-target (BMP2Target) process used in the feedback flowchart of FIG. 17; and
- FIG. 19 is a more detailed flowchart of a target-to-bitmap (Target2BMP) process used in the feedback flowchart of FIG. 17.
- Target2BMP target-to-bitmap
- Navigational display system 20 includes a controller 22, a memory 24, a user interface 26, and a display 28.
- Memory 24 contains one or more raster graphics files 25 that contain images of one or more navigation charts.
- navigational display system 20 is adapted to display these navigation charts on display 28 to a viewer.
- Navigational display system 20 allows a user to view an electronic image of a navigation chart in a variety of different environments.
- Navigational display system 20 may be incorporated into any known electronic device capable of displaying raster graphic image files, such as, but not limited to, a conventional computer, a cell phone, a personal digital assistant, a dashboard GPS display for an automobile, an electronic flight deck computer system of an aircraft or spacecraft, a laptop computer, an electronic navigational display for a surface or submersible marine vessel, or other type of electronic displays. Whatever device navigational display system 20 is incorporated into, it is useful to electronically display navigation charts in accordance with the principles described in more detail below.
- the display of navigation charts on navigational display system 20 can be performed during the operation of a mobile vehicle, such as an airplane, while the vehicle is moving. Alternatively, navigational display system 20 can be used to view charts from locations outside of a mobile vehicle. Regardless of where navigational display system 20 is utilized, a user can view any of a database of navigational charts stored in memory 24. User interface 26 allows a user to zoom in, zoom out, scroll up, down, left, and right, and rotate chart images while viewing the navigation charts displayed on display 28. Still further, as will be discussed in greater detail below, navigational display system 20 can automatically locate different sections of a navigation chart and display only those selected sections on display 28. Other capabilities of navigational display system 20 will be discussed further below.
- Navigational display system 20 may be modified to include a navigation system 30, such as is illustrated in FIG. IA.
- FlG. IA illustrates a modified version of navigational display system 20 that will be referred to as navigational display system 20.
- Navigational display system 20' is especially useful for displaying navigational charts on a mobile vehicle, particularly while the mobile vehicle is moving.
- Navigation system 30 allows display system 20' to display the navigation charts on the display 28 in a manner in which the current position of the mobile vehicle is indicated by an icon or other symbol placed by system 20 on top of the navigational chart being displayed. This allows the operator of the mobile vehicle, which may be an aircraft, boat, or land-based vehicle, to see his or her current position with respect to the navigational chart.
- Navigational display system 20' includes all of the same components of display system 20 with the addition of navigation system 30, and all of these components operate in the same manner in both of the systems 20 and 20'. More details and features of navigational display systems 20 and 20' will be described below.
- FIG. 2 depicts an illustrative example of a pair of displays 28a and 28b that may be used in conjunction with either of navigational display systems 20 and 20'.
- Displays 28a and 28b each include a plurality of buttons 32 located adjacent a bottom edge 34 of the displays 28a and 28b.
- Buttons 32 constitute one form of user interface 26.
- Other types of user interfaces 26 may also be used in accordance with the present invention, including, but not limited to, computer mouses, touch screens, knobs, keyboards, joy sticks, voice recognition devices, and the like, as well as combinations thereof.
- Buttons 32 are selectively pressed by the operator of the display system 20 or 20' to control the information that is displayed on displays 28a and 28b. In the illustrated example of FIG.
- buttons 32 are known as soft keys. That is, buttons 32 interact with controller 22 to change what is displayed on displays 28a and 28b based on a menu (not shown) that is displayed on displays 28a and 28b immediately above the buttons 32.
- An example of such soft keys that may be used in accordance with the present invention is disclosed in commonly assigned, co-pending PCT application serial number PCT/US2006/021390, entitled AIRCRAFT AVIONIC SYSTEM HAVING A PILOT USER INTERFACE WITH CONTEXT DEPENDENT INPUT DEVICES, filed June 2, 2006 in the United States receiving office, the complete disclosure of which is hereby incorporated herein by reference.
- User interface 26 interacts with controller 22 to cause controller 22 to display different images on either or both of displays 28a and 28b. While FIG. 2 depicts two displays 28a and 28b, it will be understood by those skilled in the art that the invention is applicable to systems having only a single display, or systems having two or more displays. Further, the type of display used in accordance with the present invention can vary widely, from conventional LCD type displays to organic light-emitting diode (OLED) displays to cathode ray tubes (CRTs) to projection displays to head-up displays (HUD) to plasma screens to any other known type of electronic display.
- OLED organic light-emitting diode
- CRTs cathode ray tubes
- HUD head-up displays
- Display 28a in FIG. 2 may be a primary flight display (PFD) for an aircraft while display 28b may be a multi-function display (MFD) for an aircraft.
- display system 20 can be used in environments other than mobile vehicles, and, even when display systems 20 or 20' are used on a mobile vehicle, they can be applied to other mobile vehicles besides aircraft. Further, when display systems 20 and 20' are used in conjunction with an aircraft, the present invention can be applied to different displays other than the MFD or PFD within the aircraft cockpit.
- the display 28 (or displays) used in accordance with the present invention is configured to be able to display a navigation chart useful for the particular activity the navigation chart relates to, such as flying, boating, driving, or other activities.
- Memory 24 may be any conventional type of electronic memory such as, but not limited to, RAM, ROM, flash memory, a compact disc, a DVD, a hard drive, an SD or Compact Flash card, a USB portable data stick, a floppy disk, a holographic versatile disc (HUD) or any type of electronic memory capable of being read by a computer, regardless of whether the memory is fixed within the computer or removable from it.
- RAM random access memory
- ROM read-only memory
- flash memory read-only memory
- compact disc compact disc
- DVD digital versatile disc
- HDD holographic versatile disc
- Controller 22 may include one or more conventional microprocessors and may be a conventional computer, such as a PC or other known type of computer. In some applications, controller 22 may alternatively be a specialized computer or computer system specifically adapted for controlling various aspects of the overall system in which it is incorporated. For example, if display system 20 is incorporated into a personal digital assistant (PDA), controller 22 would include the processor or processors inside the PDA that perform the conventional functions of the PDA. Alternatively, if display system 20 were incorporated into a cell phone, controller 22 would include the processor(s) inside the cell phone that ran the phone's conventional software and/or firmware.
- PDA personal digital assistant
- controller 22 would include the processor(s) inside the cell phone that ran the phone's conventional software and/or firmware.
- controller 22 may be one or more of the processing components of an electronic flight deck control system that displays such information as aircraft attitude, altitude, heading, position, radio information, engine parameters, a crew alerting and warning system (CAWS) list, weather, and the like to the pilot.
- CAWS crew alerting and warning system
- Additional environments into which navigational display system 20 can be incorporated include projection cell phones capable of projecting images onto a surface, such as, but not limited to, cell phones using the PicoProjector available from Microvision of Redmond, Washington.
- Navigational display system 20 can further be incorporated into wearable video displays and video glasses, such as, but not limited to, the iLoungeTM, available from Myvu Corporation of Westwood, Massachusetts, and the Lumus Pd- 10TM, available from Lumus Ltd. of Rehovot, Israel.
- wearable video displays and video glasses such as, but not limited to, the iLoungeTM, available from Myvu Corporation of Westwood, Massachusetts, and the Lumus Pd- 10TM, available from Lumus Ltd. of Rehovot, Israel.
- controller 22 can include one or more processors that perform a wide variety of other functions in addition to the rendering of navigation chart images.
- any controller 22 is suitable for the present invention so long as it is capable of reading the raster graphics files 25 that contain the navigation charts and displaying these charts on display 28 in response to some form of prompting which may come from user interface 26, or some other source, such as an electronic signal from a system or subsystem that monitors the stage of a particular journey.
- navigational display system 20 While the types of environments in which navigational display system 20 may be implemented can vary, as noted above, the following discussion of the types of navigation charts that may be displayed on navigational display system 20 will primarily be made with respect to aircraft navigation charts. It will be understood that this discussion is for purposes of illustration only, and is not intended to limit the scope of the invention to avionic applications.
- Navigation chart 36 is a conventional instrument approach chart for an aircraft published by Jeppesen Inc. of Englewood, Colorado.
- Navigation chart 36 includes a plurality of different sections, including a header section 38, a map plan view section 40, an aircraft profile section 42, and an aircraft minimums section 44 that specifies various minimum information for landing the aircraft.
- Navigation chart 36 is available from Jeppesen Inc. in both a paper format and an electronic format.
- navigation chart 36 is provided as a vector graphics file that includes text written in True Type fonts. As was briefly mentioned in the Background of the Invention section, the use of the vector graphics format and True Type fonts in navigation charts limit the ability of the charts to be conveniently rendered on many navigational display systems.
- Navigational display systems 20 and 20' are configured to be able to conveniently render navigational charts that are originally provided in a vector graphics format and that use True Type fonts. Navigational display systems 20 and 20' accomplish this without requiring the computational resources necessary for rendering vector graphics files, without requiring extensive re- working of the graphics display platform of a particular controller 22, and also while more easily allowing the software used to display navigation chart 36 to achieve a higher DO-178B level rating, such as a level B,
- Memory 24 of navigational display system 20 has stored in it raster graphics file 25, which contains an image of navigational chart 36.
- the stored image includes header section 38, map plan view section 40, aircraft profile view section 42, and aircraft minimums section 44.
- Memory 24 also stores metadata 27 (FIG. 1) within it that identifies which pixels in the raster graphics file correspond to each of the sections 38, 40, 42, and 44.
- Metadata 27 may be stored in a file separate from raster graphics file 25, such as illustrated in FIGS. 1 and IA, or it may be stored within raster graphics file 25 itself.
- the term "metadata” is used herein to generally refer to data that describes other data, such as data that describes the image data of the navigation charts. It will be understood, however, that the term “data” as used herein can refer to either data or metadata.
- controller 22 may be programmed to allow a pilot to choose, via user interface 26, any one or more sections 38-44 for display on display 28.
- controller 22 could instruct controller 22, via user interface 26, to display only map plan view section 40 on display 28.
- controller 22 would read the metadata 27 from memory 24 that identifies which pixels in the raster graphics file 25 correspond to map plan view section 40 and display only those pixels on display 28. This would allow the pilot to more easily focus on only the map plan view section 40 of navigation chart 36.
- the pilot could select any other one of sections 38-44 for display by itself on display 28, or he or she could select any combination of two or more sections 38-44 for simultaneous display on display 28.
- the pilot can also, of course, have the entire navigational chart 36 displayed at one time on display 28.
- controller 22 and user interface 26 are also configured to allow the pilot (or other user of display system 20) to zoom in or zoom out on whatever portion of navigation chart 36 that is being displayed on display 28 (i.e. zooming in and out can be done regardless of whether the entire navigational chart 36 is being displayed, or only selected sections 38-44 of it).
- navigational display system 20' may be configured to display the aircraft's current location on top of navigational chart 36 so that a pilot can immediately see his or her location with respect to navigation chart 36, as will be described more below.
- Navigational display systems 20 and 20' can also overlay a planned flight plan on top of the navigational chart, if desired,
- Navigation system 30 of display system 20' may be any conventional navigation system, such as, but not limited to, a GPS-based navigation system or an inertial reference system.
- navigation system 30 may include one or more accelerometers, gyroscopes, magnetometers, radio beacon receivers, or any other conventional navigation equipment used on aircraft.
- Navigation system 30 determines the current location of the mobile vehicle with respect to a known reference system, such as latitude and longitude, or a GPS coordinate system, or any other reference system which can determine a position in a manner that can be correlated to navigation chart 36.
- map plan view section 40 of navigation chart 36 includes various navigation landmarks, such as a river 52, an airport 54, a VOR station 56, an intersection 58 (KILBY), and a plurality of potential obstacles 60.
- Map plan view section 40 also includes geographic references that tie the information displayed in section 40 to an external reference system.
- map plan view section 40 includes latitude markings 46 and longitude markings 48 which indicate the position of the map's contents with respect to the Earth's latitude and longitude references.
- map plan view section 40 is drawn to a known scale. This known scale, along with the latitude and longitude markings 46 and 48, may be stored as part of metadata 27. When done so, this metadata allows controller 22 to display on display 28 the current position of the aircraft (or other type of mobile vehicle on which navigational display system 20' is implemented).
- FIG. 2 One example of the display of the mobile vehicle's current position on top of navigational chart 36 is illustrated in FIG. 2.
- Display 28b is shown displaying a map plan view section 40 of a navigation chart 36 (which is a different chart than the specific one illustrated in FlG. 3).
- An aircraft icon 50 is also shown on display 28b at a location west (to the left) of river 52 and northeast of airport 54.
- Controller 22 overlays the aircraft icon 50 on top of navigation chart 36 at a location on the navigation chart that coincides with the aircraft's current position with respect to the map plan view section of navigation chart 36.
- the controller 22 will use the metadata 27 containing the latitudinal and longitudinal references to display the aircraft icon 50 on top of plan view section 40 at the 50 degree, 10 minute north latitude and 90 degree, 32 minute west longitude position on the map. This will allow the pilot to immediately see his or her current position with respect to the items that are included within the map plan view section 40, such as river 52, airport 54, etc.
- Controller 22 updates the position of aircraft icon 50 on display 28 as the aircraft moves. This updating may take place at a rate of several times a second, although other rates may also be used. The updating is based on the information controller 22 receives from navigation system 30. Thus, in the example of FIG. 2, if the aircraft continues flying north, controller 22 will repetitively adjust the position of aircraft icon 50 upwards on display 28b while the underlying image of the map plan view section 40 remains stationary. The aircraft icon 50 will therefore move upward across display 28b in accordance with the corresponding movement of the aircraft through the sky. The visual effect presented to the pilot is thus similar to what the pilot would see if he were physically located at a distance above the aircraft and looking down at the Earth, which was represented by the map plan view 40 of the navigation chart 36.
- controller 22 can react in any of a variety of different manners, including removing aircraft icon 50 from the display, issuing a warning to the pilot, searching for another navigation chart 36 that corresponds to the geographic region into which the aircraft has moved and automatically displaying such a navigation chart (if located), removing the display of navigation chart 36, indicating and updating the distance the aircraft has flown out of the range of the chart, or any other action that would be appropriate for the situation.
- aircraft icon 50 can be varied within the present invention.
- icon 50 can be adapted to reflect the visual appearance of the specific type of aircraft.
- any other suitable non-aircraft icon or indication can be used to display the current position of the aircraft on navigation chart 36.
- navigational display system 20' is used on a mobile vehicle that is not an aircraft, aircraft icon 50 can be replaced with an icon representing the type of vehicle on which system 20' is implemented, e.g. a boat, a car, an RV, etc, or any other type of indication that provides a visual cue to the operator of the mobile' vehicle of the vehicle's current location with respect to the underlying navigation chart 36.
- navigation chart 36 may include one or more insets 62 on the plan view map section 40 (FIG. 2).
- Insets 62 may display a variety of different types of information, such as an enlargement of a particular area of the map or textual information relating to a particular area of a map.
- memory 24 will store the location of each and every inset 62 in a particular navigation chart as part of metadata 27.
- this metadata may be stored within the raster graphics file 25 that contains the image data for the navigation chart 36, or it may be stored separately from the raster graphics file 25.
- metadata 27 could be stored in a memory separate from memory 24.
- controller 22 will read this information and repetitively check to see if the current location of the aircraft has moved to a position that lies over one of the insets 62. If it does, controller 22 will react in any one of a variety of different manners. f0070] In one embodiment, if the inset 62 contains an enlargement of a particular section of a map and that inset happens to be scaled with geographic references, controller 22 will shift the position of aircraft icon 50 to the geographically proper location within the inset 62. This shifting may optionally involve a change in the size of icon 50.
- controller 22 may simply remove aircraft icon from display 28 until the aircraft moves to a location that no longer falls within the region encompassed by inset 62.
- controller 22 may continue to display icon 50 on display 28 at a location that coincides with the latitudinal and longitudinal marks 46 and 48 outside the inset 62, despite the fact that the location of icon 50 might not represent the aircraft's actual location with respect to the interior of inset 62.
- This continued display of icon 50 could involve a change in its color or other attribute in order to give the pilot a visual indication that the location of icon 50 is not necessarily accurate within the area defined by inset 62.
- Other variations are possible as well.
- navigation chart 36 of FIG. 3 has been divided into four sections — header 38, map plan view 40, profile view 42, and minimums 44 — it will be understood by those skilled in the art that the present invention does not limit the specific number of sections into which any particular navigation chart 36 may be divided. Indeed, it would be possible to divide the navigation chart 36 of FIG. 3 into more or less sections than the four illustrated therein.
- the top row of information in the navigation chart of FIG. 3 contains various radio frequency information, including the radio frequencies for the Automatic Terminal Information Service (ATIS), the Green Bay approach, the Minneapolis Center, the Green Bay tower, etc. This entire row of radio frequency information could be considered a separate section of navigation chart 36. Or this row could be further subdivided into smaller sections.
- ATD Automatic Terminal Information Service
- navigation chart 36 will store the location of each section as part of metadata 27. That is, memory 24 will store in metadata 27 sufficient information to identify which pixels in the raster graphics file 25 correspond to each and every different section of the chart 36. This data will allow controller 22 to selectively display, upon prompting via user interface 26, each of the sections of chart 36 either individually or in any desired combination, as was discussed above.
- Each raster graphics file 25 contains a raster graphics image of one or more navigation charts 36.
- Each raster graphics image includes a plurality of pixels that, when combined together in the proper arrangement, create the image of the navigation chart 36.
- the specific format of the file containing the raster graphics image of the navigation chart 36 can vary within the scope of the invention.
- raster graphics image 25 may be stored as a conventional bitmap file.
- Other types of file formats may also be used, and the invention contemplates tailoring the format of the raster graphics file 25 and accompanying metadata 27 to the specific needs and formats required by a particular navigational display system 20 or 20', or other device that may display an image of the navigation chart 36.
- metadata 27 may contain information identifying a geographic reference for a chart, a scale, the location of different map sections, and the location of different insets within a given map.
- This list of information that may be stored within metadata 27 is only an illustrative example of the types of information that may be stored in memory. Changes and additions to this list are within the scope of the invention.
- An example of one set of metadata 27 that may be stored for an aircraft navigation chart is listed in the tables of FIGS. 4 and 5.
- the metadata identified in FIGS. 4 and 5 is divided into a plurality of data fields 64. Each of the data fields 64 is identified in the leftmost column of FIGS. 4 and 5.
- the size of the data field in bytes is listed in the second column from the left, followed by a short description of the data field in the next column to the right, and an indication of the type of data field in the right-most data column.
- the data fields 64 listed in FIGS. 4 and 5 are merely illustrative of the types of data fields 64 that may be used in accordance with the present invention. In other words, the precise number and types of data fields 64 that comprise metadata 27 may vary substantially from that depicted in FIGS. 4 and 5, including additional metadata not illustrated in FIGS. 4 and 5. Further, the size of the data fields can be varied, along with the field types that define the format of the metadata in the data fields 64. [0074] The meaning of the data fields 64 of FIGS. 4 and 5 will now be described.
- the m_tiches data field (FIG. 4) specifies the size of the chart in thousandths of inches.
- the data in the m_tiches data field may be formatted in a manner referred to as a "Magnitude2d" type of field, which specifies a first set of four bytes that identifies the width of the navigation chart in thousandths of inches, and a second set of four bytes that identifies the length of the navigation chart in thousandths of inches.
- the m whole data field (FIG. 4) identifies the location and extent of the whole navigation chart 36 in whatever coordinate system the navigation chart 36 uses.
- the metadata within the m whole data field may be formatted in a manner referred to as a "Rect" type of field, which specifies a first set of eight bytes that identifies the coordinates of the lower left corner of the entire navigation chart 36 and a second set of eight bytes that identifies the coordinate of the upper right corner of the entire navigation chart 36.
- the m angleToRotateHeader data field (FIG. 4) identifies what angle the image data of the navigation chart 36 will need to be rotated (if any) in order for the image to be presented on a display with the header section 38 oriented toward the top of the display.
- the m angleToRotateHeader data field is useful where some navigation charts 36 may be oriented in a landscape orientation and other ones may be oriented in a portrait orientation. Controller 22 can read the metadata in the m_angleToRotateHeader data field and use this to automatically display the navigation chart 36 in the proper orientation, thereby relieving the viewer of the task of having to re-orient the navigation chart manually through buttons 32, or some other type of user interface 26.
- the m_angleToRotate data field may be formatted in a manner referred to as a "float" type of data field, which simply refers to a four byte floating point number.
- the m isTo Scale data field (FIG. 4) identifies whether the navigation chart 36 is drawn to scale or not.
- the metadata 27 within the m_isTo Scale data field may be stored as a single bit (or byte) in which a value of one means navigation chart 36 is drawn to scale and a value of zero means navigation chart 36 is not drawn to scale, or vice versa. This format is referred to as "bool" in FIG. 4. If the m_isToScale data field indicates that the navigation chart 36 is drawn to scale, then additional metadata will be stored in memory 24, such as that identified in FIG. 5. This additional metadata will be discussed more below with respect to FIG. 5.
- the m sizeOfMetadata data field (FIG. 4) identifies the total size of the metadata 27 that accompanies the image data of the navigation chart 36. As mentioned above, FIGS. 4 and 5 identify all of the metadata 27 that may accompany a particular navigation chart 36. In some situations, the particular fields 64 of metadata 27 that accompany a given navigation chart 36 will vary from one chart to another, such as when some charts are drawn to scale and other charts are not drawn to scale.
- the m_sizeOfMetadata field identifies the total size of whatever metadata 27 happens to accompany a particular image of a navigation chart 36.
- the m_sizeOfFilename data field (FIG. 4) identifies the size of the file name that will be used in the target system.
- the target system refers to the particular display system that will be displaying the navigation chart.
- the metadata in the m sizeOfFilename data field may be stored as an unsigned integer ("unsigned int").
- the m_pFilename data field (FIG. 4) identifies the file name of the raster graphics file
- This data field 64 allows the target raster graphics file 25 to be correlated to the name originally given to a particular navigation chart by the vendor or supplier of the vector graphics file of that navigation chart.
- the size of this data field is variable and determined by the value stored in the m sizeOfFilename data field, discussed above.
- the metadata 27 in the m_pFilename data field may be stored as a string of unsigned characters ("unsigned char").
- FIG. 5 depicts additional metadata that may usefully be stored in memory 24 (or another memory accessible to controller 22) if the data field m isToScale (FIG. 4) indicates that the navigation chart 36 is drawn to scale. If the navigation chart 36 is not drawn to scale, the data fields 64 of FIG. 5 may be omitted in their entirety.
- the m_header data field (FIG. 5) identifies the location and extent of header section 38 of navigation chart 36. Specifically, the m_header data field identifies which pixels in the image of navigation chart 36 correspond to header section 38.
- the metadata within this field may be stored in the "Rect" format, which defines a first set of eight bytes of data that identify the lower left coordinates of a rectangle and a second set of eight bytes of data that identify the upper right coordinates of the rectangle.
- the specific coordinates used to identify these two locations may be based on the coordinate reference system used in the m whole data field (discussed above).
- the m_plan data field (FIG. 5) identifies which pixels in the image of navigation chart 36 correspond to the map plan view section 40.
- the metadata in the m_plan data field may be stored as two sets of eight bytes wherein the first set identifies the coordinates of the lower left corner of the rectangle of plan view section 40 and the second set identifies the coordinates of the upper right corner of the rectangle of plan view section 40.
- the m_planLatLon data field (FIG. 5) identifies the location and extent of the map plan view section 40 in latitudinal and longitudinal coordinates.
- the "LatLonRect" field type may be defined as a first set of sixteen bytes that identifies the latitude and longitude of the lower left comer of the map plan view section 40, and a second set of sixteen bytes that identifies the latitude and longitude of the upper right corner of the map plan view section 40.
- the m isProfilePresent data field (FIG. 5) identifies whether navigation chart 36 includes a profile section 42 or not. This information may be stored as a single bit (or byte) wherein a zero value indicates that navigation chart 36 does not include a profile view section 42 and a one value indicates that chart 36 does includes a profile view section, or vice versa.
- the m_profile data field (FIG.5 ) identifies the location and extent of the aircraft profile view section 42 of navigation chart 36, if such a section 42 is present in chart 36.
- the metadata in the m_profile data field may be stored as a first set of eight bytes that defines the lower left coordinates of the profile view section 42 and a second set of eight bytes that defines the upper right coordinates of the profile view section 42.
- the m minimum data field (FIG. 5) identifies the location and extent of the aircraft minimum section 44 of navigation chart 36.
- the metadata in the m minimum data field may be stored as a first set of eight bytes that defines the lower left coordinates of the minimum section 44 and a second set of eight bytes that defines the upper right coordinates of the minimum section 44.
- the m numberOflnsets data field (FIG. 5) identifies the number of insets 62 that are present (if any) within the plan view section 40 of navigation chart 36.
- the metadata in the mjnumberOflnsets data field may be stored as a four byte unsigned integer.
- the m_plnset data field (FIG. 5) identifies the location and extent of each of the insets
- m_plnset data field will occupy a size that is dependent upon the actual number of insets 62 within the plan view section 40 of a given navigation chart 36.
- a first set of eight bytes may be used to identify the lower left coordinates of the inset and a second set of eight bytes may be used to identify the upper right coordinates of the inset wherein the coordinates are specified in the same coordinate reference system used in the other metadata fields (e.g. the m_whole data field).
- the m_pInsetLatLon data field (FIG. 5) also identifies the location and extent of each of the insets 62 that are contained with the plan view section 40 of navigation chart 36.
- the m_pInsetLatLon data field differs from the above-described m_plnset data field in that it defines the lower left corner and upper right corner of the inset 62 in latitudinal and longitudinal reference coordinates. It will be understood that, if the target navigational display system is configured to operate using a latitudinal and longitudinal reference system, rather than some other particularized reference system, the m_plnset data field could be omitted while retaining the m_pInsetLatLon data field.
- the data fields 64 illustrated in FIGS. 4 and 5 are merely illustrative of the types of metadata 27 that may be stored in memory 24. Different types of data fields, different numbers of data fields, and different formats for the data fields may be used in accordance with the present invention. Further, the precise location where data fields 64 are stored in memory can also be varied within the scope of the present invention.
- the metadata 27 of data fields 64 that accompanies a particular navigation chart 36 are stored in memory 24 as part of the raster graphics file 25 that contains the image of that particular navigation chart 36. That is, the raster graphics file 25 that contains the image data for a navigation chart also includes the metadata 27 for that particular chart.
- the data fields 64 that accompany a particular chart are stored in memory 24 in a file separate from the raster graphics file 25 that contains the image data of the particular chart.
- controller 22 will read two different files when displaying a particular navigation chart on display 28: the raster graphics file 25 containing the image of the navigation chart, and a separate file containing the metadata of data fields 64 that correspond to that navigation chart.
- one or both of these two files should include information that allows the data fields 64 to be correlated to a particular raster graphics file 25.
- metadata could be stored in a memory separate from memory 24, if desired.
- the particular number, kind, and format of the data fields 64 that accompany a given navigation chart 36 may vary considerably depending upon the particular form of the navigation chart 36. While the aircraft navigation chart 36 of FIG. 3, which includes rectangular sections 38-44, has been referenced herein, the present invention is applicable to navigation charts 36 that are divided into sections having shapes other than rectangles. For example, some navigation charts 36 might include one or more circular sections, or one or more square sections, or some other type of polygon or non-polygonal shape. If it is desirable for controller 22 to be able to recognize these non-rectangular shapes (such as for display purposes), then additional data fields 64 identifying which pixels in the raster graphics file 25 of the navigation chart correspond to those variously shaped sections would be stored as metadata 27.
- these additional data fields 64 could be varied, but could include data identifying a center point and a radius for the circular sections, two corner locations for the square sections, and whatever metadata 27 that would be necessary to define the location of whatever other types of shaped sections navigation chart 36 contained.
- the specific data fields 64 that accompany a given navigation chart can therefore be tailored to the particular layout and information contained with a given navigation chart.
- navigation charts 36 may include multiple sections
- the present invention contemplates using additional data fields 64 like those described above to store information about the scale and/or geographic coordinate system of those multiple sections.
- the present invention contemplates storing, in addition to the raster graphics image data of a navigation chart, any type of further metadata 27 about the chart that may be useful for controller 22 to know about the image data for purposes of facilitating the display of the navigation chart to the viewer.
- FIG. 6 depicts another example of one of the many possible manifestations of navigational display system 20 according to the various aspects of the present invention.
- navigational display system 20 is incorporated into a conventional cell phone 63.
- Cell phone 63 includes a display area 65 and a plurality of keys 67.
- display area 65 of cell phone 63 corresponds to display 28
- keys 67 correspond to user interface 26
- the internal memory and microprocessor of cell phone 63 (not shown) correspond to memory 24 and controller 22, respectively, of display system 20.
- Display area 65 of cell phone 63 is illustrated in FIG. 6 displaying an aircraft navigation chart 36 (different from the one of FIG. 3). More specifically, display area 65 of cell phone 63 is illustrated in FIG. 6 displaying the map plan view section 40 and aircraft profile view section 42 of an aircraft navigation chart. The entire navigation chart (which would include header section 38 and minimums section 44) is not shown because the user has pressed the appropriate keys 67 to cause the controller 22 within cell phone 63 to automatically display only sections 40 and 42 of the navigation chart. Further, as can be seen, the display of sections 40 and 42 is not merely a zooming in on these sections of the map, but rather a display in which the sections surrounding sections 40 and 42 having been cut out of the displayed image.
- a user of display system 20 63 does not need to undergo the trial-and-error process of manually zooming and scrolling the image of the navigation chart until the appropriate section or section is displayed. Instead, the user can press a button (or other type of user interface), and controller 22 will automatically display only the desired section at a size that fills, to the extent possible, the viewing area of the display.
- the precise keys 67 used to manipulate the image of the navigation chart can be varied within the scope of the invention. In general, it is desirable to allow keys 67 to be able to zoom in and out on the navigation chart, automatically display different sections of the chart, and scroll the image of the chart up, down, and side-to-side.
- controller 22 may include one or more conventional microprocessors programmed to read raster graphics file 25 (and the accompanying metadata 27, if separate from file 25) from memory 24 and cause the associated display 28 to display the image of the navigation chart contained within raster graphics file 25.
- the microprocessor would be programmed to allow the navigation chart image to be manipulated in the manners described herein (zooming, scrolling, selectively displaying sections, etc) based on inputs from user interface 26.
- the software necessary to carry out these functions may vary from device to device, but would be well within the ability of a person of ordinary skill in the art to devise without undue experimentation.
- Controller 22 can also be programmed to download additional navigation charts 36 from one or more databases. This downloading can take place over any computer network, including the Internet. Controller 22 can be configured in a client/server architecture where the database of navigation charts in raster graphics format acts as a server in responding to requests from controller 22. Such an arrangement would allow for the downloading of individual navigation charts in an "on-demand" time frame, i.e. charts could be downloaded right at the moment they are needed. This "on-demand" feature would greatly improve the prior methods of distributing navigation charts, particularly aircraft navigation charts, which have to be purchased in bulk subscriptions, rather than on a chart-by-chart basis.
- Chart conversion method 66 begins with a vector graphics file 68 of one or more navigation charts 36 that are stored in any type of conventional memory device.
- Vector graphics file 68 is fed into a computer 70, which may be a conventional personal computer or any other type of computer capable of being programmed to carry out the functions described herein.
- the manner in which the vector graphics file 68 (or files) are transferred to computer 70 can vary widely, and could include having computer 70 read the vector graphics files 68 directly from its own internal memory, transferring the vector graphics files to computer 70 via a network (including an Internet connection), physically transporting a memory device (such as a disk, DVD, CD-Rom, flash memory device, etc) to computer 70 and coupling the memory device to computer 70 in the appropriate manner, or still other methods.
- a network including an Internet connection
- a memory device such as a disk, DVD, CD-Rom, flash memory device, etc
- Raster graphics file 25 contains a set of raster graphics image data that corresponds to the navigation chart. That is, raster graphics file 25 contains the color information for each of the pixels that, when combined together, create a picture or image of the navigation chart.
- the metadata 27 is the same metadata that was discussed above with respect to data fields 64, and, as was discussed previously, may vary depending upon the layout and composition of a particular navigation chart, as well as what sections of the navigation chart it may be desirable for controller 22 to be able to automatically display by themselves.
- Electronic repository 76 may be the same as memory 24, but it also includes a wider variety of devices beyond memories specifically associated with a controller, user-interface and display, such as controller 22, user-interface 26, and display 28. More specifically, electronic repository 76 may be a stand-alone memory device having no associated controller, display, or user-interface. Such stand-alone memory devices include, but are not limited to, such devices as a floppy disk, a hard drive, a DVD, a CD-Rom, a flash memory device, or similar type of devices.
- electronic repository 76 may be a memory inside of a specific device, such as a memory contained within a portable digital assistant (PDA), a portable media player, a cell phone, a computer (laptop or desktop) or any other type of known device capable of electronically storing the raster graphics file 25 and metadata 27.
- Repository 76 may also be connected to the Internet, or other local or wide area network.
- Computer 70 may be programmed to combine, for each navigation chart 36, the metadata 27 with the raster graphics file 25. If programmed in this manner, computer 70 will output a single raster graphics file 25 for each navigation chart 36.
- computer 70 may be programmed to store the raster graphics file 25 and metadata 27 separately, in which case computer 70 will generate two files for each chart 36, or two sets of files for database of multiple charts.
- the data within electronic repository 76 may be transferred to a mobile vehicle 78, which, as noted previously, could be an air, terrestrial, or marine vehicle.
- Mobile vehicle 78 may contain navigational display systems 20 or 20', or it may contain a display system different from navigational display systems 20 or 20'.
- the manner in which the data from repository 76 is transferred to mobile vehicle 78 can vary substantially within the scope of the invention.
- files 25 and metadata 27 are transmitted wirelessly to a memory onboard mobile vehicle 78 (such as memory 22).
- repository 76 might be physically transported to mobile vehicle 78 and connected to a computer on board mobile vehicle 78, such as may occur if repository 76 takes the form of a conventional Secure Digital (SD) card, a Compact Flash card, a portable USB (Universal Serial Bus) memory drive, or some other similar type of memory device.
- SD Secure Digital
- Compact Flash Compact Flash
- USB Universal Serial Bus
- the mobile vehicle can display images of the navigation charts 36 via its on-board display system.
- the on-board display system may be navigational display system 20 or 20', or it may be a different type of on-board display system.
- the navigation chart(s) 36 stored as raster graphic files 25 with accompanying metadata 27 (either within the file itself or separate), as opposed to vector graphic files 68.
- the computational resources required by the on-board display system of the mobile vehicle to render the navigation chart 36 from the raster graphics file 25 and metadata 27 is substantially less than would be required to render the navigation chart 36 from a vector graphics file 68.
- the navigation chart can be displayed on a wider variety of on-board display systems because the on-board display systems don't need to be able to handle complex tasks, like the rendering of True Type fonts, or the processing of vector graphics files written in specialized formats.
- higher safety ratings for the software that renders the navigation chart can be more easily achieved (such as those specified in DO-178B) because the software necessary to render an image from the raster graphics file 25 and metadata 27 is simpler.
- FIG. 8 illustrates a flowchart summarizing in greater detail the series of steps computer 70 may be programmed to follow in order to carry out the chart conversion process 66 outlined in FIG. 7.
- the steps illustrated in FIG. 8 are only one specific manner in which computer 70 may be programmed to carry out various aspects of the present invention, and it will be understood that computer 70 could be programmed to convert the vector graphics files 68 to raster graphics files 25 and metadata 27 in a variety of different manners.
- Chart conversion process 66 begins with a vector graphic file 68 that contains vector graphics image data 80 of a navigation chart 36.
- the vector graphics image data 80 contains the information that defines the image of the navigation chart 36 using the vector graphics method of defining images.
- Vector graphics file 68 further includes a day palette 72 and a night palette 74, Day and night palettes 72 and 74 define the colors that are used to render the image of the navigation chart.
- Day and night palettes 72 and 74 are an optional component of vector graphics file 68.
- a navigation chart may only have a single color palette associated with it, in which case vector graphics file 68 would include only that single palette, rather than two palettes.
- vector graphics file 68 includes multiple palettes, such as a day and night palettes, or no palettes at all.
- Day and night palettes generally refer to color palettes that are used to render a navigation chart image during different times of the day, or during other times when the ambient lighting surround a display, such as display 28, changes.
- a display such as display 28
- the day and night palettes 72 and 74 of vector graphics file 68 define two different sets of colors, the former a set of colors appropriate for displaying during high light level conditions, such as the day time, and the latter appropriate for display during low light level conditions, such as at night.
- computer 70 renders an unverified day chart image from the vector graphics file 68.
- a “day chart” refers to a navigation chart that is rendered using the colors specified in day palette 72. As noted, these colors generally make the chart more easily viewable during the day time hours.
- a “night chart”, which is rendered at step 90, refers to a chart that is rendered using the colors specified in night palette 74, which are generally appropriate for viewing at night time. The information contained in a day chart and the corresponding night chart is the same. The only difference is the selection of colors used when displaying the chart.
- the rendering of the day chart in step 82 may be accomplished in any of a variety of known manners.
- the rendering takes place inside a memory of computer 70.
- Such an internal rendering of the day chart may be accomplished using known techniques, such as the GetDIBits function of the Microsoft Windows® operating system. Other known functions of the Windows® operating system may also be used to render the chart at step 82.
- the rendering of the day chart defines a set of pixels that, when combined in the appropriate manner, create an image replicating the image of the navigation chart. While the present invention contemplates that the day chart could be rendered in step 82 with a variety of different resolutions, one acceptable resolution is to render the day chart using 2,048 pixels along the longest side of the day chart.
- the rendered image at step 82 will result in the creation of an image having 2,048 x 2,048 pixels. If one side of the chart is shorter than the other side, then the longer side will have 2,048 pixels and the shorter side would be divided into a smaller number pixels corresponding to the shorter length of that side of the image.
- the rendering of the day chart at step 82 may be accomplished with the assistance of a conventional graphics card, such as would be known by one of ordinary skill in the art.
- a conventional graphics card such as would be known by one of ordinary skill in the art.
- One suitable system for rendering the day chart in step 82 is the ATI Catalyst® graphics software for Microsoft Windows that is available from Advanced Micro Devices of Sunnyvale, California.
- Other graphics software may also be used within the scope of the present invention.
- the result of rendering the day chart at step 82 is the definition of a plurality of pixels. Each of these pixels has a specific color associated with it.
- the result of step 82 is the creation of pixels that are defined by a 32 bit quad RGB value.
- the 32 bit quad RGB format uses 8 bits to define a red value, 8 bits to define a green value, 8 bits to define a blue value, and 8 bits that are not used.
- the number of colors in the day chart that is rendered at step 82 will likely, but not necessarily, be different than the number of colors defined in the day palette 72 of vector graphics file 68.
- the vector graphics files of aircraft navigation charts marketed by Jeppesen, Inc. of Englcwood, CO. contain night and day palettes 72 and 74 that each contain a maximum of 32 colors.
- the resulting day chart created at step 82 will typically have more than 32 colors.
- the various conventional graphics software that can be used within the present invention to render the day chart will typically add additional colors for anti-aliasing purposes. Some of the pixels defined in step 82 will therefore have colors that have been created for antialiasing purposes.
- anti-aliasing colors will likely be different than the original colors specified in the day palette 72 of the vector graphics file 68. As will be explained in more detail below, the present invention, in one embodiment, limits the number of anti-aliasing colors so that the generated raster graphics file 25 consumes a reduced amount of memory.
- Comparison step 84 is undertaken by computer 70 in order to verify that the rendered day chart has been properly rendered and contains no artifacts.
- Comparison step 84 produces a verified image of the daytime version of the navigation chart.
- computer 70 runs a color reduction and palettization process 86.
- the color reduction and palettization process 86 will be described in greater detail below with respect to FIGS. 10, 11, and 12.
- color reduction and palettization process 86 will reduce the number of anti-aliasing colors produced during step 82 to a predetermined threshold. Further, process 86 will create a day color palette to which each of the individual pixels will be indexed.
- process 86 will be a raster graphics file that uses less memory than it otherwise would if the file were created directly from the verified image data output at step 84.
- the color reduction and palettization process 86 is thus an advantageous process, but not a critical step in chart conversion method 66.
- step 88 computer 70 creates a night palette.
- the creation of the night palette at step 88 is dependent upon the rendering of the night chart in step 90, as will be explained further below.
- the rendering of the night chart at step 90 is performed in the same manner as the rendering of the day chart at step 82. The only difference is that different colors are used in the night color rendition than in the day color rendition.
- the rendering of the night chart at step 90 may be followed by an optional comparison step 92 which is carried out in the same manner as step 84.
- the night palette created at step 88 will define a color for each of the pixels in the day image of the navigation chart. Each of the pixels will have an associated index value that corresponds to one of the colors in the night palette.
- step 94 computer 70 extracts the data from vector graphics file 68 that is necessary to define the metadata 27.
- this metadata 27 may include the information listed in FIGS. 4 and 5, merely a fraction of this information, or additional information beyond what is listed in FIGS. 4 and 5.
- the contents of the metadata 27 may vary depending upon the specific type of navigation chart.
- Injection step 96 combines the metadata 27 with the image data and palette data that was created at step 88.
- the metadata 27 may be stored separately from the image and palette data. If this separate storage is desired, then step 96 would be omitted and the metadata 27 would be saved into whatever memory (such as memory 24) it was desired to store it in. This can be done at step 101.
- the day image, night palette, and day palette from step 88 may also be saved in the memory at step 101 without combination with the metadata at step 96.
- step 98 creates a conventional bitmap file in which the pixels corresponding to the navigation chart are stored as image data in the conventional image data block of the bitmap file, and the metadata is stored in an optional data block that is part of the conventional definition of the bitmap file standard. This will be described in more detail below with respect to FIG. 15. Alternatively, if it is desired to convert the bitmap file into a raster graphics file having a format different than the bitmap format, this can also be done.
- Such a different format may be desirable for different types of target display systems.
- the details of converting a bitmap file into a different type of target file will be described in more detail below with respect to FIG. 17 along with an optional feed back process 100 that may be used to confirm the raster graphics file was properly generated.
- An illustrative example of a raster graphics file 27 in a format different than bitmap file 98 will also be described in more detail below with respect to FIG. 16.
- FIG. 9 illustrates in greater detail the process involved in comparison steps 84 and 92.
- FIG. 9 will be described with reference to comparison step 84, which corresponds to the comparison step undertaken with respect to the day chart image created at step 82. It will be understood, however, that the following description is equally applicable to comparison step 92, which is used in conjunction with the night chart image generated at step 90.
- the comparison step illustrated in FIG. 9 involves a first image capture method 104 and a second image capture method 106. Both of the image capture methods 104 and 106 result in the definition of pixels that create an image of the navigation chart 36.
- the first image capture method 104 involves defining the pixels for the entire navigation chart 36.
- the second image capture method 106 involves defining a plurality of pixels of two different parts of the navigation chart that together make up a complete image of the chart.
- second image capture method 106 may define the pixels for the top half of the navigation chart in a first step and define the pixels for the bottom half of the navigation chart in a second step.
- second image capture method 106 could involve tiling the image of the navigation chart into more than two different pieces. The individual tiles of the image would then be pieced back together to define an entire image of the navigation chart.
- the number of tiles can be varied from two to any number greater than two.
- image capture method 106 may involve capturing the top half of the image separately from the bottom half, it is also possible to capture the left half separately from the right half of the image of the navigation chart. Alternatively, still different portions of the image may be individually captured with second method 106.
- step 106 determines whether the results of steps 104 and 106 match. If the images do match, then computer 70 selects the image generated by either step 104 or 106 and proceeds to the next step in chart conversion method 66 (FIG. 8) using the selected image data.
- computer 70 indicates this mismatch to the operator of computer 70.
- the operator may then instruct computer 70 to re-start the steps illustrated in FIG. 8 to see if a repetition of the steps will result in a match at step 84 (or 92).
- the computer may take other actions in response to a mismatch from capture methods 104 and 106.
- steps 104, 106, and 108 The purpose of steps 104, 106, and 108 is to help ensure that the rendering steps 82 and 90 have generated a plurality of pixels that accurately represent the image of the navigation chart. While comparison steps 84 and 92 are both optional in the present invention, they add a degree of safety to the overall conversion process of the present invention. This added safety can be especially helpful when attempting to certify the methods of the present invention to meet industry standard safety levels, such as those set forth in the DO-178B or D0-200A standards. More particularly, comparison steps 84 and 92 help ensure that no artifact is introduced into the pixel data during the previous image rendering of steps 82 and 90. This is accomplished by rendering the entire image at step 104 and various pieces at step 106 which are then re-combined.
- step 104 If the rendering of the image at step 104 introduces any visual artifact, such as a Microsoft Windows ⁇ logo, window, message, or any other undesirable item not part of the navigation chart, comparison step 108 will likely detect this. Steps 104 and 106 will detect this because of the different locations of the artifact that will be produced in each step. Thus, for example, if step 104 introduces an artifact in the lower left corner of the image of the entire navigation chart, second image capture method 106 will also produce the same artifact in the lower left corner of each of the pieces of the image that are captured during step 106.
- any visual artifact such as a Microsoft Windows ⁇ logo, window, message, or any other undesirable item not part of the navigation chart
- step 106 captures the image by separately capturing the top half of the image and then separately capturing the bottom half of the image, each of the two halves of the image will include the same artifact in the lower left hand corner.
- the top half and bottom half of the image captured in step 106 are combined together into a single image of the entire navigation chart, there will be two artifacts, one in the lower left hand corner of the top half, and one in the lower left hand corner of the bottom half.
- the entire image captured in step 106 is compared with the entire image captured at step 104 in comparison step 108, they will not match.
- the output of step 104 will have a single artifact in the lower left hand corner, while the output of step 106 will have two artifacts.
- FIG. 1 Oa depicts a raster graphics day image 110 having a height 112 and a width 114. While image 110 in FIG. 10a is a blank image, image 110 would normally be an image of a navigation chart 36. For example, image 110 could be an image of the chart illustrated in FIG. 3. Alternatively, image 110 could be an image of any navigation chart 36 desirably used in accordance with the methods and systems of the present invention.
- Image 110 is the image that is generated at step 82 and it may either be verified at step 84, or the step of verification may be omitted. Image 110 is fed into the color reduction and palettization process 86.
- Image 110 consists of a plurality of pixels 1 16.
- FIG. 10a only illustrates a fraction of the pixels 116 which comprise the entire image 110.
- the precise number of pixels 116 that can be used to define image 110 can vary within the scope of the present invention. As noted, however, one embodiment of the present invention defines 2,048 pixels along the longer edge of image 110. hi the illustration of FIG. 10a, the height 112 dimension of image 1 10 would thus be divided into 2,048 pixels as it is longer that width dimension 114. Other number of pixels can be used within the scope of the present invention.
- Each pixel 116 has a color value associated with it. While the length of this color value can be varied within the scope of the present invention, a 32 bit length will be used for purposes of discussing FIGS. 10a- 1Oe.
- FIG. 1 Oa illustrates a 32 bit color value 118 in which bits 0-7 define a blue value, bits 8-15 define a green value, bits 16-23 define a red value and bits 24-31 are unused.
- Image 110 will consume an amount of memory equal to the number of pixels 116 multiplied by 32 bits (not counting the palette data). Because this may be an unacceptably large amount of memory, the size of color values 118 may be reduced via process 86 in a manner that is illustrated more clearly in FIGS. lOb-lOe and 11.
- computer 70 tabulates all the different colors values 118 that result.
- An example of one such tabulation is depicted in FIG. 10b.
- the number of colors tabulated may be greater than the number of colors originally defined in the day palette 72 of vector graphics file 68. This is because the rendering step 82 may create a number of anti-aliasing colors that are added to image 1 10. The total number of colors in image 110 therefore may exceed that in the original vector graphics file 68.
- Unreduced day palette 119 is a table that includes all of the color values 118 that are used in image 110 and that omits all of the colors values that are not used in image 110. In the example illustrated in FIG. 10c, there are 390 different colors defined in image 110 (0 through 389).
- computer 70 After computer 70 has generated unreduced day palette 119, computer 70 utilizes a color reduction process 120 illustrated in FIG. 11 to create a reduced day palette 144. At step
- computer 121 counts the total number of colors in unreduced day palette 119.
- Process or step 126 may best be understood with an example.
- the unreduced day palette 119 contained 155 colors (rather than the 390 listed in FIG. 10c).
- computer 70 would store an eight bit index value (which could have 265 different values) for each pixel 116 that corresponded to the correct color value for that pixel in palette 119.
- index value which could have 265 different values
- step 122 determines at step 122 (FIG. 11) that there are more color values in unreduced day palette 119 than the predetermined threshold number, then it moves to step 128.
- computer 70 calculates the color distance between each and every pair of different colors in unreduced day palette 119.
- the number of color distances calculated at step 128 will be equal to (N)(N- 1)/2 wherein N is the number of color values in palette 119.
- N is the number of color values in palette 119.
- Computer 70 would therefore compute (390 x 389)/2 color distances. This is equal to 75,855 color distances.
- FIG. 12a illustrates that the distance between each color is found by squaring the difference between each of the individual color components in the color values 118.
- the color values 118 consist of a red value, a green value, and a blue value.
- the color distance calculation involves squaring the difference between the red values in a pair, squaring the difference between the green values in the pair, and squaring the difference between the blue values in the pair. These squared values are then summed together and their square root may optionally be taken. Taking the square root would produce a true color distance, but this is not necessary because the pairs of colors will be arranged in a distance table 127 (FIG.
- color distance will refer to the actual color distance or the square of the actual color distance, as well as any other values that correlate to the color distance in a manner that does not alter the order of the color distance from shortest to longest, or vice versa.
- FIG. 12a illustrates a calculation of the squared distance between the pair of color values having index entries of 158 and 200.
- the 158 color value has a hexadecimal value of 0x8F6A2D.
- the 200 color value has a hexadecimal value of 0x91 A434.
- the squared difference between the red values is first computed. In this case, the squared distance between the red values is equal to the square of hexadecimal 8F minus hexadecimal 91.
- the squared distance between the green values is computed. This distance is equal to the square of hexadecimal 6 A minus A4.
- the squared distance between the blue values is calculated. This squared distance is equal to the square of hexadecimal 2D minus 34. The sum of these squared distances between the red, green, and blue values is then determined. The sum is equal to the squared distance between the color values indexed at entries 158 and 200. As noted, this distance may be left as a squared value or a square root could be taken to determine an actual distance. In the illustration of FIGS. 12a and 12b, the square root is not taken because this requires less computation and the results are the same as when the square root is determined.
- computer 70 determines the distance (or distance squared) between each of the color values in unreduced palette 119. After computer 70 has made its calculation of color distance, it arranges the color distances (or color squared distance) in color distance table 127 in a manner starting from the smallest color distance (or color squared distance) to the largest distance (or color squared distance).
- FIG. 12b illustrates an example of such a color distance table.
- the color index values 158 and 200 refer to colors that are separated by a squared distance of 3417.
- the pair of color index values 388 and 389 refer to colors that are separated by a color distance squared of 36.
- the color indexs value 0 and 1 are separated by a distance squared of 9.
- the color pair consisting of index color values 200 and 209 has the shortest distance between its colors. Specifically, the distance between these colors is only 1. While color distance table 127 is shown in FIG. 12b as only containing six separate distance entries, as mentioned above, it would actually contain (390 x 389)/2 total entries. These additional entries have been omitted for purposes of ease of illustration.
- step 1208 After computer 70 has computed the color distance (or color distance squared) between each pair of color values at step 128, it moves onto step 130 (FIG. 11) where it determines which color pair has the shortest distance (or shortest distance squared) between it. As noted, in the example of FIG. 12b the color pair with the shortest distance between it consists of colors having index values of 200 and 209. After determining the color pair with the shortest distance at step 130, computer 70 determines at step 132 (FIG. 11) whether both of the color values in that pair are object colors.
- object color refers to a color that was originally defined in the day palette 72 of vector graphics file 68.
- computer 70 leaves those two color values unchanged and returns to step 130 where it then determines the color pair with the next shortest distance. For example, in reference to FIG. 12b, computer 70 will first determine at step 132 whether index colors 200 and 209 are both object colors or not. If both of these colors are object colors, then computer 70 would return to step 130 and determine the pair of colors with the next shortest distance between them, which in this case would be the colors with index values 0 and 1. As can be seen in FIG. 12b, index colors 0 and 1 have a color distance squared of 9. Computer 70 would then determine whether the colors with index values of 0 and 1 were both object colors at step 132. If they were, it would return to step 130 and find the color pair with the next shortest distance. This pattern would continue until computer 70 eventually located a color pair in which at least one of the colors was not an object color.
- step 134 computer 70 determines whether both of the colors in the color pair are non-object colors. (A non-object color is a color not defined in the day palette 72 of vector graphics file 68). If both of the colors are non-object colors, computer 70 will proceed to step 136. At step 136 computer 70 replaces the less frequently used color value in the pair with the more frequently used color value in the pair. For example, if the particular color pair consisted of color index values 0 and 1, as referenced in FIG.
- color index value 0 is used in 15,840 pixels, while color index value 1 is used for only 20 pixels.
- the 20 pixels that were previously assigned a color index value of 1 (which corresponds to the 32 bit RGB value 0x000003) would be re-assigned to the index color value 0, which corresponds to the hexadecimal color value 0x000000).
- the total number of colors in the day palette 119 would be reduced by one.
- step 138 determines whether the reduced number of color values is now equal to or less than the color threshold 124. Because step 136 replaces one color value with another color value, color palette 119 now consists of one less color value than it had prior to step 136. Step 138 determines whether this reduced number color value is equal to or less than threshold number 124. If it is, computer 70 jumps to step 126. If it is not, computer 70 returns to step 130 where it then determines the color pair having the next shortest distance. The next shortest color pair will be the next shortest pair out of those colors that still remain in palette 119.
- step 134 determines at step 134 (FIG. 11) that both of the color values in a particular pair are not non-object colors, (which, in conjunction with step 132, means that one color is an object color and one is not an object color) then it proceeds to step 140.
- step 140 computer 70 replaces the non-object color with the object color in day palette 119.
- step 138 determines whether the reduced number of colors in palette 119 is less than or equal to threshold number 124. The outcome of that determination dictates whether computer 70 then proceeds to step 126 or continues to repeat the cycle of steps that start at step 130.
- the only colors that are changed in process 120 are the anti-aliasing colors that were added during the chart rendering of step 82. Further, the anti-aliasing colors that are changed are those that are close to other colors in the unreduced day palette 119 and that are used relatively less frequently than the colors that are not changed. This results in a reduced visual impact on the image 110.
- the reduction process 120 of FJG. 11 therefore preserves the original colors of vector graphics file 68 while minimally impacting the antialiasing colors introduced at step 82.
- each of the pixels 116 can be correlated with an 8 bit color index value (FIG. 1Oe). This reduces the amount of memory necessary to store image 110.
- the result of step 82 may be an image 110 in which each pixel 116 is represented by a 32 bit color value 118.
- each pixel 116 will be represented by an indexed color value 142 (FIGS.1 Od-I Oe) that can be stored as an 8 bit value. Consequently, color reduction process 120 will reduce the data necessary to store image 110 by approximately 75 percent (32 bits per pixel to eight bits per pixel plus a color palette).
- each pixel 116 in the reduced image of the navigation chart has an indexed color value 142 associated with it, rather than a direct color value.
- This index color value 142 identifies an entry in reduced day palette 144, such as is illustrated in FIG. 1Od.
- a process 154 for creating a raster graphics night palette is depicted in FIGS. 13 and
- the original vector graphics file 68 of the navigation chart 36 includes a mapping 146 that correlates day palette 72 with night palette 74, such as is illustrated in FIG. 13 a.
- Day/night mapping 146 maps the colors of the day palette 72 of the navigation chart 36 to the night colors in night palette 74.
- Day/night mapping 146 provides information enabling a user of vector graphics file 68 to render a night image. Specifically, day/night mapping 146 enables the user of vector graphics file 68 to replace the day color values when a night time rendition of the navigation chart 36 is desired. While day/night mapping 146 illustrates only 32 colors, it will be understood that the present invention is applicable to vector graphics files that include more or less than 32 colors.
- computer 70 renders a raster graphics image of a night version 152 (FIG. 13d) of the navigation chart 36 at step 90 (FIG. 8).
- Step 90 is carried out in the same manner as step 82 using the night palette 74 of vector graphics file 68 rather than the day palette 72 (which is used in step 82).
- comparison step 92 ensures that the rendered image is an accurate representation of the corresponding navigation chart.
- the night image 152 is used to create the night palette at step 88.
- the night palette 74 of the vector graphic file 68 in the example illustrated in
- FIG. 13a includes only 32 colors, the raster graphic image 152 of the night chart that is rendered will likely include more than 32 colors. This is because the conventional software and/or hardware that may be used to render the image into a raster graphic image will likely add anti-aliasing colors to the raster graphic night image 152, The raster graphic night image 152 will therefore likely include more than the original 32 colors specified in the vector graphic night palette 74.
- FIG. 14 depicts a raster graphic night palette creation process 154 that is carried out by computer 70.
- Raster graphic night palette creation process 154 utilizes the reduced, raster graphics day palette 144, such as that illustrated in FIG. 1Od.
- Process 154 begins by choosing one of the index values in reduced day palette 144. While the initial index value chosen can be any of the values in palette 144, an initial value of zero will be chosen for purposes of discussion herein. This index value will be referred to as value X in FIG. 14 and the accompanying discussion. While any initial value of X may be chosen, and the order of selecting subsequent index values from palette 144 can vary in any manner, night palette creation process 154 will eventually address every index value in day palette 144. It therefore will be more convenient to describe process 154 with an index value X that starts at zero and increments to the highest value in palette 144.
- index value X identifies an entry in the reduced, raster graphics day palette
- X will be incremented from 0 all the way up to the highest index value in palette 144, which is 255. Thus, in the example of FIG. 14, X will be incremented from 0 to 255 during the night palette creation process 154.
- Night raster graphic palette creation process 154 begins at step 156 where a set of pixels D in the raster graphics day image 110 are identified. Specifically, the pixels having an index value of X are idenitifed. With respect to the example depicted in FIG. 13b, computer 70 identifies at step 156 the 15,840 pixels that have a color index value of 0x000000. At step 158, computer 70 identifies all of the pixels in the night image 152 that have the same physical location within the image as the pixels in set D. These pixels constitute a set N.
- step D determines whether the color defined by the index value X is an object color in the vector graphics day palette 72. If it is, computer 70 proceeds to step 164. If it is not, computer 70 proceeds to step 166.
- step 166 computer 70 determines the average color value of all of the pixels in the set N.
- This average value is the average of the various color components.
- the colors are defined as shades of red, green, and blue, the red values are averaged, the green values are averaged, and the blue values are averaged.
- the average of these red, green, and blue values define an average color.
- step 168 computer 70 sets the raster graphics day palette entry for the index value X equal to the average color value determined at step 166. Thereafter, computer 70 increments the value of X at step 170. After incrementing X at step 170, computer 70 determines at step 172 whether X is equal to the threshold color value 124 discussed previously. If it is, the raster graphic night palette creation process 154 is complete and the entire night palette 178 has been created. If is isn't, computer 70 returns to step 156 and the cycle depicted in FIG. 14 repeats itself until the entire night palette 178 has been created.
- step 160 If it is determined at step 160 that the color defined by index value X is an object color in the vector graphics day palette 172, computer 70 passes to step 164.
- step 164 computer 70 determines whether any of the pixels 116 in the set N have a color value that is listed in the vector graphics night palette 74. If is determined at step 164 that none of the pixels in set N have a color value from this vector graphic night palette 74, then computer 70 proceeds to step 166 and follows the procedures of step 166, as has been described previously. If computer 70 determines at step 164 that at least one of the pixels 116 in set N has a color value listed in the vector graphics night palette 74, then computer 70 proceeds to step 174.
- computer 70 sets the raster graphics night palette entry having the index value X equal to the value in the vector graphics night palette 74 that corresponds to the vector graphics day color with the same index value X. Thereafter, computer 70 proceeds to increments X at step 170 in a manner that has been described previously.
- FIGS. 13a and 13b illustrate an illustrative example of the night palette creation process 154.
- the index value X has been set to 145.
- the selection of the value X equal to 145 has been made herein merely for purpose of illustration and does not connote any significance with respect to any of the other values to which X may be set.
- computer 70 identifies a set of pixels D in the raster graphic day image 110 that have a color defined by a color index value X.
- an index value of X equal to 145 identifies an RGB color value of 0x8F6A2D.
- the reduced day palette 144 indicates that there are 3 pixels having this color value in the raster graphic day image 110.
- FIG. 13c illustrates the physical location of these 3 pixels, which are labeled al, a2, and a3. It should be noted that, although day image 110 in FIG. 13c is depicted as a blank image, the actual image 110 would be an image of a navigation chart 36. Image 110 of FIG. 13c has been left blank in order to more clearly explain the night palette creation process 154. In actual use, image 110 may consist of an image like that of the navigation chart 36 depicted in FIG. 3, or any other navigation chart.
- night palette creation process 154 identifies a set of pixels N in the night image 152 that have the same location of the corresponding pixels in the day image 110.
- the pixels bl, b2, and b3 comprise the set D.
- the pixels bl-b3 are located in the same location within night image 152 as the pixels al-a3 are in the day image 110.
- FIG. 13d illustrates raster graphic night image 152 as being physically larger than raster graphic day image 110, the actual sizes of the two images is the same. The disparity in sizes depicted in FIGS.
- each pixel 116 in the night image 152 is defined by 32 bits of data.
- each pixel 116 in the raster graphic day image 110 is defined by an 8 bit data field.
- the difference in physical size between the images 110 and 152 depicted in FIGS. 13c and 13d is intended to convey this difference in data sizes, not a difference in the number of pixels.
- the coordinates of the pixel al in day image 110 (FIG.
- X (equal to 145 in this example) is an object color. If it is, computer 70 proceeds to step 164. If not, it proceeds to step 166. At step 164, computer 70 determines whether any of the pixels in the raster graphic night image 152 have a color value that is defined in raster graphics night palette 74. With respect to the example of FIG. 13d, computer 70 will determine at step 164 whether any of the pixels bl, b2, or b3 have a color value that is listed in vector graphics night palette 74. Depending upon the outcome of that determination, computer 70 will proceed to step 166 or step 174. In an example of FIG.
- step 166 if none of the pixels bl, b2, or b3 have a color value that is defined in vector graphic night palette 74, then computer 70 proceeds to step 166, where it averages the colors of pixels bl, b2, and b3 together. In the example of FIGS. 13 a- 13d, this average color value will then be reset as the 145* 1 entry in the raster graphic night color palette 178 (technically, the 146 th entry since the palette begins at 0). Computer 70 would then increment X and determine the color value for the next entry in the raster graphic night color palette 178 (which would be index value 146).
- step 174 computer 70 sets the color value at the index value of 145 in the raster graphics night palette 178 equal to the hexadecimal value 0x686969. This hexadecimal value is determined from the day/night mapping 146. As can be seen therein, the night color 0x686969 corresponds to a day color of 0x8F6A2D. This day color is the day color defined for the index value of 145 in the reduced raster graphics day palette 144.
- step 174 computer 70 proceeds to increment X and continue to generate all entries in the raster graphics night image palette 178 (FIG. 13e).
- the result of the raster graphic night palette creation process 154 is a raster graphics night palette 178 which will have the same number of index entries as the raster graphic day palette 144. This matching number of entries results because the day palette 144 is used to create raster graphic night palette 178 and one entry in the night palette 178 is created for each entry in the day palette 144.
- the present invention can be implemented using different methods to create night palettes besides the night palette creation process 154 described herein. It will also be understood that the night palette creation process could be omitted from the present invention. For example, it would be possible to save both the day image 110 and the night image 152 in a memory, such as memory 24. However, saving both the day image 110 and the night image 152 consumes extra memory. In some applications, this extra consumption of memory may not be an issue and the present invention can be practiced in these applications where the extra memory is not an issue. In those situations where it is desirable to conserve memory space, the day image 1 10 is saved along with the day and night palette 144 and 178, respectively, while the night image 152 is discarded.
- step 88 computer 70 will have created a day image 110, a reduced day palette 144, and a night palette 178.
- These three pieces of data may be combined with the metadata extracted at step 94. Alternatively, as noted elsewhere, these three pieces of data can be stored as a raster graphics file 25 separate from the metadata extracted at step 94.
- step 96 If the metadata extracted at step 94 and the data from night palette creation step 88 are to be combined together into a single raster graphics file 25, this is done at step 96.
- the manner in which step 96 combines this data into a single file can be accomplished in a variety of different ways.
- the metadata is inserted into an optional data block within a standard bitmap file.
- FIG. 15 depicts the five standard data blocks in a conventional bitmap file 180.
- Bitmap file 180 includes a file header block 182, a bitmap information header block 184, a color palette block 186, an optional data block 188, and an image data block 190.
- the file header block 182 includes five separate data fields that are identified as bfType, bfSize, bfReservedl, bfReserved2, and bfOff ⁇ its.
- the bitmap information block 184 includes 11 different data fields that are identified as biSize, bi Width, biHeight, biPlanes, biBitCount, biCompression, biSizelmage, biXPelsPerMeter, biYPeldPerMeter, biClrUsed, biClrlmportant.
- the color palette block 186 defines a color palette for the raster graphic image that is stored in the image data block 190.
- Data block 188 represents data that may optionally be stored in a bitmap file 180 in accordance with the defined format for bitmap files.
- the bitmap standard does not define the format of the data stored in block 188. Instead, data of any format can be stored in block 188.
- metadata 27 can be easily stored herein in any suitable format, such as, but not limited to, the formats of FIGS. 4 and 5.
- the raster graphic night palette 178 can be stored in data block 188. Injection step 96 accomplishes storage of the metadata 27 and night palette 178 in optional data block 188.
- Injection step 96 will also adjust the bfOffBits data field in the bitmap file block 182 in accordance with the size of the metadata inserted (as well as the size of the raster graphics night palette 178). More specifically, the bfOffBits data field in data block 182 defines the number of bits from itself to the beginning of the image data in block 190. Thus, the bfOffBits data field should be adjusted in accordance with the size of the metadata (and any other data) inserted into optional data block 188. In conventional bitmap files, the bfOffBits data field is an unsigned integer of 32 bits.
- the optional data block 188 can therefore take on a size of 2 ⁇ 32 bits minus the bits contained within bitmap information block 184 and color palette block 186.
- the bitmap file standard therefore allows ample room within optional data block 188 for the storage of metadata 27 and night palette 178.
- FIG. 16 illustrates various data blocks for a target file format that has been labeled RAS.
- RAS is an arbitrary name used herein merely to illustrate one possible alternative file format to the bitmap file format.
- the 16 includes six separate data blocks.
- the six data blocks are a raster file header block 194, a metadata information block 196, a bitmap info header block 198, a day palette block 200, a night palette block 202, and an encoded image data block 204.
- Metadata block 196 stores the metadata 27 described previously.
- Day palette block 200 stores the raster graphics day palette 144.
- the night palette block 202 stores the raster graphics night palette 178.
- the encoded image data 204 stores the pixels that make up the image of the navigation chart.
- raster graphics file is an umbrella term that refers to any type of file containing an image defined in a raster format.
- raster graphics file can refer to bitmap file 180 a RAS file 192, or any other raster graphics file, regardless of format.
- the steps of the feedback method 100 are depicted in more detail in FIGS. 17-19. Feedback method 100 receives bitmap file 180 after injection step 96 (FIG. 8) has been completed.
- Feedback method 102 utilizes two software processes known as BMP2Target 206 and Target2BMP 208.
- the BMP2Target process 206 converts the bitmap file 180 into a target file 210, which may be the RAS file 192 or some other type of raster graphics file.
- the Target2BMP process 208 reconverts the target file 210 back into a bitmap file.
- the reconverted bitmap file is compared with the original bitmap file 180 at a comparison step 212. If there are no differences detected at comparison step 212, then the target file 210 is deemed a verified re-creation of the navigation chart 36. If there are differences detected at step 212, appropriate corrective action is undertaken.
- a computer which may be computer 70 or another computer, reads the bitmap file 180 and encodes the image data using the run length encoding (RLE) algorithm.
- a target buffer 215 is created at step 216 based on the size of the bitmap file 180.
- the metadata 27 is copied into the target buffer 215.
- the bitmap information 184 is copied into target buffer 215.
- the raster graphics day palette 144 is written into the target buffer 215.
- the raster graphics night palette 178 is written into the target buffer 215.
- the encoded image data from step 214 is copied into the target buffer 215 at step 228. Thereafter, all of the data in the target buffer 215 is compressed at a step 230 before being output as the raster graphics file in the target format.
- FIG. 19 depicts a more detailed overview of the Target2BMP process 208, which is generally the reverse of the Target2BMP process 206.
- the raster graphics file 192 stored in a target format is decompressed.
- the output of the decompression step 232 is written into a target buffer 234.
- the encoded image data is decoded at step 236.
- a bitmap buffer 240 is created based on the size of the target buffer 234 and the image data that was decoded at step 236.
- a bitmap file is written into the bitmap buffer 240 based on information extracted from the target buffer 234.
- the target buffer data is read and the metadata is copied into the bitmap buffer 240.
- the bitmap information header block 198 is read from the target buffer 234 and copied into bitmap buffer 240.
- the day palette is copied at step 248 into the bitmap buffer 240 and the night palette is copied at step 250 to the bitmap buffer 240.
- the resulting bitmap buffer 240 is then compared at comparison step 212 (FIG. 17) with the original bitmap file 180 to determine if there is a match. If there is, the resulting file is a confirmed raster graphics file. If not, appropriate corrective action may be taken.
- the feedback method 100 helps to ensure that the raster graphics file 25 that is created by chart conversion method 66 is an accurate reproduction of the original navigation chart. This reassurance offered by the feedback method 100 assists in obtaining higher safety ratings for the raster graphics file 25, (and metadata 27, if separate) as well as the software used to render it.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
A system and method for facilitating the rendering of navigational charts on electronic platforms is disclosed. The system and method allow the electronic platforms to display the navigational charts without undue burden on the computational resources of the electronic platform. The electronic platform need not be modified to handle the rendering of TrueType fonts or images stored in a vector graphics file format. The system may generate electronic files of navigation charts that accurately reproduce the image of paper navigation charts while consuming a reduced amount of memory. Color reduction algorithms and methods for generating day and night palettes help reduce the memory necessary to store the navigation chart.
Description
SYSTEMS AND METHODS FOR GENERATING, STORING AND USING ELECTRONIC NAVIGATION CHARTS
BACKGROUND OF THE INVENTION
[0001] The present invention relates generally to navigation charts, and more specifically to methods of generating, using, and storing navigation charts in an electronic form that is advantageous for use on multiple different platforms and in multiple different environments.
10002] Navigation charts are commonly used in aviation, marine, and land-based environments. Before the advent of the computer and electronic display systems, such navigation charts were exclusively produced in a paper form. After the development and widespread adoption of computers and electronic displays, it became common to publish the navigation charts in electronic formats. Such electronic formats allowed the navigation charts to be displayed on electronic displays, which reduced the need for bulky compilations of paper navigation charts, allowed virtually instantaneous access to any chart in a particular database of charts, and facilitated the updating of the charts.
[0003] Despite the advantages of electronic navigation charts, they have not been implemented as broadly as possible because of various issues, including safety considerations, compatibility issues with different electronic platforms, and ease of integration obstacles. This is partly due to the fact that the companies and/or governmental units that compile and publish the navigation charts may only publish the electronic navigation charts in one specific electronic format, and that particular electronic format may be not be capable of being read by different platforms, or may need to be extensively modified in order to be used in the desired target environment.
[0004] As one example, electronic navigation charts for aircraft are commonly sold commercially in a vector graphics file format. While this type of format allows the image of the navigation chart to be scaled without loss of clarity, this electronic format requires a significant amount of computation to render the file into a viewable image. Such computation is often undesirable because there are many other computational tasks required of the processor or processors in the computerized flight deck system, and devoting significant amounts of time to the rendering of vector graphics images diverts computational resources away from these other important tasks.
[0005] Still further, in some computerized flight deck systems, it may not be possible to display certain types of vector graphics files. For example, some vector graphic files of
aircraft navigation charts utilize True Type fonts. In order to display these types of fonts, the software of the computerized flight deck must be specifically designed to accept and properly process the True Type fonts. However, many computerized flight decks are not so configured. Computerized flight decks commonly utilize graphic-rendering software that follows the OpenGL (Open Graphics Library) standard, which was originally developed by Silicon Graphics. Computerized avionic systems configured in accordance with the OpenGL standard, however, are not capable of displaying True Type fonts without expending significant programming time developing a specialized software package that sits atop the basic OpenGL platform. f 0006] There are also safety concerns that arise with the display of navigation charts, particularly aircraft navigation charts displayed in the cockpit of airplanes. The Radio Technical Commission for Aeronautics (RTCA) promulgates certain safety standards for avionics software, including one standard known as DO-178B, which is entitled "Software Considerations in Airborne Systems and Equipments Certification." This safety standard ranks avionics software according to five letter grades A-E, with A being the safest and E being the least safe. In the past, the display of aircraft navigation charts in vector graphics form has often utilized software that was classified as Level C under the DO-178B standards, which, as noted, is not the highest safety classification under these standards.
[00071 Another issue with past electronic images of navigation charts has been the preservation of the navigation chart's look when converting the electronic image to different forms. In some situations, the publisher of the navigation charts may desire to have the electronic image of the navigation chart appear virtually identical regardless of the different electronic formats for the navigation chart. At the same time, it is often desirable to store the electronic images of the navigation charts in a manner that uses as little memory as possible. Known techniques for reducing the file size of electronic images, however, can degrade the image so that it no longer appears identical to the original paper image, or the original electronic file from which it was derived. Prior art techniques for storing electronic images of navigation charts have not adequately achieved an acceptable balance between preserving the look of an electronic image while consuming reduced amounts of memory.
[0008] Thus, there are various disadvantages with the prior art electronic images of navigation charts that limit their usefulness.
SUMMARY OF THE INVENTION
[0009J The present invention provides a method and system for rendering computerized navigation charts that overcomes the aforementioned problems with the prior art. The present
invention is applicable to navigation charts for marine, terrestrial, and avionics environments. In all these fields, the present invention provides the capability of easily rendering navigation charts across a wide variety of different computer platforms with a reduced amount of computational power. In the avionics field, the present invention allows for the display of aircraft navigation charts utilizing software classified as Level B under the DO-178B standards of RTCA.
[0010] According to one aspect of the present invention, an electronic navigational display system is provided that includes a display, memory, data, a user interface, and a controller. The display is adapted to display information to a viewer. The memory stores an electronic image of a navigation chart that includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section. The electronic image of both sections are stored in a raster graphics format within the memory. Data is also included within the memory that specifies the locations of the first and second sections of the navigation chart within the image. The user interface is adapted to allow a user to select a display option in which only the first section of the navigation chart is displayed on the display. The controller is in communication with the user interface and is adapted to read the data and the electronic image from the memory and use the data to display the navigation chart according to the selected display option.
[0011] According to another aspect of the present invention, an electronic navigational display system for a mobile vehicle is provided. The navigational display system includes a display, a memory, a navigation system, a controller, and data stored in the memory. The display is adapted to display information to a user of the mobile vehicle while the user is inside the mobile vehicle. The memory stores an electronic image of a navigation chart in a raster graphics format, along with data corresponding to the electronic image. The data specifies a scale and a latitudinal and longitudinal reference for the electronic image of the navigation chart. The navigation system is adapted to determine a current position of the mobile vehicle, and the controller is adapted to read the electronic image from the memory and display the navigation chart on the display. The controller is further adapted to display the current position of the mobile vehicle as determined by the navigation system on the display in a manner in which the current position of the mobile vehicle is indicated on top of the electronic image of the navigation chart at a location that matches the vehicle's current position with respect to the electronic image of the navigation chart.
[0012] In accordance with another aspect of the present invention, an electronic repository of at least one navigation chart that includes a map section having a plan view of a map is
provided. The electronic repository includes a memory, image data, and first and second data fields. The image data contains an image of the navigation chart that is stored in the memory as a plurality of pixels in a raster graphics format. The first data field is contained within the memory and is separate from the image data. The first data field specifies a scale for the map section of the image data wherein the scale allows a physical distance to be computed between a pair of pixels within the map section of the image data such that the physical distance computed between the pair of pixels can be converted to an actual distance between a pair of locations on the map corresponding to the pair of pixels. The second data field is contained within the memory and is separate from the image data. The second data field specifies a geographical reference for the map section of the image data such that a set of geographical coordinates can be determined from the geographical reference for any pixel within the map section of the image data.
[0013] According to another aspect of the present invention, an electronic repository of at least one navigation chart that includes a first and a second section is provided wherein the first section includes a plan view of a map and the second section includes text containing navigation information relating to the first section. The electronic repository includes a memory, image data, and first and second data fields within the memory. The first and second data fields are both separate from the image data. The image data contains an image of the navigation chart that is stored in the memory as a plurality of pixels in a raster graphics format. The first data field identifies a location of the first section of the navigation chart within the image data, and the second data field identifies a location of the second section of the navigation chart within the image data.
[0014] According to still another aspect of the present invention, a method for converting a vector graphics file of a navigation chart into a raster graphics file is provided wherein the navigation chart includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section. The method includes loading the vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file. Thereafter, the rendered image is converted into a plurality of pixels that each have a color value associated with them. A first set of pixels corresponding to the first section of the navigation chart and a second set of pixels corresponding to the second section of the navigation chart are both determined by using the vector graphics file of the navigation chart. A raster graphics file is stored in an electronic target location along with data relating to the color value of each of the plurality of pixels and data identifying the first and second sets of pixels.
[0015] According to still another aspect of the present invention, a method is provided for converting a vector graphics file of a navigation chart into a raster graphics file. The method includes loading the vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file wherein the rendered image defines a plurality of object colors. The rendered image is converted into a plurality of pixels wherein at least one of the plurality of pixels has a non-object color different from the object colors. A color value for each of the plurality of pixels is determined, then the total number of color values are counted and compared to a predetermined threshold. If the total number of color values exceeds the predetermined threshold, the total number of colors is reduced by calculating a color distance between all of the color values, determining a frequency of a first color value and whether the first color value corresponds to an object color or a non-object color, and determining a frequency of a second color value and whether the second color value corresponds to an object color or a non-object color. Based on the calculated color distances and color frequencies, the following actions are taken: replacing the first color value with the second color value if the first color value corresponds to a non-object color and the second color value corresponds to an object color; or
(ii) replacing the first color value with the second color value if the first color value corresponds to a non-object color and the second color value corresponds to a non-object color having a greater frequency than the first color value; or
(iii) replacing the second color value with the first color value if the second color value corresponds to a non-object color and the first color value corresponds to a non-object color having a greater frequency than the second color value; or
(iv) leaving the first and second color values unchanged if both the first and second color values correspond to object colors.
After any of steps (i)-(iv) have been performed, the remaining color values are stored in a raster graphics file. [0016] According to yet another aspect of the present invention, a method of converting a vector graphics file of an aircraft navigation chart into a raster graphics file using a computer running on a Windows® operating system is provided. The method includes loading the vector graphics file into the computer and using a GetDIBits function of the Windows operating system to determine a first set of pixels corresponding to an entire image of the aircraft navigation chart, a second set of pixels corresponding to a first portion of the aircraft navigation chart, and a third set of pixels corresponding to a second portion of the aircraft
navigation chart wherein the second set of pixels includes a plurality of pixels not contained within the third set. Thereafter, the second and third sets of pixels are compared against the first set of pixels to determine if the pixels in the second and third sets are the same as the corresponding pixels in the first set. If they are not the same, any discrepancy between the pixels of the second and third sets and the pixels of the first set is flagged. If they are the same, a sufficient number of the pixels are saved in a raster graphics file to define an entire image of the navigation chart.
[0017] In other aspects of the invention, the navigation charts may be aircraft navigation charts that include a section illustrating a profile view of a desired course of the aircraft. Data may be stored in memory identifying the location of the profile view section of the navigation chart within the raster graphics image. The data identifying the various sections of the navigation chart may be stored within the same electronic file as the image data, or it may be stored in a file separate from the image data. The aircraft navigation chart may also include information relating to flight minimums in another section, and data may be stored in memory specifying the location of the flight minimum information within the raster graphics image. A day and a night palette may also be stored in memory and accompany the raster graphics image of the navigation chart whereby the raster graphics image can be displayed with different colors depending upon the time of the day and/or ambient light conditions. The size of the raster graphics image of the navigation chart can be reduced by lowering the number of color values for the pixels to a number less than or equal to a predefined threshold. The manner of reducing the number of color values for the pixels may involve altering the color values of selected anti-aliasing pixels such that the selected anti-aliasing pixels are assigned new color values that are the same as the color values of other pixels within the navigation chart image.
[0018] The method and systems of the present invention provide improved electronic images of navigation charts that are more easily adapted to different computerized display platforms. The electronic images consume relatively small amounts of memory, can be rendered without undue computational demands, provide all the navigation information of prior art navigation charts, and can be manipulated in the same manners as the navigation chart images of the prior art. Further, because the electronic images of the present invention do not require the computational and/or software requirements necessary to render vector graphic images, the images of the present invention can be displayed on a wider variety of electronic devices than can be done with past images, including, but not limited to cell phones, PDAs, wearable video displays and video glasses, portable media players like ipods, and other similar devices.
Still further, the reduced computational and software requirements necessary to display the charts of the present invention allow the charts to be incorporated into a client/server architecture where a client requests a particular chart and the server delivers it to the client.
Thus, not only can a variety of electronic devices easily display the navigation chart images of the present invention, they can also download additional navigation chart images from a database. Such downloading may occur over any type of computer network, including the
Internet. These and other advantages of the present invention will be apparent to one skilled in the art in light of the following written description and the accompanying drawings,
DESCRIPTION OF THE DRAWINGS [0019] FIG. 1 is a block diagram of a navigational display system according to one aspect of the present invention; [0020] FIG. IA is a block diagram of a navigational display system for a mobile vehicle according to another aspect of the present invention; [0021] FIG. 2 is an elevational view of a pair of flight deck displays that may be used in conjunction with the navigational display systems of FIGS. 1 or IA; [0022] FIG. 3 is an example of an aircraft navigation chart that may be used in accordance with various aspects of the present invention; [0023] FIG. 4 is a table of a first set of metadata inserted into either a raster graphics file containing an image of the navigation chart or a related file that accompanies the raster graphics file; [0024] FIG. 5 is a table of a second set of metadata inserted into either the raster graphics file containing an image of the navigation chart or a related file that accompanies the raster graphics file; [0025] FIG. 6 is a cell phone shown displaying a navigation chart that was read from a raster graphics file in accordance with the present invention; [00261 FIG. 7 is a block diagram of a chart conversion process for changing electronic navigation charts from a vector graphics file to a raster graphics format; [0027] FIG. 8 is a flowchart illustrating in greater detail a sequence of steps that may be followed in carrying out the chart conversion process of FIG. 7;
[0028] FIG. 9 is a more detailed flowchart of the comparison method illustrated in FIG. 8;
[0029] FIG. 10a is a diagram representing a generic example of a raster graphic navigation chart image stored as a plurality of pixels wherein each pixel has a thirty-two bit color value associated with it;
[0030] FIG. 10b is a table tabulating a frequency of usage of each of the color values in the generic navigation chart example of FIG. 10a; [00311 FIG. 10c is an unreduced day color palette correlating an index value to all of the colors in the table of FIG. 10b that have a non-zero usage frequency; [0032J FIG. 1 Od is a reduced day color palette table illustrating a reduced set of color values produced after the color values in the palette of FIG. 10c have undergone a color reduction process; [0033] FIG. 1Oe is a diagram representing the generic example of the navigation chart image of FIG. 10a wherein each of the plurality of pixels has an eight bit indexed color value associated with it; [0034] FIG. 11 is a flowchart of a color reduction process according to one aspect of the present invention; [0035] FIG. 12a is a diagram illustrating in more detail an example of a color distance computation according to a color reduction process used with one aspect of the present invention; [0036] FIG. 12b is a table arranging pairs of indexed color values in order from the smallest distance to the greatest distance; [0037] FIG. 13a is a table illustrating a color mapping between day and night colors of the navigation chart as provided in an original vector graphics file of the navigation chart; [0038] FIG. 13b is the reduced day color palette table of FIG. 1Od reproduced for ease of reference in conjunction with FIGS. 13c and 13d; [0039] FIG. 13c is a generic example of a daytime navigation chart image wherein three pixels having the same color value are highlighted; [0040] FIG. 13d is a generic example of a nighttime navigation chart image that corresponds to the daytime navigation chart image of FIG. 13c; [0041] FIG. 13e is a table illustrating a night palette for a raster graphics image of a navigation chart; [0042] FIG. 14 is a flowchart of a night palette creation process according to one aspect of the present invention; [0043] FIG. 15 is a diagram illustrating standard data blocks and data fields of a conventional bitmap computer file; [0044] FIG. 16 is a diagram illustrating various data blocks and data fields of a raster graphics file having an alternative format; [0045] FIG. 17 is a more detailed flowchart of the feedback method illustrated in FIG. 8;
[0046] FIG. 18 is a more detailed flowchart of a bitmap-to-target (BMP2Target) process used in the feedback flowchart of FIG. 17; and
[0047J FIG. 19 is a more detailed flowchart of a target-to-bitmap (Target2BMP) process used in the feedback flowchart of FIG. 17.
DETAILED DESCRIPTION OF THE INVENTION
[0048] The present invention will now be described with reference to the accompanying drawings wherein the reference numerals appearing in the following written description correspond to like-numbered elements in the several drawings. A navigational display system 20 according to one aspect of the present invention is illustrated in FIG. 1. Navigational display system 20 includes a controller 22, a memory 24, a user interface 26, and a display 28. Memory 24 contains one or more raster graphics files 25 that contain images of one or more navigation charts. In general, navigational display system 20 is adapted to display these navigation charts on display 28 to a viewer. Navigational display system 20 allows a user to view an electronic image of a navigation chart in a variety of different environments. Navigational display system 20 may be incorporated into any known electronic device capable of displaying raster graphic image files, such as, but not limited to, a conventional computer, a cell phone, a personal digital assistant, a dashboard GPS display for an automobile, an electronic flight deck computer system of an aircraft or spacecraft, a laptop computer, an electronic navigational display for a surface or submersible marine vessel, or other type of electronic displays. Whatever device navigational display system 20 is incorporated into, it is useful to electronically display navigation charts in accordance with the principles described in more detail below.
[0049] The display of navigation charts on navigational display system 20 can be performed during the operation of a mobile vehicle, such as an airplane, while the vehicle is moving. Alternatively, navigational display system 20 can be used to view charts from locations outside of a mobile vehicle. Regardless of where navigational display system 20 is utilized, a user can view any of a database of navigational charts stored in memory 24. User interface 26 allows a user to zoom in, zoom out, scroll up, down, left, and right, and rotate chart images while viewing the navigation charts displayed on display 28. Still further, as will be discussed in greater detail below, navigational display system 20 can automatically locate different sections of a navigation chart and display only those selected sections on display 28. Other capabilities of navigational display system 20 will be discussed further below.
[0050] Navigational display system 20 may be modified to include a navigation system 30, such as is illustrated in FIG. IA. FlG. IA illustrates a modified version of navigational
display system 20 that will be referred to as navigational display system 20. Navigational display system 20' is especially useful for displaying navigational charts on a mobile vehicle, particularly while the mobile vehicle is moving. Navigation system 30 allows display system 20' to display the navigation charts on the display 28 in a manner in which the current position of the mobile vehicle is indicated by an icon or other symbol placed by system 20 on top of the navigational chart being displayed. This allows the operator of the mobile vehicle, which may be an aircraft, boat, or land-based vehicle, to see his or her current position with respect to the navigational chart. Navigational display system 20' includes all of the same components of display system 20 with the addition of navigation system 30, and all of these components operate in the same manner in both of the systems 20 and 20'. More details and features of navigational display systems 20 and 20' will be described below.
[0051] FIG. 2 depicts an illustrative example of a pair of displays 28a and 28b that may be used in conjunction with either of navigational display systems 20 and 20'. Displays 28a and 28b each include a plurality of buttons 32 located adjacent a bottom edge 34 of the displays 28a and 28b. Buttons 32 constitute one form of user interface 26. Other types of user interfaces 26 may also be used in accordance with the present invention, including, but not limited to, computer mouses, touch screens, knobs, keyboards, joy sticks, voice recognition devices, and the like, as well as combinations thereof. Buttons 32 are selectively pressed by the operator of the display system 20 or 20' to control the information that is displayed on displays 28a and 28b. In the illustrated example of FIG. 2, buttons 32 are known as soft keys. That is, buttons 32 interact with controller 22 to change what is displayed on displays 28a and 28b based on a menu (not shown) that is displayed on displays 28a and 28b immediately above the buttons 32. An example of such soft keys that may be used in accordance with the present invention is disclosed in commonly assigned, co-pending PCT application serial number PCT/US2006/021390, entitled AIRCRAFT AVIONIC SYSTEM HAVING A PILOT USER INTERFACE WITH CONTEXT DEPENDENT INPUT DEVICES, filed June 2, 2006 in the United States receiving office, the complete disclosure of which is hereby incorporated herein by reference.
[0052] User interface 26, whether comprised of buttons 32 or some other type of structures, interacts with controller 22 to cause controller 22 to display different images on either or both of displays 28a and 28b. While FIG. 2 depicts two displays 28a and 28b, it will be understood by those skilled in the art that the invention is applicable to systems having only a single display, or systems having two or more displays. Further, the type of display used in accordance with the present invention can vary widely, from conventional LCD type displays
to organic light-emitting diode (OLED) displays to cathode ray tubes (CRTs) to projection displays to head-up displays (HUD) to plasma screens to any other known type of electronic display.
[0053] Display 28a in FIG. 2 may be a primary flight display (PFD) for an aircraft while display 28b may be a multi-function display (MFD) for an aircraft. As noted above, however, display system 20 can be used in environments other than mobile vehicles, and, even when display systems 20 or 20' are used on a mobile vehicle, they can be applied to other mobile vehicles besides aircraft. Further, when display systems 20 and 20' are used in conjunction with an aircraft, the present invention can be applied to different displays other than the MFD or PFD within the aircraft cockpit. Regardless of the particular environment, the display 28 (or displays) used in accordance with the present invention is configured to be able to display a navigation chart useful for the particular activity the navigation chart relates to, such as flying, boating, driving, or other activities. The navigation chart displayed on display 28 is read from memory 24 as a raster graphics file 25. Memory 24 may be any conventional type of electronic memory such as, but not limited to, RAM, ROM, flash memory, a compact disc, a DVD, a hard drive, an SD or Compact Flash card, a USB portable data stick, a floppy disk, a holographic versatile disc (HUD) or any type of electronic memory capable of being read by a computer, regardless of whether the memory is fixed within the computer or removable from it.
[0054] Controller 22 may include one or more conventional microprocessors and may be a conventional computer, such as a PC or other known type of computer. In some applications, controller 22 may alternatively be a specialized computer or computer system specifically adapted for controlling various aspects of the overall system in which it is incorporated. For example, if display system 20 is incorporated into a personal digital assistant (PDA), controller 22 would include the processor or processors inside the PDA that perform the conventional functions of the PDA. Alternatively, if display system 20 were incorporated into a cell phone, controller 22 would include the processor(s) inside the cell phone that ran the phone's conventional software and/or firmware. As yet another alternative, if navigational display system 20 or 20' is implemented on an aircraft, controller 22 may be one or more of the processing components of an electronic flight deck control system that displays such information as aircraft attitude, altitude, heading, position, radio information, engine parameters, a crew alerting and warning system (CAWS) list, weather, and the like to the pilot.
[0055] Additional environments into which navigational display system 20 can be incorporated include projection cell phones capable of projecting images onto a surface, such as, but not limited to, cell phones using the PicoProjector available from Microvision of Redmond, Washington. Navigational display system 20 can further be incorporated into wearable video displays and video glasses, such as, but not limited to, the iLounge™, available from Myvu Corporation of Westwood, Massachusetts, and the Lumus Pd- 10™, available from Lumus Ltd. of Rehovot, Israel.
[0056] In general, controller 22 can include one or more processors that perform a wide variety of other functions in addition to the rendering of navigation chart images. In fact, any controller 22 is suitable for the present invention so long as it is capable of reading the raster graphics files 25 that contain the navigation charts and displaying these charts on display 28 in response to some form of prompting which may come from user interface 26, or some other source, such as an electronic signal from a system or subsystem that monitors the stage of a particular journey.
[0057] While the types of environments in which navigational display system 20 may be implemented can vary, as noted above, the following discussion of the types of navigation charts that may be displayed on navigational display system 20 will primarily be made with respect to aircraft navigation charts. It will be understood that this discussion is for purposes of illustration only, and is not intended to limit the scope of the invention to avionic applications.
[0058] An example of a navigation chart 36 that may be stored electronically in memory 24 and displayed on display 28 is depicted in FIG. 3. Navigation chart 36 is a conventional instrument approach chart for an aircraft published by Jeppesen Inc. of Englewood, Colorado. Navigation chart 36 includes a plurality of different sections, including a header section 38, a map plan view section 40, an aircraft profile section 42, and an aircraft minimums section 44 that specifies various minimum information for landing the aircraft. Navigation chart 36 is available from Jeppesen Inc. in both a paper format and an electronic format. In the electronic format, navigation chart 36 is provided as a vector graphics file that includes text written in True Type fonts. As was briefly mentioned in the Background of the Invention section, the use of the vector graphics format and True Type fonts in navigation charts limit the ability of the charts to be conveniently rendered on many navigational display systems.
[0059] Navigational display systems 20 and 20', however, are configured to be able to conveniently render navigational charts that are originally provided in a vector graphics format and that use True Type fonts. Navigational display systems 20 and 20' accomplish
this without requiring the computational resources necessary for rendering vector graphics files, without requiring extensive re- working of the graphics display platform of a particular controller 22, and also while more easily allowing the software used to display navigation chart 36 to achieve a higher DO-178B level rating, such as a level B,
[0060] Memory 24 of navigational display system 20 has stored in it raster graphics file 25, which contains an image of navigational chart 36. The stored image includes header section 38, map plan view section 40, aircraft profile view section 42, and aircraft minimums section 44. Memory 24 also stores metadata 27 (FIG. 1) within it that identifies which pixels in the raster graphics file correspond to each of the sections 38, 40, 42, and 44. Metadata 27 may be stored in a file separate from raster graphics file 25, such as illustrated in FIGS. 1 and IA, or it may be stored within raster graphics file 25 itself. The term "metadata" is used herein to generally refer to data that describes other data, such as data that describes the image data of the navigation charts. It will be understood, however, that the term "data" as used herein can refer to either data or metadata.
[0061] The metadata 27 that identifies which pixels correspond to each of sections 38-44 allows controller 22 to selectively display portions of navigation chart 36 on display 28. More specifically, controller 22 may be programmed to allow a pilot to choose, via user interface 26, any one or more sections 38-44 for display on display 28. Thus, for example, a pilot could instruct controller 22, via user interface 26, to display only map plan view section 40 on display 28. In that case, controller 22 would read the metadata 27 from memory 24 that identifies which pixels in the raster graphics file 25 correspond to map plan view section 40 and display only those pixels on display 28. This would allow the pilot to more easily focus on only the map plan view section 40 of navigation chart 36. Alternatively, the pilot could select any other one of sections 38-44 for display by itself on display 28, or he or she could select any combination of two or more sections 38-44 for simultaneous display on display 28. The pilot can also, of course, have the entire navigational chart 36 displayed at one time on display 28.
[0062] In addition to selecting sections of navigational chart 36 for displaying on display 28, controller 22 and user interface 26 are also configured to allow the pilot (or other user of display system 20) to zoom in or zoom out on whatever portion of navigation chart 36 that is being displayed on display 28 (i.e. zooming in and out can be done regardless of whether the entire navigational chart 36 is being displayed, or only selected sections 38-44 of it). Still further, navigational display system 20' may be configured to display the aircraft's current location on top of navigational chart 36 so that a pilot can immediately see his or her location
with respect to navigation chart 36, as will be described more below. Navigational display systems 20 and 20' can also overlay a planned flight plan on top of the navigational chart, if desired,
[0063J Navigation system 30 of display system 20' may be any conventional navigation system, such as, but not limited to, a GPS-based navigation system or an inertial reference system. When display system 20' is used on an aircraft, navigation system 30 may include one or more accelerometers, gyroscopes, magnetometers, radio beacon receivers, or any other conventional navigation equipment used on aircraft. Navigation system 30 determines the current location of the mobile vehicle with respect to a known reference system, such as latitude and longitude, or a GPS coordinate system, or any other reference system which can determine a position in a manner that can be correlated to navigation chart 36.
[00641 As can be seen in FIG. 3, map plan view section 40 of navigation chart 36 includes various navigation landmarks, such as a river 52, an airport 54, a VOR station 56, an intersection 58 (KILBY), and a plurality of potential obstacles 60. Map plan view section 40 also includes geographic references that tie the information displayed in section 40 to an external reference system. Specifically, map plan view section 40 includes latitude markings 46 and longitude markings 48 which indicate the position of the map's contents with respect to the Earth's latitude and longitude references. Further, map plan view section 40 is drawn to a known scale. This known scale, along with the latitude and longitude markings 46 and 48, may be stored as part of metadata 27. When done so, this metadata allows controller 22 to display on display 28 the current position of the aircraft (or other type of mobile vehicle on which navigational display system 20' is implemented).
[0065] One example of the display of the mobile vehicle's current position on top of navigational chart 36 is illustrated in FIG. 2. Display 28b is shown displaying a map plan view section 40 of a navigation chart 36 (which is a different chart than the specific one illustrated in FlG. 3). An aircraft icon 50 is also shown on display 28b at a location west (to the left) of river 52 and northeast of airport 54. Controller 22 overlays the aircraft icon 50 on top of navigation chart 36 at a location on the navigation chart that coincides with the aircraft's current position with respect to the map plan view section of navigation chart 36. In other words, if the aircraft is currently at 50 degrees, 10 minutes north latitude and 90 degrees, 32 minutes west longitude, the controller 22 will use the metadata 27 containing the latitudinal and longitudinal references to display the aircraft icon 50 on top of plan view section 40 at the 50 degree, 10 minute north latitude and 90 degree, 32 minute west longitude position on the map. This will allow the pilot to immediately see his or her current position
with respect to the items that are included within the map plan view section 40, such as river 52, airport 54, etc.
[0066] Controller 22 updates the position of aircraft icon 50 on display 28 as the aircraft moves. This updating may take place at a rate of several times a second, although other rates may also be used. The updating is based on the information controller 22 receives from navigation system 30. Thus, in the example of FIG. 2, if the aircraft continues flying north, controller 22 will repetitively adjust the position of aircraft icon 50 upwards on display 28b while the underlying image of the map plan view section 40 remains stationary. The aircraft icon 50 will therefore move upward across display 28b in accordance with the corresponding movement of the aircraft through the sky. The visual effect presented to the pilot is thus similar to what the pilot would see if he were physically located at a distance above the aircraft and looking down at the Earth, which was represented by the map plan view 40 of the navigation chart 36.
[00671 If the aircraft flies out of the range of the geographic region depicted on plan view map section 40 of navigation chart 36, then controller 22 can react in any of a variety of different manners, including removing aircraft icon 50 from the display, issuing a warning to the pilot, searching for another navigation chart 36 that corresponds to the geographic region into which the aircraft has moved and automatically displaying such a navigation chart (if located), removing the display of navigation chart 36, indicating and updating the distance the aircraft has flown out of the range of the chart, or any other action that would be appropriate for the situation.
[0068] It will be understood that aircraft icon 50 can be varied within the present invention.
Because the aircraft can be a conventional fixed-wing aircraft, a helicopter, an autogyro, a gyrodyne, a power lift, a glider, or a lighter-than-air balloon or airship, icon 50 can be adapted to reflect the visual appearance of the specific type of aircraft. Alternatively, any other suitable non-aircraft icon or indication can be used to display the current position of the aircraft on navigation chart 36. Still further, if navigational display system 20' is used on a mobile vehicle that is not an aircraft, aircraft icon 50 can be replaced with an icon representing the type of vehicle on which system 20' is implemented, e.g. a boat, a car, an RV, etc, or any other type of indication that provides a visual cue to the operator of the mobile' vehicle of the vehicle's current location with respect to the underlying navigation chart 36.
[0069] In some instances, navigation chart 36 may include one or more insets 62 on the plan view map section 40 (FIG. 2). Insets 62 may display a variety of different types of
information, such as an enlargement of a particular area of the map or textual information relating to a particular area of a map. Whatever the contents of insets 62, memory 24 will store the location of each and every inset 62 in a particular navigation chart as part of metadata 27. As noted, this metadata, may be stored within the raster graphics file 25 that contains the image data for the navigation chart 36, or it may be stored separately from the raster graphics file 25. Alternatively, metadata 27 could be stored in a memory separate from memory 24. Regardless of its location, controller 22 will read this information and repetitively check to see if the current location of the aircraft has moved to a position that lies over one of the insets 62. If it does, controller 22 will react in any one of a variety of different manners. f0070] In one embodiment, if the inset 62 contains an enlargement of a particular section of a map and that inset happens to be scaled with geographic references, controller 22 will shift the position of aircraft icon 50 to the geographically proper location within the inset 62. This shifting may optionally involve a change in the size of icon 50. Alternatively, if the area within inset 62 is not scaled, controller 22 may simply remove aircraft icon from display 28 until the aircraft moves to a location that no longer falls within the region encompassed by inset 62. As yet another alternative, controller 22 may continue to display icon 50 on display 28 at a location that coincides with the latitudinal and longitudinal marks 46 and 48 outside the inset 62, despite the fact that the location of icon 50 might not represent the aircraft's actual location with respect to the interior of inset 62. This continued display of icon 50 could involve a change in its color or other attribute in order to give the pilot a visual indication that the location of icon 50 is not necessarily accurate within the area defined by inset 62. Other variations are possible as well.
[0071J While navigation chart 36 of FIG. 3 has been divided into four sections — header 38, map plan view 40, profile view 42, and minimums 44 — it will be understood by those skilled in the art that the present invention does not limit the specific number of sections into which any particular navigation chart 36 may be divided. Indeed, it would be possible to divide the navigation chart 36 of FIG. 3 into more or less sections than the four illustrated therein. For example, the top row of information in the navigation chart of FIG. 3 contains various radio frequency information, including the radio frequencies for the Automatic Terminal Information Service (ATIS), the Green Bay approach, the Minneapolis Center, the Green Bay tower, etc. This entire row of radio frequency information could be considered a separate section of navigation chart 36. Or this row could be further subdivided into smaller sections. Other rows, sections, or parts of navigation chart 36 could also be considered separate
sections. Regardless of the precise manner in which navigation chart 36 is broken up into sections, memory 24 will store the location of each section as part of metadata 27. That is, memory 24 will store in metadata 27 sufficient information to identify which pixels in the raster graphics file 25 correspond to each and every different section of the chart 36. This data will allow controller 22 to selectively display, upon prompting via user interface 26, each of the sections of chart 36 either individually or in any desired combination, as was discussed above.
[0072] Each raster graphics file 25 contains a raster graphics image of one or more navigation charts 36. Each raster graphics image includes a plurality of pixels that, when combined together in the proper arrangement, create the image of the navigation chart 36. The specific format of the file containing the raster graphics image of the navigation chart 36 can vary within the scope of the invention. As one possible choice, raster graphics image 25 may be stored as a conventional bitmap file. Other types of file formats may also be used, and the invention contemplates tailoring the format of the raster graphics file 25 and accompanying metadata 27 to the specific needs and formats required by a particular navigational display system 20 or 20', or other device that may display an image of the navigation chart 36.
[0073] As noted above, metadata 27 may contain information identifying a geographic reference for a chart, a scale, the location of different map sections, and the location of different insets within a given map. This list of information that may be stored within metadata 27 is only an illustrative example of the types of information that may be stored in memory. Changes and additions to this list are within the scope of the invention. An example of one set of metadata 27 that may be stored for an aircraft navigation chart is listed in the tables of FIGS. 4 and 5. The metadata identified in FIGS. 4 and 5 is divided into a plurality of data fields 64. Each of the data fields 64 is identified in the leftmost column of FIGS. 4 and 5. The size of the data field in bytes is listed in the second column from the left, followed by a short description of the data field in the next column to the right, and an indication of the type of data field in the right-most data column. The data fields 64 listed in FIGS. 4 and 5 are merely illustrative of the types of data fields 64 that may be used in accordance with the present invention. In other words, the precise number and types of data fields 64 that comprise metadata 27 may vary substantially from that depicted in FIGS. 4 and 5, including additional metadata not illustrated in FIGS. 4 and 5. Further, the size of the data fields can be varied, along with the field types that define the format of the metadata in the data fields 64.
[0074] The meaning of the data fields 64 of FIGS. 4 and 5 will now be described. The m_tiches data field (FIG. 4) specifies the size of the chart in thousandths of inches. The data in the m_tiches data field may be formatted in a manner referred to as a "Magnitude2d" type of field, which specifies a first set of four bytes that identifies the width of the navigation chart in thousandths of inches, and a second set of four bytes that identifies the length of the navigation chart in thousandths of inches.
[0075] The m whole data field (FIG. 4) identifies the location and extent of the whole navigation chart 36 in whatever coordinate system the navigation chart 36 uses. The metadata within the m whole data field may be formatted in a manner referred to as a "Rect" type of field, which specifies a first set of eight bytes that identifies the coordinates of the lower left corner of the entire navigation chart 36 and a second set of eight bytes that identifies the coordinate of the upper right corner of the entire navigation chart 36.
[0076] The m angleToRotateHeader data field (FIG. 4) identifies what angle the image data of the navigation chart 36 will need to be rotated (if any) in order for the image to be presented on a display with the header section 38 oriented toward the top of the display. The m angleToRotateHeader data field is useful where some navigation charts 36 may be oriented in a landscape orientation and other ones may be oriented in a portrait orientation. Controller 22 can read the metadata in the m_angleToRotateHeader data field and use this to automatically display the navigation chart 36 in the proper orientation, thereby relieving the viewer of the task of having to re-orient the navigation chart manually through buttons 32, or some other type of user interface 26. The m_angleToRotate data field may be formatted in a manner referred to as a "float" type of data field, which simply refers to a four byte floating point number.
[0077] The m isTo Scale data field (FIG. 4) identifies whether the navigation chart 36 is drawn to scale or not. The metadata 27 within the m_isTo Scale data field may be stored as a single bit (or byte) in which a value of one means navigation chart 36 is drawn to scale and a value of zero means navigation chart 36 is not drawn to scale, or vice versa. This format is referred to as "bool" in FIG. 4. If the m_isToScale data field indicates that the navigation chart 36 is drawn to scale, then additional metadata will be stored in memory 24, such as that identified in FIG. 5. This additional metadata will be discussed more below with respect to FIG. 5.
[0078] The m sizeOfMetadata data field (FIG. 4) identifies the total size of the metadata 27 that accompanies the image data of the navigation chart 36. As mentioned above, FIGS. 4 and 5 identify all of the metadata 27 that may accompany a particular navigation chart 36. In
some situations, the particular fields 64 of metadata 27 that accompany a given navigation chart 36 will vary from one chart to another, such as when some charts are drawn to scale and other charts are not drawn to scale. The m_sizeOfMetadata field identifies the total size of whatever metadata 27 happens to accompany a particular image of a navigation chart 36.
[0079] The m_sizeOfFilename data field (FIG. 4) identifies the size of the file name that will be used in the target system. The target system refers to the particular display system that will be displaying the navigation chart. The metadata in the m sizeOfFilename data field may be stored as an unsigned integer ("unsigned int").
[00801 The m_pFilename data field (FIG. 4) identifies the file name of the raster graphics file
25 that contains the image data of the navigation chart 36. This data field 64 allows the target raster graphics file 25 to be correlated to the name originally given to a particular navigation chart by the vendor or supplier of the vector graphics file of that navigation chart. The size of this data field is variable and determined by the value stored in the m sizeOfFilename data field, discussed above. The metadata 27 in the m_pFilename data field may be stored as a string of unsigned characters ("unsigned char").
[0081] FIG. 5 depicts additional metadata that may usefully be stored in memory 24 (or another memory accessible to controller 22) if the data field m isToScale (FIG. 4) indicates that the navigation chart 36 is drawn to scale. If the navigation chart 36 is not drawn to scale, the data fields 64 of FIG. 5 may be omitted in their entirety. The m_header data field (FIG. 5) identifies the location and extent of header section 38 of navigation chart 36. Specifically, the m_header data field identifies which pixels in the image of navigation chart 36 correspond to header section 38. The metadata within this field may be stored in the "Rect" format, which defines a first set of eight bytes of data that identify the lower left coordinates of a rectangle and a second set of eight bytes of data that identify the upper right coordinates of the rectangle. The specific coordinates used to identify these two locations may be based on the coordinate reference system used in the m whole data field (discussed above).
[00821 The m_plan data field (FIG. 5) identifies which pixels in the image of navigation chart 36 correspond to the map plan view section 40. As with the m header data field, the metadata in the m_plan data field may be stored as two sets of eight bytes wherein the first set identifies the coordinates of the lower left corner of the rectangle of plan view section 40 and the second set identifies the coordinates of the upper right corner of the rectangle of plan view section 40.
[0083] The m_planLatLon data field (FIG. 5) identifies the location and extent of the map plan view section 40 in latitudinal and longitudinal coordinates. The "LatLonRect" field type
may be defined as a first set of sixteen bytes that identifies the latitude and longitude of the lower left comer of the map plan view section 40, and a second set of sixteen bytes that identifies the latitude and longitude of the upper right corner of the map plan view section 40.
[0084J The m isProfilePresent data field (FIG. 5) identifies whether navigation chart 36 includes a profile section 42 or not. This information may be stored as a single bit (or byte) wherein a zero value indicates that navigation chart 36 does not include a profile view section 42 and a one value indicates that chart 36 does includes a profile view section, or vice versa.
[0085] The m_profile data field (FIG.5 ) identifies the location and extent of the aircraft profile view section 42 of navigation chart 36, if such a section 42 is present in chart 36. The metadata in the m_profile data field may be stored as a first set of eight bytes that defines the lower left coordinates of the profile view section 42 and a second set of eight bytes that defines the upper right coordinates of the profile view section 42.
[0086] The m minimum data field (FIG. 5) identifies the location and extent of the aircraft minimum section 44 of navigation chart 36. The metadata in the m minimum data field may be stored as a first set of eight bytes that defines the lower left coordinates of the minimum section 44 and a second set of eight bytes that defines the upper right coordinates of the minimum section 44.
[0087] The m numberOflnsets data field (FIG. 5) identifies the number of insets 62 that are present (if any) within the plan view section 40 of navigation chart 36. The metadata in the mjnumberOflnsets data field may be stored as a four byte unsigned integer.
[0088] The m_plnset data field (FIG. 5) identifies the location and extent of each of the insets
62 that are contained within the plan view section 40 of navigation chart 36. The metadata in the m_plnset data field will occupy a size that is dependent upon the actual number of insets 62 within the plan view section 40 of a given navigation chart 36. For each inset 62 within plan view section 40, a first set of eight bytes may be used to identify the lower left coordinates of the inset and a second set of eight bytes may be used to identify the upper right coordinates of the inset wherein the coordinates are specified in the same coordinate reference system used in the other metadata fields (e.g. the m_whole data field).
[0089] The m_pInsetLatLon data field (FIG. 5) also identifies the location and extent of each of the insets 62 that are contained with the plan view section 40 of navigation chart 36. The m_pInsetLatLon data field differs from the above-described m_plnset data field in that it defines the lower left corner and upper right corner of the inset 62 in latitudinal and longitudinal reference coordinates. It will be understood that, if the target navigational display system is configured to operate using a latitudinal and longitudinal reference system,
rather than some other particularized reference system, the m_plnset data field could be omitted while retaining the m_pInsetLatLon data field.
[0090J As mentioned above, the data fields 64 illustrated in FIGS. 4 and 5 are merely illustrative of the types of metadata 27 that may be stored in memory 24. Different types of data fields, different numbers of data fields, and different formats for the data fields may be used in accordance with the present invention. Further, the precise location where data fields 64 are stored in memory can also be varied within the scope of the present invention. In one embodiment, the metadata 27 of data fields 64 that accompanies a particular navigation chart 36 are stored in memory 24 as part of the raster graphics file 25 that contains the image of that particular navigation chart 36. That is, the raster graphics file 25 that contains the image data for a navigation chart also includes the metadata 27 for that particular chart. In another embodiment, the data fields 64 that accompany a particular chart are stored in memory 24 in a file separate from the raster graphics file 25 that contains the image data of the particular chart. In this latter case, controller 22 will read two different files when displaying a particular navigation chart on display 28: the raster graphics file 25 containing the image of the navigation chart, and a separate file containing the metadata of data fields 64 that correspond to that navigation chart. Further, in this latter case, one or both of these two files should include information that allows the data fields 64 to be correlated to a particular raster graphics file 25. Alternatively, metadata could be stored in a memory separate from memory 24, if desired.
[0091] It will be understood that the particular number, kind, and format of the data fields 64 that accompany a given navigation chart 36 may vary considerably depending upon the particular form of the navigation chart 36. While the aircraft navigation chart 36 of FIG. 3, which includes rectangular sections 38-44, has been referenced herein, the present invention is applicable to navigation charts 36 that are divided into sections having shapes other than rectangles. For example, some navigation charts 36 might include one or more circular sections, or one or more square sections, or some other type of polygon or non-polygonal shape. If it is desirable for controller 22 to be able to recognize these non-rectangular shapes (such as for display purposes), then additional data fields 64 identifying which pixels in the raster graphics file 25 of the navigation chart correspond to those variously shaped sections would be stored as metadata 27. The format of these additional data fields 64 could be varied, but could include data identifying a center point and a radius for the circular sections, two corner locations for the square sections, and whatever metadata 27 that would be necessary to define the location of whatever other types of shaped sections navigation chart
36 contained. The specific data fields 64 that accompany a given navigation chart can therefore be tailored to the particular layout and information contained with a given navigation chart.
[0092] It is also possible that some navigation charts 36 may include multiple sections
(regardless of shape) that depict objects that are drawn to scale or that are referenced to a geographical coordinate system, such as latitude and longitude. In such a case, the present invention contemplates using additional data fields 64 like those described above to store information about the scale and/or geographic coordinate system of those multiple sections. In general, the present invention contemplates storing, in addition to the raster graphics image data of a navigation chart, any type of further metadata 27 about the chart that may be useful for controller 22 to know about the image data for purposes of facilitating the display of the navigation chart to the viewer.
[0093J FIG. 6 depicts another example of one of the many possible manifestations of navigational display system 20 according to the various aspects of the present invention. In FIG. 6, navigational display system 20 is incorporated into a conventional cell phone 63. Cell phone 63 includes a display area 65 and a plurality of keys 67. In the manifestation of FIG. 6, display area 65 of cell phone 63 corresponds to display 28, keys 67 correspond to user interface 26, and the internal memory and microprocessor of cell phone 63 (not shown) correspond to memory 24 and controller 22, respectively, of display system 20.
[0094] Display area 65 of cell phone 63 is illustrated in FIG. 6 displaying an aircraft navigation chart 36 (different from the one of FIG. 3). More specifically, display area 65 of cell phone 63 is illustrated in FIG. 6 displaying the map plan view section 40 and aircraft profile view section 42 of an aircraft navigation chart. The entire navigation chart (which would include header section 38 and minimums section 44) is not shown because the user has pressed the appropriate keys 67 to cause the controller 22 within cell phone 63 to automatically display only sections 40 and 42 of the navigation chart. Further, as can be seen, the display of sections 40 and 42 is not merely a zooming in on these sections of the map, but rather a display in which the sections surrounding sections 40 and 42 having been cut out of the displayed image. Thus, a user of display system 20 63 does not need to undergo the trial-and-error process of manually zooming and scrolling the image of the navigation chart until the appropriate section or section is displayed. Instead, the user can press a button (or other type of user interface), and controller 22 will automatically display only the desired section at a size that fills, to the extent possible, the viewing area of the display. The precise keys 67 used to manipulate the image of the navigation chart can be
varied within the scope of the invention. In general, it is desirable to allow keys 67 to be able to zoom in and out on the navigation chart, automatically display different sections of the chart, and scroll the image of the chart up, down, and side-to-side.
10095] As was noted previously, the device in which navigation system 20 (or 20') can be incorporated can vary substantially within the present invention from aircraft flight deck computer systems to cell phones to personal digital assistants to marine and land-based vehicle display systems, and to still others. Whatever the specific structure, controller 22 may include one or more conventional microprocessors programmed to read raster graphics file 25 (and the accompanying metadata 27, if separate from file 25) from memory 24 and cause the associated display 28 to display the image of the navigation chart contained within raster graphics file 25. Further, the microprocessor would be programmed to allow the navigation chart image to be manipulated in the manners described herein (zooming, scrolling, selectively displaying sections, etc) based on inputs from user interface 26. The software necessary to carry out these functions may vary from device to device, but would be well within the ability of a person of ordinary skill in the art to devise without undue experimentation.
[0096J Controller 22 can also be programmed to download additional navigation charts 36 from one or more databases. This downloading can take place over any computer network, including the Internet. Controller 22 can be configured in a client/server architecture where the database of navigation charts in raster graphics format acts as a server in responding to requests from controller 22. Such an arrangement would allow for the downloading of individual navigation charts in an "on-demand" time frame, i.e. charts could be downloaded right at the moment they are needed. This "on-demand" feature would greatly improve the prior methods of distributing navigation charts, particularly aircraft navigation charts, which have to be purchased in bulk subscriptions, rather than on a chart-by-chart basis.
[0097] Having now described the various display systems of the present invention for displaying navigation charts, the following description will turn to several other aspects of the present invention, including methods for generating the raster graphics files 25 and the formats and contents of the raster graphics files 25 and metadata 27. Turning specifically to FIG. 7, a chart conversion method 66 according to one aspect of the present invention is illustrated. Chart conversion method 66 begins with a vector graphics file 68 of one or more navigation charts 36 that are stored in any type of conventional memory device. Vector graphics file 68 is fed into a computer 70, which may be a conventional personal computer or any other type of computer capable of being programmed to carry out the functions described
herein. The manner in which the vector graphics file 68 (or files) are transferred to computer 70 can vary widely, and could include having computer 70 read the vector graphics files 68 directly from its own internal memory, transferring the vector graphics files to computer 70 via a network (including an Internet connection), physically transporting a memory device (such as a disk, DVD, CD-Rom, flash memory device, etc) to computer 70 and coupling the memory device to computer 70 in the appropriate manner, or still other methods.
[00981 Once computer 70 has received one or more vector graphics files 68 of navigation charts 36, computer 70 converts those charts into corresponding raster graphics files 25 (and metadata 27, which may or may not be contained within the file 25 itself). Raster graphics file 25 contains a set of raster graphics image data that corresponds to the navigation chart. That is, raster graphics file 25 contains the color information for each of the pixels that, when combined together, create a picture or image of the navigation chart. The metadata 27 is the same metadata that was discussed above with respect to data fields 64, and, as was discussed previously, may vary depending upon the layout and composition of a particular navigation chart, as well as what sections of the navigation chart it may be desirable for controller 22 to be able to automatically display by themselves.
[0099] After computer 70 has generated files 25 and metadata 27 from a given vector graphics file 68 of a navigation chart 36, the files 25 and metadata 27 are stored in an electronic repository 76. Electronic repository 76 may be the same as memory 24, but it also includes a wider variety of devices beyond memories specifically associated with a controller, user-interface and display, such as controller 22, user-interface 26, and display 28. More specifically, electronic repository 76 may be a stand-alone memory device having no associated controller, display, or user-interface. Such stand-alone memory devices include, but are not limited to, such devices as a floppy disk, a hard drive, a DVD, a CD-Rom, a flash memory device, or similar type of devices. Alternatively, electronic repository 76 may be a memory inside of a specific device, such as a memory contained within a portable digital assistant (PDA), a portable media player, a cell phone, a computer (laptop or desktop) or any other type of known device capable of electronically storing the raster graphics file 25 and metadata 27. Repository 76 may also be connected to the Internet, or other local or wide area network. Computer 70 may be programmed to combine, for each navigation chart 36, the metadata 27 with the raster graphics file 25. If programmed in this manner, computer 70 will output a single raster graphics file 25 for each navigation chart 36. Alternatively, computer 70 may be programmed to store the raster graphics file 25 and metadata 27 separately, in
which case computer 70 will generate two files for each chart 36, or two sets of files for database of multiple charts.
[001001 After computer 70 has converted as many vector graphics navigation charts into raster graphics files 25 and metadata 27 as is desired, the data within electronic repository 76 may be transferred to a mobile vehicle 78, which, as noted previously, could be an air, terrestrial, or marine vehicle. Mobile vehicle 78 may contain navigational display systems 20 or 20', or it may contain a display system different from navigational display systems 20 or 20'. The manner in which the data from repository 76 is transferred to mobile vehicle 78 can vary substantially within the scope of the invention. In some cases, files 25 and metadata 27 are transmitted wirelessly to a memory onboard mobile vehicle 78 (such as memory 22). In other cases, repository 76 might be physically transported to mobile vehicle 78 and connected to a computer on board mobile vehicle 78, such as may occur if repository 76 takes the form of a conventional Secure Digital (SD) card, a Compact Flash card, a portable USB (Universal Serial Bus) memory drive, or some other similar type of memory device. Other ways of transferring the data of repository 76 to the mobile vehicle are also possible.
[00101] Once the metadata 27 and raster graphic files 25 are transferred to the mobile vehicle, the mobile vehicle can display images of the navigation charts 36 via its on-board display system. The on-board display system, as mentioned above, may be navigational display system 20 or 20', or it may be a different type of on-board display system. Whatever the specific form of the display system on-board the mobile vehicle, there are several advantages that arise from having the navigation chart(s) 36 stored as raster graphic files 25 with accompanying metadata 27 (either within the file itself or separate), as opposed to vector graphic files 68. First, the computational resources required by the on-board display system of the mobile vehicle to render the navigation chart 36 from the raster graphics file 25 and metadata 27 is substantially less than would be required to render the navigation chart 36 from a vector graphics file 68. Second, the navigation chart can be displayed on a wider variety of on-board display systems because the on-board display systems don't need to be able to handle complex tasks, like the rendering of True Type fonts, or the processing of vector graphics files written in specialized formats. And third, higher safety ratings for the software that renders the navigation chart can be more easily achieved (such as those specified in DO-178B) because the software necessary to render an image from the raster graphics file 25 and metadata 27 is simpler.
[00102] FIG. 8 illustrates a flowchart summarizing in greater detail the series of steps computer 70 may be programmed to follow in order to carry out the chart conversion process
66 outlined in FIG. 7. The steps illustrated in FIG. 8 are only one specific manner in which computer 70 may be programmed to carry out various aspects of the present invention, and it will be understood that computer 70 could be programmed to convert the vector graphics files 68 to raster graphics files 25 and metadata 27 in a variety of different manners.
[00103] Chart conversion process 66 (FIG. 8) begins with a vector graphic file 68 that contains vector graphics image data 80 of a navigation chart 36. The vector graphics image data 80 contains the information that defines the image of the navigation chart 36 using the vector graphics method of defining images. Vector graphics file 68 further includes a day palette 72 and a night palette 74, Day and night palettes 72 and 74 define the colors that are used to render the image of the navigation chart. Day and night palettes 72 and 74 are an optional component of vector graphics file 68. In some applications, a navigation chart may only have a single color palette associated with it, in which case vector graphics file 68 would include only that single palette, rather than two palettes. In other applications, such as illustrated in Fig. 8, vector graphics file 68 includes multiple palettes, such as a day and night palettes, or no palettes at all.
[00104] Day and night palettes generally refer to color palettes that are used to render a navigation chart image during different times of the day, or during other times when the ambient lighting surround a display, such as display 28, changes. When an operator of a mobile vehicle is riding in the mobile vehicle at night time, the operator's eyes will often become accustomed to the low levels of light outside the vehicle. When the operator switches from looking out the window of the vehicle to items inside the vehicle, such as display 28, it is generally desirable to present the information on display 28 using color intensities that are less bright and less disruptive to the eyes of the operator. In other words, when the operator's eyes become accustomed to the low light conditions outside the mobile vehicle, it is desirable not to disrupt that visual adjustment by presenting brightly colored images to the operator. The day and night palettes 72 and 74 of vector graphics file 68 define two different sets of colors, the former a set of colors appropriate for displaying during high light level conditions, such as the day time, and the latter appropriate for display during low light level conditions, such as at night.
[00105] At step 82, computer 70 renders an unverified day chart image from the vector graphics file 68. A "day chart" refers to a navigation chart that is rendered using the colors specified in day palette 72. As noted, these colors generally make the chart more easily viewable during the day time hours. A "night chart", which is rendered at step 90, refers to a chart that is rendered using the colors specified in night palette 74, which are generally
appropriate for viewing at night time. The information contained in a day chart and the corresponding night chart is the same. The only difference is the selection of colors used when displaying the chart.
[00106J The rendering of the day chart in step 82 may be accomplished in any of a variety of known manners. In one embodiment, the rendering takes place inside a memory of computer 70. Such an internal rendering of the day chart may be accomplished using known techniques, such as the GetDIBits function of the Microsoft Windows® operating system. Other known functions of the Windows® operating system may also be used to render the chart at step 82. The rendering of the day chart defines a set of pixels that, when combined in the appropriate manner, create an image replicating the image of the navigation chart. While the present invention contemplates that the day chart could be rendered in step 82 with a variety of different resolutions, one acceptable resolution is to render the day chart using 2,048 pixels along the longest side of the day chart. If the day chart has a square shape, then the rendered image at step 82 will result in the creation of an image having 2,048 x 2,048 pixels. If one side of the chart is shorter than the other side, then the longer side will have 2,048 pixels and the shorter side would be divided into a smaller number pixels corresponding to the shorter length of that side of the image.
[00107] The rendering of the day chart at step 82 may be accomplished with the assistance of a conventional graphics card, such as would be known by one of ordinary skill in the art. One suitable system for rendering the day chart in step 82 is the ATI Catalyst® graphics software for Microsoft Windows that is available from Advanced Micro Devices of Sunnyvale, California. Other graphics software may also be used within the scope of the present invention.
[00108] As noted, the result of rendering the day chart at step 82 is the definition of a plurality of pixels. Each of these pixels has a specific color associated with it. In one embodiment, the result of step 82 is the creation of pixels that are defined by a 32 bit quad RGB value. As is known to those skilled in the art, the 32 bit quad RGB format uses 8 bits to define a red value, 8 bits to define a green value, 8 bits to define a blue value, and 8 bits that are not used. In this format, there are therefore 256 possible color values for the color red, 256 values for the color green, and 256 values for the color blue. In combination, there are therefore more than 16 million colors that can be defined by the 32 bit quad RGB format (256 x 256 x 256=16,777,216).
[00109] It should be noted that the number of colors in the day chart that is rendered at step 82 will likely, but not necessarily, be different than the number of colors defined in the day
palette 72 of vector graphics file 68. For example, it is known that the vector graphics files of aircraft navigation charts marketed by Jeppesen, Inc. of Englcwood, CO., contain night and day palettes 72 and 74 that each contain a maximum of 32 colors. The resulting day chart created at step 82, however, will typically have more than 32 colors. The reason for this is that the various conventional graphics software that can be used within the present invention to render the day chart will typically add additional colors for anti-aliasing purposes. Some of the pixels defined in step 82 will therefore have colors that have been created for antialiasing purposes. These anti-aliasing colors will likely be different than the original colors specified in the day palette 72 of the vector graphics file 68. As will be explained in more detail below, the present invention, in one embodiment, limits the number of anti-aliasing colors so that the generated raster graphics file 25 consumes a reduced amount of memory.
[001101 After the day chart has been rendered at step 82 (FIG. 8), an optional comparison step
84 is undertaken by computer 70 in order to verify that the rendered day chart has been properly rendered and contains no artifacts. The detailed description of comparison step 84 will be provided below with reference to FIG. 9. Comparison step 84 produces a verified image of the daytime version of the navigation chart. After the rendered chart has undergone comparison step 84, computer 70 runs a color reduction and palettization process 86. The color reduction and palettization process 86 will be described in greater detail below with respect to FIGS. 10, 11, and 12. In general, color reduction and palettization process 86 will reduce the number of anti-aliasing colors produced during step 82 to a predetermined threshold. Further, process 86 will create a day color palette to which each of the individual pixels will be indexed. The result of process 86 will be a raster graphics file that uses less memory than it otherwise would if the file were created directly from the verified image data output at step 84. The color reduction and palettization process 86 is thus an advantageous process, but not a critical step in chart conversion method 66.
[00111] At step 88, computer 70 creates a night palette. The creation of the night palette at step 88 is dependent upon the rendering of the night chart in step 90, as will be explained further below. The rendering of the night chart at step 90 is performed in the same manner as the rendering of the day chart at step 82. The only difference is that different colors are used in the night color rendition than in the day color rendition. The rendering of the night chart at step 90 may be followed by an optional comparison step 92 which is carried out in the same manner as step 84. The night palette created at step 88 will define a color for each of the pixels in the day image of the navigation chart. Each of the pixels will have an associated index value that corresponds to one of the colors in the night palette.
[001121 At step 94, computer 70 extracts the data from vector graphics file 68 that is necessary to define the metadata 27. As was discussed above, this metadata 27 may include the information listed in FIGS. 4 and 5, merely a fraction of this information, or additional information beyond what is listed in FIGS. 4 and 5. As was also mentioned previously, the contents of the metadata 27 may vary depending upon the specific type of navigation chart.
[00113] After the metadata has been extracted at step 94, it may be combined with the output of step 88 at an injection step 96. Injection step 96 combines the metadata 27 with the image data and palette data that was created at step 88. As was noted previously, the metadata 27 may be stored separately from the image and palette data. If this separate storage is desired, then step 96 would be omitted and the metadata 27 would be saved into whatever memory (such as memory 24) it was desired to store it in. This can be done at step 101. The day image, night palette, and day palette from step 88 may also be saved in the memory at step 101 without combination with the metadata at step 96.
[00114] After the metadata 27 is injected into the image and palette data generated at step 88, computer 70 creates a raster graphics file 25 at step 98. The created raster graphics file may take on a variety of different forms in accordance with the present invention. In one embodiment, step 98 creates a conventional bitmap file in which the pixels corresponding to the navigation chart are stored as image data in the conventional image data block of the bitmap file, and the metadata is stored in an optional data block that is part of the conventional definition of the bitmap file standard. This will be described in more detail below with respect to FIG. 15. Alternatively, if it is desired to convert the bitmap file into a raster graphics file having a format different than the bitmap format, this can also be done. Such a different format may be desirable for different types of target display systems. The details of converting a bitmap file into a different type of target file will be described in more detail below with respect to FIG. 17 along with an optional feed back process 100 that may be used to confirm the raster graphics file was properly generated. An illustrative example of a raster graphics file 27 in a format different than bitmap file 98 will also be described in more detail below with respect to FIG. 16.
[00115] FIG. 9 illustrates in greater detail the process involved in comparison steps 84 and 92.
For purposes of description herein, FIG. 9 will be described with reference to comparison step 84, which corresponds to the comparison step undertaken with respect to the day chart image created at step 82. It will be understood, however, that the following description is equally applicable to comparison step 92, which is used in conjunction with the night chart image generated at step 90. The comparison step illustrated in FIG. 9 involves a first image
capture method 104 and a second image capture method 106. Both of the image capture methods 104 and 106 result in the definition of pixels that create an image of the navigation chart 36.
[00116] In one embodiment, the first image capture method 104 involves defining the pixels for the entire navigation chart 36. In that same embodiment, the second image capture method 106 involves defining a plurality of pixels of two different parts of the navigation chart that together make up a complete image of the chart. For example, second image capture method 106 may define the pixels for the top half of the navigation chart in a first step and define the pixels for the bottom half of the navigation chart in a second step. Alternatively, second image capture method 106 could involve tiling the image of the navigation chart into more than two different pieces. The individual tiles of the image would then be pieced back together to define an entire image of the navigation chart. The number of tiles can be varied from two to any number greater than two. Further, the specific portions of the image that are captured can be varied as desired. For example, while it was just mentioned that image capture method 106 may involve capturing the top half of the image separately from the bottom half, it is also possible to capture the left half separately from the right half of the image of the navigation chart. Alternatively, still different portions of the image may be individually captured with second method 106.
[00117] Regardless of the specific manner in which the navigation chart is broken into pieces and individually rendered via second method 106, these pieces are combined back together to define an entire image of the navigation chart. The entire image that is pieced together at step 106 is then compared at step 108 (FIG. 9) with the output of the first image capture method 104. If the image of the navigation chart has been correctly captured by computer 70, then the results of steps 104 and 106 should be identical. Comparison step 108 determines whether the results of steps 104 and 106 match. If the images do match, then computer 70 selects the image generated by either step 104 or 106 and proceeds to the next step in chart conversion method 66 (FIG. 8) using the selected image data. If the images from steps 104 and 106 do not match, then computer 70 indicates this mismatch to the operator of computer 70. The operator may then instruct computer 70 to re-start the steps illustrated in FIG. 8 to see if a repetition of the steps will result in a match at step 84 (or 92). Alternatively, the computer may take other actions in response to a mismatch from capture methods 104 and 106.
[00118] The purpose of steps 104, 106, and 108 is to help ensure that the rendering steps 82 and 90 have generated a plurality of pixels that accurately represent the image of the
navigation chart. While comparison steps 84 and 92 are both optional in the present invention, they add a degree of safety to the overall conversion process of the present invention. This added safety can be especially helpful when attempting to certify the methods of the present invention to meet industry standard safety levels, such as those set forth in the DO-178B or D0-200A standards. More particularly, comparison steps 84 and 92 help ensure that no artifact is introduced into the pixel data during the previous image rendering of steps 82 and 90. This is accomplished by rendering the entire image at step 104 and various pieces at step 106 which are then re-combined. If the rendering of the image at step 104 introduces any visual artifact, such as a Microsoft Windows© logo, window, message, or any other undesirable item not part of the navigation chart, comparison step 108 will likely detect this. Steps 104 and 106 will detect this because of the different locations of the artifact that will be produced in each step. Thus, for example, if step 104 introduces an artifact in the lower left corner of the image of the entire navigation chart, second image capture method 106 will also produce the same artifact in the lower left corner of each of the pieces of the image that are captured during step 106. If step 106 captures the image by separately capturing the top half of the image and then separately capturing the bottom half of the image, each of the two halves of the image will include the same artifact in the lower left hand corner. Thus, when the top half and bottom half of the image captured in step 106 are combined together into a single image of the entire navigation chart, there will be two artifacts, one in the lower left hand corner of the top half, and one in the lower left hand corner of the bottom half. When the entire image captured in step 106 is compared with the entire image captured at step 104 in comparison step 108, they will not match. The output of step 104 will have a single artifact in the lower left hand corner, while the output of step 106 will have two artifacts. This will be alerted to the operator via computer 70. The details of color reduction and palettization process 86 (FIG. 8) will now be described in greater specificity with respect to FIGS .1 Oa- 1 Oe. FIG. 1 Oa depicts a raster graphics day image 110 having a height 112 and a width 114. While image 110 in FIG. 10a is a blank image, image 110 would normally be an image of a navigation chart 36. For example, image 110 could be an image of the chart illustrated in FIG. 3. Alternatively, image 110 could be an image of any navigation chart 36 desirably used in accordance with the methods and systems of the present invention. Image 110 is the image that is generated at step 82 and it may either be verified at step 84, or the step of verification may be omitted. Image 110 is fed into the color reduction and palettization process 86.
[00120] Image 110 consists of a plurality of pixels 1 16. For clarity, FIG. 10a only illustrates a fraction of the pixels 116 which comprise the entire image 110. The precise number of pixels 116 that can be used to define image 110 can vary within the scope of the present invention. As noted, however, one embodiment of the present invention defines 2,048 pixels along the longer edge of image 110. hi the illustration of FIG. 10a, the height 112 dimension of image 1 10 would thus be divided into 2,048 pixels as it is longer that width dimension 114. Other number of pixels can be used within the scope of the present invention.
[00121] Each pixel 116 has a color value associated with it. While the length of this color value can be varied within the scope of the present invention, a 32 bit length will be used for purposes of discussing FIGS. 10a- 1Oe. FIG. 1 Oa illustrates a 32 bit color value 118 in which bits 0-7 define a blue value, bits 8-15 define a green value, bits 16-23 define a red value and bits 24-31 are unused. Image 110 will consume an amount of memory equal to the number of pixels 116 multiplied by 32 bits (not counting the palette data). Because this may be an unacceptably large amount of memory, the size of color values 118 may be reduced via process 86 in a manner that is illustrated more clearly in FIGS. lOb-lOe and 11.
[00122] After raster day image 110 has been rendered, computer 70 tabulates all the different colors values 118 that result. An example of one such tabulation is depicted in FIG. 10b. As was noted above, the number of colors tabulated may be greater than the number of colors originally defined in the day palette 72 of vector graphics file 68. This is because the rendering step 82 may create a number of anti-aliasing colors that are added to image 1 10. The total number of colors in image 110 therefore may exceed that in the original vector graphics file 68.
[00123] After computer 70 has tabulated all of the colors in image 110, computer 70 generates an unreduced day color palette 119, an example of which is illustrated in FIG. 10c. Unreduced day palette 119 is a table that includes all of the color values 118 that are used in image 110 and that omits all of the colors values that are not used in image 110. In the example illustrated in FIG. 10c, there are 390 different colors defined in image 110 (0 through 389).
[00124] After computer 70 has generated unreduced day palette 119, computer 70 utilizes a color reduction process 120 illustrated in FIG. 11 to create a reduced day palette 144. At step
121, computer 121 counts the total number of colors in unreduced day palette 119. At step
122, computer 70 compares the total number of colors in unreduced day palette 119 with a predetermined threshold number 124. While the present invention can be utilized with any suitable threshold number 124, the following discussing will utilize a threshold value of 256
for purposes of discussion. The number 256 will be used because it can be represented by an eight bit data field (2Λ8=256). Other values, however, can be used. If the total number of colors in unreduced palette 119 is less than the threshold number 124, then computer 70 will proceed to step 126 where it will begin an indexing process that replaces each of the color values 118 with an index value that identifies a color entry in the unreduced day palette 256.
[00125] Process or step 126 may best be understood with an example. Suppose, for example, that the unreduced day palette 119 contained 155 colors (rather than the 390 listed in FIG. 10c). Rather than storing the 32 bit RGB quad hex value for each pixel 116, computer 70 would store an eight bit index value (which could have 265 different values) for each pixel 116 that corresponded to the correct color value for that pixel in palette 119. Thus, in the example of FIG. 10c, all pixels with an RGB color value of 0x8F6A2D (which requires 32 bits of data to store) would be replaced by the eight bit index value of 1001 1110 (decimal 158), which is the index entry in palette 119 corresponding to the color value 0x8F6A2D. Similar substitutions would be made for all of the rest of the pixels 116 in the image 110.
[00126] If computer 70 determines at step 122 (FIG. 11) that there are more color values in unreduced day palette 119 than the predetermined threshold number, then it moves to step 128. At step 128, computer 70 calculates the color distance between each and every pair of different colors in unreduced day palette 119. The number of color distances calculated at step 128 will be equal to (N)(N- 1)/2 wherein N is the number of color values in palette 119. For example, in the chart illustrated in FIG. 10c, there are 390 different color values. Computer 70 would therefore compute (390 x 389)/2 color distances. This is equal to 75,855 color distances.
[00127] The manner in which the color distance is calculated is illustrated in more detail in
FIG. 12a. FIG. 12a illustrates that the distance between each color is found by squaring the difference between each of the individual color components in the color values 118. In the example illustrated in FIGS. 10 and 11 , the color values 118 consist of a red value, a green value, and a blue value. The color distance calculation involves squaring the difference between the red values in a pair, squaring the difference between the green values in the pair, and squaring the difference between the blue values in the pair. These squared values are then summed together and their square root may optionally be taken. Taking the square root would produce a true color distance, but this is not necessary because the pairs of colors will be arranged in a distance table 127 (FIG. 12b) from the shortest distance to the longest distance and this arrangement will be the same regardless of whether the color pairs are arranged by distance or the square of the distance value. Accordingly, the term "color
distance" as used herein will refer to the actual color distance or the square of the actual color distance, as well as any other values that correlate to the color distance in a manner that does not alter the order of the color distance from shortest to longest, or vice versa.
[00128] FIG. 12a illustrates a calculation of the squared distance between the pair of color values having index entries of 158 and 200. Specifically, the 158 color value has a hexadecimal value of 0x8F6A2D. The 200 color value has a hexadecimal value of 0x91 A434. To compute the squared distance between these two color values, the squared difference between the red values is first computed. In this case, the squared distance between the red values is equal to the square of hexadecimal 8F minus hexadecimal 91. Next, the squared distance between the green values is computed. This distance is equal to the square of hexadecimal 6 A minus A4. Thereafter, the squared distance between the blue values is calculated. This squared distance is equal to the square of hexadecimal 2D minus 34. The sum of these squared distances between the red, green, and blue values is then determined. The sum is equal to the squared distance between the color values indexed at entries 158 and 200. As noted, this distance may be left as a squared value or a square root could be taken to determine an actual distance. In the illustration of FIGS. 12a and 12b, the square root is not taken because this requires less computation and the results are the same as when the square root is determined.
[00129] As noted above, computer 70 determines the distance (or distance squared) between each of the color values in unreduced palette 119. After computer 70 has made its calculation of color distance, it arranges the color distances (or color squared distance) in color distance table 127 in a manner starting from the smallest color distance (or color squared distance) to the largest distance (or color squared distance). FIG. 12b illustrates an example of such a color distance table.
[00130] As can be seen in FIG. 12b, the color index values 158 and 200 refer to colors that are separated by a squared distance of 3417. Similarly, the pair of color index values 388 and 389 refer to colors that are separated by a color distance squared of 36. The color indexs value 0 and 1 are separated by a distance squared of 9. The color pair consisting of index color values 200 and 209 has the shortest distance between its colors. Specifically, the distance between these colors is only 1. While color distance table 127 is shown in FIG. 12b as only containing six separate distance entries, as mentioned above, it would actually contain (390 x 389)/2 total entries. These additional entries have been omitted for purposes of ease of illustration.
[00131] After computer 70 has computed the color distance (or color distance squared) between each pair of color values at step 128, it moves onto step 130 (FIG. 11) where it determines which color pair has the shortest distance (or shortest distance squared) between it. As noted, in the example of FIG. 12b the color pair with the shortest distance between it consists of colors having index values of 200 and 209. After determining the color pair with the shortest distance at step 130, computer 70 determines at step 132 (FIG. 11) whether both of the color values in that pair are object colors. The term "object color" refers to a color that was originally defined in the day palette 72 of vector graphics file 68. If both of the color values in the color pair are object colors, computer 70 leaves those two color values unchanged and returns to step 130 where it then determines the color pair with the next shortest distance. For example, in reference to FIG. 12b, computer 70 will first determine at step 132 whether index colors 200 and 209 are both object colors or not. If both of these colors are object colors, then computer 70 would return to step 130 and determine the pair of colors with the next shortest distance between them, which in this case would be the colors with index values 0 and 1. As can be seen in FIG. 12b, index colors 0 and 1 have a color distance squared of 9. Computer 70 would then determine whether the colors with index values of 0 and 1 were both object colors at step 132. If they were, it would return to step 130 and find the color pair with the next shortest distance. This pattern would continue until computer 70 eventually located a color pair in which at least one of the colors was not an object color.
[00132] When computer 70 locates such a color pair, it proceeds to step 134 (FIG. 1 1). At step 134, computer 70 determines whether both of the colors in the color pair are non-object colors. (A non-object color is a color not defined in the day palette 72 of vector graphics file 68). If both of the colors are non-object colors, computer 70 will proceed to step 136. At step 136 computer 70 replaces the less frequently used color value in the pair with the more frequently used color value in the pair. For example, if the particular color pair consisted of color index values 0 and 1, as referenced in FIG. 10c, and both of these colors were non- object colors, computer 70 would replace color index value 1 with color index value 0 because color index value 0 is far more frequently used than color index value 1. Specifically, color index value 0 is used in 15,840 pixels, while color index value 1 is used for only 20 pixels. The 20 pixels that were previously assigned a color index value of 1 (which corresponds to the 32 bit RGB value 0x000003) would be re-assigned to the index color value 0, which corresponds to the hexadecimal color value 0x000000). Thereafter, there would be 15,860 pixels with a color index value of 0 and no pixels with a color index
value of 1. Further, the total number of colors in the day palette 119 would be reduced by one.
[00133] After computer 70 has made the appropriate color replacement at step 136, it proceeds to step 138 where it determines whether the reduced number of color values is now equal to or less than the color threshold 124. Because step 136 replaces one color value with another color value, color palette 119 now consists of one less color value than it had prior to step 136. Step 138 determines whether this reduced number color value is equal to or less than threshold number 124. If it is, computer 70 jumps to step 126. If it is not, computer 70 returns to step 130 where it then determines the color pair having the next shortest distance. The next shortest color pair will be the next shortest pair out of those colors that still remain in palette 119. That is, any color values that have been re-assigned to a new value will not be considered when determining the next shortest color pair because the color distance in table 127 would no longer be accurate for those re-assigned color values. After the next color pair of unchanged colors having the shortest distance is identified, computer 70 proceeds through steps 132-134 in the same manner as has been described, as appropriate.
[00134] If computer 70 determines at step 134 (FIG. 11) that both of the color values in a particular pair are not non-object colors, (which, in conjunction with step 132, means that one color is an object color and one is not an object color) then it proceeds to step 140. At step 140, computer 70 replaces the non-object color with the object color in day palette 119. Thereafter, computer 70 proceeds to step 138 where, as described above, it determines whether the reduced number of colors in palette 119 is less than or equal to threshold number 124. The outcome of that determination dictates whether computer 70 then proceeds to step 126 or continues to repeat the cycle of steps that start at step 130.
[00135] As should be apparent from the foregoing discussion of FIG. 11 , computer 70 will eventually change a sufficient number of colors in the unreduced day palette 119 so that it will have a total number of colors that are equal to the threshold number 124. When this occurs, unreduced day palette 119 will become the reduced day palette 144 (FIG. 1Od). The reduction process 120 of FIG. 11 will always produce a reduced color palette 144 so long as the threshold number of colors 124 is greater than or equal to the original number of colors in the day palette 72 of vector graphics file 68. Further, as was noted above, the resulting day palette 144 from process 120 will retain all of the original colors in day palette 72 of vector graphics file 68. The only colors that are changed in process 120 are the anti-aliasing colors that were added during the chart rendering of step 82. Further, the anti-aliasing colors that are changed are those that are close to other colors in the unreduced day palette 119 and that
are used relatively less frequently than the colors that are not changed. This results in a reduced visual impact on the image 110. The reduction process 120 of FJG. 11 therefore preserves the original colors of vector graphics file 68 while minimally impacting the antialiasing colors introduced at step 82.
[00136] If the threshold number 124 is chosen to be 256, each of the pixels 116 can be correlated with an 8 bit color index value (FIG. 1Oe). This reduces the amount of memory necessary to store image 110. As was noted previously with respect to FIG. 10a, the result of step 82 (FIG. 8) may be an image 110 in which each pixel 116 is represented by a 32 bit color value 118. After color reduction process 120 is complete, each pixel 116 will be represented by an indexed color value 142 (FIGS.1 Od-I Oe) that can be stored as an 8 bit value. Consequently, color reduction process 120 will reduce the data necessary to store image 110 by approximately 75 percent (32 bits per pixel to eight bits per pixel plus a color palette). After this reduction in size, each pixel 116 in the reduced image of the navigation chart has an indexed color value 142 associated with it, rather than a direct color value. This index color value 142 identifies an entry in reduced day palette 144, such as is illustrated in FIG. 1Od.
[00137] A process 154 for creating a raster graphics night palette is depicted in FIGS. 13 and
14. In one embodiment, the original vector graphics file 68 of the navigation chart 36 includes a mapping 146 that correlates day palette 72 with night palette 74, such as is illustrated in FIG. 13 a. Day/night mapping 146 maps the colors of the day palette 72 of the navigation chart 36 to the night colors in night palette 74. Day/night mapping 146 provides information enabling a user of vector graphics file 68 to render a night image. Specifically, day/night mapping 146 enables the user of vector graphics file 68 to replace the day color values when a night time rendition of the navigation chart 36 is desired. While day/night mapping 146 illustrates only 32 colors, it will be understood that the present invention is applicable to vector graphics files that include more or less than 32 colors.
[00138] As was briefly mentioned above, computer 70 renders a raster graphics image of a night version 152 (FIG. 13d) of the navigation chart 36 at step 90 (FIG. 8). Step 90 is carried out in the same manner as step 82 using the night palette 74 of vector graphics file 68 rather than the day palette 72 (which is used in step 82). After the night chart is rendered, comparison step 92 ensures that the rendered image is an accurate representation of the corresponding navigation chart. Thereafter, the night image 152 is used to create the night palette at step 88.
[00139] While the night palette 74 of the vector graphic file 68 in the example illustrated in
FIG. 13a includes only 32 colors, the raster graphic image 152 of the night chart that is rendered will likely include more than 32 colors. This is because the conventional software and/or hardware that may be used to render the image into a raster graphic image will likely add anti-aliasing colors to the raster graphic night image 152, The raster graphic night image 152 will therefore likely include more than the original 32 colors specified in the vector graphic night palette 74.
[00140] FIG. 14 depicts a raster graphic night palette creation process 154 that is carried out by computer 70. Raster graphic night palette creation process 154 utilizes the reduced, raster graphics day palette 144, such as that illustrated in FIG. 1Od. Process 154 begins by choosing one of the index values in reduced day palette 144. While the initial index value chosen can be any of the values in palette 144, an initial value of zero will be chosen for purposes of discussion herein. This index value will be referred to as value X in FIG. 14 and the accompanying discussion. While any initial value of X may be chosen, and the order of selecting subsequent index values from palette 144 can vary in any manner, night palette creation process 154 will eventually address every index value in day palette 144. It therefore will be more convenient to describe process 154 with an index value X that starts at zero and increments to the highest value in palette 144.
[00141] As noted, index value X identifies an entry in the reduced, raster graphics day palette
144. In the table of FIG. 13b X will be incremented from 0 all the way up to the highest index value in palette 144, which is 255. Thus, in the example of FIG. 14, X will be incremented from 0 to 255 during the night palette creation process 154.
[00142] Night raster graphic palette creation process 154 (FIG. 14) begins at step 156 where a set of pixels D in the raster graphics day image 110 are identified. Specifically, the pixels having an index value of X are idenitifed. With respect to the example depicted in FIG. 13b, computer 70 identifies at step 156 the 15,840 pixels that have a color index value of 0x000000. At step 158, computer 70 identifies all of the pixels in the night image 152 that have the same physical location within the image as the pixels in set D. These pixels constitute a set N. Thus, for example, if set D consisted of one pixel in the left hand corner of raster graphic day image 110, then set N would consist of a single pixel in the upper left hand corner of raster graphic night image 152. Because the raster graphic day image 110 has the same numbers of pixels as the raster graphic night image 152, any given pixel in the day image 110 will have a corresponding pixel in the night image 152 located in the same location.
[00143] After the night palette creation process 154 completes step 156 and step 158, computer 70 determines at step 160 whether the color defined by the index value X is an object color in the vector graphics day palette 72. If it is, computer 70 proceeds to step 164. If it is not, computer 70 proceeds to step 166. At step 166, computer 70 determines the average color value of all of the pixels in the set N. This average value is the average of the various color components. Thus, if the colors are defined as shades of red, green, and blue, the red values are averaged, the green values are averaged, and the blue values are averaged. The average of these red, green, and blue values define an average color.
[00144] From step 166, computer 70 passes to step 168. At step 168, computer 70 sets the raster graphics day palette entry for the index value X equal to the average color value determined at step 166. Thereafter, computer 70 increments the value of X at step 170. After incrementing X at step 170, computer 70 determines at step 172 whether X is equal to the threshold color value 124 discussed previously. If it is, the raster graphic night palette creation process 154 is complete and the entire night palette 178 has been created. If is isn't, computer 70 returns to step 156 and the cycle depicted in FIG. 14 repeats itself until the entire night palette 178 has been created.
[00145] If it is determined at step 160 that the color defined by index value X is an object color in the vector graphics day palette 172, computer 70 passes to step 164. At step 164, computer 70 determines whether any of the pixels 116 in the set N have a color value that is listed in the vector graphics night palette 74. If is determined at step 164 that none of the pixels in set N have a color value from this vector graphic night palette 74, then computer 70 proceeds to step 166 and follows the procedures of step 166, as has been described previously. If computer 70 determines at step 164 that at least one of the pixels 116 in set N has a color value listed in the vector graphics night palette 74, then computer 70 proceeds to step 174. At step 174, computer 70 sets the raster graphics night palette entry having the index value X equal to the value in the vector graphics night palette 74 that corresponds to the vector graphics day color with the same index value X. Thereafter, computer 70 proceeds to increments X at step 170 in a manner that has been described previously.
[00146] FIGS. 13a and 13b illustrate an illustrative example of the night palette creation process 154. In the example of FIGS. 13a-13b the index value X has been set to 145. The selection of the value X equal to 145 has been made herein merely for purpose of illustration and does not connote any significance with respect to any of the other values to which X may be set. As noted, in step 156 of night palette creation process 154, computer 70 identifies a set of pixels D in the raster graphic day image 110 that have a color defined by a color index
value X. Using the example of FIG. 13b, it can be seen that an index value of X equal to 145 identifies an RGB color value of 0x8F6A2D. Further, the reduced day palette 144 indicates that there are 3 pixels having this color value in the raster graphic day image 110. FIG. 13c illustrates the physical location of these 3 pixels, which are labeled al, a2, and a3. It should be noted that, although day image 110 in FIG. 13c is depicted as a blank image, the actual image 110 would be an image of a navigation chart 36. Image 110 of FIG. 13c has been left blank in order to more clearly explain the night palette creation process 154. In actual use, image 110 may consist of an image like that of the navigation chart 36 depicted in FIG. 3, or any other navigation chart.
[00147] At step 158, night palette creation process 154 identifies a set of pixels N in the night image 152 that have the same location of the corresponding pixels in the day image 110. In the example of FIG. 13d, the pixels bl, b2, and b3 comprise the set D. As can been seen from the comparison of FIG. 13c and FIG. 13d, the pixels bl-b3 are located in the same location within night image 152 as the pixels al-a3 are in the day image 110. While FIG. 13d illustrates raster graphic night image 152 as being physically larger than raster graphic day image 110, the actual sizes of the two images is the same. The disparity in sizes depicted in FIGS. 13c and 13d is merely intended to indicate that the pixels 116 in raster graphic night image 152 are defined by a larger set of data than they are in raster graphic day image 110. Specifically, in the examples of FIGS. 13c and 13d, each pixel 116 in the night image 152 is defined by 32 bits of data. In contrast, each pixel 116 in the raster graphic day image 110 is defined by an 8 bit data field. The difference in physical size between the images 110 and 152 depicted in FIGS. 13c and 13d is intended to convey this difference in data sizes, not a difference in the number of pixels. Thus, the coordinates of the pixel al in day image 110 (FIG. 13 c) will be the same as the coordinates of pixel bl in image 152 (FIG. 13d). Similarly, the coordinates of pixel a2 in image 110 would be the same as the coordinates for pixel b2 in image 152. The same is true for pixels a3 and b3.
[00148] At step 16O5 computer 70 determines whether the color value defined by index value
X (equal to 145 in this example) is an object color. If it is, computer 70 proceeds to step 164. If not, it proceeds to step 166. At step 164, computer 70 determines whether any of the pixels in the raster graphic night image 152 have a color value that is defined in raster graphics night palette 74. With respect to the example of FIG. 13d, computer 70 will determine at step 164 whether any of the pixels bl, b2, or b3 have a color value that is listed in vector graphics night palette 74. Depending upon the outcome of that determination, computer 70 will proceed to step 166 or step 174. In an example of FIG. 13d, if none of the pixels bl, b2, or
b3 have a color value that is defined in vector graphic night palette 74, then computer 70 proceeds to step 166, where it averages the colors of pixels bl, b2, and b3 together. In the example of FIGS. 13 a- 13d, this average color value will then be reset as the 145*1 entry in the raster graphic night color palette 178 (technically, the 146th entry since the palette begins at 0). Computer 70 would then increment X and determine the color value for the next entry in the raster graphic night color palette 178 (which would be index value 146).
[001491 If computer 70 determines at step 164 that at least one of the pixels bl, b2, and b3
(FIG. 13d) has a color value that is listed in vector graphics night palette 176, then computer 70 proceeds to step 174. At step 174, computer 70 sets the color value at the index value of 145 in the raster graphics night palette 178 equal to the hexadecimal value 0x686969. This hexadecimal value is determined from the day/night mapping 146. As can be seen therein, the night color 0x686969 corresponds to a day color of 0x8F6A2D. This day color is the day color defined for the index value of 145 in the reduced raster graphics day palette 144. After step 174 has been completed, computer 70 proceeds to increment X and continue to generate all entries in the raster graphics night image palette 178 (FIG. 13e). The result of the raster graphic night palette creation process 154 is a raster graphics night palette 178 which will have the same number of index entries as the raster graphic day palette 144. This matching number of entries results because the day palette 144 is used to create raster graphic night palette 178 and one entry in the night palette 178 is created for each entry in the day palette 144.
[00150] It will be understood that the present invention can be implemented using different methods to create night palettes besides the night palette creation process 154 described herein. It will also be understood that the night palette creation process could be omitted from the present invention. For example, it would be possible to save both the day image 110 and the night image 152 in a memory, such as memory 24. However, saving both the day image 110 and the night image 152 consumes extra memory. In some applications, this extra consumption of memory may not be an issue and the present invention can be practiced in these applications where the extra memory is not an issue. In those situations where it is desirable to conserve memory space, the day image 1 10 is saved along with the day and night palette 144 and 178, respectively, while the night image 152 is discarded.
[00151] After the night palette has been created at step 88 (FIG. 8) computer 70 will have created a day image 110, a reduced day palette 144, and a night palette 178. These three pieces of data may be combined with the metadata extracted at step 94. Alternatively, as
noted elsewhere, these three pieces of data can be stored as a raster graphics file 25 separate from the metadata extracted at step 94.
[001521 If the metadata extracted at step 94 and the data from night palette creation step 88 are to be combined together into a single raster graphics file 25, this is done at step 96. The manner in which step 96 combines this data into a single file can be accomplished in a variety of different ways. In one example, the metadata is inserted into an optional data block within a standard bitmap file. FIG. 15 depicts the five standard data blocks in a conventional bitmap file 180. Bitmap file 180 includes a file header block 182, a bitmap information header block 184, a color palette block 186, an optional data block 188, and an image data block 190. The file header block 182 includes five separate data fields that are identified as bfType, bfSize, bfReservedl, bfReserved2, and bfOffβits. The bitmap information block 184 includes 11 different data fields that are identified as biSize, bi Width, biHeight, biPlanes, biBitCount, biCompression, biSizelmage, biXPelsPerMeter, biYPeldPerMeter, biClrUsed, biClrlmportant. The color palette block 186 defines a color palette for the raster graphic image that is stored in the image data block 190. [00153] Data block 188 represents data that may optionally be stored in a bitmap file 180 in accordance with the defined format for bitmap files. The bitmap standard does not define the format of the data stored in block 188. Instead, data of any format can be stored in block 188. Thus, metadata 27 can be easily stored herein in any suitable format, such as, but not limited to, the formats of FIGS. 4 and 5. Further, in addition to metadata 27, the raster graphic night palette 178 can be stored in data block 188. Injection step 96 accomplishes storage of the metadata 27 and night palette 178 in optional data block 188.
[00154] Injection step 96 will also adjust the bfOffBits data field in the bitmap file block 182 in accordance with the size of the metadata inserted (as well as the size of the raster graphics night palette 178). More specifically, the bfOffBits data field in data block 182 defines the number of bits from itself to the beginning of the image data in block 190. Thus, the bfOffBits data field should be adjusted in accordance with the size of the metadata (and any other data) inserted into optional data block 188. In conventional bitmap files, the bfOffBits data field is an unsigned integer of 32 bits. The optional data block 188 can therefore take on a size of 2Λ32 bits minus the bits contained within bitmap information block 184 and color palette block 186. The bitmap file standard therefore allows ample room within optional data block 188 for the storage of metadata 27 and night palette 178.
[00155] If it is desirable to create a raster graphics file 25 of a navigation chart in a file format other than a bitmap format, such a conversion can be carried out in accordance with the
present invention. The desired format of the final raster graphic file 25 will be referred to herein as the "target format." In accordance with the present invention, the target format can take on any raster graphics file format that contains an image of a navigation chart. One example of such a type of format is illustrated in FIG. 16. FIG. 16 illustrates various data blocks for a target file format that has been labeled RAS. The term "RAS" is an arbitrary name used herein merely to illustrate one possible alternative file format to the bitmap file format. The RAS file 192 depicted in FIG. 16 includes six separate data blocks. The six data blocks are a raster file header block 194, a metadata information block 196, a bitmap info header block 198, a day palette block 200, a night palette block 202, and an encoded image data block 204. The information contained in each of the blocks is described in FIG. 16. Metadata block 196 stores the metadata 27 described previously. Day palette block 200 stores the raster graphics day palette 144. The night palette block 202 stores the raster graphics night palette 178. The encoded image data 204 stores the pixels that make up the image of the navigation chart.
[00156] Regardless of the specific target file format, it may also to be desirable in accordance with the present invention to compress the final raster graphics file that is created by the chart conversion method 66. Such compression may utilize any known compression techniques, such as a run length encoding (RLE) compression technique. Such a compression may be carried out with respect to the image data. The entire raster file can also be compressed using the ZLIB compression technique to further compress the file.
[00157 J After a raster graphics file has been created in the target format it may be desirable to utilize a feedback method 100 (FIG. 17) in order to confirm that the raster graphics file 25 accurately reproduces the navigation chart 36. It should be noted that the phrase "raster graphics file" is an umbrella term that refers to any type of file containing an image defined in a raster format. Thus, "raster graphics file" 25 can refer to bitmap file 180 a RAS file 192, or any other raster graphics file, regardless of format. The steps of the feedback method 100 are depicted in more detail in FIGS. 17-19. Feedback method 100 receives bitmap file 180 after injection step 96 (FIG. 8) has been completed. Feedback method 102 utilizes two software processes known as BMP2Target 206 and Target2BMP 208. The BMP2Target process 206 converts the bitmap file 180 into a target file 210, which may be the RAS file 192 or some other type of raster graphics file. After the target file 210 has been created, the Target2BMP process 208 reconverts the target file 210 back into a bitmap file. The reconverted bitmap file is compared with the original bitmap file 180 at a comparison step 212. If there are no differences detected at comparison step 212, then the target file 210 is
deemed a verified re-creation of the navigation chart 36. If there are differences detected at step 212, appropriate corrective action is undertaken.
[00158] The detailed manner in which the BMP2Target process 206 operates is depicted in
FIG. 18. At step 214, a computer which may be computer 70 or another computer, reads the bitmap file 180 and encodes the image data using the run length encoding (RLE) algorithm. A target buffer 215 is created at step 216 based on the size of the bitmap file 180. At step 220, the metadata 27 is copied into the target buffer 215. At step 222, the bitmap information 184 is copied into target buffer 215. At step 224, the raster graphics day palette 144 is written into the target buffer 215. At step 226, the raster graphics night palette 178 is written into the target buffer 215. The encoded image data from step 214 is copied into the target buffer 215 at step 228. Thereafter, all of the data in the target buffer 215 is compressed at a step 230 before being output as the raster graphics file in the target format.
[00159] FIG. 19 depicts a more detailed overview of the Target2BMP process 208, which is generally the reverse of the Target2BMP process 206. At step 232, the raster graphics file 192 stored in a target format is decompressed. The output of the decompression step 232 is written into a target buffer 234. From target buffer 234, the encoded image data is decoded at step 236. At step 238, a bitmap buffer 240 is created based on the size of the target buffer 234 and the image data that was decoded at step 236. At step 242, a bitmap file is written into the bitmap buffer 240 based on information extracted from the target buffer 234. At step 244, the target buffer data is read and the metadata is copied into the bitmap buffer 240. At step 246, the bitmap information header block 198 is read from the target buffer 234 and copied into bitmap buffer 240. The day palette is copied at step 248 into the bitmap buffer 240 and the night palette is copied at step 250 to the bitmap buffer 240. The resulting bitmap buffer 240 is then compared at comparison step 212 (FIG. 17) with the original bitmap file 180 to determine if there is a match. If there is, the resulting file is a confirmed raster graphics file. If not, appropriate corrective action may be taken. The feedback method 100 helps to ensure that the raster graphics file 25 that is created by chart conversion method 66 is an accurate reproduction of the original navigation chart. This reassurance offered by the feedback method 100 assists in obtaining higher safety ratings for the raster graphics file 25, (and metadata 27, if separate) as well as the software used to render it.
[00160] While the foregoing description of the chart conversion method 66 has taken place with reference to pixels that define an RGB value, it will be understood that the present invention can be adapted to pixels that define color values using other methods, such as Hue,
Saturation, and Luminescence (HSL) values. Still other types of pixel data can be used in accordance with the present invention. While the present invention has been described herein in terms of the embodiments discussed in the above specification, it will be understood by those skilled in the art that the present invention is not limited to these particular embodiments, but includes any and all modifications that are within the spirit and scope of the present invention as defined in the appended claims.
Claims
1. An electronic navigational display system comprising: a display adapted to display information to a viewer of the display; a memory for storing an electronic image of a navigation chart that includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section, said electronic image of both said first and second sections of said navigation chart being stored in a raster graphics format within said memory; data stored in said memory corresponding to said electronic image, said data specifying a location of said first section of said navigation chart within said image, and said data specifying a location of said second section of said navigation chart within said image; a user interface adapted to allow a user to select a display option in which said display displays said first section of said navigation chart on said display without said second section of said navigation chart; and a controller in communication with said user interface and adapted to read said data and said electronic image from said memory and use said data to display said navigation chart according to the selected display option.
2. The system of claim 1 wherein said display system is incorporated into an aircraft flight control system.
3. The system of claim 2 wherein said navigation chart further includes a third section having a profile view of a desired flight path for the aircraft, said electronic image of said third section of said navigation chart is stored in a raster graphics format within said memory, said data specifies a location of said third section of said navigation chart within said image, and said user interface is adapted to give a user a second display option in which said controller uses said data to display only said third section of said navigation chart.
4. The system of claim 3 wherein said navigation chart further includes a fourth section specifying aircraft minimums for landing the aircraft, said electronic image of said fourth section of said navigation chart is stored in a raster graphics format within said memory, said data specifies a location of said fourth section of said navigation chart within said image, and said user interface is adapted to give a user a third display option in which said controller uses said data to display only said fourth section of said navigation chart.
5. The system of claim 4 further including a navigation system adapted to determine a current position of the aircraft wherein: said data further specifies a scale and a latitudinal and longitudinal reference for at least said first section of said electronic image of said chart; and said controller uses said navigation system, said scale, and said latitudinal and longitudinal reference to display the current position of the aircraft on said display in a manner in which the current position of the aircraft is indicated on top of said first section of said electronic image of said chart at a location that matches the aircraft's current position with respect to the first section of the electronic image of the navigation chart.
6. The system of claim 1 wherein said memory further includes: electronic images of a plurality of navigation charts wherein each said image is stored in a raster graphics format within said memory and includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section; and data stored in said memory corresponding to each of said electronic images of said plurality of navigation charts, said data specifying a location of each of said first sections of said plurality of navigation charts within the navigation chart's corresponding image, and said data specifying a location of each of said second sections of said plurality of navigation charts with the navigation chart's corresponding image.
7. The system of claim 1 wherein electronic image of said navigation chart and said data are stored together in a common electronic file.
8. An electronic navigational display system for a mobile vehicle comprising: a display adapted to display information to a user of the mobile vehicle while the user is inside the mobile vehicle; a memory for storing an electronic image of a navigation chart, said electronic image being stored in a raster graphics format within said memory; data stored in said memory corresponding to said electronic image, said data specifying a scale for said electronic image of said navigation chart, and said data specifying a latitudinal and longitudinal reference for said electronic image of said navigation chart; a navigation system adapted to determine a current position of the mobile vehicle; and a controller adapted to read said electronic image from said memory and display said electronic image of said navigation chart on said display, said controller further adapted to display on said display the current position of the mobile vehicle as determined by said navigation system in a manner in which the current position of the mobile vehicle is indicated on top of said electronic image of said navigation chart at a location that matches the vehicle's current position with respect to the electronic image of the navigation chart.
9. The system of claim 8 wherein said data is stored as part of an electronic file that also contains said image.
10. The system of claim 8 wherein said data is stored separately from an electronic file that contains said image.
11. The system of claim 8 wherein the mobile vehicle is an aircraft.
12. The system of claim 11 wherein said navigation chart is an approach chart that includes a plan view section and a profile view section, and said data includes a first data field specifying a location of said plan view section within said electronic image of said navigation chart, and a second data field specifying a location of said profile view section within said electronic image of said navigation chart.
13. The system of claim 12 wherein said controller is further adapted to read said first and second data fields and display either said plan view section without said profile view section, or said profile view section without said plan view section.
14. The system of claim 11 wherein said navigation chart includes a top and said data includes a data field specifying how much said electronic image of said navigation chart must be rotated in order to display said image on said display with the top of said chart oriented toward a top of said display.
15. The system of claim 11 wherein said memory further includes a plurality of electronic images of navigation charts and each of said images is stored in a raster graphics format.
16. The system of claim 15 wherein said plurality of images of navigation charts include images of landing charts, approach charts, and departure charts.
17. The system of claim 8 wherein said electronic image of said navigation chart includes a section containing a plan view of a map and said section also includes at least one inset, said data further specifying a location of said at least one inset on said navigation chart.
18. The system of claim 17 wherein said controller stops displaying the current position of said mobile vehicle when the current position of said mobile vehicle coincides with said at least one inset.
19. The system of claim 8 wherein said controller does not use a Microsoft Windows® operating system.
20. The system of claim 8 wherein said memory stores a first color palette for the navigation chart and a second color palette for the navigation chart, said first color palette including colors intended to be displayed on said display during high ambient light conditions, and said second color palette including colors intended to be displayed on said display during low ambient light conditions.
21. An electronic repository of at least one navigation chart that includes a map section having a plan view of a map, said electronic repository comprising: a memory; image data of the at least one navigation chart, said image data being saved in said memory as a plurality of pixels in a raster graphics format; a first data field within said memory and separate from said angle data, said first data field specifying a scale for the map section of the image data, said scale adapted to allow a physical distance to be computed between a pair of pixels within the map section of the image data wherein the physical difference computed between the pair of pixels can be converted to an actual distance a pair of locations on the map corresponding to the pair of pixels; and a second data field within said memory and separate from said image data, said second data field specifying a geographical reference for said map section of said image data such that a set of geographical coordinates can be determined from said geographical reference for any pixel within said map section of said image data.
22. The electronic repository of claim 21 wherein said geographical reference for said map is a latitudinal and longitudinal reference allowing a latitude and a longitude to be determined from any pixel within said map section of said image data.
23. The electronic repository of claim 22 wherein said navigation chart is a navigation chart for an aircraft.
24. The electronic repository of claim 22 wherein said image data, said first data field, and said second data field are all stored in said memory as a single electronic file,
25. The electronic repository of claim 24 wherein said single electronic file is a bitmap file containing a plurality of blocks of data, and said first and second data fields are stored in a block of data between a color palette block of data and a block of data containing said image data.
26. The method of claim 23 wherein said navigation chart includes at least first and second sections and said memory further includes: a third data field separate from said image data, said third data field identifying a location of said first section of said navigation chart within said image data; and a fourth data field separate from said image data, said fourth data field identifying a location of said second section of said navigation chart within said image data.
27. The electronic repository of claim 26 wherein said memory is contained within one of a computer, a computer disk, a'portable flash memory storage device, a CD, a DVD, a cell phone, a person digital assistant, and a portable media player.
28. An electronic repository of at least one navigation chart that includes at least first and second sections wherein said first section includes a plan view of a map and said second section includes text containing navigation information relating to said first section, said electronic repository comprising: a memory; image data of the at least one navigation chart, said image data being saved in said memory as a plurality of pixels in a raster graphics format; a first data field within said memory and separate from said image data, said first data field identifying a location of said first section of said navigation chart within said image data; and a second data field within said memory and separate from said image data, said second data field identifying a location of said second section of said navigation chart within said image data,
29. The electronic repository of claim 28 wherein said navigation chart is a navigation chart for an aircraft.
30. The electronic repository of claim 29 wherein said navigation chart includes a third section having a profile view of a desired flight path for the aircraft, and said memory includes a third data field identifying a location of said third second of said navigation chart within said image data.
31. The electronic repository of claim 30 wherein said navigation chart includes a fourth section specifying aircraft minimums for landing the aircraft, and said memory includes a fourth data field identifying a location of said fourth section of said navigation chart within said image data.
32. The electronic repository of claim 31 wherein said navigation chart includes a top and said memory includes a fifth data field identifying an amount of rotation, if any, of said image data necessary to cause said image data to be displayed with the top of said navigation chart facing upward on a display.
33. The electric repository of claim 28 wherein said memory is contained within one of a computer, a computer disk, a portable flash memory storage device, a CD, a DVD, a cell phone, a personal digital assistant, and a portable media player.
34. The electric repository of claim 32 further including a sixth data field within said memory and separate from said image data, said sixth data field specifying a scale for the map section of the image data, said scale adapted to allow a physical distance to be computed between a pair of pixels within the map section of the image data wherein the physical distance computed between the pair of pixels can be converted to an actual distance between a pair of locations on the map corresponding to the pair of pixels.
35. The electric repository of claim 34 further including a seventh data field within said memory and separate from said image data, said seventh data field specifying a geographical coordinate reference of said image data of said first section of said navigation chart such that a set of geographic coordinates can be determined from said geographical coordinate reference for any pixel of said image data within said first section of said navigation chart.
36. The electronic repository of claim 35 wherein said geographical coordinate reference is a latitudinal and longitudinal reference.
37. The electronic repository of claim 35 wherein said geographical coordinate reference is a global positioning system reference.
38. The electronic repository of claim 30 wherein said memory includes image data for a plurality of navigation charts and separate first, second, and third data fields for each of the charts in said plurality of navigation charts.
39. The electronic repository of claim 38 wherein said image data and said first, second, and third data fields of each navigation chart are stored together in a single electronic file.
40. The electronic repository of claim 39 wherein said single electronic file is a bitmap file containing a plurality of blocks of data, and said first, second, and third data fields are stored in a block of data located between a color palette block of data and a block of data containing said image data.
41. The electronic repository of claim 29 wherein said image data includes an eight bit index number for each of said pixels, and said index number identifies an entry in a color palette.
42. The electronic repository of claim 41 wherein said memory further stores a day color palette and a night color palette and said index number can be matched to entries in either said day color palette or said night color palette.
43. A method of converting a vector graphics file of a navigation chart into a raster graphics file wherein the navigation chart includes a first section having a plan view of a map and a second section having text containing navigation information relating to said first section, said method comprising; loading said vector graphics file into a computer, converting an image of the navigation chart defined by said vector graphics file adjust accordingly into a plurality of pixels; determining a color value for each of said plurality of pixels; determining from said vector graphics file a first set of said pixels corresponding to said first section of said navigation chart; determining from said vector graphics file a second set of said pixels corresponding to said second section of said chart; storing a raster graphics file in an electronic target location, said raster graphics file including data relating to said color value of each of said plurality of pixels; and storing data identifying said first and second sets of pixels in said electronic target location.
44. The method of claim 43 wherein said navigation chart is a navigation chart for an aircraft.
45. The method of claim 44 wherein the converting of the image of the navigation chart into a plurality of pixels includes rendering the image into a memory inside of the computer.
46. The method of claim 43 further including: determining from the vector graphics file a scale for the plurality of pixels; saving said scale in the raster graphics file; determining from the vector graphics file a latitudinal and longitudinal reference point for said plurality of pixels; and saving said latitudinal and longitudinal reference point in said electronic target location.
47. The method of claim 44 further including determining from the vector graphics file a preferred orientation of said plurality of pixels and saving said preferred orientation of said plurality of pixels in said electronic target location.
48. The method of claim 44 wherein said navigation chart further includes a third section having a profile view of a desired flight path for the aircraft, and said method further includes determining from the vector graphics file a third set of pixels corresponding to said third section of said chart, and storing data in said electronic target location identifying said third set of pixels.
49. The method of claim 48 wherein said navigation chart further includes a fourth section specifying aircraft minimums for landing the aircraft, and said method further includes determining from the vector graphics file a fourth set of pixels corresponding to said fourth section of said chart, and storing data in said electronic target location identifying said fourth set of pixels.
50. The method of claim 43 further including: counting a total number of said color values; comparing the total number of said color values to a predetermined threshold; and if said total number of color values exceeds said predetermined threshold, changing a sufficient number of said color values such that a reduced total of color values are generated wherein said reduced total of color values does not exceed said predetermined threshold, and storing in said raster graphics file data related to all the color values in said reduced total.
51. The method of claim 50 wherein the rendering of the chart includes the generation of a plurality of anti-aliasing colors in said rendered image, and said changing of a sufficient number of said color values includes changing only said anti-aliasing colors.
52. The method of claim 51 wherein said changing of a sufficient number of said color values includes: computing a color distance between a pair of said color values wherein said pair of color values includes a first color value and a second color value; and changing said first color value to said second color value based at least partially on said color distance.
53. The method of claim 52 wherein said changing of said first color value to said second color value is also based at least partially on a frequency of said first color value in the rendered image of said navigation chart.
54. The method of claim 44 wherein said storing of data identifying said first and second sets of pixels includes storing the data identifying said first and second data sets of pixels in said raster graphics file within said electronic target location.
55. The method of claim 44 further including : generating a color index containing an entry for every different one of said color values used in said plurality of pixels; correlating each one of said plurality of pixels to one of the entries in said color index; and storing in said raster graphics file one of said entries for each of said plurality of pixels.
56. The method of claim 55 further including : generating a night color index different from said color index, said night color index containing a night entry for every different one of said entries in said color index; correlating each one of said night entries to one of said entries in said color index; and storing in said raster graphics file said night color index.
57. A method of converting a vector graphics file of a navigation chart into a raster graphics file comprising: loading said vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file, said rendered image defining a plurality of object colors; converting the rendered image of the navigation chart into a plurality of pixels wherein at least one of the plurality of pixels has a non-object color different from said plurality of object colors; determining a color value for each of said plurality of pixels; counting a total number of said color values; comparing the total number of said color values to a predetermined threshold, and, if said total number of color values exceeds said predetermined threshold, reducing said total number of color values by performing the following:
(a) calculating a color distance between all of said color values;
(b) determining a frequency of a first color value and whether said first color value corresponds to an object color or a non-object color;
(c) determining a frequency of a second color value and whether said second color value corresponds to an object color or a non-object color; and
(i) replacing said first color value with said second color value if said first color value corresponds to a non-object color and said second color value corresponds to an object color; or
(ii) replacing said first color value with said second color value if said first color value corresponds to a non-object color and said second color value corresponds to a non-object color having a greater frequency than said first color value; or
(iii) replacing said second color value with said first color value if said second color value corresponds to a non-object color and said first color value corresponds to a non-object color having a greater frequency than said second color value; or
(iv) leaving said first and second color values unchanged if both said first and second color values correspond to object colors; and
(d) storing in a raster graphics file whatever color values remain after any of substeps (i)-(iv) have been performed.
58. The method of claim 57 further including repeating steps (b) through (c) using different first and second color values until the total number of said color values is equal to or less than said predetermined threshold.
59. The method of claim 58 wherein said predetermined threshold is 256 or less.
60. The method of claim 58 wherein said color values each include first, second, and third components, said first component identifying a degree of red, said second component identifying a degree of green, and said third component identifying a degree of blue.
61. The method of claim 60 wherein the navigation chart includes a first section having a plan view of a map and a second section having text containing navigation information relating to said first section, said method further including: determining from said vector graphics file a first set of said pixels corresponding to said first section of said navigation chart; determining from said vector graphics file a second set of said pixels corresponding to said second section of said chart; and storing data identifying said first and second sets of pixels in said raster graphics file.
62. The method of claim 60 further including: generating a color index containing an entry for every different one of said color values used in said plurality of pixels; correlating each one of said plurality of pixels to one of the entries in said color index; and storing in said raster graphics file one of said entries for each of said plurality of pixels.
63. The method of claim 62 further including: generating a night color index different from said color index, said night color index containing a night entry for every different one of said entries in said color index; correlating each one of said night entries to one of said entries in said color index; and storing in said raster graphics file said night color index.
64. A method of converting a vector graphics file of an aircraft navigation chart into a raster graphics file using a computer running on a Windows operating system, said method comprising: loading the vector graphics file into the computer; using a GetDIBits function of the Windows operating system to determine a first set of pixels corresponding to an entire image of said aircraft navigation chart; using the GetDIBits function of the Windows operating system to determine a second set of pixels corresponding to a first portion of said aircraft navigation chart; using the GetDIBits function of the Windows operating system to determine a third set of pixels corresponding to a second portion of said aircraft navigation chart, said second set of pixels including a plurality of pixels not contained within said third set of pixels; comparing the second and third sets of pixels against the first set of pixels to determine if the pixels in the second and third sets are the same as corresponding pixels in said first set; flagging any discrepancy between the pixels of the second and third sets and the pixels of the first set; and saving a sufficient number of said pixels in a raster graphics file to define an entire image of said aircraft navigation chart.
65. The method of claim 64 further including reading from the vector graphics file a scale for the aircraft navigation chart and storing scale data in a memory common with said raster graphics file, said scale data identifying a scale for the raster graphics file of the navigation chart.
66. The method of claim 65 further including reading from the vector graphics file a location of a plan view map section of the navigation chart and storing plan view location data in the memory common with said raster graphics file, said plan view location data identifying the pixels of said raster graphics file that correspond to said plan view map section of the navigation chart.
67. The method of claim 66 further including reading from the vector graphics file a geographic reference for the navigation chart and storing reference data in the memory common with said raster graphics file, said reference data identifying a correlation between a geographic reference and the pixels of said raster graphics file.
68. The method of claim 67 wherein said scale data, said plan view location data, and said reference data are all stored within the raster graphics file.
69. The method of claim 64 further including reading from said vector graphics file a day palette having a first number of colors, generating a new day palette having a second number of colors greater than said first number of colors, and storing said new day palette in a memory common with said raster graphics file.
70. The method of claim 69 further including reading from said vector graphics file a night palette having a third number of colors, generating a new night palette having a fourth number of colors greater than said third number of colors, and storing said new night palette in the memory in common with the raster graphics file.
71. The method of claim 70 wherein the generating of said new night palette includes: using a GetDIBits function of the Windows operating system to determine a set of night pixels corresponding to an entire night image of said aircraft navigation chart; comparing a plurality of pixels in said first set to a plurality of pixels in said set of night pixels; and changing a color of at least one pixel in said set of night pixels.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US94595707P | 2007-06-25 | 2007-06-25 | |
| US60/945,957 | 2007-06-25 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2009002603A1 true WO2009002603A1 (en) | 2008-12-31 |
Family
ID=40185978
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2008/061386 Ceased WO2009002603A1 (en) | 2007-06-25 | 2008-04-24 | Systems and methods for generating, storing and using electronic navigation charts |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2009002603A1 (en) |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8809600B2 (en) | 2009-01-15 | 2014-08-19 | Harald Kohnz | Process for the production of lower alcohols by olefin hydration |
| FR3007545A1 (en) * | 2013-06-21 | 2014-12-26 | Thales Sa | SYSTEM METHOD AND COMPUTER PROGRAM FOR PROVIDING A MACHINE MAN INTERFACE DATA RELATING TO AN ASPECT OF THE OPERATION OF AN AIRCRAFT |
| EP3219371A1 (en) | 2016-03-14 | 2017-09-20 | Ineos Solvents Germany GmbH | Column tray for the reactive distillation of heteroazeotropes and process using the tray |
| RU2646355C2 (en) * | 2013-11-22 | 2018-03-02 | Хуавэй Текнолоджиз Ко., Лтд. | Solution for improved coding of screen content |
| US10091512B2 (en) | 2014-05-23 | 2018-10-02 | Futurewei Technologies, Inc. | Advanced screen content coding with improved palette table and index map coding methods |
| WO2018222416A1 (en) * | 2017-06-01 | 2018-12-06 | Qualcomm Incorporated | Adjusting color palettes used for displaying images on a display device based on ambient light levels |
| WO2019133526A1 (en) * | 2017-12-29 | 2019-07-04 | Lyft, Inc. | Optimizing transportation networks through dynamic user interfaces |
| US10638143B2 (en) | 2014-03-21 | 2020-04-28 | Futurewei Technologies, Inc. | Advanced screen content coding with improved color table and index map coding methods |
| CN111861856A (en) * | 2019-04-30 | 2020-10-30 | 霍尼韦尔国际公司 | System and method for rendering dynamic data on a cockpit display |
| EP4002326A1 (en) * | 2020-11-11 | 2022-05-25 | Honeywell International Inc. | System and method for dynamically augmenting raster charts displayed on a cockpit display |
| KR102562491B1 (en) * | 2022-09-13 | 2023-08-02 | 한국해양과학기술원 | Method system for designing electronic navigation chart of raster type and methodfor utilizing thereof |
| US12148314B2 (en) | 2021-01-07 | 2024-11-19 | Honeywell International Inc. | System and method for dynamically augmenting raster charts displayed on a cockpit display |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6199015B1 (en) * | 1996-10-10 | 2001-03-06 | Ames Maps, L.L.C. | Map-based navigation system with overlays |
| US20070053513A1 (en) * | 1999-10-05 | 2007-03-08 | Hoffberg Steven M | Intelligent electronic appliance system and method |
| US20070139411A1 (en) * | 2002-03-15 | 2007-06-21 | Bjorn Jawerth | Methods and systems for downloading and viewing maps |
-
2008
- 2008-04-24 WO PCT/US2008/061386 patent/WO2009002603A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6199015B1 (en) * | 1996-10-10 | 2001-03-06 | Ames Maps, L.L.C. | Map-based navigation system with overlays |
| US20070053513A1 (en) * | 1999-10-05 | 2007-03-08 | Hoffberg Steven M | Intelligent electronic appliance system and method |
| US20070139411A1 (en) * | 2002-03-15 | 2007-06-21 | Bjorn Jawerth | Methods and systems for downloading and viewing maps |
Cited By (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8809600B2 (en) | 2009-01-15 | 2014-08-19 | Harald Kohnz | Process for the production of lower alcohols by olefin hydration |
| FR3007545A1 (en) * | 2013-06-21 | 2014-12-26 | Thales Sa | SYSTEM METHOD AND COMPUTER PROGRAM FOR PROVIDING A MACHINE MAN INTERFACE DATA RELATING TO AN ASPECT OF THE OPERATION OF AN AIRCRAFT |
| US9626873B2 (en) | 2013-06-21 | 2017-04-18 | Thales | Method, system and computer program for providing, on a human-machine interface, data relating to an aspect of the operation of an aircraft |
| RU2646355C2 (en) * | 2013-11-22 | 2018-03-02 | Хуавэй Текнолоджиз Ко., Лтд. | Solution for improved coding of screen content |
| US10291827B2 (en) | 2013-11-22 | 2019-05-14 | Futurewei Technologies, Inc. | Advanced screen content coding solution |
| US10638143B2 (en) | 2014-03-21 | 2020-04-28 | Futurewei Technologies, Inc. | Advanced screen content coding with improved color table and index map coding methods |
| US10091512B2 (en) | 2014-05-23 | 2018-10-02 | Futurewei Technologies, Inc. | Advanced screen content coding with improved palette table and index map coding methods |
| EP3219371A1 (en) | 2016-03-14 | 2017-09-20 | Ineos Solvents Germany GmbH | Column tray for the reactive distillation of heteroazeotropes and process using the tray |
| WO2018222416A1 (en) * | 2017-06-01 | 2018-12-06 | Qualcomm Incorporated | Adjusting color palettes used for displaying images on a display device based on ambient light levels |
| US10446114B2 (en) | 2017-06-01 | 2019-10-15 | Qualcomm Incorporated | Adjusting color palettes used for displaying images on a display device based on ambient light levels |
| WO2019133526A1 (en) * | 2017-12-29 | 2019-07-04 | Lyft, Inc. | Optimizing transportation networks through dynamic user interfaces |
| US10852903B2 (en) | 2017-12-29 | 2020-12-01 | Lyft, Inc. | Optimizing transportation networks through dynamic user interfaces |
| US11422667B2 (en) | 2017-12-29 | 2022-08-23 | Lyft, Inc. | Optimizing transportation networks through dynamic user interfaces |
| US11709575B2 (en) | 2017-12-29 | 2023-07-25 | Lyft, Inc. | Optimizing transportation networks through dynamic user interfaces |
| US12321561B2 (en) | 2017-12-29 | 2025-06-03 | Lyft, Inc. | Optimizing transportation networks through dynamic user interfaces |
| CN111861856A (en) * | 2019-04-30 | 2020-10-30 | 霍尼韦尔国际公司 | System and method for rendering dynamic data on a cockpit display |
| EP4002326A1 (en) * | 2020-11-11 | 2022-05-25 | Honeywell International Inc. | System and method for dynamically augmenting raster charts displayed on a cockpit display |
| US12148314B2 (en) | 2021-01-07 | 2024-11-19 | Honeywell International Inc. | System and method for dynamically augmenting raster charts displayed on a cockpit display |
| KR102562491B1 (en) * | 2022-09-13 | 2023-08-02 | 한국해양과학기술원 | Method system for designing electronic navigation chart of raster type and methodfor utilizing thereof |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2009002603A1 (en) | Systems and methods for generating, storing and using electronic navigation charts | |
| US6892118B1 (en) | Pictographic mode awareness display for aircraft | |
| US7495601B2 (en) | Declutter of graphical TCAS targets to improve situational awareness | |
| US8310378B2 (en) | Method and apparatus for displaying prioritized photo realistic features on a synthetic vision system | |
| US9020681B2 (en) | Display of navigation limits on an onboard display element of a vehicle | |
| US8094188B1 (en) | System, apparatus, and method for enhancing the image presented on an aircraft display unit through location highlighters | |
| US20120123680A1 (en) | System and method for electronic moving map and aeronautical context display | |
| US20170030735A1 (en) | Onboard aircraft systems and methods to identify moving landing platforms | |
| US20100082184A1 (en) | Displaying air traffic symbology based on relative importance | |
| US8976042B1 (en) | Image combining system, device, and method of multiple vision sources | |
| US10019835B1 (en) | Digital map rendering method | |
| EP3444570A1 (en) | Aircraft systems and methods for unusual attitude recovery | |
| US7418318B2 (en) | Method and HUD system for displaying unusual attitude | |
| US20040122589A1 (en) | Electronic map display declutter | |
| US8344911B1 (en) | System, module, and method for generating non-linearly spaced graduations for a symbolic linear scale | |
| US10163185B1 (en) | Systems and methods for user driven avionics graphics | |
| US7908045B1 (en) | System and method for presenting an image of terrain on an aircraft display unit | |
| US9540116B1 (en) | Attitude indicator generating and presenting system, device, and method | |
| US8723696B1 (en) | Location information generation system, device, and method | |
| US20160340054A1 (en) | Method for displaying an image of a scene outside of an aircraft in an augmented reality context | |
| US11047707B2 (en) | Visualization method of the attitude of an aircraft, associated computer program product and visualization system | |
| JP5709355B2 (en) | Method and system for generating an unroute visible terrain display | |
| US11748923B2 (en) | System and method for providing more readable font characters in size adjusting avionics charts | |
| US8571728B1 (en) | Systems and methods for embedding aircraft attitude data and detecting inconsistent aircraft attitude information | |
| US20120245759A1 (en) | Device for displaying critical and non-critical information, and aircraft including such a device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08746751 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 08746751 Country of ref document: EP Kind code of ref document: A1 |