[go: up one dir, main page]

WO2009002603A1 - Systèmes et procédés pour générer, stocker et utiliser des cartes de navigation électroniques - Google Patents

Systèmes et procédés pour générer, stocker et utiliser des cartes de navigation électroniques Download PDF

Info

Publication number
WO2009002603A1
WO2009002603A1 PCT/US2008/061386 US2008061386W WO2009002603A1 WO 2009002603 A1 WO2009002603 A1 WO 2009002603A1 US 2008061386 W US2008061386 W US 2008061386W WO 2009002603 A1 WO2009002603 A1 WO 2009002603A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
color
image
navigation
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2008/061386
Other languages
English (en)
Inventor
Brian S. Zingg
Zhihong Zhou
Dereck B. Clark
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
L3 Aviation Products Inc
Original Assignee
L3 Communications Avionics Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L3 Communications Avionics Systems Inc filed Critical L3 Communications Avionics Systems Inc
Publication of WO2009002603A1 publication Critical patent/WO2009002603A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft
    • G08G5/20Arrangements for acquiring, generating, sharing or displaying traffic information
    • G08G5/21Arrangements for acquiring, generating, sharing or displaying traffic information located onboard the aircraft
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means

Definitions

  • the present invention relates generally to navigation charts, and more specifically to methods of generating, using, and storing navigation charts in an electronic form that is advantageous for use on multiple different platforms and in multiple different environments.
  • Navigation charts are commonly used in aviation, marine, and land-based environments. Before the advent of the computer and electronic display systems, such navigation charts were exclusively produced in a paper form. After the development and widespread adoption of computers and electronic displays, it became common to publish the navigation charts in electronic formats. Such electronic formats allowed the navigation charts to be displayed on electronic displays, which reduced the need for bulky compilations of paper navigation charts, allowed virtually instantaneous access to any chart in a particular database of charts, and facilitated the updating of the charts.
  • the present invention provides a method and system for rendering computerized navigation charts that overcomes the aforementioned problems with the prior art.
  • the present invention is applicable to navigation charts for marine, terrestrial, and avionics environments. In all these fields, the present invention provides the capability of easily rendering navigation charts across a wide variety of different computer platforms with a reduced amount of computational power.
  • the present invention allows for the display of aircraft navigation charts utilizing software classified as Level B under the DO-178B standards of RTCA.
  • an electronic navigational display system includes a display, memory, data, a user interface, and a controller.
  • the display is adapted to display information to a viewer.
  • the memory stores an electronic image of a navigation chart that includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section.
  • the electronic image of both sections are stored in a raster graphics format within the memory.
  • Data is also included within the memory that specifies the locations of the first and second sections of the navigation chart within the image.
  • the user interface is adapted to allow a user to select a display option in which only the first section of the navigation chart is displayed on the display.
  • the controller is in communication with the user interface and is adapted to read the data and the electronic image from the memory and use the data to display the navigation chart according to the selected display option.
  • an electronic navigational display system for a mobile vehicle.
  • the navigational display system includes a display, a memory, a navigation system, a controller, and data stored in the memory.
  • the display is adapted to display information to a user of the mobile vehicle while the user is inside the mobile vehicle.
  • the memory stores an electronic image of a navigation chart in a raster graphics format, along with data corresponding to the electronic image.
  • the data specifies a scale and a latitudinal and longitudinal reference for the electronic image of the navigation chart.
  • the navigation system is adapted to determine a current position of the mobile vehicle, and the controller is adapted to read the electronic image from the memory and display the navigation chart on the display.
  • the controller is further adapted to display the current position of the mobile vehicle as determined by the navigation system on the display in a manner in which the current position of the mobile vehicle is indicated on top of the electronic image of the navigation chart at a location that matches the vehicle's current position with respect to the electronic image of the navigation chart.
  • an electronic repository of at least one navigation chart that includes a map section having a plan view of a map.
  • the electronic repository includes a memory, image data, and first and second data fields.
  • the image data contains an image of the navigation chart that is stored in the memory as a plurality of pixels in a raster graphics format.
  • the first data field is contained within the memory and is separate from the image data.
  • the first data field specifies a scale for the map section of the image data wherein the scale allows a physical distance to be computed between a pair of pixels within the map section of the image data such that the physical distance computed between the pair of pixels can be converted to an actual distance between a pair of locations on the map corresponding to the pair of pixels.
  • the second data field is contained within the memory and is separate from the image data.
  • the second data field specifies a geographical reference for the map section of the image data such that a set of geographical coordinates can be determined from the geographical reference for any pixel within the map section of the image data.
  • an electronic repository of at least one navigation chart that includes a first and a second section
  • the first section includes a plan view of a map
  • the second section includes text containing navigation information relating to the first section.
  • the electronic repository includes a memory, image data, and first and second data fields within the memory. The first and second data fields are both separate from the image data.
  • the image data contains an image of the navigation chart that is stored in the memory as a plurality of pixels in a raster graphics format.
  • the first data field identifies a location of the first section of the navigation chart within the image data
  • the second data field identifies a location of the second section of the navigation chart within the image data.
  • a method for converting a vector graphics file of a navigation chart into a raster graphics file wherein the navigation chart includes a first section having a plan view of a map and a second section having text containing navigation information relating to the first section.
  • the method includes loading the vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file. Thereafter, the rendered image is converted into a plurality of pixels that each have a color value associated with them. A first set of pixels corresponding to the first section of the navigation chart and a second set of pixels corresponding to the second section of the navigation chart are both determined by using the vector graphics file of the navigation chart.
  • a raster graphics file is stored in an electronic target location along with data relating to the color value of each of the plurality of pixels and data identifying the first and second sets of pixels.
  • a method for converting a vector graphics file of a navigation chart into a raster graphics file. The method includes loading the vector graphics file into a computer and rendering an image of the navigation chart from the vector graphics file wherein the rendered image defines a plurality of object colors. The rendered image is converted into a plurality of pixels wherein at least one of the plurality of pixels has a non-object color different from the object colors.
  • a color value for each of the plurality of pixels is determined, then the total number of color values are counted and compared to a predetermined threshold. If the total number of color values exceeds the predetermined threshold, the total number of colors is reduced by calculating a color distance between all of the color values, determining a frequency of a first color value and whether the first color value corresponds to an object color or a non-object color, and determining a frequency of a second color value and whether the second color value corresponds to an object color or a non-object color. Based on the calculated color distances and color frequencies, the following actions are taken: replacing the first color value with the second color value if the first color value corresponds to a non-object color and the second color value corresponds to an object color; or
  • a method of converting a vector graphics file of an aircraft navigation chart into a raster graphics file using a computer running on a Windows ® operating system includes loading the vector graphics file into the computer and using a GetDIBits function of the Windows operating system to determine a first set of pixels corresponding to an entire image of the aircraft navigation chart, a second set of pixels corresponding to a first portion of the aircraft navigation chart, and a third set of pixels corresponding to a second portion of the aircraft navigation chart wherein the second set of pixels includes a plurality of pixels not contained within the third set.
  • the second and third sets of pixels are compared against the first set of pixels to determine if the pixels in the second and third sets are the same as the corresponding pixels in the first set. If they are not the same, any discrepancy between the pixels of the second and third sets and the pixels of the first set is flagged. If they are the same, a sufficient number of the pixels are saved in a raster graphics file to define an entire image of the navigation chart.
  • the navigation charts may be aircraft navigation charts that include a section illustrating a profile view of a desired course of the aircraft.
  • Data may be stored in memory identifying the location of the profile view section of the navigation chart within the raster graphics image.
  • the data identifying the various sections of the navigation chart may be stored within the same electronic file as the image data, or it may be stored in a file separate from the image data.
  • the aircraft navigation chart may also include information relating to flight minimums in another section, and data may be stored in memory specifying the location of the flight minimum information within the raster graphics image.
  • a day and a night palette may also be stored in memory and accompany the raster graphics image of the navigation chart whereby the raster graphics image can be displayed with different colors depending upon the time of the day and/or ambient light conditions.
  • the size of the raster graphics image of the navigation chart can be reduced by lowering the number of color values for the pixels to a number less than or equal to a predefined threshold.
  • the manner of reducing the number of color values for the pixels may involve altering the color values of selected anti-aliasing pixels such that the selected anti-aliasing pixels are assigned new color values that are the same as the color values of other pixels within the navigation chart image.
  • the method and systems of the present invention provide improved electronic images of navigation charts that are more easily adapted to different computerized display platforms.
  • the electronic images consume relatively small amounts of memory, can be rendered without undue computational demands, provide all the navigation information of prior art navigation charts, and can be manipulated in the same manners as the navigation chart images of the prior art.
  • the electronic images of the present invention do not require the computational and/or software requirements necessary to render vector graphic images, the images of the present invention can be displayed on a wider variety of electronic devices than can be done with past images, including, but not limited to cell phones, PDAs, wearable video displays and video glasses, portable media players like ipods, and other similar devices.
  • the reduced computational and software requirements necessary to display the charts of the present invention allow the charts to be incorporated into a client/server architecture where a client requests a particular chart and the server delivers it to the client.
  • FIG. 1 is a block diagram of a navigational display system according to one aspect of the present invention
  • FIG. IA is a block diagram of a navigational display system for a mobile vehicle according to another aspect of the present invention
  • FIG. 2 is an elevational view of a pair of flight deck displays that may be used in conjunction with the navigational display systems of FIGS. 1 or IA
  • FIG. 3 is an example of an aircraft navigation chart that may be used in accordance with various aspects of the present invention.
  • FIG. 3 is an example of an aircraft navigation chart that may be used in accordance with various aspects of the present invention
  • FIG. 4 is a table of a first set of metadata inserted into either a raster graphics file containing an image of the navigation chart or a related file that accompanies the raster graphics file;
  • FIG. 5 is a table of a second set of metadata inserted into either the raster graphics file containing an image of the navigation chart or a related file that accompanies the raster graphics file;
  • FIG. 6 is a cell phone shown displaying a navigation chart that was read from a raster graphics file in accordance with the present invention;
  • FIG. 7 is a block diagram of a chart conversion process for changing electronic navigation charts from a vector graphics file to a raster graphics format;
  • FIG. 8 is a flowchart illustrating in greater detail a sequence of steps that may be followed in carrying out the chart conversion process of FIG. 7;
  • FIG. 9 is a more detailed flowchart of the comparison method illustrated in FIG. 8;
  • FIG. 10a is a diagram representing a generic example of a raster graphic navigation chart image stored as a plurality of pixels wherein each pixel has a thirty-two bit color value associated with it;
  • FIG. 10b is a table tabulating a frequency of usage of each of the color values in the generic navigation chart example of FIG. 10a;
  • FIG. 10c is an unreduced day color palette correlating an index value to all of the colors in the table of FIG. 10b that have a non-zero usage frequency;
  • FIG. 1 Od is a reduced day color palette table illustrating a reduced set of color values produced after the color values in the palette of FIG. 10c have undergone a color reduction process;
  • FIG. 10a is a diagram representing a generic example of a raster graphic navigation chart image stored as a plurality of pixels wherein each pixel has a thirty-two bit color value associated with it;
  • FIG. 10b is a table tabulating a frequency of usage of each of the color values in the generic navigation chart
  • FIG. 11 is a flowchart of a color reduction process according to one aspect of the present invention
  • FIG. 12a is a diagram illustrating in more detail an example of a color distance computation according to a color reduction process used with one aspect of the present invention
  • FIG. 12b is a table arranging pairs of indexed color values in order from the smallest distance to the greatest distance
  • FIG. 13a is a table illustrating a color mapping between day and night colors of the navigation chart as provided in an original vector graphics file of the navigation chart; [0038] FIG.
  • FIG. 13b is the reduced day color palette table of FIG. 1Od reproduced for ease of reference in conjunction with FIGS. 13c and 13d;
  • FIG. 13c is a generic example of a daytime navigation chart image wherein three pixels having the same color value are highlighted;
  • FIG. 13d is a generic example of a nighttime navigation chart image that corresponds to the daytime navigation chart image of FIG. 13c;
  • FIG. 13e is a table illustrating a night palette for a raster graphics image of a navigation chart;
  • FIG. 14 is a flowchart of a night palette creation process according to one aspect of the present invention; [0043] FIG.
  • FIG. 15 is a diagram illustrating standard data blocks and data fields of a conventional bitmap computer file; [0044] FIG. 16 is a diagram illustrating various data blocks and data fields of a raster graphics file having an alternative format; [0045] FIG. 17 is a more detailed flowchart of the feedback method illustrated in FIG. 8; [0046] FIG. 18 is a more detailed flowchart of a bitmap-to-target (BMP2Target) process used in the feedback flowchart of FIG. 17; and
  • FIG. 19 is a more detailed flowchart of a target-to-bitmap (Target2BMP) process used in the feedback flowchart of FIG. 17.
  • Target2BMP target-to-bitmap
  • Navigational display system 20 includes a controller 22, a memory 24, a user interface 26, and a display 28.
  • Memory 24 contains one or more raster graphics files 25 that contain images of one or more navigation charts.
  • navigational display system 20 is adapted to display these navigation charts on display 28 to a viewer.
  • Navigational display system 20 allows a user to view an electronic image of a navigation chart in a variety of different environments.
  • Navigational display system 20 may be incorporated into any known electronic device capable of displaying raster graphic image files, such as, but not limited to, a conventional computer, a cell phone, a personal digital assistant, a dashboard GPS display for an automobile, an electronic flight deck computer system of an aircraft or spacecraft, a laptop computer, an electronic navigational display for a surface or submersible marine vessel, or other type of electronic displays. Whatever device navigational display system 20 is incorporated into, it is useful to electronically display navigation charts in accordance with the principles described in more detail below.
  • the display of navigation charts on navigational display system 20 can be performed during the operation of a mobile vehicle, such as an airplane, while the vehicle is moving. Alternatively, navigational display system 20 can be used to view charts from locations outside of a mobile vehicle. Regardless of where navigational display system 20 is utilized, a user can view any of a database of navigational charts stored in memory 24. User interface 26 allows a user to zoom in, zoom out, scroll up, down, left, and right, and rotate chart images while viewing the navigation charts displayed on display 28. Still further, as will be discussed in greater detail below, navigational display system 20 can automatically locate different sections of a navigation chart and display only those selected sections on display 28. Other capabilities of navigational display system 20 will be discussed further below.
  • Navigational display system 20 may be modified to include a navigation system 30, such as is illustrated in FIG. IA.
  • FlG. IA illustrates a modified version of navigational display system 20 that will be referred to as navigational display system 20.
  • Navigational display system 20' is especially useful for displaying navigational charts on a mobile vehicle, particularly while the mobile vehicle is moving.
  • Navigation system 30 allows display system 20' to display the navigation charts on the display 28 in a manner in which the current position of the mobile vehicle is indicated by an icon or other symbol placed by system 20 on top of the navigational chart being displayed. This allows the operator of the mobile vehicle, which may be an aircraft, boat, or land-based vehicle, to see his or her current position with respect to the navigational chart.
  • Navigational display system 20' includes all of the same components of display system 20 with the addition of navigation system 30, and all of these components operate in the same manner in both of the systems 20 and 20'. More details and features of navigational display systems 20 and 20' will be described below.
  • FIG. 2 depicts an illustrative example of a pair of displays 28a and 28b that may be used in conjunction with either of navigational display systems 20 and 20'.
  • Displays 28a and 28b each include a plurality of buttons 32 located adjacent a bottom edge 34 of the displays 28a and 28b.
  • Buttons 32 constitute one form of user interface 26.
  • Other types of user interfaces 26 may also be used in accordance with the present invention, including, but not limited to, computer mouses, touch screens, knobs, keyboards, joy sticks, voice recognition devices, and the like, as well as combinations thereof.
  • Buttons 32 are selectively pressed by the operator of the display system 20 or 20' to control the information that is displayed on displays 28a and 28b. In the illustrated example of FIG.
  • buttons 32 are known as soft keys. That is, buttons 32 interact with controller 22 to change what is displayed on displays 28a and 28b based on a menu (not shown) that is displayed on displays 28a and 28b immediately above the buttons 32.
  • An example of such soft keys that may be used in accordance with the present invention is disclosed in commonly assigned, co-pending PCT application serial number PCT/US2006/021390, entitled AIRCRAFT AVIONIC SYSTEM HAVING A PILOT USER INTERFACE WITH CONTEXT DEPENDENT INPUT DEVICES, filed June 2, 2006 in the United States receiving office, the complete disclosure of which is hereby incorporated herein by reference.
  • User interface 26 interacts with controller 22 to cause controller 22 to display different images on either or both of displays 28a and 28b. While FIG. 2 depicts two displays 28a and 28b, it will be understood by those skilled in the art that the invention is applicable to systems having only a single display, or systems having two or more displays. Further, the type of display used in accordance with the present invention can vary widely, from conventional LCD type displays to organic light-emitting diode (OLED) displays to cathode ray tubes (CRTs) to projection displays to head-up displays (HUD) to plasma screens to any other known type of electronic display.
  • OLED organic light-emitting diode
  • CRTs cathode ray tubes
  • HUD head-up displays
  • Display 28a in FIG. 2 may be a primary flight display (PFD) for an aircraft while display 28b may be a multi-function display (MFD) for an aircraft.
  • display system 20 can be used in environments other than mobile vehicles, and, even when display systems 20 or 20' are used on a mobile vehicle, they can be applied to other mobile vehicles besides aircraft. Further, when display systems 20 and 20' are used in conjunction with an aircraft, the present invention can be applied to different displays other than the MFD or PFD within the aircraft cockpit.
  • the display 28 (or displays) used in accordance with the present invention is configured to be able to display a navigation chart useful for the particular activity the navigation chart relates to, such as flying, boating, driving, or other activities.
  • Memory 24 may be any conventional type of electronic memory such as, but not limited to, RAM, ROM, flash memory, a compact disc, a DVD, a hard drive, an SD or Compact Flash card, a USB portable data stick, a floppy disk, a holographic versatile disc (HUD) or any type of electronic memory capable of being read by a computer, regardless of whether the memory is fixed within the computer or removable from it.
  • RAM random access memory
  • ROM read-only memory
  • flash memory read-only memory
  • compact disc compact disc
  • DVD digital versatile disc
  • HDD holographic versatile disc
  • Controller 22 may include one or more conventional microprocessors and may be a conventional computer, such as a PC or other known type of computer. In some applications, controller 22 may alternatively be a specialized computer or computer system specifically adapted for controlling various aspects of the overall system in which it is incorporated. For example, if display system 20 is incorporated into a personal digital assistant (PDA), controller 22 would include the processor or processors inside the PDA that perform the conventional functions of the PDA. Alternatively, if display system 20 were incorporated into a cell phone, controller 22 would include the processor(s) inside the cell phone that ran the phone's conventional software and/or firmware.
  • PDA personal digital assistant
  • controller 22 would include the processor(s) inside the cell phone that ran the phone's conventional software and/or firmware.
  • controller 22 may be one or more of the processing components of an electronic flight deck control system that displays such information as aircraft attitude, altitude, heading, position, radio information, engine parameters, a crew alerting and warning system (CAWS) list, weather, and the like to the pilot.
  • CAWS crew alerting and warning system
  • Additional environments into which navigational display system 20 can be incorporated include projection cell phones capable of projecting images onto a surface, such as, but not limited to, cell phones using the PicoProjector available from Microvision of Redmond, Washington.
  • Navigational display system 20 can further be incorporated into wearable video displays and video glasses, such as, but not limited to, the iLoungeTM, available from Myvu Corporation of Westwood, Massachusetts, and the Lumus Pd- 10TM, available from Lumus Ltd. of Rehovot, Israel.
  • wearable video displays and video glasses such as, but not limited to, the iLoungeTM, available from Myvu Corporation of Westwood, Massachusetts, and the Lumus Pd- 10TM, available from Lumus Ltd. of Rehovot, Israel.
  • controller 22 can include one or more processors that perform a wide variety of other functions in addition to the rendering of navigation chart images.
  • any controller 22 is suitable for the present invention so long as it is capable of reading the raster graphics files 25 that contain the navigation charts and displaying these charts on display 28 in response to some form of prompting which may come from user interface 26, or some other source, such as an electronic signal from a system or subsystem that monitors the stage of a particular journey.
  • navigational display system 20 While the types of environments in which navigational display system 20 may be implemented can vary, as noted above, the following discussion of the types of navigation charts that may be displayed on navigational display system 20 will primarily be made with respect to aircraft navigation charts. It will be understood that this discussion is for purposes of illustration only, and is not intended to limit the scope of the invention to avionic applications.
  • Navigation chart 36 is a conventional instrument approach chart for an aircraft published by Jeppesen Inc. of Englewood, Colorado.
  • Navigation chart 36 includes a plurality of different sections, including a header section 38, a map plan view section 40, an aircraft profile section 42, and an aircraft minimums section 44 that specifies various minimum information for landing the aircraft.
  • Navigation chart 36 is available from Jeppesen Inc. in both a paper format and an electronic format.
  • navigation chart 36 is provided as a vector graphics file that includes text written in True Type fonts. As was briefly mentioned in the Background of the Invention section, the use of the vector graphics format and True Type fonts in navigation charts limit the ability of the charts to be conveniently rendered on many navigational display systems.
  • Navigational display systems 20 and 20' are configured to be able to conveniently render navigational charts that are originally provided in a vector graphics format and that use True Type fonts. Navigational display systems 20 and 20' accomplish this without requiring the computational resources necessary for rendering vector graphics files, without requiring extensive re- working of the graphics display platform of a particular controller 22, and also while more easily allowing the software used to display navigation chart 36 to achieve a higher DO-178B level rating, such as a level B,
  • Memory 24 of navigational display system 20 has stored in it raster graphics file 25, which contains an image of navigational chart 36.
  • the stored image includes header section 38, map plan view section 40, aircraft profile view section 42, and aircraft minimums section 44.
  • Memory 24 also stores metadata 27 (FIG. 1) within it that identifies which pixels in the raster graphics file correspond to each of the sections 38, 40, 42, and 44.
  • Metadata 27 may be stored in a file separate from raster graphics file 25, such as illustrated in FIGS. 1 and IA, or it may be stored within raster graphics file 25 itself.
  • the term "metadata” is used herein to generally refer to data that describes other data, such as data that describes the image data of the navigation charts. It will be understood, however, that the term “data” as used herein can refer to either data or metadata.
  • controller 22 may be programmed to allow a pilot to choose, via user interface 26, any one or more sections 38-44 for display on display 28.
  • controller 22 could instruct controller 22, via user interface 26, to display only map plan view section 40 on display 28.
  • controller 22 would read the metadata 27 from memory 24 that identifies which pixels in the raster graphics file 25 correspond to map plan view section 40 and display only those pixels on display 28. This would allow the pilot to more easily focus on only the map plan view section 40 of navigation chart 36.
  • the pilot could select any other one of sections 38-44 for display by itself on display 28, or he or she could select any combination of two or more sections 38-44 for simultaneous display on display 28.
  • the pilot can also, of course, have the entire navigational chart 36 displayed at one time on display 28.
  • controller 22 and user interface 26 are also configured to allow the pilot (or other user of display system 20) to zoom in or zoom out on whatever portion of navigation chart 36 that is being displayed on display 28 (i.e. zooming in and out can be done regardless of whether the entire navigational chart 36 is being displayed, or only selected sections 38-44 of it).
  • navigational display system 20' may be configured to display the aircraft's current location on top of navigational chart 36 so that a pilot can immediately see his or her location with respect to navigation chart 36, as will be described more below.
  • Navigational display systems 20 and 20' can also overlay a planned flight plan on top of the navigational chart, if desired,
  • Navigation system 30 of display system 20' may be any conventional navigation system, such as, but not limited to, a GPS-based navigation system or an inertial reference system.
  • navigation system 30 may include one or more accelerometers, gyroscopes, magnetometers, radio beacon receivers, or any other conventional navigation equipment used on aircraft.
  • Navigation system 30 determines the current location of the mobile vehicle with respect to a known reference system, such as latitude and longitude, or a GPS coordinate system, or any other reference system which can determine a position in a manner that can be correlated to navigation chart 36.
  • map plan view section 40 of navigation chart 36 includes various navigation landmarks, such as a river 52, an airport 54, a VOR station 56, an intersection 58 (KILBY), and a plurality of potential obstacles 60.
  • Map plan view section 40 also includes geographic references that tie the information displayed in section 40 to an external reference system.
  • map plan view section 40 includes latitude markings 46 and longitude markings 48 which indicate the position of the map's contents with respect to the Earth's latitude and longitude references.
  • map plan view section 40 is drawn to a known scale. This known scale, along with the latitude and longitude markings 46 and 48, may be stored as part of metadata 27. When done so, this metadata allows controller 22 to display on display 28 the current position of the aircraft (or other type of mobile vehicle on which navigational display system 20' is implemented).
  • FIG. 2 One example of the display of the mobile vehicle's current position on top of navigational chart 36 is illustrated in FIG. 2.
  • Display 28b is shown displaying a map plan view section 40 of a navigation chart 36 (which is a different chart than the specific one illustrated in FlG. 3).
  • An aircraft icon 50 is also shown on display 28b at a location west (to the left) of river 52 and northeast of airport 54.
  • Controller 22 overlays the aircraft icon 50 on top of navigation chart 36 at a location on the navigation chart that coincides with the aircraft's current position with respect to the map plan view section of navigation chart 36.
  • the controller 22 will use the metadata 27 containing the latitudinal and longitudinal references to display the aircraft icon 50 on top of plan view section 40 at the 50 degree, 10 minute north latitude and 90 degree, 32 minute west longitude position on the map. This will allow the pilot to immediately see his or her current position with respect to the items that are included within the map plan view section 40, such as river 52, airport 54, etc.
  • Controller 22 updates the position of aircraft icon 50 on display 28 as the aircraft moves. This updating may take place at a rate of several times a second, although other rates may also be used. The updating is based on the information controller 22 receives from navigation system 30. Thus, in the example of FIG. 2, if the aircraft continues flying north, controller 22 will repetitively adjust the position of aircraft icon 50 upwards on display 28b while the underlying image of the map plan view section 40 remains stationary. The aircraft icon 50 will therefore move upward across display 28b in accordance with the corresponding movement of the aircraft through the sky. The visual effect presented to the pilot is thus similar to what the pilot would see if he were physically located at a distance above the aircraft and looking down at the Earth, which was represented by the map plan view 40 of the navigation chart 36.
  • controller 22 can react in any of a variety of different manners, including removing aircraft icon 50 from the display, issuing a warning to the pilot, searching for another navigation chart 36 that corresponds to the geographic region into which the aircraft has moved and automatically displaying such a navigation chart (if located), removing the display of navigation chart 36, indicating and updating the distance the aircraft has flown out of the range of the chart, or any other action that would be appropriate for the situation.
  • aircraft icon 50 can be varied within the present invention.
  • icon 50 can be adapted to reflect the visual appearance of the specific type of aircraft.
  • any other suitable non-aircraft icon or indication can be used to display the current position of the aircraft on navigation chart 36.
  • navigational display system 20' is used on a mobile vehicle that is not an aircraft, aircraft icon 50 can be replaced with an icon representing the type of vehicle on which system 20' is implemented, e.g. a boat, a car, an RV, etc, or any other type of indication that provides a visual cue to the operator of the mobile' vehicle of the vehicle's current location with respect to the underlying navigation chart 36.
  • navigation chart 36 may include one or more insets 62 on the plan view map section 40 (FIG. 2).
  • Insets 62 may display a variety of different types of information, such as an enlargement of a particular area of the map or textual information relating to a particular area of a map.
  • memory 24 will store the location of each and every inset 62 in a particular navigation chart as part of metadata 27.
  • this metadata may be stored within the raster graphics file 25 that contains the image data for the navigation chart 36, or it may be stored separately from the raster graphics file 25.
  • metadata 27 could be stored in a memory separate from memory 24.
  • controller 22 will read this information and repetitively check to see if the current location of the aircraft has moved to a position that lies over one of the insets 62. If it does, controller 22 will react in any one of a variety of different manners. f0070] In one embodiment, if the inset 62 contains an enlargement of a particular section of a map and that inset happens to be scaled with geographic references, controller 22 will shift the position of aircraft icon 50 to the geographically proper location within the inset 62. This shifting may optionally involve a change in the size of icon 50.
  • controller 22 may simply remove aircraft icon from display 28 until the aircraft moves to a location that no longer falls within the region encompassed by inset 62.
  • controller 22 may continue to display icon 50 on display 28 at a location that coincides with the latitudinal and longitudinal marks 46 and 48 outside the inset 62, despite the fact that the location of icon 50 might not represent the aircraft's actual location with respect to the interior of inset 62.
  • This continued display of icon 50 could involve a change in its color or other attribute in order to give the pilot a visual indication that the location of icon 50 is not necessarily accurate within the area defined by inset 62.
  • Other variations are possible as well.
  • navigation chart 36 of FIG. 3 has been divided into four sections — header 38, map plan view 40, profile view 42, and minimums 44 — it will be understood by those skilled in the art that the present invention does not limit the specific number of sections into which any particular navigation chart 36 may be divided. Indeed, it would be possible to divide the navigation chart 36 of FIG. 3 into more or less sections than the four illustrated therein.
  • the top row of information in the navigation chart of FIG. 3 contains various radio frequency information, including the radio frequencies for the Automatic Terminal Information Service (ATIS), the Green Bay approach, the Minneapolis Center, the Green Bay tower, etc. This entire row of radio frequency information could be considered a separate section of navigation chart 36. Or this row could be further subdivided into smaller sections.
  • ATD Automatic Terminal Information Service
  • navigation chart 36 will store the location of each section as part of metadata 27. That is, memory 24 will store in metadata 27 sufficient information to identify which pixels in the raster graphics file 25 correspond to each and every different section of the chart 36. This data will allow controller 22 to selectively display, upon prompting via user interface 26, each of the sections of chart 36 either individually or in any desired combination, as was discussed above.
  • Each raster graphics file 25 contains a raster graphics image of one or more navigation charts 36.
  • Each raster graphics image includes a plurality of pixels that, when combined together in the proper arrangement, create the image of the navigation chart 36.
  • the specific format of the file containing the raster graphics image of the navigation chart 36 can vary within the scope of the invention.
  • raster graphics image 25 may be stored as a conventional bitmap file.
  • Other types of file formats may also be used, and the invention contemplates tailoring the format of the raster graphics file 25 and accompanying metadata 27 to the specific needs and formats required by a particular navigational display system 20 or 20', or other device that may display an image of the navigation chart 36.
  • metadata 27 may contain information identifying a geographic reference for a chart, a scale, the location of different map sections, and the location of different insets within a given map.
  • This list of information that may be stored within metadata 27 is only an illustrative example of the types of information that may be stored in memory. Changes and additions to this list are within the scope of the invention.
  • An example of one set of metadata 27 that may be stored for an aircraft navigation chart is listed in the tables of FIGS. 4 and 5.
  • the metadata identified in FIGS. 4 and 5 is divided into a plurality of data fields 64. Each of the data fields 64 is identified in the leftmost column of FIGS. 4 and 5.
  • the size of the data field in bytes is listed in the second column from the left, followed by a short description of the data field in the next column to the right, and an indication of the type of data field in the right-most data column.
  • the data fields 64 listed in FIGS. 4 and 5 are merely illustrative of the types of data fields 64 that may be used in accordance with the present invention. In other words, the precise number and types of data fields 64 that comprise metadata 27 may vary substantially from that depicted in FIGS. 4 and 5, including additional metadata not illustrated in FIGS. 4 and 5. Further, the size of the data fields can be varied, along with the field types that define the format of the metadata in the data fields 64. [0074] The meaning of the data fields 64 of FIGS. 4 and 5 will now be described.
  • the m_tiches data field (FIG. 4) specifies the size of the chart in thousandths of inches.
  • the data in the m_tiches data field may be formatted in a manner referred to as a "Magnitude2d" type of field, which specifies a first set of four bytes that identifies the width of the navigation chart in thousandths of inches, and a second set of four bytes that identifies the length of the navigation chart in thousandths of inches.
  • the m whole data field (FIG. 4) identifies the location and extent of the whole navigation chart 36 in whatever coordinate system the navigation chart 36 uses.
  • the metadata within the m whole data field may be formatted in a manner referred to as a "Rect" type of field, which specifies a first set of eight bytes that identifies the coordinates of the lower left corner of the entire navigation chart 36 and a second set of eight bytes that identifies the coordinate of the upper right corner of the entire navigation chart 36.
  • the m angleToRotateHeader data field (FIG. 4) identifies what angle the image data of the navigation chart 36 will need to be rotated (if any) in order for the image to be presented on a display with the header section 38 oriented toward the top of the display.
  • the m angleToRotateHeader data field is useful where some navigation charts 36 may be oriented in a landscape orientation and other ones may be oriented in a portrait orientation. Controller 22 can read the metadata in the m_angleToRotateHeader data field and use this to automatically display the navigation chart 36 in the proper orientation, thereby relieving the viewer of the task of having to re-orient the navigation chart manually through buttons 32, or some other type of user interface 26.
  • the m_angleToRotate data field may be formatted in a manner referred to as a "float" type of data field, which simply refers to a four byte floating point number.
  • the m isTo Scale data field (FIG. 4) identifies whether the navigation chart 36 is drawn to scale or not.
  • the metadata 27 within the m_isTo Scale data field may be stored as a single bit (or byte) in which a value of one means navigation chart 36 is drawn to scale and a value of zero means navigation chart 36 is not drawn to scale, or vice versa. This format is referred to as "bool" in FIG. 4. If the m_isToScale data field indicates that the navigation chart 36 is drawn to scale, then additional metadata will be stored in memory 24, such as that identified in FIG. 5. This additional metadata will be discussed more below with respect to FIG. 5.
  • the m sizeOfMetadata data field (FIG. 4) identifies the total size of the metadata 27 that accompanies the image data of the navigation chart 36. As mentioned above, FIGS. 4 and 5 identify all of the metadata 27 that may accompany a particular navigation chart 36. In some situations, the particular fields 64 of metadata 27 that accompany a given navigation chart 36 will vary from one chart to another, such as when some charts are drawn to scale and other charts are not drawn to scale.
  • the m_sizeOfMetadata field identifies the total size of whatever metadata 27 happens to accompany a particular image of a navigation chart 36.
  • the m_sizeOfFilename data field (FIG. 4) identifies the size of the file name that will be used in the target system.
  • the target system refers to the particular display system that will be displaying the navigation chart.
  • the metadata in the m sizeOfFilename data field may be stored as an unsigned integer ("unsigned int").
  • the m_pFilename data field (FIG. 4) identifies the file name of the raster graphics file
  • This data field 64 allows the target raster graphics file 25 to be correlated to the name originally given to a particular navigation chart by the vendor or supplier of the vector graphics file of that navigation chart.
  • the size of this data field is variable and determined by the value stored in the m sizeOfFilename data field, discussed above.
  • the metadata 27 in the m_pFilename data field may be stored as a string of unsigned characters ("unsigned char").
  • FIG. 5 depicts additional metadata that may usefully be stored in memory 24 (or another memory accessible to controller 22) if the data field m isToScale (FIG. 4) indicates that the navigation chart 36 is drawn to scale. If the navigation chart 36 is not drawn to scale, the data fields 64 of FIG. 5 may be omitted in their entirety.
  • the m_header data field (FIG. 5) identifies the location and extent of header section 38 of navigation chart 36. Specifically, the m_header data field identifies which pixels in the image of navigation chart 36 correspond to header section 38.
  • the metadata within this field may be stored in the "Rect" format, which defines a first set of eight bytes of data that identify the lower left coordinates of a rectangle and a second set of eight bytes of data that identify the upper right coordinates of the rectangle.
  • the specific coordinates used to identify these two locations may be based on the coordinate reference system used in the m whole data field (discussed above).
  • the m_plan data field (FIG. 5) identifies which pixels in the image of navigation chart 36 correspond to the map plan view section 40.
  • the metadata in the m_plan data field may be stored as two sets of eight bytes wherein the first set identifies the coordinates of the lower left corner of the rectangle of plan view section 40 and the second set identifies the coordinates of the upper right corner of the rectangle of plan view section 40.
  • the m_planLatLon data field (FIG. 5) identifies the location and extent of the map plan view section 40 in latitudinal and longitudinal coordinates.
  • the "LatLonRect" field type may be defined as a first set of sixteen bytes that identifies the latitude and longitude of the lower left comer of the map plan view section 40, and a second set of sixteen bytes that identifies the latitude and longitude of the upper right corner of the map plan view section 40.
  • the m isProfilePresent data field (FIG. 5) identifies whether navigation chart 36 includes a profile section 42 or not. This information may be stored as a single bit (or byte) wherein a zero value indicates that navigation chart 36 does not include a profile view section 42 and a one value indicates that chart 36 does includes a profile view section, or vice versa.
  • the m_profile data field (FIG.5 ) identifies the location and extent of the aircraft profile view section 42 of navigation chart 36, if such a section 42 is present in chart 36.
  • the metadata in the m_profile data field may be stored as a first set of eight bytes that defines the lower left coordinates of the profile view section 42 and a second set of eight bytes that defines the upper right coordinates of the profile view section 42.
  • the m minimum data field (FIG. 5) identifies the location and extent of the aircraft minimum section 44 of navigation chart 36.
  • the metadata in the m minimum data field may be stored as a first set of eight bytes that defines the lower left coordinates of the minimum section 44 and a second set of eight bytes that defines the upper right coordinates of the minimum section 44.
  • the m numberOflnsets data field (FIG. 5) identifies the number of insets 62 that are present (if any) within the plan view section 40 of navigation chart 36.
  • the metadata in the mjnumberOflnsets data field may be stored as a four byte unsigned integer.
  • the m_plnset data field (FIG. 5) identifies the location and extent of each of the insets
  • m_plnset data field will occupy a size that is dependent upon the actual number of insets 62 within the plan view section 40 of a given navigation chart 36.
  • a first set of eight bytes may be used to identify the lower left coordinates of the inset and a second set of eight bytes may be used to identify the upper right coordinates of the inset wherein the coordinates are specified in the same coordinate reference system used in the other metadata fields (e.g. the m_whole data field).
  • the m_pInsetLatLon data field (FIG. 5) also identifies the location and extent of each of the insets 62 that are contained with the plan view section 40 of navigation chart 36.
  • the m_pInsetLatLon data field differs from the above-described m_plnset data field in that it defines the lower left corner and upper right corner of the inset 62 in latitudinal and longitudinal reference coordinates. It will be understood that, if the target navigational display system is configured to operate using a latitudinal and longitudinal reference system, rather than some other particularized reference system, the m_plnset data field could be omitted while retaining the m_pInsetLatLon data field.
  • the data fields 64 illustrated in FIGS. 4 and 5 are merely illustrative of the types of metadata 27 that may be stored in memory 24. Different types of data fields, different numbers of data fields, and different formats for the data fields may be used in accordance with the present invention. Further, the precise location where data fields 64 are stored in memory can also be varied within the scope of the present invention.
  • the metadata 27 of data fields 64 that accompanies a particular navigation chart 36 are stored in memory 24 as part of the raster graphics file 25 that contains the image of that particular navigation chart 36. That is, the raster graphics file 25 that contains the image data for a navigation chart also includes the metadata 27 for that particular chart.
  • the data fields 64 that accompany a particular chart are stored in memory 24 in a file separate from the raster graphics file 25 that contains the image data of the particular chart.
  • controller 22 will read two different files when displaying a particular navigation chart on display 28: the raster graphics file 25 containing the image of the navigation chart, and a separate file containing the metadata of data fields 64 that correspond to that navigation chart.
  • one or both of these two files should include information that allows the data fields 64 to be correlated to a particular raster graphics file 25.
  • metadata could be stored in a memory separate from memory 24, if desired.
  • the particular number, kind, and format of the data fields 64 that accompany a given navigation chart 36 may vary considerably depending upon the particular form of the navigation chart 36. While the aircraft navigation chart 36 of FIG. 3, which includes rectangular sections 38-44, has been referenced herein, the present invention is applicable to navigation charts 36 that are divided into sections having shapes other than rectangles. For example, some navigation charts 36 might include one or more circular sections, or one or more square sections, or some other type of polygon or non-polygonal shape. If it is desirable for controller 22 to be able to recognize these non-rectangular shapes (such as for display purposes), then additional data fields 64 identifying which pixels in the raster graphics file 25 of the navigation chart correspond to those variously shaped sections would be stored as metadata 27.
  • these additional data fields 64 could be varied, but could include data identifying a center point and a radius for the circular sections, two corner locations for the square sections, and whatever metadata 27 that would be necessary to define the location of whatever other types of shaped sections navigation chart 36 contained.
  • the specific data fields 64 that accompany a given navigation chart can therefore be tailored to the particular layout and information contained with a given navigation chart.
  • navigation charts 36 may include multiple sections
  • the present invention contemplates using additional data fields 64 like those described above to store information about the scale and/or geographic coordinate system of those multiple sections.
  • the present invention contemplates storing, in addition to the raster graphics image data of a navigation chart, any type of further metadata 27 about the chart that may be useful for controller 22 to know about the image data for purposes of facilitating the display of the navigation chart to the viewer.
  • FIG. 6 depicts another example of one of the many possible manifestations of navigational display system 20 according to the various aspects of the present invention.
  • navigational display system 20 is incorporated into a conventional cell phone 63.
  • Cell phone 63 includes a display area 65 and a plurality of keys 67.
  • display area 65 of cell phone 63 corresponds to display 28
  • keys 67 correspond to user interface 26
  • the internal memory and microprocessor of cell phone 63 (not shown) correspond to memory 24 and controller 22, respectively, of display system 20.
  • Display area 65 of cell phone 63 is illustrated in FIG. 6 displaying an aircraft navigation chart 36 (different from the one of FIG. 3). More specifically, display area 65 of cell phone 63 is illustrated in FIG. 6 displaying the map plan view section 40 and aircraft profile view section 42 of an aircraft navigation chart. The entire navigation chart (which would include header section 38 and minimums section 44) is not shown because the user has pressed the appropriate keys 67 to cause the controller 22 within cell phone 63 to automatically display only sections 40 and 42 of the navigation chart. Further, as can be seen, the display of sections 40 and 42 is not merely a zooming in on these sections of the map, but rather a display in which the sections surrounding sections 40 and 42 having been cut out of the displayed image.
  • a user of display system 20 63 does not need to undergo the trial-and-error process of manually zooming and scrolling the image of the navigation chart until the appropriate section or section is displayed. Instead, the user can press a button (or other type of user interface), and controller 22 will automatically display only the desired section at a size that fills, to the extent possible, the viewing area of the display.
  • the precise keys 67 used to manipulate the image of the navigation chart can be varied within the scope of the invention. In general, it is desirable to allow keys 67 to be able to zoom in and out on the navigation chart, automatically display different sections of the chart, and scroll the image of the chart up, down, and side-to-side.
  • controller 22 may include one or more conventional microprocessors programmed to read raster graphics file 25 (and the accompanying metadata 27, if separate from file 25) from memory 24 and cause the associated display 28 to display the image of the navigation chart contained within raster graphics file 25.
  • the microprocessor would be programmed to allow the navigation chart image to be manipulated in the manners described herein (zooming, scrolling, selectively displaying sections, etc) based on inputs from user interface 26.
  • the software necessary to carry out these functions may vary from device to device, but would be well within the ability of a person of ordinary skill in the art to devise without undue experimentation.
  • Controller 22 can also be programmed to download additional navigation charts 36 from one or more databases. This downloading can take place over any computer network, including the Internet. Controller 22 can be configured in a client/server architecture where the database of navigation charts in raster graphics format acts as a server in responding to requests from controller 22. Such an arrangement would allow for the downloading of individual navigation charts in an "on-demand" time frame, i.e. charts could be downloaded right at the moment they are needed. This "on-demand" feature would greatly improve the prior methods of distributing navigation charts, particularly aircraft navigation charts, which have to be purchased in bulk subscriptions, rather than on a chart-by-chart basis.
  • Chart conversion method 66 begins with a vector graphics file 68 of one or more navigation charts 36 that are stored in any type of conventional memory device.
  • Vector graphics file 68 is fed into a computer 70, which may be a conventional personal computer or any other type of computer capable of being programmed to carry out the functions described herein.
  • the manner in which the vector graphics file 68 (or files) are transferred to computer 70 can vary widely, and could include having computer 70 read the vector graphics files 68 directly from its own internal memory, transferring the vector graphics files to computer 70 via a network (including an Internet connection), physically transporting a memory device (such as a disk, DVD, CD-Rom, flash memory device, etc) to computer 70 and coupling the memory device to computer 70 in the appropriate manner, or still other methods.
  • a network including an Internet connection
  • a memory device such as a disk, DVD, CD-Rom, flash memory device, etc
  • Raster graphics file 25 contains a set of raster graphics image data that corresponds to the navigation chart. That is, raster graphics file 25 contains the color information for each of the pixels that, when combined together, create a picture or image of the navigation chart.
  • the metadata 27 is the same metadata that was discussed above with respect to data fields 64, and, as was discussed previously, may vary depending upon the layout and composition of a particular navigation chart, as well as what sections of the navigation chart it may be desirable for controller 22 to be able to automatically display by themselves.
  • Electronic repository 76 may be the same as memory 24, but it also includes a wider variety of devices beyond memories specifically associated with a controller, user-interface and display, such as controller 22, user-interface 26, and display 28. More specifically, electronic repository 76 may be a stand-alone memory device having no associated controller, display, or user-interface. Such stand-alone memory devices include, but are not limited to, such devices as a floppy disk, a hard drive, a DVD, a CD-Rom, a flash memory device, or similar type of devices.
  • electronic repository 76 may be a memory inside of a specific device, such as a memory contained within a portable digital assistant (PDA), a portable media player, a cell phone, a computer (laptop or desktop) or any other type of known device capable of electronically storing the raster graphics file 25 and metadata 27.
  • Repository 76 may also be connected to the Internet, or other local or wide area network.
  • Computer 70 may be programmed to combine, for each navigation chart 36, the metadata 27 with the raster graphics file 25. If programmed in this manner, computer 70 will output a single raster graphics file 25 for each navigation chart 36.
  • computer 70 may be programmed to store the raster graphics file 25 and metadata 27 separately, in which case computer 70 will generate two files for each chart 36, or two sets of files for database of multiple charts.
  • the data within electronic repository 76 may be transferred to a mobile vehicle 78, which, as noted previously, could be an air, terrestrial, or marine vehicle.
  • Mobile vehicle 78 may contain navigational display systems 20 or 20', or it may contain a display system different from navigational display systems 20 or 20'.
  • the manner in which the data from repository 76 is transferred to mobile vehicle 78 can vary substantially within the scope of the invention.
  • files 25 and metadata 27 are transmitted wirelessly to a memory onboard mobile vehicle 78 (such as memory 22).
  • repository 76 might be physically transported to mobile vehicle 78 and connected to a computer on board mobile vehicle 78, such as may occur if repository 76 takes the form of a conventional Secure Digital (SD) card, a Compact Flash card, a portable USB (Universal Serial Bus) memory drive, or some other similar type of memory device.
  • SD Secure Digital
  • Compact Flash Compact Flash
  • USB Universal Serial Bus
  • the mobile vehicle can display images of the navigation charts 36 via its on-board display system.
  • the on-board display system may be navigational display system 20 or 20', or it may be a different type of on-board display system.
  • the navigation chart(s) 36 stored as raster graphic files 25 with accompanying metadata 27 (either within the file itself or separate), as opposed to vector graphic files 68.
  • the computational resources required by the on-board display system of the mobile vehicle to render the navigation chart 36 from the raster graphics file 25 and metadata 27 is substantially less than would be required to render the navigation chart 36 from a vector graphics file 68.
  • the navigation chart can be displayed on a wider variety of on-board display systems because the on-board display systems don't need to be able to handle complex tasks, like the rendering of True Type fonts, or the processing of vector graphics files written in specialized formats.
  • higher safety ratings for the software that renders the navigation chart can be more easily achieved (such as those specified in DO-178B) because the software necessary to render an image from the raster graphics file 25 and metadata 27 is simpler.
  • FIG. 8 illustrates a flowchart summarizing in greater detail the series of steps computer 70 may be programmed to follow in order to carry out the chart conversion process 66 outlined in FIG. 7.
  • the steps illustrated in FIG. 8 are only one specific manner in which computer 70 may be programmed to carry out various aspects of the present invention, and it will be understood that computer 70 could be programmed to convert the vector graphics files 68 to raster graphics files 25 and metadata 27 in a variety of different manners.
  • Chart conversion process 66 begins with a vector graphic file 68 that contains vector graphics image data 80 of a navigation chart 36.
  • the vector graphics image data 80 contains the information that defines the image of the navigation chart 36 using the vector graphics method of defining images.
  • Vector graphics file 68 further includes a day palette 72 and a night palette 74, Day and night palettes 72 and 74 define the colors that are used to render the image of the navigation chart.
  • Day and night palettes 72 and 74 are an optional component of vector graphics file 68.
  • a navigation chart may only have a single color palette associated with it, in which case vector graphics file 68 would include only that single palette, rather than two palettes.
  • vector graphics file 68 includes multiple palettes, such as a day and night palettes, or no palettes at all.
  • Day and night palettes generally refer to color palettes that are used to render a navigation chart image during different times of the day, or during other times when the ambient lighting surround a display, such as display 28, changes.
  • a display such as display 28
  • the day and night palettes 72 and 74 of vector graphics file 68 define two different sets of colors, the former a set of colors appropriate for displaying during high light level conditions, such as the day time, and the latter appropriate for display during low light level conditions, such as at night.
  • computer 70 renders an unverified day chart image from the vector graphics file 68.
  • a “day chart” refers to a navigation chart that is rendered using the colors specified in day palette 72. As noted, these colors generally make the chart more easily viewable during the day time hours.
  • a “night chart”, which is rendered at step 90, refers to a chart that is rendered using the colors specified in night palette 74, which are generally appropriate for viewing at night time. The information contained in a day chart and the corresponding night chart is the same. The only difference is the selection of colors used when displaying the chart.
  • the rendering of the day chart in step 82 may be accomplished in any of a variety of known manners.
  • the rendering takes place inside a memory of computer 70.
  • Such an internal rendering of the day chart may be accomplished using known techniques, such as the GetDIBits function of the Microsoft Windows® operating system. Other known functions of the Windows® operating system may also be used to render the chart at step 82.
  • the rendering of the day chart defines a set of pixels that, when combined in the appropriate manner, create an image replicating the image of the navigation chart. While the present invention contemplates that the day chart could be rendered in step 82 with a variety of different resolutions, one acceptable resolution is to render the day chart using 2,048 pixels along the longest side of the day chart.
  • the rendered image at step 82 will result in the creation of an image having 2,048 x 2,048 pixels. If one side of the chart is shorter than the other side, then the longer side will have 2,048 pixels and the shorter side would be divided into a smaller number pixels corresponding to the shorter length of that side of the image.
  • the rendering of the day chart at step 82 may be accomplished with the assistance of a conventional graphics card, such as would be known by one of ordinary skill in the art.
  • a conventional graphics card such as would be known by one of ordinary skill in the art.
  • One suitable system for rendering the day chart in step 82 is the ATI Catalyst® graphics software for Microsoft Windows that is available from Advanced Micro Devices of Sunnyvale, California.
  • Other graphics software may also be used within the scope of the present invention.
  • the result of rendering the day chart at step 82 is the definition of a plurality of pixels. Each of these pixels has a specific color associated with it.
  • the result of step 82 is the creation of pixels that are defined by a 32 bit quad RGB value.
  • the 32 bit quad RGB format uses 8 bits to define a red value, 8 bits to define a green value, 8 bits to define a blue value, and 8 bits that are not used.
  • the number of colors in the day chart that is rendered at step 82 will likely, but not necessarily, be different than the number of colors defined in the day palette 72 of vector graphics file 68.
  • the vector graphics files of aircraft navigation charts marketed by Jeppesen, Inc. of Englcwood, CO. contain night and day palettes 72 and 74 that each contain a maximum of 32 colors.
  • the resulting day chart created at step 82 will typically have more than 32 colors.
  • the various conventional graphics software that can be used within the present invention to render the day chart will typically add additional colors for anti-aliasing purposes. Some of the pixels defined in step 82 will therefore have colors that have been created for antialiasing purposes.
  • anti-aliasing colors will likely be different than the original colors specified in the day palette 72 of the vector graphics file 68. As will be explained in more detail below, the present invention, in one embodiment, limits the number of anti-aliasing colors so that the generated raster graphics file 25 consumes a reduced amount of memory.
  • Comparison step 84 is undertaken by computer 70 in order to verify that the rendered day chart has been properly rendered and contains no artifacts.
  • Comparison step 84 produces a verified image of the daytime version of the navigation chart.
  • computer 70 runs a color reduction and palettization process 86.
  • the color reduction and palettization process 86 will be described in greater detail below with respect to FIGS. 10, 11, and 12.
  • color reduction and palettization process 86 will reduce the number of anti-aliasing colors produced during step 82 to a predetermined threshold. Further, process 86 will create a day color palette to which each of the individual pixels will be indexed.
  • process 86 will be a raster graphics file that uses less memory than it otherwise would if the file were created directly from the verified image data output at step 84.
  • the color reduction and palettization process 86 is thus an advantageous process, but not a critical step in chart conversion method 66.
  • step 88 computer 70 creates a night palette.
  • the creation of the night palette at step 88 is dependent upon the rendering of the night chart in step 90, as will be explained further below.
  • the rendering of the night chart at step 90 is performed in the same manner as the rendering of the day chart at step 82. The only difference is that different colors are used in the night color rendition than in the day color rendition.
  • the rendering of the night chart at step 90 may be followed by an optional comparison step 92 which is carried out in the same manner as step 84.
  • the night palette created at step 88 will define a color for each of the pixels in the day image of the navigation chart. Each of the pixels will have an associated index value that corresponds to one of the colors in the night palette.
  • step 94 computer 70 extracts the data from vector graphics file 68 that is necessary to define the metadata 27.
  • this metadata 27 may include the information listed in FIGS. 4 and 5, merely a fraction of this information, or additional information beyond what is listed in FIGS. 4 and 5.
  • the contents of the metadata 27 may vary depending upon the specific type of navigation chart.
  • Injection step 96 combines the metadata 27 with the image data and palette data that was created at step 88.
  • the metadata 27 may be stored separately from the image and palette data. If this separate storage is desired, then step 96 would be omitted and the metadata 27 would be saved into whatever memory (such as memory 24) it was desired to store it in. This can be done at step 101.
  • the day image, night palette, and day palette from step 88 may also be saved in the memory at step 101 without combination with the metadata at step 96.
  • step 98 creates a conventional bitmap file in which the pixels corresponding to the navigation chart are stored as image data in the conventional image data block of the bitmap file, and the metadata is stored in an optional data block that is part of the conventional definition of the bitmap file standard. This will be described in more detail below with respect to FIG. 15. Alternatively, if it is desired to convert the bitmap file into a raster graphics file having a format different than the bitmap format, this can also be done.
  • Such a different format may be desirable for different types of target display systems.
  • the details of converting a bitmap file into a different type of target file will be described in more detail below with respect to FIG. 17 along with an optional feed back process 100 that may be used to confirm the raster graphics file was properly generated.
  • An illustrative example of a raster graphics file 27 in a format different than bitmap file 98 will also be described in more detail below with respect to FIG. 16.
  • FIG. 9 illustrates in greater detail the process involved in comparison steps 84 and 92.
  • FIG. 9 will be described with reference to comparison step 84, which corresponds to the comparison step undertaken with respect to the day chart image created at step 82. It will be understood, however, that the following description is equally applicable to comparison step 92, which is used in conjunction with the night chart image generated at step 90.
  • the comparison step illustrated in FIG. 9 involves a first image capture method 104 and a second image capture method 106. Both of the image capture methods 104 and 106 result in the definition of pixels that create an image of the navigation chart 36.
  • the first image capture method 104 involves defining the pixels for the entire navigation chart 36.
  • the second image capture method 106 involves defining a plurality of pixels of two different parts of the navigation chart that together make up a complete image of the chart.
  • second image capture method 106 may define the pixels for the top half of the navigation chart in a first step and define the pixels for the bottom half of the navigation chart in a second step.
  • second image capture method 106 could involve tiling the image of the navigation chart into more than two different pieces. The individual tiles of the image would then be pieced back together to define an entire image of the navigation chart.
  • the number of tiles can be varied from two to any number greater than two.
  • image capture method 106 may involve capturing the top half of the image separately from the bottom half, it is also possible to capture the left half separately from the right half of the image of the navigation chart. Alternatively, still different portions of the image may be individually captured with second method 106.
  • step 106 determines whether the results of steps 104 and 106 match. If the images do match, then computer 70 selects the image generated by either step 104 or 106 and proceeds to the next step in chart conversion method 66 (FIG. 8) using the selected image data.
  • computer 70 indicates this mismatch to the operator of computer 70.
  • the operator may then instruct computer 70 to re-start the steps illustrated in FIG. 8 to see if a repetition of the steps will result in a match at step 84 (or 92).
  • the computer may take other actions in response to a mismatch from capture methods 104 and 106.
  • steps 104, 106, and 108 The purpose of steps 104, 106, and 108 is to help ensure that the rendering steps 82 and 90 have generated a plurality of pixels that accurately represent the image of the navigation chart. While comparison steps 84 and 92 are both optional in the present invention, they add a degree of safety to the overall conversion process of the present invention. This added safety can be especially helpful when attempting to certify the methods of the present invention to meet industry standard safety levels, such as those set forth in the DO-178B or D0-200A standards. More particularly, comparison steps 84 and 92 help ensure that no artifact is introduced into the pixel data during the previous image rendering of steps 82 and 90. This is accomplished by rendering the entire image at step 104 and various pieces at step 106 which are then re-combined.
  • step 104 If the rendering of the image at step 104 introduces any visual artifact, such as a Microsoft Windows ⁇ logo, window, message, or any other undesirable item not part of the navigation chart, comparison step 108 will likely detect this. Steps 104 and 106 will detect this because of the different locations of the artifact that will be produced in each step. Thus, for example, if step 104 introduces an artifact in the lower left corner of the image of the entire navigation chart, second image capture method 106 will also produce the same artifact in the lower left corner of each of the pieces of the image that are captured during step 106.
  • any visual artifact such as a Microsoft Windows ⁇ logo, window, message, or any other undesirable item not part of the navigation chart
  • step 106 captures the image by separately capturing the top half of the image and then separately capturing the bottom half of the image, each of the two halves of the image will include the same artifact in the lower left hand corner.
  • the top half and bottom half of the image captured in step 106 are combined together into a single image of the entire navigation chart, there will be two artifacts, one in the lower left hand corner of the top half, and one in the lower left hand corner of the bottom half.
  • the entire image captured in step 106 is compared with the entire image captured at step 104 in comparison step 108, they will not match.
  • the output of step 104 will have a single artifact in the lower left hand corner, while the output of step 106 will have two artifacts.
  • FIG. 1 Oa depicts a raster graphics day image 110 having a height 112 and a width 114. While image 110 in FIG. 10a is a blank image, image 110 would normally be an image of a navigation chart 36. For example, image 110 could be an image of the chart illustrated in FIG. 3. Alternatively, image 110 could be an image of any navigation chart 36 desirably used in accordance with the methods and systems of the present invention.
  • Image 110 is the image that is generated at step 82 and it may either be verified at step 84, or the step of verification may be omitted. Image 110 is fed into the color reduction and palettization process 86.
  • Image 110 consists of a plurality of pixels 1 16.
  • FIG. 10a only illustrates a fraction of the pixels 116 which comprise the entire image 110.
  • the precise number of pixels 116 that can be used to define image 110 can vary within the scope of the present invention. As noted, however, one embodiment of the present invention defines 2,048 pixels along the longer edge of image 110. hi the illustration of FIG. 10a, the height 112 dimension of image 1 10 would thus be divided into 2,048 pixels as it is longer that width dimension 114. Other number of pixels can be used within the scope of the present invention.
  • Each pixel 116 has a color value associated with it. While the length of this color value can be varied within the scope of the present invention, a 32 bit length will be used for purposes of discussing FIGS. 10a- 1Oe.
  • FIG. 1 Oa illustrates a 32 bit color value 118 in which bits 0-7 define a blue value, bits 8-15 define a green value, bits 16-23 define a red value and bits 24-31 are unused.
  • Image 110 will consume an amount of memory equal to the number of pixels 116 multiplied by 32 bits (not counting the palette data). Because this may be an unacceptably large amount of memory, the size of color values 118 may be reduced via process 86 in a manner that is illustrated more clearly in FIGS. lOb-lOe and 11.
  • computer 70 tabulates all the different colors values 118 that result.
  • An example of one such tabulation is depicted in FIG. 10b.
  • the number of colors tabulated may be greater than the number of colors originally defined in the day palette 72 of vector graphics file 68. This is because the rendering step 82 may create a number of anti-aliasing colors that are added to image 1 10. The total number of colors in image 110 therefore may exceed that in the original vector graphics file 68.
  • Unreduced day palette 119 is a table that includes all of the color values 118 that are used in image 110 and that omits all of the colors values that are not used in image 110. In the example illustrated in FIG. 10c, there are 390 different colors defined in image 110 (0 through 389).
  • computer 70 After computer 70 has generated unreduced day palette 119, computer 70 utilizes a color reduction process 120 illustrated in FIG. 11 to create a reduced day palette 144. At step
  • computer 121 counts the total number of colors in unreduced day palette 119.
  • Process or step 126 may best be understood with an example.
  • the unreduced day palette 119 contained 155 colors (rather than the 390 listed in FIG. 10c).
  • computer 70 would store an eight bit index value (which could have 265 different values) for each pixel 116 that corresponded to the correct color value for that pixel in palette 119.
  • index value which could have 265 different values
  • step 122 determines at step 122 (FIG. 11) that there are more color values in unreduced day palette 119 than the predetermined threshold number, then it moves to step 128.
  • computer 70 calculates the color distance between each and every pair of different colors in unreduced day palette 119.
  • the number of color distances calculated at step 128 will be equal to (N)(N- 1)/2 wherein N is the number of color values in palette 119.
  • N is the number of color values in palette 119.
  • Computer 70 would therefore compute (390 x 389)/2 color distances. This is equal to 75,855 color distances.
  • FIG. 12a illustrates that the distance between each color is found by squaring the difference between each of the individual color components in the color values 118.
  • the color values 118 consist of a red value, a green value, and a blue value.
  • the color distance calculation involves squaring the difference between the red values in a pair, squaring the difference between the green values in the pair, and squaring the difference between the blue values in the pair. These squared values are then summed together and their square root may optionally be taken. Taking the square root would produce a true color distance, but this is not necessary because the pairs of colors will be arranged in a distance table 127 (FIG.
  • color distance will refer to the actual color distance or the square of the actual color distance, as well as any other values that correlate to the color distance in a manner that does not alter the order of the color distance from shortest to longest, or vice versa.
  • FIG. 12a illustrates a calculation of the squared distance between the pair of color values having index entries of 158 and 200.
  • the 158 color value has a hexadecimal value of 0x8F6A2D.
  • the 200 color value has a hexadecimal value of 0x91 A434.
  • the squared difference between the red values is first computed. In this case, the squared distance between the red values is equal to the square of hexadecimal 8F minus hexadecimal 91.
  • the squared distance between the green values is computed. This distance is equal to the square of hexadecimal 6 A minus A4.
  • the squared distance between the blue values is calculated. This squared distance is equal to the square of hexadecimal 2D minus 34. The sum of these squared distances between the red, green, and blue values is then determined. The sum is equal to the squared distance between the color values indexed at entries 158 and 200. As noted, this distance may be left as a squared value or a square root could be taken to determine an actual distance. In the illustration of FIGS. 12a and 12b, the square root is not taken because this requires less computation and the results are the same as when the square root is determined.
  • computer 70 determines the distance (or distance squared) between each of the color values in unreduced palette 119. After computer 70 has made its calculation of color distance, it arranges the color distances (or color squared distance) in color distance table 127 in a manner starting from the smallest color distance (or color squared distance) to the largest distance (or color squared distance).
  • FIG. 12b illustrates an example of such a color distance table.
  • the color index values 158 and 200 refer to colors that are separated by a squared distance of 3417.
  • the pair of color index values 388 and 389 refer to colors that are separated by a color distance squared of 36.
  • the color indexs value 0 and 1 are separated by a distance squared of 9.
  • the color pair consisting of index color values 200 and 209 has the shortest distance between its colors. Specifically, the distance between these colors is only 1. While color distance table 127 is shown in FIG. 12b as only containing six separate distance entries, as mentioned above, it would actually contain (390 x 389)/2 total entries. These additional entries have been omitted for purposes of ease of illustration.
  • step 1208 After computer 70 has computed the color distance (or color distance squared) between each pair of color values at step 128, it moves onto step 130 (FIG. 11) where it determines which color pair has the shortest distance (or shortest distance squared) between it. As noted, in the example of FIG. 12b the color pair with the shortest distance between it consists of colors having index values of 200 and 209. After determining the color pair with the shortest distance at step 130, computer 70 determines at step 132 (FIG. 11) whether both of the color values in that pair are object colors.
  • object color refers to a color that was originally defined in the day palette 72 of vector graphics file 68.
  • computer 70 leaves those two color values unchanged and returns to step 130 where it then determines the color pair with the next shortest distance. For example, in reference to FIG. 12b, computer 70 will first determine at step 132 whether index colors 200 and 209 are both object colors or not. If both of these colors are object colors, then computer 70 would return to step 130 and determine the pair of colors with the next shortest distance between them, which in this case would be the colors with index values 0 and 1. As can be seen in FIG. 12b, index colors 0 and 1 have a color distance squared of 9. Computer 70 would then determine whether the colors with index values of 0 and 1 were both object colors at step 132. If they were, it would return to step 130 and find the color pair with the next shortest distance. This pattern would continue until computer 70 eventually located a color pair in which at least one of the colors was not an object color.
  • step 134 computer 70 determines whether both of the colors in the color pair are non-object colors. (A non-object color is a color not defined in the day palette 72 of vector graphics file 68). If both of the colors are non-object colors, computer 70 will proceed to step 136. At step 136 computer 70 replaces the less frequently used color value in the pair with the more frequently used color value in the pair. For example, if the particular color pair consisted of color index values 0 and 1, as referenced in FIG.
  • color index value 0 is used in 15,840 pixels, while color index value 1 is used for only 20 pixels.
  • the 20 pixels that were previously assigned a color index value of 1 (which corresponds to the 32 bit RGB value 0x000003) would be re-assigned to the index color value 0, which corresponds to the hexadecimal color value 0x000000).
  • the total number of colors in the day palette 119 would be reduced by one.
  • step 138 determines whether the reduced number of color values is now equal to or less than the color threshold 124. Because step 136 replaces one color value with another color value, color palette 119 now consists of one less color value than it had prior to step 136. Step 138 determines whether this reduced number color value is equal to or less than threshold number 124. If it is, computer 70 jumps to step 126. If it is not, computer 70 returns to step 130 where it then determines the color pair having the next shortest distance. The next shortest color pair will be the next shortest pair out of those colors that still remain in palette 119.
  • step 134 determines at step 134 (FIG. 11) that both of the color values in a particular pair are not non-object colors, (which, in conjunction with step 132, means that one color is an object color and one is not an object color) then it proceeds to step 140.
  • step 140 computer 70 replaces the non-object color with the object color in day palette 119.
  • step 138 determines whether the reduced number of colors in palette 119 is less than or equal to threshold number 124. The outcome of that determination dictates whether computer 70 then proceeds to step 126 or continues to repeat the cycle of steps that start at step 130.
  • the only colors that are changed in process 120 are the anti-aliasing colors that were added during the chart rendering of step 82. Further, the anti-aliasing colors that are changed are those that are close to other colors in the unreduced day palette 119 and that are used relatively less frequently than the colors that are not changed. This results in a reduced visual impact on the image 110.
  • the reduction process 120 of FJG. 11 therefore preserves the original colors of vector graphics file 68 while minimally impacting the antialiasing colors introduced at step 82.
  • each of the pixels 116 can be correlated with an 8 bit color index value (FIG. 1Oe). This reduces the amount of memory necessary to store image 110.
  • the result of step 82 may be an image 110 in which each pixel 116 is represented by a 32 bit color value 118.
  • each pixel 116 will be represented by an indexed color value 142 (FIGS.1 Od-I Oe) that can be stored as an 8 bit value. Consequently, color reduction process 120 will reduce the data necessary to store image 110 by approximately 75 percent (32 bits per pixel to eight bits per pixel plus a color palette).
  • each pixel 116 in the reduced image of the navigation chart has an indexed color value 142 associated with it, rather than a direct color value.
  • This index color value 142 identifies an entry in reduced day palette 144, such as is illustrated in FIG. 1Od.
  • a process 154 for creating a raster graphics night palette is depicted in FIGS. 13 and
  • the original vector graphics file 68 of the navigation chart 36 includes a mapping 146 that correlates day palette 72 with night palette 74, such as is illustrated in FIG. 13 a.
  • Day/night mapping 146 maps the colors of the day palette 72 of the navigation chart 36 to the night colors in night palette 74.
  • Day/night mapping 146 provides information enabling a user of vector graphics file 68 to render a night image. Specifically, day/night mapping 146 enables the user of vector graphics file 68 to replace the day color values when a night time rendition of the navigation chart 36 is desired. While day/night mapping 146 illustrates only 32 colors, it will be understood that the present invention is applicable to vector graphics files that include more or less than 32 colors.
  • computer 70 renders a raster graphics image of a night version 152 (FIG. 13d) of the navigation chart 36 at step 90 (FIG. 8).
  • Step 90 is carried out in the same manner as step 82 using the night palette 74 of vector graphics file 68 rather than the day palette 72 (which is used in step 82).
  • comparison step 92 ensures that the rendered image is an accurate representation of the corresponding navigation chart.
  • the night image 152 is used to create the night palette at step 88.
  • the night palette 74 of the vector graphic file 68 in the example illustrated in
  • FIG. 13a includes only 32 colors, the raster graphic image 152 of the night chart that is rendered will likely include more than 32 colors. This is because the conventional software and/or hardware that may be used to render the image into a raster graphic image will likely add anti-aliasing colors to the raster graphic night image 152, The raster graphic night image 152 will therefore likely include more than the original 32 colors specified in the vector graphic night palette 74.
  • FIG. 14 depicts a raster graphic night palette creation process 154 that is carried out by computer 70.
  • Raster graphic night palette creation process 154 utilizes the reduced, raster graphics day palette 144, such as that illustrated in FIG. 1Od.
  • Process 154 begins by choosing one of the index values in reduced day palette 144. While the initial index value chosen can be any of the values in palette 144, an initial value of zero will be chosen for purposes of discussion herein. This index value will be referred to as value X in FIG. 14 and the accompanying discussion. While any initial value of X may be chosen, and the order of selecting subsequent index values from palette 144 can vary in any manner, night palette creation process 154 will eventually address every index value in day palette 144. It therefore will be more convenient to describe process 154 with an index value X that starts at zero and increments to the highest value in palette 144.
  • index value X identifies an entry in the reduced, raster graphics day palette
  • X will be incremented from 0 all the way up to the highest index value in palette 144, which is 255. Thus, in the example of FIG. 14, X will be incremented from 0 to 255 during the night palette creation process 154.
  • Night raster graphic palette creation process 154 begins at step 156 where a set of pixels D in the raster graphics day image 110 are identified. Specifically, the pixels having an index value of X are idenitifed. With respect to the example depicted in FIG. 13b, computer 70 identifies at step 156 the 15,840 pixels that have a color index value of 0x000000. At step 158, computer 70 identifies all of the pixels in the night image 152 that have the same physical location within the image as the pixels in set D. These pixels constitute a set N.
  • step D determines whether the color defined by the index value X is an object color in the vector graphics day palette 72. If it is, computer 70 proceeds to step 164. If it is not, computer 70 proceeds to step 166.
  • step 166 computer 70 determines the average color value of all of the pixels in the set N.
  • This average value is the average of the various color components.
  • the colors are defined as shades of red, green, and blue, the red values are averaged, the green values are averaged, and the blue values are averaged.
  • the average of these red, green, and blue values define an average color.
  • step 168 computer 70 sets the raster graphics day palette entry for the index value X equal to the average color value determined at step 166. Thereafter, computer 70 increments the value of X at step 170. After incrementing X at step 170, computer 70 determines at step 172 whether X is equal to the threshold color value 124 discussed previously. If it is, the raster graphic night palette creation process 154 is complete and the entire night palette 178 has been created. If is isn't, computer 70 returns to step 156 and the cycle depicted in FIG. 14 repeats itself until the entire night palette 178 has been created.
  • step 160 If it is determined at step 160 that the color defined by index value X is an object color in the vector graphics day palette 172, computer 70 passes to step 164.
  • step 164 computer 70 determines whether any of the pixels 116 in the set N have a color value that is listed in the vector graphics night palette 74. If is determined at step 164 that none of the pixels in set N have a color value from this vector graphic night palette 74, then computer 70 proceeds to step 166 and follows the procedures of step 166, as has been described previously. If computer 70 determines at step 164 that at least one of the pixels 116 in set N has a color value listed in the vector graphics night palette 74, then computer 70 proceeds to step 174.
  • computer 70 sets the raster graphics night palette entry having the index value X equal to the value in the vector graphics night palette 74 that corresponds to the vector graphics day color with the same index value X. Thereafter, computer 70 proceeds to increments X at step 170 in a manner that has been described previously.
  • FIGS. 13a and 13b illustrate an illustrative example of the night palette creation process 154.
  • the index value X has been set to 145.
  • the selection of the value X equal to 145 has been made herein merely for purpose of illustration and does not connote any significance with respect to any of the other values to which X may be set.
  • computer 70 identifies a set of pixels D in the raster graphic day image 110 that have a color defined by a color index value X.
  • an index value of X equal to 145 identifies an RGB color value of 0x8F6A2D.
  • the reduced day palette 144 indicates that there are 3 pixels having this color value in the raster graphic day image 110.
  • FIG. 13c illustrates the physical location of these 3 pixels, which are labeled al, a2, and a3. It should be noted that, although day image 110 in FIG. 13c is depicted as a blank image, the actual image 110 would be an image of a navigation chart 36. Image 110 of FIG. 13c has been left blank in order to more clearly explain the night palette creation process 154. In actual use, image 110 may consist of an image like that of the navigation chart 36 depicted in FIG. 3, or any other navigation chart.
  • night palette creation process 154 identifies a set of pixels N in the night image 152 that have the same location of the corresponding pixels in the day image 110.
  • the pixels bl, b2, and b3 comprise the set D.
  • the pixels bl-b3 are located in the same location within night image 152 as the pixels al-a3 are in the day image 110.
  • FIG. 13d illustrates raster graphic night image 152 as being physically larger than raster graphic day image 110, the actual sizes of the two images is the same. The disparity in sizes depicted in FIGS.
  • each pixel 116 in the night image 152 is defined by 32 bits of data.
  • each pixel 116 in the raster graphic day image 110 is defined by an 8 bit data field.
  • the difference in physical size between the images 110 and 152 depicted in FIGS. 13c and 13d is intended to convey this difference in data sizes, not a difference in the number of pixels.
  • the coordinates of the pixel al in day image 110 (FIG.
  • X (equal to 145 in this example) is an object color. If it is, computer 70 proceeds to step 164. If not, it proceeds to step 166. At step 164, computer 70 determines whether any of the pixels in the raster graphic night image 152 have a color value that is defined in raster graphics night palette 74. With respect to the example of FIG. 13d, computer 70 will determine at step 164 whether any of the pixels bl, b2, or b3 have a color value that is listed in vector graphics night palette 74. Depending upon the outcome of that determination, computer 70 will proceed to step 166 or step 174. In an example of FIG.
  • step 166 if none of the pixels bl, b2, or b3 have a color value that is defined in vector graphic night palette 74, then computer 70 proceeds to step 166, where it averages the colors of pixels bl, b2, and b3 together. In the example of FIGS. 13 a- 13d, this average color value will then be reset as the 145* 1 entry in the raster graphic night color palette 178 (technically, the 146 th entry since the palette begins at 0). Computer 70 would then increment X and determine the color value for the next entry in the raster graphic night color palette 178 (which would be index value 146).
  • step 174 computer 70 sets the color value at the index value of 145 in the raster graphics night palette 178 equal to the hexadecimal value 0x686969. This hexadecimal value is determined from the day/night mapping 146. As can be seen therein, the night color 0x686969 corresponds to a day color of 0x8F6A2D. This day color is the day color defined for the index value of 145 in the reduced raster graphics day palette 144.
  • step 174 computer 70 proceeds to increment X and continue to generate all entries in the raster graphics night image palette 178 (FIG. 13e).
  • the result of the raster graphic night palette creation process 154 is a raster graphics night palette 178 which will have the same number of index entries as the raster graphic day palette 144. This matching number of entries results because the day palette 144 is used to create raster graphic night palette 178 and one entry in the night palette 178 is created for each entry in the day palette 144.
  • the present invention can be implemented using different methods to create night palettes besides the night palette creation process 154 described herein. It will also be understood that the night palette creation process could be omitted from the present invention. For example, it would be possible to save both the day image 110 and the night image 152 in a memory, such as memory 24. However, saving both the day image 110 and the night image 152 consumes extra memory. In some applications, this extra consumption of memory may not be an issue and the present invention can be practiced in these applications where the extra memory is not an issue. In those situations where it is desirable to conserve memory space, the day image 1 10 is saved along with the day and night palette 144 and 178, respectively, while the night image 152 is discarded.
  • step 88 computer 70 will have created a day image 110, a reduced day palette 144, and a night palette 178.
  • These three pieces of data may be combined with the metadata extracted at step 94. Alternatively, as noted elsewhere, these three pieces of data can be stored as a raster graphics file 25 separate from the metadata extracted at step 94.
  • step 96 If the metadata extracted at step 94 and the data from night palette creation step 88 are to be combined together into a single raster graphics file 25, this is done at step 96.
  • the manner in which step 96 combines this data into a single file can be accomplished in a variety of different ways.
  • the metadata is inserted into an optional data block within a standard bitmap file.
  • FIG. 15 depicts the five standard data blocks in a conventional bitmap file 180.
  • Bitmap file 180 includes a file header block 182, a bitmap information header block 184, a color palette block 186, an optional data block 188, and an image data block 190.
  • the file header block 182 includes five separate data fields that are identified as bfType, bfSize, bfReservedl, bfReserved2, and bfOff ⁇ its.
  • the bitmap information block 184 includes 11 different data fields that are identified as biSize, bi Width, biHeight, biPlanes, biBitCount, biCompression, biSizelmage, biXPelsPerMeter, biYPeldPerMeter, biClrUsed, biClrlmportant.
  • the color palette block 186 defines a color palette for the raster graphic image that is stored in the image data block 190.
  • Data block 188 represents data that may optionally be stored in a bitmap file 180 in accordance with the defined format for bitmap files.
  • the bitmap standard does not define the format of the data stored in block 188. Instead, data of any format can be stored in block 188.
  • metadata 27 can be easily stored herein in any suitable format, such as, but not limited to, the formats of FIGS. 4 and 5.
  • the raster graphic night palette 178 can be stored in data block 188. Injection step 96 accomplishes storage of the metadata 27 and night palette 178 in optional data block 188.
  • Injection step 96 will also adjust the bfOffBits data field in the bitmap file block 182 in accordance with the size of the metadata inserted (as well as the size of the raster graphics night palette 178). More specifically, the bfOffBits data field in data block 182 defines the number of bits from itself to the beginning of the image data in block 190. Thus, the bfOffBits data field should be adjusted in accordance with the size of the metadata (and any other data) inserted into optional data block 188. In conventional bitmap files, the bfOffBits data field is an unsigned integer of 32 bits.
  • the optional data block 188 can therefore take on a size of 2 ⁇ 32 bits minus the bits contained within bitmap information block 184 and color palette block 186.
  • the bitmap file standard therefore allows ample room within optional data block 188 for the storage of metadata 27 and night palette 178.
  • FIG. 16 illustrates various data blocks for a target file format that has been labeled RAS.
  • RAS is an arbitrary name used herein merely to illustrate one possible alternative file format to the bitmap file format.
  • the 16 includes six separate data blocks.
  • the six data blocks are a raster file header block 194, a metadata information block 196, a bitmap info header block 198, a day palette block 200, a night palette block 202, and an encoded image data block 204.
  • Metadata block 196 stores the metadata 27 described previously.
  • Day palette block 200 stores the raster graphics day palette 144.
  • the night palette block 202 stores the raster graphics night palette 178.
  • the encoded image data 204 stores the pixels that make up the image of the navigation chart.
  • raster graphics file is an umbrella term that refers to any type of file containing an image defined in a raster format.
  • raster graphics file can refer to bitmap file 180 a RAS file 192, or any other raster graphics file, regardless of format.
  • the steps of the feedback method 100 are depicted in more detail in FIGS. 17-19. Feedback method 100 receives bitmap file 180 after injection step 96 (FIG. 8) has been completed.
  • Feedback method 102 utilizes two software processes known as BMP2Target 206 and Target2BMP 208.
  • the BMP2Target process 206 converts the bitmap file 180 into a target file 210, which may be the RAS file 192 or some other type of raster graphics file.
  • the Target2BMP process 208 reconverts the target file 210 back into a bitmap file.
  • the reconverted bitmap file is compared with the original bitmap file 180 at a comparison step 212. If there are no differences detected at comparison step 212, then the target file 210 is deemed a verified re-creation of the navigation chart 36. If there are differences detected at step 212, appropriate corrective action is undertaken.
  • a computer which may be computer 70 or another computer, reads the bitmap file 180 and encodes the image data using the run length encoding (RLE) algorithm.
  • a target buffer 215 is created at step 216 based on the size of the bitmap file 180.
  • the metadata 27 is copied into the target buffer 215.
  • the bitmap information 184 is copied into target buffer 215.
  • the raster graphics day palette 144 is written into the target buffer 215.
  • the raster graphics night palette 178 is written into the target buffer 215.
  • the encoded image data from step 214 is copied into the target buffer 215 at step 228. Thereafter, all of the data in the target buffer 215 is compressed at a step 230 before being output as the raster graphics file in the target format.
  • FIG. 19 depicts a more detailed overview of the Target2BMP process 208, which is generally the reverse of the Target2BMP process 206.
  • the raster graphics file 192 stored in a target format is decompressed.
  • the output of the decompression step 232 is written into a target buffer 234.
  • the encoded image data is decoded at step 236.
  • a bitmap buffer 240 is created based on the size of the target buffer 234 and the image data that was decoded at step 236.
  • a bitmap file is written into the bitmap buffer 240 based on information extracted from the target buffer 234.
  • the target buffer data is read and the metadata is copied into the bitmap buffer 240.
  • the bitmap information header block 198 is read from the target buffer 234 and copied into bitmap buffer 240.
  • the day palette is copied at step 248 into the bitmap buffer 240 and the night palette is copied at step 250 to the bitmap buffer 240.
  • the resulting bitmap buffer 240 is then compared at comparison step 212 (FIG. 17) with the original bitmap file 180 to determine if there is a match. If there is, the resulting file is a confirmed raster graphics file. If not, appropriate corrective action may be taken.
  • the feedback method 100 helps to ensure that the raster graphics file 25 that is created by chart conversion method 66 is an accurate reproduction of the original navigation chart. This reassurance offered by the feedback method 100 assists in obtaining higher safety ratings for the raster graphics file 25, (and metadata 27, if separate) as well as the software used to render it.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un système et un procédé pour faciliter le rendu de cartes de navigation sur des plateformes électroniques. Le système et le procédé permettent aux plateformes électroniques d'afficher les cartes de navigation sans un fardeau inutile sur les ressources de calcul de la plateforme électronique. La plateforme électronique n'a pas besoin d'être modifiée pour manipuler le rendu des polices TrueType ou des images stockées dans un format de fichier vectoriel. Le système peut générer des fichiers électroniques de cartes de navigation qui reproduisent précisément l'image de cartes de navigation en papier tout en consommant une quantité réduite de mémoire. Des algorithmes de réduction de couleur et des procédés pour générer des palettes de jour et de nuit aident à réduire la mémoire nécessaire pour stocker la carte de navigation.
PCT/US2008/061386 2007-06-25 2008-04-24 Systèmes et procédés pour générer, stocker et utiliser des cartes de navigation électroniques Ceased WO2009002603A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US94595707P 2007-06-25 2007-06-25
US60/945,957 2007-06-25

Publications (1)

Publication Number Publication Date
WO2009002603A1 true WO2009002603A1 (fr) 2008-12-31

Family

ID=40185978

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/061386 Ceased WO2009002603A1 (fr) 2007-06-25 2008-04-24 Systèmes et procédés pour générer, stocker et utiliser des cartes de navigation électroniques

Country Status (1)

Country Link
WO (1) WO2009002603A1 (fr)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8809600B2 (en) 2009-01-15 2014-08-19 Harald Kohnz Process for the production of lower alcohols by olefin hydration
FR3007545A1 (fr) * 2013-06-21 2014-12-26 Thales Sa Procede systeme et programme d ordinateur pour fournir sur une interface homme machine les donnees relatives a un aspect du fonctionnement d un aeronef
EP3219371A1 (fr) 2016-03-14 2017-09-20 Ineos Solvents Germany GmbH Plateau de colonne de distillation réactive d'hétéroazéotropes et procédé utilisant ledit plateau
RU2646355C2 (ru) * 2013-11-22 2018-03-02 Хуавэй Текнолоджиз Ко., Лтд. Решение для усовершенствованного кодирования содержимого экрана
US10091512B2 (en) 2014-05-23 2018-10-02 Futurewei Technologies, Inc. Advanced screen content coding with improved palette table and index map coding methods
WO2018222416A1 (fr) * 2017-06-01 2018-12-06 Qualcomm Incorporated Réglage de palettes de couleurs utilisées pour afficher des images sur un dispositif d'affichage sur la base de niveaux de lumière ambiante
WO2019133526A1 (fr) * 2017-12-29 2019-07-04 Lyft, Inc. Optimisation de réseaux de transport par l'intermédiaire d'interfaces utilisateur dynamiques
US10638143B2 (en) 2014-03-21 2020-04-28 Futurewei Technologies, Inc. Advanced screen content coding with improved color table and index map coding methods
CN111861856A (zh) * 2019-04-30 2020-10-30 霍尼韦尔国际公司 在驾驶舱显示器上渲染动态数据的系统和方法
EP4002326A1 (fr) * 2020-11-11 2022-05-25 Honeywell International Inc. Système et système d'augmentation dynamique des cartes matricielles affichées sur un affichage de poste de pilotage
KR102562491B1 (ko) * 2022-09-13 2023-08-02 한국해양과학기술원 래스터형 전자 해도 설계 방법 및 시스템과 이의 활용 방법
US12148314B2 (en) 2021-01-07 2024-11-19 Honeywell International Inc. System and method for dynamically augmenting raster charts displayed on a cockpit display

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6199015B1 (en) * 1996-10-10 2001-03-06 Ames Maps, L.L.C. Map-based navigation system with overlays
US20070053513A1 (en) * 1999-10-05 2007-03-08 Hoffberg Steven M Intelligent electronic appliance system and method
US20070139411A1 (en) * 2002-03-15 2007-06-21 Bjorn Jawerth Methods and systems for downloading and viewing maps

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6199015B1 (en) * 1996-10-10 2001-03-06 Ames Maps, L.L.C. Map-based navigation system with overlays
US20070053513A1 (en) * 1999-10-05 2007-03-08 Hoffberg Steven M Intelligent electronic appliance system and method
US20070139411A1 (en) * 2002-03-15 2007-06-21 Bjorn Jawerth Methods and systems for downloading and viewing maps

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8809600B2 (en) 2009-01-15 2014-08-19 Harald Kohnz Process for the production of lower alcohols by olefin hydration
FR3007545A1 (fr) * 2013-06-21 2014-12-26 Thales Sa Procede systeme et programme d ordinateur pour fournir sur une interface homme machine les donnees relatives a un aspect du fonctionnement d un aeronef
US9626873B2 (en) 2013-06-21 2017-04-18 Thales Method, system and computer program for providing, on a human-machine interface, data relating to an aspect of the operation of an aircraft
RU2646355C2 (ru) * 2013-11-22 2018-03-02 Хуавэй Текнолоджиз Ко., Лтд. Решение для усовершенствованного кодирования содержимого экрана
US10291827B2 (en) 2013-11-22 2019-05-14 Futurewei Technologies, Inc. Advanced screen content coding solution
US10638143B2 (en) 2014-03-21 2020-04-28 Futurewei Technologies, Inc. Advanced screen content coding with improved color table and index map coding methods
US10091512B2 (en) 2014-05-23 2018-10-02 Futurewei Technologies, Inc. Advanced screen content coding with improved palette table and index map coding methods
EP3219371A1 (fr) 2016-03-14 2017-09-20 Ineos Solvents Germany GmbH Plateau de colonne de distillation réactive d'hétéroazéotropes et procédé utilisant ledit plateau
WO2018222416A1 (fr) * 2017-06-01 2018-12-06 Qualcomm Incorporated Réglage de palettes de couleurs utilisées pour afficher des images sur un dispositif d'affichage sur la base de niveaux de lumière ambiante
US10446114B2 (en) 2017-06-01 2019-10-15 Qualcomm Incorporated Adjusting color palettes used for displaying images on a display device based on ambient light levels
WO2019133526A1 (fr) * 2017-12-29 2019-07-04 Lyft, Inc. Optimisation de réseaux de transport par l'intermédiaire d'interfaces utilisateur dynamiques
US10852903B2 (en) 2017-12-29 2020-12-01 Lyft, Inc. Optimizing transportation networks through dynamic user interfaces
US11422667B2 (en) 2017-12-29 2022-08-23 Lyft, Inc. Optimizing transportation networks through dynamic user interfaces
US11709575B2 (en) 2017-12-29 2023-07-25 Lyft, Inc. Optimizing transportation networks through dynamic user interfaces
US12321561B2 (en) 2017-12-29 2025-06-03 Lyft, Inc. Optimizing transportation networks through dynamic user interfaces
CN111861856A (zh) * 2019-04-30 2020-10-30 霍尼韦尔国际公司 在驾驶舱显示器上渲染动态数据的系统和方法
EP4002326A1 (fr) * 2020-11-11 2022-05-25 Honeywell International Inc. Système et système d'augmentation dynamique des cartes matricielles affichées sur un affichage de poste de pilotage
US12148314B2 (en) 2021-01-07 2024-11-19 Honeywell International Inc. System and method for dynamically augmenting raster charts displayed on a cockpit display
KR102562491B1 (ko) * 2022-09-13 2023-08-02 한국해양과학기술원 래스터형 전자 해도 설계 방법 및 시스템과 이의 활용 방법

Similar Documents

Publication Publication Date Title
WO2009002603A1 (fr) Systèmes et procédés pour générer, stocker et utiliser des cartes de navigation électroniques
US6892118B1 (en) Pictographic mode awareness display for aircraft
US7495601B2 (en) Declutter of graphical TCAS targets to improve situational awareness
EP4181400B1 (fr) Outil pour faciliter des bases de données graphiques générées par le client pour une utilisation avec un système avionique certifié
US9752893B2 (en) Onboard aircraft systems and methods to identify moving landing platforms
US8310378B2 (en) Method and apparatus for displaying prioritized photo realistic features on a synthetic vision system
EP2395325B1 (fr) Affichage de limites de navigation sur un élément d'affichage embarqué d'un véhicule
US8094188B1 (en) System, apparatus, and method for enhancing the image presented on an aircraft display unit through location highlighters
US20120123680A1 (en) System and method for electronic moving map and aeronautical context display
US20100082184A1 (en) Displaying air traffic symbology based on relative importance
US8976042B1 (en) Image combining system, device, and method of multiple vision sources
US10019835B1 (en) Digital map rendering method
EP3444570A1 (fr) Systèmes d'aéronef et procédés de récupération d'attitude inhabituelle
WO2006073669A2 (fr) Afficheur graphique en vol de phenomene meteorologique et procede associe
EP3187826A1 (fr) Affichage de donnees meteorologiques dans un aeronef
US7418318B2 (en) Method and HUD system for displaying unusual attitude
US20040122589A1 (en) Electronic map display declutter
US8344911B1 (en) System, module, and method for generating non-linearly spaced graduations for a symbolic linear scale
US10163185B1 (en) Systems and methods for user driven avionics graphics
CN105823475A (zh) 场景的三维表示方法
US20160340054A1 (en) Method for displaying an image of a scene outside of an aircraft in an augmented reality context
US9540116B1 (en) Attitude indicator generating and presenting system, device, and method
US8723696B1 (en) Location information generation system, device, and method
US11047707B2 (en) Visualization method of the attitude of an aircraft, associated computer program product and visualization system
JP5709355B2 (ja) アンルート可視地勢ディスプレイを生成するための方法及びシステム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08746751

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08746751

Country of ref document: EP

Kind code of ref document: A1