US20040169653A1 - Bird's-eye view forming method, map display apparatus and navigation system - Google Patents

Bird's-eye view forming method, map display apparatus and navigation system Download PDF

Info

Publication number
US20040169653A1
US20040169653A1 US10/795,241 US79524104A US2004169653A1 US 20040169653 A1 US20040169653 A1 US 20040169653A1 US 79524104 A US79524104 A US 79524104A US 2004169653 A1 US2004169653 A1 US 2004169653A1
Authority
US
United States
Prior art keywords
map
data
display
bird
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/795,241
Inventor
Yoshinori Endo
Toshio Fujiwara
Hiroyuki Satake
Hiroshi Shojima
Norimasa Kishi
Masaki Watanabe
Motoki Hirano
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP09553595A external-priority patent/JP3474022B2/en
Application filed by Individual filed Critical Individual
Priority to US10/795,241 priority Critical patent/US20040169653A1/en
Publication of US20040169653A1 publication Critical patent/US20040169653A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3635Guidance using 3D or perspective road maps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/3673Labelling using text of road map data items, e.g. road names, POI names
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means

Definitions

  • the present invention relates to a navigation system for detecting the position of a moving body to inform a user of the current position of the moving body, and a map display apparatus using the navigation system.
  • navigation systems have been recently known, and these navigation systems are used while mounted on moving bodies such as vehicles, ships, etc.
  • a navigation system When such a navigation system is mounted on a moving body, it performs arithmetic and logic processing on information provided from various sensors to detect the current position of the moving body and display the current position on a screen of the navigation system.
  • This type of navigation system includes a position detector for detecting the absolute position of the moving body, a storage device for storing map data which include two-dimensional vector data obtained by projecting roads and points on the ground, such as structures, etc., onto a mesh-segmented plane using a universal transverse Mercator projection method, and character data associated with the vector data, an input device for receiving external instructions (commands), and a display device for reading out desired vector data from the mesh-segmented map stored in the storage device in accordance with an instruction input from the input device, to perform data conversion processing on the data, and then displaying the map on the display.
  • map data which include two-dimensional vector data obtained by projecting roads and points on the ground, such as structures, etc., onto a mesh-segmented plane using a universal transverse Mercator projection method, and character data associated with the vector data
  • an input device for receiving external instructions (commands)
  • a display device for reading out desired vector data from the mesh-segmented map stored in the storage device in accordance
  • the data conversion processing includes shift conversion processing for changing the display position on a map, scale conversion (enlarging/reducing) processing for displaying a map on any scale, and rotational coordinate-transformation processing for changing a display direction on the map.
  • shift conversion processing for changing the display position on a map
  • scale conversion (enlarging/reducing) processing for displaying a map on any scale
  • rotational coordinate-transformation processing for changing a display direction on the map.
  • the conventional navigation system as described above adopts a plan-map display mode in which the ground is drawn in the vertical orthogonal projection style. Therefore, in order to simultaneously display two spots which are far away from each other, a reduction-scale of the map must be necessarily increased, and thus detailed information cannot be displayed on the screen.
  • An object of the present invention is to provide a bird's-eye view forming method for displaying a map of a plane (ground) on the basis of a bird's-eye view corresponding to a projected plan of the plane (ground) which is obtained by projecting the plane (ground) on any plane between the plane (ground) and a view point when the plane (ground) is viewed from the view point located at any height from the plane (ground), a map display apparatus using the bird's-eye view forming method, and a navigation system using the map display apparatus.
  • a bird's-eye view forming method for forming, from map data, drawing (display) data for a map which is to be represented by a bird's-eye view is characterized in that coordinate data contained in the map data are perspectively converted to drawing (display) data based on a bird's-eye view on a desired projection plane with a view point being set to a desired position (i.e., the coordinate data of the map data on a vertical orthogonal projection plane are perspectively converted to drawing (display) data on a desired projection plane based on a bird's-eye view).
  • the bird's-eye view forming method preferably includes a step of receiving an input of the position of the view point, and a step of determining the projection plane so that the drawing (display) positions of two predetermined spots (for example, a current position and a destination) which are obtained through a perspectively projecting transformation on the basis of the coordinates of the two spots in the map data and the position of the view point, are set to predetermined positions.
  • the perspectively projecting conversion is defined as such a data conversion that data on a plan view are converted to data on a bird's-eye view through a perspective projection operation. Therefore, by using the perspectively projecting conversion, a plan-view map is converted to a bird's-eye view map.
  • the perspectively projecting conversion is merely referred to as “perspective projection”).
  • the bird's-eye view forming method may include a step of receiving an input of a scale, and a step of determining the position of the view point and the projection plane so that the drawing (display) positions of the two predetermined spots (for example, a current position and a destination) which are obtained by performing the perspective projection on the basis of the coordinates of the two spots in the map data and the input scale, are set to predetermined positions, and a scale of a drawn (displayed) map is equal to the input scale.
  • the drawing (display) positions of the two predetermined spots for example, a current position and a destination
  • the bird's-eye view forming method may include a step of receiving an input of a projection angle which is defined as an angle at which a plane defined for the map data and the projection plane intersect each other, and a step of determining the projection plane on the basis of the input projection angle and the set view point position.
  • a map display apparatus for displaying a bird's-eye view by using the bird's-eye view forming method as described above, and a navigation system for displaying a map by using the map display apparatus.
  • map drawing means includes coordinate transforming means, and the coordinate transforming means performs the perspective projection (conversion) on the map data to convert a plan-view map to a bird's-eye view map, and to display the converted map on a screen. Therefore, according to the present invention, a user can obtain a bird's-eye view display which is easy to see and in which the user can easily recognize the positional relationship of spots displayed on the map. According to the bird's-eye view forming method of the present invention, the view point can be freely set, so that it can meet user's need.
  • a navigation system which is excellent in operational performance and convenient for a user can be obtained if it is designed to overcome the following first to seventh problems.
  • plan view (plan map) display mode and the bird's-eye view display mode are freely switchable. With this switching operation, both the plan view display and the bird's-eye view display can be made in accordance with a user's requirement, and thus the convenience is enhanced.
  • a second problem occurs, namely, that it is difficult for the user to grasp the positional relationship between two different type of view when display is switched between the plan view display and the bird's-eye view display. This is because the position of the same spot to be displayed on the screen varies significantly due to variation of the view point with respect to the map, when the display switching operation is performed, and further, spots which have not been displayed or have been displayed until the display switching time are respectively displayed or are not displayed at the display switching time.
  • the present invention is designed so that the display is freely switchable the plan view display and the bird's-eye view display, it is further preferable that an intersecting angle between a plane containing the map data and a plane onto which the map is projected, is gradually increased or reduced in time series. That is, according to the present invention, it is preferable that the view point is smoothly shifted during the conversion between the plan view display and the bird's-eye view display. With this operation, the shift between the plan view and the bird's-eye view is smoothly performed, and thus the user can easily recognize the positional relationship between spots displayed on the plan view map and the same spots displayed on the bird's-eye view map.
  • the view point from which to obtain the bird's-eye view may be fixed to a specific position which is set to be away from the current position at a fixed distance and in a fixed direction at all times. Furthermore, the height of the view point may be varied in accordance with an external operation. If the view point is fixed to the specific position as described above, the bird's-eye view is displayed such that the current position is fixed to a point on the screen at all times, so that the user can easily grasp the current position. Furthermore, if the position of the view point can be set to a user's desired position, the convenience can be further enhanced.
  • an instruction regarding the positions of two spots may be accepted, and the view point and the intersection angle between the map-projected plane (bird's-eye view) and the plane containing the map data (plan view) may be determined so that the two spots are displayed at predetermined positions on the screen, whereby the bird's-eye view display is performed.
  • the positional relationship between the two spots can be easily recognized. Further, even when the positions of the two spots vary, these are displayed on the same frame at all times, so that the user can grasp the positional relationship between the two spots with no complicated operation.
  • line backgrounds such as roads, railroad lines, etc.
  • plane backgrounds such as rivers, green zones, etc.
  • character strings may be perspectively converted and drawn on the bird's-eye view map.
  • the character strings are perspectively converted, the shape of a character is smaller and is more deformed as it becomes farther away from the view point.
  • it is in the vicinity of the view point it is enlarged.
  • the character strings may be illegible in these cases (fifth problem).
  • the coordinate transforming means of the present invention does not perform the perspective projection on a character image.
  • characters contained in the bird's-eye view are displayed at the same size, so that the character strings are easily legible.
  • drawing (display) judgment means of the present invention displays the character data more preferentially (i.e., increases the display priority rank of the character data).
  • character data having a higher display priority rank may be displayed while being superposed on character data having a lower display priority rank, so that the character string can be prevented from being deficient and it can be made legible.
  • the display height or the distance from the view point is set as a criterion for determining the display priority rank.
  • a character string which is displayed at a lower display height (which is the height from the bottom side of the display frame) on the screen may be displayed more preferentially (i.e., so that the character string is not covered by the other character strings, but is superposed on the other character strings).
  • the character data near to the current position are displayed while being prevented from being covered by the other character strings and thus missed.
  • the deficiency (missing) of the character information (string) near to the current position which the user has a greater need for can be prevented, and the character string can be made legible.
  • the data amount of line data, plane background data and character data to be drawn per unit area is increased in a region which is far away from the view point (hereinafter referred to as “far-distance region”, and this region is a region having a small angle of depression). Accordingly, when the character data of these data are drawn while superposed on the other data, the line data and the plane background data are covered by the character data in the far-distance region, so that it may be difficult to check the shapes of roads, etc. in the far-distance region (seventh problem).
  • the drawing judgment means of the present invention preferably eliminates, from targets to be drawn (displayed), those character strings which are obtained through the perspective projection of the map and which are located at positions higher than a predetermined display height from the bottom side of the display frame.
  • the drawing judgement means may eliminate those character strings which are defined at position where is further than a fixed distance from the view point.
  • the character strings are not displayed (drawn) in a region which is located above the predetermined height or is more than at a predetermined distance away, that is, in a region-where the data amount to be drawn per unit area is considerably increased due to the reduction in angle of depression in the bird's-eye view display mode. Therefore, a road display, a plane background display, etc. are not covered by the character string display, so that the recognition of roads in the far-distance regions can be prevented from being impaired even in the bird's-eye view display mode.
  • FIG. 1 is a diagram showing an example of a bird's-eye view display drawn (displayed) according to the present invention
  • FIG. 2 is a diagram showing a navigation system of an embodiment according to the present invention.
  • FIG. 3 is a block diagram showing a hardware construction of a processing unit
  • FIG. 4 is a functional block diagram showing the processing unit
  • FIG. 5 is a functional block diagram showing map drawing means
  • FIG. 6 is a diagram showing a perspective projection processing of a map
  • FIG. 7 is a diagram showing a series of coordinate transforming steps for bird's-eye view display
  • FIG. 8 is a flowchart for a processing flow of the coordinate transforming means
  • FIGS. 9A, 9B and 9 C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display
  • FIGS. 10A, 10B and 10 C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display
  • FIG. 11 is a flowchart for a flow of the perspective projection processing
  • FIG. 12 is a flowchart showing a processing flow of drawing (display) judgment means
  • FIGS. 13A and 13B are diagrams showing an effect of a rearrangement operation of character strings
  • FIGS. 14A and 14B are diagrams showing an effect of a drawing (display) character string selecting operation
  • FIGS. 15A, 15B and 15 C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display
  • FIGS. 16A, 16B and 16 C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display.
  • FIG. 17 is a perspective view showing the outlook of the navigation system of the embodiment.
  • the navigation system is mounted on a vehicle such as a car or the like, but, the same effect can be obtained when the navigation system is mounted on other moving bodies such as a ship, a train, etc.
  • FIG. 1 shows a bird's-eye view which is displayed by a bird's-eye view map display apparatus installed in a navigation system of an embodiment according to the present invention.
  • the bird's-eye view map display apparatus of this embodiment forms a bird's-eye view showing a landscape obtained through a bird's eye as a projection drawing of two-dimensional map data (represented by reference numeral 101 in FIG. 1), and displays it on a display screen of a display.
  • the bird's-eye view is defined as a view of a landscape which is obtained as if it is viewed through the eyes of a bird flying at a specific position which is high above the ground.
  • reference numeral 102 represents a bird's-eye view.
  • a route 104 which is emphatically displayed to indicate a travel course (the emphasis is made with a bold line as shown in FIG. 1, but, it may be made using flashing display or discoloration)
  • reference numeral 105 represents a mark (symbol) indicating the current position.
  • arrows (indicated by solid lines and dotted lines) represent a projection from the two-dimensional map data 101 to the bird's-eye view 102 .
  • FIG. 17 is a perspective view showing the outlook of the navigation system for a vehicle according to the embodiment.
  • the navigation system of this embodiment is installed in a housing 69 , and the housing 69 is equipped with a display screen of the display 2 , a scroll key unit 66 , a scale changing key unit 67 , a projection angle changing key unit 68 , a touch panel 70 provided on the display screen of the display 2 , etc.
  • the scroll key unit 66 is responsive to a scroll instruction of an image displayed on the display screen, and has four direction-instruction keys 660 which are responsive to shift-up, shift-down, shift-to-right, shift-to-left instructions, respectively.
  • the scale changing key unit 67 is responsive to an instruction of changing the scale of a map displayed on the screen, and has two scale instructing keys 67 a and 67 b which are responsive to scale-up and scale-down instructions, respectively.
  • the projection angle changing key unit 68 is responsive to an instruction to change the angle of a projection plane of a bird's-eye view to be displayed on the screen, and has two angle instruction keys 68 a and 68 b which are responsive to projection-angle increasing and decreasing instructions, respectively.
  • the touch panel 70 serves as an input means which detects a touch (contact) to the surface thereof to output a touched position on the surface thereof.
  • the keys 66 to 68 may be designed as a software key which is triggered through the touch to a predetermined region on the touch panel 70 .
  • the navigation system for a vehicle includes a processing unit 1 , a display 2 which is connected to the processing unit 1 through a signal line S 1 , a map storage device 3 which is-connected to the processing unit 1 through a signal line S 2 , a voice input/output device 4 which is connected to the processing unit 1 through a signal line S 3 , an input device 5 which is connected to the processing unit 1 through a signal line S 4 , a wheel speed sensor 6 which is connected to the processing unit 1 through a signal line S 5 , an attitude (geomagnetic) sensor 7 which is connected to the processing unit 1 through a signal line S 6 , a gyro 8 which is connected to the processing unit 1 through a signal line S 7 , a GPS (Global Positioning System) receiver 9 which is connected to the processing unit 1 through a signal line S 8 , and a traffic information receiver 10 which is connected to the processing unit 1 through a signal line S 9 .
  • Each of the signal lines S 1 Each of the signal lines S
  • the processing unit 1 is a central unit for performing various processing. For example, it detects the current position on the basis of information output from the various sensors 6 to 9 , and reads desired map information from the map storage device 3 on the basis of the obtained current position information. In addition, it graphically develops map data to display the map data while overlapped with a current-position mark, and selects the optimum road which connects the current position and a destination instructed by the user to inform the user of the optimum course with voice or graphic display.
  • the display 2 serves to display graphic information generated in the processing unit 1 , and it comprises a CRT or a liquid crystal display.
  • RGB Red, Green, Blue
  • NTSC National Television System Committee
  • the map storage device 3 includes a large-capacity storage medium such as a CD-ROM (Compact Disk-Read Only Memory) or IC (Integrated Circuit) card, and it performs a read-out operation of reading out data held in the large-capacity storage medium in response to an instruction of the processing unit 1 to supply the data to the processing unit 1 , and a write-in operation of writing data supplied from the processing unit 1 into the large-capacity storage medium.
  • a large-capacity storage medium such as a CD-ROM (Compact Disk-Read Only Memory) or IC (Integrated Circuit) card
  • the voice input/output device 4 serves to convert a message produced in the processing unit 1 to a voice message to a user, and it also serves to receive a voice input and recognize the content thereof and to transmit the information to the processing unit 1 .
  • the input device 5 serves to receive an instruction input externally, and it includes the scroll key unit 66 , the scale changing key unit 67 , the projection angle changing key unit 68 and the touch panel 70 in this embodiment.
  • the input device 5 of the present invention is not limited to the above construction, and another input means such as a joy stick, a keyboard, a mouse, a pen input device or the like may be used.
  • the navigation system of this embodiment has various sensors such as a wheel speed sensor 6 for measuring the distance on the basis of the product of the circumferential length of the wheels of a vehicle and the detected number of rotations of the wheels, and measuring a turn angle of the vehicle on the basis of the difference in number of rotations between the paired wheels, an attitude (geomagnetic) sensor 7 for detecting the earth's magnetic field to measure a direction in which the vehicle is oriented, a gyro 8 such as an optical fiber gyro, a vibrational gyro or the like for measuring a rotational angle of the vehicle, and a GPS receiver 9 for receiving signals from three or more GPS satellites to measure the distance between the vehicle and each of the GPS satellites, and variation of the distance, thereby measuring the current position, the travel direction and the travel azimuth (bearing) of the vehicle.
  • the sensors of the present invention are not limited to the above sensors. For example, in a navigation system mounted in a ship, a Doppler sonar is used
  • the navigation system of this embodiment has a traffic information receiver 10 to receive signals from beacon transmitters or FM (Frequency Modulation) broadcasting stations which transmit traffic information such as information on traffic jams, road repairs and suspension of traffic, and information on parking areas.
  • beacon transmitters or FM (Frequency Modulation) broadcasting stations which transmit traffic information such as information on traffic jams, road repairs and suspension of traffic, and information on parking areas.
  • FM Frequency Modulation
  • FIG. 3 shows the hardware construction of the processing unit 1 as described above.
  • the processing unit 1 includes a CPU (Central Processing unit) 21 for performing a numerical operation and a control operation of the devices 22 to 31 , a RAM (Random Access Memory) 22 for holding maps and processing data, a ROM (Read Only Memory) 23 for holding programs, a DMA (Direct Memory Access) 24 for performing data transmission between memories and between memories and respective devices at high speed, a drawing controller 25 for performing a graphics drawing operation when developing vector data to an image at high speed and performing a display control, a VRAM (Video Random Access Memory) 26 for storing graphic image data, a color pallet 27 for converting the image data to RGB signals, an A/D (Analog/Digital) converter 28 for converting analog signals to digital signals, an SCI (Serial Communication Interface) 29 for converting serial signals to parallel signals which are synchronized with a bus, an I/O (Input/Output) device 30 for synchronizing serial signals with parallel signals, then sending the synchronized
  • the processing unit 1 includes user's operation analyzing means 41 , route calculating means 42 , route guiding means 43 , map drawing means 44 , current position calculating means 45 , map match processing means 46 , data read-in processing means 47 , menu drawing means 48 , and graphics processing means 49 as shown in FIG. 4.
  • each of these means 41 to 49 is implemented through execution of instructions stored in the ROM 23 by the CPU 21 .
  • the present invention is not limited to the above implementation, and each of the means may be implemented using a hardware construction such as a dedicated circuit or the like.
  • the current position calculating means 45 serves to time-integrate distance data and angle data which are obtained by integrating distance pulse data measured by the wheel speed sensor 6 and angular velocity data measured by the gyro 8 , to calculate the current position (X′, Y′) of a vehicle after travel on the basis of an initial position (X, Y).
  • the current position calculating means 45 corrects the absolute azimuth of the vehicle travel direction on the basis of the azimuth data obtained by the attitude sensor 7 and the angle data obtained by the gyro 8 so that an angle by which the vehicle has been rotated is matched with a vehicle travel azimuth.
  • the current position calculating means 45 also performs correction processing so that the accumulated errors are canceled on the basis of positional data obtained by the GPS receiver 9 at a predetermined time interval (every one second in this embodiment) to output the corrected data as current position information.
  • the current position information thus obtained still contains minute errors due to the sensors. Therefore, in,order to enhance the positional precision of the navigation system, a map match processing is performed by the map match processing means 46 .
  • this processing data of roads contained in a map around the current position which are read in by the data read-in processing means 47 are compared with a travel locus obtained by the current position calculating means 45 to match the current position to a road which has the highest correlation in shape.
  • the current position can be matched to the travel road through the map match processing, so that the current position information can be output with high precision.
  • the user's operation analyzing means 41 analyzes a user's demand or instruction which is input through the input device 5 by the user, and controls the units 42 to 48 to perform the processing corresponding to the demand (instruction). For example, when a demand for route guidance to a destination is input, the user's operation analyzing means 41 instructs the map drawing means 44 to display a map to set the destination, and then instructs the route calculation means 42 to determine a route extending from the current position to the destination and instructs the route guide means 43 to supply the route guidance information to the user.
  • the route calculation means 42 searches a node connecting two specified points by using a Dijkistra method or the like to obtain a route having the highest priority.
  • the route calculating means 42 has plural criteria for the priority order, and it uses a criterion indicated by a user's instruction to determine the route. In this embodiment, in accordance with the instruction, there can be obtained a route providing the shortest distance between the two points, a route along which the vehicle can reach to the destination in the shortest time, a route in which cost is cheapest, or the like.
  • the route guide means 43 compares link information of a guide route obtained by the route calculating means 42 with the current position information obtained by the current position calculating means 45 and the map match processing means 46 . Before passing through a crossing or the like, it outputs direction-instructing information to the voice input/output device 4 indicating whether the vehicle should go straight, or whether the vehicle should turn to the right or left, whereby the information is output as a voice message, or a direction in which the vehicle should travel is drawn on a map displayed on the screen of the display 2 .
  • the data read-in processing unit 47 operates to prepare for the read-in operation of map data of a desired region from the map storage device 3 .
  • the map drawing means 44 receives from the data read-in processing means 47 the map data around a position for which the display is instructed, and it operates to transmit to the graphics processing means 49 a command for drawing specified objects on a specified reduction-scale with a specified direction set to an up-direction.
  • the menu drawing means 48 receives a command output from the user's operation analyzing means 41 , and transmits to the graphics processing means 49 a command for drawing demanded various menus.
  • the graphics processing means 49 receives drawing commands generated by the map drawing means 44 and the menu drawing means 48 to develop an image in the VRAM 26 .
  • the map drawing means 44 includes initial data clip means 61 , coordinate transforming means 62 , drawing judgment means 63 , data clip means 64 and drawing command issuing means 65 .
  • the initial data clip means 61 performs clip processing to select road data, plane and line background data and character data on a region, which are required for subsequent processing, from each mesh of map data read out from the map storage device 3 by the data read-in processing means 47 , and supplies the selection result to the coordinate transforming means 62 .
  • the initial data clip means 61 is actuated at a time interval (period) for renewing drawn map (for example 0.5 seconds) or it is actuated in accordance with an instruction from the user's operation analyzing means 41 .
  • the coordinate transforming means 62 performs processing for transforming each coordinate value of the map data obtained in the clip processing, such as enlargement/reduction processing, rotation processing and projection processing of the map data.
  • the coordinate transforming means 62 is actuated in accordance with an instruction from the initial data clip means 61 or the user's operation analyzing means 41 .
  • the drawing judgement means 63 operates to select those data which are contained in the map data obtained by the coordinate transforming means 62 and actually required to be drawn. For example, when the reduction-scale is large, the drawing judgment means 63 operates to omit narrow roads or omissible place names because of the substantial increase in the amount of data to be drawn. With this operation, the processing speed for the drawing can be prevented from being significantly reduced.
  • the drawing judgment means 63 is actuated in accordance with an instruction from the coordinate transforming means 62 or the user's operation analyzing means 41 .
  • the data clip means 64 operates to select the map data on the drawing region from the map data obtained by the drawing judgment means 63 through the clip processing.
  • the same algorithms as the initial data clip means may be used by the data clip means 64 .
  • the data clip means 64 is actuated in accordance with an instruction from the drawing judgment means 63 or the user's operation analyzing means 41 .
  • the data clip means 64 may be omitted.
  • the drawing command issuing means 65 operates to issue to the graphics processing unit 49 commands for drawing lines, polygons, characters, etc. and commands for setting patterns to drawn roads, planes, lines, background data, character data, etc. with indicated colors or patterns.
  • the drawing command issuing means 65 is actuated in accordance with an instruction from the data clip means 64 .
  • the initial data clip means 61 is actuated at a time interval (period) for renewing a drawn map (for example, every 0.5 seconds), and it transmits the data obtained in the clip processing to the coordinate transforming means 62 .
  • the drawing operation is performed every 0.5 seconds except for a case where an instruction is supplied from an external operation.
  • a printed map table or a conventional navigation system provides a plan-view map display in which the ground is viewed from an infinite point.
  • the plan-view map display has an advantage that the scale is fixed over any position on the same display frame, so that the user can easily get the sense of distance.
  • two points are required to be displayed on the same display frame, it is necessary to perform an adjustment operation of optimizing the scale.
  • only limited information is displayed because the information amount to be displayed at the same time is restricted by the size and precision of the display. This problem can be avoided by using the bird's-eye view display.
  • the bird's-eye view display can be achieved by the perspective projection processing in which two-dimensional or three-dimensional map information 53 of a plane A is projected onto a plane B intersecting the plane A at an angle ⁇ , to thereby obtain projection information 54 .
  • a point 55 of coordinate (Tx, Ty, Tz) is set as a view point, and a map 53 (represented by a rectangle having vertexes a, b, c and d) on the plane A (illustrated by a rectangle 51 ) is projected onto the plane B (illustrated by a rectangle 52 ) which intersects the plane A at an angle ⁇ , to obtain a projection 54 (represented by a rectangle having vertexes a′, b′, c′ and d′).
  • the points a, b, c and d of the map 53 are projected to the points a′, b′, c′ and d′.
  • view point 55 which is the origin of a visual-point coordinate system in the projection processing is merely referred to as “view point”. Further, in this specification, the term “view point” does not mean a view point corresponding to the position of the user's eyes.
  • the bird's-eye view display can be realized in the conventional navigation system with no additional new map data, by adding means for performing the perspective projection processing as described above.
  • the perspective projection processing is performed by the coordinate transforming means 62 . Further, it is preferable to use various means of implementation as described later, when the bird's-eye view display is performed.
  • the coordinate transforming means 62 sets the position of the view point, and determines a viewing direction from the view point and a projection angle (an angle ⁇ at which the projection plane B intersects to the plane A in the case of FIG. 6) (step 1 ). In this step, a region which is to be displayed in the bird's-eye view mode is determined. When a bird's-eye view is displayed on a rectangular display frame (screen), a near-distance region which is close to the view point is narrowed, and a far-distance region which is far away from the view point is broadened. Therefore, map data to be finally drawn becomes a trapezoidal region 72 in a map mesh 71 .
  • the coordinate transforming means 62 uses the initial data clip means 61 to extract map data of a circumscribed rectangular region 73 of the trapezoidal region 72 to be actually drawn, from map mesh data 71 containing the region to be displayed in the bird's-eye view mode (step 2 ).
  • the coordinate transforming means 62 enlarges or reduces the extracted data, and then subjects the data to affine transformation to erect the trapezoidal region. Further, it performs data conversion on each coordinate value of the map data by using the perspective projection processing (step 3 ).
  • step 3 a drawing target region which is the trapezoid 72 in the step 2 is converted to the rectangular region 74 , and the circumscribed rectangle 73 of the trapezoid 72 is coordinate-transformed to a rectangle 75 which is circumscribed around the rectangle 73 . It is not necessary to draw regions other than the drawing target region 74 in the rectangle 75 . Therefore, the coordinate transforming means 62 uses the data clip means 64 to perform the clip processing on the regions other than the rectangle region 74 which is the drawing target, thereby removing these regions (step 4 ).
  • the map data thus obtained are transmitted to the drawing command issuing means 65 .
  • the drawing command issuing means 65 generates a drawing command on the basis of the transmitted map data and supplies the drawing command to the graphics processing unit 49 to prepare drawing data.
  • the graphics processing means 49 prepares the drawing data and stores the data into the VRAM 26 , and instructs the display 2 to display the data.
  • the display 2 displays the drawing data held in the VRAM 26 on the screen thereof. With the above operation, the bird's-eye view 102 shown in FIG. 1 is displayed on the screen of the display 2 .
  • the bird's-eye view display facilitates recognition of the directional and positional relationship between any two points.
  • the bird's-eye view display suitable for use in a case where detailed information on the periphery of a point is required to be provided whereas rough information on the peripheries of the other points is sufficient.
  • the coordinate of the specific point may be predetermined.
  • the position may be set by determining a coordinate which satisfies a predetermined condition, or the position of the point may be input externally.
  • the current position is set to one of the above two points.
  • the position information of the current position is obtained by the current position calculating means 45 or the map match processing means 46 . Therefore, when the current position is indicated to any one of a detailed information display position and a rough information display position through the input device 5 , the map drawing means 44 uses the coordinate of the current position which is obtained by the current position calculating means 45 or the map match processing means 46 .
  • this embodiment adopts the processing shown in FIG. 8 in the coordinate transformation.
  • the coordinate transforming means 62 performs an enlarging or reducing operation for a region to be processed (step 1010 ) if the map is required to be enlarged or reduced (step 1000 ), and also performs a rotational transformation processing (step 1030 ) if the map is required to be rotated (step 1020 ).
  • the rotational transformation processing (step 1030 ) contains a rotational angle calculating operation (step 1031 ) and an affine transforming operation (step 1032 ).
  • the coordinate transforming means 62 judges whether the bird's-eye view display is performed (step 1040 ).
  • the coordinate transforming means 62 judges that the perspective projection (conversion) is not performed in the bird's-eye display judgement (step 1040 ), and finishes the coordinate transformation processing without performing the perspective projection processing (step 1050 ).
  • the plan view map is displayed on the screen of the display 2 because the perspective projection processing is not performed.
  • the map when any point on the map is searched for, the map is not displayed in the bird's-eye view display mode, but displayed in the plan-view display mode, so that the search target point can be rapidly searched for.
  • the coordinate transforming means 62 which receives these instructions through the user's operation analyzing means 41 judges, in the bird's-eye view display judgment in step 1040 , that the perspective projection processing should be performed, and thus it performs the perspective projection processing (step 1050 ).
  • the perspective projection processing contains a projection angle calculation processing (step 1051 ), a projection-plane position calculation processing (step 1052 ) and a perspective projection operation (step 1053 ).
  • the map data which are notified to the drawing command issuing means 65 are converted to data for the bird's-eye view, and thus the map data which are prepared by the graphics processing means 49 and stored in the VRAM 26 also become data for the bird's-eye view. Accordingly, the map displayed on the display 2 is changed from the plan-view map to the bird's-eye view map.
  • the map can be freely changed from the bird's-eye view map to the plan-view map or from the plan-view map to the bird's-eye view map. Therefore, according to this embodiment, an easily-understandable map display can be provided.
  • the projection angle 0 is gradually increased from 0° with a time lapse (every 0.5 seconds in this embodiment) during the projection angle ⁇ calculation (step 1051 ) in the perspective projection processing (step 1050 ), and this increase of the projection angle is stopped until the projection angle reaches a target projection angle.
  • the increment unit of the projection angle is preferably set to a constant value.
  • the plan-view display is smoothly shifted to the bird's-eye view display by performing the perspective projection processing (step 1053 ) while increasing the projection angle with variation of time, so that the user can easily grasp the positional relationship between the places (spots) in the plan-view map and the bird's-eye view map.
  • the same effect can be obtained by gradually reducing the projection angle from an initially-set projection angle to 0° in time series.
  • FIG. 9A represents a-visual field (corresponding to a range to be drawn and displayed in the bird's-eye view display mode) in a map mesh 91 , and the visual position and the projection angle to obtain a bird's-eye view are shown in FIG. 9B.
  • FIG. 9C the current position and the vehicle travel direction in the bird's-eye view display are shown. Arrows as indicated by dotted lines in FIGS. 9A and 9C represent the vehicle travel direction.
  • the coordinate transforming means 62 first performs a desired enlargement/reduction processing (step 1000 and 1010 ), and then judges that the rotation is needed (step 1020 ) to determine an intersection angle ⁇ between the vector of the travel direction and the bottom side of the map mesh as shown in FIG. 9A (step 1031 ). Further, the coordinate transforming means 62 performs the affine transformation on the map data to rotate the data by an angle ⁇ (step 1032 ).
  • the coordinate transforming means 62 performs the processing of determining the projection angle and the visual position (steps 1051 and 1052 ).
  • the projection angle ⁇ is set to a value close to 0°, the difference in scale between the near-distance point and the far-distance point of the view point is reduced. On the other hand, if the projection angle ⁇ is set to a value close to 90°, the difference in scale between the near-distance point and the far-distance point is increased. In this embodiment, the projection angle ⁇ is normally set to about 30° to 45°.
  • the coordinate transforming means 62 changes the projection angle ⁇ in accordance with an angle changing direction instruction (input through the projection angle changing key 68 ) which is input from the user's operation analyzing means 41 . That is, upon detection of the angle-increasing instruction key 68 a being pushed down, the user's operation analyzing means 41 instructs the increase of the projection angle to the coordinate transforming means 62 . Further, every time the user's operation analyzing means 41 detects that the angle-increasing instruction key 68 a has been pushed down for 0.5 seconds from the projection angle increasing instruction, as described above, it instructs the increase of the projection angle to the coordinate transforming means 62 .
  • the coordinate transforming means 62 increases the projection angle ⁇ by 5° (that is, the projection angle rises up by 5°).
  • the same action is made for the projection angle reducing instruction. That is, for every 0.5 seconds that the angle-reducing instruction key 68 b continues to be pushed down, the user's operation analyzing means 41 instructs the reduction of the projection angle to the coordinate transforming means 62 , and in response to this instruction the coordinate transforming means 62 reduces the projection angle ⁇ by 5° (that is, the projection angle falls by 5°).
  • the user can freely set a map region to be displayed in the bird's-eye view display mode. Accordingly, when the increase of the projection angle is instructed, those regions which are farther away from the current position are displayed on the map because the projection angle ⁇ increases. On the other hand, when the reduction of the projection angle is instructed, those regions which are in the neighborhood of the current position are displayed on the map because the projection angle ⁇ decreases.
  • the coordinate transforming means 62 determines the position of the projection plane (Tx, Ty, Tz) so that a differential value ( ⁇ x, ⁇ y, ⁇ z) obtained by subtracting the position of the projection plane (Tx, Ty, Tz) from the current position (x, y, z) is equal to a fixed value at all times (step 1052 ). Further, the coordinate transforming means 62 sets, as an absolute quantity, ⁇ x to zero and ⁇ z to a small value (when it is displayed in a small reduced scale to meet the scale of the map) or a large value (when it is displayed in a large reduced scale). Normally, ⁇ z may be selected so that the scale of the plan-view display is coincident with the scale of a point around the center of the bird's-eye view display.
  • the scale of the map is preferably changeable in accordance with a user's demand.
  • the coordinate transforming means 62 changes the scale of the map to be displayed in accordance with a scale changing instruction from the user's operation analyzing means 41 (which is input by the scale changing key 67 ). That is, when detecting that the enlargement instruction key 67 a has been pushed, the user's operation analyzing means 41 outputs the enlargement instruction to the coordinate transforming means 62 . Further, it instructs the enlargement to the coordinate transforming means 62 every 0.5 seconds it is detected that the enlargement instruction key 67 a continues to be pushed down from the instruction of the enlargement.
  • the coordinate transforming means 62 increases ⁇ z by a predetermined value in step 1052 .
  • the same action is made for the reduction instruction.
  • the user's operation analyzing means 41 instructs the scale reduction to the coordinate transforming means 62 every 0.5 seconds when the reduction instruction key 67 b and the reduction instruction key 67 b continue to be pushed down, and in response to this instruction the coordinate transforming means 62 reduces ⁇ z by a predetermined value.
  • ⁇ y may be set to a negative value as shown in FIGS. 9A to 9 C, however, it may be set to a positive value as shown in FIGS. 15A to 15 C.
  • the bird's-eye view can be displayed with no problem both when the view point position is at the front side of the current position and when it is at the rear side of the current position.
  • FIGS. 15A to 15 C are diagrams for obtaining a bird's-eye view of the same range as shown in FIGS. 9A to 9 C, in which the view point is set to a different position. In FIG.
  • reference numeral 152 represents a visual field (a range to be drawn and displayed) in a map mesh 151 , and the visual position and the projection angle for obtaining a bird's-eye view are shown in FIG. 15B. Further, the current position and the vehicle travel direction in the obtained bird's-eye view are shown in FIG. 15C. Arrows as indicated by dotted lines in FIGS. 15A and 15C represent the vehicle travel direction.
  • the coordinate transforming means 62 determines the position of the view point (specifically, ⁇ y) in accordance with the view point position (input by the touch sensor 70 ) instructed by the user's operation analyzing means 41 . That is, upon detection of the touch sensor 70 being touched, the user's operation analyzing means 41 indicates the touch position to the coordinate transforming means 62 . In response to this indication of the position information, the coordinate transforming means 62 sets ⁇ y to such a value that the view point is coincident with the indicated position.
  • ⁇ y can be set to any value within a broad range containing positive and negative values in accordance with the user's instruction. Accordingly, the position to be displayed in detail can be set more flexibly.
  • the coordinate transforming means 62 performs the perspective projection processing on each coordinate value of the map data by using the projection angle ⁇ and the position of the projection plane (Tx, Ty, Tz) thus obtained (step 1053 ).
  • the graphics processing means 49 performs the drawing processing with the obtained map data, whereby the map display is performed on the screen such that the travel direction is set to the up-direction at all times and the current position is displayed at the same point on the screen in the bird's-eye view display mode, as shown in FIG. 9C and FIG. 15C.
  • the current position is displayed at the central lower side of the screen as shown in FIGS. 9C, 10C, 15 C and 16 C.
  • indication of a destination can be accepted through touching of to the touch panel 70 . Further, when the destination is indicated, the destination is drawn so as to be contained in the visual field (range to be drawn on the display frame). At this time, the coordinate transforming means 62 determines the rotational angle ⁇ so that the current position is located at the central lower side, and the destination is located at the center upper side (in FIGS. 10C and 16C).
  • FIGS. 10A to 10 C are diagrams showing a case of obtaining a bird's-eye view display in which both the current position and the destination are drawn on the same frame (field of view)
  • FIGS. 16A to 16 C are diagrams showing a case of obtaining a bird's-eye view display of the same range as show in FIGS. 10A to 10 C, in which the view point is set to a different position.
  • a field of view (range to be drawn and displayed) 162 is shown in a map mesh 161 in FIGS.
  • FIGS. 10B and 16B the view point position and the projection angle for obtaining the bird's-eye view are shown in FIGS. 10B and 16B
  • FIG. 10C and 16 C the current position and the position of the destination in the obtained bird's-eye view display are shown in FIG. 10C and 16 C.
  • Arrows indicated by dotted lines represent the direction from the current position to the destination.
  • the current position is displayed at the central lower side of the screen, and the destination is displayed at the central upper side of the screen.
  • the same processing is performed in a case where the display positions of any two points are selectively set to the central lower and upper sides of the screen in accordance with an instruction of the display positions of the two points.
  • the coordinate transforming means 62 calculates an intersection angle ⁇ between the bottom side of the map mesh and a line perpendicular to a line connecting the current position and the destination a shown in FIGS. 10A and 16A in step 1031 , and performs the affine transformation on each coordinate value of the map data to be drawn by the angle ⁇ in step 1032 .
  • the coordinate transforming means 62 shifts to the processing for determining the projection angle ⁇ and the view point position (step 1051 and step 1052 ).
  • the initial value of the projection angle ⁇ is a predetermined value in the range of 30° to 40°, and it is changed in accordance with an input operation of the projection angle changing key 68 .
  • the initial position of the projection plane is determined so that the differential value obtained by subtracting the position coordinate of the projection plane from the coordinate of the current position is equal to a predetermined value, and it is changed in accordance with an input operation of the reduction-scale instructing key 67 .
  • the rotation of the coordinates is performed by using the transforming equations (2) and (3), and the affine transformation is performed by using the transforming equation (1).
  • the parameters Ty and Tz which indicate the position of the projection plane are calculated by substituting a suitable value, for example, zero into Tx, substituting the position coordinates and the display positions of the current point and the destination, and then solving a linear equation system.
  • the coordinate transforming means 62 performs the perspective projection processing (step 1053 ). That is, the coordinate transforming means 62 performs the perspective projection on each coordinate value of the map data on the basis of the projection angle ⁇ and the position of the projection plane thus obtained.
  • the graphics processing means 49 performs the drawing processing with the obtained map data, whereby the current position and the destination can be displayed on the same display frame. Further, if the above processing is performed every time the current position varies, the current position and the destination can be displayed at the same positions on the same display frame, even when the current position varies with time. Further, if the judgment of the step 1040 is set to judge the necessity of the perspective projection processing at all times during the bird's-eye view display, the display of the screen is changed at all times in accordance with the time variation.
  • step 1040 if the judgment of the step 1040 is set to judge the necessity of the perspective projection processing only when the current position is shifted by a predetermined distance or more, only the mark 105 indicating the current position on the same bird's-eye view map is shifted until the current position is shifted by a predetermined distance, and the map display is varied to an enlarged map display as the current position is shifted by the predetermined distance and approaches the destination.
  • the display mode is set to the bird's-eye view display to facilitate the understanding of the positional relationship between the current position and the destination when the current position and the destination are far away from each other, and set to a display mode near to the plan-view display when the current position approaches the destination.
  • the projection angle ⁇ may be gradually increased to a larger value as the distance between the current position and the destination becomes shorter. In this case, when the current position is far way from the destination, the display mode is set to the plan-view display so that the overall positional relationship can be easily understood. As the current position approaches the destination, the visual-point position is gradually heightened to enhance the user's sense of distance.
  • step 1053 the details of the perspective projection processing (step 1053 ) of the coordinate transforming means 62 will be described with reference to FIG. 11.
  • the coordinate transforming means 62 judges whether input map data are plane data (step 1100 ). If the input data are plane data, the coordinate transforming means 62 performs the perspective projection (step 1101 ) on each node coordinate value of the given plane data by using the parameters obtained by the projection angle ⁇ calculation (step 1051 ) and the visual-point position calculation (step 1052 ). The perspective projection processing of the node (step 1101 ) is repeated until the processing on all input nodes is completed (step 1102 ).
  • the coordinate transforming means 62 subjects line data to the same processing as the plane data, to perform the perspective projection on the coordinate value of each node (steps 1103 to 1105 ).
  • the coordinate transforming means 62 judges whether the input map data are character data (step 1106 ). If the input data are judged to be character data, the perspective transforming means 62 performs the perspective projection on the coordinate values of drawing start points of given character strings by using the projection angle ⁇ and the coordinate value of the visual-point position which are obtained in the steps 1051 and 1052 . In this embodiment, no perspective projection is performed on a character image. The perspective projection processing of the node in step 1107 is repeated until the processing on all the input nodes is completed (step 1108 ).
  • the drawing judgement means 63 serve to judge whether each data item contained in the map data should be drawn.
  • the drawing judgment means 63 judges whether the input map data contain plane data (step 1200 ). If the plane data are judged to be contained, the drawing judgment means 63 performs a predetermined plane data drawing judgment processing (step 1201 ).
  • the plane data drawing judgment processing (step 1201 ) which is carried out in this case serves to judge an attribute which is preset for each plane data and select desired plane data, for example.
  • the drawing judgment means 63 subsequently judges whether the input map data contain line data such as roads, line backgrounds, etc. (step 1202 ). If the line data are judged to be contained, the drawing judgment means 63 performs a predetermined line data drawing judgment processing (step 1203 ).
  • the line data drawing judgment (step 1203 ) which is carried out in this case serves to judge an attribute which is preset for each line data and select desired line data, for example.
  • the drawing judgment means 63 subsequently judges whether the input map data contain character data (step 1204 ). If the character data are judged to be contained, the drawing judgment means 63 performs rearrangement processing on the coordinate values of drawing start points of character strings to rearrange the character strings from the far-distance side to the near-distance side (step 1205 ).
  • the drawing judgment means 63 compares the height y of the drawing start point of each of the rearranged character strings with a predetermined height h on the display frame (step 1206 ).
  • the height h is set to the position corresponding to “2 ⁇ 3” from the bottom side of the display frame. That is, the height h is set to a predetermined fixed value. This is because when the height from the bottom side is set to 2 ⁇ 3 or more, the angle of depression is extremely small, and the density of nodes to be drawn is excessively high.
  • this reference value (height h) may be set in accordance with the height of the view point or the like.
  • the distance from the view point may be used in place of the height from the bottom side of the display frame. For example, assuming the distance between the view point and the destination to be “1”, only those character strings which are located at a distance of “2 ⁇ 3” or less from the view point are drawn, and those character strings which are located at a distance longer than “2 ⁇ 3” are eliminated from drawing targets.
  • the drawing judgment means 63 sets the character string as a drawing target. On the other hand, if the height y is judged to be higher than the height h, the drawing judgment means 63 eliminates the character string from the drawing targets (step 1207 ).
  • the graphics processing means 49 performs the drawing operation by using the map data which are selected by the drawing judgment means 63 as described above, so that the bird's-eye view display can be prevented from being made too intricate, and a clear and lucid display can be obtained.
  • the drawing of a character string is determined on the basis of only the height of the display position thereof.
  • the following method may be used. That is, a display priority rank (order) is preset for each character string, and a range of the display ranks of character strings to be displayed is set in accordance with the display height (or distance from the view point) of the character strings. In this case, only the character strings to which the display ranks contained in the range are allocated are displayed.
  • the upper limit of the display ranks of the character strings to be displayed is set to a fixed rank irrespective of the height of the character string, but, the lower limit of the display ranks of the character strings is set to a higher rank in accordance with the height of the character string on the display frame.
  • the lower limit of the display ranks of the character strings is set to a higher rank in accordance with the height of the character string on the display frame.
  • in an area close to the upper side of the display frame only character strings having higher display priority ranks are displayed.
  • in an area close to the bottom side of the display frame not only character strings having higher display ranks, but also character strings having lower display ranks are displayed. Accordingly, when the current position is set to a position close to the bottom side of the display frame, more detailed character information can be obtained for an area as the area becomes closer to the current position. If the area is farther away from the current position, only important information is displayed. Accordingly, this display mode is very convenient for users.
  • the attribute may be used as another criterion for the display priority ranking. In this case, only characters having predetermined attributes may be displayed in each region.
  • the steps 1205 to 1207 may be omitted, but, these steps are preferably executed. The effect of these steps will be described with reference to FIGS. 13A, 13B, 14 A and 14 B.
  • FIG. 13A shows a bird's-eye view map obtained when the drawing operation is carried out without performing the rearrangement processing and the drawing character string selecting processing of steps 1205 to 1207 .
  • FIGS. 13B and 14A shows a bird's-eye map obtained when the drawing operation is carried out together with performing the rearrangement processing of step 1205 , but without performing the drawing character string selecting processing of steps 1206 and 1207 .
  • FIGS. 14B shows a bird's-eye view map obtained according to this embodiment, that is, a bird's-eye view map obtained when the processing of steps 1205 to 1207 is performed.
  • FIGS. 13A to 14 B figures, etc. which are drawn on the basis of plane data and line data are omitted from the illustration.
  • the bird's-eye view display apparatus and the navigation system according to this embodiment can display a map which is expressed in a bird's-eye view display mode. Accordingly, the user can obtain a clear map display in which the positional relationship can be easily recognized. In addition to this effect, the following effects are also expected to be obtained by this embodiment.
  • the intersection angle between a plane containing map data and a plane to which a map is projected is gradually increased or reduced in time series. That is, in the switching operation between the plan-view display mode and the bird's-eye view display mode, the view point to obtain the bird's-eye view is smoothly shifted. Accordingly, the user can easily recognize the positional correlation between the spots displayed on the plan-view map and the spots displayed on the bird's-eye view map because the shift between the plan-view map and the bird's-eye view map is performed smoothly.
  • the view point to obtain the bird's-eye view can be fixed to a position which is away from the current position at a fixed distance and in a fixed direction at all times. Further, the height of the view point can be varied in accordance with an input operation received externally. If the view point is fixed, the bird's-eye view is displayed such that the current position is fixed to a specific position on the screen at all times, so that the user can easily grasp the current position. Further, the view point can be set to any position desired by the user.
  • the view point and the intersection angle between the plane containing the map data and the plane to which the map is projected are determined so that the two points are displayed at the specified positions on the screen, and the bird's-eye view display is performed on the basis of the above determination result. Accordingly, the user can easily recognize the positional relationship between the two points. Further, even when the distance between the two points varies, the two points are displayed on the same display frame at all times, so that the user can easily grasp the positional relationship between the two points with no complicated operation.
  • the coordinate transforming means of this embodiment performs no perspective projection processing on the character image. Accordingly, all characters contained in the bird's-eye view are displayed at the same size so as to be erect on a map, so that the user can easily read these characters.
  • the drawing judgment means of this embodiment sorts character data strings in accordance with the distance of the character data position to the view point on the map data. That is, as a character string approaches the view point, the character string is treated so as to have a higher display priority rank. Accordingly, the character data close to the view point can be prevented from being covered and missed by the other character data, so that character information close to the view point can be easily identified.
  • the drawing judgment means of this embodiment eliminates the character string, from drawing targets.
  • the drawing judgment means may eliminate from the drawing targets those character strings which are defined at positions away from the view point at a predetermined distance or more.
  • no character string is drawn in an area which is at a predetermined height or more or away from the view point at a predetermined distance or more, that is, in an area in which the data amount to be drawn per unit area is considerably increased due to reduction of the angle of depression in the bird's-eye view display. Therefore, the road display, the plane background display, etc. in this area are not covered by the character string display, so that the user can recognize far-distance roads, etc. with no obstruction.
  • the map can be displayed in the bird's-eye view display mode. Accordingly, the user can obtain a map display in which the positional relationship between spots displayed on a map can be easily recognized.

Abstract

In a navigation system using a bird's-eye view display mode, map data on a plan view map are subjected to a perspective projection conversion to obtain drawing data on a bird's-eye view map. In this case, an input of the position of a view point is accepted, and a projection plane for a bird's-eye view is determined on the basis of the coordinates of a current position and a destination and the position of the view point so that the display positions of the two points which have been subjected to perspective-projection conversion are coincident with predetermined positions. Alternatively, an input of a scale is accepted, and the position of the view point and the projection plane are determined on the basis of the coordinates of the two points and the scale so that the display positions of the two points after the perspective projection conversion are coincident with predetermined positions and the drawing scale is coincident with the input scale. Or, as a further alternative, an input of the projection angle is accepted, and the projection plane is determined on the basis of the projection angle and the position of the view point.

Description

    BACKGROUND OF THE INVENTION
  • 1 Field of the Invention [0001]
  • The present invention relates to a navigation system for detecting the position of a moving body to inform a user of the current position of the moving body, and a map display apparatus using the navigation system. [0002]
  • 2. Description of Related Art [0003]
  • Various types of navigation systems have been recently known, and these navigation systems are used while mounted on moving bodies such as vehicles, ships, etc. When such a navigation system is mounted on a moving body, it performs arithmetic and logic processing on information provided from various sensors to detect the current position of the moving body and display the current position on a screen of the navigation system. [0004]
  • This type of navigation system includes a position detector for detecting the absolute position of the moving body, a storage device for storing map data which include two-dimensional vector data obtained by projecting roads and points on the ground, such as structures, etc., onto a mesh-segmented plane using a universal transverse Mercator projection method, and character data associated with the vector data, an input device for receiving external instructions (commands), and a display device for reading out desired vector data from the mesh-segmented map stored in the storage device in accordance with an instruction input from the input device, to perform data conversion processing on the data, and then displaying the map on the display. [0005]
  • The data conversion processing includes shift conversion processing for changing the display position on a map, scale conversion (enlarging/reducing) processing for displaying a map on any scale, and rotational coordinate-transformation processing for changing a display direction on the map. With these processings, a plan view map is obtained by drawing the ground through an orthogonal projection in a vertical direction. [0006]
  • As described above, when a map is displayed on the screen, the conventional navigation system as described above adopts a plan-map display mode in which the ground is drawn in the vertical orthogonal projection style. Therefore, in order to simultaneously display two spots which are far away from each other, a reduction-scale of the map must be necessarily increased, and thus detailed information cannot be displayed on the screen. [0007]
  • SUMMARY OF THE INVENTION
  • An object of the present invention is to provide a bird's-eye view forming method for displaying a map of a plane (ground) on the basis of a bird's-eye view corresponding to a projected plan of the plane (ground) which is obtained by projecting the plane (ground) on any plane between the plane (ground) and a view point when the plane (ground) is viewed from the view point located at any height from the plane (ground), a map display apparatus using the bird's-eye view forming method, and a navigation system using the map display apparatus. [0008]
  • In order to attain the above object, a bird's-eye view forming method for forming, from map data, drawing (display) data for a map which is to be represented by a bird's-eye view, is characterized in that coordinate data contained in the map data are perspectively converted to drawing (display) data based on a bird's-eye view on a desired projection plane with a view point being set to a desired position (i.e., the coordinate data of the map data on a vertical orthogonal projection plane are perspectively converted to drawing (display) data on a desired projection plane based on a bird's-eye view). [0009]
  • The bird's-eye view forming method preferably includes a step of receiving an input of the position of the view point, and a step of determining the projection plane so that the drawing (display) positions of two predetermined spots (for example, a current position and a destination) which are obtained through a perspectively projecting transformation on the basis of the coordinates of the two spots in the map data and the position of the view point, are set to predetermined positions. (Here, the perspectively projecting conversion is defined as such a data conversion that data on a plan view are converted to data on a bird's-eye view through a perspective projection operation. Therefore, by using the perspectively projecting conversion, a plan-view map is converted to a bird's-eye view map. In the following description, the perspectively projecting conversion is merely referred to as “perspective projection”). [0010]
  • Alternatively, the bird's-eye view forming method may include a step of receiving an input of a scale, and a step of determining the position of the view point and the projection plane so that the drawing (display) positions of the two predetermined spots (for example, a current position and a destination) which are obtained by performing the perspective projection on the basis of the coordinates of the two spots in the map data and the input scale, are set to predetermined positions, and a scale of a drawn (displayed) map is equal to the input scale. [0011]
  • Further, the bird's-eye view forming method may include a step of receiving an input of a projection angle which is defined as an angle at which a plane defined for the map data and the projection plane intersect each other, and a step of determining the projection plane on the basis of the input projection angle and the set view point position. [0012]
  • Still further, according to the present invention, there are provided a map display apparatus for displaying a bird's-eye view by using the bird's-eye view forming method as described above, and a navigation system for displaying a map by using the map display apparatus. [0013]
  • According to the bird's-eye view display apparatus and the navigation system, map drawing means includes coordinate transforming means, and the coordinate transforming means performs the perspective projection (conversion) on the map data to convert a plan-view map to a bird's-eye view map, and to display the converted map on a screen. Therefore, according to the present invention, a user can obtain a bird's-eye view display which is easy to see and in which the user can easily recognize the positional relationship of spots displayed on the map. According to the bird's-eye view forming method of the present invention, the view point can be freely set, so that it can meet user's need. [0014]
  • According to the present invention, a navigation system which is excellent in operational performance and convenient for a user can be obtained if it is designed to overcome the following first to seventh problems. [0015]
  • Firstly, in the case where the processing speed of scroll processing in a bird's-eye view display mode is lower than that in a plan display mode, the operational performance may be degraded when the scroll processing is frequently used to search for a desired spot on the bird's-eye view display (first problem). [0016]
  • Therefore, it is preferable in the present invention that the plan view (plan map) display mode and the bird's-eye view display mode are freely switchable. With this switching operation, both the plan view display and the bird's-eye view display can be made in accordance with a user's requirement, and thus the convenience is enhanced. [0017]
  • In the above case, a second problem occurs, namely, that it is difficult for the user to grasp the positional relationship between two different type of view when display is switched between the plan view display and the bird's-eye view display. This is because the position of the same spot to be displayed on the screen varies significantly due to variation of the view point with respect to the map, when the display switching operation is performed, and further, spots which have not been displayed or have been displayed until the display switching time are respectively displayed or are not displayed at the display switching time. [0018]
  • Therefore, if the present invention is designed so that the display is freely switchable the plan view display and the bird's-eye view display, it is further preferable that an intersecting angle between a plane containing the map data and a plane onto which the map is projected, is gradually increased or reduced in time series. That is, according to the present invention, it is preferable that the view point is smoothly shifted during the conversion between the plan view display and the bird's-eye view display. With this operation, the shift between the plan view and the bird's-eye view is smoothly performed, and thus the user can easily recognize the positional relationship between spots displayed on the plan view map and the same spots displayed on the bird's-eye view map. [0019]
  • Thirdly, when the map is displayed in the bird's-eye view display mode, if the view point serving as a parameter to the perspective projection is fixed to a position, a mark indicating the current position which is displayed on the screen is shifted on the map while the current position is shifted. In this case, if the marked current position deviates from a displayed map area, the mark of the current position is not displayed (third problem). [0020]
  • Therefore, according to the present invention, the view point from which to obtain the bird's-eye view may be fixed to a specific position which is set to be away from the current position at a fixed distance and in a fixed direction at all times. Furthermore, the height of the view point may be varied in accordance with an external operation. If the view point is fixed to the specific position as described above, the bird's-eye view is displayed such that the current position is fixed to a point on the screen at all times, so that the user can easily grasp the current position. Furthermore, if the position of the view point can be set to a user's desired position, the convenience can be further enhanced. [0021]
  • Fourthly, when two spots are displayed on the same frame, it is sufficient in the conventional plan view display to renew the scale in accordance with the distance between the two points. However, in the bird's-eye view display of the map, the map cannot be optimally displayed by merely renewing the scale (fourth problem). [0022]
  • Therefore, according to the present invention, an instruction regarding the positions of two spots (for example, the current position and the destination) may be accepted, and the view point and the intersection angle between the map-projected plane (bird's-eye view) and the plane containing the map data (plan view) may be determined so that the two spots are displayed at predetermined positions on the screen, whereby the bird's-eye view display is performed. With this operation, the positional relationship between the two spots can be easily recognized. Further, even when the positions of the two spots vary, these are displayed on the same frame at all times, so that the user can grasp the positional relationship between the two spots with no complicated operation. [0023]
  • According to the present invention, in the bird's-eye view display mode, line backgrounds such as roads, railroad lines, etc., plane backgrounds such as rivers, green zones, etc., and character strings may be perspectively converted and drawn on the bird's-eye view map. However, if the character strings are perspectively converted, the shape of a character is smaller and is more deformed as it becomes farther away from the view point. On the other hand, if it is in the vicinity of the view point, it is enlarged. Thus the character strings may be illegible in these cases (fifth problem). [0024]
  • Therefore, it is preferable that the coordinate transforming means of the present invention does not perform the perspective projection on a character image. In this case, characters contained in the bird's-eye view are displayed at the same size, so that the character strings are easily legible. [0025]
  • Further, when a lot of character data are displayed, character strings are displayed while overlapping with one another. Particularly in the bird's-eye view, an angle of depression with respect to the view point is smaller at a far-distance point from the view point and thus the reduction scale is large at that point, so that the data amount of character data to be displayed per unit area is increased. Accordingly, the character data (strings) are liable to be overlapped with each other at the far-distance point (sixth problem). [0026]
  • Therefore, it is preferable that as the position of character data contained in the map data becomes nearer to the view point, drawing (display) judgment means of the present invention displays the character data more preferentially (i.e., increases the display priority rank of the character data). With this operation, character data having a higher display priority rank may be displayed while being superposed on character data having a lower display priority rank, so that the character string can be prevented from being deficient and it can be made legible. [0027]
  • When the current position is located in the vicinity of the bottom side of the display frame or the view point, it is preferable that the display height or the distance from the view point is set as a criterion for determining the display priority rank. Further, when two or more character strings are displayed while overlapping with one another, a character string which is displayed at a lower display height (which is the height from the bottom side of the display frame) on the screen may be displayed more preferentially (i.e., so that the character string is not covered by the other character strings, but is superposed on the other character strings). In this case, the character data near to the current position are displayed while being prevented from being covered by the other character strings and thus missed. With this display operation, the deficiency (missing) of the character information (string) near to the current position which the user has a greater need for can be prevented, and the character string can be made legible. [0028]
  • As described above, when the bird's-eye view display is performed, the data amount of line data, plane background data and character data to be drawn per unit area is increased in a region which is far away from the view point (hereinafter referred to as “far-distance region”, and this region is a region having a small angle of depression). Accordingly, when the character data of these data are drawn while superposed on the other data, the line data and the plane background data are covered by the character data in the far-distance region, so that it may be difficult to check the shapes of roads, etc. in the far-distance region (seventh problem). [0029]
  • Therefore, the drawing judgment means of the present invention preferably eliminates, from targets to be drawn (displayed), those character strings which are obtained through the perspective projection of the map and which are located at positions higher than a predetermined display height from the bottom side of the display frame. Or the drawing judgement means may eliminate those character strings which are defined at position where is further than a fixed distance from the view point. As described above, the character strings are not displayed (drawn) in a region which is located above the predetermined height or is more than at a predetermined distance away, that is, in a region-where the data amount to be drawn per unit area is considerably increased due to the reduction in angle of depression in the bird's-eye view display mode. Therefore, a road display, a plane background display, etc. are not covered by the character string display, so that the recognition of roads in the far-distance regions can be prevented from being impaired even in the bird's-eye view display mode.[0030]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing an example of a bird's-eye view display drawn (displayed) according to the present invention; [0031]
  • FIG. 2 is a diagram showing a navigation system of an embodiment according to the present invention; [0032]
  • FIG. 3 is a block diagram showing a hardware construction of a processing unit; [0033]
  • FIG. 4 is a functional block diagram showing the processing unit; [0034]
  • FIG. 5 is a functional block diagram showing map drawing means; [0035]
  • FIG. 6 is a diagram showing a perspective projection processing of a map; [0036]
  • FIG. 7 is a diagram showing a series of coordinate transforming steps for bird's-eye view display; [0037]
  • FIG. 8 is a flowchart for a processing flow of the coordinate transforming means; [0038]
  • FIGS. 9A, 9B and [0039] 9C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display;
  • FIGS. 10A, 10B and [0040] 10C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display;
  • FIG. 11 is a flowchart for a flow of the perspective projection processing; [0041]
  • FIG. 12 is a flowchart showing a processing flow of drawing (display) judgment means; [0042]
  • FIGS. 13A and 13B are diagrams showing an effect of a rearrangement operation of character strings; [0043]
  • FIGS. 14A and 14B are diagrams showing an effect of a drawing (display) character string selecting operation; [0044]
  • FIGS. 15A, 15B and [0045] 15C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display;
  • FIGS. 16A, 16B and [0046] 16C are diagrams showing a setting method of a view point and a projection plane for bird's-eye view display; and
  • FIG. 17 is a perspective view showing the outlook of the navigation system of the embodiment.[0047]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A preferred embodiment according to the present invention will be described with the accompanying drawings. In the following embodiment, the navigation system is mounted on a vehicle such as a car or the like, but, the same effect can be obtained when the navigation system is mounted on other moving bodies such as a ship, a train, etc. [0048]
  • FIG. 1 shows a bird's-eye view which is displayed by a bird's-eye view map display apparatus installed in a navigation system of an embodiment according to the present invention. The bird's-eye view map display apparatus of this embodiment forms a bird's-eye view showing a landscape obtained through a bird's eye as a projection drawing of two-dimensional map data (represented by [0049] reference numeral 101 in FIG. 1), and displays it on a display screen of a display. Here, the bird's-eye view is defined as a view of a landscape which is obtained as if it is viewed through the eyes of a bird flying at a specific position which is high above the ground.
  • In FIG. 1, [0050] reference numeral 102 represents a bird's-eye view. In the bird's-eye view 102, a route 104 which is emphatically displayed to indicate a travel course (the emphasis is made with a bold line as shown in FIG. 1, but, it may be made using flashing display or discoloration), and reference numeral 105 represents a mark (symbol) indicating the current position. In FIG. 1, arrows (indicated by solid lines and dotted lines) represent a projection from the two-dimensional map data 101 to the bird's-eye view 102.
  • FIG. 17 is a perspective view showing the outlook of the navigation system for a vehicle according to the embodiment. The navigation system of this embodiment is installed in a [0051] housing 69, and the housing 69 is equipped with a display screen of the display 2, a scroll key unit 66, a scale changing key unit 67, a projection angle changing key unit 68, a touch panel 70 provided on the display screen of the display 2, etc.
  • The scroll [0052] key unit 66 is responsive to a scroll instruction of an image displayed on the display screen, and has four direction-instruction keys 660 which are responsive to shift-up, shift-down, shift-to-right, shift-to-left instructions, respectively. The scale changing key unit 67 is responsive to an instruction of changing the scale of a map displayed on the screen, and has two scale instructing keys 67 a and 67 b which are responsive to scale-up and scale-down instructions, respectively. The projection angle changing key unit 68 is responsive to an instruction to change the angle of a projection plane of a bird's-eye view to be displayed on the screen, and has two angle instruction keys 68 a and 68 b which are responsive to projection-angle increasing and decreasing instructions, respectively. The touch panel 70 serves as an input means which detects a touch (contact) to the surface thereof to output a touched position on the surface thereof. The keys 66 to 68 may be designed as a software key which is triggered through the touch to a predetermined region on the touch panel 70.
  • As shown in FIG. 2, the navigation system for a vehicle according to this embodiment includes a [0053] processing unit 1, a display 2 which is connected to the processing unit 1 through a signal line S1, a map storage device 3 which is-connected to the processing unit 1 through a signal line S2, a voice input/output device 4 which is connected to the processing unit 1 through a signal line S3, an input device 5 which is connected to the processing unit 1 through a signal line S4, a wheel speed sensor 6 which is connected to the processing unit 1 through a signal line S5, an attitude (geomagnetic) sensor 7 which is connected to the processing unit 1 through a signal line S6, a gyro 8 which is connected to the processing unit 1 through a signal line S7, a GPS (Global Positioning System) receiver 9 which is connected to the processing unit 1 through a signal line S8, and a traffic information receiver 10 which is connected to the processing unit 1 through a signal line S9. Each of the signal lines S1 to S9, may be either wire type or wireless type is used insofar as these types of signal lines can transmit signals. In this embodiment, the wire type is used.
  • The [0054] processing unit 1 is a central unit for performing various processing. For example, it detects the current position on the basis of information output from the various sensors 6 to 9, and reads desired map information from the map storage device 3 on the basis of the obtained current position information. In addition, it graphically develops map data to display the map data while overlapped with a current-position mark, and selects the optimum road which connects the current position and a destination instructed by the user to inform the user of the optimum course with voice or graphic display.
  • The [0055] display 2 serves to display graphic information generated in the processing unit 1, and it comprises a CRT or a liquid crystal display. RGB (Red, Green, Blue) signals or NTSC (National Television System Committee) signals are normally transmitted along the signal line S1 between the processing unit 1 and the display 2, like a normal system.
  • The [0056] map storage device 3 includes a large-capacity storage medium such as a CD-ROM (Compact Disk-Read Only Memory) or IC (Integrated Circuit) card, and it performs a read-out operation of reading out data held in the large-capacity storage medium in response to an instruction of the processing unit 1 to supply the data to the processing unit 1, and a write-in operation of writing data supplied from the processing unit 1 into the large-capacity storage medium.
  • The voice input/[0057] output device 4 serves to convert a message produced in the processing unit 1 to a voice message to a user, and it also serves to receive a voice input and recognize the content thereof and to transmit the information to the processing unit 1.
  • The [0058] input device 5 serves to receive an instruction input externally, and it includes the scroll key unit 66, the scale changing key unit 67, the projection angle changing key unit 68 and the touch panel 70 in this embodiment. The input device 5 of the present invention is not limited to the above construction, and another input means such as a joy stick, a keyboard, a mouse, a pen input device or the like may be used.
  • The navigation system of this embodiment has various sensors such as a [0059] wheel speed sensor 6 for measuring the distance on the basis of the product of the circumferential length of the wheels of a vehicle and the detected number of rotations of the wheels, and measuring a turn angle of the vehicle on the basis of the difference in number of rotations between the paired wheels, an attitude (geomagnetic) sensor 7 for detecting the earth's magnetic field to measure a direction in which the vehicle is oriented, a gyro 8 such as an optical fiber gyro, a vibrational gyro or the like for measuring a rotational angle of the vehicle, and a GPS receiver 9 for receiving signals from three or more GPS satellites to measure the distance between the vehicle and each of the GPS satellites, and variation of the distance, thereby measuring the current position, the travel direction and the travel azimuth (bearing) of the vehicle. The sensors of the present invention are not limited to the above sensors. For example, in a navigation system mounted in a ship, a Doppler sonar is used to detect the speed of the ship, and any sensor may be suitably selected in accordance with its use purpose.
  • Further, the navigation system of this embodiment has a [0060] traffic information receiver 10 to receive signals from beacon transmitters or FM (Frequency Modulation) broadcasting stations which transmit traffic information such as information on traffic jams, road repairs and suspension of traffic, and information on parking areas.
  • FIG. 3 shows the hardware construction of the [0061] processing unit 1 as described above. The processing unit 1 includes a CPU (Central Processing unit) 21 for performing a numerical operation and a control operation of the devices 22 to 31, a RAM (Random Access Memory) 22 for holding maps and processing data, a ROM (Read Only Memory) 23 for holding programs, a DMA (Direct Memory Access) 24 for performing data transmission between memories and between memories and respective devices at high speed, a drawing controller 25 for performing a graphics drawing operation when developing vector data to an image at high speed and performing a display control, a VRAM (Video Random Access Memory) 26 for storing graphic image data, a color pallet 27 for converting the image data to RGB signals, an A/D (Analog/Digital) converter 28 for converting analog signals to digital signals, an SCI (Serial Communication Interface) 29 for converting serial signals to parallel signals which are synchronized with a bus, an I/O (Input/Output) device 30 for synchronizing serial signals with parallel signals, then sending the synchronized serial signals into the bus, and a counter 31 for integrating a pulse signal. These devices 21 to 31 are connected to one another through the bus.
  • Next, the functional construction of the [0062] processing unit 1 will be described with reference to FIG. 4.
  • The [0063] processing unit 1 includes user's operation analyzing means 41, route calculating means 42, route guiding means 43, map drawing means 44, current position calculating means 45, map match processing means 46, data read-in processing means 47, menu drawing means 48, and graphics processing means 49 as shown in FIG. 4. In this embodiment, each of these means 41 to 49 is implemented through execution of instructions stored in the ROM 23 by the CPU 21. However, the present invention is not limited to the above implementation, and each of the means may be implemented using a hardware construction such as a dedicated circuit or the like.
  • The current position calculating means [0064] 45 serves to time-integrate distance data and angle data which are obtained by integrating distance pulse data measured by the wheel speed sensor 6 and angular velocity data measured by the gyro 8, to calculate the current position (X′, Y′) of a vehicle after travel on the basis of an initial position (X, Y). Here, the current position calculating means 45 corrects the absolute azimuth of the vehicle travel direction on the basis of the azimuth data obtained by the attitude sensor 7 and the angle data obtained by the gyro 8 so that an angle by which the vehicle has been rotated is matched with a vehicle travel azimuth. When the data obtained by the sensors are integrated as described above, errors of the sensors are accumulated. Therefore, the current position calculating means 45 also performs correction processing so that the accumulated errors are canceled on the basis of positional data obtained by the GPS receiver 9 at a predetermined time interval (every one second in this embodiment) to output the corrected data as current position information.
  • As described above, the current position information thus obtained still contains minute errors due to the sensors. Therefore, in,order to enhance the positional precision of the navigation system, a map match processing is performed by the map match processing means [0065] 46. In this processing, data of roads contained in a map around the current position which are read in by the data read-in processing means 47 are compared with a travel locus obtained by the current position calculating means 45 to match the current position to a road which has the highest correlation in shape. In many cases, the current position can be matched to the travel road through the map match processing, so that the current position information can be output with high precision.
  • The user's operation analyzing means [0066] 41 analyzes a user's demand or instruction which is input through the input device 5 by the user, and controls the units 42 to 48 to perform the processing corresponding to the demand (instruction). For example, when a demand for route guidance to a destination is input, the user's operation analyzing means 41 instructs the map drawing means 44 to display a map to set the destination, and then instructs the route calculation means 42 to determine a route extending from the current position to the destination and instructs the route guide means 43 to supply the route guidance information to the user.
  • The route calculation means [0067] 42 searches a node connecting two specified points by using a Dijkistra method or the like to obtain a route having the highest priority. The route calculating means 42 has plural criteria for the priority order, and it uses a criterion indicated by a user's instruction to determine the route. In this embodiment, in accordance with the instruction, there can be obtained a route providing the shortest distance between the two points, a route along which the vehicle can reach to the destination in the shortest time, a route in which cost is cheapest, or the like.
  • The route guide means [0068] 43 compares link information of a guide route obtained by the route calculating means 42 with the current position information obtained by the current position calculating means 45 and the map match processing means 46. Before passing through a crossing or the like, it outputs direction-instructing information to the voice input/output device 4 indicating whether the vehicle should go straight, or whether the vehicle should turn to the right or left, whereby the information is output as a voice message, or a direction in which the vehicle should travel is drawn on a map displayed on the screen of the display 2.
  • The data read-in [0069] processing unit 47 operates to prepare for the read-in operation of map data of a desired region from the map storage device 3. The map drawing means 44 receives from the data read-in processing means 47 the map data around a position for which the display is instructed, and it operates to transmit to the graphics processing means 49 a command for drawing specified objects on a specified reduction-scale with a specified direction set to an up-direction.
  • The menu drawing means [0070] 48 receives a command output from the user's operation analyzing means 41, and transmits to the graphics processing means 49 a command for drawing demanded various menus.
  • The graphics processing means [0071] 49 receives drawing commands generated by the map drawing means 44 and the menu drawing means 48 to develop an image in the VRAM 26.
  • Next, the function of the map drawing means [0072] 44 will be described with reference to FIG. 5 the map drawing means 44 includes initial data clip means 61, coordinate transforming means 62, drawing judgment means 63, data clip means 64 and drawing command issuing means 65.
  • The initial data clip means [0073] 61 performs clip processing to select road data, plane and line background data and character data on a region, which are required for subsequent processing, from each mesh of map data read out from the map storage device 3 by the data read-in processing means 47, and supplies the selection result to the coordinate transforming means 62. In this embodiment, the initial data clip means 61 is actuated at a time interval (period) for renewing drawn map (for example 0.5 seconds) or it is actuated in accordance with an instruction from the user's operation analyzing means 41.
  • As algorithms for the clip processing Cohen-Sutherland line clip algorithm for the road data and the line data, Sutherland-Hodgman polygon clip algorithm for the plane data, etc. are used, (Foley, van Dam, Feiner, Hughes: Computer Graphics: Addison-Wesley Publishing Company pp. 111-127). With this processing, the amount of data which will be subsequently subjected to the coordinate projection or the drawing processing can be reduced, so that the processing speed is expected to be higher. [0074]
  • The coordinate transforming [0075] means 62 performs processing for transforming each coordinate value of the map data obtained in the clip processing, such as enlargement/reduction processing, rotation processing and projection processing of the map data. The coordinate transforming means 62 is actuated in accordance with an instruction from the initial data clip means 61 or the user's operation analyzing means 41.
  • The drawing judgement means [0076] 63 operates to select those data which are contained in the map data obtained by the coordinate transforming means 62 and actually required to be drawn. For example, when the reduction-scale is large, the drawing judgment means 63 operates to omit narrow roads or omissible place names because of the substantial increase in the amount of data to be drawn. With this operation, the processing speed for the drawing can be prevented from being significantly reduced. The drawing judgment means 63 is actuated in accordance with an instruction from the coordinate transforming means 62 or the user's operation analyzing means 41.
  • The data clip means [0077] 64 operates to select the map data on the drawing region from the map data obtained by the drawing judgment means 63 through the clip processing. The same algorithms as the initial data clip means may be used by the data clip means 64. The data clip means 64 is actuated in accordance with an instruction from the drawing judgment means 63 or the user's operation analyzing means 41. The data clip means 64 may be omitted.
  • The drawing command issuing means [0078] 65 operates to issue to the graphics processing unit 49 commands for drawing lines, polygons, characters, etc. and commands for setting patterns to drawn roads, planes, lines, background data, character data, etc. with indicated colors or patterns. The drawing command issuing means 65 is actuated in accordance with an instruction from the data clip means 64.
  • In this embodiment, the initial data clip means [0079] 61 is actuated at a time interval (period) for renewing a drawn map (for example, every 0.5 seconds), and it transmits the data obtained in the clip processing to the coordinate transforming means 62. Accordingly, according to this embodiment, the drawing operation is performed every 0.5 seconds except for a case where an instruction is supplied from an external operation.
  • Next, the outline of the bird's-eye view display will be described with reference to FIG. 6. [0080]
  • When a map is displayed, a printed map table or a conventional navigation system provides a plan-view map display in which the ground is viewed from an infinite point. The plan-view map display has an advantage that the scale is fixed over any position on the same display frame, so that the user can easily get the sense of distance. However, when two points are required to be displayed on the same display frame, it is necessary to perform an adjustment operation of optimizing the scale. In addition, when the two points are far away from each other, only limited information is displayed because the information amount to be displayed at the same time is restricted by the size and precision of the display. This problem can be avoided by using the bird's-eye view display. [0081]
  • The bird's-eye view display can be achieved by the perspective projection processing in which two-dimensional or three-[0082] dimensional map information 53 of a plane A is projected onto a plane B intersecting the plane A at an angle θ, to thereby obtain projection information 54.
  • In the schematic diagram of FIG. 6, a [0083] point 55 of coordinate (Tx, Ty, Tz) is set as a view point, and a map 53 (represented by a rectangle having vertexes a, b, c and d) on the plane A (illustrated by a rectangle 51 ) is projected onto the plane B (illustrated by a rectangle 52 ) which intersects the plane A at an angle θ, to obtain a projection 54 (represented by a rectangle having vertexes a′, b′, c′ and d′). The points a, b, c and d of the map 53 are projected to the points a′, b′, c′ and d′. In this specification, the view point 55 which is the origin of a visual-point coordinate system in the projection processing is merely referred to as “view point”. Further, in this specification, the term “view point” does not mean a view point corresponding to the position of the user's eyes.
  • In the bird's-eye view display, information close to the view point [0084] 55 (for example, line ab) is enlarged, and information which is far away from the view point (for example, line cd) is reduced. Accordingly, when two points are displayed on the same display frame, these points are displayed so that one point for which more detailed information is desired is located near to the view point, while the other point is located far away from the view point. In this case, the positional relationship (interval distance, etc.) between the two points can be easily recognized, and a large amount of information on the periphery of the view point can be provided to the user.
  • According to this embodiment, since two-dimensional map data are usable as map information for the bird's-eye view display, the bird's-eye view display can be realized in the conventional navigation system with no additional new map data, by adding means for performing the perspective projection processing as described above. In this embodiment, the perspective projection processing is performed by the coordinate transforming [0085] means 62. Further, it is preferable to use various means of implementation as described later, when the bird's-eye view display is performed.
  • First, a basic method of realizing the bird's-eye view display will be described with reference to FIG. 7. [0086]
  • The coordinate transforming [0087] means 62 sets the position of the view point, and determines a viewing direction from the view point and a projection angle (an angle θ at which the projection plane B intersects to the plane A in the case of FIG. 6) (step 1). In this step, a region which is to be displayed in the bird's-eye view mode is determined. When a bird's-eye view is displayed on a rectangular display frame (screen), a near-distance region which is close to the view point is narrowed, and a far-distance region which is far away from the view point is broadened. Therefore, map data to be finally drawn becomes a trapezoidal region 72 in a map mesh 71.
  • Next, the coordinate transforming [0088] means 62 uses the initial data clip means 61 to extract map data of a circumscribed rectangular region 73 of the trapezoidal region 72 to be actually drawn, from map mesh data 71 containing the region to be displayed in the bird's-eye view mode (step 2).
  • Subsequently, the coordinate transforming [0089] means 62 enlarges or reduces the extracted data, and then subjects the data to affine transformation to erect the trapezoidal region. Further, it performs data conversion on each coordinate value of the map data by using the perspective projection processing (step 3). At this time, the coordinate transformation based on the affine transformation is represented by the following equation (1): where φ represents the rotational angle of the map, (x, y) represents the coordinate value Of the map data before the affine transformation, and (x′, y′) represents the coordinate value of the map data before the affine transformation, ( x y ) = ( cos φ - sin φ sin φ cos φ ) ( x y ) ( 1 )
    Figure US20040169653A1-20040902-M00001
  • The coordinate transformation in the perspective projection processing is represented by the following equations (2) and (3): where (Tx,Ty,Tz) represents the position coordinate of the view point, θ represents the intersection angle between the planes A and B, (x,y) represents the coordinate value of the map data before the transformation, and (x′,y′) represents the coordinate value of the map data after the transformation, [0090]
  • x′=(Tx+x)/(Tz+y·sin θ)   (2)
  • y′=(Ty+y·cos θ)/(Tz+y·sin θ)   (3)
  • In [0091] step 3, a drawing target region which is the trapezoid 72 in the step 2 is converted to the rectangular region 74, and the circumscribed rectangle 73 of the trapezoid 72 is coordinate-transformed to a rectangle 75 which is circumscribed around the rectangle 73. It is not necessary to draw regions other than the drawing target region 74 in the rectangle 75. Therefore, the coordinate transforming means 62 uses the data clip means 64 to perform the clip processing on the regions other than the rectangle region 74 which is the drawing target, thereby removing these regions (step 4).
  • The map data thus obtained are transmitted to the drawing command issuing means [0092] 65. The drawing command issuing means 65 generates a drawing command on the basis of the transmitted map data and supplies the drawing command to the graphics processing unit 49 to prepare drawing data. The graphics processing means 49 prepares the drawing data and stores the data into the VRAM 26, and instructs the display 2 to display the data. The display 2 displays the drawing data held in the VRAM 26 on the screen thereof. With the above operation, the bird's-eye view 102 shown in FIG. 1 is displayed on the screen of the display 2.
  • The bird's-eye view display facilitates recognition of the directional and positional relationship between any two points. In addition, the bird's-eye view display suitable for use in a case where detailed information on the periphery of a point is required to be provided whereas rough information on the peripheries of the other points is sufficient. As described above, when detailed information or rough information on the periphery of a specific point is supplied, the coordinate of the specific point may be predetermined. Alternatively, the position may be set by determining a coordinate which satisfies a predetermined condition, or the position of the point may be input externally. [0093]
  • According to the navigation system, in many cases the current position is set to one of the above two points. The position information of the current position is obtained by the current position calculating means [0094] 45 or the map match processing means 46. Therefore, when the current position is indicated to any one of a detailed information display position and a rough information display position through the input device 5, the map drawing means 44 uses the coordinate of the current position which is obtained by the current position calculating means 45 or the map match processing means 46.
  • However, when an instruction of a position on the map displayed on the screen is received and the instructed position is set as either the detailed information display position and the rough information display position, in many cases the user performs a scroll display on the map to search for the position for which the detailed information or rough information is displayed. However, the scroll-speed in the bird's-eye view display is low because the overall drawing operation must be repeated again every time a region to be displayed is changed. Accordingly, when the scroll display is used very frequently, the bird's-eye view display mode is inconvenient. [0095]
  • In order to solve this problem, this embodiment adopts the processing shown in FIG. 8 in the coordinate transformation. [0096]
  • First, the coordinate transforming [0097] means 62 performs an enlarging or reducing operation for a region to be processed (step 1010) if the map is required to be enlarged or reduced (step 1000), and also performs a rotational transformation processing (step 1030) if the map is required to be rotated (step 1020). The rotational transformation processing (step 1030) contains a rotational angle calculating operation (step 1031) and an affine transforming operation (step 1032).
  • Subsequently, the coordinate transforming [0098] means 62 judges whether the bird's-eye view display is performed (step 1040). Here, for example when an instruction to search for any point from the map is input through the input device 5, or when an instruction for display of the plan-view map is input through the input device 5, the coordinate transforming means 62 judges that the perspective projection (conversion) is not performed in the bird's-eye display judgement (step 1040), and finishes the coordinate transformation processing without performing the perspective projection processing (step 1050). In this case, the plan view map is displayed on the screen of the display 2 because the perspective projection processing is not performed.
  • According to this embodiment, when any point on the map is searched for, the map is not displayed in the bird's-eye view display mode, but displayed in the plan-view display mode, so that the search target point can be rapidly searched for. [0099]
  • Further, for example when an instruction for displaying the bird's-eye view is input through the [0100] input device 5, or when any point is instructed as a destination through the input device 5 during the plan-view display, the coordinate transforming means 62 which receives these instructions through the user's operation analyzing means 41 judges, in the bird's-eye view display judgment in step 1040, that the perspective projection processing should be performed, and thus it performs the perspective projection processing (step 1050).
  • The perspective projection processing (step [0101] 1050) contains a projection angle calculation processing (step 1051), a projection-plane position calculation processing (step 1052) and a perspective projection operation (step 1053). With this processing, the map data which are notified to the drawing command issuing means 65 are converted to data for the bird's-eye view, and thus the map data which are prepared by the graphics processing means 49 and stored in the VRAM 26 also become data for the bird's-eye view. Accordingly, the map displayed on the display 2 is changed from the plan-view map to the bird's-eye view map.
  • As described above, the map can be freely changed from the bird's-eye view map to the plan-view map or from the plan-view map to the bird's-eye view map. Therefore, according to this embodiment, an easily-understandable map display can be provided. [0102]
  • Further, when the display frame is switched from the plan-view frame to the bird's-eye view frame in the above processing, or when the switching of the plan-view map to the bird's-eye view map or from the bird's-eye view map to the plan-view map is instructed through the [0103] input device 5 or the like, it would be difficult for the user to recognize the map if the switching operation between the plan-view map and the bird's-eye view map is rapidly performed, because the display style of the map varies greatly on the screen. Therefore, it is preferable that the shift between the plan-view map and the bird's-eye view map is performed gradually.
  • In this embodiment, when the display frame switching operation from the plan-view map to the bird's-eye view map is required, the projection angle [0104] 0 is gradually increased from 0° with a time lapse (every 0.5 seconds in this embodiment) during the projection angle θ calculation (step 1051) in the perspective projection processing (step 1050), and this increase of the projection angle is stopped until the projection angle reaches a target projection angle. In this case, the increment unit of the projection angle is preferably set to a constant value.
  • As described above, the plan-view display is smoothly shifted to the bird's-eye view display by performing the perspective projection processing (step [0105] 1053) while increasing the projection angle with variation of time, so that the user can easily grasp the positional relationship between the places (spots) in the plan-view map and the bird's-eye view map. When the switching operation from the bird's-eye view display to the plan-view display is performed, the same effect can be obtained by gradually reducing the projection angle from an initially-set projection angle to 0° in time series.
  • Next, a method of determining the intersection angle θ (projection angle) between the map and the projection plane which is a perspective projection parameter in the bird's-eye view display, and the coordinate of the origin (Tx, Ty, Tz) of a view point coordinate system containing the projection plane, which is viewed from an object coordinate system containing the plane of the map, that is, the position of the projection plane, will be described with reference to FIGS. 8 and 9A-[0106] 9C.
  • It is generally desired for the navigation system to display a place at which the user's vehicle runs, that is, the periphery of the current position of the vehicle, in more detail. Therefore, the case where the bird's-eye view is displayed such that the current position is located at the central lower side of the screen as shown in FIG. 9C will first be described. In this case, the front side of the current position of the vehicle in a vehicle travel direction and a road map along which the vehicle has travelled until now is displayed in the bird's-eye view display mode. [0107] Reference numeral 92 in FIG. 9A represents a-visual field (corresponding to a range to be drawn and displayed in the bird's-eye view display mode) in a map mesh 91, and the visual position and the projection angle to obtain a bird's-eye view are shown in FIG. 9B. In FIG. 9C, the current position and the vehicle travel direction in the bird's-eye view display are shown. Arrows as indicated by dotted lines in FIGS. 9A and 9C represent the vehicle travel direction.
  • In order to achieve the bird's-eye view display as shown in FIG. 9C, the coordinate transforming [0108] means 62 first performs a desired enlargement/reduction processing (step 1000 and 1010), and then judges that the rotation is needed (step 1020) to determine an intersection angle φ between the vector of the travel direction and the bottom side of the map mesh as shown in FIG. 9A (step 1031). Further, the coordinate transforming means 62 performs the affine transformation on the map data to rotate the data by an angle φ (step 1032).
  • Since whether or not the bird's-eye view display is to be performed is judged in [0109] step 1040, the coordinate transforming means 62 performs the processing of determining the projection angle and the visual position (steps 1051 and 1052).
  • If the projection angle θ is set to a value close to 0°, the difference in scale between the near-distance point and the far-distance point of the view point is reduced. On the other hand, if the projection angle θ is set to a value close to 90°, the difference in scale between the near-distance point and the far-distance point is increased. In this embodiment, the projection angle θ is normally set to about 30° to 45°. [0110]
  • In this embodiment, the coordinate transforming [0111] means 62 changes the projection angle θ in accordance with an angle changing direction instruction (input through the projection angle changing key 68) which is input from the user's operation analyzing means 41. That is, upon detection of the angle-increasing instruction key 68 a being pushed down, the user's operation analyzing means 41 instructs the increase of the projection angle to the coordinate transforming means 62. Further, every time the user's operation analyzing means 41 detects that the angle-increasing instruction key 68 a has been pushed down for 0.5 seconds from the projection angle increasing instruction, as described above, it instructs the increase of the projection angle to the coordinate transforming means 62. In response to the projection angle increasing instruction, the coordinate transforming means 62 increases the projection angle θ by 5° (that is, the projection angle rises up by 5°). The same action is made for the projection angle reducing instruction. That is, for every 0.5 seconds that the angle-reducing instruction key 68 b continues to be pushed down, the user's operation analyzing means 41 instructs the reduction of the projection angle to the coordinate transforming means 62, and in response to this instruction the coordinate transforming means 62 reduces the projection angle θ by 5° (that is, the projection angle falls by 5°).
  • With this above operation, the user can freely set a map region to be displayed in the bird's-eye view display mode. Accordingly, when the increase of the projection angle is instructed, those regions which are farther away from the current position are displayed on the map because the projection angle θ increases. On the other hand, when the reduction of the projection angle is instructed, those regions which are in the neighborhood of the current position are displayed on the map because the projection angle θ decreases. [0112]
  • Subsequently, the coordinate transforming [0113] means 62 determines the position of the projection plane (Tx, Ty, Tz) so that a differential value (Δx, Δy, Δz) obtained by subtracting the position of the projection plane (Tx, Ty, Tz) from the current position (x, y, z) is equal to a fixed value at all times (step 1052). Further, the coordinate transforming means 62 sets, as an absolute quantity, Δx to zero and Δz to a small value (when it is displayed in a small reduced scale to meet the scale of the map) or a large value (when it is displayed in a large reduced scale). Normally, Δz may be selected so that the scale of the plan-view display is coincident with the scale of a point around the center of the bird's-eye view display.
  • The scale of the map is preferably changeable in accordance with a user's demand. In this embodiment, the coordinate transforming [0114] means 62 changes the scale of the map to be displayed in accordance with a scale changing instruction from the user's operation analyzing means 41 (which is input by the scale changing key 67). That is, when detecting that the enlargement instruction key 67 a has been pushed, the user's operation analyzing means 41 outputs the enlargement instruction to the coordinate transforming means 62. Further, it instructs the enlargement to the coordinate transforming means 62 every 0.5 seconds it is detected that the enlargement instruction key 67 a continues to be pushed down from the instruction of the enlargement. In response to the enlargement instruction, the coordinate transforming means 62 increases Δz by a predetermined value in step 1052. The same action is made for the reduction instruction. The user's operation analyzing means 41 instructs the scale reduction to the coordinate transforming means 62 every 0.5 seconds when the reduction instruction key 67 b and the reduction instruction key 67 b continue to be pushed down, and in response to this instruction the coordinate transforming means 62 reduces Δz by a predetermined value.
  • In this embodiment, Δy may be set to a negative value as shown in FIGS. 9A to [0115] 9C, however, it may be set to a positive value as shown in FIGS. 15A to 15C. Further, according to this embodiment, the bird's-eye view can be displayed with no problem both when the view point position is at the front side of the current position and when it is at the rear side of the current position. FIGS. 15A to 15C are diagrams for obtaining a bird's-eye view of the same range as shown in FIGS. 9A to 9C, in which the view point is set to a different position. In FIG. 15A, reference numeral 152 represents a visual field (a range to be drawn and displayed) in a map mesh 151, and the visual position and the projection angle for obtaining a bird's-eye view are shown in FIG. 15B. Further, the current position and the vehicle travel direction in the obtained bird's-eye view are shown in FIG. 15C. Arrows as indicated by dotted lines in FIGS. 15A and 15C represent the vehicle travel direction.
  • In this embodiment, the coordinate transforming [0116] means 62 determines the position of the view point (specifically, Δy) in accordance with the view point position (input by the touch sensor 70) instructed by the user's operation analyzing means 41. That is, upon detection of the touch sensor 70 being touched, the user's operation analyzing means 41 indicates the touch position to the coordinate transforming means 62. In response to this indication of the position information, the coordinate transforming means 62 sets Δy to such a value that the view point is coincident with the indicated position.
  • As described above, according to this embodiment, Δy can be set to any value within a broad range containing positive and negative values in accordance with the user's instruction. Accordingly, the position to be displayed in detail can be set more flexibly. [0117]
  • Finally, the coordinate transforming [0118] means 62 performs the perspective projection processing on each coordinate value of the map data by using the projection angle θ and the position of the projection plane (Tx, Ty, Tz) thus obtained (step 1053). Thereafter, the graphics processing means 49 performs the drawing processing with the obtained map data, whereby the map display is performed on the screen such that the travel direction is set to the up-direction at all times and the current position is displayed at the same point on the screen in the bird's-eye view display mode, as shown in FIG. 9C and FIG. 15C.
  • As described above, in the navigation system, it is generally desired that the point at which the user travels at present, that is, the periphery of the current position, is displayed in more detail. Therefore, in the navigation system of this embodiment, the current position is displayed at the central lower side of the screen as shown in FIGS. 9C, 10C, [0119] 15C and 16C.
  • According to the navigation system of this embodiment, indication of a destination can be accepted through touching of to the [0120] touch panel 70. Further, when the destination is indicated, the destination is drawn so as to be contained in the visual field (range to be drawn on the display frame). At this time, the coordinate transforming means 62 determines the rotational angle φ so that the current position is located at the central lower side, and the destination is located at the center upper side (in FIGS. 10C and 16C).
  • A bird's-eye view display method in the above case will be described with reference to FIG. 8, FIGS. 10A to [0121] 10C and FIGS. 16A to 16C. FIGS. 10A to 10C are diagrams showing a case of obtaining a bird's-eye view display in which both the current position and the destination are drawn on the same frame (field of view), and FIGS. 16A to 16C are diagrams showing a case of obtaining a bird's-eye view display of the same range as show in FIGS. 10A to 10C, in which the view point is set to a different position. A field of view (range to be drawn and displayed) 162 is shown in a map mesh 161 in FIGS. 10A and 16A, the view point position and the projection angle for obtaining the bird's-eye view are shown in FIGS. 10B and 16B, and the current position and the position of the destination in the obtained bird's-eye view display are shown in FIG. 10C and 16 C. Arrows indicated by dotted lines represent the direction from the current position to the destination.
  • In the following description, the current position is displayed at the central lower side of the screen, and the destination is displayed at the central upper side of the screen. The same processing is performed in a case where the display positions of any two points are selectively set to the central lower and upper sides of the screen in accordance with an instruction of the display positions of the two points. [0122]
  • In order to achieve the bird's-eye view displays shown in FIG. 10C and 16C, the coordinate transforming [0123] means 62 calculates an intersection angle φ between the bottom side of the map mesh and a line perpendicular to a line connecting the current position and the destination a shown in FIGS. 10A and 16A in step 1031, and performs the affine transformation on each coordinate value of the map data to be drawn by the angle φ in step 1032.
  • Since it is judged in [0124] step 1040 that the bird's-eye view display is performed, the coordinate transforming means 62 shifts to the processing for determining the projection angle θ and the view point position (step 1051 and step 1052). As described above, the initial value of the projection angle θ is a predetermined value in the range of 30° to 40°, and it is changed in accordance with an input operation of the projection angle changing key 68. Further, as described above, the initial position of the projection plane is determined so that the differential value obtained by subtracting the position coordinate of the projection plane from the coordinate of the current position is equal to a predetermined value, and it is changed in accordance with an input operation of the reduction-scale instructing key 67. Further, the rotation of the coordinates is performed by using the transforming equations (2) and (3), and the affine transformation is performed by using the transforming equation (1). Further, the parameters Ty and Tz which indicate the position of the projection plane are calculated by substituting a suitable value, for example, zero into Tx, substituting the position coordinates and the display positions of the current point and the destination, and then solving a linear equation system.
  • Next, the coordinate transforming [0125] means 62 performs the perspective projection processing (step 1053). That is, the coordinate transforming means 62 performs the perspective projection on each coordinate value of the map data on the basis of the projection angle θ and the position of the projection plane thus obtained.
  • The graphics processing means [0126] 49 performs the drawing processing with the obtained map data, whereby the current position and the destination can be displayed on the same display frame. Further, if the above processing is performed every time the current position varies, the current position and the destination can be displayed at the same positions on the same display frame, even when the current position varies with time. Further, if the judgment of the step 1040 is set to judge the necessity of the perspective projection processing at all times during the bird's-eye view display, the display of the screen is changed at all times in accordance with the time variation. Further, if the judgment of the step 1040 is set to judge the necessity of the perspective projection processing only when the current position is shifted by a predetermined distance or more, only the mark 105 indicating the current position on the same bird's-eye view map is shifted until the current position is shifted by a predetermined distance, and the map display is varied to an enlarged map display as the current position is shifted by the predetermined distance and approaches the destination.
  • If the above processing is performed every time the current position is varied, and the current position and the position of the destination to be displayed on the screen are set to fixed positions, the map display could be gradually enlarged as the current position approaches the destination. [0127]
  • Further, if the projection angle θ is set to be reduced to a smaller value as the distance between the current position and the destination is shorter, the display mode is set to the bird's-eye view display to facilitate the understanding of the positional relationship between the current position and the destination when the current position and the destination are far away from each other, and set to a display mode near to the plan-view display when the current position approaches the destination. Conversely, the projection angle θ may be gradually increased to a larger value as the distance between the current position and the destination becomes shorter. In this case, when the current position is far way from the destination, the display mode is set to the plan-view display so that the overall positional relationship can be easily understood. As the current position approaches the destination, the visual-point position is gradually heightened to enhance the user's sense of distance. [0128]
  • Next, a method of drawing character data in the bird's-eye view display will be described. First, the details of the perspective projection processing (step [0129] 1053) of the coordinate transforming means 62 will be described with reference to FIG. 11.
  • The coordinate transforming [0130] means 62 judges whether input map data are plane data (step 1100). If the input data are plane data, the coordinate transforming means 62 performs the perspective projection (step 1101) on each node coordinate value of the given plane data by using the parameters obtained by the projection angle θ calculation (step 1051) and the visual-point position calculation (step 1052). The perspective projection processing of the node (step 1101) is repeated until the processing on all input nodes is completed (step 1102).
  • Subsequently, the coordinate transforming [0131] means 62 subjects line data to the same processing as the plane data, to perform the perspective projection on the coordinate value of each node (steps 1103 to 1105).
  • When the processing on the line data is completed, the coordinate transforming [0132] means 62 judges whether the input map data are character data (step 1106). If the input data are judged to be character data, the perspective transforming means 62 performs the perspective projection on the coordinate values of drawing start points of given character strings by using the projection angle θ and the coordinate value of the visual-point position which are obtained in the steps 1051 and 1052. In this embodiment, no perspective projection is performed on a character image. The perspective projection processing of the node in step 1107 is repeated until the processing on all the input nodes is completed (step 1108).
  • When the graphics drawing is performed with the perspective projection result thus obtained, such a bird's-eye view map as the map [0133] 103 shown in FIG. 1 is obtained. According to this embodiment, since no perspective projection is carried out on the character image in step 1107 as described above, all the characters are drawn at the same size so as to be erect on the map as shown by the map 103 of FIG. 1.
  • Next, the processing of the drawing judgment means [0134] 63 will be described with reference to FIG. 12. The drawing judgement means 63 serve to judge whether each data item contained in the map data should be drawn.
  • First, the drawing judgment means [0135] 63 judges whether the input map data contain plane data (step 1200). If the plane data are judged to be contained, the drawing judgment means 63 performs a predetermined plane data drawing judgment processing (step 1201). The plane data drawing judgment processing (step 1201) which is carried out in this case serves to judge an attribute which is preset for each plane data and select desired plane data, for example.
  • When the processing on the plane data is completed, the drawing judgment means [0136] 63 subsequently judges whether the input map data contain line data such as roads, line backgrounds, etc. (step 1202). If the line data are judged to be contained, the drawing judgment means 63 performs a predetermined line data drawing judgment processing (step 1203). The line data drawing judgment (step 1203) which is carried out in this case serves to judge an attribute which is preset for each line data and select desired line data, for example.
  • When the processing on the line data is completed, the drawing judgment means [0137] 63 subsequently judges whether the input map data contain character data (step 1204). If the character data are judged to be contained, the drawing judgment means 63 performs rearrangement processing on the coordinate values of drawing start points of character strings to rearrange the character strings from the far-distance side to the near-distance side (step 1205).
  • Subsequently, the drawing judgment means [0138] 63 compares the height y of the drawing start point of each of the rearranged character strings with a predetermined height h on the display frame (step 1206). In this embodiment, when the height of the display frame is set to “1”, the height h is set to the position corresponding to “⅔” from the bottom side of the display frame. That is, the height h is set to a predetermined fixed value. This is because when the height from the bottom side is set to ⅔ or more, the angle of depression is extremely small, and the density of nodes to be drawn is excessively high. However, this reference value (height h) may be set in accordance with the height of the view point or the like.
  • As a judgment criterion for the drawing of the character strings the distance from the view point may be used in place of the height from the bottom side of the display frame. For example, assuming the distance between the view point and the destination to be “1”, only those character strings which are located at a distance of “⅔” or less from the view point are drawn, and those character strings which are located at a distance longer than “⅔” are eliminated from drawing targets. [0139]
  • If the height y of a drawing start point of a character string is judged to be lower than the height h as a position judgment result, the drawing judgment means [0140] 63 sets the character string as a drawing target. On the other hand, if the height y is judged to be higher than the height h, the drawing judgment means 63 eliminates the character string from the drawing targets (step 1207). In this embodiment, the graphics processing means 49 performs the drawing operation by using the map data which are selected by the drawing judgment means 63 as described above, so that the bird's-eye view display can be prevented from being made too intricate, and a clear and lucid display can be obtained.
  • Further, according to this embodiment, the drawing of a character string is determined on the basis of only the height of the display position thereof. In place of this method of determination, the following method may be used. That is, a display priority rank (order) is preset for each character string, and a range of the display ranks of character strings to be displayed is set in accordance with the display height (or distance from the view point) of the character strings. In this case, only the character strings to which the display ranks contained in the range are allocated are displayed. For example, the upper limit of the display ranks of the character strings to be displayed is set to a fixed rank irrespective of the height of the character string, but, the lower limit of the display ranks of the character strings is set to a higher rank in accordance with the height of the character string on the display frame. In this case, in an area close to the upper side of the display frame, only character strings having higher display priority ranks are displayed. On the other hand, in an area close to the bottom side of the display frame, not only character strings having higher display ranks, but also character strings having lower display ranks are displayed. Accordingly, when the current position is set to a position close to the bottom side of the display frame, more detailed character information can be obtained for an area as the area becomes closer to the current position. If the area is farther away from the current position, only important information is displayed. Accordingly, this display mode is very convenient for users. [0141]
  • Further, when an attribute (font of display character, meaning of character string, or the like) is previously allocated to each character string, the attribute may be used as another criterion for the display priority ranking. In this case, only characters having predetermined attributes may be displayed in each region. [0142]
  • If a character string is defined for a predetermined point serving as a destination, only the character string may be displayed. [0143]
  • The [0144] steps 1205 to 1207 may be omitted, but, these steps are preferably executed. The effect of these steps will be described with reference to FIGS. 13A, 13B, 14A and 14B.
  • FIG. 13A shows a bird's-eye view map obtained when the drawing operation is carried out without performing the rearrangement processing and the drawing character string selecting processing of [0145] steps 1205 to 1207. FIGS. 13B and 14A shows a bird's-eye map obtained when the drawing operation is carried out together with performing the rearrangement processing of step 1205, but without performing the drawing character string selecting processing of steps 1206 and 1207. FIGS. 14B shows a bird's-eye view map obtained according to this embodiment, that is, a bird's-eye view map obtained when the processing of steps 1205 to 1207 is performed. In FIGS. 13A to 14B, figures, etc. which are drawn on the basis of plane data and line data are omitted from the illustration.
  • When the bird's-eye view is displayed without performing the rearrangement processing of [0146] step 1205, character strings are displayed merely in a stored order of the character strings, irrespective of the distance to the view point, as shown in FIG. 13A. Therefore, a character string which is far away from the view point (a far-distance character string) may be displayed while superposed on a character string which is close to the view point (a near-distance character string).
  • On the other hand, in the case where the rearrangement processing of the character string of [0147] step 1205 is performed, if a far-distance character string is overlapped with a near-distance character string, the near-distance character string is displayed while superposed on the far-distance character string. Such a bird's-eye view display mode as described above is very preferable because as an area approaches the current position, this display mode makes it easier to read information for the area.
  • Further, in the case where the bird's-eye view is displayed without performing the drawing character string selecting processing of [0148] steps 1206 and 1207, as a character becomes farther away from the view point, the angle of depression of the character string with respect to the view point becomes smaller, so that the character string is reduced and compressed more and more, as shown in FIG. 14A. Therefore, the data amount of line, planes and characters to be drawn per unit area is increased. The character data of these data are drawn while superposed on the other data, so that the line and plane background data are hidden by the character data. Therefore, it is very difficult to read plane information such as remote roads, rivers, green zones, etc. from the display frame in the above bird's-eye view display.
  • On the other hand, according to this embodiment, in order to perform the drawing character string selecting processing of [0149] steps 1206 and 1207, character data in a range (character drawing range) extending from the bottom side of the display frame to the height h are drawn while the other data are not drawn as shown in FIG. 14B. Accordingly, in a bird's-eye view display obtained according to this embodiment, the plane information such as roads, rivers, green zones, etc. which are remote from the view point can be easily read from the display frame.
  • The bird's-eye view display apparatus and the navigation system according to this embodiment can display a map which is expressed in a bird's-eye view display mode. Accordingly, the user can obtain a clear map display in which the positional relationship can be easily recognized. In addition to this effect, the following effects are also expected to be obtained by this embodiment. [0150]
  • First, according to this embodiment, it's easy to switch between the plan-view display and the bird's-eye view display. Accordingly, both the plan view and the bird's-eye view can be displayed in accordance with the user's demand. [0151]
  • Secondly, according to this embodiment, when the bird's-eye view display and the plan-view display are switched, the intersection angle between a plane containing map data and a plane to which a map is projected is gradually increased or reduced in time series. That is, in the switching operation between the plan-view display mode and the bird's-eye view display mode, the view point to obtain the bird's-eye view is smoothly shifted. Accordingly, the user can easily recognize the positional correlation between the spots displayed on the plan-view map and the spots displayed on the bird's-eye view map because the shift between the plan-view map and the bird's-eye view map is performed smoothly. [0152]
  • Thirdly, according to this embodiment, the view point to obtain the bird's-eye view can be fixed to a position which is away from the current position at a fixed distance and in a fixed direction at all times. Further, the height of the view point can be varied in accordance with an input operation received externally. If the view point is fixed, the bird's-eye view is displayed such that the current position is fixed to a specific position on the screen at all times, so that the user can easily grasp the current position. Further, the view point can be set to any position desired by the user. [0153]
  • Fourthly, according to this embodiment, when the positions of two points (for example, a current position and a destination) are specified, the view point and the intersection angle between the plane containing the map data and the plane to which the map is projected are determined so that the two points are displayed at the specified positions on the screen, and the bird's-eye view display is performed on the basis of the above determination result. Accordingly, the user can easily recognize the positional relationship between the two points. Further, even when the distance between the two points varies, the two points are displayed on the same display frame at all times, so that the user can easily grasp the positional relationship between the two points with no complicated operation. [0154]
  • Fifthly, the coordinate transforming means of this embodiment performs no perspective projection processing on the character image. Accordingly, all characters contained in the bird's-eye view are displayed at the same size so as to be erect on a map, so that the user can easily read these characters. [0155]
  • Sixthly, the drawing judgment means of this embodiment sorts character data strings in accordance with the distance of the character data position to the view point on the map data. That is, as a character string approaches the view point, the character string is treated so as to have a higher display priority rank. Accordingly, the character data close to the view point can be prevented from being covered and missed by the other character data, so that character information close to the view point can be easily identified. [0156]
  • Seventhly, when a character string obtained through the perspective projection processing is located at a position higher than a predetermined height on the screen, the drawing judgment means of this embodiment eliminates the character string, from drawing targets. The drawing judgment means may eliminate from the drawing targets those character strings which are defined at positions away from the view point at a predetermined distance or more. As described above, no character string is drawn in an area which is at a predetermined height or more or away from the view point at a predetermined distance or more, that is, in an area in which the data amount to be drawn per unit area is considerably increased due to reduction of the angle of depression in the bird's-eye view display. Therefore, the road display, the plane background display, etc. in this area are not covered by the character string display, so that the user can recognize far-distance roads, etc. with no obstruction. [0157]
  • According to the bird's-eye view display apparatus and the navigation system of the present invention, the map can be displayed in the bird's-eye view display mode. Accordingly, the user can obtain a map display in which the positional relationship between spots displayed on a map can be easily recognized. [0158]

Claims (37)

What is claimed is:
1. A map display apparatus comprising:
an image display device;
map storage means for storing map data to display maps; and
map drawing means for preparing drawing data of a map using said map data read out from said map storage means and displaying an image of the map on said image display device on the basis of the drawing data, wherein said map drawing means includes coordinate transforming means for transforming coordinate data contained in the map data, and said coordinate transforming means includes perspective projection means for perspectively projecting the coordinate data contained in the map data onto a predetermined projection plane with a view point at a desired position to thereby prepare the drawing data of a bird's-eye view.
2. The map display apparatus as claimed in claim 1, further comprising an input device, wherein said coordinate transforming means further includes means for accepting an input of the position of the view point through said input device, and means for determining the projection plane on the basis of predetermined two points contained in the map data and the input view point position so that the drawing positions of the two points after the perspectively projecting operation are coincident with predetermined positions.
3. The map display apparatus as claimed in claim 1, further comprising an input device, and wherein said coordinate transforming means further includes means for accepting an input of a scale through said input device, and means for determining the position of the view point and the projection plane on the basis of the coordinates of two predetermined points contained in the map data and the input scale so that the drawing positions of the two points after the perspectively projecting operation are coincident with predetermined positions, and the scale of a drawn map on the projection plane is coincident with the input scale.
4. The map display apparatus as claimed in claim 1, further comprising an input device, and wherein said coordinate transforming means further includes means for accepting, through said input device, an input of a projection angle which is an intersection angle between the projection plane and a plane on which the map data are defined, and means for determining the projection plane on the basis of the input projection angle and the position of the view point.
5. The map display apparatus as claimed in claim 1, further comprising means for preparing drawing data of a plan view without performing the perspective projection processing, and drawing determining means for determining one of the bird's-eye view and the plan view for which the drawing data should be prepared.
6. The map display apparatus as claimed in claim 5, further comprising an input device, and wherein said coordinate transforming means further includes drawing selection accepting means for accepting, through said input device, an instruction indicating one of the bird's-eye view and the plan view for which the drawing data are prepared, and wherein said drawing determining means determines the drawing data to be prepared in accordance with the accepted instruction of said drawing selection accepting means.
7. The map display apparatus as claimed in claim 5, wherein said map drawing means further includes projection angle changing means for instructing said coordinate transforming means to prepare drawing data of at least one second bird's-eye view to be obtained through the perspective projection using a second projection angle which is smaller than a first projection angle corresponding to the projection angle to obtain a first bird's-eye view, and instructing said image display device to display a map image with the drawing data of the second bird's-eye view between the first bird's-eye view display obtained through the perspective projection using the first projection angle and the plan view display, in a case where a display target for which the drawing data are to be prepared is changed from one of the first bird's-eye view and the plan view to the other.
8. The map display apparatus as claimed in claim 7, wherein said projection angle changing means includes means for preparing drawing data of two or more said second bird's-eye views by using two or more said second projection angles which 5 are different from each other, wherein when the display target is changed from the bird's-eye view to the plan view, said projection angle changing means successively displays the two or more second bird's-eye views in a projection-angle order from a larger projection angle to a smaller projection angle, and when the display target is changed from the plan view to the bird's eye view, said projection angle changing means successively displays the two or more second bird's-eye view in a projection-angle order from a smaller projection angle to a larger projection angle.
9. The map display apparatus as claimed in claim 6, wherein said map drawing means further includes means for accepting an indication of any point contained in the map data through said input device, and wherein said drawing determining means determines the drawing data of the plan view as the drawing data to be prepared when accepting the indication of the point.
10. The map display apparatus as claimed in claim 1, wherein the map data contain, as character string data, vector data to recognize the position of a character string, and image data to recognize an image of the character string, and said coordinate transforming means performs no perspective projection on the image data of the map data.
11. The map display apparatus as claimed in claim 1, wherein said map drawing means further includes drawing judgment means for determining whether data of at least one node contained in the drawing data should be set as one of a number of drawing targets, and eliminating said data of a node which is not set as a drawing target from the drawing targets, and wherein said map drawing means displays a map image on said image display device with the drawing data which have been processed by said drawing judgment means.
12. The map display apparatus as claimed in claim 11, wherein when the data of the node are data of a character string, the judgment of said drawing judgment means is based on a judgment as to whether the display height of the character string from the bottom side of a display frame after the perspective projection is equal to or less than a predetermined reference value on the display frame.
13. The map display apparatus as claimed in claim 12, wherein the reference value is equal to “⅔” from the bottom side of the display frame when the display height (size) of the display frame is set to “1”.
14. The map display apparatus as claimed in claim 11, wherein when the data of the node are data of a character string, the judgment of said drawing judgment means is based on a judgment as to whether the distance between the visual-point position and the position at which the character string is defined in the map data is equal to or less than a predetermined reference value.
15. The map display apparatus as claimed in claim 11, wherein when the data of the node are data of a character string, the judgment of said drawing judgment means is based on a judgment as to whether the distance between a predetermined point contained in the map and the position at which the character string is defined in the map data is equal to or less than a predetermined reference value.
16. The map display apparatus as claimed in claim 11, wherein an attribute is beforehand allocated to at least a part of each node, and said drawing judgment means sets, as a drawing target, data of a node having an attribute which meets a preset attribute allocated to each display area on the display frame, and eliminates from the drawing targets data of a node having an attribute which does not meet the preset attribute.
17. The map display apparatus as claimed in claim 1, wherein said coordinate transforming means changes the projection angle in accordance with the distance between two predetermined points contained in the map data.
18. The map display apparatus as claimed in claim 17, wherein said coordinate transforming means reduces the projection angle to a smaller value as the distance between the two points becomes shorter.
19. The map display apparatus as claimed in claim 1, wherein the map data contain data of character strings, and said map drawing means further includes means for setting a display priority rank for the character strings, and character string rearranging means for displaying a character string having the highest display priority rank of character strings when the display positions of said character strings are overlapped with one another.
20. The map display apparatus as claimed in claim 19, wherein the display priority rank of a character string is set in accordance with the display height of the character string on the display frame from the bottom side of the display frame.
21. The map display apparatus as claimed in claim 20, wherein the display priority rank of a character string is set to be higher as the display height of the character string on the display frame from the bottom side of the display frame becomes lower.
22. The map display apparatus as claimed in claim 19, wherein the display priority rank of a character string is set in accordance with the distance between a predetermined point contained in the map data and a point at which the character string is defined.
23. The map display apparatus as claimed in claim 22, wherein the display priority rank is set to be higher as the distance between the predetermined point and the point at which the character string is defined becomes shorter.
24. A map display apparatus comprising an image display device, map storing means for storing map data to display a map, and map drawing means for preparing drawing data of said map by using said map data read out from said map storing means and displaying a map image on said image display device on the basis of the drawing data, wherein the map data contain data of character strings, and wherein said map drawing means includes means for setting display priority ranks of the character strings, and character rearranging means for displaying a character string having the highest display priority rank of overlapped character strings when the display positions of said character strings are overlapped with one another on a display map.
25. A navigation system comprising an image display device, map storing means for storing map data to display a map, map drawing means for preparing drawing data of said map by using said map data read out from said map storing means and displaying a map image on said image display device on the basis of the drawing data, and current position recognizing means for recognizing a current position, wherein said map drawing means includes coordinate transforming means for transforming coordinate data contained in the map data, and said coordinate transforming means includes perspective projection means for perspectively projecting the coordinate data contained in the map data onto a predetermined projection plane with a view point at a desired position, to thereby prepare the drawing data of a bird's-eye view.
26. The navigation system as claimed in claim 25, further comprising an input device, wherein said coordinate transforming means further includes means for accepting an input of the position of the view point through said input device, and means for determining the projection plane on the basis of predetermined two points contained in the map data and the input view point position so that the drawing positions of the two points after the perspectively projecting operation are coincident with predetermined positions.
27. The navigation system as claimed in claim 26, wherein one of said two points is the current position which is recognized by said current position recognizing means.
28. The navigation system as claimed in claim 25, further comprising an input device, wherein said coordinate transforming means further includes means for accepting an input of a scale through said input device, and means for determining the position of the view point and the projection plane on the basis of the coordinates of predetermined two points contained in the map data and the input scale so that the drawing positions of the two points after the perspectively projecting operation are coincident with predetermined positions, and the scale of a drawn map on the projection plane is coincident with the input scale.
29. The navigation system as claimed in claim 28, wherein one of the two points is the current position which is recognized by said current position recognizing means.
30. The navigation system as claimed in claim 25, further comprising an input device, wherein said coordinate transforming means changes the projection angle in accordance with the distance between the current position recognized by said current position recognizing means and a destination which is indicated through said input device.
31. The navigation system as claimed in claim 25, wherein the map data contain data of a character string, and said map drawing means further includes drawing judgment means for judging whether said data of said character string contained in the drawing data should be set as one of a number of drawing targets, on the basis of the distance between a point at which the character string is defined in the map data and a point on the map data which corresponds to the current position recognized by said current position recognizing means, and eliminating said data of the character string which are judged not to be set as a drawing target from the drawing targets, and wherein said map drawing means displays a map image on said image display device on the basis of the drawing data which have been processed by said drawing judgment means.
32. The navigation system as claimed in claim 25, wherein the map data contain data of a character string, and said map drawing means further includes means for setting display priority ranks for each said character strings, and character string rearranging means for displaying said character string having the highest display priority rank of overlapped character strings when the display positions of character strings are overlapped with one another on the display frame, and wherein the display priority rank of said character string is set in accordance with the distance between a point on the map which corresponds to the current position recognized by said current position recognizing means and the point at which the character string is defined.
33. The navigation system as claimed in claim 25, wherein said coordinate transforming means further includes means for determining the view point or the projection angle and the projection plane so that a height of a display position of the current position recognized by said current position recognizing means from a bottom side of a display frame is equal to or less than “⅓” when the height of the display frame is set to “1”.
34. A bird's-eye view forming method for preparing drawing data of a map represented in a bird's-eye view display mode with map data, characterized in that coordinate data contained in the map data are perspectively projected onto a predetermined projection plane with a view point at a desired position to thereby prepare the drawing data of a bird's-eye view.
35. The bird's-eye view forming method as claimed in claim 34, comprising the steps of:
accepting an input of the position of the view point; and
determining the projection plane on the basis of the coordinates of predetermined two points contained in the map data and the input position of the view point so that the drawing positions of the two points after the perspective projection are coincident to predetermined positions.
36. The method as claimed in claim 34, comprising the steps of:
accepting an input of a scale; and
determining the position of the view point and the projection plane on the basis of the coordinates of predetermined two points contained in the map data and the input scale so that the drawing positions of the two points after the perspective projection are coincident with predetermined positions, and the scale of a drawn map on the projection plane is coincident with the input scale.
37. The method as claimed in claim 34, comprising the steps of:
accepting an input of a projection angle which is an intersection angle between the projection plane and a plane on which the map data are defined; and
determining the projection plane on the basis of the input projection angle and a position of the view point.
US10/795,241 1995-04-20 2004-03-09 Bird's-eye view forming method, map display apparatus and navigation system Abandoned US20040169653A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/795,241 US20040169653A1 (en) 1995-04-20 2004-03-09 Bird's-eye view forming method, map display apparatus and navigation system

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP09553595A JP3474022B2 (en) 1995-04-20 1995-04-20 Map display device, map display method, arithmetic processing unit for map display device, and navigation system
JP7-95535 1995-04-20
US08/632,791 US6141014A (en) 1995-04-20 1996-04-17 Bird's-eye view forming method, map display apparatus and navigation system
US09/497,932 US6346942B1 (en) 1995-04-20 2000-02-04 Bird's-eye view forming method, map display apparatus and navigation system
US09/853,547 US6654014B2 (en) 1995-04-20 2001-05-11 Bird's-eye view forming method, map display apparatus and navigation system
US10/458,164 US6760027B2 (en) 1995-04-20 2003-06-09 Bird's-eye view forming method, map display apparatus and navigation system
US10/795,241 US20040169653A1 (en) 1995-04-20 2004-03-09 Bird's-eye view forming method, map display apparatus and navigation system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/458,164 Continuation US6760027B2 (en) 1995-04-20 2003-06-09 Bird's-eye view forming method, map display apparatus and navigation system

Publications (1)

Publication Number Publication Date
US20040169653A1 true US20040169653A1 (en) 2004-09-02

Family

ID=26436756

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/853,547 Expired - Lifetime US6654014B2 (en) 1995-04-20 2001-05-11 Bird's-eye view forming method, map display apparatus and navigation system
US10/458,164 Expired - Fee Related US6760027B2 (en) 1995-04-20 2003-06-09 Bird's-eye view forming method, map display apparatus and navigation system
US10/795,241 Abandoned US20040169653A1 (en) 1995-04-20 2004-03-09 Bird's-eye view forming method, map display apparatus and navigation system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/853,547 Expired - Lifetime US6654014B2 (en) 1995-04-20 2001-05-11 Bird's-eye view forming method, map display apparatus and navigation system
US10/458,164 Expired - Fee Related US6760027B2 (en) 1995-04-20 2003-06-09 Bird's-eye view forming method, map display apparatus and navigation system

Country Status (1)

Country Link
US (3) US6654014B2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040153238A1 (en) * 2003-01-22 2004-08-05 Increment P Corporation Data structure of feature guidance information, recording medium storing feature guidance information, navigation device, navigation system, navigation method, navigation program and recording medium storing the navigation program
US20040212627A1 (en) * 2002-10-10 2004-10-28 Xanavi Informatics Corporation Map data transmitting method, map data transmitting apparatus, information device and map data transmitting system
US20050137789A1 (en) * 2003-12-23 2005-06-23 Honda Motor Co., Ltd. Prioritized delivery of navigation information
US20050137798A1 (en) * 2003-12-23 2005-06-23 Honda Motor Co., Ltd. System and method for transferring navigation information using different coordinate systems
US20050159881A1 (en) * 2003-12-23 2005-07-21 Honda Motor Co., Ltd. System and method for managing navigation information
US20050159880A1 (en) * 2003-12-23 2005-07-21 Honda Motor Co., Ltd. Smart storage and transmission of navigation information
US20050261830A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Method for modifying navigation information
US20050261827A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. System and method for displaying information
US20060132482A1 (en) * 2004-11-12 2006-06-22 Oh Byong M Method for inter-scene transitions
US20060284879A1 (en) * 2004-05-13 2006-12-21 Sony Corporation Animation generating apparatus, animation generating method, and animation generating program
US20070076920A1 (en) * 2005-10-04 2007-04-05 Microsoft Corporation Street side maps and paths
US20070288162A1 (en) * 2004-05-19 2007-12-13 Honda Motor Company Ltd. System and Method for Off Route Processing
WO2008094644A1 (en) * 2007-01-31 2008-08-07 Hewlett-Packard Development Company, L.P. Method and apparatus for moving content to mobile devices
US20090207170A1 (en) * 2006-02-22 2009-08-20 Navitime Japan Co., Ltd. Map display system, map display method for map display system, map display device, and program
US20090304853A1 (en) * 2008-06-09 2009-12-10 Oms Investments, Inc. Bird feed for attracting finches and other small birds
US20090304899A1 (en) * 2008-06-09 2009-12-10 Oms Investments, Inc. Bird feed that attracts less blackbirds and other undesirable birds
US20100164702A1 (en) * 2008-12-26 2010-07-01 Kabushiki Kaisha Toshiba Automotive display system and display method
US20100225756A1 (en) * 2009-03-06 2010-09-09 Sony Corporation Navigation apparatus and navigation method
US20110063432A1 (en) * 2000-10-06 2011-03-17 Enrico Di Bernardo System and method for creating, storing and utilizing images of a geographic location
US20110188760A1 (en) * 2010-02-03 2011-08-04 Oculus Info Inc. System and Method for Creating and Displaying Map Projections related to Real-Time Images
US20110249030A1 (en) * 2009-11-30 2011-10-13 Pioneer Corporation Map display device, map display method, map display program, and computer-readable recording medium
US20110316885A1 (en) * 2010-06-23 2011-12-29 Samsung Electronics Co., Ltd. Method and apparatus for displaying image including position information
US20120038623A1 (en) * 2008-05-29 2012-02-16 Ewoud Van Raamsdonk Generating a map display image
US10508926B2 (en) 2012-06-05 2019-12-17 Apple Inc. Providing navigation instructions while device is in locked mode
US10718625B2 (en) 2012-06-05 2020-07-21 Apple Inc. Voice instructions during navigation
US11055912B2 (en) 2012-06-05 2021-07-06 Apple Inc. Problem reporting in maps

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6989832B2 (en) * 2000-01-21 2006-01-24 Sony Computer Entertainment Inc. Entertainment apparatus, storage medium and object display method
DE60112223T2 (en) 2000-03-14 2006-05-24 Aisin AW Co., Ltd., Anjo ROAD MAP DISPLAY AND RECORDING MEDIUM
JP3561267B2 (en) * 2000-06-27 2004-09-02 株式会社ケイビーエムジェイ Information providing system, information providing method, and storage medium
AU2002255568B8 (en) 2001-02-20 2014-01-09 Adidas Ag Modular personal network systems and methods
JP4054589B2 (en) * 2001-05-28 2008-02-27 キヤノン株式会社 Graphic processing apparatus and method
JP2003067721A (en) * 2001-08-24 2003-03-07 Pioneer Electronic Corp Map image display system and method
JP3805231B2 (en) * 2001-10-26 2006-08-02 キヤノン株式会社 Image display apparatus and method, and storage medium
WO2003054781A1 (en) * 2001-12-21 2003-07-03 Siemens Aktiengesellschaft Device for detecting and displaying movements
GB2384313A (en) * 2002-01-18 2003-07-23 Qinetiq Ltd An attitude sensor
US7187377B1 (en) * 2002-06-28 2007-03-06 Microsoft Corporation Three-dimensional virtual tour method and system
US7480512B2 (en) * 2004-01-16 2009-01-20 Bones In Motion, Inc. Wireless device, program products and methods of using a wireless device to deliver services
US20040243307A1 (en) * 2003-06-02 2004-12-02 Pieter Geelen Personal GPS navigation device
US20050031169A1 (en) * 2003-08-09 2005-02-10 Alan Shulman Birds eye view virtual imaging for real time composited wide field of view
US7692646B2 (en) * 2003-12-19 2010-04-06 Koninklijke Philips Electronics N.V. Method of and scaling unit for scaling a three-dimensional model
US20060267803A1 (en) * 2005-05-26 2006-11-30 Tele Atlas North America, Inc. Non-perspective variable-scale map displays
US7711478B2 (en) * 2005-06-21 2010-05-04 Mappick Technologies, Llc Navigation system and method
US8670925B2 (en) * 2005-06-21 2014-03-11 Calabrese Holdings L.L.C. Navigation system and method
US9726513B2 (en) 2005-06-21 2017-08-08 Nytell Software LLC Navigation system and method
US20070016372A1 (en) * 2005-07-14 2007-01-18 Gm Global Technology Operations, Inc. Remote Perspective Vehicle Environment Observation System
JP4841897B2 (en) * 2005-08-29 2011-12-21 アルパイン株式会社 Navigation device
JP4682809B2 (en) * 2005-11-04 2011-05-11 株式会社デンソー Parking assistance system
US8594933B2 (en) * 2006-02-09 2013-11-26 Sap Ag Transmission of sensor data based on geographical navigation data
JP2007280212A (en) * 2006-04-10 2007-10-25 Sony Corp Display control device, display control method and display control program
US20090271745A1 (en) * 2006-07-20 2009-10-29 Navitime Japan Co., Ltd. Map display system, map display device, map display method, and map distribution server
TWI309709B (en) * 2006-11-15 2009-05-11 Primax Electronics Ltd Storage media containing electronic map file, electronic map display system utilizing electronic map file and related method thereof
JP2008209208A (en) * 2007-02-26 2008-09-11 Denso Corp Car navigation device
JP2009129001A (en) * 2007-11-20 2009-06-11 Sanyo Electric Co Ltd Operation support system, vehicle, and method for estimating three-dimensional object area
JP5116514B2 (en) * 2008-03-11 2013-01-09 キヤノン株式会社 Imaging apparatus and display control method
US8624902B2 (en) 2010-02-04 2014-01-07 Microsoft Corporation Transitioning between top-down maps and local navigation of reconstructed 3-D scenes
US20110187704A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Generating and displaying top-down maps of reconstructed 3-d scenes
US8773424B2 (en) * 2010-02-04 2014-07-08 Microsoft Corporation User interfaces for interacting with top-down maps of reconstructed 3-D scences
GB2483490A (en) * 2010-09-10 2012-03-14 Cheng Uei Prec Ind Co Ltd Guiding module for projecting guiding information
US9965140B2 (en) * 2011-12-26 2018-05-08 TrackThings LLC Method and apparatus of a marking objects in images displayed on a portable unit
JP5921233B2 (en) * 2012-02-06 2016-05-24 キヤノン株式会社 Image management apparatus, control method therefor, and program
US8884234B2 (en) * 2012-03-16 2014-11-11 Raytheon Company Portable directional device for locating neutron emitting sources
US9262868B2 (en) * 2012-09-19 2016-02-16 Google Inc. Method for transforming mapping data associated with different view planes into an arbitrary view plane
JP6701734B2 (en) * 2014-01-30 2020-05-27 日本ゼオン株式会社 Crosslinkable nitrile rubber composition and rubber crosslinked product
US20170115749A1 (en) * 2014-10-26 2017-04-27 Chian Chiu Li Systems And Methods For Presenting Map And Other Information Based On Pointing Direction
WO2019069366A1 (en) * 2017-10-03 2019-04-11 株式会社Stroly Information processing device, information system, information processing method, and program
CN112561789A (en) * 2020-12-23 2021-03-26 中国科学院长春光学精密机械与物理研究所 Irregular image processing method

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4737916A (en) * 1985-04-30 1988-04-12 Nippondenso Co., Ltd. Electronic map display system
US4835532A (en) * 1982-07-30 1989-05-30 Honeywell Inc. Nonaliasing real-time spatial transform image processing system
US4899285A (en) * 1986-06-26 1990-02-06 Nissan Motor Company, Limited System and method for measuring a position of a moving object with a hybrid navigation apparatus
US4940972A (en) * 1987-02-10 1990-07-10 Societe D'applications Generales D'electricite Et De Mecanique (S A G E M) Method of representing a perspective image of a terrain and a system for implementing same
US4952922A (en) * 1985-07-18 1990-08-28 Hughes Aircraft Company Predictive look ahead memory management for computer image generation in simulators
US5015188A (en) * 1988-05-03 1991-05-14 The United States Of America As Represented By The Secretary Of The Air Force Three dimensional tactical element situation (3DTES) display
US5088054A (en) * 1988-05-09 1992-02-11 Paris Ii Earl A Computer graphics hidden surface removal system
US5115398A (en) * 1989-07-04 1992-05-19 U.S. Philips Corp. Method of displaying navigation data for a vehicle in an image of the vehicle environment, a navigation system for performing the method, and a vehicle comprising a navigation system
US5155683A (en) * 1991-04-11 1992-10-13 Wadiatur Rahim Vehicle remote guidance with path control
US5161886A (en) * 1989-01-11 1992-11-10 U.S. Philips Corp. Method for the perspective display of a part of a topographic map, and device suitable for performing such a method
US5398188A (en) * 1991-12-09 1995-03-14 Mitsubishi Denki Kabushiki Kaisha Navigation apparatus and method for a vehicle for displaying positional information of the same
US5434591A (en) * 1989-12-15 1995-07-18 Hitachi, Ltd. Scrolling method and apparatus in which data being displayed is altered during scrolling
US5448696A (en) * 1990-11-30 1995-09-05 Hitachi, Ltd. Map information system capable of displaying layout information
US5507485A (en) * 1994-04-28 1996-04-16 Roblor Marketing Group, Inc. Golf computer and golf replay device
US5528735A (en) * 1993-03-23 1996-06-18 Silicon Graphics Inc. Method and apparatus for displaying data within a three-dimensional information landscape
US5555354A (en) * 1993-03-23 1996-09-10 Silicon Graphics Inc. Method and apparatus for navigation within three-dimensional information landscape
US5566280A (en) * 1993-09-20 1996-10-15 Kabushiki Kaisha Toshiba 3D dynamic image production system with automatic viewpoint setting
US5574648A (en) * 1990-10-09 1996-11-12 Pilley; Harold R. Airport control/management system using GNSS-based methods and equipment for the control of surface and airborne traffic
US5618179A (en) * 1992-05-22 1997-04-08 Atari Games Corpooration Driver training system and method with performance data feedback
US5687307A (en) * 1993-09-21 1997-11-11 Canon Kabushiki Kaisha Computer graphic animation in which texture animation is independently performed on a plurality of objects in three-dimensional space
US5732385A (en) * 1994-04-15 1998-03-24 Nissan Motor Co., Ltd. Vehicle navigation system displaying bird-eye view of different visual points and different contraction scale ratios depending upon vehicle travel conditions
US5742924A (en) * 1994-12-02 1998-04-21 Nissan Motor Co., Ltd. Apparatus and method for navigating mobile body using road map displayed in form of bird's eye view
US5748109A (en) * 1993-12-27 1998-05-05 Nissan Motor Co., Ltd. Apparatus and method for navigating vehicle to destination using display unit
US5793310A (en) * 1994-02-04 1998-08-11 Nissan Motor Co., Ltd. Portable or vehicular navigating apparatus and method capable of displaying bird's eye view
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US5862498A (en) * 1994-11-11 1999-01-19 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US5884217A (en) * 1994-11-14 1999-03-16 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US5904725A (en) * 1995-04-25 1999-05-18 Matsushita Electric Industrial Co., Ltd. Local positioning apparatus
US5913918A (en) * 1995-06-13 1999-06-22 Matsushita Electric Industrial Co., Ltd. Automotive navigation apparatus and recording medium storing program therefor
US5917436A (en) * 1995-04-20 1999-06-29 Hitachi, Ltd. Map display apparatus
US5964810A (en) * 1995-06-09 1999-10-12 Xanavi Informatics Corporation Map display apparatus
US5974876A (en) * 1990-05-02 1999-11-02 Pioneer Electronic Corporation Map information displaying apparatus, navigation apparatus and program storage device readable by the navigation apparatus
US6020890A (en) * 1996-08-12 2000-02-01 Fujitsu Limited Two-dimensional image display device for three-dimensional computer graphics model
US6141014A (en) * 1995-04-20 2000-10-31 Hitachi, Ltd. Bird's-eye view forming method, map display apparatus and navigation system
US6166748A (en) * 1995-11-22 2000-12-26 Nintendo Co., Ltd. Interface for a high performance low cost video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
US6169552B1 (en) * 1996-04-16 2001-01-02 Xanavi Informatics Corporation Map display device, navigation device and map display method
US6252602B1 (en) * 1996-10-08 2001-06-26 Sharp Kabushiki Kaisha Information processing apparatus
US6411298B1 (en) * 1996-06-25 2002-06-25 Hitachi Medical Corporation Method and apparatus for determining visual point and direction of line of sight in three-dimensional image construction method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01219883A (en) 1988-02-29 1989-09-01 Meitetsuku:Kk Display device for on-vehicle navigation system
JP2606723B2 (en) 1988-04-15 1997-05-07 有限会社ヴェルク・ジャパン Map display device for navigation
JPH07111618B2 (en) 1988-06-15 1995-11-29 株式会社日立製作所 Scroll screen display method and device
JPH05203457A (en) 1992-01-29 1993-08-10 Nec Home Electron Ltd Route guidance apparatus
JP3263115B2 (en) 1992-03-17 2002-03-04 株式会社東芝 Image display device
JP2759934B2 (en) 1992-09-07 1998-05-28 澁谷工業株式会社 Dust removal device in gas laser device
KR0124403B1 (en) 1994-09-28 1997-11-28 배순훈 Audio dubbing apparatus for cd/vcr

Patent Citations (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4835532A (en) * 1982-07-30 1989-05-30 Honeywell Inc. Nonaliasing real-time spatial transform image processing system
US4737916A (en) * 1985-04-30 1988-04-12 Nippondenso Co., Ltd. Electronic map display system
US4952922A (en) * 1985-07-18 1990-08-28 Hughes Aircraft Company Predictive look ahead memory management for computer image generation in simulators
US4899285A (en) * 1986-06-26 1990-02-06 Nissan Motor Company, Limited System and method for measuring a position of a moving object with a hybrid navigation apparatus
US4940972A (en) * 1987-02-10 1990-07-10 Societe D'applications Generales D'electricite Et De Mecanique (S A G E M) Method of representing a perspective image of a terrain and a system for implementing same
US5015188A (en) * 1988-05-03 1991-05-14 The United States Of America As Represented By The Secretary Of The Air Force Three dimensional tactical element situation (3DTES) display
US5088054A (en) * 1988-05-09 1992-02-11 Paris Ii Earl A Computer graphics hidden surface removal system
US5161886A (en) * 1989-01-11 1992-11-10 U.S. Philips Corp. Method for the perspective display of a part of a topographic map, and device suitable for performing such a method
US5161886C1 (en) * 1989-01-11 2001-10-30 Philips Corp Method for the perspective display of a part of a topographic map and device suitable for performing such a method
US5115398A (en) * 1989-07-04 1992-05-19 U.S. Philips Corp. Method of displaying navigation data for a vehicle in an image of the vehicle environment, a navigation system for performing the method, and a vehicle comprising a navigation system
US5434591A (en) * 1989-12-15 1995-07-18 Hitachi, Ltd. Scrolling method and apparatus in which data being displayed is altered during scrolling
US5974876A (en) * 1990-05-02 1999-11-02 Pioneer Electronic Corporation Map information displaying apparatus, navigation apparatus and program storage device readable by the navigation apparatus
US5574648A (en) * 1990-10-09 1996-11-12 Pilley; Harold R. Airport control/management system using GNSS-based methods and equipment for the control of surface and airborne traffic
US5448696A (en) * 1990-11-30 1995-09-05 Hitachi, Ltd. Map information system capable of displaying layout information
US5155683A (en) * 1991-04-11 1992-10-13 Wadiatur Rahim Vehicle remote guidance with path control
US5398188A (en) * 1991-12-09 1995-03-14 Mitsubishi Denki Kabushiki Kaisha Navigation apparatus and method for a vehicle for displaying positional information of the same
US5618179A (en) * 1992-05-22 1997-04-08 Atari Games Corpooration Driver training system and method with performance data feedback
US5555354A (en) * 1993-03-23 1996-09-10 Silicon Graphics Inc. Method and apparatus for navigation within three-dimensional information landscape
US5671381A (en) * 1993-03-23 1997-09-23 Silicon Graphics, Inc. Method and apparatus for displaying data within a three-dimensional information landscape
US5528735A (en) * 1993-03-23 1996-06-18 Silicon Graphics Inc. Method and apparatus for displaying data within a three-dimensional information landscape
US5566280A (en) * 1993-09-20 1996-10-15 Kabushiki Kaisha Toshiba 3D dynamic image production system with automatic viewpoint setting
US5687307A (en) * 1993-09-21 1997-11-11 Canon Kabushiki Kaisha Computer graphic animation in which texture animation is independently performed on a plurality of objects in three-dimensional space
US5748109A (en) * 1993-12-27 1998-05-05 Nissan Motor Co., Ltd. Apparatus and method for navigating vehicle to destination using display unit
US5793310A (en) * 1994-02-04 1998-08-11 Nissan Motor Co., Ltd. Portable or vehicular navigating apparatus and method capable of displaying bird's eye view
US6011494A (en) * 1994-02-04 2000-01-04 Nissan Motor Co., Ltd. Portable or vehicular navigating apparatus and method capable of displaying bird's eye view
US5732385A (en) * 1994-04-15 1998-03-24 Nissan Motor Co., Ltd. Vehicle navigation system displaying bird-eye view of different visual points and different contraction scale ratios depending upon vehicle travel conditions
US5507485A (en) * 1994-04-28 1996-04-16 Roblor Marketing Group, Inc. Golf computer and golf replay device
US5862498A (en) * 1994-11-11 1999-01-19 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US5884217A (en) * 1994-11-14 1999-03-16 Xanavi Informatics Corporation Map display apparatus for motor vehicle
US5742924A (en) * 1994-12-02 1998-04-21 Nissan Motor Co., Ltd. Apparatus and method for navigating mobile body using road map displayed in form of bird's eye view
US5850352A (en) * 1995-03-31 1998-12-15 The Regents Of The University Of California Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images
US6141014A (en) * 1995-04-20 2000-10-31 Hitachi, Ltd. Bird's-eye view forming method, map display apparatus and navigation system
US5917436A (en) * 1995-04-20 1999-06-29 Hitachi, Ltd. Map display apparatus
US6603407B2 (en) * 1995-04-20 2003-08-05 Hitachi, Ltd. Map display apparatus
US6278383B1 (en) * 1995-04-20 2001-08-21 Hitachi, Ltd. Map display apparatus
US5904725A (en) * 1995-04-25 1999-05-18 Matsushita Electric Industrial Co., Ltd. Local positioning apparatus
US5964810A (en) * 1995-06-09 1999-10-12 Xanavi Informatics Corporation Map display apparatus
US5913918A (en) * 1995-06-13 1999-06-22 Matsushita Electric Industrial Co., Ltd. Automotive navigation apparatus and recording medium storing program therefor
US6166748A (en) * 1995-11-22 2000-12-26 Nintendo Co., Ltd. Interface for a high performance low cost video game system with coprocessor providing high speed efficient 3D graphics and digital audio signal processing
US6169552B1 (en) * 1996-04-16 2001-01-02 Xanavi Informatics Corporation Map display device, navigation device and map display method
US6411298B1 (en) * 1996-06-25 2002-06-25 Hitachi Medical Corporation Method and apparatus for determining visual point and direction of line of sight in three-dimensional image construction method
US6020890A (en) * 1996-08-12 2000-02-01 Fujitsu Limited Two-dimensional image display device for three-dimensional computer graphics model
US6252602B1 (en) * 1996-10-08 2001-06-26 Sharp Kabushiki Kaisha Information processing apparatus

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8818138B2 (en) 2000-10-06 2014-08-26 Enrico Di Bernardo System and method for creating, storing and utilizing images of a geographical location
US10473465B2 (en) 2000-10-06 2019-11-12 Vederi, Llc System and method for creating, storing and utilizing images of a geographical location
US9644968B2 (en) 2000-10-06 2017-05-09 Vederi, Llc System and method for creating, storing and utilizing images of a geographical location
US20110063432A1 (en) * 2000-10-06 2011-03-17 Enrico Di Bernardo System and method for creating, storing and utilizing images of a geographic location
US8213749B2 (en) 2000-10-06 2012-07-03 Verderi, LLC System and method for creating, storing and utilizing images of a geographic location
US20040212627A1 (en) * 2002-10-10 2004-10-28 Xanavi Informatics Corporation Map data transmitting method, map data transmitting apparatus, information device and map data transmitting system
US7158149B2 (en) * 2002-10-10 2007-01-02 Xanavi Informatics Corporation Map data transmitting method, map data transmitting apparatus, information device and map data transmitting system
US7451040B2 (en) * 2003-01-22 2008-11-11 Increment Corporation Data structure of feature guidance information, recording medium storing feature guidance information, navigation device, navigation system, navigation method, navigation program and recording medium storing the navigation program
US20040153238A1 (en) * 2003-01-22 2004-08-05 Increment P Corporation Data structure of feature guidance information, recording medium storing feature guidance information, navigation device, navigation system, navigation method, navigation program and recording medium storing the navigation program
US7146271B2 (en) 2003-12-23 2006-12-05 Honda Motor Co., Ltd. System and method for managing navigation information
US20050159880A1 (en) * 2003-12-23 2005-07-21 Honda Motor Co., Ltd. Smart storage and transmission of navigation information
US20050159881A1 (en) * 2003-12-23 2005-07-21 Honda Motor Co., Ltd. System and method for managing navigation information
US7184888B2 (en) 2003-12-23 2007-02-27 Honda Motor Co., Ltd. System and method for transferring navigation information using different coordinate systems
US20050137798A1 (en) * 2003-12-23 2005-06-23 Honda Motor Co., Ltd. System and method for transferring navigation information using different coordinate systems
US7263438B2 (en) 2003-12-23 2007-08-28 Honda Motor Co., Ltd. Smart storage and transmission of navigation information
US7512484B2 (en) 2003-12-23 2009-03-31 Honda Motor Co., Ltd. Smart storage and transmission of navigation information
US7480561B2 (en) 2003-12-23 2009-01-20 Honda Motor Co., Ltd. Prioritized delivery of navigation information
US20080021642A1 (en) * 2003-12-23 2008-01-24 Hideo Furukawa Smart Storage and Transmission of Navigation Information
US20050137789A1 (en) * 2003-12-23 2005-06-23 Honda Motor Co., Ltd. Prioritized delivery of navigation information
US20060284879A1 (en) * 2004-05-13 2006-12-21 Sony Corporation Animation generating apparatus, animation generating method, and animation generating program
US7612777B2 (en) * 2004-05-13 2009-11-03 Sony Corporation Animation generating apparatus, animation generating method, and animation generating program
WO2005116583A3 (en) * 2004-05-19 2006-05-04 Honda Motor Co Ltd System and method for displaying information
US20100082820A1 (en) * 2004-05-19 2010-04-01 Honda Motor Co., Ltd. System and Method for Off Route Processing
US7292936B2 (en) * 2004-05-19 2007-11-06 Honda Motor Co., Ltd. System and method for displaying information
US20050261830A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Method for modifying navigation information
US7206696B2 (en) 2004-05-19 2007-04-17 Honda Motor Co., Ltd. Method for modifying navigation information
US20050261827A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. System and method for displaying information
US20070288162A1 (en) * 2004-05-19 2007-12-13 Honda Motor Company Ltd. System and Method for Off Route Processing
US7660667B2 (en) 2004-05-19 2010-02-09 Honda Motor Co., Ltd. System and method for off route processing
US20060132482A1 (en) * 2004-11-12 2006-06-22 Oh Byong M Method for inter-scene transitions
US10032306B2 (en) 2004-11-12 2018-07-24 Everyscape, Inc. Method for inter-scene transitions
US10304233B2 (en) 2004-11-12 2019-05-28 Everyscape, Inc. Method for inter-scene transitions
US20070076920A1 (en) * 2005-10-04 2007-04-05 Microsoft Corporation Street side maps and paths
US7840032B2 (en) * 2005-10-04 2010-11-23 Microsoft Corporation Street-side maps and paths
US20090207170A1 (en) * 2006-02-22 2009-08-20 Navitime Japan Co., Ltd. Map display system, map display method for map display system, map display device, and program
US20100106408A1 (en) * 2007-01-31 2010-04-29 Timothy Kindberg Method and apparatus for moving content to mobile devices
WO2008094644A1 (en) * 2007-01-31 2008-08-07 Hewlett-Packard Development Company, L.P. Method and apparatus for moving content to mobile devices
US8170784B2 (en) 2007-01-31 2012-05-01 Hewlett-Packard Development Company, L.P. Method and apparatus for moving content to mobile devices
US20120038623A1 (en) * 2008-05-29 2012-02-16 Ewoud Van Raamsdonk Generating a map display image
US9852709B2 (en) * 2008-05-29 2017-12-26 Tomtom Navigation B.V. Generating a map display image
US20090304900A1 (en) * 2008-06-09 2009-12-10 Oms Investments, Inc. Bird feed for attracting finches and other small desirable birds
US20090304853A1 (en) * 2008-06-09 2009-12-10 Oms Investments, Inc. Bird feed for attracting finches and other small birds
US20090304899A1 (en) * 2008-06-09 2009-12-10 Oms Investments, Inc. Bird feed that attracts less blackbirds and other undesirable birds
US20090304898A1 (en) * 2008-06-09 2009-12-10 Oms Investments, Inc. Bird feed that attracts fewer undesirable birds
US8212662B2 (en) * 2008-12-26 2012-07-03 Kabushiki Kaisha Toshiba Automotive display system and display method
US20100164702A1 (en) * 2008-12-26 2010-07-01 Kabushiki Kaisha Toshiba Automotive display system and display method
US9404762B2 (en) * 2009-03-06 2016-08-02 Sony Corporation Navigation apparatus and navigation method
US10378913B2 (en) 2009-03-06 2019-08-13 Sony Corporation Navigation apparatus and navigation method
US20100225756A1 (en) * 2009-03-06 2010-09-09 Sony Corporation Navigation apparatus and navigation method
US20110249030A1 (en) * 2009-11-30 2011-10-13 Pioneer Corporation Map display device, map display method, map display program, and computer-readable recording medium
US8922592B2 (en) * 2009-11-30 2014-12-30 Pioneer Corporation Map display device, map display method, map display program, and computer-readable recording medium
US9047699B2 (en) * 2010-02-03 2015-06-02 Uncharted Software Inc. System and method for creating and displaying map projections related to real-time images
US20140104315A1 (en) * 2010-02-03 2014-04-17 Oculus Info Inc. System and method for creating and displaying map projections related to real-time images
US8436872B2 (en) * 2010-02-03 2013-05-07 Oculus Info Inc. System and method for creating and displaying map projections related to real-time images
US20110188760A1 (en) * 2010-02-03 2011-08-04 Oculus Info Inc. System and Method for Creating and Displaying Map Projections related to Real-Time Images
US20110316885A1 (en) * 2010-06-23 2011-12-29 Samsung Electronics Co., Ltd. Method and apparatus for displaying image including position information
US10508926B2 (en) 2012-06-05 2019-12-17 Apple Inc. Providing navigation instructions while device is in locked mode
US10718625B2 (en) 2012-06-05 2020-07-21 Apple Inc. Voice instructions during navigation
US10732003B2 (en) 2012-06-05 2020-08-04 Apple Inc. Voice instructions during navigation
US10911872B2 (en) 2012-06-05 2021-02-02 Apple Inc. Context-aware voice guidance
US11055912B2 (en) 2012-06-05 2021-07-06 Apple Inc. Problem reporting in maps
US11082773B2 (en) 2012-06-05 2021-08-03 Apple Inc. Context-aware voice guidance
US11290820B2 (en) 2012-06-05 2022-03-29 Apple Inc. Voice instructions during navigation
US11727641B2 (en) 2012-06-05 2023-08-15 Apple Inc. Problem reporting in maps

Also Published As

Publication number Publication date
US20010022585A1 (en) 2001-09-20
US6760027B2 (en) 2004-07-06
US20030208316A1 (en) 2003-11-06
US6654014B2 (en) 2003-11-25

Similar Documents

Publication Publication Date Title
US6760027B2 (en) Bird's-eye view forming method, map display apparatus and navigation system
US6346942B1 (en) Bird's-eye view forming method, map display apparatus and navigation system
KR100268071B1 (en) Map display method and apparatus and navigation apparatus therewith
JP3568621B2 (en) Map display device
JP3419648B2 (en) Navigation device
US6421604B1 (en) Map display apparatus for motor vehicle
JP3474053B2 (en) Map display method for navigation device and navigation device
EP0953826A2 (en) Apparatus for displaying characters and symbols on map for use in navigation system
EP0881466B1 (en) Navigation device
JP3360425B2 (en) Vehicle navigation system
JP3848351B2 (en) Navigation device
JP3408645B2 (en) Road map display control method and road map display device
JP2003030687A (en) Map display device
JP3386599B2 (en) Map display method and map display device
JP2903981B2 (en) Route guidance device for vehicles
JP3097377B2 (en) Route guidance device for vehicles
JP3386604B2 (en) Map display method and map display device
JP3428747B2 (en) Road map display control method and road map display control device
KR100482227B1 (en) Bird's-eye view creation method, map display device and navigation system
JP3386800B2 (en) Map display method and map display device
JP3415619B1 (en) Map display method and map display device
JP3386600B2 (en) Map display method and map display device
JP3411566B1 (en) Map display method

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION