[go: up one dir, main page]

HK1114169A - Navigation device and method of scrolling map data displayed on a navigation device - Google Patents

Navigation device and method of scrolling map data displayed on a navigation device Download PDF

Info

Publication number
HK1114169A
HK1114169A HK08109761.0A HK08109761A HK1114169A HK 1114169 A HK1114169 A HK 1114169A HK 08109761 A HK08109761 A HK 08109761A HK 1114169 A HK1114169 A HK 1114169A
Authority
HK
Hong Kong
Prior art keywords
navigation device
map data
camera
processing unit
movement
Prior art date
Application number
HK08109761.0A
Other languages
Chinese (zh)
Inventor
英格丽德‧哈尔特斯
特卡琴科‧塞尔海伊
Original Assignee
通腾科技股份有限公司
Filing date
Publication date
Application filed by 通腾科技股份有限公司 filed Critical 通腾科技股份有限公司
Publication of HK1114169A publication Critical patent/HK1114169A/en

Links

Abstract

The present invention relates a navigation device (10, 10'), comprising a processing unit (11) and a display (18). The processing unit (11) is arranged to display map data on the display (18). The navigation device (10, 10') further comprises a camera (24) being arranged to provide a camera feed to the processing unit (11). The processing unit (11) is further arranged to: - receive a camera feed from the camera (24), - detect a movement of the navigation device (10, 10') based on the received camera feed, and to - scroll the displayed map data in response to the detected movement.

Description

Navigation device and method for scrolling map data displayed on navigation device
Technical Field
The invention relates to a navigation device comprising a processing unit and a display, the processing unit being arranged to display map data on the display.
Furthermore, the invention relates to a method of scrolling map data displayed on a navigation device.
Background
Global Positioning System (GPS) based prior art navigation devices are well known and widely used as in-vehicle navigation systems. Such GPS-based navigation devices involve a computing device that forms a functional connection with an external (or internal) GPS receiver and is capable of determining its global position. Additionally, the computing device is capable of determining a route between a start address and a destination address, which may be input by a user of the computing device. The computing device is typically enabled by software for calculating a "best" or "optimal" route between the start address location and the destination address location from a map database. The "best" or "optimal" route is determined based on predetermined criteria and need not necessarily be the fastest or shortest route.
The navigation device may typically be mounted on the dashboard of a vehicle, but may also form part of an on-board computer of a vehicle or car radio. The navigation device may also be (part of) a handheld system like a PDA.
Using position information derived from the GPS receiver, the computing device can determine its position at regular intervals and can display the current position of the vehicle to the user. The navigation device may also include a memory device for storing map data and a display for displaying selected portions of the map data.
Further, it may provide instructions how to navigate the determined route by appropriate navigation directions, which are displayed on a display and/or generated as audible signals from a speaker (e.g., "turn left in 100 meters"). Graphics depicting the behavior to be performed (e.g., a left arrow indicating a left turn ahead) may be displayed in the status bar, and may also be superimposed over the appropriate junctions/turnings, etc. in the map itself.
It is known to enable an in-vehicle navigation system to allow a driver to initiate a recalculation of a route while driving a car along the route calculated by the navigation system. This is useful when the vehicle is faced with construction work or severe traffic jams.
It is also known to enable a user to select the type of route calculation algorithm deployed by a navigation device, for example to select a "normal" mode or a "fast" mode (which calculates a route in the shortest time, but finds fewer alternative routes than there are normal modes).
It is also known to allow routes to be calculated with user-defined criteria; for example, the user may prefer a scenic route to be calculated by the device. The device software will then calculate various routes and favor those that include the greatest number of points of interest (known as POIs) marked as, for example, landscapes along their route.
According to the prior art, navigation devices are arranged to display a map on a display, for example to show a planned route to a user. This helps the user orient himself. The navigation device may also be used merely as a map display device without the option of planning a route or displaying the current location. However, typically only a portion of the map is displayed. If the user wishes to see a portion of the map that is outside the display (e.g., on the left side (or west) of the display), he/she needs to scroll the map to the right. This may be done by using (virtual) buttons provided at the navigation device or by using a mouse. Scrolling may also be accomplished by moving a pen (stylus or pointer) or finger across the touch screen.
Disclosure of Invention
It is an object of the present invention to provide an alternative method of scrolling a map displayed by a navigation device. To achieve this object, the invention provides a navigation device as defined in the preamble, characterized in that the navigation device further comprises a camera arranged to provide a camera feed to a processing unit, the processing unit being further arranged to:
receiving a camera feed from the camera and,
detecting movement of the navigation device based on the received camera feed,
scrolling the displayed map data in response to the detected movement.
This provides an alternative method of scrolling a displayed map in a desired direction without the use of (virtual) buttons or a mouse. Furthermore, the scrolling manner corresponds to the user's intuition as if he/she were viewing the map using a magnifying glass, wherein the display acts as a magnifying glass and the map extends over an area larger than the display of the navigation device.
According to an embodiment of the invention, the processing unit is arranged to detect movement of the navigation device from the camera feed using pattern recognition techniques. Using pattern recognition techniques is an easy and reliable way to measure camera movement. These pattern recognition techniques are known to those skilled in the art.
According to an embodiment of the invention, the processing unit is arranged to detect a direction from the detected movement and to scroll the displayed map data in a direction opposite to the detected direction. The scroll direction may be determined using detection of movement of the navigation device, providing an easy and intuitive way of scrolling in a desired direction.
According to an embodiment of the invention, the processing unit is arranged to detect a distance from the detected movement and scroll the displayed map data by a scroll distance corresponding to the detected distance. The user can easily control the scroll distance of the map by relating the scroll distance of the map to the distance the navigation device moves.
According to an embodiment of the invention, the scroll distance is adjusted based on the sensitivity factor. The scroll distance may further depend on a coefficient of sensitivity, determining an amount of scrolling based on the detected movement of the navigation device.
According to an embodiment of the invention, the sensitivity coefficient is adjustable. This provides the user with the option of changing the sensitivity of the system according to his/her needs.
According to an embodiment of the invention, the sensitivity factor further depends on the focal length of the camera. The camera may be provided with an autofocus function. The camera may provide information about the selected focal distance to the processing unit. The object distance in the camera feed affects the amount of detected movement. When the camera views a near object, the amount of movement detected will be higher than when the camera views a far object. To compensate for this effect, the focal length of the camera may be used to adjust the coefficient of sensitivity.
According to an embodiment of the invention, the processing unit is arranged to detect rotation from the detected movement and scroll the displayed map data by rotating the displayed map data in a direction opposite to the direction of the detected movement. This provides an easy and intuitive way of rotating the map view.
According to an embodiment of the present invention, the processing unit is arranged to detect movement in the direction of the optical axis of the camera and scroll the displayed map data by performing a zoom operation corresponding to the detected movement. This provides an easy way of zooming in or out.
According to an embodiment of the invention, the navigation device further comprises a memory device to store map data.
According to an embodiment of the invention, the navigation device further comprises a positioning device arranged to provide information to the processing unit to determine a current position of the navigation device, and the navigation device is arranged to be in a first mode, wherein the navigation device is arranged to display map data, or in a second mode, wherein the navigation device is arranged to determine the current position using the positioning device, plan a route and guide a user through the planned route. This integrates a map display device with a navigation device that has the option of planning a route and guiding the user through the planned route.
According to an embodiment of the invention, the navigation device is arranged to switch from the first mode to the second mode, or vice versa, based on information from the positioning device.
According to an embodiment of the invention, the information from the positioning device is one of the following: destination address, velocity, acceleration. These are indications of whether the user prefers to use the device in the first mode or the second mode. For example, when a high velocity and/or high acceleration is detected, the user may wish to use the navigation device in the second mode. When it is detected that the destination has been reached, the user may no longer wish to use the navigation device in the second mode, so the navigation device may automatically switch to the first mode.
According to an embodiment of the invention, the navigation device is arranged to switch to the second mode if no information is provided by the positioning device. In this case, the navigation device is most likely indoors, so the user does not need to use the navigation device in the second mode.
According to an embodiment of the invention, the navigation device is arranged to switch from the first mode to the second mode, or vice versa, based on an input of the user, e.g. by pressing an appropriate button connected to the processing unit.
According to an embodiment of the invention, the navigation device is further provided with internal sensor devices, such as accelerometers and/or gyroscopes, to further improve the accuracy of the detected movement of the navigation device based on the received camera feed. These internal sensor devices may be used to provide additional information to the navigation device to further improve the accuracy of the detected movement.
According to another aspect, the invention relates to a method of scrolling map data displayed on a navigation device, the navigation device comprising a display, the method comprising:
the map data is displayed on a display device,
characterized in that the navigation device further comprises a camera, the method further comprising:
receiving a camera feed from the camera and,
detecting movement of the navigation device based on the received camera feed,
scrolling the displayed map data in response to the detected movement.
According to another aspect, the invention relates to a computer program arranged to perform the above method when loaded on a computer arrangement.
According to a further aspect, the invention relates to a data carrier comprising a computer program according to the above.
Drawings
Embodiments of the invention will now be described, by way of example only, with reference to the accompanying schematic drawings in which corresponding reference symbols indicate corresponding parts, and in which:
figure 1 schematically depicts a schematic block diagram of a navigation device,
figure 2 schematically depicts a schematic view of a navigation device,
figure 3 schematically depicts a side view of a navigation device according to an embodiment of the invention,
figures 4a, 4b and 4c schematically depict a front view, a rear view and a side view respectively of a navigation device according to an embodiment of the invention,
FIGS. 5a, 5b and 5c schematically depict a navigation device according to an embodiment of the invention, an
FIG. 6 shows a flow diagram according to an embodiment of the invention.
Detailed Description
Fig. 1 shows a schematic block diagram of an embodiment of a navigation device 10 comprising a processor unit 11 performing arithmetic operations. The processor unit 11 is arranged to communicate with memory units storing instructions and data, such as a hard disk 12, a Read Only Memory (ROM)13, an Electrically Erasable Programmable Read Only Memory (EEPROM)14 and a Random Access Memory (RAM) 15. The memory unit may include map data. This map data may be two-dimensional map data (latitude and longitude), but may also include a third latitude (altitude). The map data may further comprise additional information, such as information about gas stations, points of interest. The map data may also include information about the shape of buildings and objects along the road.
The processor unit 11 may also be arranged to communicate with one or more input devices, such as a keyboard 16 and a mouse 17. The keyboard 16 may, for example, be a virtual keyboard provided on the display 18 (being a touch screen). The processor unit 11 may further be arranged to communicate with one or more output devices, such as a display 18, a speaker 29, and one or more reading units 19 to read, for example, floppy disks 20 or CD ROMs 21. The display 18 may be a conventional computer display (e.g., an LCD) or may be a projection-type display, such as a heads-up display used to project instrument data onto an automobile windshield or windshield. The display 18 may also be a display arranged to act as a touch screen, which allows a user to input instructions and/or information by touching the display 18 with his finger.
The speaker 29 may also be formed as part of the navigation device 10. In the case where the navigation device 10 is used as an in-vehicle navigation device, the navigation device 10 may use speakers of a car radio, a plug-in computer, or the like.
The processor unit 11 may further be arranged to communicate with a positioning device 23, such as a GPS receiver, which provides information about the position of the navigation device 10. According to this embodiment, the positioning device 23 is a GPS based positioning device 23. It will be appreciated, however, that the navigation device 10 may implement any type of location sensing technology and is not limited to GPS. Thus, it may be implemented using other types of GNSS (Global navigation satellite System), such as the European Galileo System. Again, it is not limited to satellite-based position/velocity systems, but may equally be deployed using ground-based beacons or any other type of system that enables a device to determine its geographic location.
It should be understood, however, that more and/or other memory units, input devices, and read devices known to those skilled in the art may be provided. In addition, one or more of which may be located physically remote from memory cells 11, if desired. The processor unit 11 is shown as one block, however it may comprise several processing units running in parallel or controlled by one main processor, possibly located remotely from each other, as known to the person skilled in the art.
The navigation device 10 is shown as a computer system, but it may be any signal processing system with analog and/or digital and/or software techniques arranged to perform the functions discussed herein. It will be appreciated that although the navigation device 10 is shown in figure 1 as multiple components, the navigation device 10 may be formed as a single device.
The navigation device 10 may use navigation software, such as that known as Navigator from TomTom b.v. Navigator software can run on a touch screen (i.e., controlled by a stylus) pocket PC powered PDA device (e.g., compaq ipaq) as well as devices with an integral GPS receiver 23. The combined PDA and GPS receiver system is designed for use as an in-vehicle navigation system. The invention may also be implemented with any other arrangement of navigation device 10, such as a device with an integral GPS receiver/computer/display, or a device designed for non-vehicle use (e.g., for walkers) or vehicles other than automobiles (e.g., aircraft).
Fig. 2 depicts a navigation device 10 as described above.
Navigator software, when running on the navigation device 10, causes the navigation device 10 to display a normal navigation mode screen at the display 18, as shown in fig. 2. This view may provide driving instructions using a combination of text, symbols, voice guidance, and moving maps. The main user interface elements are as follows: the 3D map occupies most of the screen. It should be noted that the map may also be shown as a 2D map.
The map shows the location of the navigation device 10 and its immediate surroundings, rotated in such a way that the direction of movement of the navigation device 10 is always "up". Running on the lower quarter of the screen may be a status bar 2. The current location of the navigation device 10 (as determined by the navigation device 10 itself using conventional GPS location finding methods) and its position (as inferred from its direction of travel) is depicted by the location arrow 3. The route 4 calculated by the device (using a route calculation algorithm stored in the memory device 11, 12, 13, 14, 15 and applied to the map data stored in the memory device 11, 12, 13, 14, 15) is shown as a shaded (or highlighted) path. On the route 4, all major actions (e.g. turning, crossroads, detours, etc.) are schematically depicted by arrows 5 overlaying the route 4. The status bar 2 may also contain a schematic icon 6 on its left side depicting the next action, here a right turn. The status bar 2 also shows the distance to the next action (i.e. the right turn), here 50 meters, which is extracted from a database of the entire route calculated by the device, i.e. a list of all roads and related actions defining the route to be taken. The status bar 2 also shows the name of the current road 8, the estimated time before arrival 9 (here 35 minutes), the estimated actual arrival time 25 (4: 50 pm) and the distance to the destination 26(31.6 Km). Status bar 2 may further show additional information such as GPS signal strength in the form of a mobile phone type signal strength indicator.
As already mentioned above, the navigation device may comprise an input device, such as a touch screen, which enables a user to invoke a navigation menu (not shown). From this menu, other navigation functions may be initiated or controlled. Allowing navigation functions to be selected from a menu screen that is itself very easy to invoke (e.g., only one step apart from the map display to the menu screen) greatly simplifies user interaction and makes it faster and easier. The navigation menu contains options for the user to enter a destination.
The actual physical structure of the navigation device 10 itself may be substantially indistinguishable from any conventional handheld computer except for the integral GPS receiver 23 or a GPS data feed from an external GPS receiver. Thus, the memory devices 12, 13, 14, 15 store route calculation algorithms, map databases and user interface software; the processor unit 12 interprets and processes user inputs (e.g., using a touch screen to enter start and destination addresses and all other control inputs) and deploys route calculation algorithms to calculate an optimal route. "optimal" may refer to a criterion such as shortest time or shortest distance or some other user-related factor.
More specifically, the user enters his start position and desired destination into the navigation software running on the navigation device 10 using the provided input devices (e.g., touch screen 18, keypad 16, etc.). The user then selects the way to calculate the travel route: a variety of modes are provided, such as a "fast" mode, which calculates routes very quickly, but the routes may not be shortest; "complete" mode, which looks at all possible routes and locates the shortest route, but takes longer to calculate, etc. Other options are possible where the user defines a scenic route, such as passing through points of interest (POIs) mostly marked as views of outstanding beauty, or passing through most POIs that children may be interested in, or using the fewest junctions, etc.
Roads themselves are described in the map database that is part of (or otherwise accessed by) the navigation software running on the navigation device 10 as lines, i.e. vectors (e.g. start point, end point, road direction, an entire road being made up of hundreds of such sections, each uniquely defined by start point/end point direction parameters). A map is then a set of such road vectors, plus points of interest (POIs), plus road names, plus other geographic features such as park boundaries, river boundaries, etc., all of which are defined in terms of vectors. All map features (e.g., road vectors, POIs, etc.) are defined with a coordinate system that corresponds or correlates to the GPS coordinate system, enabling the device location determined by the GPS system to be located on the relevant road shown in the map.
Route calculation uses complex algorithms as part of the navigation software. The algorithm is applied to obtain a large number of possible different routes. The navigation software then evaluates them against user-defined criteria (or device defaults), such as full mode scanning, with scenic routes, past museums, and no speed camera. Next, the route that best meets the predetermined criteria is calculated by the processor unit 11 and then stored in a database in the memory means 12, 13, 14, 15 as a sequence of vectors, road names and actions to be taken at the end of the vector (e.g. corresponding to a predetermined distance along each road in the route, e.g. after 100 meters, turning left to street x).
The navigation device 10 is provided with a camera 24, as shown in fig. 1. Fig. 3 schematically depicts a side view of the navigation device 10 showing the camera 24 being integrally formed with the navigation device 10. Fig. 3 also schematically shows a display 18. The camera 24 is arranged to generate a camera feed and to transmit this camera feed to the processor unit 11, as shown in fig. 1. The processor unit 11 is arranged to analyze the received camera feed, as will be described below.
Fig. 4a and 4b show a front view and a rear view, respectively, of an alternative navigation device 10'. Fig. 4a shows: a display 18 that displays a part of the map data; and a button arrangement 30 comprising one or more integrally formed buttons. Fig. 4b shows that the camera 24 is provided at the rear side of the navigation device 10'. Finally, fig. 4c schematically shows a side view of the navigation device 10'.
The alternative navigation device 10 'described with reference to figures 4a, 4b and 4c is primarily a map display device which has no options to determine the current location of the navigation device 10', plan a route and guide the user to a destination address. Thus, the navigation device 10 'is arranged to operate only in a first mode, in which the navigation device 10' is used only as a map display device, whereas the navigation device 10 is arranged to operate in a first mode and a second mode, in which the second mode is to determine a current position, plan a route and guide a user through a planned route.
It will be appreciated that both the navigation device 10 described with reference to figures 1, 2 and 3 and the navigation device 10' described with reference to figures 4a, 4b and 4c may be used in conjunction with the present invention.
According to the present invention, the navigation device 10, 10 'is arranged to detect movement of the navigation device 10, 10' by analyzing images recorded by the camera 24. This can be done by using simple pattern recognition techniques known from the prior art.
For example, pattern recognition techniques are known that are able to distinguish certain characterizing features (edges, corners) in a camera feed and follow the features as the camera 24 (or the item being photographed) moves. By doing so, the movement of the camera 24 can be detected. To do so, the memory devices 12, 13, 14, 15 may store program instructions that instruct the processor unit 11 to perform pattern recognition techniques to detect movement of the camera 24, and thus the navigation device 10, 10'.
The pattern recognition technique preferably follows a plurality of features in the camera feed. The more features that follow, the more reliable the movement of the navigation device 10, 10' can be detected. In the case of following multiple features, the processor unit 11 may determine that the camera 24 is moving only if most of the features are engaged in similar movement (e.g., greater than 75%). This prevents movement due to movement of the object being photographed from being erroneously detected. Movement of the object being photographed typically results in movement of only a relatively small number of features.
The camera 24 is preferably able to focus on objects relatively close to the navigation device 10, 10 'to allow use of the scrolling option according to the invention when the navigation device 10, 10' is placed on a table or the like.
Scrolling the displayed map based on the detected camera movement. For example, when camera 24 is detected moving to the left, the map may be scrolled to the right. Of course, the map data may also be scrolled to the left based on the detected movement to the left of the navigation device 10, but scrolling in the opposite direction of the detected movement better corresponds to the intuition of the user, as will be further explained with reference to fig. 5 c.
This is further shown in fig. 5a and 5 b. Fig. 5a depicts a navigation device 10' similar to fig. 4 a. Fig. 5b depicts a navigation device 10' similar to fig. 5a, which is moved to the upper left along arrow a compared to its position in the real world in fig. 5 a. The navigation device 10' detects this movement by analyzing the camera feed 24 received from the camera 24 using pattern recognition techniques as described above. This also applies to the navigation device 10 depicted in fig. 1, 2 and 3.
Of course, a coefficient of sensitivity may be applied to determine the scroll distance based on the detected movement of the navigation device 10, 10'. This sensitivity coefficient may be adjusted by the user, for example via a menu available on the navigation device 10, 10'.
When a movement of the navigation device 10, 10' is detected, the processor unit 11 is arranged to scroll the displayed map data in a respective direction, in this case in the opposite direction, i.e. lower right. The distance of scrolling the map (the scroll distance) depends on the distance of the detected movement of the navigation device 10, 10'. It will be appreciated that the distance of movement of the feature being followed in the camera feed depends not only on the distance the navigation device 10, 10 'is moved, but also on the distance between the navigation device 10, 10' and the object being photographed. Thus, the coefficient of sensitivity may also depend on the distance between the camera 24 and the item being photographed. This distance may be determined by the processing unit 11 or the camera 24 by determining an appropriate focal distance.
The above results in the navigation device 10, 10 'when moved creating the illusion that the navigation device 10, 10' appears to be a magnifying glass used to view a map extending beyond the boundaries of the display 18. As the navigation device 10 moves, different parts of the map are displayed. This is shown schematically in fig. 5 c. Fig. 5c shows that the navigation device 10, 10' is surrounded by virtual map data 31. When the navigation device 10, 10 moves to the right, virtual map data 31 is displayed to the right of the navigation device 10, 10'. In other words: as if the navigation device 10, 10' were moving on an infinite map.
Fig. 6 schematically depicts a flow chart of the steps performed by the processor unit 11 when performing the above described scrolling method.
In a first step 51, the processor unit 11 receives a camera feed from the camera 24. In practice, the camera feed may be a continuous signal that is continuously received by the processor unit 11.
In a second step 52, the processor unit 11 detects the direction of movement of the navigation device 10, 10' based on the received camera feed. This can be done by using all types of analysis techniques, such as the pattern recognition techniques described above.
In a third step 53, the processor unit 11 detects a movement distance of the navigation device 10, 10' based on the detected movement of the received camera feed.
Again, this can be accomplished by using all types of analysis techniques, such as described above.
In a fourth step 54, the detected distance may be multiplied by a sensitivity factor. This coefficient may be adjusted by the user or may depend on the distance between the camera 24 and the item being photographed or located on the map scale.
Finally, in a fifth step 55, the displayed map data is scrolled in a direction corresponding to the detected direction (e.g., the opposite direction). The scroll distance may depend on the detected movement distance of the navigation device and the coefficient of sensitivity.
It will be appreciated that the processor unit 11 is in fact a loop and is constantly receiving and analysing camera feeds and constantly scrolling the displayed map data accordingly.
The above only relates to the lateral movement of the navigation devices 10, 10' and the map data. However, it will be appreciated that the above may also be used for other types of movement, such as performing rotational movement and performing zoom operations.
Using the pattern recognition techniques mentioned above, rotational movement of the navigation device 10, 10' may be detected. This can be done by following the characterizing features (edges, corners) in the camera feed and following the features as described above. Both the direction of rotation and the angle of rotation can be detected by the processor unit 11 based on the camera feed. Next, the displayed map data is scrolled (rotated) in the opposite direction at a similar angle.
Using the pattern recognition techniques mentioned above, movement of the navigation device 10, 10' in the direction of the optical axis of the camera 24 may be detected. This can be done by following the characterizing features (edges, corners) in the camera feed and following the features. The direction and amount of movement may be detected by the processor unit 11 based on the camera feed. When a movement along the optical axis of the camera 24 in the direction in which the camera 24 faces is detected, the processor unit 11 may zoom in, i.e. zoom in, the displayed map data. When the opposite movement is detected (and thus in the opposite direction along the optical axis of the camera 24 to the direction in which the camera 24 is facing), the processor unit 11 may scale down the displayed map data, i.e. zoom out.
The amount of zoom-in or zoom-out depends on the amount the navigation device 10, 10' is moved along the optical axis. Also, a sensitivity factor may be applied to adjust the zoom speed as desired by the user. Of course, zooming may also be performed in reverse, i.e. when a movement along the optical axis of the camera 24 in the direction the camera 24 is facing is detected, the processor unit 11 may scale down the displayed map data, i.e. zoom out.
It will be appreciated that the invention as described above is applicable not only to navigation devices 10, 10' arranged to display map data in a two-dimensional manner (2D), but also to navigation devices arranged to display map data in a (quasi-) stereoscopic or three-dimensional manner (3D). When using (quasi-) stereo or three-dimensional modes, the map is represented in a form that it will be seen looking down at the earth from a viewpoint above the earth's surface, at some viewing angle represented by the map data.
In the case where the navigation device displays map data in a stereoscopic or three-dimensional manner (3D), a scroll operation different from that in the two-dimensional case may be performed based on the detected movement. For example, movement of the navigation device 10, 10' in the direction of the optical axis may not result in a zoom operation, but may result in scrolling of the map data in a direction that causes the user to feel that he/she "flies" on the surface of the earth represented by the map data. Further, the rotational movement may cause a change in the direction of viewing the map data from the viewpoint. Thus, in the event that a counterclockwise rotational movement of the navigation device 10, 10' is detected, the viewing direction may change from north to west (thus scrolling the map data in a clockwise direction).
Pattern recognition techniques can detect all types of movement by following features and analyzing the movement of those features. If, for example, most of all features move in the same direction, then lateral movement (left-right-up-down, or a combination thereof) may be detected. A (counter-clockwise) rotational movement may be detected if, for example, features in the upper part of the camera feed are mostly moved to the left, in the lower part of the camera feed are moved to the right, in the left part of the camera feed are moved downwards and in the right part of the camera feed are moved upwards. Furthermore, if most features appear to move in a direction away from the center point, movement along the optical axis of the camera 24 in the direction that the camera 24 faces may be detected.
The accuracy of the detected movement based on the camera feed may be further improved by using internal sensor devices 28 (see fig. 1), such as accelerometers, gyroscopes, etc. The internal sensor device 28 is arranged to detect movements, accelerations and rotations of the navigation device 10, 10'. The internal sensor device 28 may be connected to the processing unit 11 to transmit its readings to the processing unit 11.
The readings of the internal sensor devices may be used to improve the accuracy of the detected movement of the navigation device 10 based on the camera feed. This may improve the quality of the motion/angle detection based on the camera feed, or adjust the speed and/or direction of the map scrolling (pan/rotate/zoom).
The above-described way of scrolling a map based on detected movements of the navigation device 10, 10' by analyzing camera feeds may be used in conjunction with prior art scrolling options, e.g. using (virtual) buttons or a mouse. When used in conjunction, the user may command, complete, or correct the scrolling movement performed based on the camera feed.
The term navigation device 10, 10' as used herein refers not only to navigation devices arranged to determine a current position or to determine an optimal route to a certain destination, but also covers all devices that assist a user in navigating or orienting himself by displaying (a part of) a map, such as the navigation devices depicted in fig. 4a, 4b and 4 c.
In the case where the present invention is used in conjunction with a navigation device 10 that is provided with functionality to determine a current location, plan a route, and/or navigate according to such a planned route, the navigation device 10 needs to be provided with an option to switch from a first mode, in which the navigation device 10 is used only as a map display device, to a second mode, in which the navigation device 10 is used to guide a user through the planned route.
The switching from the first mode to the second mode and vice versa may be done manually by the user, for example using appropriate buttons, but may also be done automatically by the navigation device 10. Such buttons may also be virtual buttons.
The button is arranged to provide a signal to the processing unit 11 indicating a switch mode, or indeed a switch to the first or second mode.
For example, when the navigation device 10 is arranged to receive GPS signals using the positioning device 23, it can detect the movement and speed of movement of the navigation device 10. The navigation device 10 may automatically switch (from the second mode) to the first mode when the detected speed (velocity) and/or acceleration is relatively high, e.g. relative to a predetermined threshold. In that case, the user is likely traveling and is focused on a map view based on the current position of the navigation device 10, and the user does not need to scroll the map unless scrolled by the traveling of the navigation device 10.
In case the positioning device 23 of the navigation device 10 does not receive any GPS signal, the system may switch to the second mode. In that case, the navigation device 10 is most likely indoors and the user may not be traveling. The navigation device may use the last valid received GPS position received by the positioning device 23 to determine which part of the map to display on the display 18. The navigation device 10 may also use internal sensor devices 28, such as accelerometers, gyroscopes, etc., to further more accurately determine position.
The navigation device 10 may also switch to the second mode if the desired destination (address) has been reached according to the positioning device 23.
Furthermore, if the positioning device 23 of the navigation device 10 does not receive a valid GPS signal, but a relatively large amount of movement is detected by the internal sensor device 28 (e.g., accelerometer, gyroscope, etc.) and/or the camera 24, the navigation device 10 may switch (from the first mode) to the second mode.
While specific embodiments of the invention have been described above, it will be appreciated that the invention may be practiced otherwise than as described. For example, the invention may take the form of a computer program containing one or more sequences of machine-readable instructions describing a method as disclosed above, or a data storage medium (e.g. semiconductor memory, magnetic or optical disk) having such a computer program stored therein. Those skilled in the art will appreciate that all software components may also be constructed using hardware components.
The above description is intended to be illustrative and not restrictive. Thus, it will be apparent to those skilled in the art that modifications may be made to the invention as described without departing from the scope of the claims set out below.

Claims (19)

1. A navigation device (10, 10 ') comprising a processing unit (11) and a display (18), the processing unit (11) being arranged to display map data on the display (18), characterized in that the navigation device (10, 10') further comprises a camera (24) arranged to provide a camera feed to the processing unit (11), the processing unit (11) being further arranged to:
receive a camera feed from the camera (24),
detecting movement of the navigation device (10, 10') based on the received camera feed,
scrolling the displayed map data in response to the detected movement.
2. A navigation device (10, 10 ') according to claim 1, wherein the processing unit (11) is arranged to detect the movement of the navigation device (10, 10') from the camera feed using pattern recognition techniques.
3. Navigation device (10, 10') according to any one of the preceding claims, wherein the processing unit (11) is arranged to detect a direction of the detected movement and scroll the displayed map data in a direction opposite to the detected direction.
4. A navigation device (10, 10') according to any of claims 1-3, wherein the processing unit (11) is arranged to detect a distance of the detected movement and scroll the displayed map data by a scroll distance corresponding to the detected distance.
5. The navigation device (10, 10') of claim 4, wherein the scroll distance is adjusted based on a coefficient of sensitivity.
6. Navigation device (10, 10') according to claim 5, wherein the coefficient of sensitivity is adjustable.
7. Navigation device (10, 10') according to any of claims 5-6, wherein the sensitivity factor further depends on a focal length of the camera (24).
8. Navigation device (10, 10') according to any one of the preceding claims, wherein the processing unit (11) is arranged to detect a rotation of the detected movement and scroll the displayed map data by rotating the displayed map data in a direction opposite to the direction of the detected movement.
9. Navigation device (10, 10') according to any one of the preceding claims, wherein the processing unit (11) is arranged to detect a movement in the direction of the optical axis of the camera (24) and to scroll the displayed map data by performing a zoom operation corresponding to the detected movement.
10. Navigation device (10, 10 ') according to any of the preceding claims, wherein the navigation device (10, 10') further comprises a memory device (12, 13, 14, 15) to store map data.
11. Navigation device (10) according to any one of the preceding claims, wherein the navigation device (10) further comprises a positioning device (23) arranged to provide the processing unit (11) with information to determine a current position of the navigation device (10), and the navigation device (10) is arranged to be in a first mode, wherein the navigation device (10) is arranged to display map data, or in a second mode, wherein the navigation device (10) is arranged to use the positioning device (23) to determine a current position, plan a route and guide the user through a planned route.
12. Navigation device (10) according to claim 11, wherein the navigation device (10) is arranged to switch from the first mode to the second mode, or vice versa, based on information from the positioning device (23).
13. Navigation device (10) according to claim 12, wherein the information from the positioning device (23) is one of the following: destination address, velocity, acceleration.
14. Navigation device (10) according to any of claims 11-13, wherein the navigation device (10) is arranged to switch to the second mode if no information is provided by the positioning device (23).
15. Navigation device (10) according to claim 11, wherein the navigation device (10) is arranged to switch from the first mode to the second mode, or vice versa, based on an input by a user, e.g. by pressing an appropriate button connected to the processing unit (11).
16. Navigation device (10, 10 ') according to any of the preceding claims, wherein the navigation device (10, 10 ') is further provided with internal sensor devices (28), such as accelerometers and/or gyroscopes, to further improve the accuracy of the detected movement of the navigation device (10, 10 ') based on the received camera feed.
17. A method of scrolling map data displayed on a navigation device (10), the navigation device (10, 10') comprising a display (18), the method comprising:
displaying map data on the display (18),
characterized in that the navigation device (10, 10') further comprises a camera (24), the method further comprising:
receive a camera feed from the camera (24),
detecting movement of the navigation device (10, 10') based on the received camera feed,
scrolling the displayed map data in response to the detected movement.
18. A computer program arranged to perform the method of claim 17 when loaded on a computer arrangement.
19. A data carrier comprising a computer program according to claim 18.
HK08109761.0A 2005-08-17 Navigation device and method of scrolling map data displayed on a navigation device HK1114169A (en)

Publications (1)

Publication Number Publication Date
HK1114169A true HK1114169A (en) 2008-10-24

Family

ID=

Similar Documents

Publication Publication Date Title
EP1889007B1 (en) Navigation device with camera-info
US8423292B2 (en) Navigation device with camera-info
JP6717742B2 (en) Device and method for displaying navigation instructions
JP4705170B2 (en) Navigation device and method for scrolling map data displayed on navigation device
US20090070038A1 (en) Navigation Device with Automatic Gps Precision Enhancement
CN102798397B (en) Navigation device with camera information
RU2375756C2 (en) Navigation device with information received from camera
RU2417398C2 (en) Navigation device and method of scrolling cartographic representation displayed in navigation device
HK1114169A (en) Navigation device and method of scrolling map data displayed on a navigation device
KR20080036039A (en) Navigation device and method for scrolling map data displayed on the navigation device
JP2011022152A (en) Navigation device
KR20080019690A (en) Navigation device with camera information
NZ564320A (en) Navigation device and method of scrolling map data displayed on a navigation device
HK1116861A (en) Navigation device with camera-info