[go: up one dir, main page]

WO2011003467A1 - Touchscreen input on a multi-view display screen - Google Patents

Touchscreen input on a multi-view display screen Download PDF

Info

Publication number
WO2011003467A1
WO2011003467A1 PCT/EP2009/058833 EP2009058833W WO2011003467A1 WO 2011003467 A1 WO2011003467 A1 WO 2011003467A1 EP 2009058833 W EP2009058833 W EP 2009058833W WO 2011003467 A1 WO2011003467 A1 WO 2011003467A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
screen
enabled
image
icon
Prior art date
Application number
PCT/EP2009/058833
Other languages
French (fr)
Inventor
Peter Van Hulten
Original Assignee
Tomtom International B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tomtom International B.V. filed Critical Tomtom International B.V.
Priority to PCT/EP2009/058833 priority Critical patent/WO2011003467A1/en
Publication of WO2011003467A1 publication Critical patent/WO2011003467A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • B60K35/654Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the user being the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • B60K35/656Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the user being a passenger
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/141Activation of instrument input devices by approaching fingers or pens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards

Definitions

  • This invention relates to organising touch-screen input on a multi-view display screen.
  • the invention may be especially useful for in-vehicle applications, but is not limited exclusively to such use. Additionally or alternatively, the invention may be especially useful for presenting navigation and/or route-planning information.
  • a multi-view display screen is a type which generates a plurality of images on substantially the same screen area, the image viewed depending on the viewing position with respect to the display device.
  • the image viewed may depend on the viewing angle with respect to a display screen.
  • An example of such a display device is described in US-A-7365707.
  • the display screen is especially suitable for in-vehicle use, the same screen presenting two different display images to people in two different seating positions of the vehicle, while occupying only the usual amount of space as a traditional single-view screen.
  • the present invention seeks to enhance the versatility of a multi-view display screen.
  • the preferred embodiment provides a multi-view display screen provided with a touch-screen input device for detecting one or more touch positions at which a user touches the display screen for inputting user selections.
  • the inventors have appreciated a hitherto unappreciated problem that handling touch-screen input signals for a multi-view type of display screen is considerably more complicated than for a single-view display screen.
  • a touch input is detected for a multi-view screen, it may be unknown which of the plural different viewers is attempting to input information, and how that information should be handled or routed.
  • the invention provides a technique for organising the touch-screen input by designating a selectable one of plural viewable images as being enabled for touch-screen input.
  • a first display icon presented in, or overlaid on, the enabled image indicates to a viewer of the respective image that touch-screen input is enabled for this image (or for the source or application generating the image).
  • a second display icon is presented in or overlaid on one or more other ones of the plural viewable images to indicate that these other images are not enabled for touch-screen input.
  • the technique preferably further includes changing which of the plural views is designated as enabled for touch-screen input, in response to detection that a user has pressed a displayed icon. If there are two possible images, the designation of which image is enabled toggles between the two. If there are more than two images, the designation may advance an index number identifying the respective image.
  • the technique preferably further includes detecting touch-screen input information not corresponding to the position of the displayed icon(s). Such information is routed to the module or application that is the source of the enabled image.
  • Fig. 1 is a schematic block diagram of a multi-view display screen with a touchscreen input device, and installed in-vehicle;
  • Fig. 2 is a schematic view of an example of first image viewable from a first viewing position;
  • Fig. 3 is a schematic view of an example of a second image viewable from a second viewing position
  • Fig. 4 is a schematic flow diagram showing information flow and processing in response to detection of input via the touch-screen
  • Fig. 5 is a schematic illustration of a Global Positioning System (GPS);
  • Fig. 6 is a schematic illustration of electronic components arranged to provide a navigation device.
  • Fig. 7 is a schematic illustration of the manner in which a navigation device may receive information over a wireless communication channel.
  • a preferred embodiment of the invention is now described with particular reference to a multi-view display screen used in-vehicle.
  • teachings of the invention are not limited to in-vehicle use, and may be used in any environment where touch-sensitive input is desired for multi-view display applications.
  • a multi-view display screen 10 is shown in a vehicle 12.
  • the display screen 10 is typically placed forward of, and between, two seating positions, for example, a driver position 14 and a passenger position 16.
  • the display screen 10 may be installed within a console unit of the vehicle 12, or otherwise mounted permanently or temporarily.
  • the multi-view display screen 10 is configured to generate a plurality of images 18a, 18b, simultaneously on the same screen, each being viewable distinctly from the other(s).
  • the image viewed depends on the viewing position (e.g. viewing angle) with respect to the screen 10.
  • a person sitting in a first viewing position e.g.
  • the display screen 10 is configured to generate two distinctive images 18a, 18b, but other screens 10 may generate three or more distinct images each depending on viewing position.
  • the multi-view display screen 10 enables two individuals to observe different display images. For example, a driver typically desires knowledge-based information, such as vehicle, weather or navigation information. A passenger may instead desire entertainment-based information, such as movies, games or internet browsing.
  • the display screen 10 enables a single screen to provide such information, and without an image intended for the passenger distracting the driver.
  • Fig. 2 illustrates an example of a first display image 18a generated by a vehicle navigation module 38a (described in more detail below).
  • Fig. 3 illustrates an example of a second display image 18b generated by a games or media module 38b (not shown in detail).
  • the screen 10 generally comprises a display panel 22, and an optical separator 24.
  • the display panel 22 may be of a liquid crystal or light-emitting diode type, providing a pixel-based display resolution typically N-times the required resolution of each image, where N is the number of distinct views.
  • the optical separator 24 determines which pixels are viewable according to the viewing position.
  • the optical separator 24 typically includes optical elements for directing and/or blocking light from each pixel, or each pixel group. Typical optical elements include small prism elements, small lenses or small optical masks. Examples of multi-view display screens suitable for use with the present invention are described in, for example, the aforementioned US 7365707, the entire content of which is incorporated herein by reference.
  • the screen 10 is controlled by a display controller 30, including a display driver or generator 32.
  • the display generator receives respective image source signals 36a, 36b, one for each image 18a, 18b, to be displayed.
  • the input signals 36 are typically supplied by respective source modules 38.
  • the modules 38 may typically include one or more of: a vehicle navigation module 38a, a media player and/or games module 38b, a vehicle information module 38c, an internet browser module 38d, a mobile telephone communications module 38e.
  • the modules 38 may be implemented by software executed on suitable hardware, or by dedicated module circuitry. Common hardware may optionally be used for plural modules. Additionally or alternatively, one or more of the modules 38 may be implemented as distinct units.
  • a feature of the present embodiment is that the display screen 10 is provided with a touch-screen input device 26 for enabling a person to make selections or input commands by touching the screen 10.
  • the touch-screen input device 26 may be disposed in front of the optical separator 24, or the two may be at least partly integrated together.
  • the touch-screen input device 26 may, for example, detect touches by means of capacitance sensing, or resistance sensing, or light-interruption, or inductance sensing, or processing of images captured by camera, or any combination of to or more of these.
  • the touchscreen input device 26 may optionally be configured to detect multiple touches (e.g. to identify plural positions at which the screen 10 is touched simultaneously by several fingers).
  • the touch-screen input device 26 is preferably able to detect and generate an output indicative of at least one position at which the screen is touched.
  • Handling touch-sensitive input signals for a multi-view display screen 10 is more complicated than for a single-view display screen.
  • a touch input is detected for a multi-view screen, it may be unknown whether a first person is making a selection while viewing the first image 18a, or a second person is making a different selection while viewing the different second image 18b.
  • the multi-view display 10 can efficiently generate plural images on the same screen, it still has only a single screen surface available for touch inputs by either user. It is therefore complicated to route the input signals to the appropriate source module 38. In the preferred embodiment, this is addressed by designating a selectable one of the images 18 (or one of the graphics inputs signals 36 or the respective source module 38) as being active or enabled for touch-screen input.
  • the one or more other images are designated as non- enabled.
  • the identity of the enabled image is stored as enabled-image information 40 in a register of the display controller 30.
  • the identity of the enabled image is depicted graphically on the screen 10.
  • one or more graphics icons 42, 44 may be overlaid on, or included in, at least one of the images 18 to indicate the image enabled for touch screen input.
  • an icon is generated for each image for indicating the enabled/non-enabled state of the respective image.
  • a first icon 42 (see Fig. 2) appearing in an image 18a indicates that the respective image is designated as active or enabled for touch-screen input, or rather that touch-screen inputs will be routed to the respective source module for that image.
  • a viewer seeing the first icon 42 in the image is made aware that he or she may make selections or enter commands using the touch-screen input device 26.
  • a second icon 44 (see Fig. 3) appearing in an image 18b indicates that the respective image is not active for touch-screen inputs.
  • a viewer seeing the second icon 44 in the image is made aware that he or she should not in general use the touch-screen (except to acquire active status, as described below).
  • the first and second icons 42 and 44 may be similar, but incorporate differences that can readily be understood to indicate “enabled” or “disabled” according to conventional user-interface practice.
  • the examples illustrated in Figs. 2 and 3 include a tick (e.g. coloured green) depicting "enabled” in the first icon 42, and a cross (e.g. coloured red) depicting "disabled” in the second icon 44, the icons themselves being similar in the form of a hand representing touch-sensitive input.
  • Other distinctions may include, for example, an icon being "greyed-out” or made partly transparent, in order to indicate a disabled status.
  • the icons are generated by the display generator 32 of the display controller 30.
  • the display generator 32 includes an icon resource 32a responsive to the enabled-image information 40 for generating the appropriate graphics icons.
  • the display generator 32 overlays the icons on the respective image signals 36a and 36b, so that the respective icons 42, 44 appear in the displayed images 18a and 18b.
  • the icons 42, 44 may be displayed permanently, or the icons may be displayed intermittently so as not to permanently occupy space in the displayed images 18.
  • the icons 42, 44 may be arranged near or at the same position in each display image 18a, 18b. Alternatively, the icons 42, 44 may be arranged at different positions, provided that this does not conflict with other input areas of the screen that may be used by any of the source modules 38.
  • the driver circuitry 30 further comprises a touch-screen input processor 34 responsive to signals from the touch-sensitive input device.
  • the processor 34 is configured with resources to do one or both of: (a) (a selection resource) to control which image 18 is designated as enabled for touch-screen input, and to arbitrate between requests for changing the image designated as enabled; and
  • Fig. 4 illustrates processing steps by the processor 34 for processing touchscreen inputs received from the touch-sensitive input device 26, and implementing the above resources.
  • the touch-position is analysed to determine whether the touch corresponds to the position of icon 42 or 44 in any of the images. This is most straightforward if the icons 42 and 44 are displayed at the same position within each image 18. If the touch position does not correspond to the icon position, processing proceeds to step 52 at which the identity of the enabled image 18 is determined from the stored information 40.
  • the received touch-sensitive input information is routed to the respective source module 38 that corresponds to the enabled image. In the situation depicted in Figs. 2 and 3, the active image is 18a, and the touch-sensitive input information would be routed to the navigation module 38a.
  • step 50 it is determined that the touch position does correspond to the position of one or more icons 42, 44, this is interpreted as a request to change which of the images 18 is designated as enabled for touch-screen input.
  • the identity of the enabled image 18 is determined from the stored information 40.
  • the identity is changed, preferably according to a predetermined sequence. If there are only two images 18a, 18b in the multi-view display screen 10, the sequence is to toggle the designated enabled image between the two available images, one after the other. If there are three or more images 18 in the multi-view display screen 10, an example sequence is to increment or decrement an index number representing the images, so that the designated image enabled for touch- sensitive input changes from one image to the next in turn.
  • step 58 new information 40 is stored, thereby causing the display generator 32 to update the display of icons 42, 44 in the different display images 18.
  • the above organisation of touch-screen input routing, and designation of which image is enabled for touch-screen inputs, is autonomous and independent of the different source modules 38 supplying the individual image signals 36.
  • the icons 42 and 44 are overlaid automatically, and the display controller 30 controls, and updates, which of the images is designated as enabled for touch-screen input.
  • the controller 30 also controls routing of touch-screen input to the appropriate source module 38.
  • the display controller 30 implements a resources for designating which image is enabled for touch-sensitive input, an icon resource for generating one or more graphical display icons indicating the image enabled for touch-screen input, a resource for changing the designation of which image is enabled for touch-screen input, and a resource for routing touch-screen input information to a source module associated with the image designated as enabled for touch-screen input.
  • the source modules 38 may themselves communicate with each other to incorporate equivalent display icons in the individual image signals 36, and implement the above resources for managing touch-screen input. At least a part of this functionality/resources may also be implemented or managed by an operating system that defines a execution environment for the source modules if software based.
  • the functionality of the above embodiment enables touch-screen input for a multi-view display to be organised efficiently, and in a manner that is clear and intuitive for plural viewers, without requiring substantial detection hardware to detect which viewer has pressed the screen.
  • Fig. 5 illustrates an example view of Global Positioning System (GPS), usable by navigation devices.
  • GPS Global Positioning System
  • NAVSTAR the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
  • the GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
  • the GPS system is denoted generally by reference numeral 100.
  • a plurality of satellites 120 are in orbit about the earth 124.
  • the orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous.
  • a GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
  • the spread spectrum signals 160 continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock.
  • Each satellite 120 as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120.
  • the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.
  • Figure 6 is an illustrative representation of electronic components of a navigation device 200 usable as the navigation module 38a according to a preferred embodiment of the present invention, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
  • the navigation device 200 includes a processor 210 connected to receive inputs from an input device 220 (e.g. from the touch-screen input device 26 via the display controller 30) and a display signal or image generator 240 for the screen 10.
  • the navigation device may include an output device 260, for example an audible output device (e.g. a loudspeaker).
  • output device 260 can produce audible information for a user of the navigation device 200, it is should equally be understood that input device 220 can include a microphone and software for receiving input voice commands as well.
  • processor 210 is operably coupled to a memory resource 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200.
  • the memory resource 230 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non-volatile memory, for example a digital memory, such as a flash memory.
  • the external I/O device 280 may include, but is not limited to an external listening device such as an earpiece for example.
  • connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.
  • any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example
  • the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.
  • Fig. 6 further illustrates an operative connection between the processor 210 and an antenna/receiver 250 via connection 255, wherein the antenna/receiver 250 can be a GPS antenna/receiver for example.
  • the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
  • the electronic components shown in Fig. 6 are powered by power sources (not shown) in a conventional manner.
  • power sources not shown
  • different configurations of the components shown in Fig. 6 are considered to be within the scope of the present application.
  • the components shown in Fig. 6 may be in communication with one another via wired and/or wireless connections and the like.
  • the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200.
  • the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (such as a mobile phone, PDA, and/or any device with mobile phone technology) establishing a digital connection (such as a digital connection via known Bluetooth technology for example). Thereafter, through its network service provider, the mobile device can establish a network connection (through the internet for example) with a server 302. As such, a "mobile" network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/or in a vehicle) and the server 302 to provide a "real-time" or at least very “up to date” gateway for information.
  • the establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using an internet (such as the World Wide Web) for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example.
  • the mobile device can utilize any number of communication standards such as CDMA, GSM, WAN, etc.
  • an internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example.
  • an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service)-connection (GPRS connection is a high-speed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet).
  • GPRS General Packet Radio Service
  • the navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
  • the navigation device 200 may include its own mobile phone technology within the navigation device 200 itself (including an antenna for example, or optionally using the internal antenna of the navigation device 200).
  • the mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card (e.g. Subscriber Identity Module or SIM card), complete with necessary mobile phone technology and/or an antenna for example.
  • mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.
  • a Bluetooth enabled navigation device may be used to correctly work with the ever changing spectrum of mobile phone models, manufacturers, etc., model/manufacturer specific settings may be stored on the navigation device 200 for example. The data stored for this information can be updated.
  • the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements.
  • the server 302 and a navigation device 200 can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).
  • the server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312.
  • the processor 304 is further operatively connected to transmitter 308 and receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318.
  • the signals sent and received may include data, communication, and/or other propagated signals.
  • the transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
  • Server 302 is further connected to (or includes) a mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314.
  • the mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302.
  • the navigation device 200 is adapted to communicate with the server 302 through communications channel 318, and includes processor, memory, etc. as previously described with regard to Fig. 6, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
  • Software stored in server memory 306 provides instructions for the processor
  • One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200.
  • Another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
  • the communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302.
  • Both the server 302 and navigation device 200 include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.
  • the communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
  • RF radio-frequency
  • the communication channel 318 includes telephone and computer networks. Furthermore, the communication channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency, infrared communication, etc. Additionally, the communication channel 318 can accommodate satellite communication.
  • the communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology.
  • the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc.
  • TDMA Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • CDMA Code Division Multiple Access
  • GSM Global System for Mobile Communications
  • Both digital and analogue signals can be transmitted through the communication channel 318.
  • These signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
  • the server 302 includes a remote server accessible by the navigation device
  • the server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200.
  • a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200.
  • a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
  • the navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection for example.
  • the processor 304 in the server 302 may be used to handle the bulk of the processing needs, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
  • the multi-view display screen 10 and the navigation device 200 may be installed permanently within a vehicle to be used only with that vehicle.
  • one or both of the screen 10 and the navigation device 200 may be a portable unit.
  • the screen 10 and the navigation device are installed together in a portable housing (not shown).
  • the navigation device may utilise any kind of position sensing technology as an alternative to (or indeed in addition to) GPS.
  • the navigation device may utilise using other global navigation satellite systems such as the European Galileo system. Equally, it is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

Apparatus comprising: a multi-view display screen (10) operable to display overlapping plural images (18) viewable on the same display screen, a viewable image being determined according to viewing position; a touch-screen input device (26) for detecting user touches of the screen (10) representing user input; a display controller (30-34) for designating a selectable one of the plural images (18a, 18b) as enabled for touch-screen input, and for generating at least one graphical display icon for indicating the image enabled for touch-screen input. The icons may include a first icon displayed in the image is enabled for touchscreen input, and a second icon displayed in one or more other images not enabled for touch-screen input. The controller is configured to change the image designated for input in response to detection of a user touching the position of the icon in the image.

Description

TOUCHSCREEN INPUT ON A MULTI-VIEW DISPLAY SCREEN
Field of the Invention
This invention relates to organising touch-screen input on a multi-view display screen. The invention may be especially useful for in-vehicle applications, but is not limited exclusively to such use. Additionally or alternatively, the invention may be especially useful for presenting navigation and/or route-planning information.
Background to the Invention
A multi-view display screen is a type which generates a plurality of images on substantially the same screen area, the image viewed depending on the viewing position with respect to the display device. For example, the image viewed may depend on the viewing angle with respect to a display screen. An example of such a display device is described in US-A-7365707. The display screen is especially suitable for in-vehicle use, the same screen presenting two different display images to people in two different seating positions of the vehicle, while occupying only the usual amount of space as a traditional single-view screen.
Co-pending, unpublished patent application PCT/EP2008/063391 describes a multi-dimensional display for in-car navigation.
The present invention seeks to enhance the versatility of a multi-view display screen.
Summary of the Invention
Aspects of the invention are defined in the claims.
In one aspect, the preferred embodiment provides a multi-view display screen provided with a touch-screen input device for detecting one or more touch positions at which a user touches the display screen for inputting user selections.
The inventors have appreciated a hitherto unappreciated problem that handling touch-screen input signals for a multi-view type of display screen is considerably more complicated than for a single-view display screen. When a touch input is detected for a multi-view screen, it may be unknown which of the plural different viewers is attempting to input information, and how that information should be handled or routed.
In a preferred embodiment, the invention provides a technique for organising the touch-screen input by designating a selectable one of plural viewable images as being enabled for touch-screen input. Preferably, a first display icon presented in, or overlaid on, the enabled image indicates to a viewer of the respective image that touch-screen input is enabled for this image (or for the source or application generating the image). Additionally or alternatively, it is preferred that a second display icon is presented in or overlaid on one or more other ones of the plural viewable images to indicate that these other images are not enabled for touch-screen input.
The technique preferably further includes changing which of the plural views is designated as enabled for touch-screen input, in response to detection that a user has pressed a displayed icon. If there are two possible images, the designation of which image is enabled toggles between the two. If there are more than two images, the designation may advance an index number identifying the respective image.
The technique preferably further includes detecting touch-screen input information not corresponding to the position of the displayed icon(s). Such information is routed to the module or application that is the source of the enabled image.
Features and advantages of the invention in its various aspects and embodiments include at least one selected from: (i) enhancing the versatility of a multi-view display screen beyond merely displaying image signals; (ii) efficiently organising touch-screen input for a multi-view display screen, without requiring additional hardware to detect which of plural possible viewers is reaching to touch the screen; (iii) presenting display icons to inform viewers whether a respective image is enabled for touch-screen input; (iv) permitting selection of a desired viewable image for designation as enabled for touch-screen input ; (v) providing simple and intuitive operation that avoids conflict between the contradictory issues of generating multiple images viewable on the same screen, while only having a single touch-screen available for all viewers.
Further feature and advantages are set out hereafter, and further details and features of each of these embodiments are defined in the accompanying dependent claims and elsewhere in the following detailed description. Protection is claimed for any novel feature or idea described herein and/or illustrated in the drawings, whether or not emphasis has been placed thereon.
Brief Description of the Drawings
Various aspects of the teachings of the present invention, and arrangements embodying those teachings, will hereafter be described by way of illustrative example with reference to the accompanying drawings, in which:
Fig. 1 is a schematic block diagram of a multi-view display screen with a touchscreen input device, and installed in-vehicle; Fig. 2 is a schematic view of an example of first image viewable from a first viewing position;
Fig. 3 is a schematic view of an example of a second image viewable from a second viewing position;
Fig. 4 is a schematic flow diagram showing information flow and processing in response to detection of input via the touch-screen;
Fig. 5 is a schematic illustration of a Global Positioning System (GPS);
Fig. 6 is a schematic illustration of electronic components arranged to provide a navigation device; and
Fig. 7 is a schematic illustration of the manner in which a navigation device may receive information over a wireless communication channel.
Detailed Description of Preferred Embodiments
A preferred embodiment of the invention is now described with particular reference to a multi-view display screen used in-vehicle. However, the teachings of the invention are not limited to in-vehicle use, and may be used in any environment where touch-sensitive input is desired for multi-view display applications.
Referring to Fig. 1 , a multi-view display screen 10 is shown in a vehicle 12. The display screen 10 is typically placed forward of, and between, two seating positions, for example, a driver position 14 and a passenger position 16. The display screen 10 may be installed within a console unit of the vehicle 12, or otherwise mounted permanently or temporarily. The multi-view display screen 10 is configured to generate a plurality of images 18a, 18b, simultaneously on the same screen, each being viewable distinctly from the other(s). The image viewed depends on the viewing position (e.g. viewing angle) with respect to the screen 10. In the present embodiment, a person sitting in a first viewing position (e.g. at the driver position 14) and viewing the screen 10 from a first direction sees the first image 18a but generally not the second 18b. A person sitting in a second viewing position (e.g. at the passenger position 16) and viewing the screen 10 from a second direction sees the second image 18b but generally not the first 18a. In the present embodiment, the display screen 10 is configured to generate two distinctive images 18a, 18b, but other screens 10 may generate three or more distinct images each depending on viewing position. The multi-view display screen 10 enables two individuals to observe different display images. For example, a driver typically desires knowledge-based information, such as vehicle, weather or navigation information. A passenger may instead desire entertainment-based information, such as movies, games or internet browsing. The display screen 10 enables a single screen to provide such information, and without an image intended for the passenger distracting the driver.
Fig. 2 illustrates an example of a first display image 18a generated by a vehicle navigation module 38a (described in more detail below). Fig. 3 illustrates an example of a second display image 18b generated by a games or media module 38b (not shown in detail).
Different constructions of multi-view display screen 10 are envisaged. In one form, the screen 10 generally comprises a display panel 22, and an optical separator 24. The display panel 22 may be of a liquid crystal or light-emitting diode type, providing a pixel-based display resolution typically N-times the required resolution of each image, where N is the number of distinct views. The optical separator 24 determines which pixels are viewable according to the viewing position. The optical separator 24 typically includes optical elements for directing and/or blocking light from each pixel, or each pixel group. Typical optical elements include small prism elements, small lenses or small optical masks. Examples of multi-view display screens suitable for use with the present invention are described in, for example, the aforementioned US 7365707, the entire content of which is incorporated herein by reference.
The screen 10 is controlled by a display controller 30, including a display driver or generator 32. The display generator receives respective image source signals 36a, 36b, one for each image 18a, 18b, to be displayed. The input signals 36 are typically supplied by respective source modules 38. The modules 38 may typically include one or more of: a vehicle navigation module 38a, a media player and/or games module 38b, a vehicle information module 38c, an internet browser module 38d, a mobile telephone communications module 38e. The modules 38 may be implemented by software executed on suitable hardware, or by dedicated module circuitry. Common hardware may optionally be used for plural modules. Additionally or alternatively, one or more of the modules 38 may be implemented as distinct units.
A feature of the present embodiment is that the display screen 10 is provided with a touch-screen input device 26 for enabling a person to make selections or input commands by touching the screen 10. If the screen 10 includes an optical separator 24, the touch-screen input device 26 may be disposed in front of the optical separator 24, or the two may be at least partly integrated together. The touch-screen input device 26 may, for example, detect touches by means of capacitance sensing, or resistance sensing, or light-interruption, or inductance sensing, or processing of images captured by camera, or any combination of to or more of these. The touchscreen input device 26 may optionally be configured to detect multiple touches (e.g. to identify plural positions at which the screen 10 is touched simultaneously by several fingers). The touch-screen input device 26 is preferably able to detect and generate an output indicative of at least one position at which the screen is touched.
Handling touch-sensitive input signals for a multi-view display screen 10 is more complicated than for a single-view display screen. When a touch input is detected for a multi-view screen, it may be unknown whether a first person is making a selection while viewing the first image 18a, or a second person is making a different selection while viewing the different second image 18b. While the multi-view display 10 can efficiently generate plural images on the same screen, it still has only a single screen surface available for touch inputs by either user. It is therefore complicated to route the input signals to the appropriate source module 38. In the preferred embodiment, this is addressed by designating a selectable one of the images 18 (or one of the graphics inputs signals 36 or the respective source module 38) as being active or enabled for touch-screen input. The one or more other images (or other graphics input signals or other source modules) are designated as non- enabled. The identity of the enabled image is stored as enabled-image information 40 in a register of the display controller 30. The identity of the enabled image is depicted graphically on the screen 10. For example, one or more graphics icons 42, 44 may be overlaid on, or included in, at least one of the images 18 to indicate the image enabled for touch screen input. Preferably an icon is generated for each image for indicating the enabled/non-enabled state of the respective image. A first icon 42 (see Fig. 2) appearing in an image 18a indicates that the respective image is designated as active or enabled for touch-screen input, or rather that touch-screen inputs will be routed to the respective source module for that image. A viewer seeing the first icon 42 in the image is made aware that he or she may make selections or enter commands using the touch-screen input device 26. A second icon 44 (see Fig. 3) appearing in an image 18b indicates that the respective image is not active for touch-screen inputs. A viewer seeing the second icon 44 in the image is made aware that he or she should not in general use the touch-screen (except to acquire active status, as described below).
The first and second icons 42 and 44 may be similar, but incorporate differences that can readily be understood to indicate "enabled" or "disabled" according to conventional user-interface practice. The examples illustrated in Figs. 2 and 3 include a tick (e.g. coloured green) depicting "enabled" in the first icon 42, and a cross (e.g. coloured red) depicting "disabled" in the second icon 44, the icons themselves being similar in the form of a hand representing touch-sensitive input. Other distinctions may include, for example, an icon being "greyed-out" or made partly transparent, in order to indicate a disabled status. In the preferred embodiment, the icons are generated by the display generator 32 of the display controller 30. The display generator 32 includes an icon resource 32a responsive to the enabled-image information 40 for generating the appropriate graphics icons. The display generator 32 overlays the icons on the respective image signals 36a and 36b, so that the respective icons 42, 44 appear in the displayed images 18a and 18b. The icons 42, 44 may be displayed permanently, or the icons may be displayed intermittently so as not to permanently occupy space in the displayed images 18. The icons 42, 44 may be arranged near or at the same position in each display image 18a, 18b. Alternatively, the icons 42, 44 may be arranged at different positions, provided that this does not conflict with other input areas of the screen that may be used by any of the source modules 38.
The driver circuitry 30 further comprises a touch-screen input processor 34 responsive to signals from the touch-sensitive input device. In the present embodiment, the processor 34 is configured with resources to do one or both of: (a) (a selection resource) to control which image 18 is designated as enabled for touch-screen input, and to arbitrate between requests for changing the image designated as enabled; and
(b) (a routing resource) to route received touch-screen input information to the source module 38 that corresponds to the image 18 designated to be enabled for touch-screen input.
Fig. 4 illustrates processing steps by the processor 34 for processing touchscreen inputs received from the touch-sensitive input device 26, and implementing the above resources. At step 50, the touch-position is analysed to determine whether the touch corresponds to the position of icon 42 or 44 in any of the images. This is most straightforward if the icons 42 and 44 are displayed at the same position within each image 18. If the touch position does not correspond to the icon position, processing proceeds to step 52 at which the identity of the enabled image 18 is determined from the stored information 40. At step 54, the received touch-sensitive input information is routed to the respective source module 38 that corresponds to the enabled image. In the situation depicted in Figs. 2 and 3, the active image is 18a, and the touch-sensitive input information would be routed to the navigation module 38a.
If at step 50 it is determined that the touch position does correspond to the position of one or more icons 42, 44, this is interpreted as a request to change which of the images 18 is designated as enabled for touch-screen input. At step 56, the identity of the enabled image 18 is determined from the stored information 40. At step 58, the identity is changed, preferably according to a predetermined sequence. If there are only two images 18a, 18b in the multi-view display screen 10, the sequence is to toggle the designated enabled image between the two available images, one after the other. If there are three or more images 18 in the multi-view display screen 10, an example sequence is to increment or decrement an index number representing the images, so that the designated image enabled for touch- sensitive input changes from one image to the next in turn. Once the index numbers reaches a limit value, the next increment or decrement returns the index value to an initial value, thereby restarting the sequence. At step 58, new information 40 is stored, thereby causing the display generator 32 to update the display of icons 42, 44 in the different display images 18.
The above organisation of touch-screen input routing, and designation of which image is enabled for touch-screen inputs, is autonomous and independent of the different source modules 38 supplying the individual image signals 36. The icons 42 and 44 are overlaid automatically, and the display controller 30 controls, and updates, which of the images is designated as enabled for touch-screen input. The controller 30 also controls routing of touch-screen input to the appropriate source module 38. The display controller 30 implements a resources for designating which image is enabled for touch-sensitive input, an icon resource for generating one or more graphical display icons indicating the image enabled for touch-screen input, a resource for changing the designation of which image is enabled for touch-screen input, and a resource for routing touch-screen input information to a source module associated with the image designated as enabled for touch-screen input.
In an alternative form, not shown, the source modules 38 may themselves communicate with each other to incorporate equivalent display icons in the individual image signals 36, and implement the above resources for managing touch-screen input. At least a part of this functionality/resources may also be implemented or managed by an operating system that defines a execution environment for the source modules if software based.
However implemented, the functionality of the above embodiment enables touch-screen input for a multi-view display to be organised efficiently, and in a manner that is clear and intuitive for plural viewers, without requiring substantial detection hardware to detect which viewer has pressed the screen.
For the sake of completeness, there now follows a description of a navigation system that may be used as part, or all, of the navigation module 38a if implemented. Reference is also made to the navigation display described in the aforementioned
PCT/EP2008/063391 , the entire content of which is incorporated herein by reference.
Fig. 5 illustrates an example view of Global Positioning System (GPS), usable by navigation devices. Such systems are known and are used for a variety of purposes. In general, GPS is a satellite-radio based navigation system capable of determining continuous position, velocity, time, and in some instances direction information for an unlimited number of users. Formerly known as NAVSTAR, the GPS incorporates a plurality of satellites which orbit the earth in extremely precise orbits. Based on these precise orbits, GPS satellites can relay their location to any number of receiving units.
The GPS system is implemented when a device, specially equipped to receive GPS data, begins scanning radio frequencies for GPS satellite signals. Upon receiving a radio signal from a GPS satellite, the device determines the precise location of that satellite via one of a plurality of different conventional methods. The device will continue scanning, in most instances, for signals until it has acquired at least three different satellite signals (noting that position is not normally, but can be determined, with only two signals using other triangulation techniques). Implementing geometric triangulation, the receiver utilizes the three known positions to determine its own two-dimensional position relative to the satellites. This can be done in a known manner. Additionally, acquiring a fourth satellite signal will allow the receiving device to calculate its three dimensional position by the same geometrical calculation in a known manner. The position and velocity data can be updated in real time on a continuous basis by an unlimited number of users.
As shown in Figure 5, the GPS system is denoted generally by reference numeral 100. A plurality of satellites 120 are in orbit about the earth 124. The orbit of each satellite 120 is not necessarily synchronous with the orbits of other satellites 120 and, in fact, is likely asynchronous. A GPS receiver 140 is shown receiving spread spectrum GPS satellite signals 160 from the various satellites 120.
The spread spectrum signals 160, continuously transmitted from each satellite 120, utilize a highly accurate frequency standard accomplished with an extremely accurate atomic clock. Each satellite 120, as part of its data signal transmission 160, transmits a data stream indicative of that particular satellite 120. It is appreciated by those skilled in the relevant art that the GPS receiver device 140 generally acquires spread spectrum GPS satellite signals 160 from at least three satellites 120 for the GPS receiver device 140 to calculate its two-dimensional position by triangulation. Acquisition of an additional signal, resulting in signals 160 from a total of four satellites 120, permits the GPS receiver device 140 to calculate its three-dimensional position in a known manner.
Figure 6 is an illustrative representation of electronic components of a navigation device 200 usable as the navigation module 38a according to a preferred embodiment of the present invention, in block component format. It should be noted that the block diagram of the navigation device 200 is not inclusive of all components of the navigation device, but is only representative of many example components.
The navigation device 200 includes a processor 210 connected to receive inputs from an input device 220 (e.g. from the touch-screen input device 26 via the display controller 30) and a display signal or image generator 240 for the screen 10. The navigation device may include an output device 260, for example an audible output device (e.g. a loudspeaker). As output device 260 can produce audible information for a user of the navigation device 200, it is should equally be understood that input device 220 can include a microphone and software for receiving input voice commands as well.
In the navigation device 200, processor 210 is operably coupled to a memory resource 230 via connection 235 and is further adapted to receive/send information from/to input/output (I/O) ports 270 via connection 275, wherein the I/O port 270 is connectible to an I/O device 280 external to the navigation device 200. The memory resource 230 comprises, for example, a volatile memory, such as a Random Access Memory (RAM) and a non-volatile memory, for example a digital memory, such as a flash memory. The external I/O device 280 may include, but is not limited to an external listening device such as an earpiece for example. The connection to I/O device 280 can further be a wired or wireless connection to any other external device such as a car stereo unit for hands-free operation and/or for voice activated operation for example, for connection to an ear piece or head phones, and/or for connection to a mobile phone for example, wherein the mobile phone connection may be used to establish a data connection between the navigation device 200 and the internet or any other network for example, and/or to establish a connection to a server via the internet or some other network for example.
Fig. 6 further illustrates an operative connection between the processor 210 and an antenna/receiver 250 via connection 255, wherein the antenna/receiver 250 can be a GPS antenna/receiver for example. It will be understood that the antenna and receiver designated by reference numeral 250 are combined schematically for illustration, but that the antenna and receiver may be separately located components, and that the antenna may be a GPS patch antenna or helical antenna for example.
Further, it will be understood by one of ordinary skill in the art that the electronic components shown in Fig. 6 are powered by power sources (not shown) in a conventional manner. As will be understood by one of ordinary skill in the art, different configurations of the components shown in Fig. 6 are considered to be within the scope of the present application. For example, the components shown in Fig. 6 may be in communication with one another via wired and/or wireless connections and the like. Thus, the scope of the navigation device 200 of the present application includes a portable or handheld navigation device 200.
Referring now to Fig. 7, the navigation device 200 may establish a "mobile" or telecommunications network connection with a server 302 via a mobile device (not shown) (such as a mobile phone, PDA, and/or any device with mobile phone technology) establishing a digital connection (such as a digital connection via known Bluetooth technology for example). Thereafter, through its network service provider, the mobile device can establish a network connection (through the internet for example) with a server 302. As such, a "mobile" network connection is established between the navigation device 200 (which can be, and often times is mobile as it travels alone and/or in a vehicle) and the server 302 to provide a "real-time" or at least very "up to date" gateway for information.
The establishing of the network connection between the mobile device (via a service provider) and another device such as the server 302, using an internet (such as the World Wide Web) for example, can be done in a known manner. This can include use of TCP/IP layered protocol for example. The mobile device can utilize any number of communication standards such as CDMA, GSM, WAN, etc.
As such, an internet connection may be utilised which is achieved via data connection, via a mobile phone or mobile phone technology within the navigation device 200 for example. For this connection, an internet connection between the server 302 and the navigation device 200 is established. This can be done, for example, through a mobile phone or other mobile device and a GPRS (General Packet Radio Service)-connection (GPRS connection is a high-speed data connection for mobile devices provided by telecom operators; GPRS is a method to connect to the internet).
The navigation device 200 can further complete a data connection with the mobile device, and eventually with the internet and server 302, via existing Bluetooth technology for example, in a known manner, wherein the data protocol can utilize any number of standards, such as the GSRM, the Data Protocol Standard for the GSM standard, for example.
The navigation device 200 may include its own mobile phone technology within the navigation device 200 itself (including an antenna for example, or optionally using the internal antenna of the navigation device 200). The mobile phone technology within the navigation device 200 can include internal components as specified above, and/or can include an insertable card (e.g. Subscriber Identity Module or SIM card), complete with necessary mobile phone technology and/or an antenna for example. As such, mobile phone technology within the navigation device 200 can similarly establish a network connection between the navigation device 200 and the server 302, via the internet for example, in a manner similar to that of any mobile device.
For GRPS phone settings, a Bluetooth enabled navigation device may be used to correctly work with the ever changing spectrum of mobile phone models, manufacturers, etc., model/manufacturer specific settings may be stored on the navigation device 200 for example. The data stored for this information can be updated.
In Fig. 7 the navigation device 200 is depicted as being in communication with the server 302 via a generic communications channel 318 that can be implemented by any of a number of different arrangements. The server 302 and a navigation device 200 can communicate when a connection via communications channel 318 is established between the server 302 and the navigation device 200 (noting that such a connection can be a data connection via mobile device, a direct connection via personal computer via the internet, etc.).
The server 302 includes, in addition to other components which may not be illustrated, a processor 304 operatively connected to a memory 306 and further operatively connected, via a wired or wireless connection 314, to a mass data storage device 312. The processor 304 is further operatively connected to transmitter 308 and receiver 310, to transmit and send information to and from navigation device 200 via communications channel 318. The signals sent and received may include data, communication, and/or other propagated signals. The transmitter 308 and receiver 310 may be selected or designed according to the communications requirement and communication technology used in the communication design for the navigation system 200. Further, it should be noted that the functions of transmitter 308 and receiver 310 may be combined into a signal transceiver.
Server 302 is further connected to (or includes) a mass storage device 312, noting that the mass storage device 312 may be coupled to the server 302 via communication link 314. The mass storage device 312 contains a store of navigation data and map information, and can again be a separate device from the server 302 or can be incorporated into the server 302.
The navigation device 200 is adapted to communicate with the server 302 through communications channel 318, and includes processor, memory, etc. as previously described with regard to Fig. 6, as well as transmitter 320 and receiver 322 to send and receive signals and/or data through the communications channel 318, noting that these devices can further be used to communicate with devices other than server 302. Further, the transmitter 320 and receiver 322 are selected or designed according to communication requirements and communication technology used in the communication design for the navigation device 200 and the functions of the transmitter 320 and receiver 322 may be combined into a single transceiver.
Software stored in server memory 306 provides instructions for the processor
304 and allows the server 302 to provide services to the navigation device 200. One service provided by the server 302 involves processing requests from the navigation device 200 and transmitting navigation data from the mass data storage 312 to the navigation device 200. Another service provided by the server 302 includes processing the navigation data using various algorithms for a desired application and sending the results of these calculations to the navigation device 200.
The communication channel 318 generically represents the propagating medium or path that connects the navigation device 200 and the server 302. Both the server 302 and navigation device 200 include a transmitter for transmitting data through the communication channel and a receiver for receiving data that has been transmitted through the communication channel.
The communication channel 318 is not limited to a particular communication technology. Additionally, the communication channel 318 is not limited to a single communication technology; that is, the channel 318 may include several communication links that use a variety of technology. For example, the communication channel 318 can be adapted to provide a path for electrical, optical, and/or electromagnetic communications, etc. As such, the communication channel 318 includes, but is not limited to, one or a combination of the following: electric circuits, electrical conductors such as wires and coaxial cables, fibre optic cables, converters, radio-frequency (RF) waves, the atmosphere, empty space, etc. Furthermore, the communication channel 318 can include intermediate devices such as routers, repeaters, buffers, transmitters, and receivers, for example.
In one illustrative arrangement, the communication channel 318 includes telephone and computer networks. Furthermore, the communication channel 318 may be capable of accommodating wireless communication such as radio frequency, microwave frequency, infrared communication, etc. Additionally, the communication channel 318 can accommodate satellite communication.
The communication signals transmitted through the communication channel 318 include, but are not limited to, signals as may be required or desired for given communication technology. For example, the signals may be adapted to be used in cellular communication technology such as Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), Global System for Mobile Communications (GSM), etc. Both digital and analogue signals can be transmitted through the communication channel 318. These signals may be modulated, encrypted and/or compressed signals as may be desirable for the communication technology.
The server 302 includes a remote server accessible by the navigation device
200 via a wireless channel. The server 302 may include a network server located on a local area network (LAN), wide area network (WAN), virtual private network (VPN), etc.
The server 302 may include a personal computer such as a desktop or laptop computer, and the communication channel 318 may be a cable connected between the personal computer and the navigation device 200. Alternatively, a personal computer may be connected between the navigation device 200 and the server 302 to establish an internet connection between the server 302 and the navigation device 200. Alternatively, a mobile telephone or other handheld device may establish a wireless connection to the internet, for connecting the navigation device 200 to the server 302 via the internet.
The navigation device 200 may be provided with information from the server 302 via information downloads which may be periodically updated automatically or upon a user connecting navigation device 200 to the server 302 and/or may be more dynamic upon a more constant or frequent connection being made between the server 302 and navigation device 200 via a wireless mobile connection device and TCP/IP connection for example. For many dynamic calculations, the processor 304 in the server 302 may be used to handle the bulk of the processing needs, however, processor 210 of navigation device 200 can also handle much processing and calculation, oftentimes independent of a connection to a server 302.
The multi-view display screen 10 and the navigation device 200 (module 38a) may be installed permanently within a vehicle to be used only with that vehicle. Alternatively, one or both of the screen 10 and the navigation device 200 may be a portable unit. In one form, the screen 10 and the navigation device are installed together in a portable housing (not shown).
It will be appreciated that whilst various aspects and embodiments of the present invention have heretofore been described, the scope of the present invention is not limited to the particular arrangements set out herein and instead extends to encompass all arrangements, and modifications and alterations thereto, which fall within the scope of the appended claims.
For example, whilst embodiments described in the foregoing detailed description refer to GPS, it should be noted that the navigation device may utilise any kind of position sensing technology as an alternative to (or indeed in addition to) GPS. For example the navigation device may utilise using other global navigation satellite systems such as the European Galileo system. Equally, it is not limited to satellite based but could readily function using ground based beacons or any other kind of system that enables the device to determine its geographic location.
It will also be well understood by persons of ordinary skill in the art that whilst the preferred embodiment implements certain functionality by means of software, that functionality could equally be implemented solely in hardware (for example by means of one or more ASICs (application specific integrated circuit)) or indeed by a mix of hardware and software. As such, the scope of the present invention should not be interpreted as being limited only to being implemented in software. The invention nevertheless extends to protect software executable by a processor, and an information carrier carrying or containing such software.
Lastly, it should also be noted that whilst the accompanying claims set out particular combinations of features described herein, the scope of the present invention is not limited to the particular combinations hereafter claimed, but instead extends to encompass any combination of features or embodiments herein disclosed irrespective of whether or not that particular combination has been specifically enumerated in the accompanying claims at this time.

Claims

1. Apparatus comprising:
a multi-view display screen (10) operable to display overlapping plural images (18) viewable on the same display screen, a viewable image being determined according to viewing position;
a touch-screen input device (26) for detecting user touches of the screen (10) representing user input;
a display controller (30-34) for designating a selectable one of the plural images as enabled for touch-screen input, and for generating at least one graphical display icon for indicating the image enabled for touch-screen input.
2. The apparatus of claim 1 , wherein the display controller comprises a resource configured to generate a first icon in the respective displayed image that is enabled for touch-screen input, the first icon being indicative that the image is enabled for touch-screen input.
3. The apparatus of claim 1 , wherein the display controller comprises a resource configured to generate a second icon in one or more respective displayed images other than the image enabled for touch-screen input, the second icon indicating that said other images are not enabled for touch-screen input.
4. The apparatus of claim 1 , wherein the display controller comprises a selection resource for changing the designation of which of the plural viewable images is designated as enabled for touch-screen input.
5. The apparatus of claim 4, wherein the selection resource is responsive, via the touch-screen input device, to detection of touch input on the screen at a position corresponding to a displayed icon.
6. The apparatus of claim 4, wherein the selection resource is responsive to toggle the designation of the image enabled for touch-screen input between first and second images.
7. The apparatus of claim 1 , wherein the display controller comprises a resource configured to selectively route received touch-screen input information to a respective source module associated with the image that is enabled for touch-screen input.
8. A vehicle comprising installed display apparatus, the display apparatus comprising:
a multi-view display screen (10) operable to display overlapping plural images
(18) viewable on the same display screen, a viewable image being determined according to viewing position;
a touch-screen input device (26) for detecting user touches of the screen (10) representing user input;
a display controller (30-34) for designating a selectable one of the plural images as enabled for touch-screen input, and for generating at least one graphical display icon for indicating the image enabled for touch-screen input.
9. The vehicle according to claim 8, further comprising a plurality of source modules (38) for providing image source signals (36) to the display apparatus, the source modules including one or more selected from: a navigation device (38a); a media player and/or games module (38b); a vehicle information module (38c); an internet browser module (38d); a mobile telephone communications module (38e).
10. A display controller (30) for a multi-view display screen (10) with a touch-screen input device (26), the display controller (30) comprising:
a resource (34, 40) for designating a selectable one of plural viewable images (18) as being enabled for touch-screen input; and
an icon resource (32) for generating at least one graphical display icon for indicating on the screen (10) the image enabled for touch-screen input.
1 1. The display controller of claim 10, wherein the icon resource (32) comprises at least one of:
a resource for generating a first icon in the respective displayed image that is enabled for touch-screen input, the first icon being indicative that the image is enabled for touch-screen input; and
a resource for generating a second icon in one or more respective displayed images other than the image enabled for touch-screen input, the second icon indicating that said other images are not enabled for touch-screen input.
12. The display controller of claim 10, further comprising a resource responsive to detected touch-input at a position corresponding to a displayed icon, to change which of the images is designated as being enabled for touch-screen input.
13. The display controller of claim 12, further comprising a resource configured to selectively route received touch-screen input information to a respective source module associated with the image that is enabled for touch-screen input.
14. A method for controlling a multi-view display screen (10) having a touch-screen input device (26), the multi-view display screen being operable to display overlapping plural images (18) viewable on the same display screen, a viewable image being determined according to viewing position, and the touch-screen input device (26) detecting user touches of the screen (10) representing user input; the method comprising:
operating at least one controller resource (34, 40) to designate a selectable one of plural viewable images (18) as being enabled for touch-screen input; and
operating at least one icon resource (32) to generate at least one graphical display icon for indicating on the screen (10) the image enabled for touch-screen input.
15. The method of claim 14, wherein the step of operating the at least one icon resource comprises at least one of:
generating a first icon in the respective displayed image that is enabled for touch-screen input, the first icon being indicative that the image is enabled for touchscreen input; and
generating a second icon in one or more respective displayed images other than the image enabled for touch-screen input, the second icon indicating that said other images are not enabled for touch-screen input.
16. The method of claim 14, further comprising a step of detecting touch-input at a position corresponding to a displayed icon, and responsive to said detection, changing which of the images is designated as being enabled for touch-screen input,
17. The method of claim 14, further comprising a step of selectively routing received touch-screen input information to a respective source module associated with the image that is enabled for touch-screen input.
18. A computer program which, when executed by a processor, implements a method for controlling a multi-view display screen (10) having a touch-screen input device (26), the multi-view display screen being operable to display overlapping plural images (18) viewable on the same display screen, a viewable image being determined according to viewing position, and the touch-screen input device (26) detecting user touches of the screen (10) representing user input; the method comprising:
operating at least one controller resource (34, 40) to designate a selectable one of plural viewable images (18) as being enabled for touch-screen input; and
operating at least one icon resource (32) to generate at least one graphical display icon for indicating on the screen (10) the image enabled for touch-screen input.
19. A machine-readable information carrier carrying or embodying a computer program according to claim 18.
PCT/EP2009/058833 2009-07-10 2009-07-10 Touchscreen input on a multi-view display screen WO2011003467A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2009/058833 WO2011003467A1 (en) 2009-07-10 2009-07-10 Touchscreen input on a multi-view display screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2009/058833 WO2011003467A1 (en) 2009-07-10 2009-07-10 Touchscreen input on a multi-view display screen

Publications (1)

Publication Number Publication Date
WO2011003467A1 true WO2011003467A1 (en) 2011-01-13

Family

ID=41796205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/058833 WO2011003467A1 (en) 2009-07-10 2009-07-10 Touchscreen input on a multi-view display screen

Country Status (1)

Country Link
WO (1) WO2011003467A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970397A (en) * 2013-01-30 2014-08-06 腾讯科技(深圳)有限公司 Rotary screen interface display method and rotary screen interface display device
EP2776998A1 (en) * 2011-11-10 2014-09-17 Gelliner Limited Payment system and method
CN112848892A (en) * 2021-01-29 2021-05-28 东风汽车有限公司 Touch panel for vehicle, control method and control electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796396A (en) * 1995-03-31 1998-08-18 Mitsubishi Electric Information Technology Center America, Inc. Multiple user/agent window control
US20050079896A1 (en) * 2003-10-14 2005-04-14 Nokia Corporation Method and apparatus for locking a mobile telephone touch screen
EP1857917A2 (en) * 2006-05-15 2007-11-21 Delphi Technologies, Inc. Multiple-view display system having user manipulation control and method
US20070297064A1 (en) * 2004-10-27 2007-12-27 Fujitsu Ten Limited Display Device
EP1988448A1 (en) * 2006-02-23 2008-11-05 Pioneer Corporation Operation input device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796396A (en) * 1995-03-31 1998-08-18 Mitsubishi Electric Information Technology Center America, Inc. Multiple user/agent window control
US20050079896A1 (en) * 2003-10-14 2005-04-14 Nokia Corporation Method and apparatus for locking a mobile telephone touch screen
US20070297064A1 (en) * 2004-10-27 2007-12-27 Fujitsu Ten Limited Display Device
EP1988448A1 (en) * 2006-02-23 2008-11-05 Pioneer Corporation Operation input device
EP1857917A2 (en) * 2006-05-15 2007-11-21 Delphi Technologies, Inc. Multiple-view display system having user manipulation control and method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2776998A1 (en) * 2011-11-10 2014-09-17 Gelliner Limited Payment system and method
US20150213529A1 (en) 2011-11-10 2015-07-30 Gelliner Limited Online Purchase Processing System and Method
US10346821B2 (en) 2011-11-10 2019-07-09 Gelliner Limited Online purchase processing system and method
US10475016B2 (en) 2011-11-10 2019-11-12 Gelliner Limited Bill payment system and method
US10528935B2 (en) 2011-11-10 2020-01-07 Gelliner Limited Payment system and method
CN103970397A (en) * 2013-01-30 2014-08-06 腾讯科技(深圳)有限公司 Rotary screen interface display method and rotary screen interface display device
CN112848892A (en) * 2021-01-29 2021-05-28 东风汽车有限公司 Touch panel for vehicle, control method and control electronic equipment

Similar Documents

Publication Publication Date Title
US10310662B2 (en) Rendering across terminals
US8717198B2 (en) Communication connecting apparatus and method for detecting mobile units in a vehicle
US9909892B2 (en) Terminal and method for controlling the same
KR101502013B1 (en) Mobile terminal and method for providing location based service thereof
CN110618800A (en) Interface display method, device, equipment and storage medium
AU2007343406A1 (en) A navigation device and method for enhanced map display
KR20080098517A (en) Navigation device and method comprising a touch sensitive screen for switching menu options
KR101738001B1 (en) Mobile Terminal And Method Of Controlling The Same
WO2011054549A1 (en) Electronic device having a proximity based touch screen
CN106034173B (en) Mobile terminal and control method thereof
WO2009132676A1 (en) A navigation device and method for emphasising a map route
WO2022213733A1 (en) Method and apparatus for acquiring flight route, and computer device and readable storage medium
KR20100050958A (en) Navigation device and method for providing information using the same
WO2011003467A1 (en) Touchscreen input on a multi-view display screen
CN101726301B (en) Destination calibration method and device combined with electronic map
WO2013037852A2 (en) Navigation method and apparatus for selecting a destination
CN111754564B (en) Video display method, device, equipment and storage medium
KR101440518B1 (en) mobile communication device and method for controlling thereof
TW201017103A (en) Navigation apparatus having improved display
TWI482482B (en) Instant communication method and electronic apparatus combining electronic map and computer program product using the method
KR101617399B1 (en) Terminal, system and method providing map display service
TW201104667A (en) Touchscreen input on a multi-view display screen
AU2008365704A1 (en) Navigation device and method for determining a route of travel
KR20170064092A (en) Contents displaying method wjithin navigation map through mirroring of smart devices
CN111795697A (en) Equipment positioning method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09780441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09780441

Country of ref document: EP

Kind code of ref document: A1