[go: up one dir, main page]

HK1182816B - Optical tablet stylus and indoor navigation system - Google Patents

Optical tablet stylus and indoor navigation system Download PDF

Info

Publication number
HK1182816B
HK1182816B HK13110180.4A HK13110180A HK1182816B HK 1182816 B HK1182816 B HK 1182816B HK 13110180 A HK13110180 A HK 13110180A HK 1182816 B HK1182816 B HK 1182816B
Authority
HK
Hong Kong
Prior art keywords
computing device
touch
mobile computing
positioning
discernable
Prior art date
Application number
HK13110180.4A
Other languages
Chinese (zh)
Other versions
HK1182816A (en
Inventor
查尔斯.泰克
安德里亚斯.诺瓦茨克
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 微软技术许可有限责任公司 filed Critical 微软技术许可有限责任公司
Publication of HK1182816A publication Critical patent/HK1182816A/en
Publication of HK1182816B publication Critical patent/HK1182816B/en

Links

Description

Optical flat stylus and indoor navigation system
Technical Field
The present invention relates to a three-dimensional touch-type input system for a mobile computing device and an optical navigation method.
Background
Many small and large mobile computing devices use a touch screen interface as the primary means of user input rather than a traditional keyboard. However, typical touchscreen interfaces alone often lack the precision necessary to capture complex drawings and/or writing, such as cursive handwriting, annotations, sketches, or other complex or non-standard graphical input. To enable user input to be richer with higher fidelity, a stylus (or other writing type device) may be used to improve the accuracy of a touch screen interface for such mobile computing devices. The stylus can be readily used with a resistive touch screen interface, an electrostatic (or capacitive) touch screen interface, or an electromagnetic touch screen interface.
However, current touch screen interfaces, used alone or in combination with a stylus, lack the ability to enable three-dimensional (3D) user input. Therefore, conventional touch screens cannot support 3D interaction for the following purposes: playing 3D video games, utilizing 3D Computer Aided Design (CAD) programs, supplementing other 3D user interfaces, or manipulating volumetric images such as medical images derived from MRI and CAT scans, among many other potential uses. Furthermore, existing two-dimensional (2D) touch screen user interfaces employ technologies that cannot support broader uses for providing complementary mobile device applications such as position determination and indoor navigation.
Disclosure of Invention
Various embodiments disclosed herein relate to optical user input techniques including one or more 3D positioning (position) sensors and one or more 3D position (location) emitters to enable high precision user input in 3D space. For several embodiments, the 3D input sensor may be oriented (origin) with respect to a particular input/output panel, such as a 2D or 3D panel (slate) type display device ("panel") of a mobile computing device. These embodiments may also include the use of one or more 3D position emitters, which for some embodiments may be disposed on a stylus, or other writing or pointing device. Such an embodiment is thus able to determine both the position and orientation of a 3D position emitter within 3D space using selectively distinguishable (e.g., relatively unique) signals from the 3D position emitter to the 3D positioning sensor to distinguish one 3D position emitter from another 3D position emitter. Furthermore, the digital signal may also be used to send additional data from the 3D position emitter to the 3D positioning sensor, the additional data comprising the known position of the 3D position emitter.
Several embodiments disclosed herein also relate to user input devices using one or more 3D position emitters in combination with one or more 3D positioning sensors. Some such embodiments also relate to user input devices that can be added to the display device at some point after the initial manufacture of the display device. Furthermore, various embodiments disclosed herein relate to indoor navigation and other complementary applications that may use interaction between one or more 3D input sensors, one or more 3D position transmitters and/or other complementary device components, and other mobile computing devices with similar capabilities.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Drawings
For purposes of understanding and illustrating the disclosure and various embodiments, there is disclosed in the drawings exemplary features and embodiments that will be better understood when read in conjunction with the accompanying drawings, it being understood, however, that the disclosure is not limited to the specific methods, precise arrangements and instrumentalities disclosed. Like reference symbols in the various drawings indicate like elements. In the figure:
FIG. 1 is a diagram of an exemplary networked computer environment, in which many embodiments disclosed herein can be used;
FIG. 2 depicts an exemplary computing device using an optical 3D tracking system that represents several embodiments disclosed herein;
FIG. 3A is a perspective view hybrid block and electrical diagram illustrating the operation of an exemplary Position Sensitive Diode (PSD) that may be used in an optical 3D tracking system representative of several embodiments disclosed herein;
FIG. 3B is a side view mixing block and electrical diagram of FIG. 3A;
FIG. 4A is a perspective view of an exemplary mobile computing device including an optical 3D tracking system representative of several embodiments described herein;
FIG. 4B is a side view of the exemplary mobile computing device illustrated in FIG. 4A;
FIG. 5 illustrates an exemplary use of a PSD sensor and stationary IR-LED beacons for indoor navigation representative of several embodiments disclosed herein;
FIG. 6 illustrates an exemplary use of PSD sensors and dynamic beacons for indoor navigation representative of several embodiments disclosed herein;
FIG. 7A is a process flow diagram illustrating an exemplary process for determining the position and orientation of a touch-type input device for use with the various embodiments disclosed herein;
FIG. 7B is a process flow diagram illustrating an exemplary process for mobile computing device beacon navigation (i.e., determining location using beacons) representative of various embodiments disclosed herein;
FIG. 7C is a process flow diagram illustrating an exemplary process for mobile computing device laser positioning (i.e., acquiring relative local navigation/position information) representative of various embodiments disclosed herein; and
FIG. 8 illustrates an exemplary computing environment.
Detailed Description
FIG. 1 is a diagram of an exemplary networked computing environment 100 in which many embodiments disclosed herein may be used. The networked computing environment 100 includes a plurality of computing devices interconnected by one or more networks 180. One or more networks 180 allow a particular computing device to connect to and communicate with another computing device. The depicted computing devices include mobile devices 110, 120, and 140, laptop computer 130, and application server 150.
In some implementations, the computing device may include a desktop personal computer, workstation, laptop, mini-computer (net-top), tablet computer, PDA, cellular telephone, smart phone, or any WAP-enabled device, or any other computing device capable of interfacing directly or indirectly with the network 180, such as the computing device 800 illustrated in fig. 8. In the case of a cellular phone, tablet computer, PDA or other wireless device, or the like, each computing device may run an HTTP client (e.g., a web browsing program) or WAP-enabled browser, which allows the user of the computing device to access information available to it at the server 150 or to provide information to the server 150. The computing device may also use other applications to, for example, access the server 150 or provide information to the server 150. In some implementations, the server 150 may be implemented with one or more general purpose computing systems, such as the computing device 800 illustrated in fig. 8. Although only one server 150 is shown, this is not meant to be limiting, and multiple servers may be implemented.
In some implementations, the computing devices may include other computing devices not shown. In some embodiments, the computing devices may include more or less computing devices than shown in fig. 1. The one or more networks 180 may include secure networks such as enterprise private networks, unsecure networks such as wireless open networks, Local Area Networks (LANs), Wide Area Networks (WANs), and the internet. Each of the one or more networks 180 may include hubs, bridges, routers, switches, and wired transmission media such as a wired network or direct-wired connection.
A server, such as application server 150, may allow clients to download information (e.g., text, audio, image, and video files) from the server or perform search queries regarding particular information stored on the server. In general, a "server" may comprise a hardware device that serves as a host in a client-server relationship or a software process that shares resources with or performs work for one or more clients. Communication between computing devices in a client-server relationship may be initiated by a client sending a request to a server for access to a particular resource or to perform a particular job. The server may then perform the requested action and send a response back to the client.
The networked computing environment 100 may provide a cloud computing environment for one or more computing devices. Cloud computing refers to internet-based computing in which shared resources, software, and/or information are provided to one or more computing devices on demand via the internet (or other global network). The term "cloud" is used as a metaphor for the internet based on a cloud in a computer network diagram, where a cloud is used to depict the internet as an abstraction of the underlying infrastructure it represents.
One embodiment of the mobile device 140 includes a touch-based input system 148 (shown here as a stylus system), a display 149, a network interface 145, a processor 146, and a memory 147, all in communication with one another. The display 149 may display digital images and/or video. The display 149 may include a touch screen user interface. The touch-based input system 148 may determine the location and orientation (orientation) of a touch-based input device, such as the stylus 141, with respect to the surface of the display 149. The network interface 145 allows the mobile device 140 to connect to one or more networks 180. The network interface 145 may include a wireless network interface, a modem, and/or a wired network interface. Processor 146 allows mobile device 140 to execute computer readable instructions stored in memory 147 to perform the processes discussed herein.
Touch-type input devices were originally developed for use as digitizers on non-display surfaces, but were later adapted for use with dedicated display devices to enable a user to interact directly with the output displayed on such display devices (rather than indirectly with a mouse, keyboard, or similar input device). The most common touch-type input devices are fingers, styluses, or writing instruments, and the term "stylus" as used herein is intended to include a wide variety of touch-type input devices without limitation. To achieve positioning resolution, accuracy and speed, touch-type input devices (or "stylus systems" as used herein) typically include display devices having resistive, electrostatic or electromagnetic touchscreens.
A resistive touch screen is a touch screen that detects a change in resistance occurring between a stylus and the touch screen, and thus senses the position at which the stylus is in contact with the surface of the touch screen. Resistive touch screen panels include a strong but flexible surface (e.g., an output display) laminated with a pair of thin flexible layers separated by a narrow gap. The two layers respectively include conductive horizontal and vertical lines to form intersections corresponding to precise locations directly below the surface. When a touch-type input device (e.g., a stylus) applies pressure (i.e., depresses) a point on the surface of the touch screen, the two conductive layers make contact beneath the point of contact, thereby enabling the location of the touch to be determined.
An electrostatic (or "capacitive") touch screen is a touch screen that detects changes in capacitance that occur between a conductive touch-type input device and the touch screen. Electrostatic touch screens sense where a touch-type input device has been in contact with (or has become very close to) the surface of the touch screen. Typically, electrostatic touchscreens use capacitive coupling between conductive transparent traces (trace) present in front of the display to sense touch-type input devices.
Electromagnetic touchscreens, however, are the most commonly used touchscreens today and are often implemented with a two-dimensional (2D) sensor array built into the back of the display panel. The positioning of a touch-type input device, particularly a dedicated stylus, is detected by electromagnetic coupling between sequential actuation loops (sequentially activated loops) in a sensor array and a resonant inductor/capacitor (LC) circuit located within the stylus, using a grid of horizontal and vertical wire loops (wireloops) that make up a 2D sensor array. Furthermore, while the physical spacing between the wire loops (sometimes referred to as the "loop antenna array") prevents any single wire loop alone from providing sufficient resolution for determining position location, the wire loops collectively provide a stable reference measurement to the stylus, which can then be used to interpolate the actual position location to an accuracy of 0.1mm in both the X and Y directions at an update rate of 133 samples per second.
However, one disadvantage of electromagnetic touchscreens is that the 2D sensor array (comprising a wire loop) must be built directly into the touchscreen, and therefore the 2D sensor array cannot be added to a standard display device at a later time. Standard displays are also difficult to convert to resistive and electrostatic touch screens. Furthermore, none of these devices support full 3D input and are greatly limited to 2D input use.
Rather, the various embodiments disclosed herein use fundamentally different approaches based on optical user input technology. More specifically, various embodiments disclosed herein are directed to optical 3D tracking systems that operate in conjunction with 2D or 3D display devices to receive 3D spatial input from a touch-type input device. For these various embodiments, the positioning of a touch-type input device (e.g., a stylus) is detected with a 3D positioning sensor system that covers a large volume of space in front of the display device. For several such embodiments, the 3D positioning sensor system may comprise (a) two or more 2D positioning sensors operating together from known relative positioning to derive the 3D position and orientation of the detected stylus, and (b) at least one 3D position emitter having one or more positioning emitters. For some such embodiments, the 2D positioning sensor may comprise a Position Sensitive Diode (PSD) and the positioning emitter may comprise one or more infrared light emitting diodes (IR-LEDs).
For several embodiments, the optical user input technology includes one or more 3D positioning sensors and one or more 3D position emitters. Similarly, a touch-based input device (such as a stylus) for various such embodiments may include a 3D position emitter embedded in the stylus, the 3D position emitter having three or more IR-LEDs to allow both position and orientation determination.
In some embodiments, the optical 3D tracking system may also capture (in addition to capturing the position of the stylus) the pose (or orientation) of the stylus to provide at least one supplemental user input based on a particular orientation (including changes in orientation), that is, to enable richer interaction (i.e., not just "touch") with the display device when operating in a 3D mode and even when operating in a 2D mode. For example, when used in a graphic-based writing or drawing application, rotating the stylus may provide control of the brush orientation in the graphic application, or may provide direct control of the line width to mimic the edge-thickening (edge-thickening) characteristic of a calligraphy pen.
FIG. 2 depicts an example computing device 200, such as the mobile computing device 140 of FIG. 1, using an optical 3D tracking system that represents several embodiments disclosed herein. The computing device 200 may be mobile or non-mobile, and the techniques described herein may be used for both mobile and non-mobile computing devices. Computing device 200 includes a display device 210 and physical control buttons 222. For some embodiments, the display device 210 may include a touch screen, however in alternative embodiments, the display device may be replaced with a non-display surface. Display device 210 includes a status region 212 that provides information regarding signal strength, time, and battery life associated with computing device 200.
A touch-type input device, such as stylus 250, may be used to provide input information to computing device 200 by directly touching display device 210 with stylus tip 252 or positioning stylus 250 above the surface of display device 210. The example computing device 200 may also include two or more 2D positioning sensors 260 and 262. For some embodiments, each 2D positioning sensor 260 and 262 may further include a Position Sensitive Diode (PSD) positioned in the focal plane of the wide-angle lens to provide a 3D space over (or on) the display device 210, for example, a 120 degree half cone volume, such that the volume of each PSD aggressively overlaps the 3D space over the display device. Similarly, stylus 250 for various such embodiments may include one or more IR-LEDs 272 that together make up positioning emitter 270.
The PSD is capable of sensing the angle of incidence of light emitted from IR-LEDs, such as those mounted on a stylus in a fixed and predetermined arrangement, or used as a stationary or dynamic reference beacon (discussed later herein). With respect to a stylus having the features of an IR-LED arrangement (which is known to computing devices), the relative positioning of each IR-LED may be determined via triangulation when the light of the IR-LED is observed from at least two PSD sensors. Furthermore, since the relative position of any pair of IR-LEDs on the stylus is known to the computing device, measuring the positioning of any three or more IR-LEDs on the stylus body enables calculation of both the positioning and orientation of the stylus itself, which itself includes a stylus tip defining a particular point in space intended as an input. Since some IR-LED emitters may be obscured at any given time (e.g., by a user's finger holding the stylus), various stylus embodiments may use more than three IR-LEDs to increase the likelihood that at least three of the stylus's IR-LEDs are not obscured at any given time and can be observed by the PSD sensor.
PSD is an analog device with very high resolution limited only by the accuracy with which the current can be measured. As such, the PSD sensor has approximately one part per million resolution demonstrated. Furthermore, the PSD does not require perfect focusing of the light being measured; instead, the PSD only requires that the light from the emitter fall completely within the boundaries of the sensing surface. Furthermore, the PSD is fast, so that localized acquisitions of over 10,000 samples per second are possible. Furthermore, the PSD is a rather simple device, which can be cost-effectively manufactured in large quantities. For the various embodiments described herein, the PSD sensor may be positioned in the focal plane of the wide-angle lens to provide a 3D space of a conical volume of 120 degrees, for example, for detecting IR-LED signals.
FIG. 3A is a perspective view blending block and electrical diagram illustrating the operation of an exemplary PSD300 that may be used in an optical 3D tracking system representative of several embodiments disclosed herein, and FIG. 3B is a side view blending block and electrical diagram of FIG. 3A. In the figure, a PSD (shown here as a linear PSD, such as that developed by osiptoelectronicsas in norway) includes a silicon photodetector PIN diode ("photodiode") 310, which silicon photodetector PIN diode 310 features a heavily doped P-type semiconductor "cathode" 312 (labeled "P" for positive), a broadband lightly doped near intrinsic semiconductor ("intrinsic region") 314 (labeled "I" for intrinsic), and a heavily doped N-type semiconductor "anode" 316 (labeled "N" for negative). The cathode 312 and anode 316 have two contacts 312a, 312b and 316a, 316b, respectively, covering opposite sides of the photodiode 310 as shown. In this configuration, the specific resistance (or photo-current rate) between the four contacts 312a, 312b, 316a and 316b varies and is distributed based on the positioning of the light 330, which can be measured, for example, by the arrangement of the ohmmeter 340, the focusing of the light 330 on the surface 310' of the photodiode 310 causing a current 318 from the cathode 312 to the anode 316. By measuring the photocurrent rate distributed in the arrangement of the ohmmeter 340, the centroid of the spot on the sensor surface can be determined.
Some embodiments disclosed herein may employ means that enable the PSD sensor to sequentially acquire the location of each IR-LED emitter by causing one IR-LED to emit its light at a time. However, several other embodiments may use alternative means of sensing the positioning of multiple IR-LED emitters simultaneously, for example based on a direct sequence spread spectrum signal, where all IR-LEDs emit light simultaneously and each IR-LED utilizes a different uniquely distinguishable code sequence that enables the digital signals of each IR-LED to be recognized from each other by a PSD sensor (a device that is known to be extremely linear). This type of spread spectrum detection of the PSD sensor also provides the benefit of filtering out any ambient IR light that reaches the sensor (due to the lack of any uniquely discernable coding characteristics of any ambient IR light). In addition, PSD sensors for various such embodiments may include filters to prevent non-IR light from reaching the photodiodes.
In view of the foregoing, certain embodiments disclosed herein relate to a 3D touch-type input system for a mobile computing device, the system including at least: (a) three IR position emitters fixedly coupled to the touch-type input device in a first known configuration for emitting three discernable IR signals; (b) two positioning sensors fixedly coupled to the mobile computing device in a second known configuration for sensing three discernable IR signals emitted from the three IR position emitters, wherein the sensing includes determining two 2D directions (six 2D directions in total, two for each IR position emitter) for each of the three IR position emitters relative to each of the two positioning sensors based on the three discernable IR signals; and (c) a processor for determining a location and orientation of the touch-type input device relative to the mobile computing device using the (e.g., six) 2D directions, the first known configuration, and the second known configuration.
FIG. 4A is a perspective view of an exemplary mobile computing device including an optical 3D tracking system 400 that represents several embodiments described herein. Fig. 4B is a side view of the exemplary mobile computing device illustrated in fig. 4A. In the figure, a display device in the form of a board 410 includes two PSD sensors 412 and 414, the two PSD sensors 412 and 414 being configured to receive a spread spectrum signal 424 (illustrated with dashed lines in fig. 4A) from a plurality of IR-LEDs 422 mounted on a pointing/writing device, here a stylus 420 (and its tip 420') for use with the board 410.
As illustrated, PSD sensors 412 and 414 may be mounted proximate to two corners of a board (e.g., a display device of a tablet computer). Further, as shown in fig. 4B, the wide-angle lens of each PSD sensor, such as PSD sensor 412, may be positioned such that its field of view 430 (here shown as a dashed line forming an angle of 120 degrees) covers a space 432 of the 3D volume in front of the surface of plate 410.
For several such embodiments, the spread spectrum digital signals 424 emitted by the IR-LEDs 422 and received by the PSD sensors 412 and 414 may be processed by one or more Digital Signal Processors (DSPs) (not shown) that determine the relative angle (i.e., unobstructed and detected) of each such IR-LED422 within its field of view 430. This information is used to triangulate the position of each IR-LED and, based on the relative positions of at least three such IR-LEDs 422 at any given time, the position and orientation of stylus 420 and its tip 420' can be determined. In other words, the position and orientation of the stylus may be calculated based on the sensed transmitter locations, which are based on their known arrangement relative to the stylus geometry. The optical 3D tracking system 400 provides sufficient resolution and accuracy to accurately detect input from, for example, a stylus 420 (or other touch-type input device).
In fact, the two PSD sensors 412 and 414 provide the board 410 something comparable to stereoscopic vision, where the slight differences in perspective (perspective) when sensing the three IR-LEDs 422, measured as horizontal and vertical angles of the signal 424, which in turn define the direction from each PSD sensor 412 and 414 to each IR-LED422, enable geometric determination of the position and orientation of these IR-LEDs 422 with respect to the defined plane of the display device (or board), thus a total of six directions for this particular example.
In some embodiments, a conventional PSD-based position sensor may be used to sequentially acquire the direction to each light emitter, such that each IR-LED is illuminated sequentially and the PSD determines the position of each IR-LED sequentially one at a time and the cycle is repeated continuously for the IR-LED array. However, for various other embodiments, two or more IR-LEDs in an array may emit simultaneously and use a sequencer that enables each IR-LED to emit its own discernable pattern (pattern), that is, a pattern in which each IR-LED emits relatively unique. For example, each IR-LED operating at 250 Kb/sec may be assigned a pseudo-random bit sequence of 512 bits. This in turn allows the linear PSD sensor to receive a mixed signal from commonly emitted IR-LEDs and then separate the relatively unique signal from each emitter based on the processing gain in the case of direct sequence spread spectrum transmission received from the IR-LED array. Such embodiments therefore include spread spectrum solutions which have the advantage of being relatively immune to uncorrelated signals while tracking each transmitter simultaneously (rather than in time sequence).
For several embodiments, the arrangement of IR-LEDs (or "IR-LED array") may be powered by a battery. For various embodiments using a stylus with an IR-LED arrangement, where the stylus (that is to say the IR-LED) is battery powered, such embodiments may further comprise in-use detection means, such as a touch sensor, which is capable of detecting when the stylus is being used and of conserving battery power when the stylus is not in use, for example by entering a low power mode and turning off the IR-LED.
Although a stylus may use a battery for its IR-LED, it is desirable that the total required power is lower than that of an optical mouse, thus implying the potential for long battery life. In fact, unlike an optical mouse, for example, the IR-LED emission need not be reflected from the surface at an unknown albedo, but rather is intended to be received directly by the corresponding PSD sensor. Thus, for several such embodiments, if the emitted light, when collected, uses a PSD sensor with a tolerable aperture (e.g., a minimum aperture of 120 degrees in at least one direction), the IR-LED output power can be very low. This in turn prevents the need for excessive light emission from the IR-LED, which could otherwise degrade performance of the PSD sensor by inadvertently illuminating nearby objects (such as a user's finger holding a stylus in which the IR-LED is disposed) that could cause false signals to be detected by the PSD sensor.
In addition, some embodiments may modulate the emission of the IR-LED to provide a low-rate wireless digital communication channel to the stylus to the pad (e.g., tablet computer) that can be used to transmit additional information, such as button press events, positioning of a code wheel or linear slider, tip surface contact pressure data, battery status, and the like.
An additional benefit of the various embodiments of the optical 3D tracking system described herein, in particular the use of PSD and IR-LED, is that it is also suitable for providing indoor navigation features (or "position/location awareness"). More specifically, the PSD sensor may be used to sense the position of the tablet relative to the room in which the tablet is operating, using a supplemental IR-LED positioned stationary or relatively fixed relative to the PSD sensor.
For example, for certain embodiments, a PSD sensor including an optical 3D tracking system can also be used to sense infrared signals emitted from a stationary IR-LED beacon in the vicinity of a device, e.g., in a room, hallway, car, or other such enclosed location. These beacons are capable of signaling their location via infrared signals, as well as other aiding information, which the computing device can use to calculate the device's location and pose for a variety of purposes, such as indoor position location, navigation, and the like. Due to the high accuracy of the sensing system, absolute plate positioning and attitude (with sub-centimeter level accuracy) within the enclosed space can be obtained. Further, the reference beacons may be stationary (to provide a fixed reference point) or dynamic (to provide a relative reference point). For some implementations, the reference beacons may inform their positioning and other aiding information through their transmitted infrared signals.
As such, certain embodiments disclosed herein relate to a system for optical navigation of a mobile computing device, the system including at least: (a) two positioning sensors fixedly coupled to the mobile computing device in a known configuration for sensing three discernable Infrared (IR) signals emitted from three IR position emitters having known positions, wherein the sensing includes determining a 2D direction (e.g., two 2D directions for each IR position emitter) of each of the three IR position emitters relative to each of the two positioning sensors based on the three discernable IR signals; and (b) a processor for determining the location and orientation of the mobile computing device relative to the three IR position emitters using, for example, six (2D) directions, known configurations of the two location sensors, and known locations of the three IR position emitters.
FIG. 5 illustrates an exemplary use of a PSD sensor 512 and a stationary IR-LED beacon 514 for indoor navigation representative of several embodiments disclosed herein. Each stationary beacon 514 includes at least one IR-LED that emits an extended sequence digital signal 516 (illustrated in dashed lines) that is receivable by PSD sensor 512 and decodable by a DSP (not shown) within an associated mobile computing device 510, such as a tablet computer. In some embodiments, each stationary beacon 514, which stationary beacon 514 may be mounted to the ceiling 518 of a room, for example, may transmit its relatively unique identification and its known location by modulating the infrared emission of its IR-LED with a secondary code. In some alternative embodiments, the known location (possibly along with other additional information) of each beacon may instead be published via a locally accessible communication system (e.g., a Wi-Fi system covering the area) that may be received by computing device 510. By receiving signals from three or more stationary beacons 514, the computing device 510 may calculate its position and orientation (e.g., pose) based on the known locations of the detectable stationary beacons 514.
Again, two PSD sensors 512 provide what is comparable to stereo vision to mobile computing device 510, with the slight differences in perspective when sensing three beacons 514, which enable geometric determination of the position of mobile computing device 510 and its orientation relative to the defined plane of its display device (or board) with respect to IR-LED beacons 514, measured as the horizontal and vertical angles of signals 516, which in turn define the direction from each PSD sensor to each IR-LED beacon 514, thus a total of six directions for this particular example.
Furthermore, certain embodiments disclosed herein relate to a system for relative optical positioning, the system for each mobile computing device comprising at least: (a) two positioning sensors fixedly coupled to a mobile computing device in a known configuration for sensing a plurality of discernable IR signals emitted from a plurality of IR sources in a 3D space, wherein the sensing comprises determining, for each discernable IR signal, two 2D directions of a corresponding IR signal source relative to each of the two positioning sensors; (b) an IR laser emitter fixedly coupled to the first mobile computing device in a first known orientation for emitting discernable IR laser light that, when reflected and scattered by an interference barrier at a point of impact in 3D space, produces discernable IR signals emanating from the point of impact that can be sensed by the two positioning sensors, wherein the emitted IR laser light is modulated to provide a digital communication channel from the mobile computing device to the other mobile computing devices; and (c) a processor for estimating a position of the mobile computing device relative to the IR source from the point of impact based on the two 2D directions, the known configurations of the two positioning sensors, and the known position and known emission direction of the IR laser.
For several embodiments, the IR laser emitter may be modulated with a dedicated code sequence similar to the IR-LED disclosed earlier herein. Furthermore, various embodiments may use low-power IR lasers that comply with occupational safety and health management (OSHA) regulations, e.g., relating to invisible laser emissions, due to the increased risk of invisible laser emissions to the human eye, as such invisible emissions do not trigger a prompt reflex that protects the human eye from the effects of conventional laser pointers. For low power IR lasers, certain embodiments may ensure that the laser emission is unconditionally safe for the eye by limiting the emission to 1 microwatt within the pupil aperture of the eye.
Fig. 6 illustrates an exemplary use of a PSD sensor 612 and a dynamic beacon 614 comprising the above-described system for indoor navigation representative of several embodiments disclosed herein. First computing device 610 may include at least one dynamic beacon 614 in addition to its set of PSD sensors 612. The dynamic beacon 614, which may be implemented as an infrared laser pointer, may emit a modulated infrared signal 615 (shown in dotted lines) having a relatively unique spreading sequence. The signal 615 is then reflected at the point where it strikes the surface 617 and scattered as a reflected signal 616 (shown in dashed lines) so as to be detectable by the PSD sensor of the beacon 614.
Because dynamic beacon 614 emits its infrared laser signal 616 from the precise known location, direction, and angle of beacon 614 relative to the PSD sensor, computing device 610, using its PSD sensor, is able to determine the intersection of the infrared laser emission (e.g., signal 616) with, for example, a wall (not shown) or ceiling 518 of a room in which computing device 610 is located. This information may be used to infer the position and/or pose of the computing device based, in part, on various assumptions about the configuration of the room around the computing device (e.g., ceiling horizontal, walls vertical, etc.). In any case, since the direction of emission is known to the computing device, the location where the laser beam hits the reflective surface can be determined using only a single PSD sensor.
Further, when the first computing device 610 is able to interact with one or more similarly configured additional computing devices, such as a second computing device 620 that includes a PSD sensor 622 and a dynamic beacon 624 that emits its own modulated infrared signal 625 (shown in dotted lines), its reflection/scattering 626 (shown in dashed lines) at the point where the infrared signal 625 strikes a surface 627, then each computing device 610 and 620 is able to determine its possible relative positioning with respect to other computing devices with which it is now able to communicate via infrared signals. More specifically, by sharing this laser strike location information, a group of computing devices can determine relative positions with respect to each other using only 3 commonly visible laser strikes (not necessarily the same laser strike) that can be detected by each computing device.
Indoor navigation allows for a variety of new applications for computing devices, including, for example, taking pictures at different positions and orientations relative to a beacon (still or dynamic), which in turn may support image integration, 3D reconstruction, 3D modeling, mapping, and other uses. It is also possible to support an interactive User Interface (UI) between multiple mobile devices and their peripherals so that the stylus of a first computing device can interact with a stationary screen and projected images emanating from a peripheral projection display device (either the first computing device or a second computing device in communication with the first computing device). These UIs may also include special gesture inputs, such as, for example, pointing to a portion on a live presentation projected onto a large still screen and adding annotations.
Mobile computing devices equipped with active beacon projectors also enable such applications: it is able to capture an accurate architectural plan by moving through open spaces with the panels uncovered and in an active recording state. Similarly, such devices may be used to capture precise 3D shapes of larger objects, such as sculptures, automobiles, etc., for 3D modeling, 3D cad, or other purposes.
The additional inter-device capabilities may also include selecting to send a message from one computing device to another computing device. For example, these computing devices 610 and 620 may establish an ad-hoc (point-to-point) coordinate system in which the projector may be used by (i.e., interact with) all participating computing devices, so long as information is indicated or additional annotations to the projected image are allowed — transforming the projected image into a shared interactive work surface. Similarly, these location features can be used for interactive gaming, 3D modeling, and several other applications.
In other embodiments, the positioning and pose of two devices relative to two or more reference beacons may be used to enable one device to interact directly with the output displayed by the other device, and vice versa. For example, a stylus from a first device may be pointed at and interact directly with a projected display of a second device (i.e., on a projection screen at a known fixed position relative to the reference beacon and thus both devices).
In particular, this includes the ability to exchange positioning and pose information with similarly equipped computing devices, which in turn enables a new form of user interface. For example, pointing a mobile computing device to another unit may be used as a gesture to establish the devices and only communication between the devices.
FIG. 7A is a process flow diagram illustrating an example process 700 for determining the position and orientation of a touch-type input device for use with the various embodiments disclosed herein. At 702, the mobile computing device senses a plurality of IR signals with IR positioning sensors and determines, for each IR positioning sensor, a 2D direction for each IR signal. At 704, the source of each IR signal (e.g., a particular IR-LED on the touch-type input device) is identified, that is, each IR signal source (e.g., IR-LED) is matched to a particular IR position emitter on the stylus, the configuration of which is already known to the mobile computing device (or, alternatively, the configuration of which may be transmitted by the stylus as modulated data on each IR signal). At 706, the position and orientation of the touch-based input device is determined based on the direction of the resulting IR signals and the known configurations of both the IR positioning sensors (and the differences in their readings) and the IR position emitters (which originate those IR signals).
Fig. 7B is a process flow diagram illustrating an exemplary process 730 for mobile computing device beacon navigation (i.e., determining location using beacons) representative of various embodiments disclosed herein. At 732, the mobile computing device senses IR location emissions (with two IR positioning sensors) from at least three beacons (one IR positioning sensor each) and determines, for each IR positioning sensor, a 2D direction for each IR signal (for each beacon). At 734, each IR signal source is matched to its location (in some embodiments by a look-up of known beacons, while in other embodiments this information may be encoded (modulated) into the IR signal itself), and these locations are matched to the relative directions determined by each IR positioning sensor. At 736, the location and orientation of the mobile computing device is determined relative to the beacon based on the direction of the resulting IR signals, the known configuration of the IR positioning sensors (and the difference in their readings), and the known location of the beacon (which initiated those IR signals).
Fig. 7C is a process flow diagram illustrating an exemplary process 760 for mobile computing device laser positioning (i.e., deriving relative local navigation/position information) representative of various embodiments disclosed herein. At 762, the mobile computing device emits laser light, for example, from an IR laser device having a known position and orientation relative to its IR positioning detector, which, upon collision with an interfering object, produces scattering of its own IR signal that it senses with its IR positioning detector, accompanied by any other IR signals that may be emitted by beacons and/or by other mobile computing devices with similar capabilities. At 764, the sensed IR signals are matched to their respective sources, including the laser collision point (corresponding to the location where the laser strikes the interfering object), beacons (which may transmit their location information as well as other encoded (i.e., modulated) data) in their IR signals, and IR signals from other mobile computing devices originating from their laser collision point. Each mobile computing device encodes (modulates) its positional information about its laser impact point in its continuous beam of laser light, and may also use the modulated laser light as a communication medium with other mobile devices. At 766, each IR signal, its direction (as determined by each IR positioning sensor), and any corresponding location information (such as location information provided by other computing devices or beacons) is processed to derive local location information relative to the received IR signal.
FIG. 8 illustrates an exemplary computing environment in which example embodiments and aspects may be implemented. The computing system environment is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality. Many other general purpose or special purpose computing system environments or configurations may be used. Examples of well known computing systems, environments, and/or configurations that may be suitable for use include, but are not limited to, Personal Computers (PCs), server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, network PCs, minicomputers, mainframe computers, embedded systems, distributed computing environments that include any of the above systems or devices, and the like.
Computer-executable instructions, such as program modules, being executed by a computer may be used. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Distributed computing environments may be used where tasks are performed by remote processing devices that are linked through a communications network or other data transmission medium. In a distributed computing environment, program modules or other data may be located in both local and remote computer storage media including memory storage devices.
With reference to FIG. 8, an exemplary system for implementing aspects described herein includes a computing device, such as computing device 800. In its most basic configuration, computing device 800 typically includes at least one processing unit 802 and memory 804. Depending on the exact configuration and type of computing device, memory 804 may be volatile (such as RAM), non-volatile (such as read-only memory (ROM), flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in fig. 8 by dashed line 806.
Computing device 800 may have additional features/functionality. For example, computing device 800 may include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. These additional memories are illustrated in FIG. 8 by removable memory 808 and non-removable memory 810.
Computing device 800 typically includes a variety of computer-readable media. Computer readable media can be any available media that can be accessed by device 800 and includes both volatile and nonvolatile media, removable and non-removable media.
Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Memory 804, removable storage 808, and non-removable storage 810 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 800. Any such computer storage media may be part of computing device 800.
Computing device 800 may include one or more communication connectors 812 that allow the device to communicate with other devices. Computing device 800 may also have one or more input devices 814, such as a keyboard, a mouse, a pen, a voice input device, a touch input device, etc. One or more output devices 816, such as a display, speakers, printer, etc., may also be included. All of these devices are well known in the art and need not be discussed at length here.
It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and apparatus of the presently disclosed subject matter, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the presently disclosed subject matter.
While exemplary embodiments may dictate aspects of using the presently disclosed subject matter in the context of one or more stand-alone computer systems, the subject matter is not so limited, but rather may be implemented in connection with any computing environment, such as a network or distributed computing environment. Further, aspects of the presently disclosed subject matter may be implemented in or across a plurality of processing chips or devices, and memory may function similarly across a plurality of devices. Such devices may include, for example, personal computers, network servers, and handheld devices.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
According to an embodiment of the present invention, the following scheme is provided:
supplementary note
1. A three-dimensional, 3D touch-type input system for a mobile computing device, the system comprising:
three infrared IR position emitters fixedly coupled to the touch-type input device in a first known configuration for emitting three discernable IR signals;
two positioning sensors fixedly coupled to the mobile computing device in a second known configuration for sensing the three discernable IR signals emitted from the three IR position emitters, wherein the sensing comprises determining two-dimensional (2D) directions of each of the three IR position emitters relative to each of the two positioning sensors based on the three discernable IR signals; and
a processor to determine a location and orientation of the touch-type input device relative to the mobile computing device based on the 2D direction, the first known configuration, and the second known configuration.
2. The system of supplementary note 1, wherein the discernable IR signal is modulated to provide a digital communication channel from the touch-type input device to the mobile computing device.
3. The system according to supplementary note 1, wherein the IR position emitter includes an IR light emitting diode.
4. The system according to supplementary note 1, wherein the positioning sensor includes a positioning sensitive diode.
5. The system according to supplementary note 4, wherein the positioning sensor further comprises a wide-angle lens characterized by a 120-degree minimum aperture in at least one direction.
6. The system according to supplementary note 1, wherein the touch-type input device is a stylus-type device.
7. The system of supplementary note 1, wherein the touch-type input device is separate from the mobile computing device, battery powered, and includes power saving features for when the touch-type input device is not in use.
8. The system according to supplementary note 1, wherein the specific orientation of the touch-type input device provides at least one supplemental user input.
9. The system according to supplementary note 1, wherein the three IR position emitters include IR light emitting diodes.
10. The system according to supplementary note 1, wherein the two positioning sensors include positioning sensitive diodes.
11. The system according to supplementary note 1, wherein:
the two positioning sensors sense three discernable infrared beacon signals emitted from three IR beacons, wherein the sensing includes determining the two 2D directions of each of the three IR beacons relative to each of the two positioning sensors based on the three discernable IR beacon signals, and wherein the sensing includes demodulating the IR beacon signals to receive position data of each of the three IR beacons, and
the processor determines a location and orientation of the mobile computing device relative to the three IR beacons based on the 2D direction, the known configurations of the two positioning sensors, and known locations of the three IR beacons.
12. A system for optical navigation of a mobile computing device, the system comprising:
two positioning sensors fixedly coupled to the mobile computing device in a known configuration for sensing three discernable IR signals emitted from three infrared IR position emitters having known positions, wherein the sensing comprises determining two-dimensional (2D) directions of each of the three IR position emitters relative to each of the two positioning sensors based on the three discernable IR signals, and
a processor to determine a location and orientation of the mobile computing device relative to the three IR position emitters based on the 2D direction, the known configurations of the two positioning sensors, and the known locations of the three IR position emitters.
13. The system of supplementary note 12, wherein the discernable IR signal is modulated to provide a digital communication channel from the IR location emitter to the mobile computing device.
14. The system of supplementary note 12 wherein the location information is obtained from the digital communication channel.
15. The system of supplementary note 12, wherein the three IR position emitters comprise IR light emitting diodes.
16. The system according to supplementary note 12, wherein the two positioning sensors include positioning sensitive diodes.
17. A system for relative optical positioning of mobile computing devices, the system comprising:
two positioning sensors fixedly coupled to the mobile computing device in a known configuration for sensing a plurality of discernable IR signals emitted from a plurality of infrared IR sources in a three-dimensional 3D space, wherein the sensing comprises determining, for each discernable IR signal, two-dimensional 2D directions of a corresponding IR signal source relative to each of the two positioning sensors;
an IR laser emitter fixedly coupled to the first mobile computing device in a first known orientation for emitting discernable IR laser light that, when reflected and scattered by an interference barrier at a point of impact in the 3D space, produces discernable IR signals diverging from the point of impact that can be sensed by the two positioning sensors, wherein the emitted IR laser light is modulated to provide a digital communication channel from the mobile computing device to other mobile computing devices; and
a processor to estimate a location of the mobile computing device relative to the IR signal source from the collision point based on the two 2D directions, the known configurations of the two location sensors, and a known location and a known emission direction of the IR laser.
18. The system of supplementary note 17, wherein the mobile computing device sends a communication to a second mobile computing device via the IR laser transmitter, wherein the mobile computing device receives a communication from the second mobile computing device via the two positioning sensors.
19. The system of supplementary note 17, wherein the IR laser transmitter comprises an IR light emitting diode.
20. The system according to supplementary note 17, wherein each of the two positioning sensors comprises a positioning sensitive diode.

Claims (10)

1. A three-dimensional, 3D, touch-type input system (148) for a mobile computing device (140), the system comprising:
three infrared IR position emitters (272) fixedly coupled to the touch-sensitive input device (250) in a first known configuration for emitting three discernable IR signals (424);
two positioning sensors (260, 262) fixedly coupled to the mobile computing device (140) in a second known configuration for sensing the three discernable IR signals (424) emitted from the three IR position emitters (272), wherein the sensing includes determining two-dimensional (2D) directions of each of the three IR position emitters (272) relative to each of the two positioning sensors (260, 262) based on the three discernable IR signals (424); and
a processor (802) for determining a position and orientation of the touch-type input device (250) relative to the mobile computing device (140) and for determining a 3D position and orientation of the touch-type input device in a 3D space above a surface of the mobile computing device based on the 2D direction, the first known configuration, and the second known configuration.
2. The system of claim 1, wherein the discernable IR signal (424) is modulated to provide a digital communication channel from the touch input device (250) to the mobile computing device (140).
3. The system of claim 1, wherein the IR position emitter (272) comprises an IR light emitting diode, and wherein the positioning sensor (260, 262) comprises a positioning sensitive diode.
4. The system of claim 1, wherein the touch-type input device (250) is a stylus-type device.
5. The system of claim 1, wherein the touch-type input device (250) is separate from the mobile computing device, battery powered, and includes power saving features for when the touch-type input device is not in use.
6. The system of claim 1, wherein a particular orientation of the touch-type input device (250) provides at least one supplemental user input.
7. The system of claim 1, wherein the three IR position emitters (272) comprise IR light emitting diodes, and wherein the two positioning sensors (260, 262) comprise positioning sensitive diodes.
8. A method (700) for optical navigation of a mobile computing device (140), the method comprising:
sensing (702) three discernable Infrared (IR) signals (424) emitted from three IR position emitters (272) having known locations, wherein the sensing (702) includes determining two-dimensional (2D) directions of each of the three IR position emitters (272) relative to each of two positioning sensors (260, 262) based on the three discernable IR signals; and
utilizing a processor (802) to determine a location and orientation of the mobile computing device (140) relative to the three IR position emitters (272) and to determine a 3D location and orientation of a touch-type input device in a 3D space above a surface of the mobile computing device based on the 2D direction, known configurations of the two positioning sensors (260, 262), and the known locations of the three IR position emitters (272).
9. The method of claim 8, wherein the discernable IR signal (424) is modulated to provide a digital communication channel from the IR location emitter (272) to the mobile computing device (140).
10. The method of claim 8, wherein each of the three IR position emitters (272) comprises an IR light emitting diode, and wherein each of the two positioning sensors (260, 262) comprises a positioning sensitive diode.
HK13110180.4A 2011-11-02 2013-09-02 Optical tablet stylus and indoor navigation system HK1182816B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/287,147 2011-11-02

Publications (2)

Publication Number Publication Date
HK1182816A HK1182816A (en) 2013-12-06
HK1182816B true HK1182816B (en) 2017-10-06

Family

ID=

Similar Documents

Publication Publication Date Title
JP6129863B2 (en) Three-dimensional touch type input system and optical navigation method
US9019239B2 (en) Creative design systems and methods
US10437391B2 (en) Optical touch sensing for displays and other applications
US10324566B2 (en) Enhanced interaction touch system
KR101560308B1 (en) Method and electronic device for virtual handwritten input
US9448645B2 (en) Digitizer using multiple stylus sensing techniques
US8941620B2 (en) System and method for a virtual multi-touch mouse and stylus apparatus
CN105593786B (en) Object location determination
US20110267264A1 (en) Display system with multiple optical sensors
TWI484386B (en) Display with an optical sensor
US20120319945A1 (en) System and method for reporting data in a computer vision system
JP6127564B2 (en) Touch determination device, touch determination method, and touch determination program
US12353649B2 (en) Input device with optical sensors
WO2019136989A1 (en) Projection touch control method and device
CN102968218B (en) Optical image type touch device and touch image processing method
CN105183190A (en) Laser pen with mouse function
CN102253763B (en) Wireless interaction system and method
HK1182816B (en) Optical tablet stylus and indoor navigation system
HK1182816A (en) Optical tablet stylus and indoor navigation system
CN105718121A (en) Optical touch device
WO2018214691A1 (en) Optical touch sensing for displays and other applications
JP2016110492A (en) Optical position information detection system, program, and object linking method
CN107943352A (en) A kind of media machine and the contactless touch screen method based on media machine
KR101037310B1 (en) Information input method and system using ultrasonic signal
JP2015118426A (en) Coordinate detection system, information processing device, program, storage medium, and coordinate detection method