[go: up one dir, main page]

US20160349864A1 - Digital ultrasonic emitting base station - Google Patents

Digital ultrasonic emitting base station Download PDF

Info

Publication number
US20160349864A1
US20160349864A1 US15/231,231 US201615231231A US2016349864A1 US 20160349864 A1 US20160349864 A1 US 20160349864A1 US 201615231231 A US201615231231 A US 201615231231A US 2016349864 A1 US2016349864 A1 US 2016349864A1
Authority
US
United States
Prior art keywords
base station
digital pen
dimensional position
dimensional
transmitters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/231,231
Inventor
Roberto Avanzi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US15/231,231 priority Critical patent/US20160349864A1/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVANZI, ROBERT
Publication of US20160349864A1 publication Critical patent/US20160349864A1/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AVANZI, ROBERTO
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S ADDRESS PREVIOUSLY RECORDED AT REEL: 046267 FRAME: 0521. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: AVANZI, ROBERTO
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • Embodiments disclosed herein are generally directed to tracking, in an acoustic tracking system, a position of a digital pen relative to a base station.
  • a user may take notes using a computer. While the user may be able to type quickly, adding graphics, arrow, tables, or other objects to her notes may be difficult using the computer. The user may perform these actions more quickly on paper than on a laptop screen or even a tablet.
  • an electronic notepad may include a clipboard that holds a piece of paper. The user may sketch on the paper while the electronic notepad digitally records the user's sketches. The user tethers the electronic notepad to a computer to transfer the data recorded by the electronic notepad to the computer. Tethering the electronic notepad to the computer in order to transfer the data, however, may be inconvenient for the user.
  • the electronic notepad system may be expensive because of the onboard processing circuitry on the electronic notepad.
  • the electronic notepad may consume a lot of power.
  • the acoustic tracking system includes a first plurality of receivers that detects first acoustic signals from a first set of transmitters disposed on a digital pen and a second plurality of receivers that detects second acoustic signals from a second set of transmitters disposed on a base station.
  • the acoustic tracking system also includes a processing component that defines, based on the second acoustic signals, a two-dimensional plane on which the base station lies, determines, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station, and projects the three-dimensional position of the digital pen onto the two-dimensional plane.
  • the acoustic tracking system further includes an application controller that records, based on the projected three-dimensional position of the digital pen onto the two-dimensional plane, the three-dimensional position of the digital pen relative to the base station, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
  • an example method of recording an object based on movement of a digital pen relative to a base station includes detecting, by a first plurality of receivers coupled to a computing device, first acoustic signals transmitted from a first set of transmitters disposed on a digital pen. The method also includes detecting, by a second plurality of receivers coupled to the computing device, second acoustic signals transmitted from a second set of transmitters disposed on a base station. The method further includes defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies.
  • the method also includes determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station.
  • the method further includes projecting the three-dimensional position of a tip of the digital pen onto the two-dimensional plane.
  • the method also includes recording the three-dimensional position of the tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • a computer-readable medium having stored thereon computer-executable instructions for performing operations, including: detecting first acoustic signals transmitted from a first set of transmitters disposed on a digital pen; detecting second acoustic signals transmitted from a second set of transmitters disposed on a base station; defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies; determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station; projecting the three-dimensional position of the digital pen onto the two-dimensional plane; and recording the three-dimensional position of the digital pen based on the projecting, where the recorded three-dimensional position represents an object representative of movement of the digital pen.
  • a system for recording an object based on movement of a digital pen relative to a base station includes means for detecting first acoustic signals transmitted from a first set of transmitters disposed on a digital pen.
  • the system also includes means for detecting second acoustic signals transmitted from a second set of transmitters disposed on a base station.
  • the system further includes means for defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies.
  • the system also includes means for determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station.
  • the system further includes means for projecting the three-dimensional position of the digital pen onto the two-dimensional plane.
  • the system also includes means for recording the three-dimensional position of the digital pen based on the projecting, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
  • FIG. 1 is a diagram illustrating an example acoustic tracking system.
  • FIG. 2 is a diagram illustrating an acoustic tracking system, consistent with some embodiments.
  • FIG. 3 is a diagram illustrating a two-dimensional plane on which a base station 208 , consistent with some embodiments.
  • FIG. 4 illustrates a base station, consistent with some embodiments.
  • FIG. 5 is a flowchart illustrating a method of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • FIG. 6 is a diagram illustrating a platform capable of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • FIG. 1 is a diagram illustrating an example acoustic tracking system 100 .
  • Acoustic tracking system 100 includes a computing device 102 , a notepad 104 , and a digital pen 106 .
  • One or more transmitters that transmit acoustic signals may be disposed on digital pen 106
  • one or more receivers that detect the acoustic signals may be disposed on computing device 102 .
  • a user may write on notepad 104 using digital pen 106 .
  • digital pen 106 may transmit acoustic signals that are detected by computing device 102 .
  • computing device 102 may determine a position of digital pen 106 and record what the user is writing on notepad 104 .
  • the user wrote “ab” on notepad 104 , which is shown on a display of computing device 102 .
  • the objects drawn by the user after the movement of notepad 104 may not be properly aligned with what is displayed on computing device 102 .
  • An object may refer to text, graphics, or other symbols.
  • the “c” drawn by the user after notepad 104 was moved is not aligned with “ab” displayed on computing device 102 .
  • a possible solution to this problem may be to provide a fold cover for computing device 102 such that when the user opens the fold cover, notepad 104 is on one side and computing device 102 is on the other side of the fold cover. It may be inconvenient for the user, however, to carry the fold cover around and also to use notepad 204 in the fold cover. Additionally, the additional expense of purchasing a fold cover may add onto the expense of using acoustic tracking system 100 .
  • FIG. 2 is a diagram illustrating an acoustic tracking system 200 , consistent with some embodiments.
  • Acoustic tracking system 200 may be used with devices such as smartphones, tablets, laptops, desktops, and personal digital assistants (PDAs).
  • PDAs personal digital assistants
  • One example of an acoustic signal-based position tracking system is a digital pen having one or more acoustic signal transmitters and a base station having one or more acoustic signal transmitters, where the acoustic signal transmitters are in communication with one or more receivers coupled to a computing device.
  • the digital pen and base station may interact with the computing device by transmitting acoustic signals, as will be discussed further below.
  • acoustic tracking system 200 includes a computing device 202 , a notepad 204 , a digital pen 206 , and a base station 208 .
  • Digital pen 206 includes a set of transmitters 210 including transmitter 210 A and transmitter 210 B that transmit acoustic signals.
  • Transmitter 210 A may be located near or at the tip of the digital pen, and transmitter 210 B may be located along a length of the digital pen.
  • Transmitter 210 A may be located within a proximity to a tip of the digital pen. For example, transmitter 210 A may be located within 0.5 millimeters of the tip of the digital pen.
  • a transmitter may also be referred to as an emitter.
  • transmitters are illustrated as being disposed on digital pen 206
  • other embodiments having more than two transmitters are within the scope of this disclosure.
  • more than two transmitters may be disposed on the digital pen if power, design complexity, and system robustness allow for it. More interference, however, from adjacent transmitters on the digital pen may arise and may depend on the pattern signal design. Orthogonal sequences with ideal correlation properties may be used for the transmitter pattern design. Further, a higher quantity of transmitters may destroy the assumption of zero mean range measurement noise assumption and result in higher noise and less position tracking accuracy.
  • Base station 208 includes a set of transmitters 220 including transmitters 220 A- 220 D that transmit acoustic signals. Although four transmitters are illustrated as being disposed on base station 208 , other embodiments having two or more transmitters are within the scope of this disclosure.
  • a user may attach base station 208 to notepad 204 and write on notepad 204 using digital pen 206 . If user moves notepad 204 , base station 208 also moves along with notepad 204 .
  • Both digital pen 206 and base station 208 emit acoustic signals that are received by computing device 202 .
  • Computing device 202 includes a set of receivers 230 for picking up the signals transmitted by set of transmitters 210 and set of transmitters 220 .
  • Set of receivers 230 may be coupled to computing device 202 and may continuously run such that they are always ready to receive input from the transmitters when computing device 202 is turned on. In another example, set of receivers 230 does not continuously run but wakes up periodically to receive input from the transmitters.
  • set of transmitters 210 and/or set of transmitters 220 may transmit a signal pattern of acoustic waves, such as an ultrasonic signal.
  • the transmitters may be any suitable ultrasonic device that includes one or more ultrasonic transducers to generate ultrasonic signals (e.g., speakers).
  • Set of receivers 230 may be any suitable acoustic receivers such as a microphone, and set of transmitters 210 and/or set of transmitters 220 may transmit ultrasonic signals to multiple microphones coupled to computing device 202 .
  • Computing device 202 may include a processing component 232 and a memory 234 .
  • processing component 232 may be one or more processors, central processing units (CPUs), image signal processors (ISPs), micro-controllers, or digital signal processors (DSPs), graphics processing units (GPUs), and audio signal processors, which may include analog and/or digital audio signal processors.
  • processing component 232 may be provided as hardware, software, or firmware, or combinations thereof in various embodiments.
  • Memory 234 may include a system memory component, which may correspond to random access memory (RAM), an internal memory component, which may correspond to read only memory (ROM), and an external or static memory, which may correspond to optical, magnetic, or solid-state memories, for example.
  • Memory 234 may correspond to a non-transitory machine-readable medium that includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, and/or any other medium from which processing component 232 is capable of reading.
  • processing component 232 extracts acoustic signals received by each receiver of set of receivers 230 . For each acoustic signal received at a receiver, processing component 232 may determine which transmitter transmitted the respective acoustic signal. The received acoustic signals may be compared to an expected signal by processing component 232 according to instructions stored in memory 234 and an expected signal stored in memory 234 or generated by processing component 232 , for example.
  • Memory 234 may include an application controller 235 and an electronic notepad application 236 .
  • Application 236 receives input from application controller 235 regarding the positions of base station 208 and of digital pen 206 relative to base station 208 .
  • Application controller 235 affects operation of an application based on determined positions of base station 208 and of digital pen 206 relative to base station 208 .
  • Computing device 202 is a separate computer from base station 208 and digital pen 206 and processes the acoustic signals received from set of transmitters 210 and set of transmitters 220 . It may be advantageous for computing device 202 to perform these calculations rather than digital pen 206 or base station 208 because computing device 202 may have a longer battery life compared to digital pen 206 or base station 208 . Additionally, computing device 202 may have a better performing processor compared to digital pen 206 or base station 208 . As such, computing device 202 's processing power may be leveraged and base station 208 and digital pen 206 may have minimal power requirements. It should also be understood, however, that digital pen 206 may include software that performs some of the calculations that are described as being performed by processing component 232 .
  • Base station 208 transmits acoustic signals 240 .
  • computing device 202 detects, by a plurality of receivers of set of receivers 230 , acoustic signals 240 transmitted from set of transmitters 220 disposed on base station 208 .
  • Processing component 232 may extract the acoustic signals received by the plurality of receivers and define, based on acoustic signals 240 , a two-dimensional plane on which base station 208 lies.
  • FIG. 3 is a diagram illustrating a two-dimensional plane 302 on which base station 208 lies, consistent with some embodiments.
  • FIGS. 2 and 3 will be discussed together to better explain how objects drawn by the user on notepad 204 using digital pen 206 are captured by computing device 202 .
  • base station 208 may transmit acoustic signals 240 that may be detected by computing device 202 and used by processing component 232 to define a two-dimensional plane on which base station 208 lies.
  • processing component 232 may define the two-dimensional plane by determining a location of at least three of the transmitters disposed on base station 208 .
  • the position accuracy may be dependent on transducer placement, signal sequence design, and transducer acoustics (e.g., porting and signal-to-noise ratio).
  • the fourth transmitter e.g., transmitter 220 D
  • the fourth transmitter may be optional in a three-emitter configuration. The fourth transmitter, however, may provide a more accurate result and may also be used if one of the other transmitters does not have a clear line of sight with receivers coupled to computing device 202 .
  • Each transmitter (of set of transmitters 210 or set of transmitters 220 ) may transmit a different acoustic signal pattern with respect to each other.
  • the different acoustic signal patterns may help to maintain a clear line of sight between the transmitters and receivers.
  • a multiplexing technique may be used to properly control the power used by these transmitters.
  • transmitters may transmit acoustic signals using Time Division Multiple Access (TDMA).
  • TDMA Time Division Multiple Access
  • a plurality of set of receivers 230 may receive at a first time slot a first ultrasonic signal from a transmitter 220 A, receive at a second time slot a second ultrasonic signal from transmitter 220 B, and receive at a third time slot a third ultrasonic signal from transmitter 220 C, and/or receive at a fourth time slot a fourth ultrasonic signal from transmitter 220 C.
  • the transmission of signals at different time slots may reduce the interference noise.
  • transmitters may transmit acoustic signals using multiplexing techniques different from TDMA.
  • transmitters may transmit acoustic signals using Frequency Division Multiplexing (FDM).
  • FDM Frequency Division Multiplexing
  • a plurality of set of receivers 230 may receive at a first frequency sub-band a first ultrasonic signal from transmitter 220 A, receive at a second frequency sub-band a second ultrasonic signal from transmitter 220 B, receive at a third frequency sub-band a second ultrasonic signal from transmitter 220 C, and/or receive at a fourth frequency sub-band a fourth ultrasonic signal from transmitter 220 D.
  • transmitters 220 A- 220 D may transmit acoustic signals using Phase Division Multiplexing (PDM).
  • PDM Phase Division Multiplexing
  • a plurality of set of receivers 230 may receive at a first phase of a channel a first ultrasonic signal from transmitter 220 A, receive at a second phase of the channel a second ultrasonic signal from transmitter 220 B, receive at a third phase of the channel a third ultrasonic signal from transmitter 220 C, and receive at a fourth phase of the channel a fourth ultrasonic signal from transmitter 220 D.
  • different frequencies or different durations may be used by the transmitters.
  • the acoustic signals may be emitted simultaneously if the transmitters emit at different frequencies relative to each other.
  • Processing component 232 may calculate a time difference of arrival (TDOA) for the acoustic signals received at the receivers and apply a least square formula to the one or more calculated time difference of arrivals to determine the position of at least three of transmitters 220 A, 220 B, 220 C, and 220 D.
  • TDOA time difference of arrival
  • processing component 232 may apply other formulas to determine a more accurate position of base station 208 .
  • the Kalman filter may be applied to determine a more accurate position of base station 208 .
  • Processing component 232 may then define, based on the position of at least three transmitters of transmitters 220 A, 220 B, 220 C, and 220 D, two-dimensional plane 302 on which base station 208 lies.
  • transmitter 220 A transmits an “origin” acoustic signal that is followed by acoustic signals transmitted by transmitter 220 B and then transmitter 220 C and then transmitter 220 D.
  • the orientation of base station 208 may be determined according to the position of transmitters 220 B, 220 C, and/or 220 D relative to transmitter 220 A.
  • the transmitters disposed on base station 208 may be such that AB>AC, where AB lie in a vertical direction and AC lie in a horizontal direction relative to a portrait-orientated piece of paper.
  • Processing component 232 may be aware of this configuration and use this information to define the two-dimensional plane 302 on which base station 208 lies.
  • base station 208 transmits its location continuously via acoustic signals 240 .
  • base station 208 includes a motion sensor (not shown), and set of transmitters 220 is activated when the motion sensor detects that base station 208 has moved. Accordingly, it may be unnecessary for base station 208 to transmit its location continuously, thus reducing power consumption at base station 208 .
  • the motion sensor is coupled to the base station. In an example, the motion sensor is disposed on the base station.
  • base station 208 includes a button (not shown) that the user presses to instruct set of transmitters 220 to emit acoustic signals. In such an embodiment, it may be unnecessary for base station 208 to transmit its location continuously, thus reducing power consumption at base station 208 . Rather, when the user moves notepad 204 , the user may inform computing device 202 of notepad 204 's new location by pressing the button.
  • Processing component 232 may compute two-dimensional plane 302 .
  • processing component 232 determines three points, P 1 , P 2 , and P 3 .
  • Point P 1 corresponds to the bottom left corner of notepad 204 or of base station 208
  • point P 2 corresponds to the bottom right corner of notepad 204 or of base station 208
  • point P 3 corresponds to the top left (e.g., of the rectangle) corresponding to notepad 204 or base station 208 , respectively.
  • Equations (1)-(3) are solved up to a multiplicative factor for the four-dimensional vector [a, b, c, d].
  • the first i, j, and k vectors be the unit vectors corresponding to the x, y, and z axes of the three-dimensional space (see x, y, and z axes in FIG. 3 ).
  • Processing component 232 may determine the two vectors ⁇ right arrow over (P 1 P 2 ) ⁇ and ⁇ right arrow over (P 1 P 3 ) ⁇ , which are shown in equations (4)-(5):
  • Two-dimensional plane 302 is the plane that passes through the two vectors ⁇ right arrow over (P 1 P 2 ) ⁇ and ⁇ right arrow over (P 1 P 3 ) ⁇ (which lie on the same plane because they have one point in common) and may be computed by determining the plane whose normal is the vector orthogonal to vectors ⁇ right arrow over (P 1 P 2 ) ⁇ and ⁇ right arrow over (P 1 P 3 ) ⁇ .
  • Equations (1)-(6) are example equations that may be used when three transmitters are disposed on digital pen 206 . As discussed, fewer than or more than three transmitters may be disposed on digital pen 206 . If more than three transmitters are disposed on digital pen 206 , more complex relations may be used.
  • Two-dimensional plane 302 is used in conjunction with the three-dimensional position of digital pen 206 to record objects drawn by the user on notepad 204 using the digital pen.
  • computing device 202 may determine the location of digital pen 206 relative to the two-dimensional plane 302 in order to properly display the drawn objects (whether or not notepad 204 has been moved).
  • Computing device 202 works with two sets of coordinates, one set of coordinates (e.g., three-dimensional location of digital pen 206 ) which is to be interpreted relative to the other set of coordinates (e.g., two-dimensional plane 302 ).
  • the three-dimensional coordinates of digital pen 206 may be transformed into a two-dimensional coordinate system that is relative to that of notepad 204 and its spatial orientation.
  • These additional notepad relative coordinates may be made available to applications executing on computing device 202 (e.g., application 236 ) to record objects drawn by the user on notepad 204 using digital pen 206 .
  • the user may write “on-screen” by physically touching digital pen 206 with notepad 204 .
  • set of transmitters 210 emit acoustic signals 242
  • set of receivers 230 coupled to computing device 202 may detect the emitted signals.
  • Acoustic signals 242 are processed by processing component 232 to derive a three-dimensional position of the digital pen.
  • processing component 232 may track a position of each of the transmitters disposed on the digital pen and determine a three-dimensional position of a tip of the digital pen (e.g., a location of transmitter 210 A).
  • computing device 202 detects, by a plurality of receivers of set of receivers 230 , acoustic signals 242 transmitted from set of transmitters 210 disposed on digital pen 206 .
  • Processing component 232 may extract the acoustic signals received by the plurality of receivers and determine, based on acoustic signals 240 and 242 , a three-dimensional position of digital pen 206 relative to base station 208 .
  • Processing component 232 may use TDOA to determine the three-dimensional position of digital pen 206 and also to determine a three-dimensional position of a tip of digital pen 206 relative to base station 208 .
  • the position accuracy may be a key performance parameter for the digital pen and may be dependent on transducer placement, signal sequence design, and transducer acoustics (e.g., porting and signal-to-noise ratio).
  • Processing component 232 may use, for example, TDOA and apply a least square formula to the one or more calculated time difference of arrivals to determine the position(s) of transmitters 210 A and/or 210 B. Other techniques may be used to determine the three-dimensional position of digital pen 206 .
  • processing component 232 may use Time of Flight (TOF) of the transmitted pulses through a line of sight (LOS).
  • TOF Time of Flight
  • LOS line of sight
  • processing component 232 may determine a distance 306 between digital pen 206 and two-dimensional plane 302 .
  • two-dimensional plane 302 is defined by equation (7):
  • a normal vector 380 that is normal to two-dimensional plane 302 , and a generic vector 382 from two-dimensional plane 302 to the point P 4 may be defined for any point (x, y, z) on two-dimensional plane 302 (e.g., satisfies equation (6)).
  • Distance 306 distance from point P 4 to two-dimensional plane 302 ) may be determined by projecting vector ⁇ right arrow over (v) ⁇ 382 onto normal vector 380 as shown in equation (8):
  • Distance 306 may be the distance between a point on digital pen 206 (e.g., tip of digital pen 206 or on transmitter 210 A or 210 B) and two-dimensional plane 302 . If distance 306 is within a threshold distance, digital pen 206 may be close enough to notepad 204 to determine that the user is using digital pen 206 to write on notepad 204 . In response to determining that distance 306 is within the threshold distance, processing component 232 projects a three-dimensional position of digital pen 206 onto two-dimensional plane 302 . Processing component 232 may use the three-dimensional position of a tip of digital pen 206 for the projection.
  • a point on digital pen 206 e.g., tip of digital pen 206 or on transmitter 210 A or 210 B
  • processing component 232 projects a three-dimensional position of digital pen 206 onto two-dimensional plane 302 . Processing component 232 may use the three-dimensional position of a tip of digital pen 206 for the projection.
  • processing component 232 may transform the three-dimensional position of the tip of digital pen 206 into a two-dimensional coordinate system that is relative to that of base station 208 and its spatial orientation. Accordingly, even if the location or orientation of notepad 204 changes, the user may continue writing on notepad 204 and computing device 202 may determine the position of digital pen 206 relative to the base station, allowing for proper alignment of the objects drawn on notepad 204 .
  • Processing component 232 may also determine a defined boundary corresponding to notepad 204 . In an example, processing component 232 translates the positions of transmitters 220 A- 220 D into a defined boundary. In response to determining that the location of digital pen 206 is within the defined boundary that lies on two-dimensional plane 302 , processing component 232 may transform the three-dimensional position of a tip of digital pen 206 into a two-dimensional coordinate system that is relative to that of base station 208 and its spatial orientation. The defined boundary may correspond to a position of notepad 204 and its length and width dimensions.
  • processing component 232 determines three points, P 1 , P 2 , and P 3 .
  • Point P 1 corresponds to the bottom left corner of notepad 204 or of base station 208
  • point P 2 corresponds to the bottom right corner of notepad 204 or of base station 208
  • point P 3 corresponds to the top left (e.g., of the rectangle) corresponding to notepad 204 or base station 208 , respectively.
  • the x-axis is ⁇ right arrow over (P 1 P 2 ) ⁇
  • the y-axis is ⁇ right arrow over (P 1 P 3 ) ⁇ .
  • Each point P i has (x, y, z)-coordinates (x i , y i , z i ).
  • Points P 1 , P 2 , and P 3 may be detected and distinguished from each other because of the different frequencies transmitted by the transmitters disposed on digital pen 206 (e.g., “beep” at different frequencies) or because of the different times with unique patterns transmitted by the transmitters disposed on digital pen 206 .
  • Processing component 232 may perform the following projections:
  • the projection may be performed, for instance, with a 3 ⁇ 3 matrix transformation A and a three-dimensional translation vector ⁇ right arrow over (v2) ⁇ , as shown in equations (11)-(13):
  • Processing component 232 may record the three-dimensional position of the tip of digital pen 206 based on the projection, where the recorded three-dimensional position of the tip represents an object that is drawn by the user on notepad 204 using digital pen 206 .
  • the three-dimensional position of the tip is represented as three numbers (e.g., the x, y, and z coordinates).
  • the object is representative of movement of the digital pen 206 across notepad 204 .
  • These coordinates may be made available to applications (e.g., e-notepad application 236 ) that are executing on computing device 202 and that record the user's writing on notepad 204 .
  • Processing component 232 may store the coordinates in memory 234 and display the object on a display 310 coupled to computing device 202 .
  • FIGS. 1-3 are merely examples, which should not unduly limit the scope of the claims.
  • base station 208 includes four processing units or “buttons” 402 , 404 , 406 , and 408 that contain a battery, loudspeaker, and control unit, and that are gluable on a surface.
  • the processing units may be attached to three or four corners of a clipboard of the user's choice.
  • the clipboard may be any generic clipboard. In such an embodiment, the clipboard may be referred to as the base station.
  • the three or more processing units may emit their pulses at different times once moved, at different frequencies/signals (in order to allow computing device 202 to better detect their positions), or a combination of these.
  • base station 208 does not contain set of receivers 230 . Rather, base station 208 interacts with computing device 202 , which contains set of receivers 230 . Accordingly, it may be unnecessary for base station 208 to process the positions of the processing units and store them on the clipboard until these are retrieved by computing device 202 .
  • An advantage of this embodiment may provide for cost savings because it may be expensive to manufacture a base station 208 that processes and stores coordinates for later use. Rather, computing device 202 may be used to process and store the coordinates and uses less complicated onboard circuitry compared to a base station that performs these actions.
  • processing component 232 may calculate a TDOA for each acoustic signal received at a plurality of receivers of set of receivers 230 .
  • Acoustic tracking systems that determine a position based on a TDOA and may do so without using a synchronization channel. In this way, it may be unnecessary to add additional hardware to acoustic tracking system 200 and to modify software based on the additional hardware.
  • Non-synchronized systems may use multiple receivers for receiving the emitted acoustical signal and calculating a Differential Time of Arrival (“DTOA”) that is a time delay measured between the multiple receivers.
  • DTOA Differential Time of Arrival
  • TDOA is described to determine the position of a transmitter, this is not intended to be limiting and other techniques may be used.
  • an acoustic tracking system may determine the position of the transmitter based on a TOA that may be synchronized. Synchronized systems may use a synchronization signal that has a speed that is faster than the speed of sound and is transmitted to the receiver for synchronizing the clocks of the transmitter and receiver. Additional modules may be placed on receiving device 202 to receive the synchronization signal from the transmitters.
  • processing component 232 may calculate the TOF and may perform triangulation or other form of multilateration to determine the position of the transmitting device as a function of time.
  • an infrared (IR) signal may be used for synchronization due to its low cost and low power requirements.
  • IR may be a cost effective, low power synchronization method.
  • Another synchronization signal that may be used is a radio wave synchronization signal.
  • using a radio wave as a synchronization signal may still require a dedicated hardware synchronization block between the radio wave circuitry and the audio processing circuitry to maintain the required synchronization.
  • generating and receiving a radio wave synchronization signal may use more power than generating and receiving an IR signal.
  • FIG. 5 is a flowchart illustrating a method 500 of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • Method 500 is not meant to be limiting and may be used in other applications.
  • Method 500 includes blocks 502 - 512 .
  • first acoustic signals transmitted from a first set of transmitters disposed on a digital pen are detected by a first plurality of receivers coupled to a computing device.
  • a first plurality of receivers of set of receivers 230 detects acoustic signals 242 transmitted from set of transmitters 210 disposed on digital pen 206 .
  • second acoustic signals transmitted from a second set of transmitters disposed on a base station are detected by a second plurality of receivers coupled to the computing device.
  • a second plurality of receivers of set of receivers 230 detects acoustic signals 240 transmitted from set of transmitters 220 disposed on base station 208 .
  • One or more receivers of the first and second pluralities of receivers may overlap.
  • a two-dimensional plane on which the base station lies is defined based on the second acoustic signals.
  • processing component 232 defines, based on acoustic signals 240 , two-dimensional plane 302 on which base station 208 .
  • processing component 232 determines, based on acoustic signals 240 and 242 , a three-dimensional position of digital pen 206 relative to base station 208 .
  • the three-dimensional position of the digital pen is projected onto the two-dimensional plane.
  • processing component 232 projects the three-dimensional position of digital pen 206 onto two-dimensional plane 302 .
  • Processing component 232 may project the three-dimensional position of the tip of digital pen 206 onto two-dimensional plane 302 .
  • the three-dimensional position of the digital pen is recorded based on the projecting, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
  • processing component 232 records the three-dimensional position of the digital pen based on the projecting, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
  • Processing component 232 may record the three-dimensional position of the digital pen's tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • method 500 may include a process of extracting acoustic signals received by each receiver of set of receivers 230 . It is also understood that one or more of the blocks of method 500 described herein may be omitted, combined, or performed in a different sequence as desired.
  • FIG. 6 is a diagram illustrating a platform capable of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • Computing device 202 may run a platform 600 .
  • Platform 600 includes a user interface 602 that is in communication with a control unit 604 , e.g., control unit 604 accepts data from set of receivers 230 and controls user interface 602 .
  • User interface 602 includes display 310 , which includes a means for displaying graphics, text, and images, such as an LCD or LPD display.
  • User interface 602 may further include a keypad 610 or other input device through which the user can input information into the platform 600 . If desired, keypad 610 may be obviated by integrating a virtual keypad into display 310 . It should be understood that with some configurations of platform 600 , portions of user interface 602 may be physically separated from control unit 604 and connected to control unit 604 via cables or wirelessly, for example, in a Bluetooth headset.
  • Control unit 604 accepts and processes data from set of receivers 230 and controls the operation of the devices.
  • processing component 232 may extract acoustic signals received by set of receivers 230 and process the signals to define a two-dimensional plane on which a base station lies and to determine a three-dimensional position of a digital pen relative to the base station.
  • Processing component 232 may project the three-dimensional position of the tip onto the two-dimensional plane and record the three-dimensional position of the tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • Application controller 235 may use the recorded positions to affect operation of e-notepad application 236 .
  • Platform 600 may include means for detecting first acoustic signals transmitted from a first set of transmitters disposed on digital pen 206 .
  • Platform 600 may further include means for detecting second acoustic signals transmitted from a second set of transmitters disposed on base station 208 .
  • Platform 600 may further include means for defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies.
  • Platform 600 may further include means for determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station.
  • Platform 600 may further include means for projecting the three-dimensional position of the tip onto the two-dimensional plane.
  • Platform 600 may further include means for recording the three-dimensional position of the tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • Control unit 604 may be provided by one or more processors 620 and associated memory 622 , hardware 624 , software 626 , and firmware 628 .
  • Control unit 604 includes a means for controlling display 310 and means for controlling application controller 235 .
  • Application controller 235 may be implanted in processor 620 , hardware 624 , firmware 628 , or software 626 , e.g., computer readable media stored in memory 622 and executed by processor 620 , or a combination thereof.
  • processor 620 can, but need not necessarily include, one or more microprocessors, embedded processors, controllers, application specific integrated circuits (ASICs), digital signal processors (DSPs), graphics processing units (GPUs), and the like.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • GPUs graphics processing units
  • processor is intended to describe the functions implemented by the system rather than specific hardware.
  • memory refers to any type of computer storage medium, including long term, short term, or other memory associated with the platform, and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware 624 , firmware 628 , software 626 , or any combination thereof.
  • the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
  • the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein.
  • software codes may be stored in memory 622 and executed by the processor 620 .
  • Memory may be implemented within the processor unit or external to the processor unit.
  • software 626 may include program codes stored in memory 622 and executed by processor 620 and may be used to run the processor and to control the operation of platform 600 as described herein.
  • a program code stored in a computer-readable medium, such as memory 622 may include program code to record an object based on movement of a digital pen relative to a base station.
  • the program code stored in a computer-readable medium may additionally include program code to cause the processor to control any operation of platform 600 as described further below.
  • the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer.
  • such computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Position Input By Displaying (AREA)

Abstract

An acoustic tracking system is provided that includes receivers that detect first acoustic signals from a first set of transmitters disposed on a digital pen and second acoustic signals from a second set of transmitters disposed on a base station. The acoustic tracking system also includes a processing component that defines a two-dimensional plane on which the base station lies and determines a three-dimensional position of the digital pen relative to the base station. The processing component projects the three-dimensional position of the digital pen onto the two-dimensional plane and records, based on the projected three-dimensional position, the three-dimensional position of the digital pen relative to the base station, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 14/533,535 filed Nov. 5, 2014 that in turn claims the benefit of U.S. Provisional Application No. 62/040,977, filed Aug. 22, 2014, the contents of both of which are hereby incorporated by reference in their entirety.
  • TECHNICAL FIELD
  • Embodiments disclosed herein are generally directed to tracking, in an acoustic tracking system, a position of a digital pen relative to a base station.
  • BACKGROUND
  • A user may take notes using a computer. While the user may be able to type quickly, adding graphics, arrow, tables, or other objects to her notes may be difficult using the computer. The user may perform these actions more quickly on paper than on a laptop screen or even a tablet.
  • Nowadays, users may use a digital pen and an “electronic notepad” for note taking. Electronic notepads may enable the user to easily markup notes and draw objects onto a computer screen. Electronic notepads, however, may be cumbersome to use and may not give the feel of real writing to the user. For example, in a conventional electronic notepad system, an electronic notepad may include a clipboard that holds a piece of paper. The user may sketch on the paper while the electronic notepad digitally records the user's sketches. The user tethers the electronic notepad to a computer to transfer the data recorded by the electronic notepad to the computer. Tethering the electronic notepad to the computer in order to transfer the data, however, may be inconvenient for the user.
  • Additionally, the electronic notepad system may be expensive because of the onboard processing circuitry on the electronic notepad. For example, the electronic notepad may consume a lot of power.
  • SUMMARY
  • Consistent with some embodiments, there is provided an example acoustic tracking system. The acoustic tracking system includes a first plurality of receivers that detects first acoustic signals from a first set of transmitters disposed on a digital pen and a second plurality of receivers that detects second acoustic signals from a second set of transmitters disposed on a base station. The acoustic tracking system also includes a processing component that defines, based on the second acoustic signals, a two-dimensional plane on which the base station lies, determines, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station, and projects the three-dimensional position of the digital pen onto the two-dimensional plane. The acoustic tracking system further includes an application controller that records, based on the projected three-dimensional position of the digital pen onto the two-dimensional plane, the three-dimensional position of the digital pen relative to the base station, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
  • Consistent with some embodiments, there is provided an example method of recording an object based on movement of a digital pen relative to a base station. The method includes detecting, by a first plurality of receivers coupled to a computing device, first acoustic signals transmitted from a first set of transmitters disposed on a digital pen. The method also includes detecting, by a second plurality of receivers coupled to the computing device, second acoustic signals transmitted from a second set of transmitters disposed on a base station. The method further includes defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies. The method also includes determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station. The method further includes projecting the three-dimensional position of a tip of the digital pen onto the two-dimensional plane. The method also includes recording the three-dimensional position of the tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • Consistent with some embodiments, there is provided a computer-readable medium having stored thereon computer-executable instructions for performing operations, including: detecting first acoustic signals transmitted from a first set of transmitters disposed on a digital pen; detecting second acoustic signals transmitted from a second set of transmitters disposed on a base station; defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies; determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station; projecting the three-dimensional position of the digital pen onto the two-dimensional plane; and recording the three-dimensional position of the digital pen based on the projecting, where the recorded three-dimensional position represents an object representative of movement of the digital pen.
  • Consistent with some embodiments, there is provided a system for recording an object based on movement of a digital pen relative to a base station. The system includes means for detecting first acoustic signals transmitted from a first set of transmitters disposed on a digital pen. The system also includes means for detecting second acoustic signals transmitted from a second set of transmitters disposed on a base station. The system further includes means for defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies. The system also includes means for determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station. The system further includes means for projecting the three-dimensional position of the digital pen onto the two-dimensional plane. The system also includes means for recording the three-dimensional position of the digital pen based on the projecting, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram illustrating an example acoustic tracking system.
  • FIG. 2 is a diagram illustrating an acoustic tracking system, consistent with some embodiments.
  • FIG. 3 is a diagram illustrating a two-dimensional plane on which a base station 208, consistent with some embodiments.
  • FIG. 4 illustrates a base station, consistent with some embodiments.
  • FIG. 5 is a flowchart illustrating a method of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • FIG. 6 is a diagram illustrating a platform capable of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • In the drawings, elements having the same designation have the same or similar functions.
  • DETAILED DESCRIPTION
  • In the following description specific details are set forth describing certain embodiments. It will be apparent, however, to one skilled in the art that the disclosed embodiments may be practiced without some or all of these specific details. The specific embodiments presented are meant to be illustrative, but not limiting. One skilled in the art may realize other material that, although not specifically described herein, is within the scope and spirit of this disclosure.
  • FIG. 1 is a diagram illustrating an example acoustic tracking system 100. Acoustic tracking system 100 includes a computing device 102, a notepad 104, and a digital pen 106. One or more transmitters that transmit acoustic signals may be disposed on digital pen 106, and one or more receivers that detect the acoustic signals may be disposed on computing device 102. A user may write on notepad 104 using digital pen 106. As the user writes on notepad 104 using digital pen 106, digital pen 106 may transmit acoustic signals that are detected by computing device 102. Based on the acoustic signals transmitted from digital pen 106, computing device 102 may determine a position of digital pen 106 and record what the user is writing on notepad 104.
  • In FIG. 1, the user wrote “ab” on notepad 104, which is shown on a display of computing device 102. If the user moves notepad 104 and continues writing, the objects drawn by the user after the movement of notepad 104 may not be properly aligned with what is displayed on computing device 102. An object may refer to text, graphics, or other symbols. For example, the “c” drawn by the user after notepad 104 was moved is not aligned with “ab” displayed on computing device 102. To solve this problem, it may be desirable for computing device 102 to determine a position of notepad 104 relative to the position of digital pen 106.
  • It may be difficult, however, for computing device 102 to determine the position of notepad 104 relative to the position of digital pen 106. A possible solution to this problem may be to provide a fold cover for computing device 102 such that when the user opens the fold cover, notepad 104 is on one side and computing device 102 is on the other side of the fold cover. It may be inconvenient for the user, however, to carry the fold cover around and also to use notepad 204 in the fold cover. Additionally, the additional expense of purchasing a fold cover may add onto the expense of using acoustic tracking system 100.
  • FIG. 2 is a diagram illustrating an acoustic tracking system 200, consistent with some embodiments. Acoustic tracking system 200 may be used with devices such as smartphones, tablets, laptops, desktops, and personal digital assistants (PDAs). One example of an acoustic signal-based position tracking system is a digital pen having one or more acoustic signal transmitters and a base station having one or more acoustic signal transmitters, where the acoustic signal transmitters are in communication with one or more receivers coupled to a computing device. In such an example, the digital pen and base station may interact with the computing device by transmitting acoustic signals, as will be discussed further below.
  • In FIG. 2, acoustic tracking system 200 includes a computing device 202, a notepad 204, a digital pen 206, and a base station 208. Digital pen 206 includes a set of transmitters 210 including transmitter 210A and transmitter 210B that transmit acoustic signals. Transmitter 210A may be located near or at the tip of the digital pen, and transmitter 210B may be located along a length of the digital pen. Transmitter 210A may be located within a proximity to a tip of the digital pen. For example, transmitter 210A may be located within 0.5 millimeters of the tip of the digital pen. A transmitter may also be referred to as an emitter.
  • Although two transmitters are illustrated as being disposed on digital pen 206, other embodiments having more than two transmitters are within the scope of this disclosure. In principle, more than two transmitters may be disposed on the digital pen if power, design complexity, and system robustness allow for it. More interference, however, from adjacent transmitters on the digital pen may arise and may depend on the pattern signal design. Orthogonal sequences with ideal correlation properties may be used for the transmitter pattern design. Further, a higher quantity of transmitters may destroy the assumption of zero mean range measurement noise assumption and result in higher noise and less position tracking accuracy.
  • Base station 208 includes a set of transmitters 220 including transmitters 220A-220D that transmit acoustic signals. Although four transmitters are illustrated as being disposed on base station 208, other embodiments having two or more transmitters are within the scope of this disclosure. A user may attach base station 208 to notepad 204 and write on notepad 204 using digital pen 206. If user moves notepad 204, base station 208 also moves along with notepad 204.
  • Both digital pen 206 and base station 208 emit acoustic signals that are received by computing device 202. Computing device 202 includes a set of receivers 230 for picking up the signals transmitted by set of transmitters 210 and set of transmitters 220. Set of receivers 230 may be coupled to computing device 202 and may continuously run such that they are always ready to receive input from the transmitters when computing device 202 is turned on. In another example, set of receivers 230 does not continuously run but wakes up periodically to receive input from the transmitters.
  • In some embodiments, set of transmitters 210 and/or set of transmitters 220 may transmit a signal pattern of acoustic waves, such as an ultrasonic signal. The transmitters may be any suitable ultrasonic device that includes one or more ultrasonic transducers to generate ultrasonic signals (e.g., speakers). Set of receivers 230 may be any suitable acoustic receivers such as a microphone, and set of transmitters 210 and/or set of transmitters 220 may transmit ultrasonic signals to multiple microphones coupled to computing device 202.
  • Computing device 202 may include a processing component 232 and a memory 234. In some embodiments, processing component 232 may be one or more processors, central processing units (CPUs), image signal processors (ISPs), micro-controllers, or digital signal processors (DSPs), graphics processing units (GPUs), and audio signal processors, which may include analog and/or digital audio signal processors. Processing component 232 may be provided as hardware, software, or firmware, or combinations thereof in various embodiments.
  • Memory 234 may include a system memory component, which may correspond to random access memory (RAM), an internal memory component, which may correspond to read only memory (ROM), and an external or static memory, which may correspond to optical, magnetic, or solid-state memories, for example. Memory 234 may correspond to a non-transitory machine-readable medium that includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, and/or any other medium from which processing component 232 is capable of reading.
  • In some embodiments, processing component 232 extracts acoustic signals received by each receiver of set of receivers 230. For each acoustic signal received at a receiver, processing component 232 may determine which transmitter transmitted the respective acoustic signal. The received acoustic signals may be compared to an expected signal by processing component 232 according to instructions stored in memory 234 and an expected signal stored in memory 234 or generated by processing component 232, for example.
  • Memory 234 may include an application controller 235 and an electronic notepad application 236. Application 236 receives input from application controller 235 regarding the positions of base station 208 and of digital pen 206 relative to base station 208. Application controller 235 affects operation of an application based on determined positions of base station 208 and of digital pen 206 relative to base station 208.
  • Computing device 202 is a separate computer from base station 208 and digital pen 206 and processes the acoustic signals received from set of transmitters 210 and set of transmitters 220. It may be advantageous for computing device 202 to perform these calculations rather than digital pen 206 or base station 208 because computing device 202 may have a longer battery life compared to digital pen 206 or base station 208. Additionally, computing device 202 may have a better performing processor compared to digital pen 206 or base station 208. As such, computing device 202's processing power may be leveraged and base station 208 and digital pen 206 may have minimal power requirements. It should also be understood, however, that digital pen 206 may include software that performs some of the calculations that are described as being performed by processing component 232.
  • Base station 208 transmits acoustic signals 240. In some embodiments, computing device 202 detects, by a plurality of receivers of set of receivers 230, acoustic signals 240 transmitted from set of transmitters 220 disposed on base station 208. Processing component 232 may extract the acoustic signals received by the plurality of receivers and define, based on acoustic signals 240, a two-dimensional plane on which base station 208 lies.
  • FIG. 3 is a diagram illustrating a two-dimensional plane 302 on which base station 208 lies, consistent with some embodiments. FIGS. 2 and 3 will be discussed together to better explain how objects drawn by the user on notepad 204 using digital pen 206 are captured by computing device 202.
  • In some embodiments, base station 208 may transmit acoustic signals 240 that may be detected by computing device 202 and used by processing component 232 to define a two-dimensional plane on which base station 208 lies. In particular, processing component 232 may define the two-dimensional plane by determining a location of at least three of the transmitters disposed on base station 208. The position accuracy may be dependent on transducer placement, signal sequence design, and transducer acoustics (e.g., porting and signal-to-noise ratio). It should be understood that the fourth transmitter (e.g., transmitter 220D) may be optional in a three-emitter configuration. The fourth transmitter, however, may provide a more accurate result and may also be used if one of the other transmitters does not have a clear line of sight with receivers coupled to computing device 202.
  • Each transmitter (of set of transmitters 210 or set of transmitters 220) may transmit a different acoustic signal pattern with respect to each other. The different acoustic signal patterns may help to maintain a clear line of sight between the transmitters and receivers. In some embodiments, a multiplexing technique may be used to properly control the power used by these transmitters. In an example, transmitters may transmit acoustic signals using Time Division Multiple Access (TDMA). For example, a plurality of set of receivers 230 may receive at a first time slot a first ultrasonic signal from a transmitter 220A, receive at a second time slot a second ultrasonic signal from transmitter 220B, and receive at a third time slot a third ultrasonic signal from transmitter 220C, and/or receive at a fourth time slot a fourth ultrasonic signal from transmitter 220C. The transmission of signals at different time slots may reduce the interference noise.
  • In some embodiments, transmitters may transmit acoustic signals using multiplexing techniques different from TDMA. In an example, transmitters may transmit acoustic signals using Frequency Division Multiplexing (FDM). For example, a plurality of set of receivers 230 may receive at a first frequency sub-band a first ultrasonic signal from transmitter 220A, receive at a second frequency sub-band a second ultrasonic signal from transmitter 220B, receive at a third frequency sub-band a second ultrasonic signal from transmitter 220C, and/or receive at a fourth frequency sub-band a fourth ultrasonic signal from transmitter 220D.
  • In another example, transmitters 220A-220D may transmit acoustic signals using Phase Division Multiplexing (PDM). For example, a plurality of set of receivers 230 may receive at a first phase of a channel a first ultrasonic signal from transmitter 220A, receive at a second phase of the channel a second ultrasonic signal from transmitter 220B, receive at a third phase of the channel a third ultrasonic signal from transmitter 220C, and receive at a fourth phase of the channel a fourth ultrasonic signal from transmitter 220D. Accordingly, different frequencies or different durations may be used by the transmitters. The acoustic signals may be emitted simultaneously if the transmitters emit at different frequencies relative to each other.
  • Processing component 232 may calculate a time difference of arrival (TDOA) for the acoustic signals received at the receivers and apply a least square formula to the one or more calculated time difference of arrivals to determine the position of at least three of transmitters 220A, 220B, 220C, and 220D. Although processing component 232 has been described as applying the least square formula, this is not intended to be limiting. Processing component 232 may apply other formulas to determine a more accurate position of base station 208. For example, the Kalman filter may be applied to determine a more accurate position of base station 208. Processing component 232 may then define, based on the position of at least three transmitters of transmitters 220A, 220B, 220C, and 220D, two-dimensional plane 302 on which base station 208 lies.
  • In an example, transmitter 220A transmits an “origin” acoustic signal that is followed by acoustic signals transmitted by transmitter 220B and then transmitter 220C and then transmitter 220D. The orientation of base station 208 may be determined according to the position of transmitters 220B, 220C, and/or 220D relative to transmitter 220A. For instance, the transmitters disposed on base station 208 may be such that AB>AC, where AB lie in a vertical direction and AC lie in a horizontal direction relative to a portrait-orientated piece of paper. Processing component 232 may be aware of this configuration and use this information to define the two-dimensional plane 302 on which base station 208 lies.
  • In an embodiment, base station 208 transmits its location continuously via acoustic signals 240. In another embodiment, base station 208 includes a motion sensor (not shown), and set of transmitters 220 is activated when the motion sensor detects that base station 208 has moved. Accordingly, it may be unnecessary for base station 208 to transmit its location continuously, thus reducing power consumption at base station 208. The motion sensor is coupled to the base station. In an example, the motion sensor is disposed on the base station.
  • In another embodiment, base station 208 includes a button (not shown) that the user presses to instruct set of transmitters 220 to emit acoustic signals. In such an embodiment, it may be unnecessary for base station 208 to transmit its location continuously, thus reducing power consumption at base station 208. Rather, when the user moves notepad 204, the user may inform computing device 202 of notepad 204's new location by pressing the button.
  • Processing component 232 may compute two-dimensional plane 302. In an example, processing component 232 determines three points, P1, P2, and P3. Point P1 corresponds to the bottom left corner of notepad 204 or of base station 208, point P2 corresponds to the bottom right corner of notepad 204 or of base station 208, and point P3 corresponds to the top left (e.g., of the rectangle) corresponding to notepad 204 or base station 208, respectively. The equation of two-dimensional plane 302 passing through the points P1=(x1,y1,z1), P2=(x2,y2,z2), P3=(x3,y3,z3) is of the form aX+bY+cZ=d and can be obtained by requiring that the values a, b, c, and d satisfy the following equations (1)-(3):

  • ax 1 +by 1 +cz 1 +d=0  (1),

  • ax 2 +by 2 +cz 2 +d=0  (2),

  • ax 3 +by 3 +cz 3 +d=0  (3),
  • with [a, b, c, d] not being equal to [0, 0, 0, 0], and where d=−(axi=byi+czi) for any of i=1, 2, 3. The system of equations formed by equations (1)-(3) are solved up to a multiplicative factor for the four-dimensional vector [a, b, c, d]. In an example, let the first i, j, and k vectors be the unit vectors corresponding to the x, y, and z axes of the three-dimensional space (see x, y, and z axes in FIG. 3). Processing component 232 may determine the two vectors {right arrow over (P1P2)} and {right arrow over (P1P3)}, which are shown in equations (4)-(5):

  • {right arrow over (P 1 P 2)}=(x 2 −x 1)i+(y 2 −y 1)j+(z 2 −z 1)k  (4),

  • {right arrow over (P 1 P 3)}=(x 3 −x 1)i+(y 3 −y 1)j+(z 3 −z 1)k  (5).
  • Two-dimensional plane 302 is the plane that passes through the two vectors {right arrow over (P1P2)} and {right arrow over (P1P3)} (which lie on the same plane because they have one point in common) and may be computed by determining the plane whose normal is the vector orthogonal to vectors {right arrow over (P1P2)} and {right arrow over (P1P3)}.
  • The normal vector is the cross product of {right arrow over (P1P2)} and {right arrow over (P1P3)}, which is shown in equation (6):
  • P 1 P 2 × P 1 P 3 = det [ i j k x 2 - x 1 y 2 - y 1 z 2 - z 1 x 3 - x 1 y 3 - y 1 z 3 - z 1 ] = ai + bj + ck , ( 6 )
  • Equations (1)-(6) are example equations that may be used when three transmitters are disposed on digital pen 206. As discussed, fewer than or more than three transmitters may be disposed on digital pen 206. If more than three transmitters are disposed on digital pen 206, more complex relations may be used.
  • Two-dimensional plane 302 is used in conjunction with the three-dimensional position of digital pen 206 to record objects drawn by the user on notepad 204 using the digital pen. In particular, computing device 202 may determine the location of digital pen 206 relative to the two-dimensional plane 302 in order to properly display the drawn objects (whether or not notepad 204 has been moved). Computing device 202 works with two sets of coordinates, one set of coordinates (e.g., three-dimensional location of digital pen 206) which is to be interpreted relative to the other set of coordinates (e.g., two-dimensional plane 302). For example, the three-dimensional coordinates of digital pen 206 may be transformed into a two-dimensional coordinate system that is relative to that of notepad 204 and its spatial orientation. These additional notepad relative coordinates may be made available to applications executing on computing device 202 (e.g., application 236) to record objects drawn by the user on notepad 204 using digital pen 206.
  • The user may write “on-screen” by physically touching digital pen 206 with notepad 204. As the user writes on notepad 204 using digital pen 206, set of transmitters 210 emit acoustic signals 242, and set of receivers 230 coupled to computing device 202 may detect the emitted signals. Acoustic signals 242 are processed by processing component 232 to derive a three-dimensional position of the digital pen. In particular, processing component 232 may track a position of each of the transmitters disposed on the digital pen and determine a three-dimensional position of a tip of the digital pen (e.g., a location of transmitter 210A).
  • In some embodiments, computing device 202 detects, by a plurality of receivers of set of receivers 230, acoustic signals 242 transmitted from set of transmitters 210 disposed on digital pen 206. Processing component 232 may extract the acoustic signals received by the plurality of receivers and determine, based on acoustic signals 240 and 242, a three-dimensional position of digital pen 206 relative to base station 208.
  • Processing component 232 may use TDOA to determine the three-dimensional position of digital pen 206 and also to determine a three-dimensional position of a tip of digital pen 206 relative to base station 208. The position accuracy may be a key performance parameter for the digital pen and may be dependent on transducer placement, signal sequence design, and transducer acoustics (e.g., porting and signal-to-noise ratio). Processing component 232 may use, for example, TDOA and apply a least square formula to the one or more calculated time difference of arrivals to determine the position(s) of transmitters 210A and/or 210B. Other techniques may be used to determine the three-dimensional position of digital pen 206. In another example, processing component 232 may use Time of Flight (TOF) of the transmitted pulses through a line of sight (LOS).
  • Referring to FIG. 3, processing component 232 may determine a distance 306 between digital pen 206 and two-dimensional plane 302. In an example, two-dimensional plane 302 is defined by equation (7):

  • ax+by+cz+d=0  (7),
  • and the point P4=[x0, y0, z0] in space represents the position of the tip of digital pen 206. A normal vector 380 that is normal to two-dimensional plane 302, and a generic vector 382 from two-dimensional plane 302 to the point P4 may be defined for any point (x, y, z) on two-dimensional plane 302 (e.g., satisfies equation (6)). Distance 306 (distance from point P4 to two-dimensional plane 302) may be determined by projecting vector {right arrow over (v)} 382 onto normal vector 380 as shown in equation (8):
  • D = proj n v = n · v v , ( 8 )
  • which may be simplified to equation (9):
  • D = ax 0 + by 0 + cz 0 + d a 2 + b 2 + c 2 . ( 9 )
  • If distance 306 is smaller than a given threshold D0, point P4 is replaced with its projection, which is shown in equation (10):
  • proj n v = n · v n 2 n = ax 0 + by 0 + cz 0 + d a 2 + b 2 + c 2 [ a , b , c ] . ( 10 )
  • Distance 306 may be the distance between a point on digital pen 206 (e.g., tip of digital pen 206 or on transmitter 210A or 210B) and two-dimensional plane 302. If distance 306 is within a threshold distance, digital pen 206 may be close enough to notepad 204 to determine that the user is using digital pen 206 to write on notepad 204. In response to determining that distance 306 is within the threshold distance, processing component 232 projects a three-dimensional position of digital pen 206 onto two-dimensional plane 302. Processing component 232 may use the three-dimensional position of a tip of digital pen 206 for the projection. For example, in response to determining that the location of digital pen 206 is close enough to two-dimensional plane 302 (within the threshold distance), processing component 232 may transform the three-dimensional position of the tip of digital pen 206 into a two-dimensional coordinate system that is relative to that of base station 208 and its spatial orientation. Accordingly, even if the location or orientation of notepad 204 changes, the user may continue writing on notepad 204 and computing device 202 may determine the position of digital pen 206 relative to the base station, allowing for proper alignment of the objects drawn on notepad 204.
  • Processing component 232 may also determine a defined boundary corresponding to notepad 204. In an example, processing component 232 translates the positions of transmitters 220A-220D into a defined boundary. In response to determining that the location of digital pen 206 is within the defined boundary that lies on two-dimensional plane 302, processing component 232 may transform the three-dimensional position of a tip of digital pen 206 into a two-dimensional coordinate system that is relative to that of base station 208 and its spatial orientation. The defined boundary may correspond to a position of notepad 204 and its length and width dimensions.
  • In an example, processing component 232 determines three points, P1, P2, and P3. Point P1 corresponds to the bottom left corner of notepad 204 or of base station 208, point P2 corresponds to the bottom right corner of notepad 204 or of base station 208, and point P3 corresponds to the top left (e.g., of the rectangle) corresponding to notepad 204 or base station 208, respectively. In an example, the x-axis is {right arrow over (P1P2)}, and the y-axis is {right arrow over (P1P3)}. Each point Pi has (x, y, z)-coordinates (xi, yi, zi). Points P1, P2, and P3 may be detected and distinguished from each other because of the different frequencies transmitted by the transmitters disposed on digital pen 206 (e.g., “beep” at different frequencies) or because of the different times with unique patterns transmitted by the transmitters disposed on digital pen 206.
  • Processing component 232 may perform the following projections:
  • Point P1 to a point in the plane where z=0 with (x, y) coordinates (0, 0) (e.g., three-dimensional coordinates [0, 0, 0]);
  • Point P2 to a point in the plane where z=0 with (x, y) coordinates (w, 0) (e.g., three-dimensional coordinates [w, 0, 0]), where w is the width of the rectangle on base station 208 or notepad 204 defined by the transmitters disposed on digital pen 206; and
  • Point P3 to a point in the plane where z=0 with (x, y) coordinates (0, h) (e.g., three-dimensional coordinates [0, h, 0]), where h is the height of the rectangle on base station 208 or notepad 204 defined by the transmitters disposed on digital pen 206. The projection may be performed, for instance, with a 3×3 matrix transformation A and a three-dimensional translation vector {right arrow over (v2)}, as shown in equations (11)-(13):

  • P 1 ·A+v=[0,0,0]  (11),

  • P 2 ·A+v=[w,0,0]  (12),

  • P 3 ·A+v=[0,h,0]  (13).
  • Processing component 232 may record the three-dimensional position of the tip of digital pen 206 based on the projection, where the recorded three-dimensional position of the tip represents an object that is drawn by the user on notepad 204 using digital pen 206. In an example, the three-dimensional position of the tip is represented as three numbers (e.g., the x, y, and z coordinates). In particular, the object is representative of movement of the digital pen 206 across notepad 204. These coordinates may be made available to applications (e.g., e-notepad application 236) that are executing on computing device 202 and that record the user's writing on notepad 204. Processing component 232 may store the coordinates in memory 234 and display the object on a display 310 coupled to computing device 202.
  • As discussed above and further emphasized here, FIGS. 1-3 are merely examples, which should not unduly limit the scope of the claims. Other embodiments of base station 208 are within the scope of the disclosure. For example, FIG. 4 illustrates a base station 208, consistent with some embodiments. In FIG. 4, base station 208 includes four processing units or “buttons” 402, 404, 406, and 408 that contain a battery, loudspeaker, and control unit, and that are gluable on a surface. The processing units may be attached to three or four corners of a clipboard of the user's choice. The clipboard may be any generic clipboard. In such an embodiment, the clipboard may be referred to as the base station.
  • The three or more processing units may emit their pulses at different times once moved, at different frequencies/signals (in order to allow computing device 202 to better detect their positions), or a combination of these. In FIG. 4, base station 208 does not contain set of receivers 230. Rather, base station 208 interacts with computing device 202, which contains set of receivers 230. Accordingly, it may be unnecessary for base station 208 to process the positions of the processing units and store them on the clipboard until these are retrieved by computing device 202. An advantage of this embodiment may provide for cost savings because it may be expensive to manufacture a base station 208 that processes and stores coordinates for later use. Rather, computing device 202 may be used to process and store the coordinates and uses less complicated onboard circuitry compared to a base station that performs these actions.
  • Additionally, in some embodiments, processing component 232 may calculate a TDOA for each acoustic signal received at a plurality of receivers of set of receivers 230. Acoustic tracking systems that determine a position based on a TDOA and may do so without using a synchronization channel. In this way, it may be unnecessary to add additional hardware to acoustic tracking system 200 and to modify software based on the additional hardware. Non-synchronized systems may use multiple receivers for receiving the emitted acoustical signal and calculating a Differential Time of Arrival (“DTOA”) that is a time delay measured between the multiple receivers.
  • Additionally, although TDOA is described to determine the position of a transmitter, this is not intended to be limiting and other techniques may be used. For example, an acoustic tracking system may determine the position of the transmitter based on a TOA that may be synchronized. Synchronized systems may use a synchronization signal that has a speed that is faster than the speed of sound and is transmitted to the receiver for synchronizing the clocks of the transmitter and receiver. Additional modules may be placed on receiving device 202 to receive the synchronization signal from the transmitters.
  • Based on the received signal, processing component 232 may calculate the TOF and may perform triangulation or other form of multilateration to determine the position of the transmitting device as a function of time. In synchronized acoustic signal-based position systems, an infrared (IR) signal may be used for synchronization due to its low cost and low power requirements. IR may be a cost effective, low power synchronization method. Another synchronization signal that may be used is a radio wave synchronization signal. However, using a radio wave as a synchronization signal may still require a dedicated hardware synchronization block between the radio wave circuitry and the audio processing circuitry to maintain the required synchronization. Moreover, generating and receiving a radio wave synchronization signal may use more power than generating and receiving an IR signal.
  • FIG. 5 is a flowchart illustrating a method 500 of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments. Method 500 is not meant to be limiting and may be used in other applications.
  • Method 500 includes blocks 502-512. In a block 502, first acoustic signals transmitted from a first set of transmitters disposed on a digital pen are detected by a first plurality of receivers coupled to a computing device. In an example, a first plurality of receivers of set of receivers 230 detects acoustic signals 242 transmitted from set of transmitters 210 disposed on digital pen 206.
  • In a block 504, second acoustic signals transmitted from a second set of transmitters disposed on a base station are detected by a second plurality of receivers coupled to the computing device. In an example, a second plurality of receivers of set of receivers 230 detects acoustic signals 240 transmitted from set of transmitters 220 disposed on base station 208. One or more receivers of the first and second pluralities of receivers may overlap.
  • In a block 506, a two-dimensional plane on which the base station lies is defined based on the second acoustic signals. In an example, processing component 232 defines, based on acoustic signals 240, two-dimensional plane 302 on which base station 208. In a block 508, it is determined, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station. In an example, processing component 232 determines, based on acoustic signals 240 and 242, a three-dimensional position of digital pen 206 relative to base station 208.
  • In a block 510, the three-dimensional position of the digital pen is projected onto the two-dimensional plane. In an example, processing component 232 projects the three-dimensional position of digital pen 206 onto two-dimensional plane 302. Processing component 232 may project the three-dimensional position of the tip of digital pen 206 onto two-dimensional plane 302.
  • In a block 512, the three-dimensional position of the digital pen is recorded based on the projecting, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen. In an example, processing component 232 records the three-dimensional position of the digital pen based on the projecting, where the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen. Processing component 232 may record the three-dimensional position of the digital pen's tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • It is also understood that additional processes may be performed before, during, or after blocks 502-512 discussed above. For example, method 500 may include a process of extracting acoustic signals received by each receiver of set of receivers 230. It is also understood that one or more of the blocks of method 500 described herein may be omitted, combined, or performed in a different sequence as desired.
  • FIG. 6 is a diagram illustrating a platform capable of recording an object based on movement of a digital pen relative to a base station, consistent with some embodiments.
  • Computing device 202 may run a platform 600. Platform 600 includes a user interface 602 that is in communication with a control unit 604, e.g., control unit 604 accepts data from set of receivers 230 and controls user interface 602. User interface 602 includes display 310, which includes a means for displaying graphics, text, and images, such as an LCD or LPD display.
  • User interface 602 may further include a keypad 610 or other input device through which the user can input information into the platform 600. If desired, keypad 610 may be obviated by integrating a virtual keypad into display 310. It should be understood that with some configurations of platform 600, portions of user interface 602 may be physically separated from control unit 604 and connected to control unit 604 via cables or wirelessly, for example, in a Bluetooth headset.
  • Control unit 604 accepts and processes data from set of receivers 230 and controls the operation of the devices. For example, processing component 232 may extract acoustic signals received by set of receivers 230 and process the signals to define a two-dimensional plane on which a base station lies and to determine a three-dimensional position of a digital pen relative to the base station. Processing component 232 may project the three-dimensional position of the tip onto the two-dimensional plane and record the three-dimensional position of the tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen. Application controller 235 may use the recorded positions to affect operation of e-notepad application 236.
  • Platform 600 may include means for detecting first acoustic signals transmitted from a first set of transmitters disposed on digital pen 206. Platform 600 may further include means for detecting second acoustic signals transmitted from a second set of transmitters disposed on base station 208. Platform 600 may further include means for defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies. Platform 600 may further include means for determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station. Platform 600 may further include means for projecting the three-dimensional position of the tip onto the two-dimensional plane. Platform 600 may further include means for recording the three-dimensional position of the tip based on the projecting, where the recorded three-dimensional position of the tip represents an object representative of movement of the digital pen.
  • Control unit 604 may be provided by one or more processors 620 and associated memory 622, hardware 624, software 626, and firmware 628. Control unit 604 includes a means for controlling display 310 and means for controlling application controller 235. Application controller 235 may be implanted in processor 620, hardware 624, firmware 628, or software 626, e.g., computer readable media stored in memory 622 and executed by processor 620, or a combination thereof.
  • As discussed above and further emphasized here, FIGS. 1-6 are merely examples that should not unduly limit the scope of the claims. It will also be understood as used herein that processor 620 can, but need not necessarily include, one or more microprocessors, embedded processors, controllers, application specific integrated circuits (ASICs), digital signal processors (DSPs), graphics processing units (GPUs), and the like. The term processor is intended to describe the functions implemented by the system rather than specific hardware. Moreover, as used herein the term “memory” refers to any type of computer storage medium, including long term, short term, or other memory associated with the platform, and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • The methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware 624, firmware 628, software 626, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
  • For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in memory 622 and executed by the processor 620. Memory may be implemented within the processor unit or external to the processor unit.
  • For example, software 626 may include program codes stored in memory 622 and executed by processor 620 and may be used to run the processor and to control the operation of platform 600 as described herein. A program code stored in a computer-readable medium, such as memory 622, may include program code to record an object based on movement of a digital pen relative to a base station. The program code stored in a computer-readable medium may additionally include program code to cause the processor to control any operation of platform 600 as described further below.
  • If implemented in firmware and/or software, the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • One skilled in the art may readily devise other systems consistent with the disclosed embodiments which are intended to be within the scope of this disclosure. The foregoing disclosure is not intended to limit the present disclosure to the precise forms or particular fields of use disclosed. As such, it is contemplated that various alternate embodiments and/or modifications to the present disclosure, whether explicitly described or implied herein, are possible in light of the disclosure. Changes may be made in form and detail without departing from the scope of the present disclosure. Thus, the present disclosure is limited only by the claims.

Claims (20)

What is claimed is:
1. An acoustic tracking system, comprising:
a first plurality of receivers that detects first acoustic signals from a first set of transmitters disposed on a digital pen;
a second plurality of receivers that detects second acoustic signals from a second set of transmitters disposed on a base station, wherein the base station is coupled to a motion detector that detects movement of the base station, and in response to the movement, the base station activates the second set of transmitters to emit the second acoustic signals;
a processing component that defines, based on the second acoustic signals, a two-dimensional plane on which the base station lies, determines, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station, and projects the three-dimensional position of the digital pen onto the two-dimensional plane; and
an application controller that records, based on the projected three-dimensional position of the digital pen onto the two-dimensional plane, the three-dimensional position of the digital pen relative to the base station, wherein the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
2. The acoustic tracking system of claim 1, wherein the first plurality of receivers includes one or more microphones.
3. The acoustic tracking system of claim 1, wherein the processing component determines a first point corresponding to a bottom left corner of the base station, determines a second point corresponding to a bottom right corner of the base station, and determines a third point corresponding to a top corner of the base station, and wherein an equation of the two-dimensional plane passes through the first, second, and third points.
4. The acoustic tracking system of claim 1, wherein the base station is attached to a notepad.
5. The acoustic tracking system of claim 4, wherein the processing component defines a boundary that corresponds to the notepad and lies on the two-dimensional plane.
6. The acoustic tracking system of claim 5, wherein the processing component determines a position of one or more transmitters of the second set of transmitters disposed on the base station.
7. The acoustic tracking system of claim 6, wherein the processing component translates the positions of the second set of transmitters into the defined boundary.
8. The acoustic tracking system of claim 5, wherein the defined boundary corresponds to a position of the notepad.
9. The acoustic tracking system of claim 5, wherein the defined boundary corresponds to a length of the notepad.
10. The acoustic tracking system of claim 9, wherein the defined boundary corresponds to a width of the notepad.
11. The acoustic tracking system of claim 1, wherein the processing component determines that the three-dimensional position of the digital pen is within the defined boundary.
12. The acoustic tracking system of claim 11, wherein in response to a determination that the three-dimensional position of the digital pen is within the defined boundary, the processing component transforms the three-dimensional position of the digital pen into a two-dimensional coordinate system relative to the base station and a spatial orientation of the base station.
13. The acoustic tracking system of claim 1, wherein the three-dimensional position of the digital pen corresponds to a three-dimensional position of a tip of the digital pen.
14. A method of recording an object based on movement of a digital pen relative to a base station, comprising:
detecting, by a first plurality of receivers coupled to a computing device, first acoustic signals from a first set of transmitters disposed on a digital pen;
detecting, by a second plurality of receivers coupled to the computing device, second acoustic signals from a second set of transmitters disposed on a base station attached to a notepad;
defining, based on the second acoustic signals, a two-dimensional plane on which the base station lies;
defining a boundary that corresponds to the notepad and that lies on the two-dimensional plane;
determining, based on the first and second acoustic signals, a three-dimensional position of the digital pen relative to the base station,
in response to a determination that the three-dimensional position of the digital pen is within the defined boundary, transforming the three-dimensional position of the digital pen into a two-dimensional coordinate system relative to the base station and a spatial orientation of the base station; and
recording the three-dimensional position of the digital pen relative to the base station, wherein the recorded three-dimensional position of the digital pen represents an object representative of movement of the digital pen.
15. The method of claim 14, comprising:
projecting the three-dimensional position of the digital pen onto the two-dimensional plane, wherein the recording includes recording, based on the projected three-dimensional position of the digital pen onto the two-dimensional plane, the three-dimensional position of the digital pen relative to the base station.
16. The method of claim 14, comprising:
determining a first point corresponding to a bottom left corner of the base station;
determining a second point corresponding to a bottom right corner of the base station; and
determining a third point corresponding to a top corner of the base station, wherein an equation of the two-dimensional plane passes through the first, second, and third points.
17. The method of claim 14, comprising:
determining a position of one or more transmitters of the second set of transmitters disposed on the base station; and
translating the positions of the second set of transmitters into the defined boundary.
18. The method of claim 14, wherein the defined boundary corresponds to a position of the notepad.
19. The method of claim 14, wherein the defined boundary corresponds to a length and a width of the notepad.
20. The method of 14, wherein the three-dimensional position of the digital pen corresponds to a three-dimensional position of a tip of the digital pen.
US15/231,231 2014-08-22 2016-08-08 Digital ultrasonic emitting base station Abandoned US20160349864A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/231,231 US20160349864A1 (en) 2014-08-22 2016-08-08 Digital ultrasonic emitting base station

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201462040977P 2014-08-22 2014-08-22
US14/533,535 US9411440B2 (en) 2014-08-22 2014-11-05 Digital ultrasonic emitting base station
US15/231,231 US20160349864A1 (en) 2014-08-22 2016-08-08 Digital ultrasonic emitting base station

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/533,535 Continuation US9411440B2 (en) 2014-08-22 2014-11-05 Digital ultrasonic emitting base station

Publications (1)

Publication Number Publication Date
US20160349864A1 true US20160349864A1 (en) 2016-12-01

Family

ID=55348300

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/533,535 Active US9411440B2 (en) 2014-08-22 2014-11-05 Digital ultrasonic emitting base station
US15/231,231 Abandoned US20160349864A1 (en) 2014-08-22 2016-08-08 Digital ultrasonic emitting base station

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/533,535 Active US9411440B2 (en) 2014-08-22 2014-11-05 Digital ultrasonic emitting base station

Country Status (2)

Country Link
US (2) US9411440B2 (en)
WO (1) WO2016028427A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200201542A1 (en) * 2017-09-04 2020-06-25 Wacom Co., Ltd. Spatial position indication system
US12405723B2 (en) * 2017-09-04 2025-09-02 Wacom Co., Ltd. Spatial position indication system

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU742663B2 (en) 1997-05-30 2002-01-10 Stryker Corporation Methods for evaluating tissue morphogenesis and activity
EP1763362B1 (en) 2004-05-25 2011-07-06 Stryker Corporation Use of op-1 for treating cartilage defects
SG173632A1 (en) 2009-02-12 2011-09-29 Stryker Corp Peripheral administration of proteins including tgf-beta superfamily members for systemic treatment of disorders and disease
AU2010341565A1 (en) 2009-12-22 2012-07-12 Stryker Corporation BMP-7 variants with reduced immunogenicity
US9977519B2 (en) * 2015-02-25 2018-05-22 Synaptics Incorporated Active pen with bidirectional communication
US9924527B2 (en) * 2015-05-21 2018-03-20 Sr Technologies, Inc. Multiple physical layer wi-fi radio system
US10564740B2 (en) 2016-07-21 2020-02-18 Samsung Electronics Co., Ltd. Pen device—panel interaction based on electromagnetic signals output by the pen device
CN107036770B (en) * 2017-04-18 2019-04-09 浙江理工大学 Leak detection and location method for air cooler finned tube bundles
US10564420B2 (en) * 2017-10-02 2020-02-18 International Business Machines Corporation Midair interaction with electronic pen projection computing system
US10831316B2 (en) 2018-07-26 2020-11-10 At&T Intellectual Property I, L.P. Surface interface

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5303207A (en) * 1992-10-27 1994-04-12 Northeastern University Acoustic local area networks
US5579285A (en) * 1992-12-17 1996-11-26 Hubert; Thomas Method and device for the monitoring and remote control of unmanned, mobile underwater vehicles
US6157368A (en) * 1994-09-28 2000-12-05 Faeger; Jan G. Control equipment with a movable control member
US20050029878A1 (en) * 2001-03-19 2005-02-10 Graham Flanagan Magnetostrictive actuator
US20050110775A1 (en) * 2003-11-21 2005-05-26 Marc Zuta Graphic input device and method
US20060217144A1 (en) * 2003-05-08 2006-09-28 Nokia Corporation Mobile telephone having a rotator input device
US20070205866A1 (en) * 2006-02-22 2007-09-06 Ntt Docomo, Inc. Wireless tag determination system and wireless tag determination method
US20080170075A1 (en) * 2007-01-16 2008-07-17 Sony Ericsson Mobile Communications Japan, Inc. Display controller, display control method, display control program, and mobile terminal device
US20110038230A1 (en) * 2008-04-24 2011-02-17 Ixsea Underwater acoustic positioning system
US20120276872A1 (en) * 2011-04-28 2012-11-01 Nokia Corporation Method and apparatus for over-the-air provisioning
US20130181941A1 (en) * 2011-12-30 2013-07-18 Sony Mobile Communications Japan, Inc. Input processing apparatus
US20150054794A1 (en) * 2013-08-21 2015-02-26 Qualcomm Incorporated Ultrasound multi-zone hovering system
US20150079963A1 (en) * 2013-09-17 2015-03-19 Xiaomi Inc. Method and device for displaying notice information
US20150170672A1 (en) * 2013-12-13 2015-06-18 Huawei Technologies Co., Ltd. Method for Performing Voice Control Operation on Terminal and Apparatus

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU7970094A (en) * 1993-10-18 1995-05-08 Carroll Touch, Inc. Acoustic wave touch panel for use with a non-active stylus
US5889511A (en) * 1997-01-17 1999-03-30 Tritech Microelectronics International, Ltd. Method and system for noise reduction for digitizing devices
US5920524A (en) * 1997-08-06 1999-07-06 Scientific Solutions, Inc. Hydrophone arrangement and bunker for housing same
AU3986400A (en) 1999-04-22 2000-11-10 Mordechai Ben-Arie Pen input device for a computer
DE69919759T2 (en) 1999-11-08 2005-09-01 Itpen Europe Ltd. METHOD FOR DIGITIZING, WRITING AND DRAWING WITH THE POSSIBILITY OF DELETING AND / OR NOTICE
JP2002041229A (en) 2000-05-19 2002-02-08 Fujitsu Ltd Coordinate recognition device
JP3988476B2 (en) * 2001-03-23 2007-10-10 セイコーエプソン株式会社 Coordinate input device and display device
US20020192009A1 (en) 2001-06-14 2002-12-19 Tuli Raja Singh Memory pen device
WO2003005293A2 (en) * 2001-06-29 2003-01-16 Hans Rudolf Sterling Apparatus for sensing the position of a pointing object
JP4091304B2 (en) * 2002-01-07 2008-05-28 セイコーインスツル株式会社 Manufacturing method of semiconductor integrated circuit and semiconductor integrated circuit
AU2003219506B2 (en) * 2002-04-15 2009-02-05 Qualcomm Incorporated Method and system for obtaining positioning data
US8542219B2 (en) 2004-01-30 2013-09-24 Electronic Scripting Products, Inc. Processing pose data derived from the pose of an elongate object
US8325159B2 (en) * 2004-04-14 2012-12-04 Elo Touch Solutions, Inc. Acoustic touch sensor
WO2006100682A2 (en) 2005-03-23 2006-09-28 Epos Technologies Limited Method and system for digital pen assembly
US20070075968A1 (en) * 2005-09-30 2007-04-05 Hall Bernard J System and method for sensing the position of a pointing object
US20110002191A1 (en) * 2006-12-07 2011-01-06 Alion Science & Technology Active sonar apparatuses and methods
US20090078473A1 (en) 2007-09-26 2009-03-26 Digital Pen Systems Handwriting Capture For Determining Absolute Position Within A Form Layout Using Pen Position Triangulation
CN101533320B (en) * 2008-03-10 2012-04-25 神基科技股份有限公司 Proximity magnification display method and device for area image of touch display device
US20090239581A1 (en) 2008-03-24 2009-09-24 Shu Muk Lee Accelerometer-controlled mobile handheld device
WO2010028166A1 (en) * 2008-09-03 2010-03-11 Sonicmule, Inc. System and method for communication between mobile devices using digital/acoustic techniques
JPWO2011043415A1 (en) * 2009-10-07 2013-03-04 日本電気株式会社 Digital pen system and pen input method
EP2491474B1 (en) * 2009-10-23 2018-05-16 Elliptic Laboratories AS Touchless interfaces
US10133411B2 (en) * 2010-06-11 2018-11-20 Qualcomm Incorporated Auto-correction for mobile receiver with pointing technology
US20110310101A1 (en) * 2010-06-22 2011-12-22 Schlumberger Technology Corporation Pillar grid conversion
US8692785B2 (en) * 2010-09-29 2014-04-08 Byd Company Limited Method and system for detecting one or more objects
US8988398B2 (en) * 2011-02-11 2015-03-24 Microsoft Corporation Multi-touch input device with orientation sensing
US9325947B2 (en) * 2011-06-28 2016-04-26 Inview Technology Corporation High-speed event detection using a compressive-sensing hyperspectral-imaging architecture
US9575544B2 (en) * 2011-11-07 2017-02-21 Qualcomm Incorporated Ultrasound based mobile receivers in idle mode
KR20140089144A (en) * 2013-01-04 2014-07-14 삼성전자주식회사 Eletronic device for asynchronous digital pen and method recognizing it
US9201522B2 (en) * 2013-09-03 2015-12-01 Qualcomm Incorporated Acoustic position tracking system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5303207A (en) * 1992-10-27 1994-04-12 Northeastern University Acoustic local area networks
US5579285A (en) * 1992-12-17 1996-11-26 Hubert; Thomas Method and device for the monitoring and remote control of unmanned, mobile underwater vehicles
US6157368A (en) * 1994-09-28 2000-12-05 Faeger; Jan G. Control equipment with a movable control member
US20050201584A1 (en) * 2001-03-19 2005-09-15 Smith Brian D. Helmet with a magnetostrictive actuator
US20050029878A1 (en) * 2001-03-19 2005-02-10 Graham Flanagan Magnetostrictive actuator
US20060217144A1 (en) * 2003-05-08 2006-09-28 Nokia Corporation Mobile telephone having a rotator input device
US20050110775A1 (en) * 2003-11-21 2005-05-26 Marc Zuta Graphic input device and method
US20070205866A1 (en) * 2006-02-22 2007-09-06 Ntt Docomo, Inc. Wireless tag determination system and wireless tag determination method
US20080170075A1 (en) * 2007-01-16 2008-07-17 Sony Ericsson Mobile Communications Japan, Inc. Display controller, display control method, display control program, and mobile terminal device
US20110038230A1 (en) * 2008-04-24 2011-02-17 Ixsea Underwater acoustic positioning system
US20120276872A1 (en) * 2011-04-28 2012-11-01 Nokia Corporation Method and apparatus for over-the-air provisioning
US20130181941A1 (en) * 2011-12-30 2013-07-18 Sony Mobile Communications Japan, Inc. Input processing apparatus
US20150054794A1 (en) * 2013-08-21 2015-02-26 Qualcomm Incorporated Ultrasound multi-zone hovering system
US20150079963A1 (en) * 2013-09-17 2015-03-19 Xiaomi Inc. Method and device for displaying notice information
US20150170672A1 (en) * 2013-12-13 2015-06-18 Huawei Technologies Co., Ltd. Method for Performing Voice Control Operation on Terminal and Apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200201542A1 (en) * 2017-09-04 2020-06-25 Wacom Co., Ltd. Spatial position indication system
US11630570B2 (en) * 2017-09-04 2023-04-18 Wacom Co., Ltd. Spatial position indication system
US12405723B2 (en) * 2017-09-04 2025-09-02 Wacom Co., Ltd. Spatial position indication system

Also Published As

Publication number Publication date
WO2016028427A1 (en) 2016-02-25
US9411440B2 (en) 2016-08-09
US20160054819A1 (en) 2016-02-25

Similar Documents

Publication Publication Date Title
US9411440B2 (en) Digital ultrasonic emitting base station
Wang et al. Millisonic: Pushing the limits of acoustic motion tracking
Chen et al. EchoTrack: Acoustic device-free hand tracking on smart phones
US9613262B2 (en) Object detection and tracking for providing a virtual device experience
Yun et al. Turning a mobile device into a mouse in the air
US8284951B2 (en) Enhanced audio recording for smart pen computing systems
US9189108B2 (en) Ultrasound multi-zone hovering system
EP3469460B1 (en) Tap event location with a selection apparatus
Wang et al. Faceori: Tracking head position and orientation using ultrasonic ranging on earphones
CN105492923A (en) Acoustic position tracking system
US20180299976A1 (en) Digitized writing apparatus
US20130207936A1 (en) Method and device for receiving reflectance-based input
US12067157B2 (en) Drift cancelation for portable object detection and tracking
US10739870B2 (en) Stylus for coordinate measuring
CN106598293A (en) Three-dimensional large-space multi-channel pen-based interaction system
US20250284348A1 (en) Electronic device and control method of the same
CN104244055A (en) Real-time interaction method of multimedia devices within effective space range
JP6185838B2 (en) Measuring 3D coordinates of transmitter
KR20150084756A (en) Location tracking systme using sensors equipped in smart phone and so on
Bai et al. Scribe: Simultaneous voice and handwriting interface
Chung et al. vTrack: Virtual trackpad interface using mm-level sound source localization for mobile interaction
KR20150138003A (en) Device and mathod for gernerating data of representing structure in the room
US10853958B2 (en) Method and device for acquiring depth information of object, and recording medium
Chen et al. A three-dimensional ultrasonic pen-type input device with millimeter-level accuracy for human–computer interaction
Bai et al. WhisperWand: Simultaneous voice and gesture tracking interface

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVANZI, ROBERT;REEL/FRAME:039406/0170

Effective date: 20141126

AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AVANZI, ROBERTO;REEL/FRAME:046267/0521

Effective date: 20180703

AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S ADDRESS PREVIOUSLY RECORDED AT REEL: 046267 FRAME: 0521. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:AVANZI, ROBERTO;REEL/FRAME:046690/0748

Effective date: 20180703

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE