US20180084387A1 - Determining Location Based on Measurements of Device Orientation - Google Patents
Determining Location Based on Measurements of Device Orientation Download PDFInfo
- Publication number
- US20180084387A1 US20180084387A1 US15/800,911 US201715800911A US2018084387A1 US 20180084387 A1 US20180084387 A1 US 20180084387A1 US 201715800911 A US201715800911 A US 201715800911A US 2018084387 A1 US2018084387 A1 US 2018084387A1
- Authority
- US
- United States
- Prior art keywords
- user
- rotation
- client device
- determining
- angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000005259 measurement Methods 0.000 title claims description 50
- 238000000034 method Methods 0.000 claims abstract description 65
- 230000008859 change Effects 0.000 claims abstract description 36
- 230000005484 gravity Effects 0.000 claims abstract description 16
- 238000001514 detection method Methods 0.000 claims abstract description 9
- 230000004927 fusion Effects 0.000 claims description 6
- 238000009499 grossing Methods 0.000 claims description 3
- 238000004891 communication Methods 0.000 description 35
- 230000005291 magnetic effect Effects 0.000 description 30
- 230000006870 function Effects 0.000 description 21
- 238000004422 calculation algorithm Methods 0.000 description 16
- 238000005457 optimization Methods 0.000 description 15
- 238000004364 calculation method Methods 0.000 description 14
- 230000001413 cellular effect Effects 0.000 description 14
- 230000008569 process Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 12
- 238000010168 coupling process Methods 0.000 description 8
- 238000005859 coupling reaction Methods 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 8
- 238000013500 data storage Methods 0.000 description 7
- 230000004807 localization Effects 0.000 description 5
- 238000003860 storage Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 239000013598 vector Substances 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 229920000535 Tan II Polymers 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005294 ferromagnetic effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 238000012958 reprocessing Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/026—Services making use of location information using location based information parameters using orientation information, e.g. compass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/14—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by recording the course traversed by the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C22/00—Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
- G01C22/006—Pedometers
Definitions
- a location of a computing device can be determined using different techniques, such as techniques based on Global Positioning System (GPS) data or on data associated with a wireless access point (e.g., a cellular base station or an 802.11 access point).
- GPS Global Positioning System
- a computing device may receive a GPS signal and responsively determine its position on the face of the Earth (e.g. an absolute location).
- a computing device may receive a signal from either a cellular base station or an 802.11 access point. The cellular base station or an 802.11 access point may estimate an exact location. Based on the location of either the cellular base station or an 802.11 access point, the computing device can calculate its position.
- a localization of a computing device may occur via use of data from multiple different networks.
- Many location-based services can be provided to a computing device based on determining the location of the mobile computing device.
- a method performed, at least in part, by one or more processors of a computing device includes determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step.
- computing the change in orientation further includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, and determining a rotation between the world frame and the HPAGA frame.
- HPAGA horizontally projected AGA
- Determining the rotation between the world frame and the HPAGA frame may use one or more of the rotation between the client device frame and the world frame, the rotation between the AGA frame and the client device frame, or the rotation between the HPAGA frame and the AGA frame.
- Computing the change in orientation may also include determining the change in orientation by using the rotation between the world frame and the HPAGA frame.
- the example method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
- Another example is directed to a non-transitory computer-readable medium having stored therein instructions, that when executed by one or more processors of a computing device, cause the computing device to perform various functions.
- the functions include determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step.
- computing the change in orientation further includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, and determining a rotation between the world frame and the HPAGA frame.
- HPAGA horizontally projected AGA
- Determining the rotation between the world frame and the HPAGA frame may use one or more of the rotation between the client device frame and the world frame, the rotation between the AGA frame and the client device frame, or the rotation between the HPAGA frame and the AGA frame.
- Computing the change in orientation may also include determining the change in orientation by using the rotation between the world frame and the HPAGA frame.
- the example method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
- a further example provides a system that includes a processor and a computer-readable medium.
- the computer-readable medium is configured to store instructions, that when executed by the at least one processor, cause the system to perform functions.
- the functions include determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step.
- computing the change in orientation further includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, and determining a rotation between the world frame and the HPAGA frame.
- HPAGA horizontally projected AGA
- Determining the rotation between the world frame and the HPAGA frame may use one or more of the rotation between the client device frame and the world frame, the rotation between the AGA frame and the client device frame, or the rotation between the HPAGA frame and the AGA frame.
- Computing the change in orientation may also include determining the change in orientation by using the rotation between the world frame and the HPAGA frame.
- the example method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
- FIG. 1 illustrates a block diagram of an example communication system according to an embodiment of the present disclosure.
- FIG. 2 illustrates a block diagram of an example computing device according to an embodiment of the present disclosure.
- FIG. 3 illustrates a block diagram of an example computing device according to another embodiment of the present disclosure.
- FIG. 4 is a flow chart of an example method for determining a location and/or movement of a device.
- FIG. 5 is a flow chart of another example method for determining and using one or more orientations associated with a device.
- FIG. 6 illustrate different reference frames for determining one or more orientations associated with a device.
- FIG. 7 is block flow diagram that illustrates relationships between frames and rotations, in accordance with an embodiment of the present disclosure.
- FIG. 8 is a flow chart of another example method for determining and using one or more orientations associated with a client device.
- FIG. 9 is block flow diagram that illustrates relationships between frames, rotations, and/or parameters in accordance with an embodiment of the present disclosure.
- FIG. 10 is a block diagram that conceptually illustrates an example system for determining location estimates of a device, and optionally, maps of observed data received from the device.
- a number of logs of data or traces of data are received from one or more devices.
- the data may include a variety of information collected by one or more sensors of the devices. These sensors may include a GPS, accelerometer, gyroscope, inertial measurement unit (IMU), barometer, magnetometer, and WIFI signal strength sensor, as just some examples.
- the present disclosure relates to techniques for processing orientation data in the traces of data, such as from an accelerometer, gyroscope, and/or magnetometer, to determine an orientation of the device and/or a user of the device.
- the device and/or user orientation may be determined with respect to a frame of reference relative to the earth or a world frame.
- a computing device such as a cellular phone, may use pedestrian dead reckoning calculations to localize a position of the device and to build a map of a location of the device and of a user of the device.
- dead reckoning calculations link positions of devices between two steps. For instance, a first step may be at time t, and a second step may be at t+1. The second step is on average likely to be around 3 ft in front of the first step, and an angle of turn or change in orientation between the first and second steps can be determined using, for example, gyroscope data.
- An example dead reckoning determination may be performed to determine an estimation of a current position of the computing device based on a previous position of the computing device, an estimated speed over an elapsed time, and an orientation or direction of travel of the computing device.
- information indicating a previous position may be received from a server that calculates or determines the information due to communication with a computing device, or from sensors of the computing device including a GPS sensor.
- the previous position may also be derived or calculated from a number of data points such as GPS location determinations, or WIFI scans and associated WIFI mappings.
- the estimated speed can also be received from a server, or derived or calculated from position determinations over an elapsed time or based on other data over the elapsed time including outputs of a pedometer, for example.
- a speed can be determined based on the elapsed time.
- the orientation or direction of travel of the computing device may be determined from data received from a server, or from sensors on-board the computing device such as a magnetometer or compass, for example.
- Any available information may be used to infer an orientation or a direction of travel including a fusion of accelerometer, gyroscope, and optionally magnetometer data, for example.
- other available information can be used to provide further estimates (directly or indirectly) as to direction of travel, including WIFI scans received in traces that may give information as to a position and heading of a device and/or user.
- the dead reckoning calculation can be performed to determine an estimation of the current position of the computing device.
- an accelerometer of the computing device can be used as a pedometer and a gyroscope or magnetometer as an orientation or compass heading provider.
- Each step of a user of the computing device causes a position to move forward a fixed distance in an orientation direction measured by the compass.
- Accuracy may be limited by precision of the sensors, magnetic disturbances inside structures of the computing device, and unknown variables such as carrying position of the computing device and stride length of the user.
- the estimate of the current position can be determined in this manner.
- the present disclosure provides additional techniques for determining orientation data.
- the present disclosure relates to fusing or linking the dead reckoning calculations with other constraints, such as Wi-Fi scan data, GPS readings, and/or magnetic field data, to determine and refine a map and position of a device and/or user of the device.
- a computing device such as a mobile phone or a server device, can perform fusion or linking of various constraints, such as dead reckoning calculations and magnetic field information in a tight-coupling manner or loose-coupling manner.
- Tight-coupling calculations may provide relatively high accuracy, but may also be relatively computationally-intensive, and may be prone to convergence to an incorrect solution.
- Loose-coupling calculations are generally less computationally-intensive than tight-coupling calculations, and may be more robust but may provide lower accuracy than tight-coupling calculations.
- the present disclosure adapts loose-coupling calculations to estimate the orientation of the device and/or the user of the device with respect to the world frame.
- the device orientation may also be used to rotate or otherwise process magnetic field information according to a frame of reference of the device with respect to the world.
- the computing device may use the processed magnetic field information, and perhaps with other constraints, to determine a location of the device and/or to build a map of an area around the device.
- the computing device may present a map on a display, and show the device location on the map, or otherwise generate information and instructions for providing such a display.
- the location of the device may also be used in location-based services, such as to narrow an Internet search to an area that is nearby the client device location, to personalize “local” news and/or weather updates or information sent to the device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, and the like.
- FIG. 1 illustrates an example communication system 100 in which an example method may be implemented.
- a client device 102 may communicate with a server 104 via one or more wired and/or wireless interfaces.
- the client device 102 and the server 104 may communicate within a network.
- the client device 102 and the server 104 may each reside within a respective network.
- the client device 102 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, a tablet computing device, and the like, that is configured to transmit data 106 to and/or receive data 108 from the server 104 in accordance with the method and functions described herein.
- the client device 102 may include a user interface, a communication interface, a processor, and data storage comprising instructions executable by the processor for carrying out one or more functions relating to the data sent to, and/or received by, the server 104 .
- the user interface may include buttons, a touchscreen, a microphone, and/or any other elements for receiving inputs, as well as a speaker, one or more displays, and/or any other elements for communicating outputs.
- the server 104 may be any entity or computing device arranged to carry out the method and computing device functions described herein. Further, the server 104 may be configured to send data 108 to and/or receive data 106 from the client device 102 .
- the server 104 may include a location module 110 which may be configured to process the data 106 received from the client device 102 to determine locations (present and historical) associated with the client device 102 .
- the data 106 received by the server 104 from the client device 102 may take various forms.
- the client device 102 may provide information indicative of a location of the client device 102 , movement of the client device 102 , or inputs from a user of the client device 102 .
- the server 104 may then process the data 106 to identify a location history that matches to the received data.
- the data 108 sent to the client device 102 from the server 104 may take various forms.
- the server 104 may send to the client device 102 an indication of location, updated location history information, or information based on the locations of the device.
- FIG. 2 illustrates a schematic drawing of an example device 200 .
- the computing device takes a form of a client device 200 .
- some components illustrated in FIG. 2 may be distributed across multiple computing devices. However, for the sake of example, the components are shown and described as part of one example client device 200 .
- the client device 200 may be or include a mobile device, desktop computer, email/messaging device, tablet computer, or similar device that may be configured to perform the functions described herein.
- the client device 200 may include a device platform (not shown), which may be configured as a multi-layered Linux platform.
- the device platform may include different applications and an application framework, as well as various kernels, libraries, and runtime entities. In other examples, other formats or systems may operate the client device 200 as well.
- the client device 200 may include an interface 202 , a wireless communication component 204 , a cellular radio communication component 206 , a global position system (GPS) 208 , one or more sensors 210 , data storage 212 , and a processor 214 . Components illustrated in FIG. 2 may be linked or coupled together by a communication link or bus 216 .
- the client device 200 may also include hardware to enable communication within the client device 200 and between the client device 200 and another computing device, such as the server 104 of FIG. 1 .
- the hardware may include transmitters, receivers, and antennas, for example.
- the interface 202 is configured to allow the client device 200 to communicate with another computing device, such as a server.
- the interface 202 may be configured to receive input data from one or more computing devices, and may also be configured to send output data to the one or more computing devices.
- the interface 202 may also maintain and manage records of data received and sent by the client device 200 .
- records of data may be maintained and managed by other components of the client device 200 .
- the interface 202 may also include a receiver and transmitter to receive and send data.
- the interface 202 may also include a user-interface, such as a keyboard, microphone, touchscreen, etc., to receive inputs as well.
- the wireless communication component 204 may be a communication interface that is configured to facilitate wireless data communication for the client device 200 according to one or more wireless communication standards.
- the wireless communication component 204 may include a WIFI communication component that is configured to facilitate wireless data communication according to one or more IEEE 802.11 standards.
- the wireless communication component 204 may include a Bluetooth communication component that is configured to facilitate wireless data communication according to one or more Bluetooth standards. Other examples are also possible.
- the processor 214 may be configured to determine one or more geographical location estimates of the client device 200 using one or more location-determination components, such as the wireless communication component 204 , the cellular radio communication component 206 , and/or the GPS 208 . For instance, the processor 214 may use a location-determination algorithm to determine a location of the client device 200 based on a presence and/or location of one or more known wireless access points within a wireless range of the client device 200 . In one example, the wireless communication component 204 may determine the identity of one or more wireless access points (e.g., a MAC address) and measure an intensity of signals received (e.g., received signal strength indication) from each of the one or more wireless access points.
- a location-determination components such as the wireless communication component 204 , the cellular radio communication component 206 , and/or the GPS 208 .
- the processor 214 may use a location-determination algorithm to determine a location of the client device 200 based on a presence and/or location of one or more
- the received signal strength indication (RSSI) from each unique wireless access point may be used to determine a distance from each wireless access point. The distances may then be compared to a database that stores information regarding where each unique wireless access point is located. Based on the distance from each wireless access point, and the known location of each of the wireless access point, a location estimate of the client device 200 may be determined.
- RSSI received signal strength indication
- the processor 214 may use a location-determination algorithm to determine a location of the client device 200 based on nearby cellular base stations.
- the cellular radio communication component 206 may be configured to at least identify a cell from which the client device 200 is receiving, or last received, signal from a cellular network.
- the cellular radio communication component 206 may also be configured to measure a round trip time (RTT) to a base station providing the signal, and combine this information with the identified cell to determine a location estimate.
- RTT round trip time
- the cellular communication component 206 may be configured to use observed time difference of arrival (OTDOA) from three or more base stations to estimate the location of the client device 200 .
- OTD observed time difference of arrival
- the processor 214 may use a location-determination algorithm to determine a location of the client device 200 based on signals sent by GPS satellites above the Earth.
- the GPS 208 may be configured to estimate a location of the mobile device by precisely timing signals sent by the GPS satellites.
- the processor 214 may use a location-determination algorithm to determine a location of the client device 200 based Bluetooth wireless signals.
- the Bluetooth signals can be compared to a map of Bluetooth devices, and a measurement probability map in which a given Bluetooth wireless signal is estimated to be received can be determined.
- Bluetooth devices may include static devices (e.g., such as Bluetooth Low Energy (BLE) beacons) that emit signals to nearby devices. Each Bluetooth device will have a range in which signals can be emitted, and the range can be used as a measurement probability map as the constraint for locations of the device.
- BLE Bluetooth Low Energy
- the processor 214 may use a location-determination algorithm to determine a location of the client device 200 based on magnetic field signals.
- ambient magnetic fields are present in environments, and include disturbances or anomalies in the Earth's magnetic field caused by pillars, doors, elevators in hallways, or other objects that may be ferromagnetic in nature
- a device may measure a magnetic field, and when such magnetic field measurements are present in the logs of data, the measurements can be compared to a map of magnetic field signal strength for a given location, and a measurement probability map in which a given magnetic field signal strength corresponds to a signal strength of the magnetic field signal can be determined and used as the constraint to determine a location of the client device.
- the processor 214 may use a location-determination algorithm that combines location estimates determined by multiple location-determination components, such as a combination of the wireless communication component 204 , the cellular radio component 206 , and the GPS 208 .
- the sensor 210 may include one or more sensors, or may represent one or more sensors included within the client device 200 .
- Example sensors include an accelerometer, gyroscope, magnetometer, pedometer, barometer, light sensors, microphone, camera, or other location and/or context-aware sensors.
- the processor 214 may also use a location-determination algorithm that fuses data from the one or more sensors 210 .
- the processor 214 may be configured to execute a dead reckoning algorithm that uses a log of sensor data as inputs to the dead reckoning algorithm to determine an estimated trajectory or pedestrian dead reckoning of the client device and associated user.
- the data storage 212 may store program logic 218 that can be accessed and executed by the processor 214 .
- the data storage 210 may also store collected sensor data 220 that may include data collected by any of the wireless communication component 204 , the cellular radio communication component 206 , the GPS 208 , and any of sensors 210 .
- the communication link 216 is illustrated as a wired connection; however, wireless connections may also be used.
- the communication link 216 may be a wired serial bus such as a universal serial bus or a parallel bus, or a wireless connection using, e.g., short-range wireless radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), or Cellular technology, among other possibilities.
- the illustrated client device 200 in FIG. 2 includes an additional processor 222 .
- the processor 222 may be configured to control other aspects of the client device 200 including displays or outputs of the client device 200 (e.g., the processor 222 may be a GPU). Example methods described herein may be performed individually by components of the client device 200 , or in combination by one or more of the components of the client device 200 .
- portions of the client device 200 may process data and provide an output internally in the client device 200 to the processor 222 , for example.
- portions of the client device 200 may process data and provide outputs externally to other computing devices.
- FIG. 3 illustrates a schematic drawing of another example computing device.
- the computing device takes a form of a server 300 .
- some components illustrated in FIG. 3 may be distributed across multiple servers. However, for the sake of example, the components are shown and described as part of one example server 300 .
- the server 300 may be a computing device, cloud, or similar entity that may be configured to perform the functions described herein.
- the server 300 may include a communication interface 302 , a location module 304 , a processor 306 , and data storage 308 . All of the components illustrated in FIG. 3 may be linked or coupled together by a communication link or bus 310 (e.g., a wired or wireless link).
- the server 300 may also include hardware to enable communication within the server 300 and between the server 300 and another computing device (not shown).
- the hardware may include transmitters, receivers, and antennas, for example.
- the communication interface 302 may allow the server 300 to communicate with another device (not shown), such as a mobile phone, personal computer, etc.
- the communication interface 302 may be configured to receive input data from one or more computing devices, and may also be configured to send output data to the one or more computing devices.
- the communication interface 302 may also maintain and manage records of data received and sent by the server 300 .
- records of data may be maintained and managed by other components of the server 300 .
- the location module 304 may be configured to receive data from a client device and determine a geographic location of the client device. The determination may be based on outputs of an accelerometer, gyroscope, barometer, magnetometer, or other sensors of the client device, as well as based on location determinations of the client device. Further, the location module 304 may be configured to execute a dead reckoning algorithm. Using a log of sensor data as inputs to the dead reckoning algorithm, the location module 304 may determine an estimated trajectory or pedestrian dead reckoning of the client device and associated user.
- the location module 304 may also be configured to determine and store a history of sensor measurements of the client device for later reprocessing based on updated data pertaining to networks or information used to the determine the locations.
- the data storage 308 may store program logic 312 that can be accessed and executed by the processor 306 .
- the data storage 310 may also include a location database 314 that can be accessed by the processor 306 as well, for example, to retrieve information regarding wireless access points, magnetic field data, orientation data, locations of satellites in a GPS network, floor plans of a building, etc., or any other type of information useful for determining a location of a client device.
- the server is illustrated with a second processor 316 , which may be an application specific processor for input/output functionality. In other examples, functions of the processor 306 and the processor 316 may be combined into one component.
- measurements collected from various sensors of a device can be combined with information from external databases (such as known locations of WIFI access points or building floor plans) to estimate a location or movement of the device in real-time. Recording the real-time location estimate at all times (or intervals/increments of time) may also produce a location history.
- FIG. 4 is a flow diagram illustrating an example method for determining a location or movement of a device.
- computing device(s) 400 operated by users 402 or surveyors 404 , may traverse areas in an environment and output traces to a model builder 406 .
- a device operated by a user 402 may output traces passively (e.g., the device may be configured to output the trace data with no additional user input), including raw data output by sensors of the device like WIFI scans, GPS data, accelerometer data, gyroscope data, barometer readings, magnetometer data, etc.
- Each trace may be associated with a time the data was collected, and thus, for traces that include GPS data, other data in the traces also has location-specific references.
- a device operated by a surveyor 404 may have location-specific references for all traces, whether due to associated GPS data or manual input of location information.
- the model builder 406 may be a module on a computing device or server, and may be configured to generate a model of the environment based on the received traces.
- the model builder 406 may include a trace localizer and a map builder.
- the model builder 406 may access reference data or information, such as magnetic field signal strength data in the environment at specific locations in the environment, or other landmark data of the environment, such as strength of signal (RSSI) for WIFI access points.
- the model builder 406 may be configured to generate a map or path of the device based on the traces.
- the model builder 406 may utilize GPS data to determine locations of the device over time, utilize dead reckoning (based on accelerometer and gyroscope outputs) to project a path, utilize elevational data (such as based on GPS elevational data and barometer readings), and optimize the path by jointly combining each.
- the model builder 406 may further optimize the path to match magnetic field data to reference magnetic field maps to align a path that most likely resembles a path that the device traversed through the environment.
- a location provider 408 may access a model output by the model builder 406 to determine locations of other device(s) 410 based on provided passive traces as well. Within examples, the location provider 408 may return a location of the device or an estimation of movement of the device to the device 410 based on data received in the traces. The computing device may use the determined locations to present a map on a display of the device, for example, and show a device location on the map, or otherwise generate information and instructions for providing such a display.
- the location of a device may also be used in location-based services, such as to narrow an Internet search to an area that is nearby the device location, to personalize “local” news and/or weather updates or information sent to the device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, and the like.
- location-based services such as to narrow an Internet search to an area that is nearby the device location, to personalize “local” news and/or weather updates or information sent to the device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, and the like.
- Traces received from devices may include a variety of measurements from multiple different sensors, and may include a variety of measurements collected over time or at various locations.
- a trace may refer to a sensor log or a collection of data output from sensors on the device over some time period and collected over a number of locations. The sensors that output data may be selected, or data to be included within the sensor log may also be selected.
- a trace of data may include all data collected by a device (using a number of sensors) over a given time frame (e.g., about 5 seconds, or perhaps about 5 minutes or any ranges therein or longer). Measurements in a trace or from trace to trace may be considered statistically independent. However, in instances in which the measurements are collected from positions/locations in close proximity or collected close in time, the measurements may have correlations.
- the traces or logs of data may be used to build a magnetic field strength map of the number of locations aligned to latitude and longitude or position coordinates. Estimate magnetic field strengths can be made based on known locations of where the magnetic field scans occurred. The reverse is also true.
- SLAM simultaneous localization and mapping
- the received logs of data can be used to determine relative paths traversed by the devices using dead reckoning, which provides estimates of AP locations and trajectory of the devices relative to each other, and such relative estimates can be aligned with more absolute positions using measurements from GPS.
- dead reckoning provides estimates of AP locations and trajectory of the devices relative to each other, and such relative estimates can be aligned with more absolute positions using measurements from GPS.
- GPS generally provides accurate latitude and longitude measurements, but only in certain locations (mostly outdoors).
- maps of signals or signal strengths may also be generated based on received logs of data or accessed to localize a device.
- maps include WIFI strength of signal (RSSI) maps, Bluetooth device maps, or geographic walkway and street maps, for example.
- RSSI WIFI strength of signal
- trustworthy measurements in an absolute frame can be accessed first to generate a first estimate of a magnetic field strength map, and new measurements and new sensor logs can be introduced to refine the estimate using the estimate as a starting point to build upon.
- a current estimate is held constant and used to determine an initial estimate for the new data.
- a SLAM optimization may be performed to jointly optimize all data without keeping anything constant. Iterations may be performed until all data has been considered.
- FIG. 5 is a block diagram of an example method in accordance with at least some embodiments described herein.
- Method 500 shown in FIG. 5 presents an embodiment of a method that, for example, could be used with the system 100 in FIG. 1 , the device 200 in FIG. 2 , the server 300 in FIG. 3 , and/or the method in FIG. 4 , for example, or may be performed by a combination of any components or processes of FIGS. 1-4 .
- Method 500 may include one or more operations, functions, or actions as illustrated by one or more of blocks 502 - 516 . Although the blocks are illustrated in a sequential order, these blocks may in some instances be performed in parallel, and/or in a different order than those described herein. In addition, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation.
- each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process.
- the program code may be stored on any type of computer readable medium, for example, such as a storage device including a disk or hard drive.
- the computer readable medium may include a non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM).
- the computer readable medium may also include non-transitory media, such as secondary or persistent long-term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example.
- the computer readable media may also be any other volatile or non-volatile storage systems.
- the computer readable medium may be considered a computer readable storage medium, a tangible storage device, or other article of manufacture, for example.
- each block in FIG. 5 may represent circuitry and/or other hardware that is wired or otherwise configured to perform the specific logical functions and processes of method 500 .
- Functions of the method 500 may be fully performed by a computing device (or components of a computing device such as one or more processors), or may be distributed across multiple computing devices and/or a server.
- the computing device may receive information from sensors of the computing device, or where the computing device is a server the information can be received from another device that collects the information.
- a computing device estimates an orientation of a client device (e.g., a phone) by computing a rotation from a coordinate frame of a client device (e.g., device frame) to a coordinate frame of the earth (e.g., world frame). This rotation from the device frame to the world frame is identified as R device world .
- a world frame 600 may be defined by an XYZ-coordinate frame, with the positive X-axis extending east, the positive Y-axis extending north, and the positive Z-axis extending up (e.g., directed radially away from the center of the earth).
- the world frame 600 varies depending on the location of a client device 602 and an associated user 604 on the earth.
- the world frame 600 may be considered fixed over a relatively short period of time, such as one day, because the variation of the world frame 600 based on the user's steps is likely insignificant. In other examples, the world frame may be allowed to drift over time.
- FIG. 6 shows a device frame 606 that may be defined by an XYZ-coordinate frame, with the positive X-axis extending to the right with respect to a front face of the device 602 , the positive Y-axis extending up with respect to the front face of the device, and the positive Z-axis extending perpendicularly out of the front face of the device.
- FIG. 6 also illustrates a user frame 608 that may be defined by an XYZ-coordinate frame, with the X-axis extending in front of and away from the user 604 , the Y-axis extending to the left of the user, and the Z-axis extending up from the user.
- the computing device computes the rotation R device world to relate the device frame 606 to the world frame 600 .
- the computing device may estimate the rotation device world by fusing device sensor data.
- the device sensor data may be collected by the client device over a plurality of locations and over a time period, and may include accelerometer and gyroscope data, and in some cases, magnetometer data.
- the magnetometer data may be used to compensate for a bias effect of the gyroscope.
- the computing device may estimate the rotation R device world by performing an extended kalman filter (EKF) method using the device sensor data.
- EKF extended kalman filter
- the computing device may estimate the rotation R device world using a game rotation vector defined by Android open source software.
- the computing device selects data, such as device sensor data related to the rotation R device world or the rotation itself, for further processing in the present method 500 .
- the computing device selects time slices during which an average orientation of the device relative to the user does not change or changes within a predetermined range.
- the time slices are selected by processing the rotation R device world , and identifying when the orientation of “world_down” in the device frame does not change significantly between time slices, such as by less than about 35 degrees.
- the computing device may eliminate potentially unreliable data associated with large variations in the orientation of the device, which may occur when a user takes their phone out of their pocket, for example. Consequently, the computing device at block 504 may then select more reliable orientation data for further processing.
- the computing device estimates an average orientation of the device relative to the user by computing a rotation from an average gravity aligned (AGA) frame to the device frame. This rotation from AGA frame to the device frame is identified as a rotation R AGA device .
- the computing device may use the orientation data selected at block 504 to compute the rotation R AGA device , which may help to verify an assumption that the device is in a generally static position relative to the user.
- the computing device performs the calculation of block 506 by creating a coordinate average gravity aligned (AGA) frame that is fixed relative to the device coordinate frame, and then computing the rotation from the AGA frame to the device frame.
- AGA coordinate average gravity aligned
- the computing device defines the AGA frame such that the Z-axis, once averaged, aligns generally with the Z-axis of the world frame.
- the client device measures the average gravity, which correlates to the negative-Z-axis and provides information regarding the down direction (e.g., directed radially toward the center of the earth). The computing device may then rotate the measured average gravity 180 degrees to generally align the average gravity with the positive-Z-axis of the world frame.
- FIG. 6 illustrates an example AGA frame 610 that may be defined by an XYZ-coordinate frame, with the positive Z-axis extending upwardly similar to the Z-axis of the world frame 600 .
- the Z-axis of the AGA frame does not align precisely with the Z-axis of the world frame, because the world frame is not fixed with respect to the device and the device moves with respect to the world frame.
- the X and Y-axes of the AGA frame are orthogonal to each other and to the Z-axis, but otherwise the X and Y-axes may be defined arbitrarily.
- the computing device may use the following Equation 1 to define the AGA frame:
- AGA_ z _in_device_frame normalize(average(measured_acceleration_in_device_frame)) (1)
- an accelerometer of the client device may provide the measured_acceleration_in_device_frame data.
- the computing device may use the following Equation 2 to define the AGA frame:
- AGA_ z _in_device_frame normalize(average(world_ z _in_device_frame)) (2)
- the computing device may determine the world_z_in_device_frame data from an estimate of the orientation of the device relative to the real world according to Equation 3:
- an AGA_x vector is selected to be perpendicular to an AGA_z vector, but otherwise may be selected arbitrarily by picking two non-collinear vectors, computing their cross product with AGA_z, and picking the normalization of the largest result.
- the computing device may then compute the rotation R AGA device from the AGA frame to the device frame. In the present example, this rotation R AGA device remains constant for each time slice selected at block 504 .
- the computing device may perform step detection using conventional techniques to determine steps taken by a user associated with the client device.
- the computing device may perform the step detection based on accelerometer and gyroscope outputs that correspond to typical step motions.
- the computing device estimates changes in orientation of the user.
- the computing device projects the AGA frame to a horizontal plane to provide a horizontally projected AGA (HPAGA) frame.
- An X-axis of the HPAGA frame may be used to represent the orientation of the user.
- the HPAGA frame corresponds to the AGA frame when the Z-axis of the AGA frame is aligned with the Z-axis of the world frame.
- FIG. 6 illustrates such an HPAGA frame 612 .
- the computing device determines a rotation from the HPAGA frame to the AGA frame. This rotation is identified as R HPAGA AGA .
- the computing device may compute R HPAGA AGA as the shortest rotation that transforms the Z-axis of the world frame into the Z-axis of the AGA frame. This rotation may vary over time, as the AGA frame moves over time.
- the computing device at block 510 also computes a rotation from the world frame to the HPAGA frame.
- This rotation may be identified as R world HPAGA , and may be computed by the chain rule, such as in Equation 4 using the AGA frame and the device frame:
- R world HPAGA R AGA HPAGA *R device AGA *R world device (4)
- the computing device may compute the rotations in Equation 4 once each of the respective rotations are determined.
- the computing device may then use the rotation R world HPAGA to determine the change in orientation of the user from one detected step to another. In one example, the computing device determines this change in orientation of the user, or delta_theta, by comparing a user_yaw value between different steps. The computing device may compute the user-yaw value according to Equation 5:
- Equation 5 includes a constant value N, because the orientation of the user with respect to the device is not known.
- the computing device determines delta_theta by comparing the user_yaw between different steps, at which time the constant values N cancel out. For instance, the computing device may determine delta_theta between a first step s 1 and a second step s 2 using Equation 6:
- delta_theta user_yaw_step(s2) ⁇ user_yaw_step(s1) (6)
- the computing device may smooth the orientation estimate determined at block 510 to remove oscillations in orientation due to the human body zigzagging when each step is taken.
- the computing device may perform this smoothing by window averaging the orientation (user_yaw) and/or orientation changes (delta_theta) with a window size of two steps, for example.
- This example window averaging modifies the orientation estimate for each step to be the average between a present step and a previous step.
- the computing device may compute the rotation from the device frame to the HPAGA frame.
- the computing device may perform this computation each time the orientation of the device is requested (such as when magnetometer measurements are generated).
- the computing device may use the chain rule of Equation 7 to determine R device HPAGA .
- R device HPAGA R world HPAGA *R device world (7)
- this rotation encapsulates small movements in the device's orientation that are not due to changes in the heading of the user.
- the computing device may then perform a loose coupling or SLAM optimization, using as measurements, one or more of: pedestrian dead reckoning that is computed using data related to steps and changes in orientation of the user (e.g., user_yaw or delta theta values, which may be smoothed at block 512 or not), as computed herein; WIFI signals in conjunction with WIFI environment information (such as WIFI RSSI fingerprint maps, or where WIFI access points are located and the associated signal strength of the access points, or information measured by devices from other users in the area); Bluetooth low energy (BLE) or other radio-frequency signals, used similarly to Wi-Fi signals; and/or magnetic field measurements.
- WIFI signals in conjunction with WIFI environment information (such as WIFI RSSI fingerprint maps, or where WIFI access points are located and the associated signal strength of the access points, or information measured by devices from other users in the area); Bluetooth low energy (BLE) or other radio-frequency signals, used similarly to Wi-Fi signals; and/or magnetic field measurements.
- WIFI environment information such as WIFI RSS
- the computing device uses one or more of these measurements (and/or perhaps others) in a SLAM optimization to determine a position and location of the device and user. In one example, at block 516 , the computing device also performs the optimization to help refine estimates of different rotations or parameters. In the case where a map of the environment is known, SLAM can be replaced by a localization-only algorithm.
- the computing device may use the determined location to present a map on a display, and show a device location on the map, or otherwise generate information and instructions for providing such a display.
- the location of a device may also be used in location-based services or computer applications, such as to narrow an Internet search to an area that is nearby the device location, to personalize “local” news and/or weather updates or information sent to the client device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, to direct emergency services to help locate the client device in case of emergency, and the like.
- emergency calls e.g., 911 call-services
- FIG. 7 provides a block flow diagram that summarizes the frames, rotations, parameters, estimations, and/or computations described above in relation to FIG. 5 .
- a block 702 represents the computation of the rotation R device world from the device frame 606 to the world frame 600 .
- a block 704 represents the computation of the rotation R AGA device from the AGA frame 610 to the device frame 606 . As shown in FIG. 7 , the computation at block 704 may use the rotation R device world to compute the rotation R AGA device .
- FIG. 7 includes a block 706 that represents the computation of the rotation R HPAGA AGA from the HPAGA frame 612 to the AGA frame 610 .
- FIG. 7 shows that computation at block 706 may use the frame R AGA device to compute the rotation R HPAGA AGA .
- FIG. 7 also includes a block 708 that represents the computation of the rotation R world HPAGA from the world frame 600 to the HPAGA frame 612 . As shown in FIG. 7 , the computation at block 708 may use the frames R device world , R AGA device , and R HPAGA AGA to compute the rotation R world HPAGA using the chain rule.
- FIG. 7 includes a block 710 that represents the computation of changes in the orientation of a user (delta_theta), which also relates to a rotation R world user from the world frame 600 to the user frame 608 , and to a rotation R world HPAGA .
- delta_theta a rotation R world user from the world frame 600 to the user frame 608
- R world HPAGA a rotation R world HPAGA
- FIG. 8 is a block diagram of an example method 800 that may be used to perform the optimization 516 of FIG. 5 .
- the method 800 may be implemented similarly as described above with respect to the method 500 , including performing the various blocks in a different order and/or in parallel.
- the computing device retrieves or otherwise accesses a rotation from HPAGA to the device, or R HPAGA device .
- the computing device may compute the rotation R HPAGA device using the HPAGA frame and the device frame discussed above.
- the rotation R HPAGA device is computed as the inverse of the rotation R device HPAGA , which may have been computed at block 516 of the method 500 .
- the computing device defines a parameter user_heading_in_HPAGA that represents a yaw difference between the HPAGA frame and the user frame.
- R user HPAGA is a rotation about the Z-axis by an angle of -user_heading_in_HPAGA (a rotation by the negative angle).
- the computing device may initially estimate the parameter user_heading_in_HPAGA, and later refine this parameter during optimization.
- the computing device defines a parameter HPAGA_yaw_in_world that represents a yaw difference between the HPAGA frame and the world frame.
- R HPAGA world is a rotation about the Z-axis by an angle of -HPAGA_yaw_in_world (a rotation by the negative angle).
- the computing device may initially estimate the parameter HPAGA_yaw_in_world and may refine this parameter during the optimization.
- the parameter HPAGA_yaw_in_world varies as function of time, because the HPAGA frame moves relative to the world frame.
- the yaw of the rotation between two different frames can be estimated as a parameter.
- the parameter can represent the rotation between the user and the world instead of the rotation between the HPAGA frame and the world frame. Because of the chain rule, this is mathematically equivalent.
- the computing device determines pedestrian dead reckoning data of the user of the client device.
- the computing device may determine or compute the pedestrian dead reckoning, which may be identified as R user world , according to the following Equation 8 and the different rotations discussed above:
- R user world R device world *R AGA device *R HPAGA AGA *R user HPAGA (8)
- Equation 8 is also equivalent to Equation 9:
- the computing device may further optimize the computation of the pedestrian dead reckoning R user world by using the delta_theta values discussed above as further constraints to the computation.
- the computing device processes measured magnetic field data to determine the magnetic field in the world frame.
- the computing device processes the magnetic field data by rotating the data into the world frame according to a rotation R device world computed by Equation 10:
- R device world R user world *R HPAGA user *R device HPAGA (10)
- Equation 10 is also equivalent to Equation 11:
- R device world R HPAGA world *R device HPAGA (11)
- the rotation R device world is used to rotate the measured magnetic field in order to use the 3-D components of the magnetic field data.
- R device HPAGA may be obtained from the chain rule of Equation 12:
- the computing device may perform an optimization using SLAM algorithms, such as GraphSLAM or FastSLAM, and/or in online localizations using other fusion algorithms, such as kalman filters.
- SLAM algorithms such as GraphSLAM or FastSLAM
- other fusion algorithms such as kalman filters.
- the computing device uses the pedestrian dead reckoning data to fuse available GPS data, WIFI data, and/or Bluetooth scan data. The computing device may then use the resulting fused data as additional constraints in the optimization to identify different parameters, such as a location and map of the client device and other estimated rotations of the device.
- FIG. 9 provides a block flow diagram that summarizes the frames, rotations, parameters, computations, and/or estimations described above in relation to FIG. 8 .
- a block 902 represents the computation of the rotation R device HPAGA , which may have been computed before the optimization at blocks 516 or 812 .
- a block 904 represents the estimation or definition of the parameter user_heading_in_HPAGA, which represents the difference in yaw between the HPAGA frame 612 and the user frame 608 .
- a block 906 represents the estimation or definition of the parameter HPAGA_yaw_in_world, which represents the difference in yaw between the HPAGA frame 612 and the world frame 600 .
- a block 908 represents the computation of a rotation R user world from the user frame 608 to the world frame 600 .
- the rotation R user world may be obtained using a chain rule calculation through HPAGA rotations.
- the rotation R user world may be computed using Equations 8 or 9 above.
- the computation at block 908 may utilize delta-theta values that correspond to user_heading_in_HPAGA and/or HPAGA_yaw_in_world as constraints to help estimate the rotation R user world .
- the rotation R user world may then be used as the user orientation with respect to the world to determine pedestrian dead reckoning data.
- this pedestrian dead reckoning data may be used to fuse other data or constraints, such as data from GPS, WIFI, and/or Bluetooth sensors.
- a block 910 represents the computation of a rotation R world device from the world frame 600 to the device frame 606 .
- the rotation R world device may be obtained using a chain rule calculation through HPAGA rotations.
- the rotation R world device may be computed using Equations 10 or 11 above.
- the computation at block 910 may utilize delta-theta values that correspond to HPAGA_yaw_in_world and a rotation R device HPAGA as constraints to help estimate the rotation.
- the rotation R world device may then be used fuse magnetometer data associated with a client device.
- FIG. 10 is a block diagram that conceptually illustrates an example system 1000 for determining locations. Any of the blocks in the system 1000 may be modules, processors, or other devices, or may take the form of instructions executable by processors to perform the associated function. The system 1000 may utilize the methods and processes described herein to perform one or more of the following calculations and optimizations.
- logs of data 1002 are received from devices.
- the logs of data may include GPS, RSSI, magnetometer, accelerometer, and gyroscope data with associated timestamps as collected by respective devices.
- the logs of data for which a dead reckoning and GPS location agree may be provided to a non-linear least squares optimizer 1004 , for example. Logs of data for which a dead reckoning and GPS location do not agree may be rejected as erroneous data or data with too much noise.
- the non-linear least squares optimizer 1004 may optimize paths using GPS and dead reckoning, as shown at block 1006 and as described above using for example a ceres optimizer, and then build optimal WIFI maps while keeping the paths constant, as shown at block 1008 .
- the non-linear least squares optimizer 1004 may further jointly optimize paths and WIFI maps using a SLAM optimization and output a WIFI map, as shown at block 1010 .
- Traces with unreliable GPS data may be received at a hierarchical Viterbi processor 1014 to perform a global search for most likely paths given associated WIFI scans in the traces, as shown at block 1016 .
- a path of a user trace may be determined using the Viterbi algorithm (e.g., most likely path through a graph) based on one or more of motion probabilities from dead reckoning, transition probabilities from floorplan, or emission probabilities from a WIFI model.
- the non-linear least squares optimizer 1004 may receive the output of the global search and align with the dead reckoning to a Viterbi path, as shown at block 1018 , and jointly optimize all paths and WIFI maps using a SLAM optimization, as shown at block 1020 .
- the SLAM optimization is performed iteratively on growing subsets of states and constraints to determine a location of a user when data was collected based on all data collected.
- a first iteration uses subsets so that a function minimized is convex.
- Running SLAM on these subsets gives an estimate of the state subset. This estimate is used for determining the next subsets to include and the initialization to use for the next iteration. Thus, more constraints are added using a previous determination as a time starting point as the best prediction.
- the system 1000 defines a process that selects states, optimizes the states using a non-linear least squares solver, and runs SLAM algorithms to determine how to initialize the state for the next optimization iteration.
- WIFI signal strength map Although examples are described as determining a WIFI signal strength map, similar or same functions may be performed to determine localization of passively collected traces for creation of other types of maps, such as magnetometer maps.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Navigation (AREA)
Abstract
A method implemented by one or more processors may include determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step. In one example, computing the change in orientation includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, determining a rotation between the world frame and the HPAGA frame, and determining the change in orientation by using the rotation between the world frame and the HPAGA frame. The method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
Description
- The present application claims priority to U.S. patent application Ser. No. 14/815,500, filed Jul. 31, 2015, the contents of which are herein incorporated by reference in their entirety. U.S. patent application Ser. No. 14/815,500 claims priority to U.S. provisional patent application No. 62/032,522, filed on Aug. 1, 2014, the contents of which are herein incorporated by reference in their entirety.
- Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
- A location of a computing device can be determined using different techniques, such as techniques based on Global Positioning System (GPS) data or on data associated with a wireless access point (e.g., a cellular base station or an 802.11 access point). In one example, a computing device may receive a GPS signal and responsively determine its position on the face of the Earth (e.g. an absolute location). In another example, a computing device may receive a signal from either a cellular base station or an 802.11 access point. The cellular base station or an 802.11 access point may estimate an exact location. Based on the location of either the cellular base station or an 802.11 access point, the computing device can calculate its position.
- Within some instances, a localization of a computing device may occur via use of data from multiple different networks. Many location-based services can be provided to a computing device based on determining the location of the mobile computing device.
- In one example, a method performed, at least in part, by one or more processors of a computing device, includes determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step. In this example, computing the change in orientation further includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, and determining a rotation between the world frame and the HPAGA frame. Determining the rotation between the world frame and the HPAGA frame may use one or more of the rotation between the client device frame and the world frame, the rotation between the AGA frame and the client device frame, or the rotation between the HPAGA frame and the AGA frame. Computing the change in orientation may also include determining the change in orientation by using the rotation between the world frame and the HPAGA frame. The example method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
- Another example is directed to a non-transitory computer-readable medium having stored therein instructions, that when executed by one or more processors of a computing device, cause the computing device to perform various functions. In one example, the functions include determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step. In this example, computing the change in orientation further includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, and determining a rotation between the world frame and the HPAGA frame. Determining the rotation between the world frame and the HPAGA frame may use one or more of the rotation between the client device frame and the world frame, the rotation between the AGA frame and the client device frame, or the rotation between the HPAGA frame and the AGA frame. Computing the change in orientation may also include determining the change in orientation by using the rotation between the world frame and the HPAGA frame. The example method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
- A further example provides a system that includes a processor and a computer-readable medium. The computer-readable medium is configured to store instructions, that when executed by the at least one processor, cause the system to perform functions. In one example, the functions include determining a rotation between a client device frame and a world frame, determining a rotation between an average gravity aligned (AGA) frame of the client device and the client device frame, performing step detection of the client device, and determining a change in orientation from a first detected step to a second detected step. In this example, computing the change in orientation further includes determining a rotation between a horizontally projected AGA (HPAGA) frame and the AGA frame, and determining a rotation between the world frame and the HPAGA frame. Determining the rotation between the world frame and the HPAGA frame may use one or more of the rotation between the client device frame and the world frame, the rotation between the AGA frame and the client device frame, or the rotation between the HPAGA frame and the AGA frame. Computing the change in orientation may also include determining the change in orientation by using the rotation between the world frame and the HPAGA frame. The example method may also include determining, using the computed change in orientation, pedestrian dead reckoning data of the client device over a time period, and determining an output location estimate of the client device using the pedestrian dead reckoning data.
- These as well as other aspects, advantages, and alternatives, will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying figures.
-
FIG. 1 illustrates a block diagram of an example communication system according to an embodiment of the present disclosure. -
FIG. 2 illustrates a block diagram of an example computing device according to an embodiment of the present disclosure. -
FIG. 3 illustrates a block diagram of an example computing device according to another embodiment of the present disclosure. -
FIG. 4 is a flow chart of an example method for determining a location and/or movement of a device. -
FIG. 5 is a flow chart of another example method for determining and using one or more orientations associated with a device. -
FIG. 6 illustrate different reference frames for determining one or more orientations associated with a device. -
FIG. 7 is block flow diagram that illustrates relationships between frames and rotations, in accordance with an embodiment of the present disclosure. -
FIG. 8 is a flow chart of another example method for determining and using one or more orientations associated with a client device. -
FIG. 9 is block flow diagram that illustrates relationships between frames, rotations, and/or parameters in accordance with an embodiment of the present disclosure. -
FIG. 10 is a block diagram that conceptually illustrates an example system for determining location estimates of a device, and optionally, maps of observed data received from the device. - The following detailed description describes various features and functions of the disclosed systems and methods with reference to the accompanying figures. In the figures, similar symbols identify similar components, unless context dictates otherwise. The illustrative system and method embodiments described herein are not meant to be limiting. It may be readily understood that certain aspects of the disclosed systems and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
- Within examples, a number of logs of data or traces of data are received from one or more devices. The data may include a variety of information collected by one or more sensors of the devices. These sensors may include a GPS, accelerometer, gyroscope, inertial measurement unit (IMU), barometer, magnetometer, and WIFI signal strength sensor, as just some examples. The present disclosure relates to techniques for processing orientation data in the traces of data, such as from an accelerometer, gyroscope, and/or magnetometer, to determine an orientation of the device and/or a user of the device. The device and/or user orientation may be determined with respect to a frame of reference relative to the earth or a world frame.
- The present disclosure also relates to using the device/user orientation to determine a walking direction or pedestrian dead reckoning associated with the device. In one aspect, a computing device, such as a cellular phone, may use pedestrian dead reckoning calculations to localize a position of the device and to build a map of a location of the device and of a user of the device. Generally, dead reckoning calculations link positions of devices between two steps. For instance, a first step may be at time t, and a second step may be at t+1. The second step is on average likely to be around 3 ft in front of the first step, and an angle of turn or change in orientation between the first and second steps can be determined using, for example, gyroscope data.
- An example dead reckoning determination may be performed to determine an estimation of a current position of the computing device based on a previous position of the computing device, an estimated speed over an elapsed time, and an orientation or direction of travel of the computing device. Within examples, information indicating a previous position may be received from a server that calculates or determines the information due to communication with a computing device, or from sensors of the computing device including a GPS sensor. The previous position may also be derived or calculated from a number of data points such as GPS location determinations, or WIFI scans and associated WIFI mappings. The estimated speed can also be received from a server, or derived or calculated from position determinations over an elapsed time or based on other data over the elapsed time including outputs of a pedometer, for example. Using a known or estimated distance traveled (as derived or calculated from outputs of a pedometer, derived from outputs of an accelerometer inferring a step has been taken, or from other sensor data), a speed can be determined based on the elapsed time. The orientation or direction of travel of the computing device may be determined from data received from a server, or from sensors on-board the computing device such as a magnetometer or compass, for example. Any available information may be used to infer an orientation or a direction of travel including a fusion of accelerometer, gyroscope, and optionally magnetometer data, for example. In still other examples, other available information can be used to provide further estimates (directly or indirectly) as to direction of travel, including WIFI scans received in traces that may give information as to a position and heading of a device and/or user.
- The dead reckoning calculation can be performed to determine an estimation of the current position of the computing device. As an example, an accelerometer of the computing device can be used as a pedometer and a gyroscope or magnetometer as an orientation or compass heading provider. Each step of a user of the computing device causes a position to move forward a fixed distance in an orientation direction measured by the compass. Accuracy may be limited by precision of the sensors, magnetic disturbances inside structures of the computing device, and unknown variables such as carrying position of the computing device and stride length of the user. However, the estimate of the current position can be determined in this manner. The present disclosure provides additional techniques for determining orientation data.
- Further, the present disclosure relates to fusing or linking the dead reckoning calculations with other constraints, such as Wi-Fi scan data, GPS readings, and/or magnetic field data, to determine and refine a map and position of a device and/or user of the device. Generally, a computing device, such as a mobile phone or a server device, can perform fusion or linking of various constraints, such as dead reckoning calculations and magnetic field information in a tight-coupling manner or loose-coupling manner. Tight-coupling calculations may provide relatively high accuracy, but may also be relatively computationally-intensive, and may be prone to convergence to an incorrect solution. Loose-coupling calculations are generally less computationally-intensive than tight-coupling calculations, and may be more robust but may provide lower accuracy than tight-coupling calculations. In examples disclosed herein, the present disclosure adapts loose-coupling calculations to estimate the orientation of the device and/or the user of the device with respect to the world frame.
- The device orientation may also be used to rotate or otherwise process magnetic field information according to a frame of reference of the device with respect to the world. The computing device may use the processed magnetic field information, and perhaps with other constraints, to determine a location of the device and/or to build a map of an area around the device.
- Further, the computing device may present a map on a display, and show the device location on the map, or otherwise generate information and instructions for providing such a display. The location of the device may also be used in location-based services, such as to narrow an Internet search to an area that is nearby the client device location, to personalize “local” news and/or weather updates or information sent to the device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, and the like.
- Referring now to the figures,
FIG. 1 illustrates anexample communication system 100 in which an example method may be implemented. InFIG. 1 , a client device 102 may communicate with aserver 104 via one or more wired and/or wireless interfaces. The client device 102 and theserver 104 may communicate within a network. Alternatively, the client device 102 and theserver 104 may each reside within a respective network. - The client device 102 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, a tablet computing device, and the like, that is configured to transmit
data 106 to and/or receivedata 108 from theserver 104 in accordance with the method and functions described herein. The client device 102 may include a user interface, a communication interface, a processor, and data storage comprising instructions executable by the processor for carrying out one or more functions relating to the data sent to, and/or received by, theserver 104. The user interface may include buttons, a touchscreen, a microphone, and/or any other elements for receiving inputs, as well as a speaker, one or more displays, and/or any other elements for communicating outputs. - The
server 104 may be any entity or computing device arranged to carry out the method and computing device functions described herein. Further, theserver 104 may be configured to senddata 108 to and/or receivedata 106 from the client device 102. Theserver 104 may include alocation module 110 which may be configured to process thedata 106 received from the client device 102 to determine locations (present and historical) associated with the client device 102. - The
data 106 received by theserver 104 from the client device 102 may take various forms. For example, the client device 102 may provide information indicative of a location of the client device 102, movement of the client device 102, or inputs from a user of the client device 102. Theserver 104 may then process thedata 106 to identify a location history that matches to the received data. - The
data 108 sent to the client device 102 from theserver 104 may take various forms. For example, theserver 104 may send to the client device 102 an indication of location, updated location history information, or information based on the locations of the device. -
FIG. 2 illustrates a schematic drawing of anexample device 200. InFIG. 2 , the computing device takes a form of aclient device 200. In some examples, some components illustrated inFIG. 2 may be distributed across multiple computing devices. However, for the sake of example, the components are shown and described as part of oneexample client device 200. Theclient device 200 may be or include a mobile device, desktop computer, email/messaging device, tablet computer, or similar device that may be configured to perform the functions described herein. - In some implementations, the
client device 200 may include a device platform (not shown), which may be configured as a multi-layered Linux platform. The device platform may include different applications and an application framework, as well as various kernels, libraries, and runtime entities. In other examples, other formats or systems may operate theclient device 200 as well. - The
client device 200 may include aninterface 202, awireless communication component 204, a cellularradio communication component 206, a global position system (GPS) 208, one ormore sensors 210,data storage 212, and aprocessor 214. Components illustrated inFIG. 2 may be linked or coupled together by a communication link orbus 216. Theclient device 200 may also include hardware to enable communication within theclient device 200 and between theclient device 200 and another computing device, such as theserver 104 ofFIG. 1 . The hardware may include transmitters, receivers, and antennas, for example. - In one example, the
interface 202 is configured to allow theclient device 200 to communicate with another computing device, such as a server. Thus, theinterface 202 may be configured to receive input data from one or more computing devices, and may also be configured to send output data to the one or more computing devices. In some examples, theinterface 202 may also maintain and manage records of data received and sent by theclient device 200. In other examples, records of data may be maintained and managed by other components of theclient device 200. Theinterface 202 may also include a receiver and transmitter to receive and send data. In other examples, theinterface 202 may also include a user-interface, such as a keyboard, microphone, touchscreen, etc., to receive inputs as well. - The
wireless communication component 204 may be a communication interface that is configured to facilitate wireless data communication for theclient device 200 according to one or more wireless communication standards. For example, thewireless communication component 204 may include a WIFI communication component that is configured to facilitate wireless data communication according to one or more IEEE 802.11 standards. As another example, thewireless communication component 204 may include a Bluetooth communication component that is configured to facilitate wireless data communication according to one or more Bluetooth standards. Other examples are also possible. - The
processor 214 may be configured to determine one or more geographical location estimates of theclient device 200 using one or more location-determination components, such as thewireless communication component 204, the cellularradio communication component 206, and/or theGPS 208. For instance, theprocessor 214 may use a location-determination algorithm to determine a location of theclient device 200 based on a presence and/or location of one or more known wireless access points within a wireless range of theclient device 200. In one example, thewireless communication component 204 may determine the identity of one or more wireless access points (e.g., a MAC address) and measure an intensity of signals received (e.g., received signal strength indication) from each of the one or more wireless access points. The received signal strength indication (RSSI) from each unique wireless access point may be used to determine a distance from each wireless access point. The distances may then be compared to a database that stores information regarding where each unique wireless access point is located. Based on the distance from each wireless access point, and the known location of each of the wireless access point, a location estimate of theclient device 200 may be determined. - In another instance, the
processor 214 may use a location-determination algorithm to determine a location of theclient device 200 based on nearby cellular base stations. For example, the cellularradio communication component 206 may be configured to at least identify a cell from which theclient device 200 is receiving, or last received, signal from a cellular network. The cellularradio communication component 206 may also be configured to measure a round trip time (RTT) to a base station providing the signal, and combine this information with the identified cell to determine a location estimate. In another example, thecellular communication component 206 may be configured to use observed time difference of arrival (OTDOA) from three or more base stations to estimate the location of theclient device 200. - In still another instance, the
processor 214 may use a location-determination algorithm to determine a location of theclient device 200 based on signals sent by GPS satellites above the Earth. For example, theGPS 208 may be configured to estimate a location of the mobile device by precisely timing signals sent by the GPS satellites. - In other examples, the
processor 214 may use a location-determination algorithm to determine a location of theclient device 200 based Bluetooth wireless signals. The Bluetooth signals can be compared to a map of Bluetooth devices, and a measurement probability map in which a given Bluetooth wireless signal is estimated to be received can be determined. Within examples, Bluetooth devices may include static devices (e.g., such as Bluetooth Low Energy (BLE) beacons) that emit signals to nearby devices. Each Bluetooth device will have a range in which signals can be emitted, and the range can be used as a measurement probability map as the constraint for locations of the device. - In still other examples, the
processor 214 may use a location-determination algorithm to determine a location of theclient device 200 based on magnetic field signals. For example, ambient magnetic fields are present in environments, and include disturbances or anomalies in the Earth's magnetic field caused by pillars, doors, elevators in hallways, or other objects that may be ferromagnetic in nature, A device may measure a magnetic field, and when such magnetic field measurements are present in the logs of data, the measurements can be compared to a map of magnetic field signal strength for a given location, and a measurement probability map in which a given magnetic field signal strength corresponds to a signal strength of the magnetic field signal can be determined and used as the constraint to determine a location of the client device. - In some examples, the
processor 214 may use a location-determination algorithm that combines location estimates determined by multiple location-determination components, such as a combination of thewireless communication component 204, thecellular radio component 206, and theGPS 208. - The
sensor 210 may include one or more sensors, or may represent one or more sensors included within theclient device 200. Example sensors include an accelerometer, gyroscope, magnetometer, pedometer, barometer, light sensors, microphone, camera, or other location and/or context-aware sensors. - The
processor 214 may also use a location-determination algorithm that fuses data from the one ormore sensors 210. For instance, theprocessor 214 may be configured to execute a dead reckoning algorithm that uses a log of sensor data as inputs to the dead reckoning algorithm to determine an estimated trajectory or pedestrian dead reckoning of the client device and associated user. - The
data storage 212 may storeprogram logic 218 that can be accessed and executed by theprocessor 214. Thedata storage 210 may also store collectedsensor data 220 that may include data collected by any of thewireless communication component 204, the cellularradio communication component 206, theGPS 208, and any ofsensors 210. - The
communication link 216 is illustrated as a wired connection; however, wireless connections may also be used. For example, thecommunication link 216 may be a wired serial bus such as a universal serial bus or a parallel bus, or a wireless connection using, e.g., short-range wireless radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), or Cellular technology, among other possibilities. - The illustrated
client device 200 inFIG. 2 includes anadditional processor 222. Theprocessor 222 may be configured to control other aspects of theclient device 200 including displays or outputs of the client device 200 (e.g., theprocessor 222 may be a GPU). Example methods described herein may be performed individually by components of theclient device 200, or in combination by one or more of the components of theclient device 200. In one instance, portions of theclient device 200 may process data and provide an output internally in theclient device 200 to theprocessor 222, for example. In other instances, portions of theclient device 200 may process data and provide outputs externally to other computing devices. -
FIG. 3 illustrates a schematic drawing of another example computing device. InFIG. 3 , the computing device takes a form of aserver 300. In some examples, some components illustrated inFIG. 3 may be distributed across multiple servers. However, for the sake of example, the components are shown and described as part of oneexample server 300. Theserver 300 may be a computing device, cloud, or similar entity that may be configured to perform the functions described herein. - The
server 300 may include acommunication interface 302, alocation module 304, aprocessor 306, anddata storage 308. All of the components illustrated inFIG. 3 may be linked or coupled together by a communication link or bus 310 (e.g., a wired or wireless link). Theserver 300 may also include hardware to enable communication within theserver 300 and between theserver 300 and another computing device (not shown). The hardware may include transmitters, receivers, and antennas, for example. - The
communication interface 302 may allow theserver 300 to communicate with another device (not shown), such as a mobile phone, personal computer, etc. Thus, thecommunication interface 302 may be configured to receive input data from one or more computing devices, and may also be configured to send output data to the one or more computing devices. In some examples, thecommunication interface 302 may also maintain and manage records of data received and sent by theserver 300. In other examples, records of data may be maintained and managed by other components of theserver 300. - The
location module 304 may be configured to receive data from a client device and determine a geographic location of the client device. The determination may be based on outputs of an accelerometer, gyroscope, barometer, magnetometer, or other sensors of the client device, as well as based on location determinations of the client device. Further, thelocation module 304 may be configured to execute a dead reckoning algorithm. Using a log of sensor data as inputs to the dead reckoning algorithm, thelocation module 304 may determine an estimated trajectory or pedestrian dead reckoning of the client device and associated user. - The
location module 304 may also be configured to determine and store a history of sensor measurements of the client device for later reprocessing based on updated data pertaining to networks or information used to the determine the locations. - The
data storage 308 may storeprogram logic 312 that can be accessed and executed by theprocessor 306. Thedata storage 310 may also include alocation database 314 that can be accessed by theprocessor 306 as well, for example, to retrieve information regarding wireless access points, magnetic field data, orientation data, locations of satellites in a GPS network, floor plans of a building, etc., or any other type of information useful for determining a location of a client device. - The server is illustrated with a
second processor 316, which may be an application specific processor for input/output functionality. In other examples, functions of theprocessor 306 and theprocessor 316 may be combined into one component. - Within examples, measurements collected from various sensors of a device (such as WIFI components, GPS sensors, barometers, and inertial sensors) can be combined with information from external databases (such as known locations of WIFI access points or building floor plans) to estimate a location or movement of the device in real-time. Recording the real-time location estimate at all times (or intervals/increments of time) may also produce a location history.
-
FIG. 4 is a flow diagram illustrating an example method for determining a location or movement of a device. Initially, computing device(s) 400, operated byusers 402 orsurveyors 404, may traverse areas in an environment and output traces to amodel builder 406. A device operated by auser 402 may output traces passively (e.g., the device may be configured to output the trace data with no additional user input), including raw data output by sensors of the device like WIFI scans, GPS data, accelerometer data, gyroscope data, barometer readings, magnetometer data, etc. Each trace may be associated with a time the data was collected, and thus, for traces that include GPS data, other data in the traces also has location-specific references. A device operated by asurveyor 404 may have location-specific references for all traces, whether due to associated GPS data or manual input of location information. - The
model builder 406 may be a module on a computing device or server, and may be configured to generate a model of the environment based on the received traces. Themodel builder 406 may include a trace localizer and a map builder. Themodel builder 406 may access reference data or information, such as magnetic field signal strength data in the environment at specific locations in the environment, or other landmark data of the environment, such as strength of signal (RSSI) for WIFI access points. Themodel builder 406 may be configured to generate a map or path of the device based on the traces. In one example, themodel builder 406 may utilize GPS data to determine locations of the device over time, utilize dead reckoning (based on accelerometer and gyroscope outputs) to project a path, utilize elevational data (such as based on GPS elevational data and barometer readings), and optimize the path by jointly combining each. Themodel builder 406 may further optimize the path to match magnetic field data to reference magnetic field maps to align a path that most likely resembles a path that the device traversed through the environment. - A
location provider 408 may access a model output by themodel builder 406 to determine locations of other device(s) 410 based on provided passive traces as well. Within examples, thelocation provider 408 may return a location of the device or an estimation of movement of the device to thedevice 410 based on data received in the traces. The computing device may use the determined locations to present a map on a display of the device, for example, and show a device location on the map, or otherwise generate information and instructions for providing such a display. The location of a device may also be used in location-based services, such as to narrow an Internet search to an area that is nearby the device location, to personalize “local” news and/or weather updates or information sent to the device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, and the like. - Traces received from devices may include a variety of measurements from multiple different sensors, and may include a variety of measurements collected over time or at various locations. A trace may refer to a sensor log or a collection of data output from sensors on the device over some time period and collected over a number of locations. The sensors that output data may be selected, or data to be included within the sensor log may also be selected. In some examples, a trace of data may include all data collected by a device (using a number of sensors) over a given time frame (e.g., about 5 seconds, or perhaps about 5 minutes or any ranges therein or longer). Measurements in a trace or from trace to trace may be considered statistically independent. However, in instances in which the measurements are collected from positions/locations in close proximity or collected close in time, the measurements may have correlations.
- The traces or logs of data may be used to build a magnetic field strength map of the number of locations aligned to latitude and longitude or position coordinates. Estimate magnetic field strengths can be made based on known locations of where the magnetic field scans occurred. The reverse is also true. To solve the problem when both are initially unknown, a simultaneous localization and mapping (SLAM) can be performed to solve both at the same time using received data in the logs of data. If one of a location of a magnetic field anomaly or locations of magnetic field scans are known, then the known data can be held constant while optimizing the other. The received logs of data can be used to determine relative paths traversed by the devices using dead reckoning, which provides estimates of AP locations and trajectory of the devices relative to each other, and such relative estimates can be aligned with more absolute positions using measurements from GPS. However, GPS generally provides accurate latitude and longitude measurements, but only in certain locations (mostly outdoors).
- Additional or alternative maps of signals or signal strengths may also be generated based on received logs of data or accessed to localize a device. Such maps include WIFI strength of signal (RSSI) maps, Bluetooth device maps, or geographic walkway and street maps, for example.
- Thus, within examples, trustworthy measurements in an absolute frame can be accessed first to generate a first estimate of a magnetic field strength map, and new measurements and new sensor logs can be introduced to refine the estimate using the estimate as a starting point to build upon. As each new piece of data is introduced, a current estimate is held constant and used to determine an initial estimate for the new data. Then, a SLAM optimization may be performed to jointly optimize all data without keeping anything constant. Iterations may be performed until all data has been considered.
-
FIG. 5 is a block diagram of an example method in accordance with at least some embodiments described herein.Method 500 shown inFIG. 5 presents an embodiment of a method that, for example, could be used with thesystem 100 inFIG. 1 , thedevice 200 inFIG. 2 , theserver 300 inFIG. 3 , and/or the method inFIG. 4 , for example, or may be performed by a combination of any components or processes ofFIGS. 1-4 .Method 500 may include one or more operations, functions, or actions as illustrated by one or more of blocks 502-516. Although the blocks are illustrated in a sequential order, these blocks may in some instances be performed in parallel, and/or in a different order than those described herein. In addition, the various blocks may be combined into fewer blocks, divided into additional blocks, and/or removed based upon the desired implementation. - In addition, for the
method 500 and other processes and methods disclosed herein, the flowchart shows functionality and operation of one possible implementation of present embodiments. In this regard, each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process. The program code may be stored on any type of computer readable medium, for example, such as a storage device including a disk or hard drive. The computer readable medium may include a non-transitory computer readable medium, for example, such as computer-readable media that stores data for short periods of time like register memory, processor cache and Random Access Memory (RAM). The computer readable medium may also include non-transitory media, such as secondary or persistent long-term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media may also be any other volatile or non-volatile storage systems. The computer readable medium may be considered a computer readable storage medium, a tangible storage device, or other article of manufacture, for example. - In addition, for the
method 500 and other processes and methods disclosed herein, each block inFIG. 5 may represent circuitry and/or other hardware that is wired or otherwise configured to perform the specific logical functions and processes ofmethod 500. - Functions of the
method 500 may be fully performed by a computing device (or components of a computing device such as one or more processors), or may be distributed across multiple computing devices and/or a server. In some examples, the computing device may receive information from sensors of the computing device, or where the computing device is a server the information can be received from another device that collects the information. - At
block 502 ofFIG. 5 , a computing device, such as one or more of the client devices and/or servers discussed herein, estimates an orientation of a client device (e.g., a phone) by computing a rotation from a coordinate frame of a client device (e.g., device frame) to a coordinate frame of the earth (e.g., world frame). This rotation from the device frame to the world frame is identified as Rdevice world. - Referring to
FIG. 6 , for example, aworld frame 600 may be defined by an XYZ-coordinate frame, with the positive X-axis extending east, the positive Y-axis extending north, and the positive Z-axis extending up (e.g., directed radially away from the center of the earth). Generally, theworld frame 600 varies depending on the location of aclient device 602 and an associateduser 604 on the earth. However, theworld frame 600 may be considered fixed over a relatively short period of time, such as one day, because the variation of theworld frame 600 based on the user's steps is likely insignificant. In other examples, the world frame may be allowed to drift over time. - Further,
FIG. 6 shows adevice frame 606 that may be defined by an XYZ-coordinate frame, with the positive X-axis extending to the right with respect to a front face of thedevice 602, the positive Y-axis extending up with respect to the front face of the device, and the positive Z-axis extending perpendicularly out of the front face of the device.FIG. 6 also illustrates auser frame 608 that may be defined by an XYZ-coordinate frame, with the X-axis extending in front of and away from theuser 604, the Y-axis extending to the left of the user, and the Z-axis extending up from the user. - At
block 502, the computing device computes the rotation Rdevice world to relate thedevice frame 606 to theworld frame 600. The computing device may estimate the rotation device world by fusing device sensor data. The device sensor data may be collected by the client device over a plurality of locations and over a time period, and may include accelerometer and gyroscope data, and in some cases, magnetometer data. The magnetometer data, for instance, may be used to compensate for a bias effect of the gyroscope. In one example, the computing device may estimate the rotation Rdevice world by performing an extended kalman filter (EKF) method using the device sensor data. Alternatively or in conjunction, the computing device may estimate the rotation Rdevice world using a game rotation vector defined by Android open source software. - At
block 504, the computing device selects data, such as device sensor data related to the rotation Rdevice world or the rotation itself, for further processing in thepresent method 500. In one example, atblock 504, the computing device selects time slices during which an average orientation of the device relative to the user does not change or changes within a predetermined range. In one example, the time slices are selected by processing the rotation Rdevice world, and identifying when the orientation of “world_down” in the device frame does not change significantly between time slices, such as by less than about 35 degrees. Atblock 504, the computing device may eliminate potentially unreliable data associated with large variations in the orientation of the device, which may occur when a user takes their phone out of their pocket, for example. Consequently, the computing device atblock 504 may then select more reliable orientation data for further processing. - At
block 506, the computing device estimates an average orientation of the device relative to the user by computing a rotation from an average gravity aligned (AGA) frame to the device frame. This rotation from AGA frame to the device frame is identified as a rotation RAGA device. The computing device may use the orientation data selected atblock 504 to compute the rotation RAGA device, which may help to verify an assumption that the device is in a generally static position relative to the user. - In one example, the computing device performs the calculation of
block 506 by creating a coordinate average gravity aligned (AGA) frame that is fixed relative to the device coordinate frame, and then computing the rotation from the AGA frame to the device frame. Generally, the computing device defines the AGA frame such that the Z-axis, once averaged, aligns generally with the Z-axis of the world frame. In one example, the client device measures the average gravity, which correlates to the negative-Z-axis and provides information regarding the down direction (e.g., directed radially toward the center of the earth). The computing device may then rotate the measured average gravity 180 degrees to generally align the average gravity with the positive-Z-axis of the world frame.FIG. 6 illustrates anexample AGA frame 610 that may be defined by an XYZ-coordinate frame, with the positive Z-axis extending upwardly similar to the Z-axis of theworld frame 600. In practice, the Z-axis of the AGA frame does not align precisely with the Z-axis of the world frame, because the world frame is not fixed with respect to the device and the device moves with respect to the world frame. The X and Y-axes of the AGA frame are orthogonal to each other and to the Z-axis, but otherwise the X and Y-axes may be defined arbitrarily. - For instance, at
block 506, the computing device may use the followingEquation 1 to define the AGA frame: -
AGA_z_in_device_frame=normalize(average(measured_acceleration_in_device_frame)) (1) - In
Equation 1, an accelerometer of the client device may provide the measured_acceleration_in_device_frame data. In another example, atblock 506, the computing device may use the following Equation 2 to define the AGA frame: -
AGA_z_in_device_frame=normalize(average(world_z_in_device_frame)) (2) - In Equation 2, the computing device may determine the world_z_in_device_frame data from an estimate of the orientation of the device relative to the real world according to Equation 3:
-
Rworld device*(0, 0, 1) (3) - In this example, an AGA_x vector is selected to be perpendicular to an AGA_z vector, but otherwise may be selected arbitrarily by picking two non-collinear vectors, computing their cross product with AGA_z, and picking the normalization of the largest result. At
block 506, once the computing device defines the AGA frame, the computing device may then compute the rotation RAGA device from the AGA frame to the device frame. In the present example, this rotation RAGA device remains constant for each time slice selected atblock 504. - At
block 508, the computing device may perform step detection using conventional techniques to determine steps taken by a user associated with the client device. Generally, the computing device may perform the step detection based on accelerometer and gyroscope outputs that correspond to typical step motions. - At
block 510, for one or more (or each) step detected atblock 508, the computing device estimates changes in orientation of the user. In one example, atblock 510, the computing device projects the AGA frame to a horizontal plane to provide a horizontally projected AGA (HPAGA) frame. An X-axis of the HPAGA frame may be used to represent the orientation of the user. The HPAGA frame corresponds to the AGA frame when the Z-axis of the AGA frame is aligned with the Z-axis of the world frame.FIG. 6 illustrates such anHPAGA frame 612. - Further, at
block 510, the computing device determines a rotation from the HPAGA frame to the AGA frame. This rotation is identified as RHPAGA AGA. The computing device may compute RHPAGA AGA as the shortest rotation that transforms the Z-axis of the world frame into the Z-axis of the AGA frame. This rotation may vary over time, as the AGA frame moves over time. - In one example, the computing device at
block 510 also computes a rotation from the world frame to the HPAGA frame. This rotation may be identified as Rworld HPAGA, and may be computed by the chain rule, such as in Equation 4 using the AGA frame and the device frame: -
R world HPAGA =R AGA HPAGA *R device AGA *R world device (4) - The computing device may compute the rotations in Equation 4 once each of the respective rotations are determined.
- The computing device may then use the rotation Rworld HPAGA to determine the change in orientation of the user from one detected step to another. In one example, the computing device determines this change in orientation of the user, or delta_theta, by comparing a user_yaw value between different steps. The computing device may compute the user-yaw value according to Equation 5:
-
user_yaw=a tan 2(HPAGA_x_in_world_frame.x, HPAGA_x_in_world_frame.y)+N (5) - Equation 5 includes a constant value N, because the orientation of the user with respect to the device is not known. The computing device determines delta_theta by comparing the user_yaw between different steps, at which time the constant values N cancel out. For instance, the computing device may determine delta_theta between a first step s1 and a second step s2 using Equation 6:
-
delta_theta=user_yaw_step(s2)−user_yaw_step(s1) (6) - At
block 512, the computing device may smooth the orientation estimate determined atblock 510 to remove oscillations in orientation due to the human body zigzagging when each step is taken. The computing device may perform this smoothing by window averaging the orientation (user_yaw) and/or orientation changes (delta_theta) with a window size of two steps, for example. This example window averaging modifies the orientation estimate for each step to be the average between a present step and a previous step. - In
FIG. 5 , atblock 514, the computing device may compute the rotation from the device frame to the HPAGA frame. The computing device may perform this computation each time the orientation of the device is requested (such as when magnetometer measurements are generated). In one example, the computing device may use the chain rule of Equation 7 to determine Rdevice HPAGA. -
R device HPAGA =R world HPAGA *R device world (7) - Generally, this rotation encapsulates small movements in the device's orientation that are not due to changes in the heading of the user.
- At
block 516, the computing device may then perform a loose coupling or SLAM optimization, using as measurements, one or more of: pedestrian dead reckoning that is computed using data related to steps and changes in orientation of the user (e.g., user_yaw or delta theta values, which may be smoothed atblock 512 or not), as computed herein; WIFI signals in conjunction with WIFI environment information (such as WIFI RSSI fingerprint maps, or where WIFI access points are located and the associated signal strength of the access points, or information measured by devices from other users in the area); Bluetooth low energy (BLE) or other radio-frequency signals, used similarly to Wi-Fi signals; and/or magnetic field measurements. Atblock 516, the computing device uses one or more of these measurements (and/or perhaps others) in a SLAM optimization to determine a position and location of the device and user. In one example, atblock 516, the computing device also performs the optimization to help refine estimates of different rotations or parameters. In the case where a map of the environment is known, SLAM can be replaced by a localization-only algorithm. - At block 516 (or after block 516), the computing device may use the determined location to present a map on a display, and show a device location on the map, or otherwise generate information and instructions for providing such a display. The location of a device may also be used in location-based services or computer applications, such as to narrow an Internet search to an area that is nearby the device location, to personalize “local” news and/or weather updates or information sent to the client device, to direct emergency calls (e.g., 911 call-services) to an appropriate or closest call handling service, to direct emergency services to help locate the client device in case of emergency, and the like.
-
FIG. 7 provides a block flow diagram that summarizes the frames, rotations, parameters, estimations, and/or computations described above in relation toFIG. 5 . For instance, ablock 702 represents the computation of the rotation Rdevice world from thedevice frame 606 to theworld frame 600. Ablock 704 represents the computation of the rotation RAGA device from theAGA frame 610 to thedevice frame 606. As shown inFIG. 7 , the computation atblock 704 may use the rotation Rdevice world to compute the rotation RAGA device. - Further,
FIG. 7 includes ablock 706 that represents the computation of the rotation RHPAGA AGA from theHPAGA frame 612 to theAGA frame 610.FIG. 7 shows that computation atblock 706 may use the frame RAGA device to compute the rotation RHPAGA AGA.FIG. 7 also includes ablock 708 that represents the computation of the rotation Rworld HPAGA from theworld frame 600 to theHPAGA frame 612. As shown inFIG. 7 , the computation atblock 708 may use the frames Rdevice world, RAGA device, and RHPAGA AGA to compute the rotation Rworld HPAGA using the chain rule. - In addition,
FIG. 7 includes ablock 710 that represents the computation of changes in the orientation of a user (delta_theta), which also relates to a rotation Rworld user from theworld frame 600 to theuser frame 608, and to a rotation Rworld HPAGA. As a general matter, it should be understood that a rotation RB C from a frame B to a frame C is the inverse of a rotation RC B from the frame C to the frame B. -
FIG. 8 is a block diagram of anexample method 800 that may be used to perform theoptimization 516 ofFIG. 5 . Generally, themethod 800 may be implemented similarly as described above with respect to themethod 500, including performing the various blocks in a different order and/or in parallel. - In
method 800, atblock 802, the computing device retrieves or otherwise accesses a rotation from HPAGA to the device, or RHPAGA device. The computing device may compute the rotation RHPAGA device using the HPAGA frame and the device frame discussed above. In one example, the rotation RHPAGA device is computed as the inverse of the rotation Rdevice HPAGA, which may have been computed atblock 516 of themethod 500. - At
block 804, the computing device defines a parameter user_heading_in_HPAGA that represents a yaw difference between the HPAGA frame and the user frame. Generally, Ruser HPAGA is a rotation about the Z-axis by an angle of -user_heading_in_HPAGA (a rotation by the negative angle). The computing device may initially estimate the parameter user_heading_in_HPAGA, and later refine this parameter during optimization. - At
block 806, the computing device defines a parameter HPAGA_yaw_in_world that represents a yaw difference between the HPAGA frame and the world frame. Generally, RHPAGA world is a rotation about the Z-axis by an angle of -HPAGA_yaw_in_world (a rotation by the negative angle). The computing device may initially estimate the parameter HPAGA_yaw_in_world and may refine this parameter during the optimization. In practice, the parameter HPAGA_yaw_in_world varies as function of time, because the HPAGA frame moves relative to the world frame. In another implementation, the yaw of the rotation between two different frames can be estimated as a parameter. For example, the parameter can represent the rotation between the user and the world instead of the rotation between the HPAGA frame and the world frame. Because of the chain rule, this is mathematically equivalent. - At
block 808, the computing device determines pedestrian dead reckoning data of the user of the client device. The computing device may determine or compute the pedestrian dead reckoning, which may be identified as Ruser world, according to the following Equation 8 and the different rotations discussed above: -
R user world =R device world *R AGA device *R HPAGA AGA *R user HPAGA (8) - Equation 8 is also equivalent to Equation 9:
-
R user world R HPAGA world *R user HPAGA (9) - The computing device may further optimize the computation of the pedestrian dead reckoning Ruser world by using the delta_theta values discussed above as further constraints to the computation.
- At
block 810, the computing device processes measured magnetic field data to determine the magnetic field in the world frame. In the present example, the computing device processes the magnetic field data by rotating the data into the world frame according to a rotation Rdevice world computed by Equation 10: -
R device world =R user world *R HPAGA user *R device HPAGA (10) - Equation 10 is also equivalent to Equation 11:
-
R device world =R HPAGA world *R device HPAGA (11) - In this example, the rotation Rdevice world is used to rotate the measured magnetic field in order to use the 3-D components of the magnetic field data. In addition, Rdevice HPAGA may be obtained from the chain rule of Equation 12:
-
R device HPAGA =R AGA HPAGA *R device AGA (12) - Other variations to the equations discussed herein are also possible depending, in part, on how the parameters are being defined and on conversions using the chain rule.
- At
block 812, the computing device may perform an optimization using SLAM algorithms, such as GraphSLAM or FastSLAM, and/or in online localizations using other fusion algorithms, such as kalman filters. In one example, atblock 812, the computing device uses the pedestrian dead reckoning data to fuse available GPS data, WIFI data, and/or Bluetooth scan data. The computing device may then use the resulting fused data as additional constraints in the optimization to identify different parameters, such as a location and map of the client device and other estimated rotations of the device. -
FIG. 9 provides a block flow diagram that summarizes the frames, rotations, parameters, computations, and/or estimations described above in relation toFIG. 8 . For instance, ablock 902 represents the computation of the rotation Rdevice HPAGA, which may have been computed before the optimization at 516 or 812. Ablocks block 904 represents the estimation or definition of the parameter user_heading_in_HPAGA, which represents the difference in yaw between theHPAGA frame 612 and theuser frame 608. InFIG. 9 , ablock 906 represents the estimation or definition of the parameter HPAGA_yaw_in_world, which represents the difference in yaw between theHPAGA frame 612 and theworld frame 600. - Further, a
block 908 represents the computation of a rotation Ruser world from theuser frame 608 to theworld frame 600. The rotation Ruser world may be obtained using a chain rule calculation through HPAGA rotations. For example, the rotation Ruser world may be computed using Equations 8 or 9 above. As illustrated, the computation atblock 908 may utilize delta-theta values that correspond to user_heading_in_HPAGA and/or HPAGA_yaw_in_world as constraints to help estimate the rotation Ruser world. The rotation Ruser world may then be used as the user orientation with respect to the world to determine pedestrian dead reckoning data. In addition, this pedestrian dead reckoning data may be used to fuse other data or constraints, such as data from GPS, WIFI, and/or Bluetooth sensors. - In
FIG. 9 , ablock 910 represents the computation of a rotation Rworld device from theworld frame 600 to thedevice frame 606. The rotation Rworld device may be obtained using a chain rule calculation through HPAGA rotations. For example, the rotation Rworld device may be computed using Equations 10 or 11 above. As illustrated, the computation atblock 910 may utilize delta-theta values that correspond to HPAGA_yaw_in_world and a rotation Rdevice HPAGA as constraints to help estimate the rotation. The rotation Rworld device may then be used fuse magnetometer data associated with a client device. -
FIG. 10 is a block diagram that conceptually illustrates anexample system 1000 for determining locations. Any of the blocks in thesystem 1000 may be modules, processors, or other devices, or may take the form of instructions executable by processors to perform the associated function. Thesystem 1000 may utilize the methods and processes described herein to perform one or more of the following calculations and optimizations. - In the
system 1000, logs ofdata 1002 are received from devices. The logs of data may include GPS, RSSI, magnetometer, accelerometer, and gyroscope data with associated timestamps as collected by respective devices. The logs of data for which a dead reckoning and GPS location agree may be provided to a non-linear least squares optimizer 1004, for example. Logs of data for which a dead reckoning and GPS location do not agree may be rejected as erroneous data or data with too much noise. The non-linear least squares optimizer 1004 may optimize paths using GPS and dead reckoning, as shown at block 1006 and as described above using for example a ceres optimizer, and then build optimal WIFI maps while keeping the paths constant, as shown atblock 1008. The non-linear least squares optimizer 1004 may further jointly optimize paths and WIFI maps using a SLAM optimization and output a WIFI map, as shown atblock 1010. - Traces with unreliable GPS data (at block 1012) may be received at a
hierarchical Viterbi processor 1014 to perform a global search for most likely paths given associated WIFI scans in the traces, as shown at block 1016. As an example, a path of a user trace may be determined using the Viterbi algorithm (e.g., most likely path through a graph) based on one or more of motion probabilities from dead reckoning, transition probabilities from floorplan, or emission probabilities from a WIFI model. The non-linear least squares optimizer 1004 may receive the output of the global search and align with the dead reckoning to a Viterbi path, as shown atblock 1018, and jointly optimize all paths and WIFI maps using a SLAM optimization, as shown at block 1020. - The SLAM optimization is performed iteratively on growing subsets of states and constraints to determine a location of a user when data was collected based on all data collected. A first iteration uses subsets so that a function minimized is convex. Running SLAM on these subsets gives an estimate of the state subset. This estimate is used for determining the next subsets to include and the initialization to use for the next iteration. Thus, more constraints are added using a previous determination as a time starting point as the best prediction. The
system 1000 defines a process that selects states, optimizes the states using a non-linear least squares solver, and runs SLAM algorithms to determine how to initialize the state for the next optimization iteration. - Although examples are described as determining a WIFI signal strength map, similar or same functions may be performed to determine localization of passively collected traces for creation of other types of maps, such as magnetometer maps.
- It should be understood that arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g. machines, interfaces, functions, orders, and groupings of functions, etc.) can be used instead, and some elements may be omitted altogether according to the desired results. Further, many of the elements that are described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location, or other structural elements described as independent structures may be combined.
- While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims, along with the full scope of equivalents to which such claims are entitled. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
Claims (20)
1. A method comprising:
receiving, by one or more processors of a computing device, a stream of sensor measurements made by a client device coupled to a user;
based on the sensor measurements, determining, by the one or more processors, a first angle of rotation between a client device coordinate frame defined with respect to the client device and a world coordinate frame defined with respect to Earth;
determining, by the one or more processors, an average gravity aligned (AGA) coordinate frame, wherein a z-axis of the AGA coordinate frame approximates a z-axis of the world coordinate frame;
determining, by the one or more processors, a second angle of rotation between the AGA coordinate frame and the client device coordinate frame;
detecting, by the one or more processors, two or more steps taken by the user while the client device is coupled to the user, wherein the step detection is based on accelerometer and gyroscope measurements made by the client device, and wherein the accelerometer and gyroscope measurements are included in the sensor measurements;
based on the first angle of rotation and the second angle of rotation, determining, by the one or more processors, a change in orientation of the user from a first detected step of the user to a second detected step of the user; and
based on the first detected step of the user, the second detected step of the user, and the change in orientation of the user, determining, by the one or more processors, a location estimate of the client device.
2. The method of claim 1 , wherein the computing device is the client device.
3. The method of claim 1 , wherein the first angle of rotation is based on sensor fusion of the accelerometer measurements and the gyroscope measurements.
4. The method of claim 3 , wherein the first angle of rotation is based on sensor fusion of the accelerometer measurements, the gyroscope measurements, and magnetometer measurements.
5. The method of claim 1 , wherein the AGA coordinate frame is defined with respect to an average of gravity measurements made by the client device, and wherein the gravity measurements are included in the sensor measurements.
6. The method of claim 1 , wherein determining the first angle of rotation comprises:
determining a plurality of instances of the first angle of rotation over a period of time; and
selecting instances of the first angle of rotation from the plurality of instances such that the selected instances of the first angles of rotation are within a pre-determined angle of one another.
7. The method of claim 6 , wherein determining the AGA coordinate frame comprises determining the AGA coordinate frame using the selected instances of the first angle of rotation, and not using the non-selected instances of the first angle of rotation.
8. The method of claim 1 , wherein determining the change in orientation of the user from the first detected step of the user to the second detected step of the user comprises:
projecting the AGA coordinate frame to a horizontal plane to provide a horizontally projected AGA (HPAGA) coordinate frame;
determining a third angle of rotation between the AGA coordinate frame and the HPAGA coordinate frame;
based on the first angle of rotation, the second angle of rotation, and the third angle of rotation, determining a fourth angle of rotation between the HPAGA coordinate frame and the world coordinate frame; and
based on the fourth angle of rotation, calculating the change in orientation of the user from the first detected step of the user to the second detected step of the user.
9. The method of claim 8 , wherein calculating the change in orientation of the user from the first detected step of the user to the second detected step of the user comprises:
based on the fourth angle of rotation, determining a yaw component of the user between the first detected step of the user to the second detected step of the user; and
smoothing the yaw component of the user.
10. The method of claim 9 , wherein determining the location estimate of the client device comprises calculating the location estimate of the client device based on the yaw component of the user.
11. The method of claim 1 , further comprising:
causing the client device to display the location estimate of the client device on a graphical representation of a map.
12. A non-transitory computer-readable medium having stored therein instructions, that when executed by one or more processors of a computing device, cause the computing device to perform operations comprising:
receiving a stream of sensor measurements made by a client device coupled to a user;
based on the sensor measurements, determining a first angle of rotation between a client device coordinate frame defined with respect to the client device and a world coordinate frame defined with respect to Earth;
determining an average gravity aligned (AGA) coordinate frame, wherein a z-axis of the AGA coordinate frame approximates a z-axis of the world coordinate frame;
determining a second angle of rotation between the AGA coordinate frame and the client device coordinate frame;
detecting two or more steps taken by the user while the client device is coupled to the user, wherein the step detection is based on accelerometer and gyroscope measurements made by the client device, and wherein the accelerometer and gyroscope measurements are included in the sensor measurements;
based on the first angle of rotation and the second angle of rotation, determining a change in orientation of the user from a first detected step of the user to a second detected step of the user; and
based on the first detected step of the user, the second detected step of the user, and the change in orientation of the user, determining a location estimate of the client device.
13. The non-transitory computer-readable medium of claim 12 , wherein the first angle of rotation is based on sensor fusion of the accelerometer measurements and the gyroscope measurements.
14. The non-transitory computer-readable medium of claim 12 , wherein the AGA coordinate frame is defined with respect to an average of gravity measurements made by the client device, and wherein the gravity measurements are included in the sensor measurements.
15. The non-transitory computer-readable medium of claim 12 , wherein determining the first angle of rotation comprises:
determining a plurality of instances of the first angle of rotation over a period of time; and
selecting instances of the first angle of rotation from the plurality of instances such that the selected instances of the first angles of rotation are within a pre-determined angle of one another.
16. The non-transitory computer-readable medium of claim 15 , wherein determining the AGA coordinate frame comprises determining the AGA coordinate frame using the selected instances of the first angle of rotation, and not using the non-selected instances of the first angle of rotation.
17. The non-transitory computer-readable medium of claim 12 , wherein determining the change in orientation of the user from the first detected step of the user to the second detected step of the user comprises:
projecting the AGA coordinate frame to a horizontal plane to provide a horizontally projected AGA (HPAGA) coordinate frame;
determining a third angle of rotation between the AGA coordinate frame and the HPAGA coordinate frame;
based on the first angle of rotation, the second angle of rotation, and the third angle of rotation, determining a fourth angle of rotation between the HPAGA coordinate frame and the world coordinate frame; and
based on the fourth angle of rotation, calculating the change in orientation of the user from the first detected step of the user to the second detected step of the user.
18. The non-transitory computer-readable medium of claim 17 , wherein calculating the change in orientation of the user from the first detected step of the user to the second detected step of the user comprises:
based on the fourth angle of rotation, determining a yaw component of the user between the first detected step of the user to the second detected step of the user; and
smoothing the yaw component of the user.
19. The non-transitory computer-readable medium of claim 18 , wherein determining the location estimate of the client device comprises calculating the location estimate of the client device based on the yaw component of the user.
20. A computing device comprising:
a processor;
memory; and
program instructions, stored in the memory, that upon execution by the processor cause the computing device to perform operations comprising:
receiving a stream of sensor measurements made by a client device coupled to a user;
based on the sensor measurements, determining a first angle of rotation between a client device coordinate frame defined with respect to the client device and a world coordinate frame defined with respect to Earth;
determining an average gravity aligned (AGA) coordinate frame, wherein a z-axis of the AGA coordinate frame approximates a z-axis of the world coordinate frame;
determining a second angle of rotation between the AGA coordinate frame and the client device coordinate frame;
detecting two or more steps taken by the user while the client device is coupled to the user, wherein the step detection is based on accelerometer and gyroscope measurements made by the client device, and wherein the accelerometer and gyroscope measurements are included in the sensor measurements;
based on the first angle of rotation and the second angle of rotation, determining a change in orientation of the user from a first detected step of the user to a second detected step of the user; and
based on the first detected step of the user, the second detected step of the user, and the change in orientation of the user, determining a location estimate of the client device.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/800,911 US20180084387A1 (en) | 2014-08-01 | 2017-11-01 | Determining Location Based on Measurements of Device Orientation |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201462032522P | 2014-08-01 | 2014-08-01 | |
| US14/815,500 US9838846B1 (en) | 2014-08-01 | 2015-07-31 | Extraction of walking direction from device orientation and reconstruction of device orientation during optimization of walking direction |
| US15/800,911 US20180084387A1 (en) | 2014-08-01 | 2017-11-01 | Determining Location Based on Measurements of Device Orientation |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/815,500 Continuation US9838846B1 (en) | 2014-08-01 | 2015-07-31 | Extraction of walking direction from device orientation and reconstruction of device orientation during optimization of walking direction |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20180084387A1 true US20180084387A1 (en) | 2018-03-22 |
Family
ID=60452166
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/815,500 Active 2035-12-22 US9838846B1 (en) | 2014-08-01 | 2015-07-31 | Extraction of walking direction from device orientation and reconstruction of device orientation during optimization of walking direction |
| US15/800,911 Abandoned US20180084387A1 (en) | 2014-08-01 | 2017-11-01 | Determining Location Based on Measurements of Device Orientation |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/815,500 Active 2035-12-22 US9838846B1 (en) | 2014-08-01 | 2015-07-31 | Extraction of walking direction from device orientation and reconstruction of device orientation during optimization of walking direction |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US9838846B1 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10578743B1 (en) * | 2018-12-27 | 2020-03-03 | Intel Corporation | System and method of location determination using multiple location inputs |
| DE102021119025A1 (en) | 2021-07-22 | 2023-01-26 | Cariad Se | Method and system for determining a position within a parking garage |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10845195B2 (en) * | 2015-07-01 | 2020-11-24 | Solitonreach, Inc. | System and method for motion based alignment of body parts |
| WO2018025531A1 (en) * | 2016-08-05 | 2018-02-08 | ソニー株式会社 | Information processing device, information processing method, and program |
| CN109282806B (en) * | 2017-07-20 | 2024-03-22 | 罗伯特·博世有限公司 | Method, apparatus and storage medium for determining pedestrian position |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7844415B1 (en) * | 2007-08-20 | 2010-11-30 | Pni Corporation | Dynamic motion compensation for orientation instrumentation |
| US20140372072A1 (en) * | 2013-04-09 | 2014-12-18 | Zhuhai Hengqin Great Aim Visible Light Communication Technology Co. Ltd. | Methods and Devices for Transmitting/Obtaining Identification Information and Positioning by Visible Light Signal |
Family Cites Families (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5559696A (en) | 1994-02-14 | 1996-09-24 | The Regents Of The University Of Michigan | Mobile robot internal position error correction system |
| JP3799323B2 (en) | 2002-11-29 | 2006-07-19 | Necインフロンティア株式会社 | Information terminal device and PC card |
| US6947734B1 (en) | 2002-12-06 | 2005-09-20 | Sprint Spectrum L.P. | Method and system for location accuracy analysis |
| KR101099151B1 (en) | 2004-10-29 | 2011-12-27 | 스카이후크 와이어리스, 인크. | Location beacon database and server, method of building location beacon database, and location based service using same |
| US7529236B2 (en) | 2005-08-15 | 2009-05-05 | Technocom Corporation | Embedded wireless location validation benchmarking systems and methods |
| US8509761B2 (en) | 2005-09-15 | 2013-08-13 | At&T Mobility Ii Llc | Location based services quality assessment |
| US20080033645A1 (en) | 2006-08-03 | 2008-02-07 | Jesse Sol Levinson | Pobabilistic methods for mapping and localization in arbitrary outdoor environments |
| US8521429B2 (en) | 2009-06-17 | 2013-08-27 | Microsoft Corporation | Accuracy assessment for location estimation systems |
| US20110250926A1 (en) | 2009-12-21 | 2011-10-13 | Qualcomm Incorporated | Dynamic antenna selection in a wireless device |
| US9749780B2 (en) | 2011-02-05 | 2017-08-29 | Apple Inc. | Method and apparatus for mobile location determination |
| WO2013037034A1 (en) | 2011-09-14 | 2013-03-21 | Trusted Positioning Inc. | Method and apparatus for navigation with nonlinear models |
| WO2013071190A1 (en) | 2011-11-11 | 2013-05-16 | Evolution Robotics, Inc. | Scaling vector field slam to large environments |
| US8972357B2 (en) | 2012-02-24 | 2015-03-03 | Placed, Inc. | System and method for data collection to validate location data |
| US8838376B2 (en) | 2012-03-30 | 2014-09-16 | Qualcomm Incorporated | Mashup of AP location and map information for WiFi based indoor positioning |
| US9357520B2 (en) | 2014-01-31 | 2016-05-31 | Google Inc. | Methods and systems for signal diffusion modeling for a discretized map of signal strength |
| US9476986B2 (en) | 2014-02-10 | 2016-10-25 | Google Inc. | Decomposition of error components between angular, forward, and sideways errors in estimated positions of a computing device |
| US10466056B2 (en) * | 2014-04-25 | 2019-11-05 | Samsung Electronics Co., Ltd. | Trajectory matching using ambient signals |
| US9459104B2 (en) | 2014-07-28 | 2016-10-04 | Google Inc. | Systems and methods for performing a multi-step process for map generation or device localizing |
| US10274346B2 (en) | 2014-07-30 | 2019-04-30 | Google Llc | Determining quality of a location-determination algorithm associated with a mobile device by processing a log of sensor data |
-
2015
- 2015-07-31 US US14/815,500 patent/US9838846B1/en active Active
-
2017
- 2017-11-01 US US15/800,911 patent/US20180084387A1/en not_active Abandoned
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7844415B1 (en) * | 2007-08-20 | 2010-11-30 | Pni Corporation | Dynamic motion compensation for orientation instrumentation |
| US20140372072A1 (en) * | 2013-04-09 | 2014-12-18 | Zhuhai Hengqin Great Aim Visible Light Communication Technology Co. Ltd. | Methods and Devices for Transmitting/Obtaining Identification Information and Positioning by Visible Light Signal |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10578743B1 (en) * | 2018-12-27 | 2020-03-03 | Intel Corporation | System and method of location determination using multiple location inputs |
| DE102021119025A1 (en) | 2021-07-22 | 2023-01-26 | Cariad Se | Method and system for determining a position within a parking garage |
Also Published As
| Publication number | Publication date |
|---|---|
| US9838846B1 (en) | 2017-12-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11262213B2 (en) | Decomposition of error components between angular, forward, and sideways errors in estimated positions of a computing device | |
| US9544871B2 (en) | Determining and aligning a position of a device and a position of a wireless access point (AP) | |
| US9459104B2 (en) | Systems and methods for performing a multi-step process for map generation or device localizing | |
| US10274346B2 (en) | Determining quality of a location-determination algorithm associated with a mobile device by processing a log of sensor data | |
| Tian et al. | A resetting approach for INS and UWB sensor fusion using particle filter for pedestrian tracking | |
| US10240995B2 (en) | Construction of a surface of best GPS visibility from passive traces using SLAM for horizontal localization and GPS readings and barometer readings for elevation estimation | |
| US10694325B2 (en) | Determining position of a device in three-dimensional space and corresponding calibration techniques | |
| US9116000B2 (en) | Map-assisted sensor-based positioning of mobile devices | |
| US9419731B2 (en) | Methods and systems for determining signal strength maps for wireless access points robust to measurement counts | |
| US9357520B2 (en) | Methods and systems for signal diffusion modeling for a discretized map of signal strength | |
| US10341982B2 (en) | Technique and system of positioning a mobile terminal indoors | |
| US20180084387A1 (en) | Determining Location Based on Measurements of Device Orientation | |
| Guimarães et al. | A motion tracking solution for indoor localization using smartphones | |
| RU2696603C1 (en) | Method, apparatus and system for determining an internal location | |
| US9794754B2 (en) | Running location provider processes | |
| US9258679B1 (en) | Modifying a history of geographic locations of a computing device | |
| US20150211845A1 (en) | Methods and Systems for Applying Weights to Information From Correlated Measurements for Likelihood Formulations Based on Time or Position Density | |
| Zablocki et al. | A Framework for Multi-Mobile Orientation Models Using an Extended Kalman Filter and NFC |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LE GRAND, ETIENNE;REEL/FRAME:045186/0883 Effective date: 20160530 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |