CN106017454A - Pedestrian navigation device and method based on novel multi-sensor fusion technology - Google Patents
Pedestrian navigation device and method based on novel multi-sensor fusion technology Download PDFInfo
- Publication number
- CN106017454A CN106017454A CN201610431107.0A CN201610431107A CN106017454A CN 106017454 A CN106017454 A CN 106017454A CN 201610431107 A CN201610431107 A CN 201610431107A CN 106017454 A CN106017454 A CN 106017454A
- Authority
- CN
- China
- Prior art keywords
- imu
- module
- processing unit
- raw data
- smart device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
本发明公开了一种基于新型多传感器融合技术的行人导航装置和方法,装置包括手持智能设备平台、观测量处理单元和融合滤波器;方法包括如下步骤:(1)手持智能设备利用自身硬件获取IMU、磁力计、压力计、WiFi、BLE和GNSS等的原始数据;(2)观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器;(3)融合滤波器利用运动学模型作为系统模型,观测量处理单元的结果建立观测模型,经过融合滤波器的处理最终得到行人导航结果。本发明克服了在没有其他辅助系统的情况下,导航误差会迅速累积的缺点;IMU处理模块考虑了手持智能设备的多种模式,突破了传统多传感器融合IMU需和载体固定的限制;提高了行人导航的精确度。
The invention discloses a pedestrian navigation device and method based on novel multi-sensor fusion technology. The device includes a handheld smart device platform, an observation processing unit and a fusion filter; the method includes the following steps: (1) The handheld smart device uses its own hardware to acquire The raw data of IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS, etc.; (2) the observation processing unit processes the raw data provided by the handheld smart device to provide observations such as position or velocity to the fusion filter; (3) fusion The filter uses the kinematics model as the system model, and the observation model is established by the result of the observation processing unit, and the pedestrian navigation result is finally obtained through the processing of the fusion filter. The present invention overcomes the shortcoming that navigation errors will accumulate rapidly without other auxiliary systems; the IMU processing module considers multiple modes of hand-held smart devices, and breaks through the traditional multi-sensor fusion IMU needs and carrier fixed limitations; improves Accuracy of pedestrian navigation.
Description
技术领域technical field
本发明涉及多传感器融合和行人导航领域,尤其是一种基于新型多传感器融合技术的行人导航装置和方法。The invention relates to the fields of multi-sensor fusion and pedestrian navigation, in particular to a pedestrian navigation device and method based on novel multi-sensor fusion technology.
背景技术Background technique
随着移动互联网的发展,室内外的行人导航应用蓬勃发展,例如大型商场室内导航、医院病人追踪、超市人流分析等。国内外多个市场分析报告一致认为行人导航将是一个有巨大市场的研究方向。与此同时,便携式智能设备,例如:智能手机、平板电脑和智能手表等在过去十年中一直超高速发展,已成为人们生活不可缺少的一部分。这些便携式设备大多数具备强大的处理器、无线收发器、摄像头、全球导航卫星系统GNSS接收机和众多传感器。因此,这些便携式智能设备已成为多传感器融合和行人导航相关应用的理想平台。With the development of the mobile Internet, indoor and outdoor pedestrian navigation applications are booming, such as indoor navigation of large shopping malls, patient tracking in hospitals, and crowd flow analysis in supermarkets. Multiple market analysis reports at home and abroad agree that pedestrian navigation will be a research direction with a huge market. At the same time, portable smart devices, such as smartphones, tablets, and smart watches, have been developing at a super-speed in the past decade and have become an indispensable part of people's lives. Most of these portable devices have powerful processors, wireless transceivers, cameras, GNSS receivers and numerous sensors. Therefore, these portable smart devices have become ideal platforms for applications related to multi-sensor fusion and pedestrian navigation.
目前单一行人导航技术都存在不同程度的缺陷。基于WiFi和蓝牙等无线系统的行人导航技术通常具有无线信号强度在恶劣的环境下波动较大,无法提供完整的导航信息如三维位置、速度和姿态,系统性能高度依赖于发射设备的分布和数量,位置信息不连续平滑等缺陷。基于微惯性单元的行人导航技术短期精准,但导航误差累积较快。基于摄像头的视觉定位在复杂环境下视觉传感器标定较慢、特征信息提取错误率高、导航信息计算慢等缺陷。因此,多传感器融合已成为目前行人导航的主流方案。At present, there are different degrees of defects in single pedestrian navigation technology. Pedestrian navigation technology based on wireless systems such as WiFi and Bluetooth usually has a large fluctuation in wireless signal strength in harsh environments, and cannot provide complete navigation information such as three-dimensional position, velocity, and attitude. System performance is highly dependent on the distribution and number of transmitting devices , position information discontinuous smooth and other defects. Pedestrian navigation technology based on micro-inertial units is accurate in the short term, but navigation errors accumulate quickly. Camera-based visual positioning has disadvantages such as slow calibration of visual sensors, high error rate of feature information extraction, and slow calculation of navigation information in complex environments. Therefore, multi-sensor fusion has become the mainstream solution for pedestrian navigation.
目前,现有的多传感器融合技术一般包括以下几个步骤:(1)以惯性单元(三轴加速度计和三轴陀螺仪)的测量数据通过惯性机械编排算法计算跟踪物体的位置、速度和姿态信息;(2)建立与惯性机械编排算法对应的误差模型,并作为融合滤波器的系统模型;(3)将其他辅助系统(GPS、WiFi、蓝牙、RFID、GNSS等)建立为融合滤波器的观测模型;(4)通过融合滤波器的预测和更新过程估计出系统状态量误差;和(5)将系统状态量误差补偿惯性单元误差和基于惯性机械编排算法的位置、速度和姿态信息,得出最终的位置信息、速度和姿态信息。At present, the existing multi-sensor fusion technology generally includes the following steps: (1) Using the measurement data of the inertial unit (three-axis accelerometer and three-axis gyroscope) to calculate the position, velocity and attitude of the tracking object through the inertial mechanical arrangement algorithm (2) Establish the error model corresponding to the inertial mechanical arrangement algorithm, and use it as the system model of the fusion filter; (3) Establish other auxiliary systems (GPS, WiFi, Bluetooth, RFID, GNSS, etc.) as the fusion filter Observation model; (4) Estimate the system state quantity error through the prediction and update process of the fusion filter; and (5) Compensate the system state quantity error for the inertial unit error and the position, velocity and attitude information based on the inertial mechanical arrangement algorithm, to obtain Output the final position information, velocity and attitude information.
现有多传感器融合技术具有以下两个致命缺点(1)在没有其他辅助系统的情况下,导航误差会迅速累积;(2)当惯性单元与载体不固定的情况下,例如:行人导航中的手机和行人,传统的多传感器融合技术无法正确估计出载体的信息。因此,现有多传感器融合技术在很多场景下无法提供准确的行人导航信息。The existing multi-sensor fusion technology has the following two fatal shortcomings (1) In the absence of other auxiliary systems, navigation errors will accumulate rapidly; (2) When the inertial unit and the carrier are not fixed, for example: pedestrian navigation For mobile phones and pedestrians, the traditional multi-sensor fusion technology cannot correctly estimate the information of the carrier. Therefore, the existing multi-sensor fusion technology cannot provide accurate pedestrian navigation information in many scenarios.
发明内容Contents of the invention
本发明所要解决的技术问题在于,提供一种基于新型多传感器融合技术的行人导航装置和方法,达到提升行人导航精度和可用性的效果。The technical problem to be solved by the present invention is to provide a pedestrian navigation device and method based on novel multi-sensor fusion technology, so as to achieve the effect of improving pedestrian navigation accuracy and usability.
为解决上述技术问题,本发明提供一种基于新型多传感器融合技术的行人导航装置和方法,包括:手持智能设备平台、观测量处理单元以及融合滤波器;手持智能设备利用自身硬件获取惯性测量单元(Inertial Measurement Unit,IMU)、磁力计、压力计、WiFi、低耗能蓝牙(Bluetooth Low Energy,BLE)和全球导航卫星系统(Global NavigationSatellite System,GNSS)的原始数据,观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器,融合滤波器利用运动学模型作为系统模型,观测量处理单元的结果建立观测模型,经过融合滤波器的处理最终得到行人导航结果。In order to solve the above technical problems, the present invention provides a pedestrian navigation device and method based on a novel multi-sensor fusion technology, including: a handheld smart device platform, an observation processing unit, and a fusion filter; the handheld smart device uses its own hardware to obtain an inertial measurement unit (Inertial Measurement Unit, IMU), magnetometer, pressure gauge, WiFi, Bluetooth Low Energy (Bluetooth Low Energy, BLE) and the raw data of the Global Navigation Satellite System (Global Navigation Satellite System, GNSS), the observation processing unit processes the handheld smart The original data provided by the device provides observations such as position or speed to the fusion filter. The fusion filter uses the kinematics model as the system model, and the observation model is established by the results of the observation processing unit. After processing by the fusion filter, the pedestrian navigation result is finally obtained. .
优选的,手持智能设备平台包括现有智能设备常见的IMU、磁力计、压力计、WiFi、低耗能蓝牙和GNSS等;IMU提供加速度和角速度的原始数据;所述磁力计提供地磁的原始数据;压力计提供大气气压的原始数据;WiFi提供WiFi接收信号强度(ReceivedSignal Strength,RSS)的原始数据;BLE提供BLERSS的原始数据;GNSS接收机提供GNSS的原始数据;智能设备平台任何可以提供观测信息的其他传感器都可以包含在所提出的多传感器融合算法中。Preferably, the handheld smart device platform includes common IMUs, magnetometers, pressure gauges, WiFi, low-energy Bluetooth and GNSS etc. in existing smart devices; the IMU provides raw data of acceleration and angular velocity; the magnetometer provides raw data of geomagnetic ;The barometer provides the raw data of atmospheric pressure; WiFi provides the raw data of WiFi Received Signal Strength (RSS); BLE provides the raw data of BLERSS; the GNSS receiver provides the raw data of GNSS; any smart device platform can provide observation information All other sensors can be included in the proposed multi-sensor fusion algorithm.
优选的,观测量处理单元包括:IMU处理单元、磁力计处理单元、压力计处理单元、WiFi处理单元、BLE处理单元和GNSS处理单元等;IMU处理单元处理所述IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器;磁力计处理单元处理所述磁力计提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器;压力计处理单元处理所述压力计提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器;WiFi处理单元处理所述WiFi提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器;BLE处理单元处理所述BLE提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器;GNSS处理单元处理所述GNSS接收机提供的位置和速度信息并传送给所述融合滤波器。观测量处理单元还包括其他处理单元以处理智能设备平台的其他传感器来得到位置或速度信息并传送给融合滤波器。Preferably, the observation quantity processing unit includes: an IMU processing unit, a magnetometer processing unit, a pressure gauge processing unit, a WiFi processing unit, a BLE processing unit and a GNSS processing unit, etc.; the IMU processing unit processes the original acceleration and angular velocity provided by the IMU data to obtain the IMU position information and send it to the fusion filter; the magnetometer processing unit processes the raw data of the geomagnetic provided by the magnetometer to obtain the geomagnetic position information and sends it to the fusion filter; the manometer processing unit processes the The raw data of atmospheric pressure provided by the manometer is used to obtain elevation information and sent to the fusion filter; the WiFi processing unit processes the RSS raw data provided by the WiFi to obtain WiFi position information and sent to the fusion filter; BLE The processing unit processes the RSS raw data provided by the BLE to obtain BLE position information and transmits it to the fusion filter; the GNSS processing unit processes the position and velocity information provided by the GNSS receiver and transmits it to the fusion filter. The observation processing unit also includes other processing units to process other sensors of the smart device platform to obtain position or velocity information and send it to the fusion filter.
优选的,融合滤波器包括系统模型和观测模型;系统模型使用运动学模型对待测目标的位置和速度信息进行预测,并传送给观测模型;观测模型将系统模型预测的位置、速度信息与观测量处理单元提供的基于IMU、磁力计、压力计、WiFi、BLE和GNSS等的位置、速度等信息结合,更新待测目标的的最终位置和速度信息。Preferably, the fusion filter includes a system model and an observation model; the system model uses the kinematics model to predict the position and velocity information of the target to be measured, and transmits it to the observation model; the observation model combines the position and velocity information predicted by the system model with the observation The processing unit combines the position and speed information based on IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS to update the final position and speed information of the target to be measured.
优选的,IMU处理单元包括用户运动模式和设备使用模式识别模块、航向角偏差估计模块、改进航位推算算法模块,用户运动模式和设备使用模式识别模块根据所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式;航向角偏差估计模块根据所述用户运动模式和设备使用模式识别模块输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据估计出航向角偏差;改进航位推算算法模块根据所述航向角偏差估计模块输出的航向角偏差和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据得到IMU位置信息并传送给所述融合滤波器。Preferably, the IMU processing unit includes a user motion pattern and a device use pattern recognition module, a heading angle deviation estimation module, an improved dead reckoning algorithm module, and a user motion pattern and a device use pattern recognition module according to the IMU of the handheld smart device platform and other The original data provided by optional hardware (such as magnetometer) recognizes user motion patterns such as stationary, walking, running, and equipment usage patterns such as hand-held, short message, telephone, navigation, pocket, backpack; the heading angle deviation estimation module according to the user Motion pattern and equipment usage pattern recognition module output user motion pattern and equipment usage pattern and the raw data provided by the IMU of the handheld smart device platform and other optional hardware (such as a magnetometer) to estimate the heading angle deviation; improve the navigation position The reckoning algorithm module obtains the IMU position information according to the heading angle deviation output by the heading angle deviation estimation module and the IMU of the handheld smart device platform and other optional hardware (such as a magnetometer) and sends it to the fusion filter.
优选的,改进航位推算算法模块包括测姿系统模块、航向角偏差补偿模块、步伐检测模块、步长估计模块、航位推算算法模块,测姿系统模块根据所述手持智能设备平台的IMU和其他可选磁力计提供的原始数据识别出手持智能设备的姿态信息;航向角偏差补偿模块读取航向角偏差估计模块输出的航向角偏差并补偿给行人航向角、输出给航位推算算法;步伐检测模块算法模块根据所述手持智能设备平台的IMU的原始数据检测出行人的步数反馈给步长估计模块;步长估计模块根据所述步伐检测模块的结果和所述手持智能设备平台的IMU的原始数据估计出行人的步长,并反馈给所述的航位推算模块;航位推算模块根据所述步长估计模块输出的步长信息和所述航向角偏差补偿模块输出的行人航向角信息计算出IMU位置观测量并反馈给所述的融合滤波器。Preferably, the improved dead reckoning algorithm module includes an attitude measurement system module, a heading angle deviation compensation module, a step detection module, a step size estimation module, and a dead reckoning algorithm module, and the attitude measurement system module is based on the IMU and the IMU of the handheld smart device platform. The original data provided by other optional magnetometers can identify the attitude information of the handheld smart device; the heading angle deviation compensation module reads the heading angle deviation output by the heading angle deviation estimation module and compensates the pedestrian heading angle, and outputs it to the dead reckoning algorithm; The detection module algorithm module detects the number of steps of pedestrians according to the raw data of the IMU of the handheld smart device platform and feeds back to the step size estimation module; the step size estimation module is based on the results of the step detection module and the IMU of the handheld smart device platform The raw data estimates the pedestrian's step length, and feeds back to the dead reckoning module; the dead reckoning module outputs the pedestrian heading angle according to the step information output by the step estimation module and the heading angle deviation compensation module The information calculates the IMU position observation and feeds it back to the fusion filter.
相应地,本发明还提供了一种基于新型多传感器融合技术的行人导航方法,包括以下步骤:Correspondingly, the present invention also provides a pedestrian navigation method based on novel multi-sensor fusion technology, comprising the following steps:
(1)手持智能设备利用自身硬件获取IMU、磁力计、压力计、WiFi、BLE和GNSS的原始数据;(2)观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器;(3)融合滤波器利用运动学模型作为系统模型,观测量处理单元的结果建立观测模型,经过融合滤波器的处理最终得到行人导航结果。(1) The handheld smart device uses its own hardware to obtain the raw data of IMU, magnetometer, pressure gauge, WiFi, BLE and GNSS; (2) The observation processing unit processes the raw data provided by the handheld smart device to provide observations such as position or speed (3) The fusion filter uses the kinematics model as the system model, and establishes an observation model with the result of the observation processing unit, and finally obtains the pedestrian navigation result through the processing of the fusion filter.
优选的,观测量处理单元处理手持智能设备提供的原始数据以提供位置或速度等观测量给融合滤波器包括如下步骤:Preferably, the observation processing unit processes the raw data provided by the handheld smart device to provide observations such as position or velocity to the fusion filter, including the following steps:
(1)IMU处理单元处理所述IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器;(2)磁力计处理单元处理所述磁力计提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器;(3)压力计处理单元处理所述压力计提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器;(4)WiFi处理单元处理所述WiFi提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器;(5)BLE处理单元处理所述BLE提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器;(6)GNSS处理单元处理所述GNSS芯片提供的GNSS的原始数据以得到GNSS的位置和速度信息并传送给所述融合滤波器。(1) The IMU processing unit processes the raw data of acceleration and angular velocity provided by the IMU to obtain the IMU position information and sends it to the fusion filter; (2) The magnetometer processing unit processes the raw data of the geomagnetic field provided by the magnetometer To obtain geomagnetic position information and send it to the fusion filter; (3) the manometer processing unit processes the raw data of the atmospheric pressure provided by the manometer to obtain elevation information and send it to the fusion filter; (4) WiFi The processing unit processes the RSS raw data provided by the WiFi to obtain the WiFi position information and transmits it to the fusion filter; (5) the BLE processing unit processes the RSS raw data provided by the BLE to obtain the BLE position information and transmits it to the Fusion filter; (6) GNSS processing unit processes the raw data of GNSS provided by the GNSS chip to obtain position and velocity information of GNSS and transmits to the fusion filter.
优选的,IMU处理单元处理IMU提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器包括如下步骤:Preferably, the IMU processing unit processes the raw data of the acceleration and angular velocity provided by the IMU to obtain the IMU position information and transmits it to the fusion filter including the following steps:
(1)所述用户运动模式和设备使用模式识别模块根据所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式;(1) The user motion pattern and device use pattern recognition module recognizes user motion patterns such as stationary, walking, running, and Handheld, SMS, phone, navigation, pocket, backpack and other device usage modes;
(2)航向角偏差估计模块根据所述用户运动模式和设备使用模式识别模块输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据估计出航向角偏差;(2) The heading angle deviation estimation module provides according to the user motion pattern and the device usage pattern output by the user motion pattern and the device usage pattern recognition module and the IMU and other optional hardware (such as a magnetometer) of the handheld smart device platform The raw data of estimating the heading angle deviation;
(3)改进航位推算算法模块根据所述航向角偏差估计模块输出的航向角偏差和所述手持智能设备平台的IMU和其他可选的硬件(例如磁力计)提供的原始数据得到IMU位置信息并传送给所述融合滤波器。(3) The improved dead reckoning algorithm module obtains the IMU position information according to the heading angle deviation output by the heading angle deviation estimation module and the raw data provided by the IMU of the handheld smart device platform and other optional hardware (such as a magnetometer) and sent to the fusion filter.
优选的,所述改进航位推算算法模块包括如下步骤:Preferably, the improved dead reckoning algorithm module includes the following steps:
(1)测姿系统模块根据所述手持智能设备平台的IMU和其他可选磁力计提供的原始数据识别出手持智能设备的姿态信息;(1) the attitude measurement system module recognizes the attitude information of the handheld smart device according to the raw data provided by the IMU of the handheld smart device platform and other optional magnetometers;
(2)航向角偏差补偿模块读取航向角偏差估计模块输出的航向角偏差并补偿给行人航向角、输出给航位推算算法;(2) The course angle deviation compensation module reads the course angle deviation output by the course angle deviation estimation module and compensates the pedestrian course angle, and outputs it to the dead reckoning algorithm;
(3)步伐检测模块算法模块根据所述手持智能设备平台的IMU的原始数据检测出行人的步数反馈给步长估计模块;(3) step detection module algorithm module detects the number of steps of pedestrian according to the raw data of the IMU of described handheld smart device platform and feeds back to the step size estimation module;
(4)步长估计模块根据所述步伐检测模块的结果和所述手持智能设备平台的IMU的原始数据估计出行人的步长,并反馈给所述的航位推算模块;(4) the step length estimation module estimates the step length of the pedestrian according to the result of the step detection module and the IMU of the handheld smart device platform, and feeds back to the dead reckoning module;
(5)航位推算模块根据所述步长估计模块输出的步长信息和所述航向角偏差补偿模块输出的行人航向角信息计算出IMU位置观测量并反馈给所述的融合滤波器。(5) The dead reckoning module calculates the IMU position observations according to the step size information output by the step size estimation module and the pedestrian heading angle information output by the heading angle deviation compensation module, and feeds back to the fusion filter.
本发明的有益效果为:本发明优化了传统行人导航中IMU的使用方法,将其从融合滤波器的系统模型解放出来变成观测模型,克服了将传统多传感器融合在没有其他辅助系统的情况下,导航误差会迅速累积的缺点。本发明中的IMU处理模块考虑了日常生活中手持智能设备的多种模式,突破了传统多传感器融合IMU需和载体固定的限制。因此,本发明大大提高了行人导航的精确度和可用性。The beneficial effects of the present invention are: the present invention optimizes the use method of the IMU in traditional pedestrian navigation, liberates it from the system model of the fusion filter and becomes an observation model, and overcomes the situation of traditional multi-sensor fusion without other auxiliary systems In this case, the navigation error will accumulate rapidly. The IMU processing module in the present invention considers various modes of handheld smart devices in daily life, and breaks through the limitations of traditional multi-sensor fusion IMU needs and fixed carriers. Therefore, the present invention greatly improves the accuracy and usability of pedestrian navigation.
附图说明Description of drawings
图1为本发明基于新型多传感器融合的行人导航装置的结构示意图。Fig. 1 is a schematic structural diagram of a pedestrian navigation device based on a novel multi-sensor fusion of the present invention.
图2为本发明中惯性单元处理模块的结构示意图。Fig. 2 is a schematic structural diagram of an inertial unit processing module in the present invention.
图3为本发明中高斯内核支持向量机非线性分类器示意图。Fig. 3 is a schematic diagram of a Gaussian kernel support vector machine nonlinear classifier in the present invention.
图4为本发明中用户运动模式和智能设备使用模式识别支持向量机示意图。FIG. 4 is a schematic diagram of a support vector machine for recognition of user motion patterns and smart device usage patterns in the present invention.
图5为本发明中改进的行人航位推测算法示意图。Fig. 5 is a schematic diagram of the improved dead reckoning algorithm for pedestrians in the present invention.
具体实施方式detailed description
如图1所示,一种基于新型多传感器融合的行人导航装置,包括:手持智能设备平台1、观测量处理单元2以及融合滤波器3。手持智能设备1利用自身硬件获取惯性测量单元(Inertial Measurement Unit,IMU)11、磁力计12、压力计13、WiFi14、低耗能蓝牙(Bluetooth Low Energy,BLE)15和全球导航卫星系统(Global Navigation SatelliteSystems,GNSS)16的原始数据,观测量处理单元2处理手持智能设备1提供的原始数据以提供位置或速度等观测量给融合滤波器3,融合滤波器3利用运动学模型作为系统模型31,观测量处理单元的结果建立观测模型32,经过融合滤波器3的处理最终得到行人导航结果。上述新型多传感器融合行人导航装置可以用于各种手持智能设备(包括智能手机、平板电脑、智能手表等),手持智能设备可以手持或固定于行人身上。图1所示的新型多传感器融合的行人导航系统颠覆了传统行人导航中IMU的使用方法,将其从融合滤波器的系统模型31解放出来变成观测模型32,克服了将传统多传感器融合行人导航装置在没有其他辅助系统的情况下,导航误差会迅速累积的缺点。As shown in Figure 1, a pedestrian navigation device based on new multi-sensor fusion includes: a handheld smart device platform 1, an observation processing unit 2, and a fusion filter 3. The handheld smart device 1 utilizes its own hardware to obtain an inertial measurement unit (Inertial Measurement Unit, IMU) 11, a magnetometer 12, a pressure gauge 13, WiFi 14, Bluetooth Low Energy (Bluetooth Low Energy, BLE) 15 and a global navigation satellite system (Global Navigation Satellite System) SatelliteSystems, GNSS) 16 raw data, the observation processing unit 2 processes the raw data provided by the handheld smart device 1 to provide observations such as position or velocity to the fusion filter 3, and the fusion filter 3 utilizes the kinematics model as the system model 31, An observation model 32 is established based on the results of the observation quantity processing unit, and the pedestrian navigation result is finally obtained through the processing of the fusion filter 3 . The above-mentioned novel multi-sensor fusion pedestrian navigation device can be used in various handheld smart devices (including smart phones, tablet computers, smart watches, etc.), and the handheld smart devices can be held or fixed on pedestrians. The new multi-sensor fusion pedestrian navigation system shown in Figure 1 subverts the use of IMU in traditional pedestrian navigation, liberates it from the fusion filter system model 31 and becomes an observation model 32, overcomes the traditional multi-sensor fusion pedestrian The disadvantage of the navigation device is that the navigation error will accumulate rapidly without other auxiliary systems.
上述手持智能设备平台1包括现有智能设备常见的IMU 11、磁力计12、压力计13、WiFi 14、BLE 15和GNSS 16等;IMU 11提供加速度和角速度的原始数据;磁力计12提供地磁的原始数据;所述压力计13提供大气气压的原始数据;WiFi 14提供WiFi接收信号强度(Received Signal Strength,RSS)的原始数据;BLE 15提供BLERSS的原始数据;GNSS 16提供GNSS的原始速度和位置数据。手持智能设备1任何可以提供观测信息的其他传感器都可以包含在所提出的多传感器融合算法中。The above-mentioned handheld smart device platform 1 includes IMU 11, magnetometer 12, pressure gauge 13, WiFi 14, BLE 15 and GNSS 16, etc. common to existing smart devices; IMU 11 provides raw data of acceleration and angular velocity; magnetometer 12 provides geomagnetic Raw data; the manometer 13 provides raw data of atmospheric pressure; WiFi 14 provides raw data of WiFi received signal strength (Received Signal Strength, RSS); BLE 15 provides raw data of BLERSS; GNSS 16 provides raw speed and position of GNSS data. Any other sensors of the handheld smart device 1 that can provide observation information can be included in the proposed multi-sensor fusion algorithm.
上述观测量处理单元2包括:IMU处理单元21、磁力计处理单元22、压力计处理单元23、WiFi处理单元24、BLE处理单元25和GNSS处理单元26等。IMU处理单元21处理所述IMU 11提供的加速度和角速度的原始数据以得到IMU位置信息并传送给所述融合滤波器3;磁力计处理单元22处理所述磁力计12提供的地磁的原始数据以得到地磁位置信息并传送给所述融合滤波器3;压力计处理单元23处理所述压力计13提供的大气气压的原始数据以得到高程信息并传送给所述融合滤波器3;WiFi处理单元24处理所述WiFi 14提供的RSS原始数据以得到WiFi位置信息并传送给所述融合滤波器3;BLE处理单元25处理所述BLE 15提供的RSS原始数据以得到BLE位置信息并传送给所述融合滤波器3;GNSS处理单元26处理所述GNSS 16原始数据以得到图像位置信息并传送给所述融合滤波器3。观测量处理单元2还包括其他处理单元以处理手持智能设备平台1的其他传感器来得到位置或速度信息并传送给融合滤波器3。The above-mentioned observation quantity processing unit 2 includes: an IMU processing unit 21, a magnetometer processing unit 22, a pressure gauge processing unit 23, a WiFi processing unit 24, a BLE processing unit 25, a GNSS processing unit 26, and the like. The IMU processing unit 21 processes the raw data of acceleration and angular velocity provided by the IMU 11 to obtain the IMU position information and transmits it to the fusion filter 3; the magnetometer processing unit 22 processes the raw data of the geomagnetism provided by the magnetometer 12 to obtain Get the geomagnetic position information and send it to the fusion filter 3; the manometer processing unit 23 processes the raw data of the atmospheric pressure provided by the manometer 13 to obtain elevation information and send it to the fusion filter 3; the WiFi processing unit 24 Process the RSS raw data provided by the WiFi 14 to obtain WiFi location information and send it to the fusion filter 3; the BLE processing unit 25 processes the RSS raw data provided by the BLE 15 to obtain BLE location information and send it to the fusion filter Filter 3 ; GNSS processing unit 26 processes the GNSS 16 raw data to obtain image position information and transmits it to the fusion filter 3 . The observation processing unit 2 also includes other processing units to process other sensors of the handheld smart device platform 1 to obtain position or velocity information and send it to the fusion filter 3 .
上述融合滤波器3包括系统模型31和观测模型32。系统模型31使用运动学模型对待测目标的位置和速度信息进行预测,并传送给观测模型32;观测模型32将系统模型31预测的位置、速度信息与观测量处理单元提供的基于IMU 11、磁力计12、压力计13、WiFi 14、BLE 15和GNSS 16等的位置、速度等信息结合,更新待测目标的的最终位置和速度信息。The aforementioned fusion filter 3 includes a system model 31 and an observation model 32 . The system model 31 uses the kinematics model to predict the position and velocity information of the target to be measured, and transmits it to the observation model 32; Combining the position and speed information of the gauge 12, pressure gauge 13, WiFi 14, BLE 15 and GNSS 16, etc., to update the final position and speed information of the target to be measured.
如图2所示,IMU处理单元21包括用户运动模式和设备使用模式识别模块211、航向角偏差估计模块212、改进航位推算算法模块213,用户运动模式和设备使用模式识别模块211根据所述手持智能设备平台的IMU 11和其他可选的硬件(例如磁力计12)提供的原始数据识别出静止、行走、跑步等用户运动模式和手持、短信、电话、导航、口袋、背包等设备使用模式。航向角偏差估计212模块根据所述用户运动模式和设备使用模式识别模块211输出的用户运动模式和设备使用模式和所述手持智能设备平台的IMU 11和其他可选的硬件(例如磁力计12)提供的原始数据估计出航向角偏差。改进航位推算算法模块213根据所述航向角偏差估计模块212输出的航向角偏差和所述手持智能设备平台1IMU 11和其他可选的硬件(例如磁力计12)提供的原始数据得到IMU位置信息并传送给所述融合滤波器3。图2中的所述IMU处理单元考虑了行人多种运动模式和智能设备的多种使用模式,设计了针对多种使用场景的IMU数据处理方法,突破了传统算法中IMU需要和载体固定的限制,提高了行人导航系统的可用性。As shown in FIG. 2 , the IMU processing unit 21 includes a user motion pattern and device usage pattern recognition module 211, a heading angle deviation estimation module 212, and an improved dead reckoning algorithm module 213. The user motion pattern and device usage pattern recognition module 211 is based on the The raw data provided by the IMU 11 of the handheld smart device platform and other optional hardware (such as the magnetometer 12) identify user motion patterns such as stationary, walking, and running, and device usage patterns such as handheld, text message, phone, navigation, pocket, and backpack . The heading angle deviation estimation 212 module is based on the user's motion pattern and the equipment usage pattern output by the user motion pattern and the equipment usage pattern recognition module 211 and the IMU 11 and other optional hardware (such as the magnetometer 12) of the handheld smart device platform Raw data is provided to estimate the heading angle deviation. The improved dead reckoning algorithm module 213 obtains the IMU position information according to the heading angle deviation output by the heading angle deviation estimation module 212 and the raw data provided by the handheld smart device platform 1 IMU 11 and other optional hardware (such as magnetometer 12) and sent to the fusion filter 3. The IMU processing unit in Figure 2 considers the various motion patterns of pedestrians and the various usage modes of smart devices, and designs IMU data processing methods for various usage scenarios, breaking through the limitations of IMU needs and fixed carriers in traditional algorithms , improving the usability of the pedestrian navigation system.
用户运动模式和设备使用模式识别模块211使用现有的手持智能设备1相关传感器输出:IMU 11,磁强计12,测距传感器(可选),光传感器(可选)。IMU 11和磁强计12更新频率为50-200Hz;后两种传感器的输出为标量,更新为用户行为触发。用户运动模式和设备使用模式识别算法提取1-3秒内传感器统计数据,做出分类决定。用户运动模式和设备使用模式识别算法可以有多种实现方式。本发明用高斯内核对偶型支持向量机作为实现的示例。The user motion pattern and device use pattern recognition module 211 uses the existing relevant sensor output of the handheld smart device 1: IMU 11, magnetometer 12, ranging sensor (optional), light sensor (optional). The update frequency of the IMU 11 and the magnetometer 12 is 50-200Hz; the outputs of the latter two sensors are scalar quantities, and the update is triggered by user behavior. User movement patterns and devices use pattern recognition algorithms to extract sensor statistics within 1-3 seconds to make classification decisions. There are many ways to implement the user motion pattern and the device use pattern recognition algorithm. The present invention uses a Gaussian kernel dual support vector machine as an example of implementation.
如图3所示,基于高斯内核的支持向量机可以隐含的将特征向量映射到无限维线性空间,从而达到或超越非线性分类(如传统的KNN)的效果。l1范数软边际支持向量机原型如下:As shown in Figure 3, the support vector machine based on the Gaussian kernel can implicitly map the feature vector to an infinite-dimensional linear space, so as to achieve or surpass the effect of nonlinear classification (such as traditional KNN). l The prototype of the 1 -norm soft margin support vector machine is as follows:
训练公式(1):Training formula (1):
s.t.yi(wTφ(xi)+b)≥1-ξi,sty i (w T φ(x i )+b)≥1-ξ i ,
ξi≥0.ξ i ≥ 0.
其中,xi∈Rd,i,=1,2,…,为训练集合中的特征向量与分类结果,w∈Rd为权重向量,C为可控规范化常量用以平衡对于训练集合中数据的拟合过度与不足,φ(·)为特征向量映射函数。Among them, x i ∈ R d , i ,=1,2,…, are the feature vectors and classification results in the training set, w ∈ R d is the weight vector, and C is a controllable normalization constant to balance the data in the training set The overfitting and underfitting of , φ(·) is the feature vector mapping function.
分类公式(2):Classification formula (2):
f(x)=wTφ(x)+b.f(x)=w T φ(x)+b.
由于满足KKT条件,l1范数软边际支持向量机对偶型如下:Since the KKT condition is satisfied, the dual type of l1 norm soft margin support vector machine is as follows:
训练公式(3):Training formula (3):
s.t.0≤αi≤C,st0≤α i ≤C,
分类公式(4):Classification formula (4):
引入高斯内核作为高效计算映射与内积的方式如下(5):The Gaussian kernel is introduced as an efficient way to calculate mapping and inner product as follows (5):
k(x,x′)=φ(x)Tφ(x′)=exp(-||x-x′||2/22),>0.k(x,x′)=φ(x) T φ(x′)=exp(-||xx′|| 2 /2 2 ),>0.
则对偶型转可以化为如下形式:Then the dual transformation can be transformed into the following form:
训练公式(6):Training formula (6):
s.t.0≤αi≤C,st0≤α i ≤C,
分类公式(7):Classification formula (7):
与传统的分类器KNN不同,支持向量机对偶型的最优解αi,=1,2,…,只存在少部分非零值,因此只需保留少部分训练特征向量(即支持向量)参与在线的分类计算(例如,实现如图3所示的基于随机产生数据的非线性分类器,KNN需要存储并使用1000个原始训练特征向量,支持向量机则只需142个支持向量),从而在很大程度上降低了对于处理器电池消耗与系统内存的需求,比较适合于手持智能设备平台的应用。Different from the traditional classifier KNN, the optimal solution of support vector machine dual type α i ,=1,2,…, there are only a few non-zero values, so only a small number of training feature vectors (ie support vectors) need to be retained to participate Online classification calculation (for example, to implement a nonlinear classifier based on randomly generated data as shown in Figure 3, KNN needs to store and use 1000 original training feature vectors, and support vector machines only need 142 support vectors), so that in It greatly reduces the demand for processor battery consumption and system memory, and is more suitable for applications on handheld smart device platforms.
用户运动模式和设备使用模式识别模块211将行人行为模式分为5类:1.静止;2.行走;3.跑步;4.自行车;5.驾驶汽车。行人行为模式的识别可以用于应用零速矫正,调节跟踪滤波器过程噪声的方差,以及调节动态系统马尔可夫过程的相关时间。所述用户运动模式和设备使用模式识别模块211将设备使用模式分为4类:1.前端平置;2.耳侧竖置;3.背包;4.臂章。手机姿态模式的识别可以用于前进方向的确定(坐标变换),以及调节跟踪滤波器过程噪声的方差。The user movement pattern and device use pattern recognition module 211 classifies pedestrian behavior patterns into five categories: 1. stationary; 2. walking; 3. running; 4. bicycle; 5. driving a car. The identification of pedestrian behavior patterns can be used to apply zero-velocity corrections, adjust the variance of the tracking filter process noise, and tune the correlation time of the Markov process of the dynamical system. The user movement pattern and device use pattern recognition module 211 divides the device use patterns into four categories: 1. Front-end flat; 2. Ear side vertical; 3. Backpack; 4. Armband. The identification of mobile phone attitude patterns can be used to determine the direction of travel (coordinate transformation), and to adjust the variance of tracking filter process noise.
如图4所示,为应用二级分类器的用户运动模式和设备使用模式识别支持向量机示意图,包括加速度统计量2111、角速度统计量2112、转动角和倾斜角统计量2113、光和距离统计量2114、速度反馈统计量2115、特征规范化模块2116、主成分分析模块2117、支持向量机模块2118、用户运动模式一级分类器2119和设备使用模式二级分类器2110。具体实现步骤有:线下收集具有代表性的数据集,进行特征向量规范化与主成分分析,应用公式(5)与(6)进行训练,提取并存储支持向量;在线计算传感器输出统计量,进行特征向量规范化与主成分提取(与训练集中系数相同),应用存储的支持向量,以及公式(5)与(7)进行二级分类,决定用户运动模式和设备使用模式。As shown in Figure 4, it is a schematic diagram of a support vector machine for user motion pattern and equipment use pattern recognition using a secondary classifier, including acceleration statistics 2111, angular velocity statistics 2112, rotation angle and tilt angle statistics 2113, light and distance statistics quantity 2114, velocity feedback statistics 2115, feature normalization module 2116, principal component analysis module 2117, support vector machine module 2118, user motion pattern primary classifier 2119 and device usage pattern secondary classifier 2110. The specific implementation steps are as follows: collect representative data sets offline, perform eigenvector normalization and principal component analysis, apply formulas (5) and (6) for training, extract and store support vectors; calculate sensor output statistics online, and conduct Eigenvector normalization and principal component extraction (same as the coefficients in the training set), apply the stored support vectors, and formulas (5) and (7) for secondary classification to determine user motion patterns and device usage patterns.
本发明中的航向角偏差估计模块212包含多种不同的方法。当只有IMU 11可用时,我们采用基于主成分分析方法(Principle Component Analysis,PCA)。行人运动的一个特征是行人加速和减速的方向都在行进方向。因此可以通过PCA分析加速度计的数据得到行人的行进方向。当GNSS 16可用时,行人行进的方向可以通过GNSS的速度计算得来。当磁力计12可用的时候,行人行进的方向也可以通过磁力计12计算得来。手持智能设备的航向角则通过九轴融合或者六轴融合得来。因此,航向角偏差可以通过各种方法得来的行人行进方向的融合解和手持智能设备的航向角相减得到,并输出到航向角偏差补偿模块2132。The heading angle deviation estimation module 212 in the present invention includes many different methods. When only IMU 11 is available, we use a method based on Principal Component Analysis (PCA). A characteristic of pedestrian movement is that the direction of pedestrian acceleration and deceleration is in the direction of travel. Therefore, the traveling direction of the pedestrian can be obtained by analyzing the data of the accelerometer through PCA. When GNSS 16 is available, the direction the pedestrian is traveling can be calculated from the GNSS velocity. When the magnetometer 12 is available, the direction of pedestrian travel can also be calculated by the magnetometer 12 . The heading angle of the handheld smart device is obtained through nine-axis fusion or six-axis fusion. Therefore, the heading angle deviation can be obtained by subtracting the fusion solution of the pedestrian's traveling direction obtained by various methods from the heading angle of the handheld smart device, and output to the heading angle deviation compensation module 2132 .
如图5所示,为上述改进航位推算算法模块213示意图,包括测姿系统模块2131、航向角偏差补偿模块2132、步伐检测模块2133、步长估计模块2134、航位推算算法模块2135,测姿系统模块2131根据所述手持智能设备平台1的IMU 11和其他可选磁力计12提供的原始数据识别出手持智能设备1的姿态信息。航向角偏差补偿模块2132读取航向角偏差估计模块212输出的航向角偏差并补偿给行人航向角、输出给航位推算算法模块2135。步伐检测模块2133根据所述手持智能设备平台的IMU 11的原始数据检测出行人的步数反馈给步长估计模块2134。步长估计模块2134根据所述步伐检测模块的结果和所述手持智能设备平台1的IMU 11的原始数据估计出行人的步长,并反馈给所述的航位推算模块2135。航位推算模块2135根据所述步长估计模块2134输出的步长信息和所述航向角偏差补偿模块2132输出的行人航向信息计算出IMU位置观测量并输出给所述的融合滤波器3。As shown in Figure 5, it is a schematic diagram of the above-mentioned improved dead reckoning algorithm module 213, including an attitude measurement system module 2131, a course angle deviation compensation module 2132, a step detection module 2133, a step size estimation module 2134, and a dead reckoning algorithm module 2135. The posture system module 2131 recognizes the posture information of the handheld smart device 1 according to the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12 . The course angle deviation compensation module 2132 reads the course angle deviation output by the course angle deviation estimation module 212 and compensates the course angle of pedestrians, and outputs it to the dead reckoning algorithm module 2135 . The step detection module 2133 detects the number of steps of pedestrians according to the raw data of the IMU 11 of the handheld smart device platform and feeds it back to the step length estimation module 2134 . The step length estimation module 2134 estimates the pedestrian's step length according to the result of the step detection module and the raw data of the IMU 11 of the handheld smart device platform 1, and feeds back to the dead reckoning module 2135. The dead reckoning module 2135 calculates the IMU position observation according to the step size information output by the step size estimation module 2134 and the pedestrian heading information output by the heading angle deviation compensation module 2132 and outputs it to the fusion filter 3 .
测姿系统模块2131根据所述手持智能设备平台1的IMU 11和其他可选磁力计12提供的原始数据识别出手持智能设备1的姿态信息。测姿系统模块2131采用的算法根据地磁信息是否可用,选择九轴测姿算法或六轴测姿算法。最终,测姿系统模块2131输出智能设备1的航向角给航向角偏差补偿模块2132。The attitude measurement system module 2131 recognizes the attitude information of the handheld smart device 1 according to the raw data provided by the IMU 11 of the handheld smart device platform 1 and other optional magnetometers 12 . The algorithm adopted by the attitude measurement system module 2131 selects the nine-axis attitude measurement algorithm or the six-axis attitude measurement algorithm according to whether the geomagnetic information is available. Finally, the attitude measurement system module 2131 outputs the heading angle of the smart device 1 to the heading angle deviation compensation module 2132 .
所述航向角偏差补偿模块2132读取航向角偏差估计模块212输出的航向角偏差并The yaw angle deviation compensation module 2132 reads the yaw angle deviation output by the yaw angle deviation estimation module 212 and
补偿给行人航向角、输出给航位推算算法模块2135。具体计算公式如下:Compensate to the heading angle of the pedestrian, and output to the dead reckoning algorithm module 2135. The specific calculation formula is as follows:
θp=θd+θoffset.(1)式中θp是行人航向角,θd是设备航向角,θoffset是航向角偏差。θ p = θd + θ offset . (1) where θ p is the heading angle of the pedestrian, θ d is the heading angle of the equipment, and θ offset is the heading angle deviation.
步伐检测模块2133根据所述手持智能设备平台1的IMU 11的原始数据检测出行人的步数反馈给步长估计模块2134。步伐检测可以通过峰值检测、过零检测、相关检测和功率谱检测等方法检测步伐。本发明考虑到多种用户运动模式和设备使用模式,步伐检测算法采用峰值检测同时检测IMU 11的加速度数据和陀螺仪数据。The step detection module 2133 detects the number of steps of pedestrians according to the raw data of the IMU 11 of the handheld smart device platform 1 and feeds it back to the step length estimation module 2134 . Step detection can detect steps by methods such as peak detection, zero-crossing detection, correlation detection and power spectrum detection. The present invention considers multiple user motion patterns and device usage patterns, and the step detection algorithm uses peak detection to simultaneously detect the acceleration data and gyroscope data of the IMU 11 .
步长估计模块2134根据所述步伐检测模块2133的结果和所述手持智能设备平台1的IMU 11的原始数据估计出行人的步长,并输出给所述的航位推算模块2135。步长估算可以通过加速度积分、钟摆模型、线性模型、经验模型等不同方法计算。本发明考虑到多种用户运动模式和设备使用模式,步长估算采用如下线性模型:The step length estimation module 2134 estimates the pedestrian's step length according to the result of the step detection module 2133 and the raw data of the IMU 11 of the handheld smart device platform 1 , and outputs it to the dead reckoning module 2135 . The step size estimation can be calculated by different methods such as acceleration integration, pendulum model, linear model, empirical model, etc. The present invention takes into account various user motion patterns and device usage patterns, and the step size estimation adopts the following linear model:
sk-1,k=A·(fk-1+fk)+B·(σacc,k-1+σacc,k)+C (2)s k-1,k =A·(f k-1 +f k )+B·(σ acc,k-1 +σ acc,k )+C (2)
式中A,B和C是常数,fk-1和fk是k-1时刻和k时刻的步频,σacc,k-1和σacc,k是k-1时刻和k时刻的加速度计的方差。In the formula, A, B and C are constants, f k-1 and f k are the step frequency at k-1 time and k time, σ acc,k-1 and σ acc,k are the acceleration at k-1 time and k time Calculated variance.
航位推算模块2135根据k-1时刻的位置[re,k-1 rn,k-1]T、步长估计模块2134输出的步长信息sk-1,k和航向角偏差补偿模块2132输出的航向角信息θk-1推算出k时刻的位置[re,k rn,k]T。对应的计算公式如下:The dead reckoning module 2135 is based on the position [r e,k-1 r n,k-1 ] T at time k-1, the step size information s k-1,k output by the step size estimation module 2134 and the heading angle deviation compensation module The heading angle information θ k-1 output by 2132 calculates the position [r e,k r n,k ] T at time k. The corresponding calculation formula is as follows:
最终,航位推算模块2135输出IMU位置观测量给所述的融合滤波器3。Finally, the dead reckoning module 2135 outputs the IMU position observations to the fusion filter 3 .
融合滤波器3包括系统模型31和观测模型32。传统的多传感器融合结构一般通过惯性机械编排算法处理IMU测量数据,并建立相关的融合滤波器系统模型。由于惯性机械编排中存在很多积分操作,因此当没有外在辅助系统的情况下,传统的多传感器融合结构的定位误差会迅速累积。本发明克服传统的多传感器融合结构的缺陷,以行人的运动模型作为系统模型,而IMU相关数据和其他系统一样作为观测模型。The fusion filter 3 includes a system model 31 and an observation model 32 . The traditional multi-sensor fusion structure generally processes the IMU measurement data through the inertial machine orchestration algorithm, and establishes the relevant fusion filter system model. Due to the many integral operations in the inertial mechanical orchestration, the positioning errors of conventional multi-sensor fusion architectures accumulate rapidly when there is no extrinsic assistance system. The invention overcomes the defects of the traditional multi-sensor fusion structure, uses the motion model of pedestrians as the system model, and uses IMU related data as the observation model like other systems.
融合滤波器3可以采用卡尔曼滤波(Kalman Filter,KF)、自适应卡尔曼滤波(AdaptiveKalman Filter,AKF)、UKF无损卡尔曼滤波(Unscented Kalman Filter,UKF)或粒子滤波(Particle Filter,PF)。本发明给出KF的设计范例。其他的滤波器可以参考KF的设计。融合滤波器3实现为KF的状态向量定义如下:The fusion filter 3 may use a Kalman filter (Kalman Filter, KF), an adaptive Kalman filter (Adaptive Kalman Filter, AKF), a UKF lossless Kalman filter (Unscented Kalman Filter, UKF) or a particle filter (Particle Filter, PF). The present invention gives a design example of KF. Other filters can refer to the design of KF. Fusion filter 3 is implemented as the state vector of KF defined as follows:
x=[re rn ru ve vn vu]T (4)x=[r e r n r u v e v n v u ] T (4)
式中re,rn和ru是三维位置(东北天坐标系),ve,vn和vu是对应的三维速度组成。该KF系统模型32采用经典的运动学模型,定义如下:In the formula, re, rn and ru are the three-dimensional positions (northeast sky coordinate system), and ve, vn and vu are the corresponding three-dimensional velocity components. The KF system model 32 adopts a classical kinematics model, which is defined as follows:
xk+1|k=Φk,k+1xk|k+ωk (5)x k+1|k =Φ k,k+1 x k|k +ω k (5)
式中xk+1|k是预测的状态向量,xk|k是在k时刻先前的状态向量,Φk,k+1是一个6×6转where x k+1|k is the predicted state vector, x k|k is the previous state vector at time k, Φ k,k+1 is a 6×6 turn
移矩阵:Shift matrix:
式中Δt是两个时刻的时间差。ωk是协方差矩阵的处理噪声,定义如下:where Δt is the time difference between two moments. ωk is the covariance matrix The processing noise of is defined as follows:
式中和是在k时刻东向天坐标系下的速度噪声,按随机游走建模。In the formula and is the velocity noise in the east sky coordinate system at time k, modeled as a random walk.
式中和是在k-1时刻东向天坐标系的速度噪声,ne,nn和nu是高斯白噪声,Δt是两个时刻的时间差。In the formula and is the velocity noise of the east sky coordinate system at time k-1, ne , n n and n u are Gaussian white noises, and Δt is the time difference between two moments.
融合滤波器3实现为KF的测量模型31定义如下:The fusion filter 3 is implemented as a KF measurement model 31 defined as follows:
zk=Hkxk|k+υk (9)z k =H k x k|k +υ k (9)
式中zk是测量向量,Hk是决策矩阵。υk是测量噪声以高斯白噪声为模型,其协方差矩阵为zk和Hk随观测量的不同而变化。当观测量来自于IMU 11时,典型的zk和Hk定义如下:where z k is the measurement vector, and H k is the decision matrix. υ k is the measurement noise modeled with Gaussian white noise, and its covariance matrix is z k and H k vary with different observations. When the observations come from the IMU 11, typical z k and H k are defined as follows:
zk=[re rn]T z k =[r e r n ] T
当观测量来自于磁力计12时,典型的zk和Hk定义如下:When the observations come from the magnetometer 12, typical z k and H k are defined as follows:
zk=[re rn ru]T z k =[r e r n r u ] T
当观测量来自于压力计13时,典型的zk和Hk定义如下:When the observed quantity comes from the manometer 13, typical z k and H k are defined as follows:
zk=ru z k = r u
Hk=[0 0 1 0 0 0] (12)当观测量来自于WiFi 14时,典型的zk和Hk定义如下:H k =[0 0 1 0 0 0] (12) When the observations come from WiFi 14, typical z k and H k are defined as follows:
zk=[re rn ru]T z k =[r e r n r u ] T
当观测量来自于BLE15时,典型的zk和Hk定义如下:When the observations come from BLE15, typical z k and H k are defined as follows:
zk=[re rn ru]T z k =[r e r n r u ] T
当观测量来自于GNSS16时,典型的zk和Hk定义如下:When the observations come from GNSS16, typical z k and H k are defined as follows:
zk=[re rn ru ve vn vu]T z k =[r e r n r u v e v n v u ] T
KF处理过程有两个阶段:预测和更新。在预测过程,根据系统模型预测出状态向量和协方差矩阵。The KF process has two phases: prediction and update. In the prediction process, the state vector and covariance matrix are predicted according to the system model.
在更新过程中,根据测量模型更新状态向量和协方差矩阵:During the update, the state vector and covariance matrix are updated according to the measured model:
式中Kk称为卡尔曼增益。In the formula, K k is called the Kalman gain.
尽管本发明就优选实施方式进行了示意和描述,但本领域的技术人员应当理解,只要不超出本发明的权利要求所限定的范围,可以对本发明进行各种变化和修改。Although the present invention has been illustrated and described in terms of preferred embodiments, those skilled in the art should understand that various changes and modifications can be made to the present invention without departing from the scope defined by the claims of the present invention.
Claims (12)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610431107.0A CN106017454B (en) | 2016-06-16 | 2016-06-16 | A kind of pedestrian navigation device and method based on multi-sensor fusion technology |
| PCT/CN2016/087281 WO2017215024A1 (en) | 2016-06-16 | 2016-06-27 | Pedestrian navigation device and method based on novel multi-sensor fusion technology |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610431107.0A CN106017454B (en) | 2016-06-16 | 2016-06-16 | A kind of pedestrian navigation device and method based on multi-sensor fusion technology |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN106017454A true CN106017454A (en) | 2016-10-12 |
| CN106017454B CN106017454B (en) | 2018-12-14 |
Family
ID=57089015
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201610431107.0A Active CN106017454B (en) | 2016-06-16 | 2016-06-16 | A kind of pedestrian navigation device and method based on multi-sensor fusion technology |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106017454B (en) |
| WO (1) | WO2017215024A1 (en) |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106767784A (en) * | 2016-12-21 | 2017-05-31 | 上海网罗电子科技有限公司 | A kind of bluetooth trains the fire-fighting precision indoor localization method of inertial navigation |
| CN107328406A (en) * | 2017-06-28 | 2017-11-07 | 中国矿业大学(北京) | A kind of mine movable object localization method and system based on Multiple Source Sensor |
| CN107943042A (en) * | 2017-12-06 | 2018-04-20 | 东南大学 | A kind of earth magnetism fingerprint database automated construction method and device |
| CN107990901A (en) * | 2017-11-28 | 2018-05-04 | 元力云网络有限公司 | A kind of sensor-based user direction localization method |
| CN108413968A (en) * | 2018-07-10 | 2018-08-17 | 上海奥孛睿斯科技有限公司 | A kind of method and system of movement identification |
| CN110118549A (en) * | 2018-02-06 | 2019-08-13 | 刘禹岐 | A kind of Multi-source Information Fusion localization method and device |
| CN110849392A (en) * | 2019-11-15 | 2020-02-28 | 上海有个机器人有限公司 | Robot mileage counting data correction method and robot |
| CN110986941A (en) * | 2019-11-29 | 2020-04-10 | 武汉大学 | A method for estimating the installation angle of mobile phones |
| CN111174781A (en) * | 2019-12-31 | 2020-05-19 | 同济大学 | Inertial navigation positioning method based on wearable device combined target detection |
| CN111256709A (en) * | 2020-02-18 | 2020-06-09 | 北京九曜智能科技有限公司 | Vehicle dead reckoning positioning method and device based on encoder and gyroscope |
| CN111984853A (en) * | 2019-05-22 | 2020-11-24 | 北京车和家信息技术有限公司 | Test driving report generation method and cloud server |
| CN112379395A (en) * | 2020-11-24 | 2021-02-19 | 中国人民解放军海军工程大学 | Positioning navigation time service system |
| CN114966791A (en) * | 2022-05-12 | 2022-08-30 | 之江实验室 | Inertial-assisted GNSS positioning method suitable for vehicle-mounted smart phone platform |
| CN115597602A (en) * | 2022-10-20 | 2023-01-13 | Oppo广东移动通信有限公司(Cn) | Robot navigation positioning method and device, readable medium and electronic equipment |
Families Citing this family (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109682372B (en) * | 2018-12-17 | 2022-10-18 | 重庆邮电大学 | Improved PDR method combining building structure information and RFID calibration |
| CN110132257B (en) * | 2019-05-15 | 2023-03-24 | 吉林大学 | Human behavior prediction method based on multi-sensor data fusion |
| CN110427046B (en) * | 2019-07-26 | 2022-09-30 | 沈阳航空航天大学 | Three-dimensional smooth random-walking unmanned aerial vehicle cluster moving model |
| CN112747754B (en) * | 2019-10-30 | 2025-01-14 | 北京魔门塔科技有限公司 | A method, device and system for fusion of multi-sensor data |
| CN110764506B (en) * | 2019-11-05 | 2022-10-11 | 广东博智林机器人有限公司 | Course angle fusion method and device of mobile robot and mobile robot |
| CN110954109A (en) * | 2019-12-12 | 2020-04-03 | 杭州十域科技有限公司 | Indoor space positioning equipment and positioning method based on fingerprint positioning and inertial navigation system |
| CN111174780B (en) * | 2019-12-31 | 2022-03-08 | 同济大学 | Road inertial navigation positioning system for blind people |
| CN111811502B (en) * | 2020-07-10 | 2022-07-22 | 北京航空航天大学 | Motion carrier multi-source information fusion navigation method and system |
| CN112268557B (en) * | 2020-09-22 | 2024-03-05 | 宽凳(湖州)科技有限公司 | Real-time high-precision positioning method for urban scene |
| CN112556696B (en) * | 2020-12-03 | 2022-01-07 | 腾讯科技(深圳)有限公司 | Object positioning method and device, computer equipment and storage medium |
| CN112880672B (en) * | 2021-01-14 | 2024-11-26 | 武汉元生创新科技有限公司 | AI-based adaptive method and device for inertial sensor fusion strategy |
| CN113008224A (en) * | 2021-03-04 | 2021-06-22 | 国电瑞源(西安)智能研究院有限公司 | Indoor and outdoor self-adaptive navigation system and method integrating multiple sensors |
| CN113029153B (en) * | 2021-03-29 | 2024-05-28 | 浙江大学 | Multi-scenario PDR positioning method based on smartphone multi-sensor fusion and SVM classification |
| CN113229804A (en) * | 2021-05-07 | 2021-08-10 | 陕西福音假肢有限责任公司 | Magnetic field data fusion circuit and method for joint mobility |
| CN113790722B (en) * | 2021-08-20 | 2023-09-12 | 北京自动化控制设备研究所 | A pedestrian step length modeling method based on time-frequency domain feature extraction from inertial data |
| CN113655439A (en) * | 2021-08-31 | 2021-11-16 | 上海第二工业大学 | An Indoor Localization Method Based on Improved Particle Filtering |
| CN114910899B (en) * | 2022-03-28 | 2025-06-06 | 北京航空航天大学杭州创新研究院 | A relative positioning method for partner robots based on multi-information fusion |
| CN115200575B (en) * | 2022-07-13 | 2024-11-19 | 中国电子科技集团公司第五十四研究所 | Indoor positioning method based on human activity recognition assisted pedestrian dead reckoning |
| CN115560759A (en) * | 2022-09-06 | 2023-01-03 | 哈尔滨工程大学 | Underwater multi-source navigation positioning method based on seabed oil and gas pipeline detection |
| WO2024082214A1 (en) * | 2022-10-20 | 2024-04-25 | Telefonaktiebolaget Lm Ericsson (Publ) | Improved target positioning by using multiple terminal devices |
| CN117942068B (en) * | 2024-01-29 | 2024-06-28 | 连云港市第二人民医院(连云港市临床肿瘤研究所) | Lower limb pose fuzzy guiding control method and system based on micro inertial navigation |
| CN119413163B (en) * | 2025-01-06 | 2025-04-04 | 深圳市爱保护科技有限公司 | Precise positioning method, device, equipment and storage medium for inertial measurement sensor for intelligent terminal equipment |
| CN119555067B (en) * | 2025-01-26 | 2025-05-23 | 中国科学院空天信息创新研究院 | Human body trunk inertial navigation autonomous positioning method and device based on inverted pendulum model |
| CN120141463B (en) * | 2025-05-15 | 2025-07-15 | 中国人民解放军国防科技大学 | Navigation method combining satellite, inertial navigation, spectrum map and magnetometer and application |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102175463A (en) * | 2011-02-12 | 2011-09-07 | 东南大学 | Method for detecting braking property of vehicle in road test based on improved Kalman filtering |
| CN103759730A (en) * | 2014-01-16 | 2014-04-30 | 南京师范大学 | Collaborative navigation system based on navigation information bilateral fusion for pedestrian and intelligent mobile carrier and navigation method thereof |
| CN103968827A (en) * | 2014-04-09 | 2014-08-06 | 北京信息科技大学 | Wearable human body gait detection self-localization method |
| CN104613963A (en) * | 2015-01-23 | 2015-05-13 | 南京师范大学 | Pedestrian navigation system and navigation positioning method based on kinesiology model |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB201500411D0 (en) * | 2014-09-15 | 2015-02-25 | Isis Innovation | Determining the position of a mobile device in a geographical area |
| US9410979B2 (en) * | 2014-09-23 | 2016-08-09 | Fitbit, Inc. | Hybrid angular motion sensors |
| CN104931049A (en) * | 2015-06-05 | 2015-09-23 | 北京信息科技大学 | Movement classification-based pedestrian self-positioning method |
| CN105588566B (en) * | 2016-01-08 | 2019-09-13 | 重庆邮电大学 | An indoor positioning system and method based on fusion of bluetooth and MEMS |
-
2016
- 2016-06-16 CN CN201610431107.0A patent/CN106017454B/en active Active
- 2016-06-27 WO PCT/CN2016/087281 patent/WO2017215024A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102175463A (en) * | 2011-02-12 | 2011-09-07 | 东南大学 | Method for detecting braking property of vehicle in road test based on improved Kalman filtering |
| CN103759730A (en) * | 2014-01-16 | 2014-04-30 | 南京师范大学 | Collaborative navigation system based on navigation information bilateral fusion for pedestrian and intelligent mobile carrier and navigation method thereof |
| CN103968827A (en) * | 2014-04-09 | 2014-08-06 | 北京信息科技大学 | Wearable human body gait detection self-localization method |
| CN104613963A (en) * | 2015-01-23 | 2015-05-13 | 南京师范大学 | Pedestrian navigation system and navigation positioning method based on kinesiology model |
Non-Patent Citations (2)
| Title |
|---|
| 张鹏等: "基于PDR、WiFi指纹识别、磁场匹配组合的室内行人导航定位", 《测绘地理信息》 * |
| 陈兴秀等: "三维复杂运动模式航迹推算惯性导航室内定位", 《应用科学学报》 * |
Cited By (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106767784B (en) * | 2016-12-21 | 2019-11-08 | 上海网罗电子科技有限公司 | A kind of fire-fighting precision indoor localization method of bluetooth training inertial navigation |
| CN106767784A (en) * | 2016-12-21 | 2017-05-31 | 上海网罗电子科技有限公司 | A kind of bluetooth trains the fire-fighting precision indoor localization method of inertial navigation |
| CN107328406A (en) * | 2017-06-28 | 2017-11-07 | 中国矿业大学(北京) | A kind of mine movable object localization method and system based on Multiple Source Sensor |
| CN107990901A (en) * | 2017-11-28 | 2018-05-04 | 元力云网络有限公司 | A kind of sensor-based user direction localization method |
| CN107943042A (en) * | 2017-12-06 | 2018-04-20 | 东南大学 | A kind of earth magnetism fingerprint database automated construction method and device |
| CN110118549A (en) * | 2018-02-06 | 2019-08-13 | 刘禹岐 | A kind of Multi-source Information Fusion localization method and device |
| CN110118549B (en) * | 2018-02-06 | 2021-05-11 | 刘禹岐 | Multi-source information fusion positioning method and device |
| CN108413968B (en) * | 2018-07-10 | 2018-10-09 | 上海奥孛睿斯科技有限公司 | A kind of method and system of movement identification |
| CN108413968A (en) * | 2018-07-10 | 2018-08-17 | 上海奥孛睿斯科技有限公司 | A kind of method and system of movement identification |
| CN111984853B (en) * | 2019-05-22 | 2024-03-22 | 北京车和家信息技术有限公司 | Test driving report generation method and cloud server |
| CN111984853A (en) * | 2019-05-22 | 2020-11-24 | 北京车和家信息技术有限公司 | Test driving report generation method and cloud server |
| CN110849392A (en) * | 2019-11-15 | 2020-02-28 | 上海有个机器人有限公司 | Robot mileage counting data correction method and robot |
| CN110986941A (en) * | 2019-11-29 | 2020-04-10 | 武汉大学 | A method for estimating the installation angle of mobile phones |
| CN111174781A (en) * | 2019-12-31 | 2020-05-19 | 同济大学 | Inertial navigation positioning method based on wearable device combined target detection |
| CN111174781B (en) * | 2019-12-31 | 2022-03-04 | 同济大学 | An Inertial Navigation Positioning Method Based on Joint Target Detection of Wearable Devices |
| CN111256709B (en) * | 2020-02-18 | 2021-11-02 | 北京九曜智能科技有限公司 | Vehicle dead reckoning positioning method and device based on encoder and gyroscope |
| CN111256709A (en) * | 2020-02-18 | 2020-06-09 | 北京九曜智能科技有限公司 | Vehicle dead reckoning positioning method and device based on encoder and gyroscope |
| CN112379395A (en) * | 2020-11-24 | 2021-02-19 | 中国人民解放军海军工程大学 | Positioning navigation time service system |
| CN112379395B (en) * | 2020-11-24 | 2023-09-05 | 中国人民解放军海军工程大学 | Positioning navigation time service system |
| CN114966791A (en) * | 2022-05-12 | 2022-08-30 | 之江实验室 | Inertial-assisted GNSS positioning method suitable for vehicle-mounted smart phone platform |
| CN114966791B (en) * | 2022-05-12 | 2025-08-12 | 之江实验室 | Inertial assisted GNSS positioning method suitable for vehicle-mounted smart phone platform |
| CN115597602A (en) * | 2022-10-20 | 2023-01-13 | Oppo广东移动通信有限公司(Cn) | Robot navigation positioning method and device, readable medium and electronic equipment |
| CN115597602B (en) * | 2022-10-20 | 2025-11-18 | Oppo广东移动通信有限公司 | Robot navigation and positioning methods and devices, readable media and electronic devices |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106017454B (en) | 2018-12-14 |
| WO2017215024A1 (en) | 2017-12-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN106017454A (en) | Pedestrian navigation device and method based on novel multi-sensor fusion technology | |
| Ouyang et al. | A survey of magnetic-field-based indoor localization | |
| CN105074381B (en) | Method and apparatus for determining misalignment between equipment and pedestrians | |
| US10184797B2 (en) | Apparatus and methods for ultrasonic sensor navigation | |
| Wang et al. | Pedestrian dead reckoning based on walking pattern recognition and online magnetic fingerprint trajectory calibration | |
| US11875519B2 (en) | Method and system for positioning using optical sensor and motion sensors | |
| US10429196B2 (en) | Method and apparatus for cart navigation | |
| US10652696B2 (en) | Method and apparatus for categorizing device use case for on foot motion using motion sensor data | |
| US10648816B2 (en) | Device and method for integrated navigation based on wireless fingerprints and MEMS sensors | |
| US11041725B2 (en) | Systems and methods for estimating the motion of an object | |
| US20160084937A1 (en) | Systems and methods for determining position information using acoustic sensing | |
| EP3814864A1 (en) | Systems and methods for autonomous machine tracking and localization of mobile objects | |
| US10837794B2 (en) | Method and system for characterization of on foot motion with multiple sensor assemblies | |
| Kourogi et al. | A method of pedestrian dead reckoning using action recognition | |
| JP5750742B2 (en) | Mobile object state estimation device | |
| US9880005B2 (en) | Method and system for providing a plurality of navigation solutions | |
| US9818037B2 (en) | Estimating heading misalignment between a device and a person using optical sensor | |
| Han | Application of inertial navigation high precision positioning system based on SVM optimization | |
| US20240271938A1 (en) | Smartphone-based inertial odometry | |
| US11592295B2 (en) | System and method for position correction | |
| US20250089016A1 (en) | Techniques for device localization | |
| Xie et al. | Holding-manner-free heading change estimation for smartphone-based indoor positioning | |
| Saadatzadeh et al. | Pedestrian dead reckoning using smartphones sensors: an efficient indoor positioning system in complex buildings of smart cities | |
| Alasaadi et al. | UniCoor: A smartphone unified coordinate system for ITS applications | |
| Parviainen | Studies on sensor aided positioning and context awareness |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |