[go: up one dir, main page]

CN114623818A - High-precision wearable helmet data acquisition equipment - Google Patents

High-precision wearable helmet data acquisition equipment Download PDF

Info

Publication number
CN114623818A
CN114623818A CN202210185677.1A CN202210185677A CN114623818A CN 114623818 A CN114623818 A CN 114623818A CN 202210185677 A CN202210185677 A CN 202210185677A CN 114623818 A CN114623818 A CN 114623818A
Authority
CN
China
Prior art keywords
data
imu
points
laser
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210185677.1A
Other languages
Chinese (zh)
Other versions
CN114623818B (en
Inventor
杨必胜
李健平
吴唯同
陈瑞波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202210185677.1A priority Critical patent/CN114623818B/en
Publication of CN114623818A publication Critical patent/CN114623818A/en
Application granted granted Critical
Publication of CN114623818B publication Critical patent/CN114623818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides high-precision wearable helmet data acquisition equipment, which integrates a camera, an IMU (inertial measurement unit) and a laser scanner into a helmet, performs hardware time synchronization, and acquires accurate sensor timestamps while acquiring data; the data acquisition process is realized by integrating all sensor data according to the hardware timestamp and dividing a data frame; removing motion distortion of laser observation in a data frame according to the observation of an inertial measurement unit, and giving color information of an image; dividing laser points in the data frame into three types of boundary characteristic points, surface characteristic points and no characteristic points according to the point cloud geometric curvature and the image gray gradient; and performing feature association on the continuous data frames according to different features, calculating relative positions and postures, and calculating coordinates of map points.

Description

高精度穿戴式头盔数据获取装备High-precision wearable helmet data acquisition equipment

技术领域technical field

本发明涉及对穿戴式头盔移动测量系统的硬件设计与数据处理,属于计算机视觉领域及激光点云测量数据处理自动化研究领域。The invention relates to hardware design and data processing of a wearable helmet mobile measurement system, and belongs to the field of computer vision and the research field of laser point cloud measurement data processing automation.

背景技术Background technique

高精度三维点云作为一种新型基础测绘数据,已经广泛服务于公路、水利、林业、电力等行业。移动测量技术是获取高精度三维点云数据的主要手段,按照不同搭载平台,主要可以分为机载、车载、地基等。这些手段可以满足大范围测图任务,但是仍然存在着GNSS信号缺失区域精度差、复杂环境作业困难且时效性低、设备整体体积过大等问题。在泛在测绘的发展趋势下,越来越多的行业需要将移动测量装备应用于GNSS信号缺失或通行性有限的区域,并且能够快速便捷地作业以获取高精度数据。As a new type of basic surveying and mapping data, high-precision 3D point cloud has been widely used in highway, water conservancy, forestry, electric power and other industries. Mobile measurement technology is the main means to obtain high-precision 3D point cloud data. These methods can satisfy large-scale mapping tasks, but there are still problems such as poor accuracy in areas where GNSS signals are missing, difficult and time-sensitive operations in complex environments, and the overall size of the equipment is too large. Under the development trend of ubiquitous surveying and mapping, more and more industries need to apply mobile surveying equipment to areas where GNSS signals are missing or limited accessibility, and can operate quickly and easily to obtain high-precision data.

目前国内外对穿戴式移动测量系统已经有了一定的研究,但主要集中在背包和手持设备,例如:Bosse et al.(2012)设计的zebedee手持移动测量系统,北科天绘推出的星探手持移动测量系统,数字绿土推出的LiBackPack背包移动测量系统。背包或手持移动测量方式限制了移动测量装备的观测角度,无法依据作业人员的观察视角,实时调整观测目标,容易造成场景三维数据缺失,给完整地采集复杂环境三维点云数据带来了不便。除此之外,现有实时定位制图(SLAM)方法(Shan and Englot,2018;Zhang and Singh,2014)大多运用于车载、背包、手持平台,这些平台运动相对于佩戴于头部来说,运动较为平缓。而人在行走的过程中存在转向、起伏等高动态运动给现有SLAM方法带来了巨大的挑战。At present, there have been some researches on wearable mobile measurement systems at home and abroad, but they mainly focus on backpacks and handheld devices, such as the zebedee handheld mobile measurement system designed by Bosse et al. (2012), the star scout launched by Beike Tianhui Handheld mobile measurement system, LiBackPack backpack mobile measurement system launched by Digital Green Earth. The backpack or handheld mobile measurement method limits the observation angle of the mobile measurement equipment, and it is impossible to adjust the observation target in real time according to the observation angle of the operator. In addition, the existing real-time localization mapping (SLAM) methods (Shan and Englot, 2018; Zhang and Singh, 2014) are mostly used in vehicles, backpacks, and handheld platforms. Compared with wearing on the head, the motion of these platforms is more difficult to move. more gentle. In the process of walking, there are high dynamic motions such as turning and ups and downs, which brings huge challenges to the existing SLAM methods.

相关资料与文献:Related information and literature:

https://www.isurestar.com/starscan.htmlhttps://www.isurestar.com/starscan.html

https://www.lidar360.com/archives/6358.htmlhttps://www.lidar360.com/archives/6358.html

Bosse,M.,Zlot,R.,Flick,P.,2012.Zebedee:Design of a spring-mounted 3-drange sensor with application to mobile mapping.IEEE Transactions on Robotics28,1104-1119.Bosse, M., Zlot, R., Flick, P., 2012. Zebedee: Design of a spring-mounted 3-drange sensor with application to mobile mapping. IEEE Transactions on Robotics 28, 1104-1119.

Shan,T.,Englot,B.,2018.Lego-loam:Lightweight and ground-optimizedlidar odometry and mapping on variable terrain,2018IEEE/RSJ InternationalConference on Intelligent Robots and Systems(IROS).IEEE,pp.4758-4765.Shan, T., Englot, B., 2018. Lego-loam: Lightweight and ground-optimizedlidar odometry and mapping on variable terrain, 2018IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS).IEEE, pp.4758-4765.

Shin,E.-H.,El-Sheimy,N.,2004.An unscented Kalman filter for in-motionalignment of low-cost IMUs,Position Location and Navigation Symposium,2004.PLANS 2004.IEEE,pp.273-279.Shin, E.-H., El-Sheimy, N., 2004. An unscented Kalman filter for in-motion alignment of low-cost IMUs, Position Location and Navigation Symposium, 2004. PLANS 2004. IEEE, pp. 273-279.

Yang,Z.,Shen,S.,2016.Monocular visual–inertial state estimation withonline initialization and camera–imu extrinsic calibration.IEEE Transactionson Automation Science and Engineering 14,39-51.Yang, Z., Shen, S., 2016. Monocular visual–inertial state estimation with online initialization and camera–imu extrinsic calibration. IEEE Transactionson Automation Science and Engineering 14, 39-51.

Zhang,J.,Singh,S.,2014.LOAM:Lidar Odometry and Mapping in Real-time,Robotics:Science and Systems.Zhang, J., Singh, S., 2014. LOAM: Lidar Odometry and Mapping in Real-time, Robotics: Science and Systems.

发明内容SUMMARY OF THE INVENTION

本发明在现有研究的基础上,设计了一种新颖的穿戴式头盔移动测量系统。Based on the existing research, the present invention designs a novel wearable helmet movement measurement system.

本发明提供了一种高精度穿戴式头盔数据获取装备,将相机、IMU和激光扫描仪集成到一个头盔中,并进行硬件时间同步,在采集数据的同时采集准确的传感器时间戳;数据获取过程实现如下,The invention provides a high-precision wearable helmet data acquisition equipment, which integrates a camera, an IMU and a laser scanner into a helmet, performs hardware time synchronization, and collects accurate sensor time stamps while collecting data; the data acquisition process The implementation is as follows,

步骤1,根据硬件时间戳,将所有传感器数据进行整合,划分数据帧;Step 1, according to the hardware timestamp, integrate all sensor data and divide the data frame;

步骤2,将一个数据帧中激光观测依据惯性测量单元的观测,去除运动畸变,并赋予影像的彩色信息;Step 2, the laser observation in a data frame is based on the observation of the inertial measurement unit, the motion distortion is removed, and the color information of the image is given;

步骤3,依据点云几何曲率与影像灰度梯度将数据帧中的激光点划分为边界特征点、面特征点、无特征点三类;Step 3, according to the geometric curvature of the point cloud and the image grayscale gradient, the laser points in the data frame are divided into three categories: boundary feature points, surface feature points, and no feature points;

步骤4,将连续的数据帧依据不同特征进行特征关联,计算相对位置与姿态,并解算地图点坐标,实现方式包括如下处理,In step 4, the continuous data frames are characterized according to different characteristics, the relative position and attitude are calculated, and the coordinates of the map points are calculated. The implementation method includes the following processing:

在头盔移动测量中需要被维护的状态量有:The state quantities that need to be maintained in the helmet movement measurement are:

Figure BDA0003523283680000021
Figure BDA0003523283680000021

其中,xk是第k个系统状态,

Figure BDA0003523283680000022
分别为位置、速度与姿态。
Figure BDA0003523283680000024
Figure BDA0003523283680000023
是IMU的加速度与角速度零偏;where x k is the kth system state,
Figure BDA0003523283680000022
position, velocity, and attitude, respectively.
Figure BDA0003523283680000024
and
Figure BDA0003523283680000023
is the zero offset of the acceleration and angular velocity of the IMU;

采用MEMS IMU模型对系统状态xk进行建模,根据IMU状态方程构建预积分模型;The MEMS IMU model is used to model the system state x k , and the pre-integration model is constructed according to the IMU state equation;

激光匹配约束分为三类,几何边界特征点约束、几何平面特征点约束、光谱边界特征点约束,其中光谱边界点与几何边界点的约束建立方式为点到线的距离,而几何面特征点的约束建立方式为点到面的距离。Laser matching constraints are divided into three categories: geometric boundary feature point constraints, geometric plane feature point constraints, and spectral boundary feature point constraints. The constraints are established as point-to-surface distances.

而且,步骤1中,将所有传感器数据进行整合的实现方式为,控制单元接收IMU发出的200Hz的TOV信号,记录信号接收时间,并线性内插出与系统时间对齐的观测量,从而实现IMU时间同步;控制单元向相机发出10Hz脉冲,并记录脉冲发出时间,从而实现相机时间同步;控制单元向激光雷达发出1Hz脉冲,并记录脉冲发出时间,从而实现激光雷达时间同步。Moreover, in step 1, the implementation of integrating all sensor data is as follows: the control unit receives the 200Hz TOV signal sent by the IMU, records the signal reception time, and linearly interpolates the observations aligned with the system time, so as to realize the IMU time Synchronization; the control unit sends a 10Hz pulse to the camera, and records the pulse sending time, so as to realize the camera time synchronization; the control unit sends a 1Hz pulse to the lidar, and records the pulse sending time, so as to realize the lidar time synchronization.

而且,步骤2中,依据预积分原理,对0.1s内的IMU观测数据进行积分,再将0.1s内的激光点云数据根据积分结果投影到该数据帧初始时间坐标系中;再将该数据帧初始时间坐标系中的点云投影到相机平面坐标系中,获取对应的影像光谱信息。Moreover, in step 2, according to the pre-integration principle, the IMU observation data within 0.1s is integrated, and then the laser point cloud data within 0.1s is projected into the initial time coordinate system of the data frame according to the integration result; The point cloud in the frame initial time coordinate system is projected into the camera plane coordinate system to obtain the corresponding image spectral information.

而且,步骤3中,依据曲率提取出单帧点云中的几何平面特征点以及几何边界特征点,除此之外,依据每个点从影像获取的光谱信息,提取出灰度梯度较大的点作为光谱边界特征点,其余点作为无特征点。Moreover, in step 3, the geometric plane feature points and the geometric boundary feature points in the point cloud of a single frame are extracted according to the curvature. In addition, according to the spectral information obtained from the image for each point, the larger gray gradient is extracted. The points are used as spectral boundary feature points, and the remaining points are used as featureless points.

本发明提供的移动测量装备能够适用于GNSS信号缺失或通行性有限的区域,并且能够快速便捷地作业以获取高精度数据,使用方便,成本低,填补了市场空白,具有重要市场价值。The mobile measurement equipment provided by the present invention can be applied to areas with missing GNSS signals or limited accessibility, can operate quickly and conveniently to obtain high-precision data, is convenient to use, has low cost, fills a market gap, and has important market value.

附图说明Description of drawings

图1是本发明实施例的方法流程图;Fig. 1 is the method flow chart of the embodiment of the present invention;

图2是本发明实施例的头盔移动测量系统硬件组成;Fig. 2 is the hardware composition of the helmet movement measurement system of the embodiment of the present invention;

图3是本发明实施例的头盔移动测量系统多传感器同步原理。FIG. 3 is the multi-sensor synchronization principle of the helmet movement measurement system according to the embodiment of the present invention.

具体实施方式Detailed ways

以下结合附图和实施例具体说明本发明的技术方案。The technical solutions of the present invention will be specifically described below with reference to the accompanying drawings and embodiments.

本发明在现有研究的基础上,设计了一种新颖的穿戴式头盔移动测量系统。该系统分为硬件和方法流程两个部分:穿戴式头盔移动测量系统硬件设计,以及高动态SLAM流程设计。在硬件设计部分,将相机,IMU,激光扫描仪集成到一个头盔中,并进行硬件时间同步,在采集数据的同时采集准确的传感器时间戳。高动态SLAM流程设计部分:第一步,根据硬件时间戳,将所有传感器数据进行整合,划分数据帧。每一帧数据中包含1张影像、0.1s时间内的激光观测、0.1s时间内的惯性测量。第二步,将一个数据帧中0.1s内的激光观测依据惯性测量单元的观测,去除运动畸变,并赋予影像的彩色信息。第三步,依据点云几何曲率与影像灰度梯度将数据帧中的激光点划分为边界特征点、面特征点、无特征点三类。第四步,将连续的数据帧依据不同特征进行特征关联,计算相对位置与姿态,并解算地图点坐标。Based on the existing research, the present invention designs a novel wearable helmet movement measurement system. The system is divided into two parts: hardware and method flow: hardware design of wearable helmet movement measurement system, and high dynamic SLAM flow design. In the hardware design part, the camera, IMU, and laser scanner are integrated into a helmet, and hardware time synchronization is performed to collect accurate sensor timestamps while collecting data. High dynamic SLAM process design part: The first step is to integrate all sensor data and divide the data frame according to the hardware timestamp. Each frame of data contains 1 image, laser observations within 0.1s, and inertial measurements within 0.1s. In the second step, the laser observation within 0.1s in a data frame is based on the observation of the inertial measurement unit, the motion distortion is removed, and the color information of the image is given. The third step is to divide the laser points in the data frame into three categories: boundary feature points, surface feature points, and no feature points according to the geometric curvature of the point cloud and the image grayscale gradient. The fourth step is to associate the continuous data frames according to different features, calculate the relative position and attitude, and solve the coordinates of the map points.

本发明实施例提供高精度穿戴式头盔数据获取装备,数据处理流程如图1,包括以下步骤:The embodiment of the present invention provides high-precision wearable helmet data acquisition equipment, and the data processing flow is shown in Figure 1, including the following steps:

步骤1:根据硬件时间戳,将所有传感器数据进行整合,划分数据帧。Step 1: Integrate all sensor data into data frames according to hardware timestamps.

本发明实施例中硬件集成原理图如图2所示,系统中包含IMU,相机,激光扫描仪(或激光雷达)三类传感器,并采用STM32单片机作为控制单元,相机、IMU和激光扫描仪分别连接控制单元,传感器及控制单元选择类型如表1。The schematic diagram of hardware integration in the embodiment of the present invention is shown in Figure 2. The system includes three types of sensors: IMU, camera, and laser scanner (or lidar), and uses STM32 microcontroller as the control unit. The camera, IMU, and laser scanner are respectively Connect the control unit, select the type of sensor and control unit as shown in Table 1.

本发明实施例中时间同步原理如图3所示,控制单元接收IMU发出的200Hz的TOV(Time of Valid)信号,记录信号接收时间,并线性内插出与系统时间对齐的观测量,从而实现IMU时间同步。控制单元向相机发出10Hz脉冲,并记录脉冲发出时间,从而实现相机时间同步。控制单元向激光雷达发出1Hz脉冲,并记录脉冲发出时间,从而实现激光雷达时间同步。The time synchronization principle in the embodiment of the present invention is shown in FIG. 3 . The control unit receives the TOV (Time of Valid) signal of 200 Hz sent by the IMU, records the signal reception time, and linearly interpolates the observation value aligned with the system time, thereby realizing IMU time synchronization. The control unit sends 10Hz pulses to the camera, and records the time when the pulses are sent out, so as to achieve camera time synchronization. The control unit sends 1Hz pulses to the lidar and records the time when the pulses are sent out, thereby realizing the time synchronization of the lidar.

表1.传感器及控制单元描述Table 1. Sensor and Control Unit Description

Figure BDA0003523283680000041
Figure BDA0003523283680000041

步骤2:将一个数据帧中0.1s内的激光观测依据惯性测量单元的观测,去除运动畸变,并赋予影像的彩色信息。依据预积分原理(Yang and Shen,2016),对0.1s内的IMU观测数据进行积分,再将0.1s内的激光点云数据根据积分结果投影到该数据帧初始时间坐标系中。再将该数据帧初始时间坐标系中的点云投影到相机平面坐标系中,获取对应的影像光谱信息。Step 2: The laser observation within 0.1s in a data frame is based on the observation of the inertial measurement unit, the motion distortion is removed, and the color information of the image is given. According to the pre-integration principle (Yang and Shen, 2016), the IMU observation data within 0.1s is integrated, and then the laser point cloud data within 0.1s is projected into the initial time coordinate system of the data frame according to the integration results. Then, the point cloud in the initial time coordinate system of the data frame is projected into the camera plane coordinate system to obtain the corresponding image spectral information.

步骤3:依据点云几何曲率与影像灰度梯度将数据帧中的激光点划分为边界特征点、面特征点、无特征点三类。实施例中,优选根据(Zhang and Singh,2014)中的方法,依据曲率提取出单帧点云中的几何平面特征点以及几何边界特征点。除此之外,依据每个点从影像获取的光谱信息,提取出灰度梯度较大的点作为光谱边界特征点。其余点作为无特征点。Step 3: According to the geometric curvature of the point cloud and the image grayscale gradient, the laser points in the data frame are divided into three categories: boundary feature points, surface feature points, and no feature points. In the embodiment, preferably according to the method in (Zhang and Singh, 2014), the geometric plane feature points and the geometric boundary feature points in the single frame point cloud are extracted according to the curvature. In addition, according to the spectral information obtained from the image at each point, points with larger grayscale gradients are extracted as spectral boundary feature points. The remaining points are regarded as featureless points.

步骤4:将连续的数据帧依据不同特征进行特征关联,计算相对位置与姿态,并解算地图点坐标。该头盔移动测量系统中需要被维护的状态量有:Step 4: Correlate the continuous data frames according to different features, calculate the relative position and attitude, and solve the map point coordinates. The state quantities that need to be maintained in the helmet movement measurement system are:

Figure BDA0003523283680000042
Figure BDA0003523283680000042

其中,xk是第k个系统状态,

Figure BDA0003523283680000043
分别为IMU坐标系(body)每一帧相对世界坐标系(world)的位置、速度与姿态。
Figure BDA0003523283680000044
Figure BDA0003523283680000045
分别是IMU的加速度计(accelerometer)与陀螺仪(gyroscope)的零偏(bias)。本发明采用MEMS IMU模型(Shin and El-Sheimy,2004)对系统IMU状态进行建模:where x k is the kth system state,
Figure BDA0003523283680000043
They are the position, velocity and attitude of each frame of the IMU coordinate system (body) relative to the world coordinate system (world).
Figure BDA0003523283680000044
and
Figure BDA0003523283680000045
They are the biases of the IMU's accelerometer and gyroscope, respectively. The present invention adopts the MEMS IMU model (Shin and El-Sheimy, 2004) to model the system IMU state:

Figure BDA0003523283680000046
Figure BDA0003523283680000046

Figure BDA0003523283680000047
Figure BDA0003523283680000047

Figure BDA0003523283680000048
Figure BDA0003523283680000048

Figure BDA0003523283680000049
Figure BDA0003523283680000049

Figure BDA00035232836800000410
Figure BDA00035232836800000410

其中,am和wm是加速度与角速度测量值,an、wn

Figure BDA0003523283680000051
Figure BDA0003523283680000052
是加速度计误差、陀螺仪误差、加速度随机游走与角速度随机游走,
Figure BDA0003523283680000053
为第k个系统状态时从IMU坐标系转到世界坐标系的旋转矩阵,
Figure BDA0003523283680000054
Figure BDA0003523283680000055
的四元数形式,gw为世界坐标系下的重力向量,这样就可以对系统状态进行递推。 where a m and w m are acceleration and angular velocity measurements, an , wn ,
Figure BDA0003523283680000051
and
Figure BDA0003523283680000052
are the accelerometer error, gyroscope error, acceleration random walk and angular velocity random walk,
Figure BDA0003523283680000053
is the rotation matrix from the IMU coordinate system to the world coordinate system when it is the kth system state,
Figure BDA0003523283680000054
for
Figure BDA0003523283680000055
The quaternion form of , g w is the gravity vector in the world coordinate system, so that the system state can be recursive.

为了利用激光匹配信息修正系统状态,抑制系统状态发散,将IMU状态方程构建预积分模型,如下:In order to use the laser matching information to correct the system state and suppress the system state divergence, the pre-integration model of the IMU state equation is constructed as follows:

Figure BDA0003523283680000056
Figure BDA0003523283680000056

Figure BDA0003523283680000057
Figure BDA0003523283680000057

Figure BDA0003523283680000058
Figure BDA0003523283680000058

其中,

Figure BDA0003523283680000059
Figure BDA00035232836800000510
分别为预积分相对位置、相对速度、相对旋转误差项,积分从tk计算到tk+1时刻,
Figure BDA00035232836800000511
为t时刻相对k帧的旋转矩阵。这样,所有的积分都是在IMU坐标系下从第k个系统状态开始积分,与世界坐标系没有关系,与起始系统状态没有关系,便于求导与计算残差,进而优化位姿。in,
Figure BDA0003523283680000059
and
Figure BDA00035232836800000510
are the relative position, relative velocity, and relative rotation error terms of pre-integration, respectively. The integration is calculated from t k to t k+1 time,
Figure BDA00035232836800000511
is the rotation matrix relative to k frames at time t. In this way, all the integrals start from the kth system state in the IMU coordinate system, which has nothing to do with the world coordinate system, and has nothing to do with the initial system state, which is convenient for derivation and residual calculation, and then optimize the pose.

激光匹配约束分为三类:几何边界特征点约束、几何平面特征点约束、光谱边界特征点约束。其中光谱边界点与几何边界点的约束建立方式为点到线的距离。而几何面特征点的约束建立方式为点到面的距离。公式如(10)。现有第k帧,p为各特征点坐标。(nT,d)为直线或平面的参数,其中d为常数。若为直线,则nT为特征点到直线中点的向量。若为平面,则nT为平面法向量。Laser matching constraints are divided into three categories: geometric boundary feature point constraints, geometric plane feature point constraints, and spectral boundary feature point constraints. The constraints between spectral boundary points and geometric boundary points are established by the distance from point to line. The constraints of geometric surface feature points are established by the distance from the point to the surface. The formula is as (10). In the existing kth frame, p is the coordinate of each feature point. (n T , d) are parameters of a line or plane, where d is a constant. If it is a straight line, n T is the vector from the feature point to the midpoint of the straight line. If it is a plane, n T is the plane normal vector.

Figure BDA00035232836800000512
Figure BDA00035232836800000512

根据以上方案,本发明实施例中首先设计并集成了一个穿戴式头盔移动测量系统,并实现了相机,IMU,激光扫描仪的时间同步。之后,针对穿戴式头盔移动测量系统设计了一种高动态SLAM方法流程设计,最后实现了系统位置与姿态实时解算,并解算地图点坐标。具体实施时,可以采用计算机软件技术实现方法流程的自动运行。According to the above solution, in the embodiment of the present invention, a wearable helmet movement measurement system is first designed and integrated, and the time synchronization of the camera, the IMU, and the laser scanner is realized. After that, a high dynamic SLAM method flow design was designed for the wearable helmet mobile measurement system, and finally the real-time solution of the system position and attitude was realized, and the map point coordinates were solved. During specific implementation, computer software technology can be used to realize the automatic operation of the method process.

本文中所描述的具体实施例仅仅是对本发明精神作举例说明。本发明所属技术领域的技术人员可以对所描述的具体实施例做各种各样的修改或补充或采用类似的方式替代,但并不会偏离本发明的精神或超越所附权利要求书所定义的范围。The specific embodiments described herein are merely illustrative of the spirit of the invention. Those skilled in the art to which the present invention pertains can make various modifications or additions to the described specific embodiments or substitute in similar manners, but will not deviate from the spirit of the present invention or go beyond the definition of the appended claims range.

Claims (4)

1.一种高精度穿戴式头盔数据获取装备,其特征在于:将相机、IMU和激光扫描仪集成到一个头盔中,并进行硬件时间同步,在采集数据的同时采集准确的传感器时间戳;数据获取过程实现如下,1. A high-precision wearable helmet data acquisition equipment, characterized in that: a camera, an IMU and a laser scanner are integrated into a helmet, and hardware time synchronization is performed to collect accurate sensor time stamps while collecting data; The acquisition process is implemented as follows, 步骤1,根据硬件时间戳,将所有传感器数据进行整合,划分数据帧;Step 1, according to the hardware timestamp, integrate all sensor data and divide the data frame; 步骤2,将一个数据帧中激光观测依据惯性测量单元的观测,去除运动畸变,并赋予影像的彩色信息;Step 2, the laser observation in a data frame is based on the observation of the inertial measurement unit, the motion distortion is removed, and the color information of the image is given; 步骤3,依据点云几何曲率与影像灰度梯度将数据帧中的激光点划分为边界特征点、面特征点、无特征点三类;Step 3, according to the geometric curvature of the point cloud and the image grayscale gradient, the laser points in the data frame are divided into three categories: boundary feature points, surface feature points, and no feature points; 步骤4,将连续的数据帧依据不同特征进行特征关联,计算相对位置与姿态,并解算地图点坐标,实现方式包括如下处理,In step 4, the continuous data frames are characterized according to different characteristics, the relative position and attitude are calculated, and the coordinates of the map points are calculated. The implementation method includes the following processing: 在头盔移动测量中需要被维护的状态量有:The state quantities that need to be maintained in the helmet movement measurement are:
Figure FDA0003523283670000011
Figure FDA0003523283670000011
其中,xk是第k个系统状态,
Figure FDA0003523283670000012
分别为位置、速度与姿态。
Figure FDA0003523283670000013
Figure FDA0003523283670000014
是IMU的加速度与角速度零偏;
where x k is the kth system state,
Figure FDA0003523283670000012
position, velocity, and attitude, respectively.
Figure FDA0003523283670000013
and
Figure FDA0003523283670000014
is the zero offset of the acceleration and angular velocity of the IMU;
采用MEMS IMU模型对系统状态xk进行建模,根据IMU状态方程构建预积分模型;The MEMS IMU model is used to model the system state x k , and the pre-integration model is constructed according to the IMU state equation; 激光匹配约束分为三类,几何边界特征点约束、几何平面特征点约束、光谱边界特征点约束,其中光谱边界点与几何边界点的约束建立方式为点到线的距离,而几何面特征点的约束建立方式为点到面的距离。Laser matching constraints are divided into three categories: geometric boundary feature point constraints, geometric plane feature point constraints, and spectral boundary feature point constraints. The constraints are established as point-to-surface distances.
2.根据权利要求1所述一种高精度穿戴式头盔数据获取装备,其特征在于:步骤1中,将所有传感器数据进行整合的实现方式为,控制单元接收IMU发出的200Hz的TOV信号,记录信号接收时间,并线性内插出与系统时间对齐的观测量,从而实现IMU时间同步;控制单元向相机发出10Hz脉冲,并记录脉冲发出时间,从而实现相机时间同步;控制单元向激光雷达发出1Hz脉冲,并记录脉冲发出时间,从而实现激光雷达时间同步。2. a kind of high-precision wearable helmet data acquisition equipment according to claim 1, is characterized in that: in step 1, the realization mode that all sensor data is integrated is, control unit receives the TOV signal of 200Hz that IMU sends, records Signal reception time, and linearly interpolate the observations aligned with the system time to achieve IMU time synchronization; the control unit sends 10Hz pulses to the camera, and records the pulse sending time to achieve camera time synchronization; the control unit sends 1Hz to the lidar Pulse, and record the time when the pulse is emitted, so as to realize the time synchronization of lidar. 3.根据权利要求1所述一种高精度穿戴式头盔数据获取装备,其特征在于:步骤2中,依据预积分原理,对0.1s内的IMU观测数据进行积分,再将0.1s内的激光点云数据根据积分结果投影到该数据帧初始时间坐标系中;再将该数据帧初始时间坐标系中的点云投影到相机平面坐标系中,获取对应的影像光谱信息。3. A high-precision wearable helmet data acquisition equipment according to claim 1, wherein in step 2, according to the pre-integration principle, the IMU observation data within 0.1s is integrated, and then the laser within 0.1s is integrated. The point cloud data is projected into the initial time coordinate system of the data frame according to the integration result; the point cloud in the initial time coordinate system of the data frame is then projected into the camera plane coordinate system to obtain the corresponding image spectral information. 4.根据权利要求1或2或3所述一种高精度穿戴式头盔数据获取装备,其特征在于:步骤3中,依据曲率提取出单帧点云中的几何平面特征点以及几何边界特征点,除此之外,依据每个点从影像获取的光谱信息,提取出灰度梯度较大的点作为光谱边界特征点,其余点作为无特征点。4. a kind of high-precision wearable helmet data acquisition equipment according to claim 1 or 2 or 3, is characterized in that: in step 3, according to the curvature to extract the geometric plane feature points and geometric boundary feature points in the single frame point cloud , in addition, according to the spectral information obtained from the image at each point, the points with larger grayscale gradients are extracted as the spectral boundary feature points, and the remaining points are regarded as non-feature points.
CN202210185677.1A 2022-02-28 2022-02-28 High-precision wearable helmet data acquisition equipment Active CN114623818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210185677.1A CN114623818B (en) 2022-02-28 2022-02-28 High-precision wearable helmet data acquisition equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210185677.1A CN114623818B (en) 2022-02-28 2022-02-28 High-precision wearable helmet data acquisition equipment

Publications (2)

Publication Number Publication Date
CN114623818A true CN114623818A (en) 2022-06-14
CN114623818B CN114623818B (en) 2024-09-13

Family

ID=81900798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210185677.1A Active CN114623818B (en) 2022-02-28 2022-02-28 High-precision wearable helmet data acquisition equipment

Country Status (1)

Country Link
CN (1) CN114623818B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160323565A1 (en) * 2015-04-30 2016-11-03 Seiko Epson Corporation Real Time Sensor and Method for Synchronizing Real Time Sensor Data Streams
CN108613675A (en) * 2018-06-12 2018-10-02 武汉大学 Low-cost unmanned aircraft traverse measurement method and system
CN113959437A (en) * 2021-10-14 2022-01-21 重庆数字城市科技有限公司 Measuring method and system for mobile measuring equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160323565A1 (en) * 2015-04-30 2016-11-03 Seiko Epson Corporation Real Time Sensor and Method for Synchronizing Real Time Sensor Data Streams
CN108613675A (en) * 2018-06-12 2018-10-02 武汉大学 Low-cost unmanned aircraft traverse measurement method and system
CN113959437A (en) * 2021-10-14 2022-01-21 重庆数字城市科技有限公司 Measuring method and system for mobile measuring equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨刚 等: "基于UWB的消防员室内协同定位算法", 《计算机工程与应用》, 28 January 2021 (2021-01-28), pages 1 - 12 *

Also Published As

Publication number Publication date
CN114623818B (en) 2024-09-13

Similar Documents

Publication Publication Date Title
CN108375370B (en) A kind of complex navigation system towards intelligent patrol unmanned plane
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN109376785B (en) A Navigation Method Based on Iterative Extended Kalman Filter Fusion Inertial and Monocular Vision
CN106017463B (en) A kind of aircraft positioning method based on positioning sensor device
CN112254729B (en) A mobile robot positioning method based on multi-sensor fusion
CN112683281B (en) A joint localization method for autonomous vehicles based on vehicle kinematics
CN102538781A (en) Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN107289910B (en) A TOF-based Optical Flow Localization System
CN106056664A (en) Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision
CN110986939A (en) Visual inertial odometer method based on IMU pre-integration
JP6288858B2 (en) Method and apparatus for estimating position of optical marker in optical motion capture
CN113639722B (en) Continuous laser scanning registration auxiliary inertial positioning and attitude determination method
CN114608554A (en) Handheld SLAM equipment and robot instant positioning and mapping method
CN116047567B (en) Satellite-habit combination positioning method and navigation method based on deep learning
CN117584989A (en) A lidar/IMU/vehicle kinematics constrained tightly coupled SLAM system and algorithm
CN112729283A (en) Navigation method based on depth camera/MEMS inertial navigation/odometer combination
CN112556681B (en) Vision-based navigation and positioning method for orchard machine
CN110044377B (en) Vicon-based IMU offline calibration method
CN112862818B (en) Underground parking lot vehicle positioning method combining inertial sensor and multi-fisheye camera
CN116794640B (en) A LiDAR-GPS/IMU self-calibration method for movable carriers
CN114184194A (en) A method for autonomous navigation and positioning of unmanned aerial vehicles in denial environments
CN115856974B (en) GNSS, INS and visual tight combination navigation positioning method based on invariant filtering
CN115145294B (en) A combined navigation system and method for autonomous aerial refueling docking
CN116443028A (en) Head posture data acquisition system and method
CN115290090A (en) SLAM map construction method based on multi-sensor information fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant