[go: up one dir, main page]

CN120803063B - Mobile investigation equipment real-time data analysis method based on multi-sensor fusion - Google Patents

Mobile investigation equipment real-time data analysis method based on multi-sensor fusion

Info

Publication number
CN120803063B
CN120803063B CN202510967246.4A CN202510967246A CN120803063B CN 120803063 B CN120803063 B CN 120803063B CN 202510967246 A CN202510967246 A CN 202510967246A CN 120803063 B CN120803063 B CN 120803063B
Authority
CN
China
Prior art keywords
motion
data
sensor
terminal
early warning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202510967246.4A
Other languages
Chinese (zh)
Other versions
CN120803063A (en
Inventor
周倜
王尚玉
陈娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Yeju Vision Intelligent Equipment Co ltd
Original Assignee
Jiangsu Yeju Vision Intelligent Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Yeju Vision Intelligent Equipment Co ltd filed Critical Jiangsu Yeju Vision Intelligent Equipment Co ltd
Priority to CN202510967246.4A priority Critical patent/CN120803063B/en
Publication of CN120803063A publication Critical patent/CN120803063A/en
Application granted granted Critical
Publication of CN120803063B publication Critical patent/CN120803063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/80Arrangements for reacting to or preventing system or operator failure
    • G05D1/86Monitoring the performance of the system, e.g. alarm or diagnosis modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Testing Or Calibration Of Command Recording Devices (AREA)

Abstract

The invention discloses a real-time data analysis method of mobile investigation equipment based on multi-sensor fusion, which relates to the technical field of multi-sensor fusion. The method and the device have the advantages that the communication relay terminal generates the grading early warning instruction according to the dynamic threat index, so that the device can adopt strategies such as emergency braking, speed reduction re-planning or environment monitoring enhancement aiming at different threat levels, the flexibility and the safety of the device for coping with complex environments are improved, the motion monitoring terminal adjusts the motion parameters of the device based on the early warning instruction and the dynamic threat index and executes the obstacle avoidance strategy, the PID controller is combined to optimize the motion trail, and the obstacle avoidance capability and the motion stability of the device in the complex environments are improved.

Description

Mobile investigation equipment real-time data analysis method based on multi-sensor fusion
Technical Field
The invention relates to the technical field of multi-sensor fusion, in particular to a mobile investigation equipment real-time data analysis method based on multi-sensor fusion.
Background
With the rapid development of information technology and sensor technology, mobile investigation equipment is increasingly abundant in application fields such as military investigation, security monitoring, smart city management and the like. In complex field environments, high-risk disaster sites or hidden investigation tasks, equipment needs to sense environmental changes in real time, monitor self motion states and track target characteristics, and severe requirements are put forward on the real-time fusion analysis capability of multi-source heterogeneous data, and the data analysis mode of a traditional single sensor is difficult to meet the accurate decision-making requirement under the complex scene.
At present, the existing data analysis technology of the mobile investigation equipment realizes the preliminary fusion of multi-sensor data, but in the data preprocessing link, the heterogeneous data which are asynchronous in time and space lacks an efficient alignment and cleaning mechanism, so that the feature matrix construction precision is insufficient, and in the risk assessment process, a static weight prediction model is mostly adopted, and the dynamic adjustment cannot be carried out according to the environmental interference degree, the motion track reliability and the target feature quality, so that the threat index calculation has deviation from an actual risk scene.
However, the current technology faces significant challenges in practical application, namely space-time deviation and noise interference exist in environmental factor data, motion state data and target feature data acquired by a multi-source sensor array, the data integrity is difficult to ensure by a traditional cleaning algorithm, the standard deviation of sensor data is increased due to the environmental interference, different attenuation factors cannot be dynamically adapted to an existing interference suppression model, drift risks exist in motion track confidence calculation due to acceleration deviation and course angle error of an IMU and GPS data, the signal-to-noise ratio difference and motion prediction deviation of a target recognition sensor are not sufficiently quantized, the target feature cannot be accurately reflected by a fusion index, and threat index misjudgment easily occurs in a complex scene due to the lack of a self-adaptive updating mechanism of a multi-mode risk prediction model with fixed weight, so that the accuracy of hierarchical early warning and the effectiveness of a device obstacle avoidance strategy are affected.
Therefore, it is necessary to invent a method for analyzing real-time data of a mobile detection device based on multi-sensor fusion to solve the above problems.
Disclosure of Invention
The invention aims to provide a mobile investigation equipment real-time data analysis method based on multi-sensor fusion, so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the technical scheme that the real-time data analysis method of the mobile investigation equipment based on multi-sensor fusion comprises an environment sensing terminal, a motion monitoring terminal, a weight optimizing terminal, a data fusion terminal, a decision control terminal, a communication relay terminal and the mobile investigation equipment, and specifically comprises the following steps:
S1, acquiring heterogeneous environment data by an environment sensing terminal through a multi-source sensor array when mobile investigation equipment operates to form a multi-source space-time synchronous data set, wherein the multi-source space-time synchronous data set comprises environment factor data, motion state data and target characteristic data;
s2, the data fusion terminal performs space-time alignment and data cleaning on the multi-source space-time synchronous data set to construct a standardized feature matrix;
S3, the decision control terminal analyzes the standardized feature matrix to obtain an environment interference suppression coefficient, a motion track confidence coefficient and a target feature fusion index, inputs the three coefficients into a multi-mode risk prediction model, and outputs a dynamic threat index;
S4, the communication relay terminal generates a grading early warning instruction according to the dynamic threat index;
s5, the motion monitoring terminal adjusts motion parameters of the equipment based on the early warning instruction and the dynamic threat index, and executes an obstacle avoidance strategy;
And S6, dynamically updating the weight coefficient in the multi-mode risk prediction model based on the dynamic threat index by the weight optimization terminal.
Preferably, the environmental factor data comprise illumination intensity, temperature, humidity atmospheric pressure, wind speed, electromagnetic field intensity and dust concentration, the motion state data comprise three-dimensional acceleration, angular velocity, geomagnetic azimuth angle and GPS positioning coordinates, and the target characteristic data comprise signal-to-noise ratio and measured motion vector of the sensor.
Preferably, the environmental interference suppression coefficient is specifically:
Wherein N is the number of environmental sensors, sigma i is the standard deviation of the data acquired by the ith sensor, mu i is the mean value of the data acquired by the ith sensor, k is an environmental attenuation factor, Δt is the length of a data acquisition time window, and e is a natural constant.
Preferably, the motion trail confidence level is specifically:
Wherein, the The acceleration vector measured for the IMU,The acceleration vector is calculated by the GPS, a max is the maximum allowable acceleration of the equipment, delta theta is the absolute value of the deviation between the geomagnetic azimuth angle and the GPS course angle, and pi is the circumference ratio.
Preferably, the target feature fusion index is specifically:
Where M is the number of target recognition sensors, SNR j is the signal-to-noise ratio of the jth sensor, ω j is the jth sensor weight, For the actual measured motion vector of the object,V max is the device maximum tracking speed, which is the motion prediction vector.
Preferably, the multi-modal risk prediction model specifically includes:
Wherein q=α× (β 1×C12×C23×C3-γ),C1 is an environmental interference suppression coefficient, C 2 is a motion trajectory confidence level, C 3 is a target feature fusion index, α is a scale factor, γ is a risk determination threshold, e is a natural constant, β 1、β2 and β 3 are weight coefficients, β 123 =1 and β 1、β2 and β 3 e [0,1].
Preferably, the hierarchical early warning instruction is:
When the dynamic threat index E is more than or equal to E 1, triggering a first-level early warning instruction, and starting an emergency braking and defending protocol;
When the dynamic threat index E 2≤E<E1 is used, triggering a secondary early warning instruction, and activating equipment deceleration and path re-planning;
when the dynamic threat index is 0 to be less than or equal to E < E 2, triggering a three-level early warning instruction, and starting an environment monitoring enhancement mode and a basic avoidance strategy.
Preferably, the executing process of the motion monitoring terminal is as follows:
a1, analyzing the early warning instruction level, and calling a preset motion parameter mapping table;
a2, calculating an obstacle avoidance vector based on the dynamic threat index:
wherein lambda is the obstacle avoidance gain factor, For the actual measured motion vector of the object,Is a motion prediction vector;
A3, adjusting the motor torque and the steering angle through a PID controller to ensure that the motion trail of the equipment meets the following conditions:
Wherein a limit is the final upper acceleration limit preset value of the device, a max is the maximum acceleration preset value of the device, E is the dynamic threat index, and min is the minimum function.
Preferably, the dynamic updating of the weight coefficient is specifically:
Wherein, the
η(E)=η0×(1+0.5×E),
Wherein e is a natural constant,For the dynamically updated final weight coefficients,For intermediate weights scaled by a dynamic threat index,For short term bias corrected weights, m (E) is a threat response intensity coefficient, η (E) is a weight adjustment rate, η 0 is a reference learning rate, C i is a real-time assessment coefficient,As a sliding average value of the coefficients,For the degree of coefficient fluctuation, E is a dynamic threat index, where i=1, 2,3.
The invention has the technical effects and advantages that:
1. According to the method, the decision control terminal calculates the environment interference suppression coefficient, the motion track confidence coefficient and the target feature fusion index, and outputs the dynamic threat index by utilizing the multi-mode risk prediction model, so that the multi-dimensional quantitative evaluation of the environment risk is realized, and a scientific basis is provided for hierarchical early warning;
2. according to the invention, the communication relay terminal generates the grading early warning instruction according to the dynamic threat indexes, so that the equipment can adopt strategies such as emergency braking, deceleration re-planning or environment monitoring enhancement aiming at different threat grades, and the flexibility and the safety of the equipment for coping with complex environments are improved;
3. according to the invention, the motion monitoring terminal adjusts the motion parameters of the equipment based on the early warning instruction and the dynamic threat index and executes the obstacle avoidance strategy, and the PID controller is combined to optimize the motion trail, so that the obstacle avoidance capability and the motion stability of the equipment in a complex environment are improved;
4. According to the invention, the weight optimization terminal dynamically updates the weight coefficient in the multi-mode risk prediction model based on the dynamic threat index, so that the self-adaptive adjustment of model parameters is realized, and the system can continuously maintain the accurate risk prediction capability in different scenes.
Drawings
Fig. 1 is a schematic diagram of the device connection of the present invention.
FIG. 2 is a flow chart of the method steps of the present invention.
Fig. 3 is a schematic diagram of an implementation process of the motion monitoring terminal of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a device connection schematic diagram shown in fig. 1, which comprises an environment sensing terminal, a motion monitoring terminal, a weight optimizing terminal, a data fusion terminal, a decision control terminal, a communication relay terminal and mobile investigation equipment;
the invention provides a mobile investigation equipment real-time data analysis method based on multi-sensor fusion as shown in fig. 2, which specifically comprises the following steps:
S1, acquiring heterogeneous environment data by an environment sensing terminal through a multi-source sensor array when mobile investigation equipment operates to form a multi-source space-time synchronous data set, wherein the multi-source space-time synchronous data set comprises environment factor data, motion state data and target characteristic data;
Further, in the above technical scheme, the environmental factor data includes illumination intensity, temperature, humidity, atmospheric pressure, wind speed, electromagnetic field intensity and dust concentration, the motion state data includes three-dimensional acceleration, angular velocity, geomagnetic azimuth angle and GPS positioning coordinates, and the target feature data includes signal-to-noise ratio and actually measured motion vector of the sensor.
It is to be noted that the illumination intensity is detected in real time by a photosensitive sensor, such as a photosensitive resistor and a silicon photocell, and the light signal is converted into an electric signal and then quantized and output;
the temperature is sensed by a temperature sensor, such as a thermocouple, a thermistor and a digital temperature chip, and the temperature value is reflected by resistance value or voltage change;
the humidity is measured by a humidity sensor, such as a capacitive humidity sensor and a resistive humidity sensor, and the humidity is calculated by utilizing the influence of water molecules on the dielectric constant or the resistance of the sensor;
the atmospheric pressure adopts an air pressure sensor, such as an MEMS air pressure chip to detect the atmospheric pressure change, and outputs a real-time air pressure value by combining an altitude calibration algorithm;
The wind speed is measured by a wind speed sensor such as an impeller type anemometer and an ultrasonic anemometer, and the impeller rotation or ultrasonic time difference is converted into a wind speed value;
the electromagnetic field intensity is quantified by inducing electromotive force or magnetic field variation by using an electromagnetic field sensor, such as a Hall effect sensor, a coil type sensor, to detect the intensity of an electric field or a magnetic field in a space;
the dust concentration is detected by a dust sensor, such as an optical dust sensor and a capacitive dust sensor, and the concentration is calculated by utilizing the light scattering or charge induction principle;
The three-dimensional acceleration outputs three-dimensional vector data through a triaxial accelerometer, such as the acceleration of MEMS accelerometer measuring equipment in the X, Y, Z axis direction;
The angular velocity is converted into an electric signal by the Coriolis force effect by using a three-axis gyroscope, such as a MEMS gyroscope measuring device, for example, the angular velocity of a rotation around three coordinate axes;
The geomagnetic azimuth angle is used for detecting the direction of the earth magnetic field through a magnetometer, such as a magneto-resistance sensor, and combining a gyroscope and an accelerometer to calculate the azimuth angle of the device;
The GPS positioning coordinates receive satellite signals through a GPS module, analyze coordinate information such as longitude, latitude, altitude and the like, and realize positioning by combining a time stamp;
The signal-to-noise ratio of the sensor calculates the ratio of the power of the output signal of the target sensor, such as a camera and a radar, to the noise power through a signal processing algorithm, and reflects the signal quality;
The actually measured motion vector tracks a target through a visual sensor such as an infrared camera, a millimeter wave radar or a laser radar, and the like, calculates the displacement, the speed and the direction of the target by combining multi-frame data to form a three-dimensional motion vector, and the technical principle is that a target detection algorithm such as YOLO is utilized to identify the target, the motion track of the target is tracked through an optical flow method or Kalman filtering, and the real-time motion vector is calculated;
All the data are acquired in real time through a multisource sensor array of the environment sensing terminal, and after various sensors convert physical quantities into electric signals, heterogeneous environment data are formed through analog-to-digital conversion and preliminary signal processing.
S2, the data fusion terminal performs space-time alignment and data cleaning on the multi-source space-time synchronous data set to construct a standardized feature matrix;
The method comprises the steps of firstly carrying out time-space alignment on time stamp differences of collected data of different sensors in an environment sensing terminal, aligning the heterogeneous data to a unified time reference through an interpolation algorithm or a clock synchronization protocol, ensuring data time sequence consistency, simultaneously converting a local coordinate system of each sensor into a unified global coordinate system through a coordinate transformation matrix, enabling motion state data and environment factor data to be mapped with each other in a space dimension, secondly carrying out data cleaning, firstly denoising original data of the sensors through algorithms such as Kalman filtering, median filtering or Gaussian filtering, secondly identifying abnormal points in the data based on a statistical method or a machine learning algorithm, repairing the abnormal points through modes such as adjacent data interpolation and regression prediction, meanwhile filling the abnormal points with a multiple interpolation method or a time sequence prediction model for the missing data, finally constructing a standardized feature matrix, converting the cleaned heterogeneous data into a unified numerical range, eliminating dimension differences, extracting key features from the cleaned data, arranging the time sequence into feature vectors, integrating the feature vectors into the multi-mode feature vectors, and representing the multi-mode feature vectors, and providing a multi-mode decision-making model for the different types of the control risk-state control terminal.
S3, the decision control terminal analyzes the standardized feature matrix to obtain an environment interference suppression coefficient, a motion track confidence coefficient and a target feature fusion index, inputs the three coefficients into a multi-mode risk prediction model, and outputs a dynamic threat index;
further, in the above technical solution, the environmental interference suppression coefficient is specifically:
Wherein N is the number of environmental sensors, sigma i is the standard deviation of the data acquired by the ith sensor, mu i is the mean value of the data acquired by the ith sensor, k is an environmental attenuation factor, Δt is the length of a data acquisition time window, and e is a natural constant.
It should be noted that the setting of the value of the environmental attenuation factor k is: Wherein ln is the logarithm based on e, T 1/2 is the half-life of the environmental disturbance, specifically taking the coefficient of variation of the sensor data The time required to decay from peak to 50%;
it should be appreciated that the formulation logic of the environmental disturbance rejection factor C 1 is based on multi-sensor data stability assessment, time-decay effects, and disturbance level quantification, where, Is the square of the variation coefficient, is used for measuring the relative discrete degree of single-sensor data, and reflects the data instability degree caused by environmental interference by square amplification fluctuation influence while eliminating the mean value dimension influenceThe method comprises the steps of obtaining an average value of interference degrees of all sensors, realizing multi-sensor fusion to avoid the influence of single sensor abnormality, comprehensively reflecting the interference level of the whole environment, taking e -k×Δt as an exponential decay term, describing the time correlation of the environment interference, enabling the contribution of early data to the current interference assessment to be exponentially attenuated along with the prolongation of a time window Deltat, controlling the attenuation speed by k, weakening the influence of historical data, enhancing the response to real-time interference, and enabling C 1 to be negatively correlated with the interference degree through the structure of a C 1 =1-interference comprehensive term, enabling C 1 to be approximately equal to 1 when the environment is not interfered, enabling C 1 to be reduced when the interference is strong, and intuitively quantifying the environment interference intensity.
Further, in the above technical solution, the motion trajectory confidence coefficient specifically includes:
Wherein, the The acceleration vector measured for the IMU,The acceleration vector is calculated by the GPS, a max is the maximum allowable acceleration of the equipment, delta theta is the absolute value of the deviation between the geomagnetic azimuth angle and the GPS course angle, and pi is the circumference ratio.
It is known that the formula design logic of the motion trail confidence coefficient C 2 is developed around the consistency check of the multi-source sensor, and the reliability of trail calculation is comprehensively evaluated by quantifying the deviation of acceleration and headingIs converted into scalar deviation, divided by a max for normalization, and is unified into the motion capability of the equipment to occupy the ratio dimension in order to eliminate the dimension influence, the square operation amplification deviation influence is that the denominator is 2 when the deviation reaches a max, the part confidence is reduced to one halfIn the dimension of angular motion consistency, geomagnetic heading is easy to be interfered by magnetism, GPS heading depends on continuous positioning, the overlarge deviation of the two implies that heading measurement is unreliable, delta theta takes absolute value, because heading deviation is not directional, only the magnitude is concerned, theoretical maximum deviation is pi, and exceeding is equivalent to reverse small angle, divided by pi to be normalized to [0,1] interval, denominator is adoptedThe method has the advantages that when the heading is completely consistent, the part approaches 1, the deviation reaches 180 degrees and is reduced to one half, the physical intuition that the local confidence coefficient of the heading dimension is halved in the reverse direction is reflected, on the multi-dimensional fusion layer, the acceleration and the heading are mutually independent but cooperatively influence the motion track confidence coefficient, the product operation leads the defects of the two dimensions to be mutually amplified, when the two deviations respectively lead the motion track confidence coefficient to be halved, the whole is reduced to one quarter, and only when the two are consistent, the C 2 is approximate to 1, and the damage of the inconsistency to the motion track confidence coefficient is strictly restrained through a mathematical structure.
Further, in the above technical solution, the target feature fusion index specifically is:
Where M is the number of target recognition sensors, SNR j is the signal-to-noise ratio of the jth sensor, ω j is the jth sensor weight, For the actual measured motion vector of the object,V max is the device maximum tracking speed, which is the motion prediction vector.
It should be noted that the value of the sensor weight ω j is set: wherein R j is the reference signal-to-noise ratio of the jth sensor, A j is the priority coefficient of the jth sensor type, which is specifically set according to industry experience, if the sensor type is millimeter wave radar, the priority coefficient is set to 1 for strong interference resistance, the sensor type is laser radar, the priority coefficient is set to 0.9 for high precision but affected by weather, and the sensor type is an infrared camera, the priority coefficient is set to 0.8 for night advantage.
It should be noted that the first half of the formula of the target feature fusion index C 3 passesThe method aims to comprehensively evaluate the signal reliability of a target recognition sensor, such as a camera and a radar, wherein the signal-to-noise ratio SNR j directly quantifies the signal definition of a j-th sensor, and the preset weight omega j is dynamically allocated according to the environmental adaptability of the sensor type, such as the weight of the radar is higher under strong electromagnetic interference, and the weight of an optical sensor is reduced in a dust environment. Through the weighted summation and the weight normalization processing, the part compresses the result to a [0,1] interval, namely, the output approaches 1 when the signal to noise ratio of all the sensors is extremely high, and otherwise, approaches 0 if the sensors fail or the signal quality is suddenly reduced. The method effectively avoids system misjudgment caused by single sensor faults, improves the anti-interference capability in the multi-source heterogeneous environment, and passes through the second half part of the formulaAiming at solving the risk of target motion prediction and actual measurement disconnection, and calculating actual measurement motion vectorsAnd prediction vectorAnd dividing the Euclidean distance deviation by the maximum tracking speed v max of the equipment for normalization, and converting the absolute deviation into a relative quantity, wherein if the actual measurement is completely consistent with the prediction (the deviation is 0), the item outputs 1, and if the deviation exceeds v max, if the target burst is in high-speed maneuver, the item outputs approximately 0. The design dynamically reflects the real-time precision of the tracking system, and avoids the false alarm caused by model lag or environmental disturbance. Finally, the double-dimensional strong-correlation coupling is realized through multiplication operation of motion C 3 = (data quality item) x (motion consistency item), and the physical significance of the double-dimensional strong-correlation coupling is that the C3 value is obviously reduced when any one of the data quality and the motion consistency is defective, for example, when the signal to noise ratio of a sensor is high but the motion prediction is seriously incorrect, the data quality item is about 1, the motion consistency item is about 0, the C 3 is about 0, the data quality item is about 0 when the motion prediction is accurate but the sensor signal is interfered, the motion consistency item is about 1, the C 3 is about 0, and the design forcing system only outputs a high C 3 value when the sensing is reliable and the prediction is identical, so that the misjudgment rate under a complex scene is radically reduced.
Further, in the above technical solution, the multimodal risk prediction model specifically includes:
Wherein q=α× (β 1×C12×C23×C3-γ),C1 is an environmental interference suppression coefficient, C 2 is a motion trajectory confidence level, C 3 is a target feature fusion index, α is a scale factor, γ is a risk determination threshold, e is a natural constant, β 1、β2 and β 3 are weight coefficients, β 123 =1 and β 1、β2 and β 3 e [0,1].
It should be noted that the initial values of the weight coefficients β 1、β2 and β 3 are set to be β 1=0.4,β2=0.3,β3 =0.3 for a scene with complex environment and low-speed movement, β 1=0.2,β2=0.5,β3 =0.3 for a scene with high maneuvering and target tracking priority, and β 1=0.6,β2=0.3,β3 =0.1 for a scene with severe environment and no specific target;
The value of the scale factor alpha is set to 2.5;
The value of the risk judging threshold value gamma is set to be gamma=gamma 0+△γenv+△γtask, wherein gamma 0=0.6,△γenv is an environment compensation term, and a calculation formula is as follows Wherein N working is the number of effective working environment sensors, N total is the total number of environment sensors,Is the standard deviation of the environmental interference suppression coefficient,As a sliding average value of the environmental interference suppression coefficient, Δγ task is a task compensation term, specifically set to Δγ task =0.15 in the night or severe weather scenario, Δγ task = -0.1 in the investigation mode, and Δγ task =0 in the regular patrol.
S4, the communication relay terminal generates a grading early warning instruction according to the dynamic threat index;
Further, in the above technical solution, the hierarchical early warning instruction is:
When the dynamic threat index E is more than or equal to E 1, triggering a first-level early warning instruction, and starting an emergency braking and defending protocol;
When the dynamic threat index E 2≤E<E1 is used, triggering a secondary early warning instruction, and activating equipment deceleration and path re-planning;
when the dynamic threat index is 0 to be less than or equal to E < E 2, triggering a three-level early warning instruction, and starting an environment monitoring enhancement mode and a basic avoidance strategy.
It should be noted that, the ratio of E 1=0.7,E2 =0.3 may be specifically adjusted according to practical situations, for example, in a complex static environment, E 2 may be reduced to 0.25, and the activation time of the environmental monitoring enhancement mode may be prolonged;
It is known that the emergency braking is to execute torque reversal operation through the motion monitoring terminal so as to enable the acceleration of the equipment to meet The defense protocol is used for activating a physical protection mechanism, such as starting a protection cover, and simultaneously blocking a key module of the equipment to prevent external invasion;
The active deceleration is to limit the acceleration to the upper limit a limit=0.5×amax of the acceleration set as the preset parameter set in the table 1, the speed is reduced smoothly by the PID controller, the path re-planning is to provide a standardized environment matrix for the data fusion terminal, and the motion monitoring terminal recalculates the obstacle avoidance vector based on the standardized environment matrix Generating a new path by adopting an A-or RRT algorithm;
The environment monitoring enhancement mode is to activate standby sensor resources and optimize working parameters, for example, the frame rate of an infrared camera is increased to 60fps to enhance the capturing capability of a dynamic target, the scanning angle of a laser radar is expanded to 180 degrees to cover a wider monitoring area, the basic avoidance strategy is to set an acceleration upper limit a limit=0.2×amax according to a preset parameter set in a table 1, the steering angle change rate is limited to be less than or equal to 15 degrees/s through a PID controller, and the motion trail is ensured to meet the requirement
S5, the motion monitoring terminal adjusts motion parameters of the equipment based on the early warning instruction and the dynamic threat index, and executes an obstacle avoidance strategy;
Further, in the above technical solution, the executing process of the motion monitoring terminal is as shown in fig. 3:
a1, analyzing the early warning instruction level, and calling a preset motion parameter mapping table;
a2, calculating an obstacle avoidance vector based on the dynamic threat index:
wherein lambda is the obstacle avoidance gain factor, For the actual measured motion vector of the object,Is a motion prediction vector;
A3, adjusting the motor torque and the steering angle through a PID controller to ensure that the motion trail of the equipment meets the following conditions:
Wherein a limit is the final upper acceleration limit preset value of the device, a max is the maximum acceleration preset value of the device, E is the dynamic threat index, and min is the minimum function.
It should be noted that the preset motion parameter mapping table is shown in table 1:
TABLE 1
Early warning level Dynamic threat index E Preset parameter set
Primary early warning E≥E1 {λ=0.5,alimit=0.8×amax}
Second-level early warning E2≤E<E1 {λ=1.2,alimit=0.5×amax}
Three-stage early warning 0≤E<E2 {λ=2,alimit=0.2×amax}
Wherein a limit is the final upper acceleration limit preset value of the equipment, and the motion trail of the equipment at the initial stage needs to meet
And S6, dynamically updating the weight coefficient in the multi-mode risk prediction model based on the dynamic threat index by the weight optimization terminal.
Further, in the above technical solution, the dynamic updating of the weight coefficient specifically includes:
Wherein, the
η(E)=η0×(1+0.5×E),
Wherein e is a natural constant,For the dynamically updated final weight coefficients,For intermediate weights scaled by a dynamic threat index,For short term bias corrected weights, m (E) is a threat response intensity coefficient, η (E) is a weight adjustment rate, η 0 is a reference learning rate, C i is a real-time assessment coefficient,As a sliding average value of the coefficients,For the degree of coefficient fluctuation, E is a dynamic threat index, where i=1, 2,3.
It should be noted that the foregoing description is only a preferred embodiment of the present invention, and although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood that modifications, equivalents, improvements and modifications to the technical solution described in the foregoing embodiments may occur to those skilled in the art, and all modifications, equivalents, and improvements are intended to be included within the spirit and principle of the present invention.

Claims (6)

1. The real-time data analysis method of the mobile investigation equipment based on the multi-sensor fusion is characterized by comprising an environment sensing terminal, a motion monitoring terminal, a weight optimizing terminal, a data fusion terminal, a decision control terminal, a communication relay terminal and the mobile investigation equipment, and specifically comprising the following steps of:
S1, acquiring heterogeneous environment data by an environment sensing terminal through a multi-source sensor array when mobile investigation equipment operates to form a multi-source space-time synchronous data set, wherein the multi-source space-time synchronous data set comprises environment factor data, motion state data and target characteristic data;
s2, the data fusion terminal performs space-time alignment and data cleaning on the multi-source space-time synchronous data set to construct a standardized feature matrix;
S3, the decision control terminal analyzes the standardized feature matrix to obtain an environment interference suppression coefficient, a motion track confidence coefficient and a target feature fusion index, inputs the three coefficients into a multi-mode risk prediction model, and outputs a dynamic threat index;
The environmental interference suppression coefficient is specifically:
,
Wherein N is the number of environmental sensors, sigma i is the standard deviation of the data acquired by the ith sensor, mu i is the mean value of the data acquired by the ith sensor, k is an environmental attenuation factor, Δt is the length of a data acquisition time window, and e is a natural constant;
The motion trail confidence level is specifically:
,
Wherein, the The acceleration vector measured for the IMU,For the GPS calculated acceleration vector, a max is the maximum allowable acceleration of the equipment, delta theta is the absolute value of the deviation between the geomagnetic azimuth angle and the GPS course angle, and pi is the circumference ratio;
The target feature fusion index specifically comprises:
,
Where M is the number of target recognition sensors, SNR j is the signal-to-noise ratio of the jth sensor, ω j is the jth sensor weight, For the actual measured motion vector of the object,V max is the maximum tracking speed of the device, which is the motion prediction vector;
S4, the communication relay terminal generates a grading early warning instruction according to the dynamic threat index;
s5, the motion monitoring terminal adjusts motion parameters of the equipment based on the early warning instruction and the dynamic threat index, and executes an obstacle avoidance strategy;
And S6, dynamically updating the weight coefficient in the multi-mode risk prediction model based on the dynamic threat index by the weight optimization terminal.
2. The method for analyzing real-time data of mobile investigation equipment based on multi-sensor fusion according to claim 1, wherein the environmental factor data comprise illumination intensity, temperature, humidity atmospheric pressure, wind speed, electromagnetic field intensity and dust concentration, the motion state data comprise three-dimensional acceleration, angular velocity, geomagnetic azimuth angle and GPS positioning coordinates, and the target characteristic data comprise signal-to-noise ratio and measured motion vector of the sensor.
3. The method for analyzing real-time data of mobile investigation equipment based on multi-sensor fusion according to claim 1, wherein the multi-modal risk prediction model is specifically:
,
Wherein q=α× (β 1×C12×C23×C3-γ),C1 is an environmental interference suppression coefficient, C 2 is a motion trajectory confidence level, C 3 is a target feature fusion index, α is a scale factor, γ is a risk determination threshold, e is a natural constant, β 1、β2 and β 3 are weight coefficients, β 123 =1 and β 1、β2 and β 3 e [0,1].
4. The method for analyzing real-time data of mobile investigation equipment based on multi-sensor fusion according to claim 1, wherein the hierarchical early warning instruction is:
When the dynamic threat index E is more than or equal to E 1, triggering a first-level early warning instruction, and starting an emergency braking and defending protocol;
When the dynamic threat index E 2≤E<E1 is used, triggering a secondary early warning instruction, and activating equipment deceleration and path re-planning;
when the dynamic threat index is 0 to be less than or equal to E < E 2, triggering a three-level early warning instruction, and starting an environment monitoring enhancement mode and a basic avoidance strategy.
5. The method for analyzing real-time data of mobile investigation equipment based on multi-sensor fusion according to claim 1, wherein the executing process of the motion monitoring terminal is as follows:
a1, analyzing the early warning instruction level, and calling a preset motion parameter mapping table;
a2, calculating an obstacle avoidance vector based on the dynamic threat index:
,
wherein lambda is the obstacle avoidance gain factor, For the actual measured motion vector of the object,Is a motion prediction vector;
A3, adjusting the motor torque and the steering angle through a PID controller to ensure that the motion trail of the equipment meets the following conditions:
,
Wherein a limit is the final upper acceleration limit preset value of the device, a max is the maximum acceleration preset value of the device, E is the dynamic threat index, and min is the minimum function.
6. The method for analyzing real-time data of mobile investigation equipment based on multi-sensor fusion according to claim 1, wherein the dynamic updating of the weight coefficient is specifically as follows:
,
Wherein, the ,
,
,
,
Wherein e is a natural constant,For the dynamically updated final weight coefficients,For intermediate weights scaled by a dynamic threat index,For short term bias corrected weights, m (E) is a threat response intensity coefficient, η (E) is a weight adjustment rate, η 0 is a reference learning rate, C i is a real-time assessment coefficient,As a sliding average value of the coefficients,For the degree of coefficient fluctuation, E is a dynamic threat index, where i=1, 2,3.
CN202510967246.4A 2025-07-14 2025-07-14 Mobile investigation equipment real-time data analysis method based on multi-sensor fusion Active CN120803063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510967246.4A CN120803063B (en) 2025-07-14 2025-07-14 Mobile investigation equipment real-time data analysis method based on multi-sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510967246.4A CN120803063B (en) 2025-07-14 2025-07-14 Mobile investigation equipment real-time data analysis method based on multi-sensor fusion

Publications (2)

Publication Number Publication Date
CN120803063A CN120803063A (en) 2025-10-17
CN120803063B true CN120803063B (en) 2026-01-30

Family

ID=97309805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510967246.4A Active CN120803063B (en) 2025-07-14 2025-07-14 Mobile investigation equipment real-time data analysis method based on multi-sensor fusion

Country Status (1)

Country Link
CN (1) CN120803063B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914152A (en) * 2020-06-30 2020-11-10 中国科学院计算技术研究所 Network event early warning method and system
CN120199050A (en) * 2025-03-24 2025-06-24 招商局检测车辆技术研究院有限公司 A laboratory safety monitoring method and system based on multi-data fusion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110246108B (en) * 2018-11-21 2023-06-20 浙江大华技术股份有限公司 Image processing method, device and computer readable storage medium
CN110209184A (en) * 2019-06-21 2019-09-06 太原理工大学 A kind of unmanned plane barrier-avoiding method based on binocular vision system
US12314346B2 (en) * 2024-12-19 2025-05-27 Digital Global Systems, Inc. Systems and methods of sensor data fusion

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914152A (en) * 2020-06-30 2020-11-10 中国科学院计算技术研究所 Network event early warning method and system
CN120199050A (en) * 2025-03-24 2025-06-24 招商局检测车辆技术研究院有限公司 A laboratory safety monitoring method and system based on multi-data fusion

Also Published As

Publication number Publication date
CN120803063A (en) 2025-10-17

Similar Documents

Publication Publication Date Title
CN110850403A (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
CN118999565B (en) Unmanned aerial vehicle navigation control method based on navigation positioning
CN107289939A (en) Unmanned boat paths planning method based on SVM algorithm
CN120337084A (en) Anomaly detection method for drone data based on multimodal time series modeling
CN120164134B (en) Hydroelectric engineering monitoring method and system based on unmanned aerial vehicle inspection and three-dimensional modeling
CN118419788B (en) Intelligent control system for approaching protection and anti-collision protection of monorail crane personnel
CN120595848A (en) Low-altitude UAV flight monitoring method and system
CN119717881A (en) Three-dimensional laser radar-based intelligent line-imitating flight inspection method for power transmission line
CN117200931B (en) A quadrotor drone attack tracing method based on data flow analysis
CN120213011A (en) A Beidou-based patrol and inspection drone positioning system
CN115951369B (en) Multi-sensor fusion positioning method for complex port environments
CN115861366B (en) Multi-source perception information fusion method and system for target detection
CN120803063B (en) Mobile investigation equipment real-time data analysis method based on multi-sensor fusion
CN120508911B (en) DAS and AI-based dual-branch sound vibration fusion event identification and positioning method
CN119888691B (en) Transparent obstacle detection and map reconstruction method and system based on laser radar point cloud data
CN120993939A (en) A method and system for collaborative formation control of unmanned aerial vehicle (UAV) swarms
CN120524122A (en) Intelligent detection system for slope protection based on deep learning
Liu et al. Range-SLAM: Ultra-Wideband-Based Smoke-Resistant Real-Time Localization and Mapping
Huang et al. A height-measurement algorithm for rotor UAVs based on multilayer space-time fusion
CN111854729B (en) Track association method based on motion information and attribute information
CN117932548B (en) Confidence-based flight obstacle recognition method based on multi-source heterogeneous sensor fusion
CN120254921B (en) Low-altitude aircraft route meteorological data acquisition and processing method
Chen et al. Information fusion structure of VTS sensors for multi-target tracking of vessels
Meng et al. Research on multi-source information fusion based on extended kalman filter
Hamidkhani et al. Event-Triggered Tracking Algorithm for Maritime Vessels Using Remote Sensing and Game Theory

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant