[go: up one dir, main page]

CN120800411A - Agricultural machinery intelligent path planning system and method based on multi-sensor fusion - Google Patents

Agricultural machinery intelligent path planning system and method based on multi-sensor fusion

Info

Publication number
CN120800411A
CN120800411A CN202511312883.4A CN202511312883A CN120800411A CN 120800411 A CN120800411 A CN 120800411A CN 202511312883 A CN202511312883 A CN 202511312883A CN 120800411 A CN120800411 A CN 120800411A
Authority
CN
China
Prior art keywords
data
agricultural machinery
path planning
path
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202511312883.4A
Other languages
Chinese (zh)
Inventor
张立艳
王鹏
王斌
郭著伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weifang Huabo Agricultural Equipment Co ltd
Original Assignee
Weifang Huabo Agricultural Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weifang Huabo Agricultural Equipment Co ltd filed Critical Weifang Huabo Agricultural Equipment Co ltd
Priority to CN202511312883.4A priority Critical patent/CN120800411A/en
Publication of CN120800411A publication Critical patent/CN120800411A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Guiding Agricultural Machines (AREA)

Abstract

The invention discloses an agricultural machinery intelligent path planning system and method based on multi-sensor fusion, and relates to the technical field of agricultural machinery automatic control. The system comprises an AI large model learning module, a visual identification module, a data analysis module and an intelligent control module. The AI large model learning module learns the using instruction of the agricultural machine controller, the visual recognition module collects and extracts the characteristic information of the panel of the agricultural machine controller, the data analysis module analyzes, diagnoses and analyzes the characteristic information, and the intelligent control module transmits the analyzed data to the intelligent driving system of the tractor and executes corresponding control actions. The invention solves the problems that agricultural machinery cannot be controlled in an associated way under different communication protocols, the existing scheme is high in cost, complex, low in efficiency and the like, has the advantages of high compatibility, low cost, high efficiency, strong adaptability, high safety and the like, and can be widely applied to the field of modern agriculture.

Description

Agricultural machinery intelligent path planning system and method based on multi-sensor fusion
Technical Field
The invention relates to the technical field of agricultural machinery automation control, in particular to an agricultural machinery intelligent path planning system and method based on multi-sensor fusion.
Background
In the modern agricultural development process, the intellectualization of agricultural machinery plays a key role in improving the agricultural production efficiency, reducing the labor cost and realizing precise agriculture. The path planning of the agricultural machinery is one of the key links of intelligent operation. The conventional agricultural machinery path planning method, such as navigation by simply relying on satellite positioning (such as GPS), has a plurality of limitations. In complex farm environments, satellite signals are subject to interference from factors such as trees, buildings, terrain fluctuations, etc., resulting in loss of positioning signals or drift phenomena. For example, the precision of single point positioning of a common Global Navigation Satellite System (GNSS) is usually only in the order of meters, and it is difficult to meet the requirements of agricultural production on high-precision operations, such as fine operations of seeding, weeding and the like, and more precise path planning is required to avoid damage to crops.
At present, in order to solve the problem of agricultural machinery path planning, main measures adopted include positioning and navigation by using a single satellite navigation system. Although simple, this method has significant drawbacks in the field environment. First, a single satellite navigation system cannot provide sufficient accuracy and reliability, and in particular, in a case where signal occlusion or interference is serious, positioning accuracy may be greatly reduced. Second, this method is not effective in identifying and avoiding obstacles in the agricultural field, such as ditches, rocks, etc., which can cause the agricultural machine to collide or be damaged during operation. In addition, a single satellite navigation system cannot adapt to changes in farmland environments, such as crop growth, terrain changes, etc., which may lead to inaccuracy in path planning and reduced operating efficiency.
Problems of the prior art although the prior art improves path planning for agricultural machinery to some extent, there are still some problems and disadvantages. First, existing path planning methods rely on a single satellite navigation system, which cannot provide sufficient accuracy and reliability, especially in complex farmland environments. Second, existing solutions are not effective in identifying and avoiding obstacles in the agricultural field, which may cause the agricultural machine to collide or be damaged during operation. In addition, the existing technical scheme cannot adapt to the change of farmland environment, such as crop growth, terrain change and the like, which may cause inaccuracy of path planning and reduction of operation efficiency. Therefore, the development of a novel intelligent path planning system and method for agricultural machinery has important practical significance and application value.
Disclosure of Invention
The present invention aims to solve at least to some extent one of the technical problems in the above-described technology.
In order to achieve the above purpose, the first aspect of the present invention provides an intelligent path planning system and method for an agricultural machine based on multi-sensor fusion.
Agricultural machinery intelligent path planning system based on multisensor fuses includes:
the system comprises a multi-sensor integrated module, a sensor module and a sensor module, wherein the multi-sensor integrated module comprises a global navigation satellite system, an inertial measurement unit, a binocular camera and a laser radar, wherein the global navigation satellite system is used for providing position information of agricultural machinery, the inertial measurement unit is used for measuring acceleration and angular velocity of the agricultural machinery in a three-dimensional space, the binocular camera is used for identifying crop rows and obstacle characteristics, and the laser radar is used for acquiring surrounding environment three-dimensional point cloud data;
The system comprises a multi-sensor integrated module, a data processing unit, a global navigation satellite system and an inertial measurement unit, wherein the multi-sensor integrated module is connected with the data processing unit, the data processing unit is used for carrying out time synchronization processing, coordinate conversion processing and multi-sensor data fusion processing on data acquired by each sensor, global path planning, local path planning and real-time adjustment can be carried out on the basis of the fused data, the multi-sensor data fusion processing adopts a fusion algorithm combining Kalman filtering and particle filtering, the data of the global navigation satellite system and the data of the inertial measurement unit are fused by adopting the Kalman filtering algorithm, feature extraction and matching are carried out on binocular camera and laser radar data, then the particle filtering algorithm is fused, the global path planning adopts an improved A-based algorithm, farmland boundary information, crop distribution information and obstacle position information are combined, and an optimal global path is searched, and the local path planning and the real-time adjustment are re-planning paths in a local range on the basis of the Dijkstra algorithm.
As an improvement, the data processing unit is provided with a high-performance multi-core processor and a parallel computing architecture, and is used for realizing real-time processing of a large amount of sensor data.
As an improvement, the binocular camera recognizes key characteristics such as crop rows, obstacles and the like through stereoscopic vision capability, and solves the problem of dimension blurring of the monocular camera.
As an improvement, the laser radar acquires high-precision surrounding three-dimensional point cloud data by emitting a laser beam and measuring the time delay of reflected light, so as to accurately sense the topography fluctuation and the obstacle distribution.
As an improvement, in the global path planning, an improved a algorithm introduces a dynamic weight mechanism, and according to the complexity of farmland environment and the operation requirement of farm machinery, heuristic function weights in the path searching process are adjusted in real time.
As an improvement, in the local path planning and real-time adjustment, the adjustment parameters can be automatically calculated according to the crop line bending or offset recognized by the binocular camera, so as to control the steering mechanism of the agricultural machinery.
An agricultural machinery intelligent path planning method based on multi-sensor fusion comprises the following steps:
The system initialization, namely starting an agricultural machine, starting a global navigation satellite system, an inertial measurement unit, a binocular camera and a laser radar, and collecting farmland environment data;
The data preprocessing, namely performing time synchronization processing on the acquired sensor data, establishing a unified coordinate system, and converting the sensor data into the coordinate system;
The data fusion comprises the steps of fusing the global navigation satellite system and the inertial measurement unit data by adopting a Kalman filtering algorithm, respectively extracting and matching features of the binocular camera and the laser radar data, and fusing by adopting a particle filtering algorithm;
Global path planning, namely searching an optimal global path from a starting point to a target point by adopting an improved A-type algorithm and combining farmland boundary information, crop distribution information and barrier position information based on the fused data;
and (3) local path planning and real-time adjustment, namely, during the operation process of the agricultural machinery, re-planning the path in a local range based on Dijkstra algorithm by utilizing multi-sensor data acquired in real time so as to avoid sudden obstacles or adapt to environmental changes.
In the data preprocessing, the time synchronization processing controls the time error of each sensor data to be on the order of microseconds.
In the data preprocessing, the relative position and posture relation among the sensors are accurately calibrated, so that the corresponding rotation matrix and translation vector are obtained, and the conversion from the coordinate system of each sensor to the global coordinate system is realized.
As an improvement, in the global path planning, the improved a algorithm adjusts the heuristic function weight through a dynamic weight mechanism, so that the planned global path meets the shortest path requirement and avoids the obstacle and the operated area.
The method further comprises the step of executing the path, wherein the steering mechanism of the agricultural machine is controlled according to the planned global path and the planned local path so as to keep the machine on the optimal working path.
Advantageous effects
The invention can realize accurate planning of the agricultural machinery operation path by fusing a plurality of sensor data such as GNSS, IMU, binocular camera and laser radar, and obviously improves the path planning precision, especially in complex farmland environment.
The invention reduces the dependence on a single sensor, enhances the reliability and stability of the system through multi-sensor data fusion, and reduces the operation risk caused by signal interference or loss.
The invention can adjust path planning in real time, avoid obstacles and operated areas and improve the operation efficiency.
The invention can adapt to the change of farmland environment, recognize and track crop rows in real time, dynamically adjust the running path of farm machinery and ensure the accuracy of operation.
The labor cost is reduced, namely, through intelligent path planning, the manual intervention is reduced, the labor cost is reduced, and the agricultural production efficiency is improved.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a system frame diagram according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Example 1 agricultural machinery Intelligent Path planning System and method based on Multi-sensor fusion basic implementation
In this embodiment, a tractor is taken as an agricultural machine as an example, and specific implementation processes of an intelligent path planning system and method based on multi-sensor fusion are described in detail.
System hardware construction
The multi-sensor integration comprises a multi-sensor assembly arranged on a tractor body, wherein a Global Navigation Satellite System (GNSS) module adopts a Beidou dual-mode positioning module, the multi-sensor assembly is arranged at an unoccluded position at the top of the tractor and is used for receiving satellite signals and outputting position information such as longitude and latitude, altitude and the like, the positioning frequency is 10Hz, an Inertial Measurement Unit (IMU) adopts a six-axis (3-axis acceleration+3-axis angular velocity) MEMS inertial sensor, the inertial measurement unit is fixed at the center position of a tractor chassis, the measuring frequency is 100Hz, the moving acceleration and the angular velocity of the tractor are output in real time, a binocular camera adopts an industrial-grade stereo camera with the resolution of 1280×720 and is arranged at the middle of the front end of the tractor, a lens faces to a farmland area in front of the front, the frame rate is 30fps and is used for shooting images of crops and obstacles, a laser radar adopts 16-line solid laser radar and is arranged at the top of the tractor cab, the horizontal view angle is 120 DEG, the ranging range is 0.5-100m, the point cloud density is 200 point/°, and the three-dimensional point cloud data is used for scanning the surrounding environment.
The data processing unit adopts a multi-core processor (4 cores, main frequency is 2.0 GHz) based on ARM Cortex-A72 architecture, is provided with 8GB DDR4 memory and a 64GB memory module, integrates interfaces such as Ethernet, USB 3.0 and the like, and is respectively connected with GNSS, IMU, binocular camera and laser radar in a wired mode to realize real-time receiving and processing of sensor data.
Data preprocessing flow
Time synchronization, namely realizing time synchronization through a hardware trigger mechanism (all sensors are connected with the same trigger signal source), wherein the trigger signal period is 10ms, and ensuring that sampling time errors of GNSS (10 Hz), IMU (100 Hz), binocular camera (30 fps) and laser radar (10 Hz) are controlled within +/-50 mu s. The synchronized data are marked by a time stamp and stored in a buffer area of the data processing unit.
And (3) coordinate conversion, namely establishing a global coordinate system by taking the center of the tractor chassis as an origin (an X axis is along the advancing direction of the tractor, a Y axis is vertical to the advancing direction and is horizontal to the right, and a Z axis is vertical to the upper). By pre-calibrating the relative pose of each sensor and the global coordinate system:
The translation vector of the center of the GNSS module relative to the origin of the global coordinate system is (0.5 m, 0, 1.2 m), and the rotation matrix is an identity matrix (assuming that the GNSS is consistent with the attitude of the global coordinate system);
The IMU center coincides with the origin of the global coordinate system, and the rotation matrix is a unit matrix;
the translation vector of the binocular camera optical center relative to the origin of the global coordinate system is (1.5 m, 0, 0.8 m), and the rotation matrix is obtained through a Soxhlet calibration method and is used for converting the camera image coordinates into the global coordinate system;
The translation vector of the laser radar center relative to the origin of the global coordinate system is (0, 0, 2.0 m), and the rotation matrix is obtained through joint calibration of the laser radar and the IMU and is used for converting the point cloud data into the global coordinate system.
Multi-sensor data fusion process
And (3) fusing the GNSS and the IMU data, namely establishing a state equation and an observation equation by adopting an Extended Kalman Filter (EKF) algorithm and taking the acceleration and the angular speed of the IMU as system input and the position and the speed of the GNSS as observation values. In the filtering process, once GNSS data (10 Hz) is received, the integral result (100 Hz) of the IMU is corrected, the accumulated error of the IMU is restrained, the real-time pose (position, speed and attitude angle) of the tractor is output, and the updating frequency is 100Hz.
Binocular camera and laser radar data fusion:
The binocular camera calculates a parallax image through an SGBM algorithm to generate depth information, extracts straight line characteristics (slope and intercept) of crop rows through Hough transformation, recognizes obstacles (such as stones and weed clusters) through a YOLOv algorithm and outputs boundary frame coordinates of the obstacles in an image;
the laser radar point cloud data is subjected to ground segmentation (a RANSAC algorithm is adopted to fit a ground plane) and cluster analysis (a DBSCAN algorithm is adopted) to extract the three-dimensional coordinates and the sizes of the obstacles;
And the two types of data are fused by adopting a particle filtering algorithm, namely crop line characteristics and barrier two-dimensional information identified by a binocular camera are taken as observation values, three-dimensional point cloud of a laser radar is taken as state prior, and the estimation results of barrier positions and crop line boundaries are optimized through particle resampling, so that a fused environment characteristic map (comprising crop line distribution and barrier three-dimensional coordinates) is output.
Intelligent path planning and execution
And (3) global path planning, namely planning a global path by adopting an improved A-based algorithm based on the fused environment characteristic map. The heuristic function of the algorithm is designed as:
euclidean distance target point crop line deviation cost ;
Wherein, the AndIs dynamic weight%) In dense areas of cropsTake 0.7 (preferably travel along crop rows) in open areasTake 0.8 (priority shortest path). The planning result is a continuous path point sequence (interval 1 m) from the initial position of the tractor to the farmland boundary end point, and the continuous path point sequence is stored in a data processing unit.
When the tractor runs along the global path, the data processing unit receives real-time sensor data once every 100ms, and if the laser radar or the binocular camera detects a burst obstacle (such as a newly added stone, the distance from the tractor is less than 5 m), the local path planning is triggered:
taking the current position as a starting point, taking a point at 50m on a global path as a local target point, and constructing a local environment map of 5m multiplied by 10 m;
searching the shortest path avoiding the obstacle in the local map by adopting Dijkstra algorithm, wherein the distance between the path points is 0.5m;
the local path and the global path are smoothly transitioned (by B-spline curve fitting) to generate a final execution path.
The data processing unit converts the expected position of the path point into a steering angle instruction (calculated by a PID control algorithm) of the tractor, outputs the steering angle instruction to a steering actuating mechanism (an electric hydraulic steering system) of the tractor, controls the tractor to run along a planned path, corrects the steering angle in real time in the running process, and ensures that the path tracking error is less than 0.2m.
Embodiment 2 application of agricultural machinery intelligent path planning system and method based on multi-sensor fusion in seeding operation
In this embodiment, a corn planter is used as an application object, and a path adjustment mechanism for a sowing operation is added on the basis of embodiment 1, and the specific process is as follows:
After the seeder is started, each sensor collects data according to the parameters of the embodiment 1, the binocular camera mainly identifies reserved corn planting lines (the boundaries of soil and crop stubbles are extracted through color threshold segmentation), and the laser radar scans topography relief (such as ridges and depressions) on two sides of the planting lines.
And (3) data fusion and crop line tracking, namely adding a crop line tracking module on the basis of the fusion algorithm of the embodiment 1, calculating the real-time offset (transverse distance relative to the central axis of the seeder) of the crop line by matching the characteristic points of continuous frame images of the binocular camera (adopting an LK optical flow method), and judging whether the crop line offset is caused by topographic relief or not by combining the topographic data of the laser radar.
Targeted adjustment of path planning:
When the global path is planned, the heuristic function of the improved A-algorithm increases the cost of the sowing row spacing, if the distance between the planned path and the reserved planting row deviates from a preset value (such as 60cm plus or minus 5 cm), the path cost is increased, and the sowing position is ensured to be centered along the reserved planting row;
When the local path is adjusted, if the crop line is detected to transversely deviate due to the topographic relief (for example, the deviation is larger than 8 cm), the local path is corrected in real time according to the deviation direction, so that a sowing unit of the sowing machine always aims at the reserved planting line, and the correction quantity linearly changes along with the deviation (for example, the proportionality coefficient is 0.8).
In the process of executing the sowing operation, the data processing unit simultaneously sends a control signal to the seed sowing device of the sowing machine (based on the path position and the preset sowing density), so that the linkage of path tracking and sowing control is realized, and the consistency of the sowing depth (the undercarriage of the sowing machine is adjusted through laser radar topographic data) and the planting distance (the error is smaller than 3 cm) is ensured.
Through the embodiment, the intelligent path planning method realizes the high-precision path planning of the agricultural machinery in the complex farmland environment through the cooperation of the multi-sensor fusion and the intelligent algorithm, and can be widely applied to intelligent operation scenes of various agricultural machinery such as tractors, sowers and the like.
In the description of this specification, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "plurality" means at least two, for example, two, three, etc., unless specifically defined otherwise.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (11)

1.一种基于多传感器融合的农机智能路径规划系统,其特征在于,包括:1. An intelligent path planning system for agricultural machinery based on multi-sensor fusion, characterized by comprising: 多传感器集成模块,所述多传感器集成模块包括全球导航卫星系统、惯性测量单元、双目相机以及激光雷达;其中,所述全球导航卫星系统用于提供农用机械的位置信息,所述惯性测量单元用于测量农用机械在三维空间中的加速度和角速度,所述双目相机用于识别作物行、障碍物特征,所述激光雷达用于获取周围环境三维点云数据;A multi-sensor integrated module, comprising a global navigation satellite system (GNSS), an inertial measurement unit (IMU), a binocular camera, and a laser radar. The GNSS is used to provide location information for agricultural machinery, the IMU is used to measure the acceleration and angular velocity of agricultural machinery in three-dimensional space, the binocular camera is used to identify crop rows and obstacle features, and the laser radar is used to obtain three-dimensional point cloud data of the surrounding environment. 数据处理单元,所述数据处理单元与所述多传感器集成模块连接,用于对各传感器采集的数据进行时间同步处理、坐标转换处理以及多传感器数据融合处理,且能基于融合后的数据执行全局路径规划和局部路径规划与实时调整;所述多传感器数据融合处理采用卡尔曼滤波和粒子滤波相结合的融合算法,其中对全球导航卫星系统和惯性测量单元数据采用卡尔曼滤波算法融合,对双目相机和激光雷达数据先进行特征提取和匹配再采用粒子滤波算法融合;所述全局路径规划采用改进的A*算法,结合农田边界信息、作物分布信息及障碍物位置信息搜索最优全局路径;所述局部路径规划与实时调整基于Dijkstra算法在局部范围重新规划路径。A data processing unit is connected to the multi-sensor integration module and is used to perform time synchronization processing, coordinate conversion processing and multi-sensor data fusion processing on the data collected by each sensor, and can perform global path planning and local path planning and real-time adjustment based on the fused data; the multi-sensor data fusion processing adopts a fusion algorithm combining Kalman filtering and particle filtering, wherein the global navigation satellite system and inertial measurement unit data are fused using the Kalman filtering algorithm, and the binocular camera and lidar data are first feature extracted and matched and then fused using the particle filtering algorithm; the global path planning adopts an improved A* algorithm, combined with farmland boundary information, crop distribution information and obstacle location information to search for the optimal global path; the local path planning and real-time adjustment are based on the Dijkstra algorithm to replan the path in the local range. 2.根据权利要求1所述的系统,其特征在于,所述数据处理单元配备高性能的多核处理器和并行计算架构,用于实现对大量传感器数据的实时处理。2. The system according to claim 1 is characterized in that the data processing unit is equipped with a high-performance multi-core processor and a parallel computing architecture to realize real-time processing of large amounts of sensor data. 3.根据权利要求1所述的系统,其特征在于,所述双目相机通过立体视觉能力识别作物行、障碍物等关键特征,解决单目相机的尺度模糊问题。3. The system according to claim 1, wherein the binocular camera uses stereo vision to identify key features such as crop rows and obstacles, solving the scale ambiguity problem of the monocular camera. 4.根据权利要求1所述的系统,其特征在于,所述激光雷达通过发射激光束并测量反射光的时间延迟,获取高精度的周围环境三维点云数据,以精确感知地形起伏和障碍物分布。4. The system according to claim 1 is characterized in that the lidar obtains high-precision three-dimensional point cloud data of the surrounding environment by emitting a laser beam and measuring the time delay of the reflected light, so as to accurately perceive the terrain undulations and obstacle distribution. 5.根据权利要求1所述的系统,其特征在于,所述全局路径规划中,改进的A*算法引入动态权重机制,根据农田环境复杂程度和农用机械作业需求,实时调整路径搜索过程中的启发函数权重。5. The system according to claim 1 is characterized in that, in the global path planning, the improved A* algorithm introduces a dynamic weight mechanism, which adjusts the heuristic function weight in the path search process in real time according to the complexity of the farmland environment and the operation requirements of agricultural machinery. 6.根据权利要求1所述的系统,其特征在于,所述局部路径规划与实时调整中,能根据双目相机识别到的作物行弯曲或偏移情况,自动计算调整参数,控制农用机械的转向机构。6. The system according to claim 1 is characterized in that during the local path planning and real-time adjustment, adjustment parameters can be automatically calculated based on the bending or offset of crop rows identified by the binocular camera to control the steering mechanism of the agricultural machinery. 7.一种基于多传感器融合的农机智能路径规划方法,其特征在于,包括以下步骤:7. A method for intelligent path planning of agricultural machinery based on multi-sensor fusion, characterized by comprising the following steps: 系统初始化:启动农用机械,开启全球导航卫星系统、惯性测量单元、双目相机和激光雷达,采集农田环境数据;System initialization: Start the agricultural machinery, turn on the global navigation satellite system, inertial measurement unit, binocular camera and lidar, and collect farmland environment data; 数据预处理:对采集到的传感器数据进行时间同步处理,并建立统一坐标系统,将各传感器数据转换至该坐标系下;Data preprocessing: Perform time synchronization on the collected sensor data, establish a unified coordinate system, and convert the sensor data into this coordinate system; 数据融合:对全球导航卫星系统和惯性测量单元数据采用卡尔曼滤波算法进行融合,对双目相机和激光雷达数据先分别进行特征提取和匹配,再采用粒子滤波算法进行融合;Data fusion: The Kalman filter algorithm is used to fuse the global navigation satellite system and inertial measurement unit data. The binocular camera and lidar data are first subjected to feature extraction and matching, and then fused using the particle filter algorithm. 全局路径规划:基于融合后的数据,采用改进的A*算法,结合农田边界信息、作物分布信息及障碍物位置信息,搜索从起始点到目标点的最优全局路径;Global path planning: Based on the fused data, an improved A* algorithm is used to combine farmland boundary information, crop distribution information, and obstacle location information to search for the optimal global path from the starting point to the target point; 局部路径规划与实时调整:在农用机械作业过程中,利用实时采集的多传感器数据,基于Dijkstra算法在局部范围重新规划路径,以避开突发障碍物或适应环境变化。Local path planning and real-time adjustment: During agricultural machinery operations, multi-sensor data collected in real time is used to replan the path locally based on the Dijkstra algorithm to avoid sudden obstacles or adapt to environmental changes. 8.根据权利要求7所述的方法,其特征在于,所述数据预处理中,时间同步处理将各传感器数据的时间误差控制在微秒级。8. The method according to claim 7, characterized in that, in the data preprocessing, the time synchronization processing controls the time error of each sensor data to the microsecond level. 9.根据权利要求7所述的方法,其特征在于,所述数据预处理中,通过精确标定各传感器之间的相对位置和姿态关系,获取相应的旋转矩阵和平移向量,实现从各传感器坐标系到全局坐标系的转换。9. The method according to claim 7 is characterized in that, in the data preprocessing, the relative position and posture relationship between each sensor is accurately calibrated to obtain the corresponding rotation matrix and translation vector, thereby realizing the conversion from each sensor coordinate system to the global coordinate system. 10.根据权利要求7所述的方法,其特征在于,所述全局路径规划中,改进的A*算法通过动态权重机制调整启发函数权重,使规划出的全局路径满足最短路径要求且避开障碍物和已作业区域。10. The method according to claim 7 is characterized in that, in the global path planning, the improved A* algorithm adjusts the heuristic function weight through a dynamic weight mechanism so that the planned global path meets the shortest path requirements and avoids obstacles and operated areas. 11.根据权利要求7所述的方法,其特征在于,还包括路径执行步骤:根据规划出的全局路径和局部路径,控制农用机械的转向机构,使机械保持在最佳作业路径上。11. The method according to claim 7, further comprising a path execution step: controlling a steering mechanism of the agricultural machinery according to the planned global path and local path to keep the machinery on the optimal operating path.
CN202511312883.4A 2025-09-15 2025-09-15 Agricultural machinery intelligent path planning system and method based on multi-sensor fusion Pending CN120800411A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202511312883.4A CN120800411A (en) 2025-09-15 2025-09-15 Agricultural machinery intelligent path planning system and method based on multi-sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202511312883.4A CN120800411A (en) 2025-09-15 2025-09-15 Agricultural machinery intelligent path planning system and method based on multi-sensor fusion

Publications (1)

Publication Number Publication Date
CN120800411A true CN120800411A (en) 2025-10-17

Family

ID=97326829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202511312883.4A Pending CN120800411A (en) 2025-09-15 2025-09-15 Agricultural machinery intelligent path planning system and method based on multi-sensor fusion

Country Status (1)

Country Link
CN (1) CN120800411A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210088354A1 (en) * 2019-09-20 2021-03-25 Deere & Company Method and system for planning a path of a vehicle
CN115049932A (en) * 2022-07-11 2022-09-13 张家港江苏科技大学产业技术研究院 Crop planting row detection method, device and system and readable storage medium
CN117193326A (en) * 2023-10-17 2023-12-08 太原科技大学 Inter-ridge mobile robot path planning method
US20240224838A1 (en) * 2023-01-09 2024-07-11 Verge Technologies Ip Corp. Method and apparatus for generating cluster contours and curved tracks and for improving farming efficiency
CN118816886A (en) * 2024-06-21 2024-10-22 华智清创(苏州)农业科技有限公司 A path planning method and device for autonomous decision-making on intercropping in farmland
CN118844175A (en) * 2024-09-05 2024-10-29 四川省农业机械科学研究院 Intelligent sowing and fertilization control system and control method
CN119225376A (en) * 2024-11-12 2024-12-31 农业农村部南京农业机械化研究所 Agricultural robot control method, device, equipment and storage medium
CN119245651A (en) * 2024-09-25 2025-01-03 山东农业大学 A path planning method for agricultural robots
CN119527351A (en) * 2025-01-22 2025-02-28 深圳卡睿智行科技有限公司 Adaptive control method and system for autonomous driving trucks
CN120252679A (en) * 2025-03-17 2025-07-04 潍柴动力股份有限公司 High-precision map correction method, device, storage medium and program product
CN120313575A (en) * 2025-03-10 2025-07-15 新疆九御科技有限公司 An autonomous navigation system for agricultural robots based on three-dimensional point cloud model
CN120370906A (en) * 2025-02-27 2025-07-25 昆明理工大学 Unmanned picking robot path planning algorithm for intelligent greenhouse
CN120482088A (en) * 2025-05-13 2025-08-15 医十紫东(北京)智能科技有限公司 Small-sized carrier for carrying people and logistics and unmanned intelligent driving system thereof

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210088354A1 (en) * 2019-09-20 2021-03-25 Deere & Company Method and system for planning a path of a vehicle
CN115049932A (en) * 2022-07-11 2022-09-13 张家港江苏科技大学产业技术研究院 Crop planting row detection method, device and system and readable storage medium
US20240224838A1 (en) * 2023-01-09 2024-07-11 Verge Technologies Ip Corp. Method and apparatus for generating cluster contours and curved tracks and for improving farming efficiency
CN117193326A (en) * 2023-10-17 2023-12-08 太原科技大学 Inter-ridge mobile robot path planning method
CN118816886A (en) * 2024-06-21 2024-10-22 华智清创(苏州)农业科技有限公司 A path planning method and device for autonomous decision-making on intercropping in farmland
CN118844175A (en) * 2024-09-05 2024-10-29 四川省农业机械科学研究院 Intelligent sowing and fertilization control system and control method
CN119245651A (en) * 2024-09-25 2025-01-03 山东农业大学 A path planning method for agricultural robots
CN119225376A (en) * 2024-11-12 2024-12-31 农业农村部南京农业机械化研究所 Agricultural robot control method, device, equipment and storage medium
CN119527351A (en) * 2025-01-22 2025-02-28 深圳卡睿智行科技有限公司 Adaptive control method and system for autonomous driving trucks
CN120370906A (en) * 2025-02-27 2025-07-25 昆明理工大学 Unmanned picking robot path planning algorithm for intelligent greenhouse
CN120313575A (en) * 2025-03-10 2025-07-15 新疆九御科技有限公司 An autonomous navigation system for agricultural robots based on three-dimensional point cloud model
CN120252679A (en) * 2025-03-17 2025-07-04 潍柴动力股份有限公司 High-precision map correction method, device, storage medium and program product
CN120482088A (en) * 2025-05-13 2025-08-15 医十紫东(北京)智能科技有限公司 Small-sized carrier for carrying people and logistics and unmanned intelligent driving system thereof

Similar Documents

Publication Publication Date Title
US8655536B2 (en) Method and system for augmenting a guidance system with a path sensor
JP7648650B2 (en) MOBILE BODY, CONTROL UNIT, AND METHOD FOR CONTROLLING OPERATION OF MOBILE BODY
CN102368158B (en) Navigation positioning method of orchard machine
JP7688045B2 (en) MOBILE BODY, CONTROL UNIT, DATA GENERATION UNIT, METHOD FOR CONTROLLING OPERATION OF MOBILE BODY, AND METHOD FOR GENERATING DATA
Torii Research in autonomous agriculture vehicles in Japan
US20200278680A1 (en) Method and Device for Operating a Mobile System
EP2135498B1 (en) A method of navigating an agricultural vehicle and an agricultural vehicle
EP4250045B1 (en) Moving body, data generating unit, and method for generating data
WO2021227792A1 (en) Agricultural machine automatic navigation method, agricultural machine automatic navigation system, and agricultural machine
JP2025529101A (en) Automatic steering using machine vision
US20240300499A1 (en) Vehicle row follow system
CN118168545A (en) Weeding robot positioning and navigation system and method based on multi-source sensor fusion
Wang et al. Autonomous maneuvers of a robotic tractor for farming
WO2023055383A1 (en) Vehicle row follow system
EP4356706B1 (en) METHOD FOR CONTROLLING A VEHICLE FOR HARVESTING AGRICULTURAL MATERIALS
CN112684483B (en) Navigation deviation perception and information acquisition method based on satellite and vision fusion
Pulugu et al. Stereo Vision Subsystem and Scene Segmentation Self‐Steering Tractors in Smart Agriculture
CN120800411A (en) Agricultural machinery intelligent path planning system and method based on multi-sensor fusion
CN118332497A (en) A multi-sensor data fusion method, system and device for mechanical automatic navigation
Yu et al. Tightly coupled GNSS/IMU/vision integrated system for positioning in agricultural scenarios
WO2024135019A1 (en) State estimation system and agriculture machine
CN115326054A (en) A kind of automatic navigation method of crawler agricultural vehicle
US20250265728A1 (en) Projecting pixels onto terrain
JP7771387B2 (en) Sensing system, agricultural machine, and sensing device
US20250329178A1 (en) Automatic Annotation of Data for Machine Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination