[go: up one dir, main page]

WO2024060209A1 - Procédé de traitement de nuage de points et radar - Google Patents

Procédé de traitement de nuage de points et radar Download PDF

Info

Publication number
WO2024060209A1
WO2024060209A1 PCT/CN2022/120909 CN2022120909W WO2024060209A1 WO 2024060209 A1 WO2024060209 A1 WO 2024060209A1 CN 2022120909 W CN2022120909 W CN 2022120909W WO 2024060209 A1 WO2024060209 A1 WO 2024060209A1
Authority
WO
WIPO (PCT)
Prior art keywords
point
points
ground
height
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2022/120909
Other languages
English (en)
Chinese (zh)
Inventor
宋妍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suteng Innovation Technologyco Ltd
Original Assignee
Suteng Innovation Technologyco Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suteng Innovation Technologyco Ltd filed Critical Suteng Innovation Technologyco Ltd
Priority to PCT/CN2022/120909 priority Critical patent/WO2024060209A1/fr
Publication of WO2024060209A1 publication Critical patent/WO2024060209A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Definitions

  • the present application relates to the field of computer technology, and in particular to a method and radar for processing point clouds.
  • LiDAR is an active remote sensing device that uses lasers as the light source and photoelectric detection technology. It is an advanced detection method that combines laser technology with modern photoelectric detection technology. LiDAR is widely used in autonomous driving, logistics vehicles, robots, vehicle-road collaboration, public smart transportation and other fields.
  • point cloud filtering can be used to remove background noise and retain effective point clouds.
  • this application provides a method and radar for processing point clouds, which can perform processing on ground points on the basis of removing abnormal noise points in the point cloud and meeting the sensing requirements of lidar. Effective identification ensures the accuracy of ground point identification.
  • a first aspect of this application provides a method for processing point clouds, including: obtaining a point cloud, which includes outlier points; determining ground points in the point cloud, where the ground points include first type points among outlier points, and The class point meets the height requirement, which is determined based on the ground point height of the previous frame point cloud of the current frame point cloud; at least the ground point is output.
  • the second aspect of the present application provides a device for processing point clouds, including: a point cloud acquisition module, a ground point determination module and a point cloud output module.
  • the point cloud acquisition module is used to obtain a point cloud, the point cloud includes outliers;
  • the ground point determination module is used to determine ground points in the point cloud, the ground points include first-class points among the outliers, the first-class points meet the height requirement, and the height requirement is determined based on the height of the ground points in the previous frame of the point cloud of the current frame;
  • the point cloud output module is used to output at least the ground points.
  • a third aspect of this application provides a board card, including the device for processing point clouds as mentioned above.
  • a fourth aspect of the present application provides a radar, including the above-mentioned device for processing point clouds.
  • a fifth aspect of the present application provides an electronic device, including: a processor; and a memory on which executable code is stored.
  • the processor is caused to execute the above method.
  • a sixth aspect of the present application provides a computer-readable storage medium on which executable code is stored.
  • the executable code is executed by a processor of an electronic device, the processor is caused to execute the above method.
  • a seventh aspect of this application provides a computer program product, which includes executable code. When the executable code is executed, the above method is implemented.
  • some embodiments of the present application determine the first type of points from the outlier points of the point cloud.
  • the first type of points meet the height requirements of the ground.
  • the height requirements are based on the upper limit of the current frame point cloud.
  • outliers may be abnormal noise points
  • the first type of points among the outlier points is retained to avoid misjudgment of ground points as abnormal noise points. This can effectively identify ground points and ensure the accuracy of ground point identification on the basis of removing abnormal noise points in the point cloud to meet the sensing requirements of lidar.
  • outlier points are screened from two perspectives: height consistency requirements and/or ground fitting straight line height range, which effectively improves the accuracy of judgment for the first type of points.
  • the outlier points are the second type points based on smoothness, and/or determine the outliers from the angle of the vector angle. Whether the point is a third type point. If the first type point is not a second type point and/or a third type point, the first type point can be deleted, which effectively improves the identification accuracy of ground points among outlier points.
  • whether the first type of point is a second type of point can be determined by the smoothness and difference between lines, further effectively improving the convenience and accuracy of identifying the second type of point.
  • the lidar since the lidar has a certain height relative to the ground, as the scanning distance increases, the distance between the two rows of points increases significantly. If a fixed smoothness threshold and/or difference is used degree threshold to determine point cloud smoothness, which may lead to inaccuracy. In this embodiment, the smoothness threshold and/or the difference threshold change as the distance between the point and the lidar changes, which further helps to improve the accuracy of the determined ground point.
  • the point cloud is divided into multiple fitting areas, and ground straight line fitting is performed respectively, which helps to reduce the computational complexity and reduce the consumption of computing resources.
  • the vector angle includes at least one of the first sub-angle, the second sub-angle, or the third sub-angle, so that whether the vector angle conforms to the ground can be judged from multiple angles. features, further helping to improve the accuracy of the determined ground points.
  • Figure 1 is a schematic diagram of a method for processing point clouds and an application scenario of radar according to an embodiment of the present application
  • Figure 2 is a schematic diagram of a vehicle-mounted lidar ground point cloud according to an embodiment of the present application
  • Figure 3 is a flow chart of a method for processing point clouds according to an embodiment of the present application.
  • Figure 4 is a schematic diagram of outlier points shown in an embodiment of the present application.
  • Figure 5 is a schematic diagram of the height of the fitted point cloud in the lidar coordinate system according to an embodiment of the present application.
  • Figure 6 is a schematic diagram showing an embodiment of the present application that does not meet the smoothness requirements
  • Figure 7 is a schematic diagram of the vector angle shown in an embodiment of the present application.
  • Figure 8 is another flow chart for processing point clouds according to an embodiment of the present application.
  • Figure 9 is another flow chart for processing point clouds according to an embodiment of the present application.
  • Figure 10 is a schematic diagram of an original point cloud according to an embodiment of the present application.
  • Figure 11 is a schematic diagram of a filtered point cloud according to an embodiment of the present application.
  • Figure 12 is a schematic diagram of a processed point cloud according to an embodiment of the present application.
  • Figure 13 is a schematic structural diagram of a device for processing point clouds according to an embodiment of the present application.
  • Figure 14 is a schematic structural diagram of a radar according to an embodiment of the present application.
  • FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • first, second, third, etc. may be used in this application to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from each other.
  • first information may also be called second information, and similarly, the second information may also be called first information. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • plurality means two or more than two, unless otherwise explicitly and specifically limited.
  • Vehicle-mounted radar The detection range is from 200 meters to 500 meters, and the identifiable physical attributes can include distance and reflectivity. It can be used for small machines such as vehicles and robots. Vehicle-mounted radar includes vehicle-mounted lidar, vehicle-mounted millimeter wave radar, vehicle-mounted ultrasonic radar, etc.
  • Lidar An active remote sensing device that uses a laser as a light source and adopts photoelectric detection technology. It is an advanced detection method that combines laser technology with modern photoelectric detection technology. It is composed of transmitting system, receiving system, scanning control system, data processing system and other parts. Its working principle is to transmit a detection signal to the target, and then process the received echo signal to obtain the target's distance, size, speed, reflectivity and other information. Its advantages are high resolution, high sensitivity, strong anti-interference ability, and not affected by dark conditions. It is widely used in fields such as autonomous driving, logistics vehicles, robots, vehicle-road collaboration, and public smart transportation.
  • Vehicle-mounted lidar By emitting outgoing light (such as a laser beam) with a wavelength of about 900nm, the outgoing light will be reflected by the obstacle when it encounters an obstacle.
  • the processing unit calculates the distance between the obstacle and the vehicle-mounted laser based on the time difference between the reflected light and the outgoing light. distance between radars. In addition, the processing unit can estimate the reflectivity of the target based on the cross-section of the reflected light.
  • Vehicle-mounted lidar has a high degree of integration due to its small size.
  • the system architecture suitable for autonomous driving scenarios can include mobile devices, networks and clouds.
  • Mobile devices include but are not limited to: cars, ships, robots, aircraft, etc.
  • the mobile device can be equipped with electronic devices such as sensors to obtain information about obstacles in the surrounding environment of the mobile device.
  • Electronic devices may include: radar, image sensors, etc.
  • Various electronic devices require the use of registers during operation or communication.
  • Embodiments of the present application provide a method and radar for processing point clouds to determine the first type of points that meet the height requirements of the ground from outlier points of the point cloud.
  • the first type of points are outliers, related technologies may misjudge them as abnormal noise points.
  • the first type of points that meet the height requirements among the outlier points can be retained. This can effectively identify ground points and ensure the accuracy of ground point identification on the basis of removing abnormal noise points in the point cloud to meet the sensing requirements of lidar.
  • Figure 1 is a schematic diagram of a method for processing point clouds and an application scenario of radar according to an embodiment of the present application.
  • FIG. 1 shows the hardware configuration of a vehicle 10 that supports assisted driving or automatic driving functions.
  • the vehicle 10 is equipped with at least one LiDAR (Light Detection and Ranging, LIDAR for short) 11 on the roof and/or the side of the body.
  • the detection area of LIDAR 11 can be fixed.
  • a certain LIDAR 11 can only be used to detect a preset area.
  • the detection area of LIDAR 11 can be adjustable.
  • the lidar on the car body can scan multiple detection areas by adjusting its posture, or it can scan multiple detection areas by adjusting the field of view of the lidar itself. .
  • the vehicle 10 can be equipped with 5 LIDARs 11: on the top of the vehicle, on the front side of the vehicle, on the rear side of the vehicle, on the left side of the vehicle and on the right side of the vehicle.
  • LIDARs 11 on the top of the vehicle, on the front side of the vehicle, on the rear side of the vehicle, on the left side of the vehicle and on the right side of the vehicle.
  • the vehicle 10 may also be equipped with a photographing device.
  • the shooting device can shoot the environment in front of the viewing angle at a prescribed viewing angle.
  • the photographing device may be a single-camera camera, a multi-camera camera, or the like.
  • the vehicle 10 may also be equipped with a plurality of millimeter wave radars surrounding the vehicle 10 .
  • the vehicle 10 is equipped with four millimeter wave radars, with the left side in front of the vehicle, the right side in front of the vehicle, the left side behind the vehicle, and the right side behind the vehicle as detection ranges.
  • Millimeter wave radar can detect the distance of an object present in each detection area and detect the relative speed of the object and the vehicle 10 .
  • the vehicle 10 can also be equipped with a positioning device 12, such as Beidou positioning device, Global Positioning System (GPS), etc.
  • the current position of the vehicle 10 can be determined via the positioning device 12 .
  • the vehicle 10 can also be equipped with an electronic control unit (Electronic Control Unit, ECU for short).
  • ECU Electronic Control Unit
  • the detection signal of at least one of the above-mentioned LIDAR 11, millimeter wave radar and positioning device 12 is sent to the ECU.
  • the ECU can detect and identify obstacles (such as roadblocks, moving objects, trees, adjacent vehicles, etc. around the vehicle 10 ) based on these signals.
  • the ECU can be physically divided into multiple units according to functions, which are collectively referred to as ECUs in this application.
  • the mobile device is described as a car, this description is not a limitation and is applicable to a variety of mobile devices, such as land robots, water robots, etc.
  • the point cloud processing method and radar in the embodiment of the present application can be applied to any one or more electronic devices that require the use of clocks, such as LIDAR 11, millimeter wave radar, positioning equipment 12, ECU or communication systems as shown in Figure 1 .
  • road signal control is coordinated to improve the quality and efficiency of road management.
  • the corresponding autonomous vehicle decisions can be determined based on the information sensed by the sensing system, and the safe distance between autonomous vehicles can be adjusted, so that the vehicles can drive on the road safely and reliably.
  • FIG2 is a schematic diagram of a ground point cloud of a vehicle-mounted laser radar according to an embodiment of the present application.
  • LiDAR will adopt a point cloud filtering scheme to remove background noise and retain effective point clouds.
  • related art point cloud filtering solutions can be based on neighborhood distance statistics to eliminate outliers. Effective points must meet the following conditions: the point cloud in the neighborhood must have a certain number of adjacent points, or the point cloud must be within The average distance of the neighborhood is within 1 Sigma, etc.
  • Ground point cloud plays a very important role in vehicle-mounted lidar perception.
  • the perception algorithm needs to be extracted from the ground and then clustered for perception of obstacles on the ground; or it can be used for curb and lane line detection for assisted driving. Therefore, the detection capability and integrity of ground point clouds are very important for vehicle-mounted lidar.
  • ground point clouds especially long-distance ground point clouds with small pitch angles, the distances between ground point clouds of adjacent scan lines are quite different. If the outlier elimination method of related technologies is still used for point cloud filtering, , it is easy to cause the ground point cloud to be mistakenly eliminated, thus affecting the perception ability of the lidar.
  • A, B and B, C are not equally spaced in Figure 2. Points O and P do not coincide. For example, when angle AOB and angle BOC are equal, the lengths of line segments AB and line segments BC are different. In addition, you can refer to the distance between different rows of point clouds shown in Figure 10.
  • ground point cloud is not specially processed, some ground point clouds will be judged as noise points and eliminated. If the point cloud filtering algorithm threshold is adjusted to ensure the integrity of the ground point cloud, some noise will be released, which will affect the judgment of application scenarios such as autonomous driving.
  • the first type of points that meet the height requirements are determined from the outlier points, and the probability that the first type of point is a ground point is determined larger.
  • Figure 3 is a flow chart of a method for processing point clouds according to an embodiment of the present application.
  • the method for processing a point cloud includes operations S310 to S330 .
  • a point cloud is obtained, and the point cloud includes outlier points.
  • the point cloud may be data collected by lidar.
  • LiDAR can be installed on a variety of mobile platforms, such as vehicles, robots, or exploration equipment. LiDAR obtains multiple reflection signals during the scanning process, and these reflection signals can be converted into point clouds.
  • Figure 4 is a schematic diagram of outlier points according to an embodiment of the present application.
  • the figure shows a partial image of a point cloud in a certain frame, in which most points are clustered together and some points are in an outlier state, that is, outlier points. These outliers may be mistakenly removed when filtering the point cloud.
  • outlier points may be points obtained through point cloud filtering processing.
  • Outliers refer to extremely large and small values in a time series that are far away from the general level of the series.
  • outliers can be identified from point clouds through statistics-based methods. Because outliers appear with low probability in the probability distribution model, low-probability data objects or data samples can be detected. However, the disadvantage is also obvious. Samples that appear with low probability are not necessarily outliers. If some ground points are outliers due to special ground structure, they can easily be misjudged as noise points.
  • Proximity-based methods identify outliers from point clouds and use the three nearest neighbors of the data object for modeling, so the points in the radius (R) area are significantly different from other object points in the data set.
  • the points in the radius (R) area are significantly different from other object points in the data set.
  • their second and third nearest neighbors are significantly farther than other objects (beyond a certain standard deviation, such as 1 sigma), so the objects in the R area can be marked as proximity-based Sexual outliers.
  • Clustering-based methods identify outliers from point clouds and detect outliers by examining the relationship between objects and clusters.
  • an outlier is an object that belongs to a small sparse cluster or does not belong to any cluster.
  • Classification-based methods identify outliers from point clouds. If there are class labels in the training data, it can be regarded as a classification problem. The idea of the problem is generally: train a classification that can distinguish "normal data" from outliers. Model. For example, you can use the model to construct a classifier that only describes the normal class, so that samples that do not belong to the normal class are outliers. Using only the normal class to detect outliers can detect new outliers that are not close to the outliers in the training set. . In this way, when a new outlier point comes in, as long as it is within the decision boundary of the normal class, it is a normal point, and if it is outside the decision boundary, it is an outlier point. For example, the construction of decision boundaries can refer to support vector machine (SVM).
  • SVM support vector machine
  • ground points in the point cloud are determined.
  • the ground points include first-type points among outlier points.
  • the first-type points meet height requirements.
  • the height requirements are ground points based on the previous frame point cloud of the current frame point cloud. Determined by height.
  • the height requirement can be dynamically changed.
  • the average height range of the ground point cloud in the point cloud of the previous frame of the current frame point cloud can be obtained, and the point cloud within the average height range is regarded as the height requirement.
  • the dynamically changing height requirements can automatically remove the inaccurate height requirements caused by the deviation of the fixed position of the lidar, and eliminate outliers that are actually ground points.
  • the height requirements are inaccurate due to changes in environmental conditions such as ground conditions, and outliers that are actually ground points are eliminated.
  • the height requirement can also be a preset height range.
  • the height range can be 0.9 meters to 1.1 meters.
  • the outlier points can be retained to improve the integrity of the ground point cloud.
  • ground points are output for subsequent processing of the point cloud to facilitate functions such as autonomous driving.
  • point clouds of vehicles, signs, traffic lights and other obstacles can also be output.
  • the processing method of ground point cloud features collected by vehicle-mounted lidar provided in this application can ensure the integrity of the ground point cloud while maintaining the denoising effect of point cloud filtering, thereby providing reliable input for the perception application of vehicle-mounted lidar. source.
  • the first category of points includes first subcategory points and/or second subcategory points, the first subcategory points meet height consistency requirements, and the second subcategory points are within the height range of the ground fitting straight line. .
  • the ground is usually relatively flat. If the height uniformity of each point in the point cloud is not good, the probability that it is a ground point is low. Therefore, there is a positive correlation between the high consistency of points in a point cloud and their probability of representing a ground point.
  • meeting the height requirements of the first type of point includes: the height of the first type of point is within the ground fitting straight line height range, and the ground fitting straight line height range is based on the previous frame of the current frame where the outlier point is located.
  • the range is determined by the ground height statistical results.
  • the ground height statistical results include at least one of the height average, variance and minimum value of the point cloud within the height limit range.
  • the first subcategory point is determined in the following manner: for each outlier point, if the height value of the outlier point is within the height range of the ground fitting straight line, then the outlier point is determined to be First subcategory point.
  • the second subclass point is determined as follows: for each of the outliers, a neighborhood point of the outlier is obtained, and if the height consistency between the outlier point and the neighborhood point is within a consistency range, the outlier point is determined to be a second subclass point.
  • the distance, height and reflectivity information of the point cloud data uploaded by the lidar device is used for processing.
  • Distance, height, and reflectivity information are all two-dimensional data. Traverse all pixels of the data, and take the data point with row number m and column number n as the target processing point. Its distance value is Dist(m,n), the height value is Height(m,n), and the reflectance value is Ref( m,n).
  • the radar installation height can be expressed as GroundHeightMeanPre.
  • the lowest point of height is represented as GroundHeightMinPre, and the value of GroundHeightMinPre is set to HeightTh1.
  • the height standard deviation within the height threshold range can be expressed as GroundHeightStdPre.
  • GroundHeightStdPre is set to 0.1 ⁇ GroundHeightMeanPre.
  • the average height within the height threshold range is expressed as GroundHeightMeanPre.
  • the ground point cloud feature preservation algorithm is processed, that is, it is further determined whether the outlier needs to be retained in order to use it as a ground point. Specifically, if the current point is judged to be an outlier, and the height of the current outlier point, Height(m,n), is greater than GroundHeightMeanPre-Q ⁇ GroundHeightStdPre, and less than GroundHeightMeanPre+Q ⁇ GroundHeightStdPre, and Height(m,n) is greater than GroundHeightMinPre -GroundHeightStdPre.
  • the outlier point can be initially considered to be a ground point and should be retained. Otherwise, it is considered not to be a ground point and the next point will be processed. It should be noted that after initially considering that the outlier point is a ground point, you can also determine whether the outlier point is a ground point by determining the height range in the next step.
  • Q is an integer greater than 1. For example, Q can be 2, 3, 4, etc. In one embodiment, the value of Q is 3, which represents three standard deviations (3-sigma rule). Using 3-sigma is an analysis method based on statistics for normally distributed data.
  • the first subcategory point can be determined in the following way. If the outlier point is the first type of point, that is, the outlier point meets the height range condition, take the Height(m-floor(L1/2) around the current point: m+floor(L1/2),n-floor(L2/2 ): The data points in the n+floor(L2/2)) neighborhood are analyzed as heights. If the absolute value of the difference between the height of each point in the neighborhood and Height(m,n) is less than the height difference threshold HeigthDiffTh and the number is greater than the statistical threshold HeightCntTh, the height consistency of the point is considered to be high. Otherwise, the high consistency of the outlier point is poor, and the processing of the next point will proceed.
  • the floor() function represents rounding down.
  • FIG. 5 is a schematic diagram of the height of the fitted point cloud in the lidar coordinate system according to an embodiment of the present application. Points with higher height consistency are included in the ground straight line fitting point set GroundPointList.
  • GroundPointList if expressed as a ground point cloud, it is approximately a straight line on the XOY-Z two-dimensional plane (i.e., Dist ⁇ cos ⁇ -Height plane, ⁇ is the pitch angle between the scanning line and the XOY plane), so it can be
  • the cos ⁇ -Height plane performs least squares fitting on the point set, and estimates the ground straight line to reduce the ground height range.
  • Equation (1) If the absolute value of the slope k is greater than kTh (kTh indicates that the slope is too large and does not conform to the ground characteristics), or r is less than rTh (rTh indicates that the data point set does not meet the threshold of a linear relationship), or the absolute value of the intercept b is greater than bTh ( bTh represents the threshold that the ground height is too high or too low), this point does not belong to the ground point. If the above three conditions are met, according to the fitting straight line equation, the current point height fitting value can be obtained as shown in Equation (1):
  • the ground height limit range can be updated to RefHeight-GroundHeightStdPre ⁇ -RefHeight+GroundHeightStdPre.
  • the point cloud has a large amount of data
  • the retained first type of points can be filtered based on some characteristics of the ground points.
  • the ground is usually relatively flat, and at least some of the first type points that do not conform to straight line fitting can be deleted based on straight line fitting.
  • the above method may also include the following operations.
  • the ground straight line fitting point set is fitted on a specific plane of a specific coordinate system to obtain at least one of the fitting straight line slope, fitting intercept and/or correlation coefficient.
  • the first type points that do not meet the ground fitting requirements are deleted to obtain updated first type points.
  • the data volume of the point cloud is very large, and the calculation of the entire point cloud is very computationally intensive.
  • the point cloud can be partitioned. After obtaining the updated first type of points, fitting can be performed based on methods such as least squares.
  • a particular coordinate system includes multiple fitting regions.
  • the ground straight line fitting point set is fitted on a specific plane of a specific coordinate system to obtain at least one of the fitting straight line slope, fitting intercept and/or correlation coefficient, including: for each fitting area , perform least squares fitting on the first type of points in the current fitting area, and obtain at least one of the slope of the fitting straight line, the fitting intercept and/or the correlation coefficient.
  • the ground fitting straight line height range can also be updated based on this to improve robustness. For example, an adaptive threshold based on data statistics is more robust than a preset fixed ground height range.
  • the above method may also include the following operations.
  • the height range is updated based on the fit line slope and the fit intercept.
  • the ground fitting straight line height range is updated based on the updated height range.
  • the ground straight line fitting point set GroundPointList is divided into N areas in the horizontal direction. Each area is ground fitted separately.
  • the fitting input point selects the current point and the accumulated points in front of the horizontal area where the current point is located.
  • the ground point set jumps by row to select M-1 points, and the jump step is Step.
  • Use the M points to perform least squares fitting of the ground straight line, and calculate the slope k, intercept b and correlation coefficient r of the fitted straight line. If the absolute value of the slope k is greater than kTh, or r is less than rTh, or the absolute value of the intercept b is greater than bTh, the point is not a ground point. If the conditions are met, fitting can be performed according to equation (1), and the ground fitting straight line height range is updated as:
  • the ground height is fitted based on the point cloud in the historical frame to obtain the fitted ground height range.
  • This makes it possible to determine whether the outlier points in the current frame are ground points based on the dynamically updated ground fitted straight line height range, effectively reducing The risk of misidentifying ground points as outlier noise points and deleting them.
  • the dynamically updated ground fitting straight line height range is an adaptive threshold based on data statistics, which is more robust.
  • the ground points also include at least one of the second type of points or the third type of points.
  • the second type of points meet the smoothness requirements
  • the third type of points meet the vector angle requirements.
  • the vector angle is a list of points. The angle between a line connecting adjacent points in the cloud and the ground plane.
  • This embodiment can also perform one-step analysis on the first type of points to improve the accuracy of ground point identification. Specifically, the first type of points obtained above can be further analyzed based on dimensions such as smoothness dimensions and/or vector angles.
  • the second type of points is determined as follows.
  • the inter-row smoothness and difference in the neighborhood of the current frame where the point is located are obtained.
  • the point cloud obtained after the laser radar scans the ground is distributed row by row.
  • the inter-row smoothness can refer to the point cloud in two adjacent rows of the outlier. As long as there is at least one row of point clouds that meets the smoothness requirement, the first category point can be retained.
  • the point is a second type point based on the inter-row smoothness and the smoothness threshold, and/or based on the difference and the difference threshold.
  • Figure 6 is a schematic diagram illustrating an embodiment of the present application that does not meet smoothness requirements. Referring to Figure 6, although the points in the point cloud are clustered and distributed, the smoothness is not high.
  • obtaining the inter-line smoothness and difference in the neighborhood of the current frame where the point is located may include the following operations.
  • a neighborhood point of the point in the current frame is obtained based on a preset window, and the ratio of the length and width of the preset window is a preset value.
  • the smoothness threshold and/or the dissimilarity threshold can be set based on the distance of the object relative to the lidar.
  • the smoothness threshold and/or the difference threshold can be set to change as the distance between the object represented by the point and the lidar changes.
  • the current outlier point if the current outlier point does not meet the height range, subsequent ground judgment is not performed and the next pixel point is processed. If the current outlier point meets the height range, take its surrounding neighborhood Dist(m-floor(L1/2): m+floor(L1/2), n-floor(L2/2): n+floor(L2/2 )) data points are analyzed as distances. Among them, L1 is the length of the vertical window, L2 is the length of the horizontal window, the typical size of L1 is 1, and the typical size of L2 is 2. Calculate the distance smoothness degree and difference degree of each row in the distance neighborhood. The calculation formulas of the distance smoothness degree SmoothCoef and the difference degree VarCoef can be as shown in Equation (2).
  • Judgment is based on the calculation result of (2). Although the ground points have large distance differences on the same scan line, the overall change is monotonous and smooth. It is judged whether it meets the following conditions: SmoothCoef is less than the threshold Th1, and VarCoef is less than the threshold. Th2, the threshold selection changes with distance, and the threshold is relatively relaxed when the distance is far. If the mth row where the current point is located satisfies the above conditions, and more than one row in two adjacent rows meets this condition, the point is considered to be a second type point, and is retained for the next pixel processing; otherwise, it can be judged Whether the first type point belongs to the third type point.
  • the vector angle includes at least one of a first sub-angle, a second sub-angle, or a third sub-angle.
  • the first sub-angle is between the first connection line and the corresponding ground plane.
  • the second sub-angle is the angle between the second connection line and the corresponding ground plane
  • the third sub-angle is the angle between the third connection line and the corresponding ground plane
  • the first connection line is the connection between the current point and the adjacent points in the same column of the current frame where the current point is located
  • the second connection is the connection between the current point and the next adjacent point in the same column of the current frame where the current point is located
  • the third connection is It is the line connecting the adjacent points in the same column of the current frame where the current point is located and the next adjacent point in the same column of the current frame where the current point is located.
  • FIG. 7 is a schematic diagram of vector angles according to an embodiment of the present application.
  • is the vertical angular resolution of the radar. If the nth column where the current point is located satisfies the three angles that are less than the angle threshold AngleTh, and at least 2 columns in the adjacent columns satisfy the three angles that are less than the angle threshold, then the point is considered to be a ground point cloud, retain it, and proceed to the next pixel point processing; otherwise, the next pixel point is processed directly. All first-category points are traversed to determine whether they are third-category points, and then whether to retain the outlier point.
  • the point cloud may also include non-outlier points.
  • Figure 8 is another flowchart of processing point clouds according to an embodiment of the present application.
  • operation S310 may be replaced by operation S810 to obtain a point cloud, where the point cloud includes outlier points and non-outlier points.
  • Operation S330 may be replaced by operation S830, which outputs the ground points and at least part of the non-outlier points in the point cloud data.
  • operation S830 which outputs the ground points and at least part of the non-outlier points in the point cloud data.
  • point clouds of pedestrians, vehicles, traffic lights, street lights and other objects can also be output.
  • the point clouds of these objects can also be filtered, classified, etc., in order to implement functions such as autonomous driving.
  • non-outlier points are valid point clouds, including but not limited to pedestrians, plants, or man-made objects.
  • the height value of the current non-outlier point is greater than the height threshold HeightTh1 and less than the height threshold HeightTh2 (HeightTh is generally less than 0, the ground line is at a negative height position and can be set according to the vehicle radar installation height); and the distance is less than the distance threshold DistTh1, include this point into the ground point cloud height statistical range of the frame, count the lowest height point of the current frame, GroundHeightMinPre, and the height average value within the height threshold range, GroundHeightMeanPre, and the height standard deviation within the height threshold range, GroundHeightStdPre, used to correct the ground area of the next frame. selection, and then perform point cloud filtering of the next point.
  • GroundHeightMeanPre is set to the radar installation height
  • GroundHeightMinPre is set to HeightTh1
  • GroundHeightStdPre is set to 0.1*radar installation height.
  • FIG. 9 is another flowchart of processing a point cloud according to an embodiment of the present application.
  • the lidar uploads the collected point cloud data
  • the processor traverses all uploaded data points (m, n), performs point cloud filtering on the data pairs, and determines whether the current point is an outlier or a non-outlier. If it is not an outlier, the current point is determined to be a valid point, and the height average, variance and minimum value within the height range limited by the current frame are calculated and used to limit the height range of the next frame.
  • the current point is an outlier, then based on the ground height statistical value of the previous frame, determine whether the current point is within the ground height range (which can also be called the initial screening ground height range). If it is not within the ground height range, then Go over to the next point. If it is within the ground height range, select the height neighborhood data to determine whether the height consistency is high.
  • the ground height range which can also be called the initial screening ground height range.
  • the current point is classified into the ground point set to perform fitting point set, and the point set is divided into least squares fitting on the XOY-Z plane.
  • the ground fitting requirements it is judged whether the slope, intercept and correlation coefficient of the fitted straight line meet the ground fitting requirements. If they do not meet the ground fitting requirements, the next point is traversed. If the ground fitting requirements are met, the ground height limit range is updated based on the fitted straight line to improve robustness.
  • the smoothing burden and degree of difference meet the threshold requirements. If so, it is determined that the outlier point is a ground point and needs to be retained. If not, calculate the angle between the pairwise vectors of all row scan points in each column in the neighborhood and the horizontal plane. Next, determine whether the vector angle meets the threshold condition. If it does, it is determined that the current outlier point is a ground point and needs to be retained; if not, it is determined that the current outlier point is not a ground point but a noise point and needs to be eliminated.
  • the ground height is a condition that must be met for judgment, which may include at least one of height consistency and compliance with the ground fitting straight line height range. Smoothness or vector angle can satisfy one of them. Furthermore, the order of operations of the above operations can be adjusted.
  • Figure 10 is a schematic diagram of an original point cloud according to an embodiment of the present application.
  • the dotted arrows indicate ground line points
  • the single solid line arrows indicate lane line points
  • the double solid line arrows indicate noise points. It can be seen that the point cloud on the ground that is farther away from the lidar has a higher degree of dispersion, and the distance between two adjacent lines is also larger, and it is easy to be eliminated as outliers.
  • there are also outliers indicated by double solid arrows in the point cloud which are noise points and need to be removed.
  • Figure 11 is a schematic diagram of a filtered point cloud according to an embodiment of the present application. Please refer to Figure 10 and Figure 11 together. After point cloud filtering, the noise is removed. However, the point cloud of the ground far away from the lidar in Figure 10 was mistakenly deleted. Additionally, lane lines were deleted by mistake. This affects the expression of lidar's perception capabilities.
  • Figure 12 is a schematic diagram of a processed point cloud according to an embodiment of the present application.
  • the noise points are removed.
  • the point clouds on the ground farther away from the lidar are more completely preserved, and the lane lines are also preserved.
  • this embodiment can effectively remove abnormal noise in point clouds on the basis of satisfying the sensing capabilities of lidar.
  • Another aspect of the present application also provides a device for processing point clouds.
  • Figure 13 is a schematic structural diagram of a device for processing point clouds according to an embodiment of the present application.
  • the device 1300 for processing point clouds includes: a point cloud acquisition module 1310, a ground point determination module 1320, and a point cloud output module 1330.
  • the point cloud obtaining module 1310 is used to obtain a point cloud, which includes outlier points.
  • the ground point determination module 1320 is used to determine the ground points in the point cloud.
  • the ground points include the first type of points among the outlier points.
  • the first type of points meet the height requirements.
  • the height requirements are based on the previous frame point cloud of the current frame point cloud. determined by the height of the ground point.
  • the point cloud output module 1330 is used to output at least ground points. In addition, the point cloud output module 1330 can also be used to output non-outlier points.
  • the first type of points may include first sub-category points and/or second sub-category points, the first sub-category points meet the height consistency requirement, and the second sub-category points are within the height range of the ground fitting straight line.
  • Another aspect of the application also provides a radar.
  • Figure 14 is a schematic structural diagram of a radar according to an embodiment of the present application.
  • the radar 1400 may include circuitry.
  • a circuit could implement the method shown above for processing point clouds.
  • the circuit can be arranged on the circuit board 1410, and multiple chips, such as central control chips, can be arranged on the circuit board 1410.
  • the circuit board 1410 may be disposed in the housing 1420.
  • the radar may be a scanning radar or a non-scanning radar.
  • scanning laser radars include MEMS laser radars, mechanical laser radars, laser radars including multiple scanning devices, etc.
  • Non-scanning laser radars include flash laser radars, phased array laser radars, etc. This application does not limit the type of laser radar.
  • the MEMS solid-state lidar Take the MEMS solid-state lidar as an example. Since the MEMS solid-state lidar scans through the simple harmonic vibration of the galvanometer, its scanning path is realized in terms of spatial sequence. For example, it can be a slow axis from top to bottom and a fast axis from left. A scanning field of view reciprocating to the right. Therefore, the detection range of MEMS solid-state lidar is divided by dividing the field of view angle corresponding to the slow axis. For example, the slow axis of MEMS solid-state lidar corresponds to a vertical field of view angle of -13° to 13°.
  • the mechanical lidar in the scanning sensor Take the mechanical lidar in the scanning sensor as an example. Since the mechanical lidar drives the optical system through a mechanical drive device to rotate 360° to achieve scanning, a cylindrical detection area is formed with the lidar as the center. Therefore, the detection range corresponding to a mechanical lidar rotation of 360° is the detection range corresponding to the detection of one frame of data. Therefore, the detection range of a mechanical lidar cycle is generally divided by the degree of rotation.
  • the image is processed through the internal photosensitive component circuit and control component and converted into a digital signal that can be recognized by the computer. It is then input to the computer through a parallel port or USB connection, and the software then processes the image. reduction. Then the detection field of view of a period is generally divided by the area of the receiving detector.
  • Another aspect of the application also provides an electronic device.
  • FIG. 15 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • electronic device 1500 may include memory 1510 and processor 1520 .
  • the electronic device 1500 may also be provided with various sensors such as laser radar.
  • the processor 1520 may be a central processing unit (CPU), or other general-purpose processors, digital signal processors (DSP), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA), or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor or any conventional processor, etc.
  • Memory 1510 may include various types of storage units, such as system memory, read-only memory (ROM), and persistent storage. Among them, ROM can store static data or instructions required by the processor 1520 or other modules of the computer. Persistent storage may be readable and writable storage. Persistent storage may be a non-volatile storage device that does not lose stored instructions and data even when the computer is powered off. In some embodiments, the permanent storage device uses a large-capacity storage device (eg, magnetic or optical disk, flash memory) as the permanent storage device. In other embodiments, the permanent storage device may be a removable storage device (eg, floppy disk, optical drive).
  • System memory can be a read-write storage device or a volatile read-write storage device, such as dynamic random access memory.
  • System memory can store some or all of the instructions and data the processor needs to run.
  • memory 1510 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (eg, DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic disks, and/or optical disks may also be used.
  • memory 1510 may include a readable and/or writable removable storage device, such as a compact disc (CD), a read-only digital versatile disc (eg, DVD-ROM, dual-layer DVD-ROM), Read-only Blu-ray discs, ultra-density discs, flash memory cards (such as SD cards, min SD cards, Micro-SD cards, etc.), magnetic floppy disks, etc.
  • a readable and/or writable removable storage device such as a compact disc (CD), a read-only digital versatile disc (eg, DVD-ROM, dual-layer DVD-ROM), Read-only Blu-ray discs, ultra-density discs, flash memory cards (such as SD cards, min SD cards, Micro-SD cards, etc.), magnetic floppy disks, etc.
  • Computer-readable storage media do not contain carrier waves and transient electronic signals that are transmitted wirelessly or wired.
  • the memory 1510 stores executable code.
  • the processor 1520 can be caused to execute part or all of the above-mentioned methods.
  • the method according to the present application can also be implemented as a computer program or computer program product, which computer program or computer program product includes computer program code instructions for executing part or all of the steps in the above method of the present application.
  • the application may also be implemented as a computer-readable storage medium (or a non-transitory machine-readable storage medium or a machine-readable storage medium) with executable code (or computer program or computer instruction code) stored thereon,
  • executable code or computer program or computer instruction code
  • the processor of the electronic device or server, etc.
  • the processor is caused to execute part or all of the respective steps of the above method according to the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un procédé de traitement d'un nuage de points, et un radar. Le procédé de traitement d'un nuage de points comprend les étapes consistant à : acquérir un nuage de points, le nuage de points comprenant des valeurs aberrantes (S310) ; déterminer des points de masse dans le nuage de points, les points de masse comprenant un premier type de points parmi les valeurs aberrantes, le premier type de points satisfaisant une exigence de hauteur, et l'exigence de hauteur étant déterminée sur la base de la hauteur de points de masse dans la trame de nuage de points précédente du cadre de nuage de points actuel (S320) ; et délivrer au moins les points de masse (S330). Le procédé de traitement d'un nuage de points peut éliminer des points de bruit anormaux dans des nuages de points, et identifier efficacement des points de masse tout en satisfaisant des exigences de détection de LIDAR, ce qui permet d'assurer la précision d'identification de point de masse.
PCT/CN2022/120909 2022-09-23 2022-09-23 Procédé de traitement de nuage de points et radar Ceased WO2024060209A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/120909 WO2024060209A1 (fr) 2022-09-23 2022-09-23 Procédé de traitement de nuage de points et radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/120909 WO2024060209A1 (fr) 2022-09-23 2022-09-23 Procédé de traitement de nuage de points et radar

Publications (1)

Publication Number Publication Date
WO2024060209A1 true WO2024060209A1 (fr) 2024-03-28

Family

ID=90453716

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/120909 Ceased WO2024060209A1 (fr) 2022-09-23 2022-09-23 Procédé de traitement de nuage de points et radar

Country Status (1)

Country Link
WO (1) WO2024060209A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118470331A (zh) * 2024-07-12 2024-08-09 山东大学 基于自适应同心圆模型的地面点云分割方法及系统
CN120949184A (zh) * 2025-08-01 2025-11-14 北京建筑大学 一种地基雷达城市桥梁长时序监测的静杂波去除方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150248577A1 (en) * 2012-09-21 2015-09-03 Umwelt (Australia) Pty. Limited On-ground or near-ground discrete object detection method and system
WO2020006765A1 (fr) * 2018-07-06 2020-01-09 深圳前海达闼云端智能科技有限公司 Procédé de détection de sol, dispositif associé et support de stockage lisible par ordinateur
CN111598780A (zh) * 2020-05-14 2020-08-28 山东科技大学 一种适用于机载LiDAR点云的地形自适应插值滤波方法
US20200372670A1 (en) * 2019-05-24 2020-11-26 Toyota Research Institute, Inc. Systems and methods for object detection including z-domain and range-domain analysis
CN112906449A (zh) * 2020-12-02 2021-06-04 北京中科慧眼科技有限公司 基于稠密视差图的路面坑洼检测方法、系统和设备
CN114170149A (zh) * 2021-11-17 2022-03-11 东南大学 一种基于激光点云的道路几何信息提取方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150248577A1 (en) * 2012-09-21 2015-09-03 Umwelt (Australia) Pty. Limited On-ground or near-ground discrete object detection method and system
WO2020006765A1 (fr) * 2018-07-06 2020-01-09 深圳前海达闼云端智能科技有限公司 Procédé de détection de sol, dispositif associé et support de stockage lisible par ordinateur
US20200372670A1 (en) * 2019-05-24 2020-11-26 Toyota Research Institute, Inc. Systems and methods for object detection including z-domain and range-domain analysis
CN111598780A (zh) * 2020-05-14 2020-08-28 山东科技大学 一种适用于机载LiDAR点云的地形自适应插值滤波方法
CN112906449A (zh) * 2020-12-02 2021-06-04 北京中科慧眼科技有限公司 基于稠密视差图的路面坑洼检测方法、系统和设备
CN114170149A (zh) * 2021-11-17 2022-03-11 东南大学 一种基于激光点云的道路几何信息提取方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118470331A (zh) * 2024-07-12 2024-08-09 山东大学 基于自适应同心圆模型的地面点云分割方法及系统
CN120949184A (zh) * 2025-08-01 2025-11-14 北京建筑大学 一种地基雷达城市桥梁长时序监测的静杂波去除方法

Similar Documents

Publication Publication Date Title
CN111753609B (zh) 一种目标识别的方法、装置及摄像机
US11295521B2 (en) Ground map generation
CN109283538B (zh) 一种基于视觉和激光传感器数据融合的海上目标大小检测方法
CN112513679B (zh) 一种目标识别的方法和装置
WO2021097618A1 (fr) Procédé et système de segmentation de nuage de points, et support d'enregistrement informatique
WO2020243962A1 (fr) Procédé de détection d'objet, dispositif électronique et plateforme mobile
WO2021012987A1 (fr) Procédé, appareil et système de détection d'objet, et dispositif associé
CN114821526A (zh) 基于4d毫米波雷达点云的障碍物三维边框检测方法
CN114187579A (zh) 自动驾驶的目标检测方法、装置和计算机可读存储介质
CN115273062A (zh) 一种融合三维激光雷达和单目相机的3d目标检测方法
WO2022198637A1 (fr) Procédé et système de filtrage de bruit en nuage de points et plate-forme mobile
CN115639536B (zh) 基于多传感器融合的无人船感知目标检测方法及装置
WO2020253764A1 (fr) Procédé et appareil de détermination d'informations de région de déplacement
WO2024060209A1 (fr) Procédé de traitement de nuage de points et radar
WO2024159623A1 (fr) Procédé et appareil de détection d'obstacle flottant, dispositif électronique et support de stockage
CN114002708B (zh) 一种面向无人艇应用的尾浪滤除方法
CN115994934B (zh) 数据时间对齐方法、装置以及域控制器
US20240385318A1 (en) Machine-learning based object detection and localization using ultrasonic sensor data
CN113902043B (zh) 目标识别方法、装置及设备
CN119156548A (zh) 一种点云评估方法及装置
CN112766100A (zh) 一种基于关键点的3d目标检测方法
CN116246162A (zh) 一种基于多传感器的无人船过闸识别方法
CN117409393A (zh) 一种焦炉机车激光点云与视觉融合障碍物检测方法及系统
CN116125440A (zh) 用于确定激光雷达传感器的最大作用范围的方法以及设备
CN115527034A (zh) 一种车端点云动静分割方法、装置及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22959216

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22959216

Country of ref document: EP

Kind code of ref document: A1