[go: up one dir, main page]

CN111275075A - Vehicle detection and tracking method based on 3D laser radar - Google Patents

Vehicle detection and tracking method based on 3D laser radar Download PDF

Info

Publication number
CN111275075A
CN111275075A CN202010029355.9A CN202010029355A CN111275075A CN 111275075 A CN111275075 A CN 111275075A CN 202010029355 A CN202010029355 A CN 202010029355A CN 111275075 A CN111275075 A CN 111275075A
Authority
CN
China
Prior art keywords
axis
points
vehicle
point cloud
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010029355.9A
Other languages
Chinese (zh)
Other versions
CN111275075B (en
Inventor
张晓东
刘毅枫
王则陆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Chaoyue CNC Electronics Co Ltd
Original Assignee
Shandong Chaoyue CNC Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Chaoyue CNC Electronics Co Ltd filed Critical Shandong Chaoyue CNC Electronics Co Ltd
Priority to CN202010029355.9A priority Critical patent/CN111275075B/en
Publication of CN111275075A publication Critical patent/CN111275075A/en
Application granted granted Critical
Publication of CN111275075B publication Critical patent/CN111275075B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/66Tracking systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Mathematical Optimization (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pure & Applied Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Analysis (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A3D laser radar-based vehicle detection and tracking method uses a 3D laser radar as a sensor, and utilizes an SVM training classifier to perform vehicle detection by using various vehicle characteristics including a vehicle characteristic I, a vehicle characteristic II, a vehicle characteristic III, a vehicle characteristic IV, a vehicle characteristic V, a vehicle characteristic VI and a vehicle characteristic VII, and uses a global nearest neighbor algorithm and unscented Kalman filtering to perform vehicle tracking, so that the real rate of vehicle detection is improved by means of the result of vehicle tracking.

Description

Vehicle detection and tracking method based on 3D laser radar
Technical Field
The invention relates to the technical field of unmanned driving, in particular to a vehicle detection and tracking method based on a 3D laser radar.
Background
With the continuous development of control system technology, sensor technology and artificial intelligence technology, research on unmanned vehicles has made great progress, and related technologies are beginning to be applied more and more in military affairs, scientific research, production and our daily life. The important scientific research significance of the unmanned technology is reflected not only on the core scientific problems contained in the unmanned technology, but also on the key strategic value and the bright application prospect of the unmanned technology, so that the social attention of the unmanned technology is extremely high.
Since vehicles tend to travel faster and are important participants in road traffic and major triggers of traffic accidents, vehicles are important interactive objects of unmanned vehicles in various road environments. With the development of the unmanned technology, research on vehicle detection and tracking is also remarkably advanced, and the research becomes one of the key problems in the field of unmanned automobiles. Therefore, how to improve the real rate of vehicle detection is also a problem in the prior art.
Disclosure of Invention
In order to overcome the defects of the technology, the invention provides the 3D laser radar-based vehicle detection and tracking method for improving the real rate of vehicle detection.
The technical scheme adopted by the invention for overcoming the technical problems is as follows:
a vehicle detection and tracking method based on a 3D laser radar comprises the following steps:
a) collecting 3D point cloud data on an unmanned vehicle by adopting a 64-line laser radar, preprocessing the 3D point cloud data and obtaining n object samples Sk,k=1,2,...,n,
Figure BDA0002362345990000011
Figure BDA0002362345990000012
Is a point in object k, where NkIs the number of points contained in the kth subject sample;
b) using f ═ f1,f2,...,fmMark the feature vector of the 3D point cloud, m is 207;
c) vehicle characteristics I from object samples SkIndependent term f of the eigenvectors obtained in the inertia tensor matrix M1~f6Composition is carried out;
d) vehicle characteristics II from object samples SkIndependent term f of the eigenvectors obtained in the covariance matrix C of (2)7~f12Composition is carried out;
e) is established to haveThe vertical ground with the central point of the rear axle of the man-driven automobile as the origin is upward and is in the positive direction of the z-axis, the direction of the automobile heading is in the positive direction of the x-axis, and a coordinate system in the positive direction of the y-axis is obtained according to a right-hand rule, and the characteristics III of the automobile are represented by an object sample SkDividing the z-axis into 10 layers, projecting the points in the point cloud of each layer onto a plane perpendicular to the z-axis and calculating the length along the x-axis to obtain 10 division features f13~f22
f) Dividing a plane formed by an x axis and a z axis into 14 x 9 groups of grids, projecting the 3D point cloud into the grids of the plane formed by the x axis and the z axis to obtain a normalized 2D histogram of the xz plane, calculating the number of points in each grid, wherein the number of points in each grid is the value of the points in a corresponding interval of the histogram, and the values corresponding to 14 x 9 intervals on the histogram are vehicle features IVf23~f148
g) Dividing a plane formed by a y axis and a z axis into 9 x 6 groups of grids, projecting the 3D point cloud to the grids of the plane formed by the y axis and the z axis to obtain a normalized 2D histogram of a yz plane, calculating the number of points in each grid, wherein the number of points in the grids is the value of the points in a corresponding interval of the histogram, and the value corresponding to 9 x 6 intervals on the histogram is the vehicle characteristic vf149~f202
h) Vehicle feature VI is represented by object sample SkMean and variance f of the reflection intensity203~f204Composition is carried out;
i) the vehicle characteristic VII is formed by an object sample SkAspect ratio, aspect ratio f205~f207Composition is carried out;
j) preparing a data set containing positive samples and negative samples, wherein the positive samples are various vehicles scanned by the 3D laser radar from different angles, the negative samples are shrubs, building outer walls and pedestrians, and f1~f207The formed data set is divided into a training set and a testing set, the training set and the testing set both comprise positive samples and negative samples, the training set is used for extracting vehicle characteristics and training an SVM classifier, and the classification effect of the SVM classifier is verified in the testing set; (ii) a
k) Stopping training when the classification effect meets the requirement, and repeatedly executing the step j) if the classification effect does not meet the requirement;
l) using a global nearest neighbor algorithm in combination with a Kalman filtering algorithm for vehicle tracking.
Further, the formula for calculating the inertia tensor matrix in the step c) is
Figure BDA0002362345990000031
In the formula xt、yt、ztAnd the coordinate value of the t-th point in the 3D point cloud in a coordinate system.
Further, the covariance matrix in step d)
Figure BDA0002362345990000032
In the formula
Figure BDA0002362345990000033
Figure BDA0002362345990000034
Figure BDA0002362345990000035
Figure BDA0002362345990000036
Figure BDA0002362345990000037
Figure BDA0002362345990000041
Figure BDA0002362345990000042
Figure BDA0002362345990000043
Figure BDA0002362345990000044
Figure BDA0002362345990000045
Is the average of the coordinates of the points in the 3D point cloud in the x, y, z directions.
Further, the formula is used in step f)
Figure BDA0002362345990000046
Figure BDA0002362345990000047
Calculating the number of points in a grid cell (p, q), where xmin、xmaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the x-axis, zmin、zmaxFor the minimum and maximum values, x, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the x axis, ziThe coordinate values of the i points on the z axis.
Further, the formula is used in the step g)
Figure BDA0002362345990000048
Figure BDA0002362345990000049
Calculating the number of points in a grid cell (p, q), where ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxMinimum and maximum values, y, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the y axis, ziThe coordinate values of the i points on the z axis.
Further, step h) is performed by the formula
Figure BDA0002362345990000051
Calculating the mean value of the reflection intensity
Figure BDA0002362345990000052
In the formula IiIs the reflection intensity of the ith point by the formula
Figure BDA0002362345990000053
Calculating the variance Icov
Further, in step i), the formula is used
Figure BDA0002362345990000054
Calculating the aspect ratio dxzBy the formula
Figure BDA0002362345990000055
Calculating the aspect ratio dxyBy the formula
Figure BDA0002362345990000056
Calculating aspect ratio dyzWherein x ismin、xmaxMinimum and maximum values, y, of the upper points of the corresponding object point clouds on the x-axismin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxThe minimum value and the maximum value of the upper point of the corresponding object point cloud on the z axis are obtained.
The invention has the beneficial effects that: the method comprises the steps of using a 3D laser radar as a sensor, using an SVM training classifier to perform vehicle detection through various vehicle characteristics including a vehicle characteristic I, a vehicle characteristic II, a vehicle characteristic III, a vehicle characteristic IV, a vehicle characteristic V, a vehicle characteristic VI and a vehicle characteristic VII, using a global nearest neighbor algorithm and unscented Kalman filtering to perform vehicle tracking, and improving the real rate of vehicle detection by means of a vehicle tracking result.
Detailed Description
The present invention is further explained below.
A vehicle detection and tracking method based on a 3D laser radar comprises the following steps:
a) collecting 3D point cloud data on an unmanned vehicle by adopting a 64-line laser radar, preprocessing the 3D point cloud data and obtaining n object samples Sk,k=1,2,...,n,
Figure BDA0002362345990000061
Figure BDA0002362345990000062
Is a point in object k, where NkIs the number of points contained in the kth subject sample;
b) using f ═ f1,f2,...,fmMark the feature vector of the 3D point cloud, m is 207;
c) vehicle characteristics I from object samples SkIndependent term f of the eigenvectors obtained in the inertia tensor matrix M1~f6Composition is carried out;
d) vehicle characteristics II from object samples SkIndependent term f of the eigenvectors obtained in the covariance matrix C of (2)7~f12Composition is carried out;
e) establishing a coordinate system which takes the central point of the rear axle of the unmanned automobile as the origin, vertically faces upwards as the positive direction of the z-axis, faces towards the automobile as the positive direction of the x-axis and obtains the positive direction of the y-axis according to a right-hand rule, and obtaining the vehicle characteristics III by an object sample SkDividing the z-axis into 10 layers, projecting the points in the point cloud of each layer onto a plane perpendicular to the z-axis and calculating the length along the x-axis to obtain 10 division features f13~f22
f) Dividing a plane formed by an x axis and a z axis into 14 x 9 groups of grids, projecting the 3D point cloud into the grids of the plane formed by the x axis and the z axis to obtain a normalized 2D histogram of the xz plane, calculating the number of points in each grid, wherein the number of points in each grid is the value of the points in a corresponding interval of the histogram, and the values corresponding to 14 x 9 intervals on the histogram are vehicle features IVf23~f148
g) Dividing a plane formed by a y axis and a z axis into 9 x 6 groups of grids, projecting the 3D point cloud to the grids of the plane formed by the y axis and the z axis to obtain a normalized 2D histogram of a yz plane, calculating the number of points in each grid, wherein the number of points in the grids is the value of the points in a corresponding interval of the histogram, and the value corresponding to 9 x 6 intervals on the histogram is the vehicle characteristic vf149~f202
h) Vehicle feature VI is represented by object sample SkMean and variance f of the reflection intensity203~f204Composition is carried out;
i) the vehicle characteristic VII is formed by an object sample SkAspect ratio, aspect ratio f205~f207Composition is carried out;
j) preparing a data set containing positive samples and negative samples, wherein the positive samples are various vehicles scanned by the 3D laser radar from different angles, the negative samples are shrubs, building outer walls and pedestrians, and f1~f207The formed data set is divided into a training set and a testing set, the training set and the testing set both comprise positive samples and negative samples, the training set is used for extracting vehicle characteristics and training an SVM classifier, and the classification effect of the SVM classifier is verified in the testing set; (ii) a
k) Stopping training when the classification effect meets the requirement, and repeatedly executing the step j) if the classification effect does not meet the requirement;
l) using a global nearest neighbor algorithm in combination with a Kalman filtering algorithm for vehicle tracking.
The method comprises the steps of using a 3D laser radar as a sensor, using an SVM training classifier to perform vehicle detection through various vehicle characteristics including a vehicle characteristic I, a vehicle characteristic II, a vehicle characteristic III, a vehicle characteristic IV, a vehicle characteristic V, a vehicle characteristic VI and a vehicle characteristic VII, using a global nearest neighbor algorithm and unscented Kalman filtering to perform vehicle tracking, and improving the real rate of vehicle detection by means of a vehicle tracking result.
The formula for calculating the inertia tensor matrix in the step c) is
Figure BDA0002362345990000071
In the formula xt、yt、ztAnd the coordinate value of the t-th point in the 3D point cloud in a coordinate system.
Covariance matrix in step d)
Figure BDA0002362345990000072
In the formula
Figure BDA0002362345990000073
Figure BDA0002362345990000081
Figure BDA0002362345990000082
Figure BDA0002362345990000083
Figure BDA0002362345990000084
Figure BDA0002362345990000085
Figure BDA0002362345990000086
Figure BDA0002362345990000087
Figure BDA0002362345990000088
Figure BDA0002362345990000089
Is the average of the coordinates of the points in the 3D point cloud in the x, y, z directions.
Using the formula in step f)
Figure BDA00023623459900000810
Figure BDA0002362345990000091
Calculating the number of points in a grid cell (p, q), whichIn xmin、xmaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the x-axis, zmin、zmaxFor the minimum and maximum values, x, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the x axis, ziThe coordinate values of the i points on the z axis.
Using the formula in step g)
Figure BDA0002362345990000092
Figure BDA0002362345990000093
Calculating the number of points in a grid cell (p, q), where ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxMinimum and maximum values, y, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the y axis, ziThe coordinate values of the i points on the z axis.
In step h) by the formula
Figure BDA0002362345990000094
Calculating the mean value of the reflection intensity
Figure BDA0002362345990000095
In the formula IiIs the reflection intensity of the ith point by the formula
Figure BDA0002362345990000096
Calculating the variance Icov
In step i) by the formula
Figure BDA0002362345990000097
Calculating the aspect ratio dxzBy the formula
Figure BDA0002362345990000098
Calculating the aspect ratio dxyBy the formula
Figure BDA0002362345990000099
Calculating aspect ratio dyzWherein x ismin、xmaxMinimum and maximum values, y, of the upper points of the corresponding object point clouds on the x-axismin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxThe minimum value and the maximum value of the upper point of the corresponding object point cloud on the z axis are obtained.
The above description is only a preferred embodiment of the present invention, and is only used to illustrate the technical solutions of the present invention, and not to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (7)

1. A vehicle detection and tracking method based on a 3D laser radar is characterized by comprising the following steps:
a) collecting 3D point cloud data on an unmanned vehicle by adopting a 64-line laser radar, preprocessing the 3D point cloud data and obtaining n object samples Sk,k=1,2,...,n,
Figure FDA0002362345980000011
Figure FDA0002362345980000012
Is a point in object k, where NkIs the number of points contained in the kth subject sample;
b) using f ═ f1,f2,...,fmMark the feature vector of the 3D point cloud, m is 207;
c) vehicle characteristics I from object samples SkIndependent term f of the eigenvectors obtained in the inertia tensor matrix M1~f6Composition is carried out;
d) vehicle characteristics II from object samples SkIndependent term f of the eigenvectors obtained in the covariance matrix C of (2)7~f12Composition is carried out;
e) establishing a vertical ground upward direction with a central point of a rear shaft of the unmanned vehicle as an original point as a positive z-axisThe direction of the vehicle is positive in the x-axis direction, a coordinate system in the y-axis direction is obtained according to a right-hand rule, and the vehicle characteristics III are obtained from an object sample SkDividing the z-axis into 10 layers, projecting the points in the point cloud of each layer onto a plane perpendicular to the z-axis and calculating the length along the x-axis to obtain 10 division features f13~f22
f) Dividing a plane formed by an x axis and a z axis into 14 x 9 groups of grids, projecting the 3D point cloud into the grids of the plane formed by the x axis and the z axis to obtain a normalized 2D histogram of the xz plane, calculating the number of points in each grid, wherein the number of points in each grid is the value of the points in a corresponding interval of the histogram, and the values corresponding to 14 x 9 intervals on the histogram are vehicle features IVf23~f148
g) Dividing a plane formed by a y axis and a z axis into 9 x 6 groups of grids, projecting the 3D point cloud to the grids of the plane formed by the y axis and the z axis to obtain a normalized 2D histogram of a yz plane, calculating the number of points in each grid, wherein the number of points in the grids is the value of the points in a corresponding interval of the histogram, and the value corresponding to 9 x 6 intervals on the histogram is the vehicle characteristic vf149~f202
h) Vehicle feature VI is represented by object sample SkMean and variance f of the reflection intensity203~f204Composition is carried out;
i) the vehicle characteristic VII is formed by an object sample SkAspect ratio, aspect ratio f205~f207Composition is carried out;
j) preparing a data set containing positive samples and negative samples, wherein the positive samples are various vehicles scanned by the 3D laser radar from different angles, the negative samples are shrubs, building outer walls and pedestrians, and f1~f207The formed data set is divided into a training set and a testing set, the training set and the testing set both comprise positive samples and negative samples, the training set is used for extracting vehicle characteristics and training an SVM classifier, and the classification effect of the SVM classifier is verified in the testing set; (ii) a
k) Stopping training when the classification effect meets the requirement, and repeatedly executing the step j) if the classification effect does not meet the requirement;
l) using a global nearest neighbor algorithm in combination with a Kalman filtering algorithm for vehicle tracking.
2. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: the formula for calculating the inertia tensor matrix in the step c) is
Figure FDA0002362345980000021
In the formula xt、yt、ztAnd the coordinate value of the t-th point in the 3D point cloud in a coordinate system.
3. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: covariance matrix in step d)
Figure FDA0002362345980000022
In the formula
Figure FDA0002362345980000023
Figure FDA0002362345980000031
Figure FDA0002362345980000032
Figure FDA0002362345980000033
Figure FDA0002362345980000034
Figure FDA0002362345980000035
Figure FDA0002362345980000036
Figure FDA0002362345980000037
Figure FDA0002362345980000038
Figure FDA0002362345980000039
Figure FDA00023623459800000310
Is the average of the coordinates of the points in the 3D point cloud in the x, y, z directions.
4. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: using the formula in step f)
Figure FDA00023623459800000311
Figure FDA0002362345980000041
Calculating the number of points in a grid cell (p, q), where xmin、xmaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the x-axis, zmin、zmaxFor the minimum and maximum values, x, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the x axis, ziThe coordinate values of the i points on the z axis.
5. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: using the formula in step g)
Figure FDA0002362345980000042
Figure FDA0002362345980000043
Calculating the number of points in a grid cell (p, q), where ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxMinimum and maximum values, y, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the y axis, ziThe coordinate values of the i points on the z axis.
6. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: in step h) by the formula
Figure FDA0002362345980000044
Calculating the mean value of the reflection intensity
Figure FDA0002362345980000045
In the formula IiIs the reflection intensity of the ith point by the formula
Figure FDA0002362345980000046
Calculating the variance Icov
7. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: in step i) by the formula
Figure FDA0002362345980000047
Calculating the aspect ratio dxzBy the formula
Figure FDA0002362345980000051
Calculating the aspect ratio dxyBy the formula
Figure FDA0002362345980000052
Calculating aspect ratio dyzWherein x ismin、xmaxFor the point cloud of the corresponding objectMinimum and maximum values of upper points on the x-axis, ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxThe minimum value and the maximum value of the upper point of the corresponding object point cloud on the z axis are obtained.
CN202010029355.9A 2020-01-10 2020-01-10 Vehicle detection and tracking method based on 3D laser radar Active CN111275075B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010029355.9A CN111275075B (en) 2020-01-10 2020-01-10 Vehicle detection and tracking method based on 3D laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010029355.9A CN111275075B (en) 2020-01-10 2020-01-10 Vehicle detection and tracking method based on 3D laser radar

Publications (2)

Publication Number Publication Date
CN111275075A true CN111275075A (en) 2020-06-12
CN111275075B CN111275075B (en) 2023-05-02

Family

ID=71001835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010029355.9A Active CN111275075B (en) 2020-01-10 2020-01-10 Vehicle detection and tracking method based on 3D laser radar

Country Status (1)

Country Link
CN (1) CN111275075B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112099042A (en) * 2020-08-07 2020-12-18 武汉万集信息技术有限公司 Vehicle tracking method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127113A (en) * 2016-06-15 2016-11-16 北京联合大学 A kind of road track line detecting method based on three-dimensional laser radar
US20180108146A1 (en) * 2016-10-13 2018-04-19 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for annotating point cloud data
US20180306922A1 (en) * 2017-04-20 2018-10-25 Baidu Online Network Technology (Beijing) Co., Ltd Method and apparatus for positioning vehicle
CN108714914A (en) * 2018-03-19 2018-10-30 山东超越数控电子股份有限公司 A kind of mechanical arm vision system
CN109443369A (en) * 2018-08-20 2019-03-08 北京主线科技有限公司 The method for constructing sound state grating map using laser radar and visual sensor
CN110246159A (en) * 2019-06-14 2019-09-17 湖南大学 The 3D target motion analysis method of view-based access control model and radar information fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127113A (en) * 2016-06-15 2016-11-16 北京联合大学 A kind of road track line detecting method based on three-dimensional laser radar
US20180108146A1 (en) * 2016-10-13 2018-04-19 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for annotating point cloud data
US20180306922A1 (en) * 2017-04-20 2018-10-25 Baidu Online Network Technology (Beijing) Co., Ltd Method and apparatus for positioning vehicle
CN108714914A (en) * 2018-03-19 2018-10-30 山东超越数控电子股份有限公司 A kind of mechanical arm vision system
CN109443369A (en) * 2018-08-20 2019-03-08 北京主线科技有限公司 The method for constructing sound state grating map using laser radar and visual sensor
CN110246159A (en) * 2019-06-14 2019-09-17 湖南大学 The 3D target motion analysis method of view-based access control model and radar information fusion

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
胡彬;赵春霞;郭剑辉;袁夏: "基于主被动传感器融合的车辆检测" *
陈俊吉;皮大伟;谢伯元;王洪亮;王霞;: "基于几何特征与三维点云特征的道路边沿识别算法" *
麦新晨;杨明;王春香;王冰: "一种基于多传感器融合的车辆检测与跟踪方法" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112099042A (en) * 2020-08-07 2020-12-18 武汉万集信息技术有限公司 Vehicle tracking method and system
CN112099042B (en) * 2020-08-07 2024-04-12 武汉万集信息技术有限公司 Vehicle tracking method and system

Also Published As

Publication number Publication date
CN111275075B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
Liu et al. SMURF: Spatial multi-representation fusion for 3D object detection with 4D imaging radar
CN113506318B (en) Three-dimensional target perception method under vehicle-mounted edge scene
CN110210389B (en) A multi-target recognition and tracking method for road traffic scenes
CN109186625B (en) Method and system for accurately positioning intelligent vehicle by using hybrid sampling filtering
CN110942000B (en) Unmanned vehicle target detection method based on deep learning
CN114359876B (en) Vehicle target identification method and storage medium
CN111580131B (en) Method for 3D lidar smart car to recognize vehicles on the highway
CN110349192B (en) A tracking method for online target tracking system based on 3D laser point cloud
CN111260683A (en) Target detection and tracking method and device for three-dimensional point cloud data
CN111461048B (en) Vision-based parking lot drivable area detection and local map construction method
CN110320504A (en) A kind of unstructured road detection method based on laser radar point cloud statistics geometrical model
CN109596078A (en) Multi-information fusion spectrum of road surface roughness real-time testing system and test method
CN112613378A (en) 3D target detection method, system, medium and terminal
CN107167811A (en) The road drivable region detection method merged based on monocular vision with laser radar
CN106022381A (en) Automatic extraction technology of street lamp poles based on vehicle laser scanning point clouds
Wang et al. Vehicle-road environment perception under low-visibility condition based on polarization features via deep learning
CN105550688B (en) The classification method and device of point cloud data
CN106560835A (en) Guideboard identification method and device
CN110850394B (en) Automatic driving laser radar intensity calibration method
Shi et al. Weather recognition based on edge deterioration and convolutional neural networks
CN110599497A (en) Drivable region segmentation method based on deep neural network
CN116573017A (en) Method, system, device and medium for sensing foreign objects in urban rail train running boundary
CN112396655A (en) Point cloud data-based ship target 6D pose estimation method
CN117765507A (en) Foggy day traffic sign detection method based on deep learning
CN117761658A (en) Multi-target detection method and system for park conveying robot based on laser radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 250000 No. 2877 Kehang Road, Suncun Town, Jinan High-tech District, Shandong Province

Applicant after: Chaoyue Technology Co.,Ltd.

Address before: 250014 no.2877 Kehang Road, Suncun Town, high tech Zone, Jinan City, Shandong Province

Applicant before: SHANDONG CHAOYUE DATA CONTROL ELECTRONICS Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant