CN111275075A - Vehicle detection and tracking method based on 3D laser radar - Google Patents
Vehicle detection and tracking method based on 3D laser radar Download PDFInfo
- Publication number
- CN111275075A CN111275075A CN202010029355.9A CN202010029355A CN111275075A CN 111275075 A CN111275075 A CN 111275075A CN 202010029355 A CN202010029355 A CN 202010029355A CN 111275075 A CN111275075 A CN 111275075A
- Authority
- CN
- China
- Prior art keywords
- axis
- points
- vehicle
- point cloud
- calculating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/277—Analysis of motion involving stochastic approaches, e.g. using Kalman filters
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Mathematical Optimization (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Pure & Applied Mathematics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Mathematical Analysis (AREA)
- Computational Mathematics (AREA)
- Computing Systems (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Algebra (AREA)
- Databases & Information Systems (AREA)
- Software Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
A3D laser radar-based vehicle detection and tracking method uses a 3D laser radar as a sensor, and utilizes an SVM training classifier to perform vehicle detection by using various vehicle characteristics including a vehicle characteristic I, a vehicle characteristic II, a vehicle characteristic III, a vehicle characteristic IV, a vehicle characteristic V, a vehicle characteristic VI and a vehicle characteristic VII, and uses a global nearest neighbor algorithm and unscented Kalman filtering to perform vehicle tracking, so that the real rate of vehicle detection is improved by means of the result of vehicle tracking.
Description
Technical Field
The invention relates to the technical field of unmanned driving, in particular to a vehicle detection and tracking method based on a 3D laser radar.
Background
With the continuous development of control system technology, sensor technology and artificial intelligence technology, research on unmanned vehicles has made great progress, and related technologies are beginning to be applied more and more in military affairs, scientific research, production and our daily life. The important scientific research significance of the unmanned technology is reflected not only on the core scientific problems contained in the unmanned technology, but also on the key strategic value and the bright application prospect of the unmanned technology, so that the social attention of the unmanned technology is extremely high.
Since vehicles tend to travel faster and are important participants in road traffic and major triggers of traffic accidents, vehicles are important interactive objects of unmanned vehicles in various road environments. With the development of the unmanned technology, research on vehicle detection and tracking is also remarkably advanced, and the research becomes one of the key problems in the field of unmanned automobiles. Therefore, how to improve the real rate of vehicle detection is also a problem in the prior art.
Disclosure of Invention
In order to overcome the defects of the technology, the invention provides the 3D laser radar-based vehicle detection and tracking method for improving the real rate of vehicle detection.
The technical scheme adopted by the invention for overcoming the technical problems is as follows:
a vehicle detection and tracking method based on a 3D laser radar comprises the following steps:
a) collecting 3D point cloud data on an unmanned vehicle by adopting a 64-line laser radar, preprocessing the 3D point cloud data and obtaining n object samples Sk,k=1,2,...,n, Is a point in object k, where NkIs the number of points contained in the kth subject sample;
b) using f ═ f1,f2,...,fmMark the feature vector of the 3D point cloud, m is 207;
c) vehicle characteristics I from object samples SkIndependent term f of the eigenvectors obtained in the inertia tensor matrix M1~f6Composition is carried out;
d) vehicle characteristics II from object samples SkIndependent term f of the eigenvectors obtained in the covariance matrix C of (2)7~f12Composition is carried out;
e) is established to haveThe vertical ground with the central point of the rear axle of the man-driven automobile as the origin is upward and is in the positive direction of the z-axis, the direction of the automobile heading is in the positive direction of the x-axis, and a coordinate system in the positive direction of the y-axis is obtained according to a right-hand rule, and the characteristics III of the automobile are represented by an object sample SkDividing the z-axis into 10 layers, projecting the points in the point cloud of each layer onto a plane perpendicular to the z-axis and calculating the length along the x-axis to obtain 10 division features f13~f22;
f) Dividing a plane formed by an x axis and a z axis into 14 x 9 groups of grids, projecting the 3D point cloud into the grids of the plane formed by the x axis and the z axis to obtain a normalized 2D histogram of the xz plane, calculating the number of points in each grid, wherein the number of points in each grid is the value of the points in a corresponding interval of the histogram, and the values corresponding to 14 x 9 intervals on the histogram are vehicle features IVf23~f148;
g) Dividing a plane formed by a y axis and a z axis into 9 x 6 groups of grids, projecting the 3D point cloud to the grids of the plane formed by the y axis and the z axis to obtain a normalized 2D histogram of a yz plane, calculating the number of points in each grid, wherein the number of points in the grids is the value of the points in a corresponding interval of the histogram, and the value corresponding to 9 x 6 intervals on the histogram is the vehicle characteristic vf149~f202;
h) Vehicle feature VI is represented by object sample SkMean and variance f of the reflection intensity203~f204Composition is carried out;
i) the vehicle characteristic VII is formed by an object sample SkAspect ratio, aspect ratio f205~f207Composition is carried out;
j) preparing a data set containing positive samples and negative samples, wherein the positive samples are various vehicles scanned by the 3D laser radar from different angles, the negative samples are shrubs, building outer walls and pedestrians, and f1~f207The formed data set is divided into a training set and a testing set, the training set and the testing set both comprise positive samples and negative samples, the training set is used for extracting vehicle characteristics and training an SVM classifier, and the classification effect of the SVM classifier is verified in the testing set; (ii) a
k) Stopping training when the classification effect meets the requirement, and repeatedly executing the step j) if the classification effect does not meet the requirement;
l) using a global nearest neighbor algorithm in combination with a Kalman filtering algorithm for vehicle tracking.
Further, the formula for calculating the inertia tensor matrix in the step c) isIn the formula xt、yt、ztAnd the coordinate value of the t-th point in the 3D point cloud in a coordinate system.
Further, the covariance matrix in step d)
Further, the formula is used in step f) Calculating the number of points in a grid cell (p, q), where xmin、xmaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the x-axis, zmin、zmaxFor the minimum and maximum values, x, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the x axis, ziThe coordinate values of the i points on the z axis.
Further, the formula is used in the step g) Calculating the number of points in a grid cell (p, q), where ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxMinimum and maximum values, y, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the y axis, ziThe coordinate values of the i points on the z axis.
Further, step h) is performed by the formulaCalculating the mean value of the reflection intensityIn the formula IiIs the reflection intensity of the ith point by the formulaCalculating the variance Icov。
Further, in step i), the formula is usedCalculating the aspect ratio dxzBy the formulaCalculating the aspect ratio dxyBy the formulaCalculating aspect ratio dyzWherein x ismin、xmaxMinimum and maximum values, y, of the upper points of the corresponding object point clouds on the x-axismin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxThe minimum value and the maximum value of the upper point of the corresponding object point cloud on the z axis are obtained.
The invention has the beneficial effects that: the method comprises the steps of using a 3D laser radar as a sensor, using an SVM training classifier to perform vehicle detection through various vehicle characteristics including a vehicle characteristic I, a vehicle characteristic II, a vehicle characteristic III, a vehicle characteristic IV, a vehicle characteristic V, a vehicle characteristic VI and a vehicle characteristic VII, using a global nearest neighbor algorithm and unscented Kalman filtering to perform vehicle tracking, and improving the real rate of vehicle detection by means of a vehicle tracking result.
Detailed Description
The present invention is further explained below.
A vehicle detection and tracking method based on a 3D laser radar comprises the following steps:
a) collecting 3D point cloud data on an unmanned vehicle by adopting a 64-line laser radar, preprocessing the 3D point cloud data and obtaining n object samples Sk,k=1,2,...,n, Is a point in object k, where NkIs the number of points contained in the kth subject sample;
b) using f ═ f1,f2,...,fmMark the feature vector of the 3D point cloud, m is 207;
c) vehicle characteristics I from object samples SkIndependent term f of the eigenvectors obtained in the inertia tensor matrix M1~f6Composition is carried out;
d) vehicle characteristics II from object samples SkIndependent term f of the eigenvectors obtained in the covariance matrix C of (2)7~f12Composition is carried out;
e) establishing a coordinate system which takes the central point of the rear axle of the unmanned automobile as the origin, vertically faces upwards as the positive direction of the z-axis, faces towards the automobile as the positive direction of the x-axis and obtains the positive direction of the y-axis according to a right-hand rule, and obtaining the vehicle characteristics III by an object sample SkDividing the z-axis into 10 layers, projecting the points in the point cloud of each layer onto a plane perpendicular to the z-axis and calculating the length along the x-axis to obtain 10 division features f13~f22;
f) Dividing a plane formed by an x axis and a z axis into 14 x 9 groups of grids, projecting the 3D point cloud into the grids of the plane formed by the x axis and the z axis to obtain a normalized 2D histogram of the xz plane, calculating the number of points in each grid, wherein the number of points in each grid is the value of the points in a corresponding interval of the histogram, and the values corresponding to 14 x 9 intervals on the histogram are vehicle features IVf23~f148;
g) Dividing a plane formed by a y axis and a z axis into 9 x 6 groups of grids, projecting the 3D point cloud to the grids of the plane formed by the y axis and the z axis to obtain a normalized 2D histogram of a yz plane, calculating the number of points in each grid, wherein the number of points in the grids is the value of the points in a corresponding interval of the histogram, and the value corresponding to 9 x 6 intervals on the histogram is the vehicle characteristic vf149~f202;
h) Vehicle feature VI is represented by object sample SkMean and variance f of the reflection intensity203~f204Composition is carried out;
i) the vehicle characteristic VII is formed by an object sample SkAspect ratio, aspect ratio f205~f207Composition is carried out;
j) preparing a data set containing positive samples and negative samples, wherein the positive samples are various vehicles scanned by the 3D laser radar from different angles, the negative samples are shrubs, building outer walls and pedestrians, and f1~f207The formed data set is divided into a training set and a testing set, the training set and the testing set both comprise positive samples and negative samples, the training set is used for extracting vehicle characteristics and training an SVM classifier, and the classification effect of the SVM classifier is verified in the testing set; (ii) a
k) Stopping training when the classification effect meets the requirement, and repeatedly executing the step j) if the classification effect does not meet the requirement;
l) using a global nearest neighbor algorithm in combination with a Kalman filtering algorithm for vehicle tracking.
The method comprises the steps of using a 3D laser radar as a sensor, using an SVM training classifier to perform vehicle detection through various vehicle characteristics including a vehicle characteristic I, a vehicle characteristic II, a vehicle characteristic III, a vehicle characteristic IV, a vehicle characteristic V, a vehicle characteristic VI and a vehicle characteristic VII, using a global nearest neighbor algorithm and unscented Kalman filtering to perform vehicle tracking, and improving the real rate of vehicle detection by means of a vehicle tracking result.
The formula for calculating the inertia tensor matrix in the step c) isIn the formula xt、yt、ztAnd the coordinate value of the t-th point in the 3D point cloud in a coordinate system.
Covariance matrix in step d)
Using the formula in step f) Calculating the number of points in a grid cell (p, q), whichIn xmin、xmaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the x-axis, zmin、zmaxFor the minimum and maximum values, x, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the x axis, ziThe coordinate values of the i points on the z axis.
Using the formula in step g) Calculating the number of points in a grid cell (p, q), where ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxMinimum and maximum values, y, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the y axis, ziThe coordinate values of the i points on the z axis.
In step h) by the formulaCalculating the mean value of the reflection intensityIn the formula IiIs the reflection intensity of the ith point by the formulaCalculating the variance Icov。
In step i) by the formulaCalculating the aspect ratio dxzBy the formulaCalculating the aspect ratio dxyBy the formulaCalculating aspect ratio dyzWherein x ismin、xmaxMinimum and maximum values, y, of the upper points of the corresponding object point clouds on the x-axismin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxThe minimum value and the maximum value of the upper point of the corresponding object point cloud on the z axis are obtained.
The above description is only a preferred embodiment of the present invention, and is only used to illustrate the technical solutions of the present invention, and not to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.
Claims (7)
1. A vehicle detection and tracking method based on a 3D laser radar is characterized by comprising the following steps:
a) collecting 3D point cloud data on an unmanned vehicle by adopting a 64-line laser radar, preprocessing the 3D point cloud data and obtaining n object samples Sk,k=1,2,...,n, Is a point in object k, where NkIs the number of points contained in the kth subject sample;
b) using f ═ f1,f2,...,fmMark the feature vector of the 3D point cloud, m is 207;
c) vehicle characteristics I from object samples SkIndependent term f of the eigenvectors obtained in the inertia tensor matrix M1~f6Composition is carried out;
d) vehicle characteristics II from object samples SkIndependent term f of the eigenvectors obtained in the covariance matrix C of (2)7~f12Composition is carried out;
e) establishing a vertical ground upward direction with a central point of a rear shaft of the unmanned vehicle as an original point as a positive z-axisThe direction of the vehicle is positive in the x-axis direction, a coordinate system in the y-axis direction is obtained according to a right-hand rule, and the vehicle characteristics III are obtained from an object sample SkDividing the z-axis into 10 layers, projecting the points in the point cloud of each layer onto a plane perpendicular to the z-axis and calculating the length along the x-axis to obtain 10 division features f13~f22;
f) Dividing a plane formed by an x axis and a z axis into 14 x 9 groups of grids, projecting the 3D point cloud into the grids of the plane formed by the x axis and the z axis to obtain a normalized 2D histogram of the xz plane, calculating the number of points in each grid, wherein the number of points in each grid is the value of the points in a corresponding interval of the histogram, and the values corresponding to 14 x 9 intervals on the histogram are vehicle features IVf23~f148;
g) Dividing a plane formed by a y axis and a z axis into 9 x 6 groups of grids, projecting the 3D point cloud to the grids of the plane formed by the y axis and the z axis to obtain a normalized 2D histogram of a yz plane, calculating the number of points in each grid, wherein the number of points in the grids is the value of the points in a corresponding interval of the histogram, and the value corresponding to 9 x 6 intervals on the histogram is the vehicle characteristic vf149~f202;
h) Vehicle feature VI is represented by object sample SkMean and variance f of the reflection intensity203~f204Composition is carried out;
i) the vehicle characteristic VII is formed by an object sample SkAspect ratio, aspect ratio f205~f207Composition is carried out;
j) preparing a data set containing positive samples and negative samples, wherein the positive samples are various vehicles scanned by the 3D laser radar from different angles, the negative samples are shrubs, building outer walls and pedestrians, and f1~f207The formed data set is divided into a training set and a testing set, the training set and the testing set both comprise positive samples and negative samples, the training set is used for extracting vehicle characteristics and training an SVM classifier, and the classification effect of the SVM classifier is verified in the testing set; (ii) a
k) Stopping training when the classification effect meets the requirement, and repeatedly executing the step j) if the classification effect does not meet the requirement;
l) using a global nearest neighbor algorithm in combination with a Kalman filtering algorithm for vehicle tracking.
4. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: using the formula in step f) Calculating the number of points in a grid cell (p, q), where xmin、xmaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the x-axis, zmin、zmaxFor the minimum and maximum values, x, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the x axis, ziThe coordinate values of the i points on the z axis.
5. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: using the formula in step g) Calculating the number of points in a grid cell (p, q), where ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxMinimum and maximum values, y, of the points of the corresponding object point cloud on the z-axisiThe coordinate value of the ith point on the y axis, ziThe coordinate values of the i points on the z axis.
7. The 3D lidar based vehicle detection and tracking method of claim 1, wherein: in step i) by the formulaCalculating the aspect ratio dxzBy the formulaCalculating the aspect ratio dxyBy the formulaCalculating aspect ratio dyzWherein x ismin、xmaxFor the point cloud of the corresponding objectMinimum and maximum values of upper points on the x-axis, ymin、ymaxFor the minimum and maximum values of the upper points of the corresponding object point cloud on the y-axis, zmin、zmaxThe minimum value and the maximum value of the upper point of the corresponding object point cloud on the z axis are obtained.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010029355.9A CN111275075B (en) | 2020-01-10 | 2020-01-10 | Vehicle detection and tracking method based on 3D laser radar |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010029355.9A CN111275075B (en) | 2020-01-10 | 2020-01-10 | Vehicle detection and tracking method based on 3D laser radar |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN111275075A true CN111275075A (en) | 2020-06-12 |
| CN111275075B CN111275075B (en) | 2023-05-02 |
Family
ID=71001835
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010029355.9A Active CN111275075B (en) | 2020-01-10 | 2020-01-10 | Vehicle detection and tracking method based on 3D laser radar |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN111275075B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112099042A (en) * | 2020-08-07 | 2020-12-18 | 武汉万集信息技术有限公司 | Vehicle tracking method and system |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106127113A (en) * | 2016-06-15 | 2016-11-16 | 北京联合大学 | A kind of road track line detecting method based on three-dimensional laser radar |
| US20180108146A1 (en) * | 2016-10-13 | 2018-04-19 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and apparatus for annotating point cloud data |
| US20180306922A1 (en) * | 2017-04-20 | 2018-10-25 | Baidu Online Network Technology (Beijing) Co., Ltd | Method and apparatus for positioning vehicle |
| CN108714914A (en) * | 2018-03-19 | 2018-10-30 | 山东超越数控电子股份有限公司 | A kind of mechanical arm vision system |
| CN109443369A (en) * | 2018-08-20 | 2019-03-08 | 北京主线科技有限公司 | The method for constructing sound state grating map using laser radar and visual sensor |
| CN110246159A (en) * | 2019-06-14 | 2019-09-17 | 湖南大学 | The 3D target motion analysis method of view-based access control model and radar information fusion |
-
2020
- 2020-01-10 CN CN202010029355.9A patent/CN111275075B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106127113A (en) * | 2016-06-15 | 2016-11-16 | 北京联合大学 | A kind of road track line detecting method based on three-dimensional laser radar |
| US20180108146A1 (en) * | 2016-10-13 | 2018-04-19 | Beijing Baidu Netcom Science And Technology Co., Ltd. | Method and apparatus for annotating point cloud data |
| US20180306922A1 (en) * | 2017-04-20 | 2018-10-25 | Baidu Online Network Technology (Beijing) Co., Ltd | Method and apparatus for positioning vehicle |
| CN108714914A (en) * | 2018-03-19 | 2018-10-30 | 山东超越数控电子股份有限公司 | A kind of mechanical arm vision system |
| CN109443369A (en) * | 2018-08-20 | 2019-03-08 | 北京主线科技有限公司 | The method for constructing sound state grating map using laser radar and visual sensor |
| CN110246159A (en) * | 2019-06-14 | 2019-09-17 | 湖南大学 | The 3D target motion analysis method of view-based access control model and radar information fusion |
Non-Patent Citations (3)
| Title |
|---|
| 胡彬;赵春霞;郭剑辉;袁夏: "基于主被动传感器融合的车辆检测" * |
| 陈俊吉;皮大伟;谢伯元;王洪亮;王霞;: "基于几何特征与三维点云特征的道路边沿识别算法" * |
| 麦新晨;杨明;王春香;王冰: "一种基于多传感器融合的车辆检测与跟踪方法" * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112099042A (en) * | 2020-08-07 | 2020-12-18 | 武汉万集信息技术有限公司 | Vehicle tracking method and system |
| CN112099042B (en) * | 2020-08-07 | 2024-04-12 | 武汉万集信息技术有限公司 | Vehicle tracking method and system |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111275075B (en) | 2023-05-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Liu et al. | SMURF: Spatial multi-representation fusion for 3D object detection with 4D imaging radar | |
| CN113506318B (en) | Three-dimensional target perception method under vehicle-mounted edge scene | |
| CN110210389B (en) | A multi-target recognition and tracking method for road traffic scenes | |
| CN109186625B (en) | Method and system for accurately positioning intelligent vehicle by using hybrid sampling filtering | |
| CN110942000B (en) | Unmanned vehicle target detection method based on deep learning | |
| CN114359876B (en) | Vehicle target identification method and storage medium | |
| CN111580131B (en) | Method for 3D lidar smart car to recognize vehicles on the highway | |
| CN110349192B (en) | A tracking method for online target tracking system based on 3D laser point cloud | |
| CN111260683A (en) | Target detection and tracking method and device for three-dimensional point cloud data | |
| CN111461048B (en) | Vision-based parking lot drivable area detection and local map construction method | |
| CN110320504A (en) | A kind of unstructured road detection method based on laser radar point cloud statistics geometrical model | |
| CN109596078A (en) | Multi-information fusion spectrum of road surface roughness real-time testing system and test method | |
| CN112613378A (en) | 3D target detection method, system, medium and terminal | |
| CN107167811A (en) | The road drivable region detection method merged based on monocular vision with laser radar | |
| CN106022381A (en) | Automatic extraction technology of street lamp poles based on vehicle laser scanning point clouds | |
| Wang et al. | Vehicle-road environment perception under low-visibility condition based on polarization features via deep learning | |
| CN105550688B (en) | The classification method and device of point cloud data | |
| CN106560835A (en) | Guideboard identification method and device | |
| CN110850394B (en) | Automatic driving laser radar intensity calibration method | |
| Shi et al. | Weather recognition based on edge deterioration and convolutional neural networks | |
| CN110599497A (en) | Drivable region segmentation method based on deep neural network | |
| CN116573017A (en) | Method, system, device and medium for sensing foreign objects in urban rail train running boundary | |
| CN112396655A (en) | Point cloud data-based ship target 6D pose estimation method | |
| CN117765507A (en) | Foggy day traffic sign detection method based on deep learning | |
| CN117761658A (en) | Multi-target detection method and system for park conveying robot based on laser radar |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| CB02 | Change of applicant information |
Address after: 250000 No. 2877 Kehang Road, Suncun Town, Jinan High-tech District, Shandong Province Applicant after: Chaoyue Technology Co.,Ltd. Address before: 250014 no.2877 Kehang Road, Suncun Town, high tech Zone, Jinan City, Shandong Province Applicant before: SHANDONG CHAOYUE DATA CONTROL ELECTRONICS Co.,Ltd. |
|
| CB02 | Change of applicant information | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |