Disclosure of Invention
In order to solve the above-mentioned shortcomings of the prior art, the present invention provides an image angle calculating method, system, terminal device and computer readable storage medium based on multiple candidate matching points, by using all the feature points with sufficiently small feature distances and satisfying the center distance constraint as candidate matching points, so as to reserve as many correct matching point pairs as possible, and by removing the wrong matching points from the clustering of the angle differences of the matching points, the correct image rotation angle is obtained.
A first object of the present invention is to provide an image angle calculation method based on multiple candidate matching points.
A second object of the present invention is to provide an image angle calculation system based on multiple candidate matching points.
A third object of the present invention is to provide a terminal device.
A fourth object of the present invention is to provide a computer-readable storage medium.
The first object of the present invention can be achieved by adopting the following technical scheme:
an image angle calculation method based on multiple candidate matching points, the method comprising:
Placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with the horizontal line;
Extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
placing the hub to be tested on a conveyor belt, shooting the hub to be tested, wherein the position of a camera is the same as that of the hub with the same model as that of the hub to be tested when shooting, and obtaining an image of the hub to be tested;
Extracting a feature point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
For any feature point M i in the feature point set M, selecting all feature points with small enough feature distance and meeting the center distance constraint from the feature point set N as candidate matching points, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
Clustering the angle differences of candidate matching point pairs of all the feature points in the feature point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta max,θmax to be the rotation angle of the hub image to be detected.
Further, the set of candidate matching points of the feature point M i is denoted as P i;
The calculating the angle difference of the candidate matching point pair of the feature point M i specifically includes:
the angle difference between M i and the kth candidate matching point in P i is:
where atan is an arctangent function, and in image I M, The abscissa and ordinate of the image center respectively,The abscissa and the ordinate of the feature point M i, respectively, in the image I N,The abscissa and ordinate of the image center respectively,The abscissa and ordinate, respectively, of the kth candidate matching point in set P i.
Further, for any one of the feature points M i in the feature point set M, all feature points with sufficiently small feature distances and satisfying the center distance constraint are selected from the feature point set N as candidate matching points, specifically:
Where j=1, 2,., N, P i is the set of candidate matching points N j for feature point M i; Is the feature distance between feature points M i and N j; Is the maximum possible feature distance; the minimum feature distance found in the point set N for the feature point M i; The Euclidean distance between the coordinate of the characteristic point M i and the central coordinate of the image I M; the method comprises the steps of selecting a candidate matching point N j coordinate and an image I N center coordinate, wherein T 1 is a characteristic distance threshold coefficient used for guaranteeing that the characteristic distance of the candidate matching point is small enough, T 2 E [0,1] is a characteristic proportion threshold used for guaranteeing that the characteristic distance of the candidate matching point is small enough compared with the characteristic distance of the optimal matching point, and T 3 is a center distance threshold used for guaranteeing that the difference between the characteristic point M i coordinate, the N j coordinate and the corresponding image center coordinate is small enough.
Further, the characteristic distance is calculated by adopting a SURF algorithm.
Further, the clustering the angle differences of the candidate matching point pairs of all the feature points in the feature point set M includes:
Re-ordering the angle differences of the candidate matching point pairs in the order of acquisition, and recording as theta t, t=1, 2, & i, wherein i is the number of the candidate matching point pairs;
Let k=1, assign θ 1 to the angle difference class θ 1;
m=2;
j=1;
the average value of the calculated angle difference class theta j is If it isThen, allocating the angle difference class theta j to the angle difference class theta m, otherwise, j=j+1, and if j < =k, returning to calculate the average value of the angle difference class theta j to beAnd continuing to execute the subsequent operation;
If j= k+1, then k=k+1, and θ m is assigned to the angle difference class θ k, where T 4 is a set angle difference threshold;
m=m+1, when m≤l, returning j=1, and continuing to perform the subsequent operations.
Further, removing the background includes:
adopting Hough circle transformation to the hub image to obtain the outer contour of the hub;
Setting all the pixel values of the external image of the outer contour of the hub to 0;
Intercepting an image according to the minimum external rectangle of the outer contour of the hub to obtain an image with the background removed;
the hub images are respectively standard hub images and hub images to be detected.
Further, a SURF algorithm is adopted to extract a feature point set for images, wherein the images are a standard hub image I M with the background removed and a hub image I N to be measured after the background is removed respectively.
The second object of the invention can be achieved by adopting the following technical scheme:
an image angle computing system based on multiple candidate matching points, the system comprising:
The first image acquisition module is used for placing the wheel hub with the same model as the wheel hub to be tested on the conveyor belt, and enabling the center line of one spoke of the wheel hub to coincide with the horizontal line;
the first extraction module is used for extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
The second image acquisition module is used for placing the hub to be tested on the conveyor belt, and obtaining an image of the hub to be tested when the hub to be tested is shot and the position of the camera is the same as that of the hub with the same model as that of the hub to be tested;
The second extraction module is used for extracting a characteristic point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
The candidate matching module is used for selecting all feature points which have small enough feature distances and meet the center distance constraint from the feature point set N as candidate matching points for any feature point M i in the feature point set M, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
And the clustering module is used for clustering the angle differences of the candidate matching point pairs of all the characteristic points in the characteristic point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta max,θmax to be the rotation angle of the hub image to be detected.
The third object of the present invention can be achieved by adopting the following technical scheme:
The terminal equipment comprises a processor and a memory for storing a program executable by the processor, wherein the processor realizes the image angle calculation method based on the multi-candidate matching points when executing the program stored by the memory.
The fourth object of the present invention can be achieved by adopting the following technical scheme:
A computer-readable storage medium storing a program which, when executed by a processor, implements the above-described image angle calculation method based on multiple candidate matching points.
Compared with the prior art, the invention has the following beneficial effects:
1. compared with the standard method that only one matching point is found for one characteristic point, the method takes all characteristic points with small characteristic distance and meeting the central distance constraint as candidate matching points, and can obtain more correct matching point pairs;
2. the right matching point pairs can be screened faster and the right image rotation angle can be obtained by simply clustering the matching point pairs and the angle differences.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments, and all other embodiments obtained by those skilled in the art without making any inventive effort based on the embodiments of the present application are within the scope of protection of the present application. It should be understood that the detailed description is intended to illustrate the application, and is not intended to limit the application.
Example 1:
as shown in fig. 1, the image angle calculating method based on multiple candidate matching points provided in this embodiment includes two processes of offline and online:
(1) And (5) off-line flow.
The method specifically comprises the following steps:
(1-1) placing a hub of which the model is identical to that of the hub to be tested on a conveyor belt, manually rotating the hub to enable the center line of a certain spoke to coincide with the horizontal line, and shooting by a camera fixed above the conveyor belt to obtain a standard hub image, as shown in fig. 2.
(1-2) Removing the standard hub image background.
Because the hub is circular, the specific steps for removing the background in the embodiment are as follows:
(1-2-1) obtaining the hub outer contour by using Hough circle transformation.
(1-2-2) Setting all of the hub outer contour outer image pixel values to 0.
(1-2-3) Intercepting the image according to the minimum circumscribed rectangle of the outer contour of the hub to obtain a standard image I M after removing the background, as shown in figure 3.
(1-3) Extracting a feature point set m= { M 1,M2,…,Mm } of the image I M and a feature descriptor corresponding to each feature point by using a feature extraction algorithm.
The feature extraction algorithm adopted in this embodiment is the SURF algorithm.
(2) And (5) online flow.
The method specifically comprises the following steps:
(2-1) placing the hub to be tested on a conveyor belt, and shooting with a camera at the same position as the offline step (1-1) to obtain an image of the hub to be tested, as shown in fig. 4.
(2-2) Removing the background of the hub image to be detected, wherein the method is the same as that of the offline step (1-2), and obtaining a hub image I N to be detected after removing the background, as shown in fig. 5.
(2-3) Extracting the feature point set n= { N 1,N2,…,Nn } and the feature descriptor corresponding to each feature point of the image I N by using a feature extraction algorithm, the method being the same as the off-line step (1-3).
(2-4) Obtaining candidate matching point pairs.
The specific process is as follows:
For each M i of the feature point sets M, its candidate matching point set is:
In the formula, For the feature distance between the feature points M i and N j, a SURF algorithm is adopted in the feature distance calculation method; Is the maximum possible feature distance; the minimum feature distance which can be found in the point set N for the feature point M i; The Euclidean distance between the coordinate of the characteristic point M i and the central coordinate of the image I M; The method comprises the steps of determining a Euclidean distance between a coordinate of a feature point N j and a central coordinate of an image I N, determining a feature distance threshold coefficient T 1 for ensuring that the feature distance of a candidate matching point is small enough, determining a feature ratio threshold T 2 E [0,1] for ensuring that the feature distance of the candidate matching point is small enough compared with the feature distance of an optimal matching point, namely, determining that the candidate matching point is a reasonable alternative scheme of the optimal matching point, and determining a central distance threshold T 3 for ensuring that the difference between the two point coordinates and the central coordinate of the image is small enough.
In this example, T 1 is 0.15, T 2 is 0.6, and T 3 is 15.
Each feature point in M i and P i constitutes a pair of candidate matching point pairs.
(2-5) Calculating the angle differences of all candidate matching point pairs.
The angle difference between the j-th feature point in the M i and the P i is:
In the formula, atan is an arctangent function, the output angle quadrant is determined by the input numerator and denominator sign, and in the image I M, The abscissa and ordinate of the image center respectively,The abscissa and the ordinate of the feature point M i, respectively, in the image I N,Respectively an abscissa and an ordinate of the image center; The abscissa and the ordinate of the j-th feature point in the point set P i, respectively.
(2-6) Simply clustering the angle differences of the candidate matching point pairs, and solving the image rotation angle theta.
The method comprises the following specific steps:
(2-6-1) reordering the angular differences of the candidate matching point pairs in the order of acquisition, denoted θ i, i=1, 2.
(2-6-2) Let k=1, and θ 1 is assigned to the angle difference class θ 1.
(2-6-3) Each of the following θ i, i=2, all angular differences class theta j of l from existing, j=1.. average of all angle differences in each angle difference class in kComparing ifThen θ i is assigned to the angle difference class θ j, otherwise let k=k+1, and θ i is assigned to the angle difference class θ k.
Wherein T 4 is the angle difference threshold, and the present embodiment T 4 takes 5.
(2-6-4) Recording the class containing the largest number of angle differences as θ max, the average value of all angle differences in the classI.e., the image rotation angle θ that is calculated.
The result of rotating the image I N at this angle is shown in fig. 6, which shows that the hub spokes substantially coincide with the horizontal after rotation.
Those skilled in the art will appreciate that all or part of the steps in a method implementing the above embodiments may be implemented by a program to instruct related hardware, and the corresponding program may be stored in a computer readable storage medium.
It should be noted that although the method operations of the above embodiments are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in that particular order or that all illustrated operations be performed in order to achieve desirable results. Rather, the depicted steps may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
Example 2:
As shown in fig. 7, the present embodiment provides an image angle computing system based on multiple candidate matching points, the system including a first image acquisition module 701, a first extraction module 702, a second image acquisition module 703, a second extraction module 704, a candidate matching module 705, and a clustering module 706, wherein:
the first image acquisition module 701 is used for placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with a horizontal line;
the first extraction module 702 is configured to extract a feature point set m= { M 1,M2,…,Mm } according to the standard hub image i M with the background removed;
The second image acquisition module 703 is configured to place the hub to be tested on a conveyor belt, and obtain an image of the hub to be tested when the hub to be tested is photographed and the position of the camera is the same as that of the hub with the same model as that of the hub to be tested;
The second extraction module 704 is configured to extract a feature point set n= { N 1,N2,…,Nn } according to the to-be-detected hub image I N after the background is removed;
The candidate matching module 705 is configured to select, for any feature point M i in the feature point set M, all feature points that have sufficiently small feature distances and satisfy a center distance constraint from the feature point set N as candidate matching points, combine M i with each candidate matching point to form a pair of candidate matching point pairs, and calculate an angle difference between the candidate matching point pairs of the feature point M i, where i=1, 2.
And the clustering module 706 is configured to cluster the angle differences of the candidate matching point pairs of all the feature points in the feature point set M, and mark the class with the largest number of angle differences as the average value of all the angle differences in θ max,θmax, which is the rotation angle of the hub image to be tested.
The specific implementation of each module in this embodiment may refer to the above embodiment 1, and will not be described in detail herein, it should be noted that, in the system provided in this embodiment, only the division of each functional module is illustrated, and in practical application, the above functional allocation may be completed by different functional modules according to needs, that is, the internal structure is divided into different functional modules to complete all or part of the functions described above.
Example 3:
the present embodiment provides a terminal device, which may be a computer, as shown in fig. 8, and is connected through a system bus 801 to a processor 802, a memory, an input device 803, a display 804 and a network interface 805, where the processor is configured to provide computing and control capabilities, the memory includes a nonvolatile storage medium 806 and an internal memory 807, where the nonvolatile storage medium 806 stores an operating system, a computer program and a database, and the internal memory 807 provides an environment for the operating system and the computer program in the nonvolatile storage medium, and when the processor 802 executes the computer program stored in the memory, the image angle calculating method based on multiple candidate matching points of the above embodiment 1 is implemented as follows:
Placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with the horizontal line;
Extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
placing the hub to be tested on a conveyor belt, shooting the hub to be tested, wherein the position of a camera is the same as that of the hub with the same model as that of the hub to be tested when shooting, and obtaining an image of the hub to be tested;
Extracting a feature point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
For any feature point M i in the feature point set M, selecting all feature points with small enough feature distance and meeting the center distance constraint from the feature point set N as candidate matching points, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
Clustering the angle differences of candidate matching point pairs of all the feature points in the feature point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta max,θmax to be the rotation angle of the hub image to be detected.
Example 4:
the present embodiment provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the image angle calculation method based on multiple candidate matching points of embodiment 1 described above, as follows:
Placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with the horizontal line;
Extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
placing the hub to be tested on a conveyor belt, shooting the hub to be tested, wherein the position of a camera is the same as that of the hub with the same model as that of the hub to be tested when shooting, and obtaining an image of the hub to be tested;
Extracting a feature point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
For any feature point M i in the feature point set M, selecting all feature points with small enough feature distance and meeting the center distance constraint from the feature point set N as candidate matching points, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
Clustering the angle differences of candidate matching point pairs of all the feature points in the feature point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta max,θmax to be the rotation angle of the hub image to be detected.
The computer readable storage medium of the present embodiment may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of a computer-readable storage medium may include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The above-mentioned embodiments are only preferred embodiments of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can make equivalent substitutions or modifications according to the technical solution and the inventive concept of the present invention within the scope of the present invention disclosed in the present invention patent, and all those skilled in the art belong to the protection scope of the present invention.