[go: up one dir, main page]

CN118261960B - Image angle calculation method and system based on multiple candidate matching points - Google Patents

Image angle calculation method and system based on multiple candidate matching points

Info

Publication number
CN118261960B
CN118261960B CN202410441063.4A CN202410441063A CN118261960B CN 118261960 B CN118261960 B CN 118261960B CN 202410441063 A CN202410441063 A CN 202410441063A CN 118261960 B CN118261960 B CN 118261960B
Authority
CN
China
Prior art keywords
image
feature
sub
wheel hub
candidate matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410441063.4A
Other languages
Chinese (zh)
Other versions
CN118261960A (en
Inventor
胡志辉
王洎
黄茜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202410441063.4A priority Critical patent/CN118261960B/en
Publication of CN118261960A publication Critical patent/CN118261960A/en
Application granted granted Critical
Publication of CN118261960B publication Critical patent/CN118261960B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/753Transform-based matching, e.g. Hough transform
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种基于多候补匹配点的图像角度计算方法及系统,所述方法包括:将与待测轮毂型号相同的轮毂放置在传送带上,并使轮毂的一个轮辐中心线与水平线重合;对轮毂进行拍摄,得到标准轮毂图像;将待测轮毂放置在传送带上,对待测轮毂进行拍摄,得到待测轮毂图像;分别对去除背景的标准轮毂图像和待测轮毂图像提取特征点集M和N;根据M和N,获取M中任一特征点Mi的候选匹配点集Pi;Mi与Pi中的特征点组成候选匹配点对;计算特征点Mi的候选匹配点对的角度差;对特征点集M中所有特征点的候选匹配点对的角度差进行聚类,计算包含角度差数目最多的类中所有角度差的均值即为旋转角度。本发明可以获得更多的正确匹配点对,并通过聚类获取正确的图像旋转角度。

This invention discloses an image angle calculation method and system based on multiple candidate matching points. The method includes: placing a wheel hub of the same model as the one to be tested on a conveyor belt, aligning the center line of one spoke of the wheel hub with the horizontal line; taking a picture of the wheel hub to obtain a standard wheel hub image; placing the wheel hub to be tested on the conveyor belt and taking a picture of the wheel hub to obtain a picture of the wheel hub to be tested; extracting feature point sets M and N from the standard wheel hub image and the wheel hub image to be tested, respectively, after removing the background; obtaining a candidate matching point set Pi for any feature point Mi in M based on M and N ; forming candidate matching point pairs with feature points in Mi and Pi; calculating the angle difference between the candidate matching point pairs of feature point Mi ; clustering the angle differences of the candidate matching point pairs of all feature points in the feature point set M, and calculating the mean of all angle differences in the class containing the most angle differences, which is the rotation angle. This invention can obtain more correct matching point pairs and obtain the correct image rotation angle through clustering.

Description

Image angle calculation method and system based on multiple candidate matching points
Technical Field
The present invention relates to the field of image feature matching technologies, and in particular, to an image angle computing method, system, terminal device and computer readable storage medium based on multiple candidate matching points.
Background
The motor vehicle hub is a typical planform symmetrically cast product. Since it is necessary to detect possible defects inside each hub with an X-ray device, it is necessary to place the hub directly under the X-ray detector for detection. For hubs with typical diameters between 24 and 30 inches, limited by the size of the X-ray detector, typically only 1 to 2 spoke area can be detected at one station. Because of the equal division characteristic of the spoke angles, after the first spoke is in the reference position for shooting detection, the hub is rotated by a set angle to be rotated to the next station for detection, and the spoke detection is completed by sequentially rotating for one circle. However, when the wheel hub is on line, the wheel spoke is difficult to ensure that no deviation exists between the wheel spoke and the datum line, the wheel hub is required to be rotated to enable the wheel spoke to be aligned with the datum line, and the fact that the wheel spoke part is shot and detected after each rotation is ensured, but the hollow part without materials between the two wheel spokes is not ensured. Therefore, the measurement of the spoke angle and the corresponding adjustment become necessary technical links in the hub detection.
Feature point matching algorithms are also commonly used for angular measurement of objects due to their good rotational invariance and dimensional invariance. However, when the measured object has more similar features and is geometrically symmetric as a whole, the number of incorrect matching points obtained by the conventional feature point matching algorithm is greatly increased, and the number of correct matching points is often too small to obtain a correct angle.
Disclosure of Invention
In order to solve the above-mentioned shortcomings of the prior art, the present invention provides an image angle calculating method, system, terminal device and computer readable storage medium based on multiple candidate matching points, by using all the feature points with sufficiently small feature distances and satisfying the center distance constraint as candidate matching points, so as to reserve as many correct matching point pairs as possible, and by removing the wrong matching points from the clustering of the angle differences of the matching points, the correct image rotation angle is obtained.
A first object of the present invention is to provide an image angle calculation method based on multiple candidate matching points.
A second object of the present invention is to provide an image angle calculation system based on multiple candidate matching points.
A third object of the present invention is to provide a terminal device.
A fourth object of the present invention is to provide a computer-readable storage medium.
The first object of the present invention can be achieved by adopting the following technical scheme:
an image angle calculation method based on multiple candidate matching points, the method comprising:
Placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with the horizontal line;
Extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
placing the hub to be tested on a conveyor belt, shooting the hub to be tested, wherein the position of a camera is the same as that of the hub with the same model as that of the hub to be tested when shooting, and obtaining an image of the hub to be tested;
Extracting a feature point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
For any feature point M i in the feature point set M, selecting all feature points with small enough feature distance and meeting the center distance constraint from the feature point set N as candidate matching points, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
Clustering the angle differences of candidate matching point pairs of all the feature points in the feature point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta maxmax to be the rotation angle of the hub image to be detected.
Further, the set of candidate matching points of the feature point M i is denoted as P i;
The calculating the angle difference of the candidate matching point pair of the feature point M i specifically includes:
the angle difference between M i and the kth candidate matching point in P i is:
where atan is an arctangent function, and in image I M, The abscissa and ordinate of the image center respectively,The abscissa and the ordinate of the feature point M i, respectively, in the image I N,The abscissa and ordinate of the image center respectively,The abscissa and ordinate, respectively, of the kth candidate matching point in set P i.
Further, for any one of the feature points M i in the feature point set M, all feature points with sufficiently small feature distances and satisfying the center distance constraint are selected from the feature point set N as candidate matching points, specifically:
Where j=1, 2,., N, P i is the set of candidate matching points N j for feature point M i; Is the feature distance between feature points M i and N j; Is the maximum possible feature distance; the minimum feature distance found in the point set N for the feature point M i; The Euclidean distance between the coordinate of the characteristic point M i and the central coordinate of the image I M; the method comprises the steps of selecting a candidate matching point N j coordinate and an image I N center coordinate, wherein T 1 is a characteristic distance threshold coefficient used for guaranteeing that the characteristic distance of the candidate matching point is small enough, T 2 E [0,1] is a characteristic proportion threshold used for guaranteeing that the characteristic distance of the candidate matching point is small enough compared with the characteristic distance of the optimal matching point, and T 3 is a center distance threshold used for guaranteeing that the difference between the characteristic point M i coordinate, the N j coordinate and the corresponding image center coordinate is small enough.
Further, the characteristic distance is calculated by adopting a SURF algorithm.
Further, the clustering the angle differences of the candidate matching point pairs of all the feature points in the feature point set M includes:
Re-ordering the angle differences of the candidate matching point pairs in the order of acquisition, and recording as theta t, t=1, 2, & i, wherein i is the number of the candidate matching point pairs;
Let k=1, assign θ 1 to the angle difference class θ 1;
m=2;
j=1;
the average value of the calculated angle difference class theta j is If it isThen, allocating the angle difference class theta j to the angle difference class theta m, otherwise, j=j+1, and if j < =k, returning to calculate the average value of the angle difference class theta j to beAnd continuing to execute the subsequent operation;
If j= k+1, then k=k+1, and θ m is assigned to the angle difference class θ k, where T 4 is a set angle difference threshold;
m=m+1, when m≤l, returning j=1, and continuing to perform the subsequent operations.
Further, removing the background includes:
adopting Hough circle transformation to the hub image to obtain the outer contour of the hub;
Setting all the pixel values of the external image of the outer contour of the hub to 0;
Intercepting an image according to the minimum external rectangle of the outer contour of the hub to obtain an image with the background removed;
the hub images are respectively standard hub images and hub images to be detected.
Further, a SURF algorithm is adopted to extract a feature point set for images, wherein the images are a standard hub image I M with the background removed and a hub image I N to be measured after the background is removed respectively.
The second object of the invention can be achieved by adopting the following technical scheme:
an image angle computing system based on multiple candidate matching points, the system comprising:
The first image acquisition module is used for placing the wheel hub with the same model as the wheel hub to be tested on the conveyor belt, and enabling the center line of one spoke of the wheel hub to coincide with the horizontal line;
the first extraction module is used for extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
The second image acquisition module is used for placing the hub to be tested on the conveyor belt, and obtaining an image of the hub to be tested when the hub to be tested is shot and the position of the camera is the same as that of the hub with the same model as that of the hub to be tested;
The second extraction module is used for extracting a characteristic point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
The candidate matching module is used for selecting all feature points which have small enough feature distances and meet the center distance constraint from the feature point set N as candidate matching points for any feature point M i in the feature point set M, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
And the clustering module is used for clustering the angle differences of the candidate matching point pairs of all the characteristic points in the characteristic point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta maxmax to be the rotation angle of the hub image to be detected.
The third object of the present invention can be achieved by adopting the following technical scheme:
The terminal equipment comprises a processor and a memory for storing a program executable by the processor, wherein the processor realizes the image angle calculation method based on the multi-candidate matching points when executing the program stored by the memory.
The fourth object of the present invention can be achieved by adopting the following technical scheme:
A computer-readable storage medium storing a program which, when executed by a processor, implements the above-described image angle calculation method based on multiple candidate matching points.
Compared with the prior art, the invention has the following beneficial effects:
1. compared with the standard method that only one matching point is found for one characteristic point, the method takes all characteristic points with small characteristic distance and meeting the central distance constraint as candidate matching points, and can obtain more correct matching point pairs;
2. the right matching point pairs can be screened faster and the right image rotation angle can be obtained by simply clustering the matching point pairs and the angle differences.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the structures shown in these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an image angle calculation method based on multiple candidate matching points according to embodiment 1 of the present invention;
FIG. 2 is a standard hub image (spoke centerline in image coincident with horizontal) of embodiment 1 of the present invention;
FIG. 3 is a standard hub image I M with the background removed in accordance with example 1 of the present invention;
FIG. 4 is an image of a hub to be tested (the center line of the spoke in the image has a certain angle with the horizontal line) according to embodiment 1 of the present invention;
FIG. 5 is a background-removed image I N of a hub to be tested according to example 1 of the present invention;
FIG. 6 is a rotated image of a hub to be tested according to embodiment 1 of the present invention;
FIG. 7 is a block diagram showing the structure of an image angle calculation system based on multiple candidate matching points according to embodiment 2 of the present invention;
fig. 8 is a block diagram showing the structure of a terminal device according to embodiment 3 of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments, and all other embodiments obtained by those skilled in the art without making any inventive effort based on the embodiments of the present application are within the scope of protection of the present application. It should be understood that the detailed description is intended to illustrate the application, and is not intended to limit the application.
Example 1:
as shown in fig. 1, the image angle calculating method based on multiple candidate matching points provided in this embodiment includes two processes of offline and online:
(1) And (5) off-line flow.
The method specifically comprises the following steps:
(1-1) placing a hub of which the model is identical to that of the hub to be tested on a conveyor belt, manually rotating the hub to enable the center line of a certain spoke to coincide with the horizontal line, and shooting by a camera fixed above the conveyor belt to obtain a standard hub image, as shown in fig. 2.
(1-2) Removing the standard hub image background.
Because the hub is circular, the specific steps for removing the background in the embodiment are as follows:
(1-2-1) obtaining the hub outer contour by using Hough circle transformation.
(1-2-2) Setting all of the hub outer contour outer image pixel values to 0.
(1-2-3) Intercepting the image according to the minimum circumscribed rectangle of the outer contour of the hub to obtain a standard image I M after removing the background, as shown in figure 3.
(1-3) Extracting a feature point set m= { M 1,M2,…,Mm } of the image I M and a feature descriptor corresponding to each feature point by using a feature extraction algorithm.
The feature extraction algorithm adopted in this embodiment is the SURF algorithm.
(2) And (5) online flow.
The method specifically comprises the following steps:
(2-1) placing the hub to be tested on a conveyor belt, and shooting with a camera at the same position as the offline step (1-1) to obtain an image of the hub to be tested, as shown in fig. 4.
(2-2) Removing the background of the hub image to be detected, wherein the method is the same as that of the offline step (1-2), and obtaining a hub image I N to be detected after removing the background, as shown in fig. 5.
(2-3) Extracting the feature point set n= { N 1,N2,…,Nn } and the feature descriptor corresponding to each feature point of the image I N by using a feature extraction algorithm, the method being the same as the off-line step (1-3).
(2-4) Obtaining candidate matching point pairs.
The specific process is as follows:
For each M i of the feature point sets M, its candidate matching point set is:
In the formula, For the feature distance between the feature points M i and N j, a SURF algorithm is adopted in the feature distance calculation method; Is the maximum possible feature distance; the minimum feature distance which can be found in the point set N for the feature point M i; The Euclidean distance between the coordinate of the characteristic point M i and the central coordinate of the image I M; The method comprises the steps of determining a Euclidean distance between a coordinate of a feature point N j and a central coordinate of an image I N, determining a feature distance threshold coefficient T 1 for ensuring that the feature distance of a candidate matching point is small enough, determining a feature ratio threshold T 2 E [0,1] for ensuring that the feature distance of the candidate matching point is small enough compared with the feature distance of an optimal matching point, namely, determining that the candidate matching point is a reasonable alternative scheme of the optimal matching point, and determining a central distance threshold T 3 for ensuring that the difference between the two point coordinates and the central coordinate of the image is small enough.
In this example, T 1 is 0.15, T 2 is 0.6, and T 3 is 15.
Each feature point in M i and P i constitutes a pair of candidate matching point pairs.
(2-5) Calculating the angle differences of all candidate matching point pairs.
The angle difference between the j-th feature point in the M i and the P i is:
In the formula, atan is an arctangent function, the output angle quadrant is determined by the input numerator and denominator sign, and in the image I M, The abscissa and ordinate of the image center respectively,The abscissa and the ordinate of the feature point M i, respectively, in the image I N,Respectively an abscissa and an ordinate of the image center; The abscissa and the ordinate of the j-th feature point in the point set P i, respectively.
(2-6) Simply clustering the angle differences of the candidate matching point pairs, and solving the image rotation angle theta.
The method comprises the following specific steps:
(2-6-1) reordering the angular differences of the candidate matching point pairs in the order of acquisition, denoted θ i, i=1, 2.
(2-6-2) Let k=1, and θ 1 is assigned to the angle difference class θ 1.
(2-6-3) Each of the following θ i, i=2, all angular differences class theta j of l from existing, j=1.. average of all angle differences in each angle difference class in kComparing ifThen θ i is assigned to the angle difference class θ j, otherwise let k=k+1, and θ i is assigned to the angle difference class θ k.
Wherein T 4 is the angle difference threshold, and the present embodiment T 4 takes 5.
(2-6-4) Recording the class containing the largest number of angle differences as θ max, the average value of all angle differences in the classI.e., the image rotation angle θ that is calculated.
The result of rotating the image I N at this angle is shown in fig. 6, which shows that the hub spokes substantially coincide with the horizontal after rotation.
Those skilled in the art will appreciate that all or part of the steps in a method implementing the above embodiments may be implemented by a program to instruct related hardware, and the corresponding program may be stored in a computer readable storage medium.
It should be noted that although the method operations of the above embodiments are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in that particular order or that all illustrated operations be performed in order to achieve desirable results. Rather, the depicted steps may change the order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
Example 2:
As shown in fig. 7, the present embodiment provides an image angle computing system based on multiple candidate matching points, the system including a first image acquisition module 701, a first extraction module 702, a second image acquisition module 703, a second extraction module 704, a candidate matching module 705, and a clustering module 706, wherein:
the first image acquisition module 701 is used for placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with a horizontal line;
the first extraction module 702 is configured to extract a feature point set m= { M 1,M2,…,Mm } according to the standard hub image i M with the background removed;
The second image acquisition module 703 is configured to place the hub to be tested on a conveyor belt, and obtain an image of the hub to be tested when the hub to be tested is photographed and the position of the camera is the same as that of the hub with the same model as that of the hub to be tested;
The second extraction module 704 is configured to extract a feature point set n= { N 1,N2,…,Nn } according to the to-be-detected hub image I N after the background is removed;
The candidate matching module 705 is configured to select, for any feature point M i in the feature point set M, all feature points that have sufficiently small feature distances and satisfy a center distance constraint from the feature point set N as candidate matching points, combine M i with each candidate matching point to form a pair of candidate matching point pairs, and calculate an angle difference between the candidate matching point pairs of the feature point M i, where i=1, 2.
And the clustering module 706 is configured to cluster the angle differences of the candidate matching point pairs of all the feature points in the feature point set M, and mark the class with the largest number of angle differences as the average value of all the angle differences in θ maxmax, which is the rotation angle of the hub image to be tested.
The specific implementation of each module in this embodiment may refer to the above embodiment 1, and will not be described in detail herein, it should be noted that, in the system provided in this embodiment, only the division of each functional module is illustrated, and in practical application, the above functional allocation may be completed by different functional modules according to needs, that is, the internal structure is divided into different functional modules to complete all or part of the functions described above.
Example 3:
the present embodiment provides a terminal device, which may be a computer, as shown in fig. 8, and is connected through a system bus 801 to a processor 802, a memory, an input device 803, a display 804 and a network interface 805, where the processor is configured to provide computing and control capabilities, the memory includes a nonvolatile storage medium 806 and an internal memory 807, where the nonvolatile storage medium 806 stores an operating system, a computer program and a database, and the internal memory 807 provides an environment for the operating system and the computer program in the nonvolatile storage medium, and when the processor 802 executes the computer program stored in the memory, the image angle calculating method based on multiple candidate matching points of the above embodiment 1 is implemented as follows:
Placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with the horizontal line;
Extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
placing the hub to be tested on a conveyor belt, shooting the hub to be tested, wherein the position of a camera is the same as that of the hub with the same model as that of the hub to be tested when shooting, and obtaining an image of the hub to be tested;
Extracting a feature point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
For any feature point M i in the feature point set M, selecting all feature points with small enough feature distance and meeting the center distance constraint from the feature point set N as candidate matching points, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
Clustering the angle differences of candidate matching point pairs of all the feature points in the feature point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta maxmax to be the rotation angle of the hub image to be detected.
Example 4:
the present embodiment provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the image angle calculation method based on multiple candidate matching points of embodiment 1 described above, as follows:
Placing the hub with the same model as the hub to be tested on a conveyor belt, and enabling the center line of one spoke of the hub to coincide with the horizontal line;
Extracting a feature point set M= { M 1,M2,…,Mm } according to the standard hub image I M with the background removed;
placing the hub to be tested on a conveyor belt, shooting the hub to be tested, wherein the position of a camera is the same as that of the hub with the same model as that of the hub to be tested when shooting, and obtaining an image of the hub to be tested;
Extracting a feature point set N= { N 1,N2,…,Nn } according to the hub image I N to be detected after the background is removed;
For any feature point M i in the feature point set M, selecting all feature points with small enough feature distance and meeting the center distance constraint from the feature point set N as candidate matching points, forming a pair of candidate matching point pairs by M i and each candidate matching point, and calculating the angle difference of the candidate matching point pairs of the feature point M i, wherein i=1, 2.
Clustering the angle differences of candidate matching point pairs of all the feature points in the feature point set M, and marking the class with the largest number of the angle differences as the average value of all the angle differences in the theta maxmax to be the rotation angle of the hub image to be detected.
The computer readable storage medium of the present embodiment may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of a computer-readable storage medium may include, but are not limited to, an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The above-mentioned embodiments are only preferred embodiments of the present invention, but the protection scope of the present invention is not limited thereto, and any person skilled in the art can make equivalent substitutions or modifications according to the technical solution and the inventive concept of the present invention within the scope of the present invention disclosed in the present invention patent, and all those skilled in the art belong to the protection scope of the present invention.

Claims (8)

1.一种基于多候补匹配点的图像角度计算方法,其特征在于,所述方法包括:1. A method for calculating image angles based on multiple candidate matching points, characterized in that the method includes: 将与待测轮毂型号相同的轮毂放置在传送带上,并使轮毂的一个轮辐中心线与水平线重合;对轮毂进行拍摄,得到标准轮毂图像;Place a wheel hub of the same model as the one to be tested on the conveyor belt, and align the center line of one spoke of the wheel hub with the horizontal line; take a picture of the wheel hub to obtain a standard wheel hub image; 根据去除背景的标准轮毂图像IM,提取特征点集M={M1,M2,…,Mm};Based on the standard wheel hub image IM with the background removed, extract the feature point set M = { M1 , M2 , ..., Mm }; 将待测轮毂放置在传送带上,对待测轮毂进行拍摄且拍摄时相机的位置与对待测轮毂型号相同的轮毂进行拍摄时相同,得到待测轮毂图像;The wheel hub to be tested is placed on the conveyor belt, and the wheel hub to be tested is photographed. The camera position is the same as when photographing a wheel hub of the same model as the wheel hub to be tested, so as to obtain an image of the wheel hub to be tested. 根据去除背景后的待测轮毂图像IN,提取特征点集N={N1,N2,…,Nn};Based on the background-removed image I N of the wheel hub to be tested, extract the feature point set N = { N1 , N2 , ..., Nn }; 对于特征点集M中任一特征点Mi,从特征点集N中选择特征距离足够小且满足中心距离约束的所有特征点作为候选匹配点;将Mi与每一个候选匹配点均组成一对候选匹配点对;计算特征点Mi的候选匹配点对的角度差;其中,i=1,2,…,m;For any feature point Mi in the feature point set M, select all feature points from the feature point set N whose feature distance is sufficiently small and which satisfy the center distance constraint as candidate matching points; form a pair of candidate matching points with each candidate matching point ; calculate the angle difference between the candidate matching point pairs of feature point Mi ; where i = 1, 2, ..., m; 对特征点集M中所有特征点的候选匹配点对的角度差进行聚类,将包含角度差数目最多的类记为θmax,θmax中所有角度差的均值即为待测轮毂图像的旋转角度;Cluster the angle differences of candidate matching point pairs of all feature points in the feature point set M, and denote the class containing the most angle differences as θ max . The mean of all angle differences in θ max is the rotation angle of the wheel hub image to be tested. 其中,对于特征点集M中任一特征点Mi,从特征点集N中选择特征距离足够小且满足中心距离约束的所有特征点作为候选匹配点,具体为:Specifically, for any feature point Mi in the feature point set M, all feature points in the feature point set N whose feature distance is sufficiently small and which satisfy the center distance constraint are selected as candidate matching points, as follows: 式中,j=1,2,…,n;Pi为特征点Mi的候选匹配点Nj的集合;为特征点Mi与Nj之间的特征距离;为最大可能特征距离;为特征点Mi在点集N中找到的最小特征距离;为特征点Mi坐标与图像IM中心坐标之间的欧氏距离;为候选匹配点Nj坐标与图像IN中心坐标之间的欧氏距离;T1为特征距离阈值系数,用于保证候选匹配点的特征距离足够小;T2∈[0,1]为特征比例阈值,用于保证候选匹配点的特征距离与最优匹配点的特征距离相差较小;T3为中心距离阈值,用于保证特征点Mi坐标、Nj坐标与对应图像中心坐标距离之差足够小;In the formula, j = 1, 2, ..., n; Pi is the set of candidate matching points Nj for feature point Mi ; The feature distance between feature points Mi and Nj ; The distance to the maximum possible feature; Let M<sub>i</sub> be the minimum feature distance found by feature point M <sub>i</sub> in point set N; The Euclidean distance between the coordinates of feature point Mi and the coordinates of the center of image I M ; T<sub>i</sub> is the Euclidean distance between the coordinates of the candidate matching point N<sub> j </sub> and the coordinates of the center of the image I<sub> N </sub>;T<sub> 1 </sub> is the feature distance threshold coefficient, used to ensure that the feature distance of the candidate matching point is small enough; T <sub>2 </sub> ∈ [0,1] is the feature ratio threshold, used to ensure that the feature distance of the candidate matching point is small compared with the feature distance of the optimal matching point; T<sub> 3 </sub> is the center distance threshold, used to ensure that the difference between the distances of the feature point Mi coordinates, N<sub> j </sub> coordinates and the corresponding image center coordinates is small enough. 所述计算特征点Mi的候选匹配点对的角度差,具体为:The calculation of the angle difference between candidate matching point pairs for feature point Mi is specifically as follows: Mi与Pi中第k个候选匹配点的角度差为:The angle difference between the k-th candidate matching points in M <sub>i</sub> and P<sub>i </sub> is: 式中,atan为反正切函数;在图像IM中,分别为图像中心的横坐标和纵坐标,分别为特征点Mi的横坐标和纵坐标;在图像IN中,分别为图像中心的横坐标和纵坐标,分别为集合Pi中第k个候选匹配点的横坐标和纵坐标。In the formula, atan is the arctangent function; in the image I M , These are the x and y coordinates of the image center, respectively. Let x and y be the x and y coordinates of feature point Mi , respectively; in image IN , These are the x and y coordinates of the image center, respectively. Let x and y be the x and y coordinates of the k-th candidate matching point in set P<sub>i</sub>, respectively. 2.根据权利要求1所述的图像角度计算方法,其特征在于,所述特征距离的计算采用SURF算法。2. The image angle calculation method according to claim 1, wherein the feature distance is calculated using the SURF algorithm. 3.根据权利要求1所述的图像角度计算方法,其特征在于,所述对特征点集M中所有特征点的候选匹配点对的角度差进行聚类,包括:3. The image angle calculation method according to claim 1, characterized in that, the clustering of the angle differences of candidate matching point pairs of all feature points in the feature point set M includes: 将候选匹配点对的角度差按获得顺序重新排序,记为θt,t=1,2,…,l;其中l为候选匹配点对的数目;The angle differences between the candidate matching point pairs are reordered according to the order in which they were obtained, and denoted as θt , t = 1, 2, ..., l; where l is the number of candidate matching point pairs. 令c=1,将θ1分配到角度差类θ1Let c = 1, and assign θ1 to the angle difference class θ1 ; m=2;m = 2; d=1;d = 1; 计算角度差类θd的均值为则将θm分配到角度差类θd,否则:d=d+1,若d<=c,则返回计算角度差类θd的均值为并继续执行后续操作;The mean of the angle difference class θd is calculated as follows: like Then assign θm to the angle difference class θd ; otherwise: d = d + 1. If d <= c, then return the mean of the angle difference class θd. And continue with the subsequent operations; 若d==c+1,则c=c+1,将θm分配到角度差类θc;其中,T4为设定的角度差阈值;If d = c + 1, then c = c + 1, and θm is assigned to the angle difference class θc ; where T4 is the set angle difference threshold. m=m+1,当m≤l,返回d=1,并继续执行后续操作。If m = m + 1, and m ≤ l, return d = 1 and continue with subsequent operations. 4.根据权利要求1所述的图像角度计算方法,其特征在于,去除背景,包括:4. The image angle calculation method according to claim 1, characterized in that removing the background includes: 对轮毂图像采用霍夫圆变换,获得轮毂外轮廓;The Hough circle transform is applied to the wheel hub image to obtain the outer contour of the wheel hub; 将轮毂外轮廓的外部图像像素值全部置为0;Set all pixel values of the outer image of the wheel hub's outer contour to 0; 根据轮毂外轮廓的最小外接矩形截取图像,得到去除背景后的图像;The image is cropped based on the smallest bounding rectangle of the wheel hub's outer contour to obtain the image after removing the background; 其中,所述轮毂图像分别为标准轮毂图像和待测轮毂图像。The wheel hub images are a standard wheel hub image and a wheel hub image to be tested. 5.根据权利要求1所述的图像角度计算方法,其特征在于,采用SURF算法为图像提取特征点集;所述图像分别为去除背景的标准轮毂图像IM和去除背景后的待测轮毂图像IN5. The image angle calculation method according to claim 1, characterized in that the SURF algorithm is used to extract feature point sets from the image; the images are respectively a standard wheel hub image IM with the background removed and a wheel hub image IN to be tested after the background is removed. 6.一种基于多候补匹配点的图像角度计算系统,其特征在于,所述系统包括:6. An image angle calculation system based on multiple candidate matching points, characterized in that the system comprises: 第一图像获取模块,用于将与待测轮毂型号相同的轮毂放置在传送带上,并使轮毂的一个轮辐中心线与水平线重合;对轮毂进行拍摄,得到标准轮毂图像;The first image acquisition module is used to place a wheel hub of the same model as the wheel hub to be tested on the conveyor belt and make the center line of one spoke of the wheel hub coincide with the horizontal line; to take a picture of the wheel hub and obtain a standard wheel hub image; 第一提取模块,用于根据去除背景的标准轮毂图像IM,提取特征点集M={M1,M2,…,Mm};The first extraction module is used to extract a feature point set M = { M1 , M2 , ..., Mm } from a standard wheel hub image IM after removing the background. 第二图像获取模块,用于将待测轮毂放置在传送带上,对待测轮毂进行拍摄且拍摄时相机的位置与对待测轮毂型号相同的轮毂进行拍摄时相同,得到待测轮毂图像;The second image acquisition module is used to place the wheel hub to be tested on the conveyor belt, take a picture of the wheel hub to be tested, and the position of the camera when taking the picture is the same as when taking a picture of a wheel hub of the same model as the wheel hub to be tested, so as to obtain an image of the wheel hub to be tested. 第二提取模块,用于根据去除背景后的待测轮毂图像IN,提取特征点集N={N1,N2,…,Nn};The second extraction module is used to extract the feature point set N = { N1 , N2 , ..., Nn } from the background-removed wheel hub image IN . 候选匹配模块,用于对于特征点集M中任一特征点Mi,从特征点集N中选择特征距离足够小且满足中心距离约束的所有特征点作为候选匹配点;将Mi与每一个候选匹配点均组成一对候选匹配点对;计算特征点Mi的候选匹配点对的角度差;其中,i=1,2,…,m;The candidate matching module is used to select all feature points from the feature point set N that have sufficiently small feature distances and satisfy the center distance constraint for any feature point Mi in the feature point set M as candidate matching points; to form a pair of candidate matching points with each candidate matching point; and to calculate the angle difference between the candidate matching point pairs of feature point Mi ; where i = 1, 2, ..., m. 聚类模块,用于对特征点集M中所有特征点的候选匹配点对的角度差进行聚类,将包含角度差数目最多的类记为θmax,θmax中所有角度差的均值即为待测轮毂图像的旋转角度;The clustering module is used to cluster the angle differences of candidate matching point pairs of all feature points in the feature point set M. The class containing the most angle differences is denoted as θmax . The mean of all angle differences in θmax is the rotation angle of the wheel hub image to be tested. 其中,对于特征点集M中任一特征点Mi,从特征点集N中选择特征距离足够小且满足中心距离约束的所有特征点作为候选匹配点,具体为:Specifically, for any feature point Mi in the feature point set M, all feature points in the feature point set N whose feature distance is sufficiently small and which satisfy the center distance constraint are selected as candidate matching points, as follows: 式中,j=1,2,…,n;Pi为特征点Mi的候选匹配点Nj的集合;为特征点Mi与Nj之间的特征距离;为最大可能特征距离;为特征点Mi在点集N中找到的最小特征距离;为特征点Mi坐标与图像IM中心坐标之间的欧氏距离;为候选匹配点Nj坐标与图像IN中心坐标之间的欧氏距离;T1为特征距离阈值系数,用于保证候选匹配点的特征距离足够小;T2∈[0,1]为特征比例阈值,用于保证候选匹配点的特征距离与最优匹配点的特征距离相差较小;T3为中心距离阈值,用于保证特征点Mi坐标、Nj坐标与对应图像中心坐标距离之差足够小;In the formula, j = 1, 2, ..., n; Pi is the set of candidate matching points Nj for feature point Mi ; The feature distance between feature points Mi and Nj ; The distance to the maximum possible feature; Let M<sub>i</sub> be the minimum feature distance found by feature point M <sub>i</sub> in point set N; The Euclidean distance between the coordinates of feature point Mi and the coordinates of the center of image I M ; T<sub>i</sub> is the Euclidean distance between the coordinates of the candidate matching point N<sub> j </sub> and the coordinates of the center of the image I<sub> N </sub>;T<sub> 1 </sub> is the feature distance threshold coefficient, used to ensure that the feature distance of the candidate matching point is small enough; T <sub>2 </sub> ∈ [0,1] is the feature ratio threshold, used to ensure that the feature distance of the candidate matching point is small compared with the feature distance of the optimal matching point; T<sub> 3 </sub> is the center distance threshold, used to ensure that the difference between the distances of the feature point Mi coordinates, N<sub> j </sub> coordinates and the corresponding image center coordinates is small enough. 所述计算特征点Mi的候选匹配点对的角度差,具体为:The calculation of the angle difference between candidate matching point pairs for feature point Mi is specifically as follows: Mi与Pi中第k个候选匹配点的角度差为:The angle difference between the k-th candidate matching points in M <sub>i</sub> and P<sub>i </sub> is: 式中,atan为反正切函数;在图像IM中,分别为图像中心的横坐标和纵坐标,分别为特征点Mi的横坐标和纵坐标;在图像IN中,分别为图像中心的横坐标和纵坐标,分别为集合Pi中第k个候选匹配点的横坐标和纵坐标。In the formula, atan is the arctangent function; in the image I M , These are the x and y coordinates of the image center, respectively. Let x and y be the x and y coordinates of feature point Mi , respectively; in image IN , These are the x and y coordinates of the image center, respectively. Let x and y be the x and y coordinates of the k-th candidate matching point in set P<sub>i</sub>, respectively. 7.一种终端设备,包括处理器以及用于存储处理器可执行程序的存储器,其特征在于,所述处理器执行存储器存储的程序时,实现权利要求1-5任一项所述的图像角度计算方法。7. A terminal device, comprising a processor and a memory for storing a processor-executable program, characterized in that, when the processor executes the program stored in the memory, it implements the image angle calculation method according to any one of claims 1-5. 8.一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时,实现权利要求1-5任一项所述的图像角度计算方法。8. A computer-readable storage medium having a computer program stored thereon, characterized in that, when the computer program is executed by a processor, it implements the image angle calculation method according to any one of claims 1-5.
CN202410441063.4A 2024-04-12 2024-04-12 Image angle calculation method and system based on multiple candidate matching points Active CN118261960B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410441063.4A CN118261960B (en) 2024-04-12 2024-04-12 Image angle calculation method and system based on multiple candidate matching points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410441063.4A CN118261960B (en) 2024-04-12 2024-04-12 Image angle calculation method and system based on multiple candidate matching points

Publications (2)

Publication Number Publication Date
CN118261960A CN118261960A (en) 2024-06-28
CN118261960B true CN118261960B (en) 2025-10-31

Family

ID=91610892

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410441063.4A Active CN118261960B (en) 2024-04-12 2024-04-12 Image angle calculation method and system based on multiple candidate matching points

Country Status (1)

Country Link
CN (1) CN118261960B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119922274A (en) * 2025-04-07 2025-05-02 浙江方泰显示技术有限公司 A same-screen multi-channel video display control system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109614977A (en) * 2018-11-14 2019-04-12 华南理工大学 A kind of wheel hub model identification method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101032446B1 (en) * 2009-11-26 2011-05-03 광주과학기술원 Apparatus and method for detecting vertices of images

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109614977A (en) * 2018-11-14 2019-04-12 华南理工大学 A kind of wheel hub model identification method

Also Published As

Publication number Publication date
CN118261960A (en) 2024-06-28

Similar Documents

Publication Publication Date Title
CN109816733B (en) Camera parameter initialization method and device, camera parameter calibration method and device and image acquisition system
JP2919284B2 (en) Object recognition method
CN116343095B (en) A vehicle trajectory extraction method based on video splicing and related equipment
WO2025025559A1 (en) Tab-defect detection method and system
CN112198878B (en) Instant map construction method and device, robot and storage medium
CN107356213B (en) Optical filter concentricity measuring method and terminal equipment
CN107025449B (en) Oblique image straight line feature matching method constrained by local area with unchanged visual angle
CN110189375A (en) An Image Target Recognition Method Based on Monocular Vision Measurement
CN116468760B (en) Multi-source remote sensing image registration method based on anisotropic diffusion description
CN110770741B (en) Lane line recognition method and device, vehicle
CN111738320A (en) Occlusion Workpiece Recognition Method Based on Template Matching
CN118261960B (en) Image angle calculation method and system based on multiple candidate matching points
CN103077528A (en) Rapid image matching method based on DCCD (Digital Current Coupling)-Laplace and SIFT (Scale Invariant Feature Transform) descriptors
CN111754556B (en) Incremental unmanned aerial vehicle aerial photography overlapping degree detection method and system
CN115019069A (en) Template matching method, template matching device and storage medium
CN113255702B (en) Target detection method and target detection device based on graph matching
JPH04242106A (en) Face recognizing apparatus
CN111915645B (en) Image matching method and device, computer equipment and computer readable storage medium
CN119152011A (en) High-precision measuring method, device, equipment and medium for workpiece size with angular point
CN112258395A (en) Image splicing method and device shot by unmanned aerial vehicle
CN115953600B (en) Multi-modal image matching method and system based on multi-directional filtering channel features
CN116128974A (en) Multi-camera collaborative calibration self-checking method and device on road pole, electronic equipment and storage medium
CN113902936A (en) Stereoscopic vision matching method for engine nozzle under double constraint conditions
CN111127311A (en) Image registration method based on micro-coincident area
CN119399283B (en) Industrial vision automatic alignment method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant