CN119169074A - A method for calculating the symmetry center of an image - Google Patents
A method for calculating the symmetry center of an image Download PDFInfo
- Publication number
- CN119169074A CN119169074A CN202411661625.2A CN202411661625A CN119169074A CN 119169074 A CN119169074 A CN 119169074A CN 202411661625 A CN202411661625 A CN 202411661625A CN 119169074 A CN119169074 A CN 119169074A
- Authority
- CN
- China
- Prior art keywords
- image
- axis
- roi
- counting
- symmetry
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/68—Analysis of geometric attributes of symmetry
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Investigating Or Analysing Biological Materials (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
The invention discloses an image symmetry center calculating method which comprises the following steps of S1, collecting an image, S2, framing an ROI rectangular region containing a symmetrical pattern in the image, S3, calculating a symmetry axis about a y-axis direction in the ROI rectangular region, S4, calculating a symmetry axis about an x-axis direction in the ROI rectangular region, S5, calculating the symmetry center of the image, wherein the intersection point of the two symmetry axes is calculated to be the symmetry center of the image according to the symmetry axis about the y-axis direction of the ROI region obtained in the step S3 and the symmetry axis about the x-axis direction of the ROI region obtained in the step S4. According to the invention, after the image is acquired in real time, the ROI rectangular region containing the center of the image is selected, and after the symmetry axes of the x-axis direction and the y-axis direction are respectively calculated in the ROI rectangular region, the center point of the image is determined by utilizing the intersection point of the two symmetry axes, so that the calculation efficiency and the accuracy are effectively improved.
Description
Technical Field
The invention relates to the field of algorithms, in particular to an image symmetry center calculating method.
Background
In the machine vision test, measurement or detection, a center point of a symmetrical pattern in an image acquired by a camera is required, so that the positioning of subsequent product detection and the relation judgment of measurement data and the center point are convenient. In the application of partial positioning, the center point of the symmetrical product needs to be calculated, the guide manipulator grabs the center of the product, or the center point can be correspondingly arranged by assembly, so that the center point of the symmetrical product in the image can be quickly calculated, and the method is a core process in an automatic vision guide system.
The existing method for determining the center point of the product is characterized in that the center point of the product is found on the design drawing by utilizing the design drawing of the product, the design drawing is converted into an image, and then the symmetrical center of the image is calculated.
Disclosure of Invention
The technical problem to be solved by the invention is to provide the image symmetry center calculating method which is capable of effectively improving the calculating efficiency and the accuracy by determining the center point of the image by utilizing the intersection point of the two symmetry axes after selecting the ROI rectangular region containing the center of the image after acquiring the image in real time and respectively calculating the symmetry axes of the x-axis direction and the y-axis direction.
The technical scheme adopted by the invention is that the method for calculating the symmetry center of the image comprises the following steps:
s1, acquiring an image, namely after the camera shoots and acquires the image of a product to be detected, transmitting image information to an industrial personal computer;
S2, framing an ROI rectangular region containing symmetrical patterns in the image, namely framing the ROI rectangular region containing the symmetrical patterns in the image acquired after the step S1 as a calculated calibration region;
S3, calculating symmetry axes about the y-axis direction in the ROI rectangular region, namely selecting a counting axis x=n in the ROI rectangular region selected in the step S2, respectively searching and extracting image regions on the left side and the right side of the counting axis x=n, comparing the images on the two sides, taking the images with smaller width as superimposed images, overlapping the images on the left side and the right side, and then carrying out similarity calculation;
S4, calculating symmetry axes about the x-axis direction in the ROI rectangular region, namely selecting a counting axis y=m in the ROI rectangular region selected in the step S2, respectively searching and extracting image regions on the upper side and the lower side of the counting axis y=m, comparing the images on the two sides, taking the images with small heights as superimposed images, overlapping the images on the upper side and the lower side, and then carrying out similarity calculation;
And S5, calculating the symmetry center of the image, namely calculating the intersection point of the two symmetry axes to be the symmetry center of the image according to the symmetry axis of the ROI area about the y-axis direction obtained in the step S3 and the symmetry axis of the ROI area about the x-axis direction obtained in the step S4.
Preferably, the ROI rectangular region in the step S2 is a calculated calibration region of the acquired image in the step S1, and the width of the ROI rectangular region is w and the height is h.
Preferably, the starting position of the counting axis x=n selected in the step S3 is the leftmost axis of the ROI rectangular region, in the counting process, the pixels of the image are taken as 1 counting unit, the x=n+1 is taken from row to row, the n takes the value from 0 to w-2, the coverage area of the ROI rectangular region is taken, the counting axes are taken row by row, the similarity S (n) corresponding to each counting axis is recorded, and the similarity S (n) is the similarity corresponding to the x=n.
Preferably, in the step S3, when the image sizes of the left and right sides of the counting axis x=n are determined, the image on the left side of the counting axis is taken as the superimposed image when n < w/2, and the image on the right side of the counting axis is taken as the superimposed image when n is greater than or equal to w/2.
Preferably, in the step S3, after the superimposed image is selected, the left or right superimposed image is turned over and converted by using the counting axis x=n as the folding center line, and then the similarity between the two non-turned and turned superimposed images is calculated.
Preferably, the starting position of the counting axis y=m selected in the step S4 is the uppermost axis of the ROI rectangular region, in the counting process, the pixel of the image is taken as a unit, the y=m-1 is taken row by row, the value of m is from h-1 to 2, the coverage area of the ROI rectangular region is taken, the counting axes are taken row by row, the similarity S (m) corresponding to each counting axis is recorded at the same time, and the similarity S (m) is the similarity corresponding to y=m.
Preferably, in the step S4, when the image sizes of the left and right sides of the counting axis y=m are determined, the image on the lower side of the counting axis is taken as the superimposed image when m < h/2, and the image on the upper side of the counting axis is taken as the superimposed image when m is greater than or equal to h/2.
Preferably, in the step S4, after the superimposed image is selected, the counting axis y=m is taken as a folding center line, and after the superimposed image on the upper side or the lower side is converted into the flipped pixel value, the similarity calculation is performed on the two non-flipped and flipped superimposed images.
Preferably, a calculation formula of the conversion of the turned pixel value of the superimposed image along the direction of the counting axis x=n is dst x,y=srcx,rows-y-1, a calculation formula of the conversion of the turned pixel value of the superimposed image along the direction of the counting axis y=m is dst x,y=srcccols-x-1,y, src is a pixel value of the image before turning, dst is a pixel value of the image after turning, x and y are corresponding coordinate values respectively, rows is the height of the turned superimposed image, and cols is the width of the turned superimposed image.
Preferably, the calculation formula of the similarity S (n) is:
;
Wherein,
;
;
T (x, y) represents a pixel value of (x, y) in the left or upper image coordinates, I (x, y) represents a pixel value of (x, y) in the right or lower image coordinates, x is (0, (w-1)) in the value range of x, y is (0, (h-1)) in the value range of y, w is the width value of the rectangular region of the ROI, and h is the height value of the rectangular region of the ROI.
The invention has the beneficial effects that:
the invention designs an image symmetry center calculating method which is used for acquiring an image in real time, selecting an ROI rectangular region containing the center of the image, respectively calculating symmetry axes in the x-axis direction and the y-axis direction in the region, and determining an image center point by utilizing the intersection point of the two symmetry axes, thereby effectively improving the calculating efficiency and the accuracy.
The invention aims to provide a calculation method for visually guiding and positioning a product center in an automatic production process, so as to realize real-time matching calculation of the product image center in the production process, and compared with the traditional positioning method, the calculation efficiency of positioning and the calculation precision of center positioning are effectively improved; specifically, the invention can shoot the image of the collected product in real time through the CCD camera to form a collected image; the method comprises the steps of selecting a region of interest (ROI) rectangular region comprising an image symmetry center on an acquired image in a frame mode as a subsequent calculation region, wherein the frame selection principle comprises the steps of selecting the image symmetry center according to calculation accuracy requirements in an actual frame selection process, such as width w and height h, selecting a counting axis x=n in the region of the ROI rectangular region after the frame selection, starting to select a vertical line at the leftmost side of the region of the ROI rectangular region, extracting images with a certain width to the left side and the right side respectively by taking the counting axis as the center, taking the image at the left side of the counting axis as an overlapping image when n is less than w/2, taking the image at the right side of the counting axis as the overlapping image when n is more than or equal to w/2, namely taking the image with a smaller width as the subsequent similarity calculation image, then carrying out inversion pixel value conversion calculation on the overlapping image, finally carrying out pixel similarity calculation on the images at the left side and the right side to obtain the similarity S (n) corresponding to the x=n, then calculating the similarity corresponding to the maximum value x=n+1 to x=n+w, taking the similarity corresponding to the maximum value in the counting axis as the rectangular region, determining the similarity corresponding to the region of the ROI on the maximum value in the horizontal symmetry axis, determining the principle, firstly taking the counting axis y=m at the uppermost side of the ROI rectangular region as an initial axis, calculating the similarity of each row downwards row by row, taking the axis corresponding to the maximum similarity as the transverse symmetry axis of the ROI rectangular region, and finally taking the intersection point between the vertical symmetry axis and the transverse symmetry axis obtained through calculation as an image midpoint.
Drawings
Fig. 1 is a functional block diagram of the present invention.
FIG. 2 is a schematic diagram of a rectangular region of a frame-selected ROI according to the present invention.
Fig. 3 is a schematic diagram of the present invention with the counting axis x=n.
FIG. 4 is a schematic diagram of a decision for selecting superimposed images according to the present invention.
FIG. 5 is a diagram illustrating a second embodiment of the present invention for determining a selected superimposed image.
FIG. 6 is a schematic diagram of a superimposed image flip according to the present invention.
FIG. 7 is a schematic diagram of a superimposed image flip according to the present invention.
FIG. 8 is a schematic illustration of the center of symmetry of an acquired image according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It is to be noted that, all directional indicators in the embodiments of the present invention such as up, down, left, right, front the following. If the particular gesture changes, the directional indication changes accordingly.
In the present invention, unless explicitly specified and limited otherwise, the terms "connected," "fixed," and the like are to be construed broadly, and for example, "connected" may be either fixedly connected or detachably connected or integrally formed, may be mechanically connected or electrically connected, may be directly connected or indirectly connected through an intermediate medium, and may be in communication with each other or in an interaction relationship between two elements, unless explicitly specified otherwise. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
As shown in fig. 1, a method for calculating an image symmetry center includes the following steps:
s1, acquiring an image, namely after the camera shoots and acquires the image of a product to be detected, transmitting image information to an industrial personal computer;
S2, framing an ROI rectangular region containing symmetrical patterns in the image, namely framing the ROI rectangular region containing the symmetrical patterns in the image acquired after the step S1 as a calculated calibration region;
S3, calculating symmetry axes about the y-axis direction in the ROI rectangular region, namely selecting a counting axis x=n in the ROI rectangular region selected in the step S2, respectively searching and extracting image regions on the left side and the right side of the counting axis x=n, comparing the images on the two sides, taking the images with smaller width as superimposed images, overlapping the images on the left side and the right side, and then carrying out similarity calculation;
S4, calculating symmetry axes about the x-axis direction in the ROI rectangular region, namely selecting a counting axis y=m in the ROI rectangular region selected in the step S2, respectively searching and extracting image regions on the upper side and the lower side of the counting axis y=m, comparing the images on the two sides, taking the images with small heights as superimposed images, overlapping the images on the upper side and the lower side, and then carrying out similarity calculation;
And S5, calculating the symmetry center of the image, namely calculating the intersection point of the two symmetry axes to be the symmetry center of the image according to the symmetry axis of the ROI area about the y-axis direction obtained in the step S3 and the symmetry axis of the ROI area about the x-axis direction obtained in the step S4.
As shown in fig. 2, as an embodiment of the present invention, the ROI rectangular region in step S2 of the present invention is a calculated calibration region of the acquired image in step S1, and the width of the ROI rectangular region is w and the height is h.
As shown in fig. 3, as an embodiment of the present invention, the starting position of the counting axis x=n selected in step S3 of the present invention is the leftmost axis of the ROI rectangular region, in the counting process, the pixels of the image are taken as 1 counting unit, x=n+1 is taken column by column, n takes values from 0 to w-2, the coverage area of the ROI rectangular region is taken, the similarity S (n) corresponding to each counting axis is recorded while the counting axis is taken column by column, and the similarity S (n) is the similarity corresponding to x=n.
As shown in fig. 4, as a preferred embodiment of the present invention, when the image sizes of the left and right sides of the counting axis x=n are determined in the step S3, the image on the left side of the counting axis is taken as the superimposed image when n < w/2, and the image on the right side of the counting axis is taken as the superimposed image when n is equal to or greater than w/2.
In step S3, after the superimposed images are selected, the counting axis x=n is taken as a folding center line, and after the superimposed images on the left side or the right side are turned over and converted, similarity calculation is performed on the two non-turned-over superimposed images and the turned-over superimposed images.
As shown in fig. 5, as an embodiment of the present invention, the starting position of the counting axis y=m selected in step S4 of the present invention is the axis at the uppermost side of the ROI rectangular region, in the counting process, the pixel of the image is taken as a unit, the y=m-1 is taken row by row, the m takes values from h-1 to 2, the coverage area of the ROI rectangular region is taken, the counting axes are taken row by row, and the similarity S (m) corresponding to each counting axis is recorded at the same time, wherein the similarity S (m) is the similarity corresponding to y=m.
In the step S4, when the image sizes of the left side and the right side of the counting axis y=m are judged, the image is judged according to the m size, when m < h/2, the image on the lower side of the counting axis is taken as a superimposed image, and when m is more than or equal to h/2, the image on the upper side of the counting axis is taken as a superimposed image.
In step S4, after the superimposed image is selected, the counting axis y=m is taken as a folding center line, and after the superimposed image on the upper side or the lower side is subjected to inversion pixel value conversion, similarity calculation is performed on the two non-inverted and inverted superimposed images.
As shown in fig. 6 to 7, schematic diagrams of turning conversion of the superimposed image along the vertical direction and the transverse direction are shown, a calculation formula of turning pixel value conversion of the superimposed image along the direction of the counting axis x=n is dst x,y=srcx,rows-y-1, a calculation formula of turning pixel value conversion of the superimposed image along the direction of the counting axis y=m is dst x,y=srcccols-x-1,y, src is a pixel value of the image before turning, dst is a pixel value of the image after turning, x and y are corresponding coordinate values respectively, rows is a height of the turned superimposed image, and cols is a width of the turned superimposed image.
Furthermore, the invention designs an image symmetry center calculating method which is used for effectively improving the calculating efficiency and the accuracy by acquiring the image in real time, selecting the ROI rectangular region containing the center of the image, calculating symmetry axes in the x-axis direction and the y-axis direction respectively, and determining the center point of the image by utilizing the intersection point of the two symmetry axes. The invention aims to provide a calculation method for visually guiding and positioning a product center in an automatic production process, so as to realize real-time matching calculation of the product image center in the production process, and compared with the traditional positioning method, the calculation efficiency of positioning and the calculation precision of center positioning are effectively improved; specifically, the invention can shoot the image of the collected product in real time through the CCD camera to form a collected image; the method comprises the steps of selecting a region of interest (ROI) rectangular region comprising an image symmetry center on an acquired image in a frame mode as a subsequent calculation region, wherein the frame selection principle comprises the steps of selecting the image symmetry center according to calculation accuracy requirements in an actual frame selection process, such as width w and height h, selecting a counting axis x=n in the region of the ROI rectangular region after the frame selection, starting to select a vertical line at the leftmost side of the region of the ROI rectangular region, extracting images with a certain width to the left side and the right side respectively by taking the counting axis as the center, taking the image at the left side of the counting axis as an overlapping image when n is less than w/2, taking the image at the right side of the counting axis as the overlapping image when n is more than or equal to w/2, namely taking the image with a smaller width as the subsequent similarity calculation image, then carrying out inversion pixel value conversion calculation on the overlapping image, finally carrying out pixel similarity calculation on the images at the left side and the right side to obtain the similarity S (n) corresponding to the x=n, then calculating the similarity corresponding to the maximum value x=n+1 to x=n+w, taking the similarity corresponding to the maximum value in the counting axis as the rectangular region, determining the similarity corresponding to the region of the ROI on the maximum value in the horizontal symmetry axis, determining the principle, firstly taking the counting axis y=m at the uppermost side of the ROI rectangular region as an initial axis, calculating the similarity of each row downwards row by row, taking the axis corresponding to the maximum similarity as the transverse symmetry axis of the ROI rectangular region, and finally taking the intersection point between the vertical symmetry axis and the transverse symmetry axis obtained through calculation as an image midpoint.
Example 1
The calculation formula of the similarity S (n) is as follows:
;
Wherein,
;
;
T (x, y) represents a pixel value of (x, y) in the left or upper image coordinates, I (x, y) represents a pixel value of (x, y) in the right or lower image coordinates, x is (0, (w-1)) in the value range of x, y is (0, (h-1)) in the value range of y, w is the width value of the rectangular region of the ROI, and h is the height value of the rectangular region of the ROI.
The formula is a normalization correlation coefficient calculation method, can eliminate the influence of illumination and noise, has stronger robustness, and the calculation method is essentially that the gray pixel value of each position of an image subtracts the gray average value of the image, the average value of the image is firstly carried out on 2 overlapped images, the gray values corresponding to the 2 images are multiplied after being processed by coordinates, and then the products of all pixel points are accumulated, so that the higher the number is the higher the similarity of the two images, the denominator of the formula is the normalization operation, the final similarity value is between 0 and 1, the similarity is 1 and is completely consistent, the maximum value can be determined, and the counting axis corresponding to the maximum value is the symmetry axis.
Taking a calculation example, namely taking the width w as 900 and the height h as 700;
(1) Calculating symmetry axis in vertical direction
The value range of n is [180,720];
calculation results corresponding to the similarity calculation value S (n):
-0.066,-0.064,-0.062,-0.059,-0.057,-0.055,-0.053,-0.052,-0.050,-0.049,-0.047,-0.045,-0.041,-0.038,-0.034,-0.031,-0.028,-0.025,-0.022,-0.019,-0.016,-0.014,-0.011,-0.007,-0.004,0.000,0.004,0.008,0.011,0.014,0.017,0.020,0.023,0.025,0.028,0.030,0.034,0.037,0.040,0.043,0.046,0.049,0.051,0.053,0.056,0.058,0.061,0.064,0.067,0.070,0.074,0.077,0.080,0.083,0.085,0.086,0.089,0.091,0.094,0.097,0.100,0.103,0.106,0.109,0.112,0.113,0.115,0.116,0.118,0.120,0.122,0.125,0.128,0.131,0.134,0.137,0.139,0.140,0.141,0.142,0.143,0.145,0.147,0.150,0.153,0.157,0.159,0.161,0.162,0.163,0.165,0.166,0.167,0.170,0.173,0.176,0.180,0.183,0.185,0.186,0.187,0.188,0.189,0.190,0.191,0.193,0.197,0.200,0.203,0.205,0.207,0.208,0.208,0.209,0.210,0.211,0.213,0.215,0.219,0.222,0.225,0.227,0.228,0.229,0.229,0.230,0.230,0.231,0.232,0.235,0.238,0.242,0.245,0.248,0.250,0.252,0.253,0.253,0.254,0.254,0.256,0.258,0.261,0.264,0.268,0.269,0.268,0.265,0.261,0.256,0.250,0.246,0.243,0.243,0.245,0.248,0.252,0.255,0.257,0.257,0.256,0.253,0.250,0.247,0.246,0.246,0.246,0.246,0.246,0.246,0.247,0.249,0.252,0.258,0.266,0.274,0.282,0.287,0.288,0.286,0.281,0.274,0.268,0.264,0.262,0.263,0.268,0.274,0.281,0.288,0.293,0.298,0.302,0.306,0.310,0.313,0.314,0.315,0.311,0.299,0.279,0.254,0.231,0.215,0.206,0.205,0.209,0.217,0.225,0.233,0.240,0.245,0.248,0.251,0.258,0.270,0.288,0.312,0.336,0.361,0.381,0.391,0.397,0.409,0.433,0.471,0.521,0.585,0.658,0.735,0.806,0.853,0.860,0.825,0.768,0.707,0.647,0.590,0.539,0.495,0.457,0.424,0.393,0.359,0.322,0.289,0.264,0.247,0.237,0.234,0.233,0.233,0.230,0.225,0.216,0.206,0.197,0.189,0.184,0.182,0.185,0.192,0.204,0.218,0.232,0.246,0.257,0.266,0.272,0.275,0.277,0.276,0.270,0.262,0.251,0.239,0.227,0.216,0.207,0.202,0.202,0.206,0.212,0.221,0.228,0.235,0.240,0.242,0.242,0.241,0.239,0.236,0.231,0.224,0.216,0.207,0.199,0.191,0.187,0.186,0.191,0.199,0.208,0.217,0.224,0.228,0.229,0.226,0.221,0.213,0.205,0.197,0.190,0.184,0.180,0.178,0.178,0.180,0.184,0.191,0.198,0.205,0.211,0.215,0.217,0.216,0.213,0.207,0.200,0.194,0.187,0.183,0.180,0.178,0.178,0.179,0.181,0.183,0.186,0.190,0.193,0.195,0.196,0.196,0.194,0.191,0.188,0.185,0.180,0.176,0.173,0.171,0.170,0.169,0.168,0.167,0.165,0.164,0.163,0.162,0.160,0.159,0.158,0.158,0.157,0.156,0.155,0.153,0.151,0.148,0.146,0.143,0.142,0.140,0.138,0.137,0.135,0.134,0.133,0.132,0.131,0.129,0.127,0.125,0.123,0.121,0.119,0.118,0.116,0.115,0.114,0.113,0.111,0.109,0.106,0.104,0.102,0.100,0.098,0.097,0.095,0.093,0.091,0.089,0.087,0.085,0.083,0.080,0.079,0.077,0.076,0.074,0.072,0.069,0.067,0.065,0.063,0.061,0.058,0.056,0.055,0.053,0.051,0.049,0.047,0.045,0.043,0.040,0.037,0.034,0.032,0.031,0.029,0.027,0.025,0.024,0.022,0.019,0.016,0.012,0.008,0.006,0.004,0.003,0.001,-0.000,-0.002,-0.003,-0.005,-0.008,-0.012,-0.015,-0.017,-0.018,-0.019,-0.020,-0.021,-0.021,-0.023,-0.024,-0.026,-0.028,-0.029,-0.031,-0.031,-0.032,-0.032,-0.031,-0.028,-0.018,-0.005,0.009,0.023,0.036,0.051,0.064,0.078,0.092,0.106,0.120,0.134,0.149,0.163,0.178,0.194,0.209,0.224,0.239,0.254,0.266,0.272,0.266,0.250,0.232,0.213,0.195,0.176,0.157,0.137,0.117,0.096,0.075,0.054,0.032,0.011,-0.010,-0.030,-0.046,-0.061,-0.074,-0.087,-0.098,-0.105,-0.107,-0.104,-0.098,-0.091,-0.083,-0.074,-0.066,-0.057,-0.049,-0.040,-0.032,-0.023,-0.014,-0.005,0.004,0.010,0.015,0.017,0.018,0.018,0.017,0.017,0.016,0.016,0.015,0.014,0.013,
The maximum S (n) is S (412) =0.86.
(2) Calculating symmetry axis in horizontal direction
M is a value range [140,560];
S (m) similarity calculation value is
-0.022,-0.020,-0.017,-0.012,-0.008,-0.003,0.004,0.011,0.017,0.020,0.022,0.025,0.028,0.031,0.033,0.035,0.036,0.037,0.040,0.045,0.050,0.053,0.058,0.064,0.069,0.074,0.077,0.079,0.080,0.081,0.083,0.085,0.088,0.090,0.093,0.095,0.098,0.100,0.104,0.107,0.111,0.116,0.120,0.124,0.127,0.130,0.132,0.133,0.133,0.132,0.133,0.134,0.136,0.138,0.141,0.145,0.149,0.155,0.160,0.163,0.165,0.166,0.166,0.167,0.167,0.168,0.168,0.169,0.171,0.174,0.176,0.179,0.182,0.184,0.187,0.190,0.193,0.195,0.196,0.198,0.201,0.203,0.205,0.207,0.208,0.208,0.208,0.207,0.207,0.207,0.208,0.211,0.215,0.220,0.225,0.230,0.234,0.236,0.236,0.235,0.233,0.231,0.229,0.228,0.228,0.229,0.231,0.235,0.240,0.246,0.251,0.255,0.258,0.259,0.259,0.257,0.256,0.254,0.253,0.252,0.252,0.253,0.253,0.255,0.258,0.261,0.264,0.268,0.272,0.276,0.279,0.281,0.281,0.281,0.280,0.278,0.276,0.275,0.274,0.273,0.273,0.274,0.276,0.280,0.285,0.291,0.297,0.303,0.307,0.310,0.311,0.310,0.305,0.298,0.289,0.282,0.278,0.276,0.277,0.280,0.286,0.296,0.308,0.318,0.323,0.322,0.316,0.306,0.295,0.284,0.273,0.263,0.255,0.250,0.250,0.254,0.263,0.276,0.290,0.303,0.313,0.318,0.316,0.306,0.287,0.262,0.237,0.217,0.203,0.197,0.194,0.195,0.199,0.204,0.210,0.218,0.232,0.254,0.280,0.305,0.327,0.343,0.353,0.356,0.353,0.346,0.338,0.335,0.348,0.381,0.438,0.518,0.611,0.701,0.774,0.819,0.810,0.754,0.664,0.565,0.478,0.412,0.360,0.326,0.322,0.349,0.383,0.416,0.437,0.434,0.405,0.355,0.294,0.235,0.186,0.153,0.141,0.148,0.166,0.191,0.219,0.248,0.275,0.297,0.315,0.328,0.336,0.338,0.337,0.333,0.328,0.321,0.316,0.313,0.312,0.312,0.313,0.318,0.325,0.335,0.343,0.348,0.350,0.347,0.342,0.333,0.322,0.308,0.296,0.286,0.280,0.277,0.279,0.284,0.293,0.302,0.310,0.317,0.320,0.320,0.315,0.307,0.297,0.287,0.279,0.273,0.270,0.269,0.271,0.275,0.279,0.282,0.284,0.285,0.286,0.286,0.285,0.283,0.280,0.276,0.271,0.267,0.264,0.262,0.260,0.258,0.257,0.256,0.256,0.256,0.257,0.258,0.259,0.260,0.261,0.261,0.259,0.255,0.251,0.246,0.242,0.237,0.233,0.229,0.228,0.229,0.231,0.234,0.237,0.240,0.241,0.240,0.236,0.231,0.225,0.220,0.216,0.213,0.210,0.208,0.206,0.207,0.208,0.209,0.210,0.210,0.209,0.208,0.207,0.206,0.204,0.202,0.198,0.193,0.188,0.185,0.182,0.180,0.177,0.175,0.176,0.178,0.179,0.180,0.179,0.177,0.174,0.171,0.167,0.163,0.158,0.155,0.152,0.150,0.147,0.145,0.143,0.141,0.140,0.140,0.140,0.139,0.138,0.137,0.135,0.132,0.128,0.123,0.117,0.113,0.109,0.107,0.106,0.104,0.103,0.102,0.101,0.099,0.096,0.093,0.090,0.087,0.084,0.082,0.079,0.075,0.070,0.064,0.059,0.055,0.051,0.048,0.045,0.043,0.043,0.044,0.044,0.041,0.037,0.031
The maximum S (m) value is S (355) =0.819.
The symmetry center in the selected R0I image is (412,355);
as shown in fig. 8, the center of symmetry is finally determined as the intersection of x=412 and y=355.
The examples of the present invention are presented only to describe specific embodiments thereof and are not intended to limit the scope of the invention. Certain modifications may be made by those skilled in the art in light of the teachings of this embodiment, and all equivalent changes and modifications that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Claims (10)
1. The image symmetry center calculating method is characterized by comprising the following steps of:
s1, acquiring an image, namely after the camera shoots and acquires the image of a product to be detected, transmitting image information to an industrial personal computer;
S2, framing an ROI rectangular region containing symmetrical patterns in the image, namely framing the ROI rectangular region containing the symmetrical patterns in the image acquired after the step S1 as a calculated calibration region;
S3, calculating symmetry axes about the y-axis direction in the ROI rectangular region, namely selecting a counting axis x=n in the ROI rectangular region selected in the step S2, respectively searching and extracting image regions on the left side and the right side of the counting axis x=n, comparing the images on the two sides, taking the images with smaller width as superimposed images, overlapping the images on the left side and the right side, and then carrying out similarity calculation;
S4, calculating symmetry axes about the x-axis direction in the ROI rectangular region, namely selecting a counting axis y=m in the ROI rectangular region selected in the step S2, respectively searching and extracting image regions on the upper side and the lower side of the counting axis y=m, comparing the images on the two sides, taking the images with small heights as superimposed images, overlapping the images on the upper side and the lower side, and then carrying out similarity calculation;
And S5, calculating the symmetry center of the image, namely calculating the intersection point of the two symmetry axes to be the symmetry center of the image according to the symmetry axis of the ROI area about the y-axis direction obtained in the step S3 and the symmetry axis of the ROI area about the x-axis direction obtained in the step S4.
2. The method of claim 1, wherein the rectangular region of the ROI in the step S2 is a calculated calibration region of the acquired image in the step S1, and the rectangular region of the ROI has a width w and a height h.
3. The method for calculating the symmetry center of the image according to claim 2, wherein the starting position of the counting axis x=n selected in the step S3 is the leftmost axis of the rectangular region of the ROI, the pixels of the image are taken as 1 counting unit in the counting process, x=n+1 is taken column by column, n is taken from 0 to w-2, the coverage area of the rectangular region of the ROI is taken, the similarity S (n) corresponding to each counting axis is recorded while the counting axes are taken column by column, and the similarity S (n) is the similarity corresponding to the x=n.
4. The method for calculating the center of symmetry of an image according to claim 3, wherein the step S3 is characterized in that when the image sizes of the left side and the right side of the counting axis x=n are determined, the image on the left side of the counting axis is taken as the superimposed image when n < w/2, and the image on the right side of the counting axis is taken as the superimposed image when n is greater than or equal to w/2.
5. The method of claim 4, wherein the step S3 is characterized in that the folded images are selected and the left or right folded images are turned over by using the counting axis x=n as the folding center line, and then the similarity between the two non-turned-over folded images is calculated.
6. The method for calculating the symmetry center of the image according to claim 2, wherein the starting position of the counting axis y=m selected in the step S4 is the uppermost axis of the rectangular region of the ROI, the pixel of the image is taken as a unit in the counting process, the value of y=m-1 is taken row by row, m is from h-1 to 2, the coverage area of the rectangular region of the ROI is taken, the similarity S (m) corresponding to each counting axis is recorded while the counting axes are taken row by row, and the similarity S (m) is the similarity corresponding to the y=m.
7. The method of calculating the center of symmetry of an image according to claim 6, wherein the image size of the left and right sides of the counting axis y=m is determined in the step S4, the image of the lower side of the counting axis is taken as the superimposed image when m < h/2, and the image of the upper side of the counting axis is taken as the superimposed image when m.gtoreq.h/2.
8. The method of claim 7, wherein the step S4 is characterized in that the folded images are selected, the folded images on the upper side or the lower side are converted by turning pixel values by using the counting axis y=m as a folding center line, and then similarity calculation is performed on the two non-turned folded images and the turned folded images.
9. The method for calculating the symmetry center of an image according to claim 5 or 6, wherein a calculation formula of the inversion pixel value conversion of the superimposed image along the direction of the counting axis x=n is dst x,y=srcx,rows-y-1, a calculation formula of the inversion pixel value conversion of the superimposed image along the direction of the counting axis y=m is dst x,y=srcccols-x-1,y, src is a pixel value of the image before inversion, dst is a pixel value of the image after inversion, x and y are corresponding coordinate values respectively, rows is the height of the inverted superimposed image, and cols is the width of the inverted superimposed image.
10. The method for computing center of symmetry of an image according to claim 3 or 4, wherein said similarity S (n) is calculated by a formula of
;
Wherein,
;
;
T (x, y) represents a pixel value of (x, y) in the left or upper image coordinates, I (x, y) represents a pixel value of (x, y) in the right or lower image coordinates, x is (0, (w-1)) in the value range of x, y is (0, (h-1)) in the value range of y, w is the width value of the rectangular region of the ROI, and h is the height value of the rectangular region of the ROI.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411661625.2A CN119169074B (en) | 2024-11-20 | 2024-11-20 | Image symmetry center calculating method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411661625.2A CN119169074B (en) | 2024-11-20 | 2024-11-20 | Image symmetry center calculating method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN119169074A true CN119169074A (en) | 2024-12-20 |
| CN119169074B CN119169074B (en) | 2025-03-04 |
Family
ID=93891836
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202411661625.2A Active CN119169074B (en) | 2024-11-20 | 2024-11-20 | Image symmetry center calculating method |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119169074B (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080021502A1 (en) * | 2004-06-21 | 2008-01-24 | The Trustees Of Columbia University In The City Of New York | Systems and methods for automatic symmetry identification and for quantification of asymmetry for analytic, diagnostic and therapeutic purposes |
| CN106557767A (en) * | 2016-11-15 | 2017-04-05 | 北京唯迈医疗设备有限公司 | A kind of method of ROI region in determination interventional imaging |
| WO2023068542A1 (en) * | 2021-10-21 | 2023-04-27 | 한국전자통신연구원 | Method and computing device for global localization of mobile robot |
-
2024
- 2024-11-20 CN CN202411661625.2A patent/CN119169074B/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080021502A1 (en) * | 2004-06-21 | 2008-01-24 | The Trustees Of Columbia University In The City Of New York | Systems and methods for automatic symmetry identification and for quantification of asymmetry for analytic, diagnostic and therapeutic purposes |
| CN106557767A (en) * | 2016-11-15 | 2017-04-05 | 北京唯迈医疗设备有限公司 | A kind of method of ROI region in determination interventional imaging |
| WO2023068542A1 (en) * | 2021-10-21 | 2023-04-27 | 한국전자통신연구원 | Method and computing device for global localization of mobile robot |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119169074B (en) | 2025-03-04 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112614176B (en) | A belt conveyor material volume measurement method, device and storage medium | |
| CN109785291B (en) | Lane line self-adaptive detection method | |
| CN105225482B (en) | Vehicle detecting system and method based on binocular stereo vision | |
| CN108416355B (en) | A collection method of industrial field production data based on machine vision | |
| CN108510476B (en) | Mobile phone screen circuit detection method based on machine vision | |
| CN109978940B (en) | Visual measurement method for SAB safety airbag size | |
| US11763463B2 (en) | Information processing apparatus, control method, and program | |
| CN101807257A (en) | Method for identifying information of image tag | |
| CN106174830A (en) | Garment dimension automatic measurement system based on machine vision and measuring method thereof | |
| CN102663760A (en) | Location and segmentation method for windshield area of vehicle in images | |
| CN117670887B (en) | Tin soldering height and defect detection method based on machine vision | |
| JP4774390B2 (en) | Character segmentation device, method and program | |
| CN115018735B (en) | Crack width identification method and system based on Hough transformation correction two-dimensional code image | |
| JP5100688B2 (en) | Object detection apparatus and program | |
| CN119169074B (en) | Image symmetry center calculating method | |
| WO2019097690A1 (en) | Image processing device, control method, and control program | |
| JP4801934B2 (en) | Moving object detection device | |
| CN116030430A (en) | Rail identification method, device, equipment and storage medium | |
| KR102583655B1 (en) | Method for detecting moving objects, device and program using the same | |
| CN115471537A (en) | Monocular camera-based moving target distance and height measuring method | |
| CN115690723A (en) | Detection frame filtering method, device, equipment, medium and vehicle | |
| JP2008217330A (en) | Speed estimation method and speed estimation program | |
| JP2007298376A (en) | Boundary position determination apparatus, method for determining boundary position, program for causing computer to function as the apparatus, and recording medium | |
| CN116755081A (en) | Target detection technology integrating vision and radar | |
| JPH09178855A (en) | Obstacle detection method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |