[go: up one dir, main page]

HK1073372B - Image matching method and image matching system - Google Patents

Image matching method and image matching system Download PDF

Info

Publication number
HK1073372B
HK1073372B HK05105836.2A HK05105836A HK1073372B HK 1073372 B HK1073372 B HK 1073372B HK 05105836 A HK05105836 A HK 05105836A HK 1073372 B HK1073372 B HK 1073372B
Authority
HK
Hong Kong
Prior art keywords
image
matching
converted
images
processing
Prior art date
Application number
HK05105836.2A
Other languages
Chinese (zh)
Other versions
HK1073372A1 (en
Inventor
饭塚健
Original Assignee
索尼株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2003348293A external-priority patent/JP4345426B2/en
Application filed by 索尼株式会社 filed Critical 索尼株式会社
Publication of HK1073372A1 publication Critical patent/HK1073372A1/en
Publication of HK1073372B publication Critical patent/HK1073372B/en

Links

Description

Image matching method and image matching system
Technical Field
The present invention relates to an image matching method for matching two blood vessel images, fingerprint images, still images, moving images, and other images based on linear components in the images, and a program and an image matching system therefor.
Background
As a system for matching image information, various image matching systems have been known in the past. For example, there has been known an information processing apparatus for, for example, comparing a registered image with an image for comparison in a predetermined positional relationship, that is, a "matching image" to calculate a correlation value, and matching the registered image with the matching image based on the correlation value; or an information processing apparatus that generates correlation values by processing pixel units (refer to, for example, japanese unexamined patent publication No. 2000-194862).
However, in the above-described information processing apparatus, when a parallel movement, rotation, enlargement, reduction, or other deviation occurs between the registered image and the matching image, it is difficult to appropriately generate the correlation value, and sufficient matching accuracy may not be obtained in some cases. There is therefore a need for improvement.
Disclosure of Invention
An object of the present invention is to provide an image matching method capable of matching images with high accuracy, and a program and an image matching system therefor.
According to a first aspect of the present invention, there is provided an image matching method for matching a first image and a second image, comprising: a first step of performing image conversion processing according to a distance from a reference position in each of the first and second images and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generating first and second converted images in a two-dimensional space defined by the distance and the angle; and a second step of performing matching processing of the second image by the first image on the basis of correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the first converted image and the second converted image generated in the first step.
Preferably, in the first step, the method performs the image conversion process to generate the first converted image and the second converted image by converting the point in each image into a curved line pattern and converting the linear component in each image into a pattern of a plurality of overlapping curved lines based on a distance from the reference position to a closest point on a straight line passing through the points in the images and an angle formed by the straight line passing through the reference position and the closest point and a reference axis including the reference position.
Further, according to a second aspect of the present invention, there is provided a program executed by an information processing apparatus for matching a first image and a second image, comprising: a first program that performs image conversion processing according to a distance from a reference position in each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generates a first converted image and a second converted image in a two-dimensional space defined by the distance and the angle; and a second program that performs matching processing of the second image by the first image, based on correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the first converted image and the second converted image generated in the first program.
Preferably, in the first program, the program performs the image conversion processing to generate the first converted image and the second converted image by converting the point in each image into a curved line pattern and converting the linear component in each image into a pattern of a plurality of overlapping curved lines based on a distance from the reference position to a closest point on a straight line passing through the points in the images and an angle formed by the straight line passing through the reference position and the closest point and a reference axis including the reference position.
Further, according to a third aspect of the present invention, there is provided an image matching system for matching a first image and a second image, comprising: conversion means for performing image conversion processing according to a distance from a reference position in each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generating a first converted image and a second converted image in a two-dimensional space defined by the distance and the angle; and matching means for performing matching processing of the second image by the first image on the basis of correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the first converted image and the second converted image generated in the first program.
Preferably, in the conversion means, the image conversion processing is performed to generate the first converted image and the second converted image by converting the points in each image into a pattern of curved lines and converting the linear components in each image into a pattern of a plurality of superimposed curved lines based on a distance from the reference position to a closest point on a straight line passing through the points in the images and an angle formed by the straight line passing through the reference position and the closest point and a reference axis including the reference position.
According to the present invention, each of the first step, the first program, and the conversion means performs image conversion processing in accordance with a distance from a reference position in each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position to generate the first converted image and the second converted image in a two-dimensional space defined by the distance and the angle, and each of the second step, the second program, and the matching means performs matching processing of the second image by the first image in accordance with correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the generated first converted image and the second converted image.
Drawings
The above and other objects and features of the present invention will be more apparent by referring to the following description of the embodiments of the present invention taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a functional block diagram of similar hardware of an image matching system according to the present invention;
FIG. 2 is a functional block diagram of similar software of the image matching system shown in FIG. 1;
fig. 3A and 3B are diagrams for explaining the operation of the conversion unit shown in fig. 2;
fig. 4A to 4F are diagrams for explaining the operation of the conversion unit shown in fig. 2, in which fig. 4A is a diagram of a sample image va1, and fig. 4B is a diagram of an image va2 obtained by rotating the image va1 shown in fig. 4A exactly by a predetermined angle θ; fig. 4C is a diagram of an image va3 obtained by moving the image va2 shown in fig. 4B in parallel; FIG. 4D is a diagram of an image hva1 obtained by subjecting the image va1 shown in FIG. 4A to image conversion processing; fig. 4E is a diagram of an image hva2 obtained by performing image conversion processing on the image va2 shown in fig. 4B; FIG. 4F is a view of an image hva3 obtained by subjecting the image va3 shown in FIG. 4C to an image conversion process;
FIGS. 5A-5F are diagrams for explaining the operation of the conversion unit shown in FIG. 2;
fig. 6 is a functional block diagram of a specific example of the correlation value generation unit shown in fig. 2;
fig. 7A to 7C are diagrams for explaining correlation values of the correlation intensity image G (p, q), where fig. 7A and 7B are diagrams of signals S1621 and S1622 as converted images, and fig. 7C is a diagram of intensity peaks of the correlation intensity image G (p, q);
fig. 8 is a diagram for explaining the correlation intensity image G (p, q);
FIG. 9 is a flowchart for explaining the operation of the image matching system shown in FIG. 1;
FIG. 10 is a functional block diagram of an image matching system according to a second embodiment of the present invention;
fig. 11A and 11B are diagrams for explaining the operation of the position correlation unit shown in fig. 10;
fig. 12A and 12B are diagrams for explaining the operation of the similarity generating unit;
fig. 13A to 13C are diagrams for explaining the operation of the similarity generating unit shown in fig. 10;
fig. 14 is a diagram for explaining the operation of the image matching system shown in fig. 10;
fig. 15A to 15C are diagrams for explaining the operation of the image matching system according to the third embodiment of the present invention; and
fig. 16 is a flowchart for explaining the operation of the image matching system according to the third embodiment of the present invention.
Detailed Description
Fig. 1 is a functional block diagram of similar hardware of an image matching system according to a first embodiment of the present invention. An image matching system (information processing apparatus) 1 of this embodiment shown in fig. 1, for example, has an image input section 11, a memory 12, a conversion processing unit 13, a decimation processing unit 14, a Fast Fourier Transform (FFT) processing unit 15, a Central Processing Unit (CPU) 16, and an operation processing unit 17. For example, the image input section 11 is connected to the memory 12. The memory 12, conversion processing unit 13, decimation processing unit 14, FFT processing unit 15, and CPU processing unit 16 are connected by a bus BS.
The image input section 11 is an input section for accepting image input from the outside. For example, the image input section 11 receives as input a registered image AIM and an image to be compared with the registered image AIM (also referred to as a "matching image RIM"). The memory 12 stores, for example, an image input from an image input section thereof. In addition, the memory 12 stores, for example, a registered image AIM, a matched image RIM, a program PRG, and the like, as shown in fig. 1. The program PRG is run by, for example, the CPU16, and includes a program for realizing the conversion processing, the correlation processing, the matching processing, and the like relating to the present invention.
The conversion processing unit 13 executes image conversion processing to be explained later under the control of the CPU16, and outputs the processing result to the CPU16. The conversion processing unit 13 preferably employs a dedicated circuit constituted by hardware so as to perform, for example, image conversion processing at high speed.
The extraction processing unit 14 executes extraction processing (also referred to as "mask processing") to be explained later under the control of the CPU16, and outputs the result to the CPU16. The decimation processing unit 14 preferably employs a dedicated circuit composed of hardware to perform decimation processing at high speed, for example.
A Fast Fourier Transform (FFT) processing unit 15 performs two-dimensional fourier transform processing according to the image stored in the memory 12 under the control of, for example, the CPU16, and outputs the processing result to the CPU16 or the like.
The operation processing unit 17 executes predetermined processing for releasing an electronic lock or the like when, for example, a registered image AIM matches a matching image RIM according to a processing result of the CPU16 to be explained later.
The CPU16 executes the matching process of the embodiment of the present invention according to, for example, the program PRG stored in the memory 12, the registered image AIM, and the matching image RIM. In addition, the CPU16 controls the image input section 11, the memory 12, the conversion processing unit 13, the decimation processing unit 14, the FFT processing unit 15, the operation processing unit 17, and the like to realize the processing of the embodiment.
Fig. 2 is a functional block diagram of software-like components of the image matching system shown in fig. 1. For example, as illustrated in fig. 2, the CPU16 running the program in the memory 12 realizes the functions of the conversion unit 161, the extraction unit 162, the correlation value generation unit 163, and the matching unit 164. The conversion unit 161 corresponds to conversion means of the present invention, and the correlation value generation unit 163 and the matching unit 164 correspond to matching means of the present invention.
The conversion unit 161 causes the conversion processing unit 13 to perform specialized image processing with respect to, for example, hardware that performs image conversion processing. In more detail, for example, the conversion unit 161 performs image conversion processing according to the registered image AIM, and outputs the processing result as a signal S1611. In addition, the conversion unit 161 performs image conversion processing according to the matched image RIM, and outputs the processing result as a signal S1612.
Fig. 3A and 3B are diagrams for explaining the operation of the conversion unit shown in fig. 2. The conversion unit 161 performs image conversion processing according to, for example, a distance from a reference position within each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position to generate the first converted image and the second converted image within a two-dimensional space defined by the distance and the angle.
In more detail, the conversion unit 161 performs image processing for converting a point in each image into a curved line pattern according to a distance ρ 0 from the reference position O to a closest point P0 on a straight line L0 passing through the point in the image and an angle θ 0 formed by a straight line n0 passing through the reference position O and the closest point P0 and a reference axis including the reference position O, and converting a linear component in the image into a plurality of superimposed curved line patterns to generate a first converted image and a second converted image.
For simplicity of explanation, for example, as shown in fig. 3A, it is assumed that a straight line L0 and points P1 (X1, Y1), P2 (X2, Y2), and P3 (X3, Y3) on the straight line L0 exist on one X-Y plane. If, for example, a straight line passing through the origin (reference position) O and perpendicular to the straight line L0 is n0, the straight line n0 and the x-axis as the reference axis have a correlation at an angle θ 0. The distance | ρ 0| from the origin O to the straight line L0 is correlated. The straight line L0 may be realized by parameters such as (ρ 0, θ 0). The image conversion process of the present invention performed on the coordinates (X, Y) on the X-Y plane is defined by, for example, the following formula (1):
ρ=x·cosθ+y·sinθ (1)
for example, when the conversion processing shown in formula (1) is performed for each of the points P1, P2, and P3, the points are converted into curves in the ρ - θ space as shown in fig. 3B. In more detail, the conversion processing is for converting the point P1 (x 1, y 1) into the curve PL1 (x 1 · cos θ + y1 · sin θ), converting the point P2 (x 2, y 2) into the curve PL2 (x 2 · cos θ + y2 · sin θ), and converting the point P3 (x 3, y 3) into the curve PL3 (x 3 · cos θ + y3 · sin θ). The patterns of these curves PL1, PL2 and PL3 intersect at an intersection CP (ρ 0, θ 0) in ρ - θ space. In the ρ - θ space, the intersection (ρ 0, θ 0) corresponds to the linear component L0 on the X-Y plane. In contrast, as shown in FIG. 3A, the linear component on the X-Y plane corresponds to the intersection CP of the patterns PL1, PL2 and PL3 in the ρ - θ space.
As explained above, the image conversion process for digitization is performed. It can be determined from the degree of overlap of the curve patterns in the ρ - θ space of the processing result which linear component is dominant on the X-Y plane before the processing. The rotation and parallel movement of the image on this XY plane correspond to parallel movement in the theta direction and the rho direction in the rho-theta space after image processing.
Fig. 4A to 4F are diagrams for explaining the operation of the conversion unit shown in fig. 2, where fig. 4A is a diagram of a sample image va1, and fig. 4B is a diagram of an image va2 obtained by rotating the image va1 shown in fig. 4A exactly by a predetermined angle θ; fig. 4C is a diagram of an image va3 obtained by moving the image va2 shown in fig. 4B in parallel; for example, in FIGS. 4A-4C, the X-axis is plotted on the ordinate and the Y-axis is plotted on the abscissa. FIG. 4D is a diagram of an image hva1 obtained by performing image conversion processing on the image va1 shown in FIG. 4A; fig. 4E is a diagram of an image hva2 obtained by performing image conversion processing on the image va2 shown in fig. 4B; fig. 4F is a diagram of an image hva3 obtained by performing image conversion processing on the image va3 shown in fig. 4C. For example, in FIGS. 4D-4F, the ρ -axis is plotted on the ordinate and the θ -axis is plotted on the abscissa.
When the conversion unit 161 performs the image conversion processing as shown in fig. 4A, for example, on the image va1 including the straight lines La1 and La2, it generates an image hva1 including two points by superimposing the curved line patterns in the image indicating the ρ - θ space as shown in fig. 4D. For simplicity of explanation, only points with a large degree of overlap of the curve pattern are shown. When the conversion unit 161 performs the image conversion processing on the image va2 obtained by rotating the image va1 exactly by the predetermined angle θ as shown in fig. 4B, it generates the image hva2 as shown in fig. 4E. The image hva2 is accurately parallel-shifted by a certain amount in the θ direction based on the rotation angle θ in the image of the space indicating ρ - θ, as compared with the image hva1 shown in fig. 4D. When the conversion unit 161 performs the image conversion processing on the image va3 obtained by parallel-moving the image va2 exactly by a predetermined amount as shown in fig. 4C, it generates the image hva3 as shown in fig. 4F. The image hva3 is accurately parallel-moved by a certain amount in the ρ direction based on the amount of parallel movement in the image indicating the space of ρ - θ, as compared with the image hva2 shown in fig. 4E. From the above feature, by calculating the degree of correlation by detecting the amount of parallel shift between images after the image conversion process, image matching can be performed in consideration of the rotation angle and the parallel shift of the images before the image conversion process.
Fig. 5A to 5F are diagrams for explaining the operation of the conversion unit shown in fig. 2. The conversion unit 161 performs image conversion processing on the registered image AIM shown in fig. 5A, for example, to generate an image S1611 as shown in fig. 5C, and performs image conversion processing on the matching image RIM shown in fig. 5B to generate an image S1612. Values consistent with the degree of overlap of the curved pattern are set in the pixels in the images S1611 and S1612. In the image displayed by the predetermined halftone in this embodiment, the displayed image becomes whiter as the degree of overlap of the curve pattern increases. As explained later, the matching unit 164 performs matching processing according to the degree of overlap of the graph patterns, thereby performing matching processing according to the linear components in the original XY space.
The extracting unit 162 extracts a region whose degree of overlap of the curve patterns in one converted image is larger than a threshold value previously set for each of the first converted image and the second converted image. In more detail, for example, the extracting unit 162 extracts an area in which the overlapping degree of the curved patterns in one converted image is larger than the threshold value previously set according to the signal S1611 of the first converted image as illustrated in fig. 5B, generates an image signal S1621 as illustrated in fig. 5C, and outputs it to the correlation value generating unit 163. Also, for example, the extracting unit 162 extracts an area in which the degree of overlapping of the curve patterns in one converted image is larger than the threshold value previously set according to the signal S1612 of the second converted image as shown in fig. 5D, generates an image signal S1622 as shown in fig. 5F, and outputs it to the correlation value generating unit 163. By performing such an extraction process, noise components such as point components different from linear components in KY space of, for example, the registered image AIM and the matching image RIM are eliminated. For example, the extraction unit 162 causes the extraction processing unit 14 to perform a specific extraction process (also referred to as a mask process) with respect to, for example, hardware that executes the extraction process (mask process) of the extraction process described above.
The correlation value generation unit 163 performs matching processing of the first image and the second image according to the results of correlation processing of the signal S1621 and the signal S1622 based on the first converted image and the second converted image at a plurality of different relative positions along the first direction and the second direction orthogonal to the first direction. Here, the first direction and the second direction indicate an X direction and a Y direction (or a θ axis direction and a ρ axis direction) in the converted image.
In more detail, the correlation value generation unit 163 generates a correlation value according to the degree of overlap of the patterns in the first converted image and the second converted image and the coincidence/non-coincidence of the patterns in the first converted image and the second converted image based on the first and second converted images, and outputs the generated correlation value as a signal S163 to the matching unit 164.
Fig. 6 is a functional block diagram of a specific example of the correlation value generation unit 162 shown in fig. 2. As shown in fig. 6, the correlation value generation unit 163 has a correlation unit 1631 and a correlation value detection unit 1632. The correlation unit 1631 performs correlation processing based on the signals S1621 and S1622 using, for example, a phase-limiting filter, and outputs the processing result as a signal S1631 to the correlation value detection unit 1632. The correlation unit 1631 is shown in fig. 6, for example, and has fourier transform units 16311 and 16312, a combining unit 16313, a phase extracting unit 16314, and an inverse fourier transform unit 16315.
In the case of an image pA (M, N) of M × N pixels, for example, fourier transform section 16311 performs fourier transform on signal S1621 as shown in equation (2), generates fourier image data X (u, v), and outputs it as signal S16311 to combining section 16313. In the case of an image pB (M, N) of M × N pixels, for example, fourier transform section 16312 performs fourier transform on signal S1622 as shown in equation (3), generates fourier image data Y (u, v), and outputs it as signal S16312 to combining section 16313.
The fourier image data X (u, v) is composed of an amplitude spectrum C (u, v) and a phase spectrum θ (u, v) as shown in formula (2), while the fourier image data Y (u, v) is composed of an amplitude spectrum D (u, v) and a phase spectrum 58388; (u, v) as shown in formula (3).
The combining unit 16313 combines the data X (u, v) and Y (u, v) generated at the fourier transform units 16311 and 16312 and correlates them. For example, combining unit 16313 generates X (u, v) · Y (u, v) and outputs it to phase decimation unit 16314. Here, Y (u, v) is the complex conjugate of Y (u, v).
The phase extraction unit 16314 eliminates the amplitude component based on the combined signal output from the combining unit 16313, and extracts phase information. For example, the phase decimation unit 16314 decimates the phase component Z (u, v) = exp { j (θ (u, v) - \\58388; (u, v)) } according to X (u, v) · Y (u, v).
The extraction of the phase information is not limited to the above form. For example, only the phase components may be combined as shown in equation (6) after extracting phase information according to equation (4) and equation (5) based on the outputs of the fourier transform units 16311 and 16312, and generating Z (u, v).
X′(u,v)=e jθ(u,v) (4)
Y′(u,v)=e j(u,v) (5)
Z(u,v)=X′(u,v)(Y′(u,v)) * =e j(θ(u,v)-(u,v)) (6)
The inverse fourier transform unit 16315 performs inverse fourier transform processing on the basis of only the signal Z (u, v) output from the phase decimation unit 16314 to generate a correlation intensity image. In more detail, the inverse fourier transform unit 16315 performs inverse fourier transform processing based on the signal Z (u, v) as shown in equation (7), generates a correlation intensity image G (p, q), and outputs it as a signal S1631 to the correlation value measurement unit 1632.
The correlation value detection unit 1632 detects a correlation value from, for example, the peak intensity in the correlation intensity image G (p, q) based on the signal S1631 output from the correlation unit 1631, and outputs the detected correlation value to the matching unit 164 as a signal S163. For example, the correlation value detection unit 1632 defines the maximum peak intensity in the correlation intensity image G (p, q) as a correlation value.
Fig. 7A to 7C are diagrams for explaining correlation values of the correlation intensity image G (p, q). Fig. 7A and 7B are diagrams showing signals S1621 and S1622 of converted images; fig. 7C is a diagram of the intensity peak of the correlation intensity image G (p, q). Fig. 8 is a diagram for explaining the correlation intensity image G (p, q). For example, the correlation value generation unit 163 performs correlation processing based on the images S1621 and S1622 as shown in, for example, fig. 7A and 7B, generates a correlation intensity image G (p, q) as shown in fig. 7C, and outputs it as a signal S1631. In fig. 7C, the z-axis indicates the correlation strength at point (p, q). The correlation value detection unit 1632 outputs the correlation strength of the peak PP having the maximum correlation strength to the matching unit 164 as the correlation value signal S163. When there is no rotational deviation and parallel shift deviation between, for example, the images S1621 and S1622, the correlation intensity image S1631 is set such that the peak PP having the maximum correlation intensity is formed at the image center position O of the correlation intensity image S1631 as shown in fig. 8. When there is a rotational deviation or a parallel shift deviation between, for example, the images S1621 and S1622, the correlation intensity image S1631 is set such that the peak PP having the maximum correlation intensity is formed accurately deviated by a certain amount from its image center position O according to the rotational deviation or the parallel shift deviation.
When the correlation intensity image is generated by the above-described correlation processing, even if there is a rotational deviation or a parallel shift deviation between the images S1621 and S1622, the correlation peak can be found as a correlation value based on the correlation intensity image.
The matching unit 164 matches the registered image AIM and the matching image RIM based on the signal S163 indicating the correlation value output from the correlation value generating unit 163. In more detail, the matching unit 164 determines that the registered image AIM and the matching image RIM are consistent when the correlation value is greater than a predetermined threshold value, and determines that they are inconsistent when the correlation value is less than the threshold value. When the image matching system of this embodiment is used for a vein matching system in the security field, for example, the CPU16 causes the operation processing unit 17 to execute predetermined processing such as unlocking the electronic lock according to the matching result of the matching unit 164.
Fig. 9 is a flowchart for explaining the operation of the image matching system shown in fig. 1. By referring to fig. 3A and 3B, fig. 5A to 5F, and fig. 7A to 7C to 9, the operation of the image matching system 1 is mainly explained in terms of the operation of the CPU16.
For example, an image AIM to be registered previously is input from the image input section 11 and stored in the memory 12. In step ST1, the matching image RIM is input from the image input section 11 and stored in the memory 12. In step ST2, the conversion unit 161 performs image processing for converting a point in the image into a curved pattern according to a distance ρ 0 from the reference position O to the closest point P0 on the straight line L0 passing through the point in the image and an angle θ between the straight line n0 passing through the reference position O and the closest point P0 and the x-axis as a reference axis including the reference position O as shown in fig. 3A based on the matching image RIM shown in fig. 5B, for example, and converting the linear component L in the image into a pattern of a plurality of superimposed curved lines PL, generating a signal S1612 as a converted image in the ρ - θ space as shown in fig. 5D, for example.
In step ST3, the extraction unit 162 performs extraction processing (masking processing) on a region whose degree of overlap of the curve patterns in one converted image is larger than the previously set threshold value, based on the converted image S1612. In more detail, as described above, in each pixel of the image S1612, a value is set according to the degree of overlap of the curved pattern. Among the images displayed by the predetermined halftone, the higher the degree of overlap of the curved patterns, the whiter the displayed image. For example, the extraction unit 162 extracts an area whose degree of overlap of the curve patterns in the converted image S1612 shown in fig. 5F is larger than the threshold value set previously, generates an image S1622 shown in fig. 5E, for example, and outputs it to the correlation value generation unit 163.
In step ST4, the CPU16 reads the registered image AIM stored in the memory 12. In step ST5, the conversion unit 161 performs image processing for converting a point in an image into a curved pattern according to a distance ρ 0 from a reference position O to a closest point P0 on a straight line L0 passing through the point in the image and an angle θ between a straight line n0 passing through the reference position O and the closest point P0 and an x-axis as a reference axis including the reference position O as shown in fig. 3A, for example, based on the registered image AIM shown in fig. 5A, and converting a linear component L in the image into a pattern of a plurality of superimposed curved lines PL, generating a signal S1611 as a converted image on a ρ - θ space, as shown in fig. 5C, for example.
Steps ST1 to ST5 correspond to a first step of performing image conversion processing according to a distance from a reference position of each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generating a first converted image and a second converted image in a two-dimensional space defined by the distance and the angle, of the present invention.
In step ST6, the extraction unit 162 performs extraction processing (mask processing) on an area whose degree of overlap of the curve patterns in one converted image is larger than a previously set threshold value, based on the converted image S1611. For example, the extracting unit 162 extracts a region whose degree of overlap of the curve patterns in the converted image S1611 as shown in fig. 5C is larger than the threshold value set previously, generates an image S1621 as shown in fig. 5E, for example, and outputs it to the correlation value generating unit 163.
The correlation value generating unit 163 generates the correlation values of the registered image AIM and the matched image RIM based on the degree of overlap of the patterns in the converted image S1621 and the converted image S1622 and the coincidence/non-coincidence of the patterns in the converted image S1621 and the converted image S1622. In more detail, in step ST7, the fourier transform units 16311 and 16312 of the correlation unit 1631 perform fourier transform processing, for example, as shown in equations (2) and (3), on the converted images S1621 and S1622, and output the processing results as signals S16311 and S16312 to the combining unit 16313.
The processing of steps ST1 to ST7 need not be in the order described above. For example, after the conversion unit 161 performs the conversion process on the registered image AIM and the matched image RIM, the extraction unit 162 may perform the extraction process on the converted image.
In step ST8, the combining unit 16313 performs the combining processing as described above based on the signals S16311 and S16312, and outputs the processing result as a signal S16313 to the phase extracting unit 16314. In step ST9, the phase extraction unit 16314 extracts only the phase component from the signal S16313 and outputs it as a signal S16314 to the inverse fourier transform unit 16315.
In step ST10, the inverse fourier transform unit 16315 performs inverse fourier transform processing based on the signal S16314, and outputs it to the correlation value detection unit 1632 as a signal S1631 as illustrated in fig. 7C. The magnitude of the correlation intensity peak of the correlation intensity image S1631 shows the degree of correlation between converted images after image conversion. When there is a parallel shift deviation between the converted images, for example, the position of the correlation intensity peak of the correlation intensity image S1631 is accurately deviated from the center position O by an amount corresponding to the amount of the parallel shift deviation between the converted images, but does not have an influence on the correlation intensity.
In step ST11, the correlation value detecting unit 1632 defines the intensity of the correlation intensity peak PP as a correlation value, and outputs the signal S163 to the matching unit 164.
In step ST12, the matching unit 164 performs matching based on the signal S163 indicating the correlation value from the correlation value detection unit 1632. In more detail, the matching unit 164 determines whether the correlation value is greater than a previously determined threshold value, and when it is determined that the correlation value is greater, outputs a matching result signal S164 indicating that the registered image AIM and the matched image RIM are identical. On the other hand, in step ST12, when it is determined that the correlation value is smaller than the previously decided threshold value, the matching unit 164 outputs a matching result signal S164 indicating that the registered image AIM and the matched image RIM do not coincide, and ends the series of processes.
Steps ST7 to ST12 correspond to a second step of performing matching processing of the first image and the second image based on correlation processing results at a plurality of different correlation positions along the first direction and a second direction orthogonal to the first direction in the first converted image and the second converted image generated in the first step of the present invention.
As described above, in this embodiment, it is assumed that the conversion unit 161 performs image conversion processing according to the distance from the reference position of each of the first and second images and the angle formed by the straight line passing through the reference position and the reference axis including the reference position, and generates the first and second converted images in the two-dimensional space defined by the distance and the angle, more specifically, the conversion unit 161 is configured to perform image processing according to the registered image AIM and the matched image RIM, to convert the points in the image into a curved pattern according to the distance ρ 0 from the reference position O to the closest point P0 on the straight line L0 passing through the points in the image and the angle θ between the straight line n0 passing through the reference position O and the closest point P0 and the x axis as the reference axis including the reference position O, and to convert the linear component L in the image into a pattern of a plurality of overlapping curved lines PL, to generate the converted signals S1611 and S1612 as the converted image 163 in the ρ - θ space, to generate the correlation value S163 and the correlation value S1612, to generate the correlation value 163 and the correlation value S164, thereby generate the matching image matching indicating that the matching unit can perform matching processing according to the registered image AIM and the correlation value S163 and the matching S164, and the matching process.
That is, the matching unit 164 performs matching according to the degree of overlapping of the patterns in the converted image S1611 and the converted image S1612 generated by the conversion unit 161 and the coincidence/non-coincidence of the patterns in the converted image S1611 and the converted image S1612, and thus can match the images with high accuracy. In addition, even when there are a parallel shift deviation and a rotation angle deviation between the registered image AIM and the matched image RIM, the parallel shift deviation and the rotation angle deviation appear as a parallel shift deviation and a rotation angle deviation between the converted images S1611 and S1612 after the image conversion processing according to the present invention. In the correlation processing of the present invention, even when a parallel movement deviation is stored between the converted images S1611 and S1612, a correlation value can be generated and matching can be performed with a simple process.
For example, in the normal image matching process, after the correction process of the parallel shift deviation and the rotation angle deviation between the images to be matched is performed, it is necessary to perform a process with a large load, for example, the matching process is performed in units of pixels, but in the image matching of this embodiment, it is not necessary to perform such a correction process, and thus the matching process can be performed at a low load and at a high speed.
Fig. 10 is a functional block diagram of an image matching system according to a second embodiment of the present invention. The image matching system 1a of this embodiment performs a Huff conversion process on the basis of the registered image AIM and the matched image RIM, corrects a parallel shift deviation of the converted image, generates a similarity which is a correlation value between the converted images after the position correction, and performs a matching process between the registered image AIM and the matched image RIM on the basis of the similarity.
The image processing apparatus 1a has the same composition as the functional block diagram shown in fig. 1 in terms of hardware, for example, and thus the description thereof is omitted. In terms of software, the image processing apparatus 1a realizes the functions of the conversion unit 161, the extraction unit 162, the correlation value generation unit 163a, the position correction unit 170, and the matching unit 164a by executing the program PRG stored in the memory 12 by the CPU16, for example, as shown in fig. 10. The difference between the first embodiment and the second embodiment is that the functions of adding the position correction unit 170, the correlation value generation unit 163a outputting the signal for position correction processing S1631, and the matching unit 164a are different. Components having the same functions between the first embodiment and the second embodiment are denoted by the same reference numerals, and explanations thereof are omitted.
The correlation value generation unit 163a performs correlation processing based on the images S1621 and S1622 as shown in fig. 7A and 7B, for example, and generates a correlation intensity image G (p, q) as shown in fig. 7C, which is output as a signal S1631.
The position correction unit 170 performs position correction processing based on the signal S1631 output from the correlation value generation unit 163a, and the signals S1621 and S1622 output from the extraction unit 162, that is, performs position correction processing based on the patterns in the first converted image and the second converted image, and outputs the results of the position correction processing as a signal S1701 and a signal S1702 to the matching unit 164a.
Fig. 11A and 11B are diagrams for explaining the operation of the position correcting unit shown in fig. 10. The numerical value shows the correlation peak intensity of the correlation data on the X-Y plane image plane of the correlation image data. For example, when the registered image AIM and the matched image RIM include a pattern of digitized linear components (linear shapes), the correlation peak intensity (also referred to as correlation intensity) has a small value as shown in fig. 11A and 11B even between images having a large correlation.
For example, the position correction unit 170 specifies N correlation values having higher correlation strengths and correlation peak positions, that is, 8 correlation values and correlation peak positions in this embodiment, as candidates for the positional relationship in two dimensions between the registered image AIM and the matched image RIM shown in fig. 11A, for example. The position correction unit 170 performs position correction by performing a plurality of position correction processes as necessary, for example, parallel movement, so that the patterns between the registered image AIM and the matched image RIM substantially coincide based on the plurality of correction values and the correction peak positions corresponding thereto.
The matching unit 164a generates a correction value based on the patterns in the two converted images, and performs matching processing between the registered image AIM and the matched image RIM based on the generated correction value and a previously set threshold value. In addition, the matching unit 164a performs the matching process based on the sum of the correlation values corresponding to the different positions and the threshold value previously set according to the results of the plurality of position correction processes.
In more detail, the matching unit 164a has a similarity generating unit 1641, a decision unit 1642, and a summing unit 1643. Fig. 12A and 12B are diagrams for explaining the operation of the similarity generating unit 1641. The similarity generating unit 1641 performs comparison processing on each of, for example, a plurality of different positional relationships in the first converted image and the second converted image, and generates a similarity as a correlation value based on the result of the comparison processing. In more detail, the similarity generating unit 1641 performs comparison processing on each of a plurality of different positional relationships in two images based on the signals S1701 and 1702 as shown in fig. 12A and 12B, for example, and generates a similarity as a correlation value based on the result of the comparison processing.
For example, when the two images are f1 (m, n) and f2 (m, n), the similarity generating unit 1641 calculates the similarity Sim by, for example, formula (8) and outputs the calculation result as S1641.
Fig. 13A to 13C are diagrams for explaining the operation of the similarity generating unit shown in fig. 10. When generating the similarity of two images including linear components (also referred to as linear shapes) as shown in fig. 13A and 13B, for example, the similarity generating unit 1641 generates the similarity in accordance with the number of intersections CP of the two images as shown in fig. 13C. For simplicity of explanation, the linear component is represented by a black pixel having a bit value of "1", and the other components are represented by a white pixel having a bit value of "0".
The summing unit 1643 sums up the similarity Sim based on the signal S1641 and outputs the sum result to the decision unit 1642 as the signal S1643. The decision unit 1642 matches the registered image AIM and the matched image RIM based on the signal S1641 indicating the degree of similarity generated by the degree of similarity generation unit 1641. For example, when the similarity is larger than a predetermined value, the decision unit 1642 determines that the registered image AIM and the matched image RIM coincide. In addition, when the signal S1643 of the value summation of the similarity Sim from the summation unit 1643 is larger than a predetermined threshold value, the decision unit 1642 determines that the registered image AIM and the matched image RIM coincide.
Fig. 14 is a diagram for explaining the operation of the image matching system of the second embodiment shown in fig. 10. The operation of the image matching system is mainly explained with respect to the operation of the CPU shown in fig. 14. The same operations as in the first embodiment are given the same reference numerals, and explanations thereof are omitted. The processing of steps ST1 to ST10 is the same as that of the first embodiment. And thus their description will be omitted. Step ST2 and step ST6 correspond to the fourth step and the fourth procedure of the present invention. In step ST111, the position correction unit 170 performs position correction processing based on the correlation intensity image G (p, q) output from the correlation value generation unit 163a as the signal S1631 and the signals S1621 and S1622 output from the extraction unit 162, that is, the patterns in the first converted image and the second converted image, and outputs the position correction processing result to the matching unit 164a as the signal S1701 and the signal S1702. Step ST111 corresponds to the third step and the third routine of the present invention.
For example, in more detail, based on the signal S1631, the position correction unit 170 specifies (selects), for example, 8 correlation values and correlation peak positions in this embodiment as shown in fig. 11A as candidates for the positional relationship in the two-dimensional space between the registered image AIM and the matching image RIM, for example, higher N candidates Pi (P0, P1.. PN-1).
In step ST112, the addition unit 1643 initializes a variable for addition. For example, it initializes the variable i to 0 and the sum value S to 0. In step ST113, the position correction unit 170 performs position correction processing of the registered image AIM and the matched image RIM based on, for example, the (coordinates) Pi of each candidate and the amount of deviation from the center of the related image data corresponding thereto.
In step ST114, the similarity Sim (i) is calculated by the similarity generating unit 1641 and output to the summing unit 1643 and the decision unit 1642.
The decision unit 1642 compares the similarity Sim (i) with the previously set first threshold th 1. When the similarity Sim is smaller than the first threshold th1 (step ST 115), the summing unit 1643 sums up the similarity Sim (i), in more detail, by the formula S = S + Sim (i), and outputs it to the decision unit S1642 (ST 116). In step ST117, the matching unit 1642 compares the summed value S with the previously set second threshold th 2. When the added value S is smaller than the second threshold th2, it compares the variable i with the value N-1 (ST 118). When the variable i does not coincide with the value N-1, it adds 1 to the variable i (ST 119), and returns to the process of step ST 113. In step ST118, when the variable i and the value N-1 coincide, it determines that the image does not coincide (ST 120).
On the other hand, in the comparison processing in step ST115, when the similarity Sim (i) is larger than the first threshold th1, the matching unit 1642 determines that the images coincide. In the comparison processing in step ST117, when the sum S is larger than the second threshold th2, the matching unit 1642 determines that the images coincide (ST 121). When the image matching system of this embodiment is used for, for example, a blood vessel pattern matching system in the security field, the operation processing unit 17 performs an operation of, for example, unlocking the electronic lock.
As described above, in this embodiment, the position correction unit 170 generates a plurality of correlation values indicating the corrected positions, and performs a plurality of position correction processing of the registered image AIM and the matched image RIM according to the generated plurality of correlation values, and the decision unit 1642 performs matching processing based on the added value of the similarity as the correlation values according to the pattern in the converted image. Therefore, for example, even in a case where the correlation between two image data to be compared is small, matching can be performed with high accuracy by summing up the similarities calculated for each positional relationship of a plurality of candidates, as compared with a case where matching is performed only by the similarities.
In addition, since it is determined that the converted image images match when the similarity Sim is larger than the first threshold th1, the matching process can be performed at high speed.
Note that the present invention is not limited to this embodiment. Various preferred modifications may be made. For example, in this embodiment, the similarity degree generation unit calculates the similarity degree by the formula (8), but the present invention is not limited thereto. For example, the similarity generating unit may perform the similarity calculation process suitable for the correlation of linear components (linear patterns).
In addition, the first threshold th1 and the second threshold th2 are fixed values, but the present invention is not limited thereto. For example, it is possible to perform matching with higher accuracy by causing each threshold to vary according to the image pattern.
It is also possible that the image matching system 1b of the third embodiment of the present invention stores a plurality of images as registered images or matched images, performs correlation processing between converted images having a small resolution (i.e., having a small image size) first when performing matching processing of the images, and performs matching processing between images having a normal resolution (i.e., a normal image size) according to the first embodiment or the second embodiment based on the correlation processing result.
In more detail, the image matching system 1b of this embodiment performs image conversion processing based on a distance from a reference position in each of the first and second images and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generates third and fourth converted images whose resolution is lower than the first and second converted images in a two-dimensional space defined by the distance and the angle, determines whether to perform high-resolution (normal resolution) correlation processing and matching processing based on correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the generated third and fourth converted images, and when the correlation value is lower than a predetermined threshold, shelves the matching processing and performs matching processing on the other images, and continuously performs high-resolution matching processing only on images whose resolution is higher than the threshold. The functional block diagram of the image matching system 1b of this embodiment has the same composition as that of the image matching system of the first embodiment, and therefore, explanation thereof is omitted.
Fig. 15 is a diagram for explaining the operation of the image matching system of the third embodiment of the present invention. According to the size of the ρ - θ plane (also referred to as a parameter space) generated by the image conversion processing of this embodiment, it is determined to what degree the straight lines on the X-Y plane are finely segmented as parameters. The larger the size of the parameter space, the finer the segmentation of the straight line can be, so that the resolution is higher by this amount. For example, the conversion unit 161 performs image conversion processing with a parameter space size of high resolution (for example, 180 × 180 pixels) based on the image vb1 including straight lines having a rotation angle deviation as shown in fig. 15A, and continuously generates an image vb2 shown in fig. 15B. In addition, the conversion unit 161 generates an image vb3 indicating the result of the image conversion processing for the parameter spatial size (for example, 30 × 30 pixels) at low resolution based on the image vb1 including the straight line having the rotational angle deviation as shown in fig. 15A.
When comparing the image vb2 and the image vb3, in the high-resolution image vb2 shown in fig. 15B, for example, each straight line having an angular deviation before image conversion is classified into other θ parameters (θ 1, θ 2), but conversely, in the low-resolution image vb3 shown in fig. 15C, they are classified into the same θ parameter (θ 3). The processing speed of the matching processing between images after the image conversion processing of the present invention depends on the processing speed of the parameter space. In more detail, for example, the larger the size of the parameter space, the higher the resolution, the longer the processing time, and the larger the processing load. The smaller the size of the parameter space, i.e. the lower the resolution, the shorter the processing time and the smaller the processing load. In the image matching system of this embodiment, when matching is performed between the input matching image RIM and a plurality of registered images AIM stored in the memory 12, first, the time taken for the entire matching process is shortened by excluding, as a result thereof, an image having a low correlation value from the consistency candidates based on the calculation result of the correlation value in the parameter space having a low resolution.
Fig. 16 is a flowchart for explaining the operation of the image matching system of the third embodiment of the present invention. By referring to fig. 16, differences from the first embodiment and the second embodiment are explained mainly in terms of the operation of the CPU of the image matching system.
For example, a plurality of registered images AIM are previously input from the image input section 11 and stored in the storage 12. In step ST201, the matched image RIM is input from the image input section 11 and stored in the memory 12.
Before performing a matching process of high resolution (normal resolution), an image conversion process is performed based on a first image and a second image in accordance with a distance from a reference position in each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and third and fourth converted images having a lower resolution than the first and second converted images are generated in a two-dimensional space defined by the distance and the angle, and whether to perform the high resolution conversion process and the matching process is determined based on correlation process results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the generated third and fourth converted images.
Specifically, in step 202, a low resolution parameter space is set. In step ST203, the conversion unit 161 performs image processing based on the matched image RIM by converting points in the image into a pattern of curved lines PL according to a distance ρ 0 from the reference position O to the closest point P0 on the straight line L0 passing through the points in the image and an angle θ between a straight line n0 passing through the reference position O and the closest point P0 and an x-axis as a reference axis including the reference position O as shown in fig. 3A and converting the linear component L in the image into a pattern of a plurality of superimposed curved lines PL, and generates a signal S1612 as a converted image on a ρ - θ space.
In step ST204, the extraction unit 162 performs extraction processing (masking processing) on an area whose degree of overlap of the curve patterns in one converted image is larger than the previously set threshold value, based on the converted image S1612. In more detail, as described above, in each pixel of the image S1612, a value is set according to the degree of overlap of the curve patterns. Among the images displayed by the predetermined halftone, the higher the degree of superimposition of the curved pattern, the whiter the displayed image. For example, the extraction unit 162 extracts an area whose overlapping degree of the curve patterns in the converted image S1612 is larger than a previously set threshold value, generates an image S1622, and outputs it to the correlation value generation unit 163.
In step ST205, the CPU16 reads the registered image AIM stored in the memory 12. In step ST206, the conversion unit 161 performs image processing for converting a point in the image into a curved pattern according to a distance ρ 0 from the reference position O to the closest point P0 on the straight line L0 passing through the point in the image and an angle θ between the straight line n0 passing through the reference position O and the closest point P0 and the x-axis as a reference axis including the reference position O as illustrated in fig. 3A based on the registered image AIM, and converting the linear component L in the image into a pattern of a plurality of superimposed curved lines PL, generating a signal S1611 as a converted image in ρ - θ space.
In step ST207, the extraction unit 162 performs extraction processing (mask processing) on an area whose degree of overlap of the curve patterns in one converted image is larger than a previously set threshold value, based on the converted image S1611. For example, the extraction unit 162 extracts an area whose degree of overlap of the curve patterns in the converted image S1611 is larger than a previously set threshold value, generates an image S1621, and outputs it to the correlation value generation unit 163.
The correlation value generation unit 163 generates a correlation value between the registered image AIM and the matched image RIM based on the degree of overlap of the patterns in the converted image S1621 and the converted image S1622 and the coincidence/non-coincidence of the patterns in the converted image S1621 and the converted image S1622. In more detail, in step ST208, the fourier transform units 16311 and 16312 of the correlation unit 1631 perform fourier transform processing, for example, as shown in equations (2) and (3), on the converted images S1621 and S1622, and output the processing results as signals S16311 and S16312 to the combining unit 16313.
The processing of steps ST201-ST208 need not be in the order described above. For example, after the conversion unit 161 performs the conversion processing on the registered image AIM and the matched image RIM, the extraction unit 162 may perform extraction processing (mask processing) on the converted image.
In step ST209, the combining unit 16313 performs the combining processing as described above based on the signals S16311 and S16312, and outputs the processing result as a signal S16313 to the phase extraction unit 16314. In step ST210, the phase extraction unit 16314 extracts only the phase component from the signal S16313 and outputs it as a signal S16314 to the inverse fourier transform unit 16315.
In step ST211, the inverse fourier transform unit 16315 performs inverse fourier transform processing based on the signal S16314, and outputs it as a signal S1631 to the correlation value detection unit 1632 as illustrated in fig. 7C. The magnitude of the correlation intensity peak of the correlation intensity image S1631 shows the degree of correlation between converted images after image conversion. When there is a parallel shift deviation between the converted images, for example, the position of the correlation intensity peak of the correlation intensity image S1631 is accurately deviated from the center position O by an amount corresponding to the amount of the parallel shift deviation between the converted images, but does not affect the correlation intensity.
In step ST212, the correlation value detecting unit 1632 defines the intensity of the correlation intensity peak PP as a correlation value, and outputs the signal S163 to the matching unit 164.
In step ST213, the matching unit 164 performs matching based on the signal S163 indicating the correlation value from the correlation value detection unit 1632. In more detail, the matching unit 164 decides whether or not the correlation value is larger than the previously determined threshold value, and when it is determined that the correlation value is smaller, shelves the matching of the registered image AIM, reads another registered image AIM in the memory 12, and returns to the processing of the value step 206.
On the other hand, when determining that the correlation value is larger than the threshold value previously decided in step ST213, the matching unit 164 determines a candidate for which the registered image AIM coincides with the matched image RIM, and sets a high resolution parameter space.
The same processing as in steps ST203 to ST212 is performed on the image of the high resolution parameter space in the same manner below (steps ST216 to ST 224).
In step ST225, the matching unit 164 performs matching based on the signal S163 indicating the correlation value from the correlation value detection unit 1632. In more detail, the matching unit 164 determines whether the correlation value is greater than a previously determined threshold value. When it is determined that the correlation value is larger, a result signal S164 indicating that the registered image AIM and the matched image RIM coincide is output (step ST 226). On the other hand, when it is determined that the correlation value is smaller than the threshold value previously set in step ST225, the matching unit 164 outputs a result signal S164 indicating that the registered image AIM and the matched image RIM do not coincide (step ST 227), reads another registered image AIM in the memory 12 (ST 228), sets a low resolution (ST 229), and returns to the processing of step 206.
As described above, in this embodiment, before the high resolution (normal resolution) matching process, based on the first image and the second image, the image conversion process is performed in accordance with the distance from the reference position in each of the first image and the second image and the included angle formed by the straight line passing through the reference position and the reference axis including the reference position, and the third converted image and the fourth converted image whose resolution is lower than that of the first converted image and the second converted image are generated in the two-dimensional space defined by the distance and the angle, and based on the correlation process results at a plurality of different relative positions along the first direction and the second direction orthogonal to the first direction in the generated third converted image and fourth converted image, it is determined whether to perform the high resolution conversion process and the matching process. When the correlation value is low, the matching of the image is put aside and the matching processing of another image is performed, so that the processing time of the entire matching processing can be shortened. In addition, since the low-resolution image processing is performed first, the processing load is reduced.
Note that the present invention is not limited to this embodiment. Any suitable modifications may be made. The matching time can be further shortened by, for example, previously performing low-resolution image conversion processing on the registered image AIM and performing matching processing between these images.
For example, the invention may be used for security-related applications for matching two blood vessel images, a fingerprint image, a static image, and a moving image based on linear components in the images.
Summarizing the effects of the present invention, according to the present invention, an image matching method capable of matching images with high accuracy, and a program and an image matching system therefor can be provided.
While the invention has been described with reference to specific embodiments chosen for purposes of illustration, it will be understood by those skilled in the art that various changes may be made therein without departing from the basic concept and scope of the invention.

Claims (13)

1. An image matching method for matching a first image and a second image, comprising:
a first step of performing image conversion processing according to a distance from a reference position in each of the first and second images and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generating first and second converted images in a two-dimensional space defined by the distance and the angle; and
a second step of performing matching processing of the first image to the second image according to correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the first and second converted images generated in the first step, wherein the first direction is a distance off-axis direction ρ and the second direction is an angle axis direction θ.
2. The image matching method according to claim 1, wherein in the first step, the method performs the image conversion process to generate the first converted image and the second converted image by converting the point in each image into a curved line pattern according to a distance from a reference position to a closest point on a straight line passing through the point in the image and an angle formed between the straight line passing through the reference position and the closest point and a reference axis including the reference position, and converting a linear component in each image into a plurality of patterns of overlapping curves.
3. An image matching method as claimed in claim 1, wherein in the second step, the first image and the second image are matched based on correlation values of phase components in response to the resulting fourier transform processing values by fourier transforming the first and second transformed images generated in the first step along said first direction and said second direction.
4. The image matching method according to claim 2, further comprising a third step of performing position correction based on the patterns in the first and second converted images generated in the first step,
wherein in the second step, matching of the first and second images is performed based on the degree of overlapping of the patterns in the first and second converted images, the position of which has been corrected in the third step, and the coincidence or non-coincidence of the patterns in the first and second converted images.
5. The image matching method of claim 4, further comprising a fourth step of extracting a region of the first and second converted images, the region having a degree of overlap of the curve patterns in one converted image greater than a predetermined threshold;
wherein in the third step, position correction is performed based on the patterns in the extracted regions of the first and second converted images in the fourth step; and
wherein in the second step, image matching is performed based on coincidence or non-coincidence of the patterns in the extracted regions of the first and second converted images whose positions are corrected in the third step.
6. The image matching method according to claim 4, wherein in the second step,
a plurality of different positional relationships in the first and second converted images generated in the first step are compared,
generating a similarity as a correlation value from the comparison result, an
And performing image matching according to the generated similarity.
7. The image matching method of claim 1, further comprising an additional step for judging whether the process of the first step is performed or not before performing the first step,
in the course of this additional step of the process,
performing image conversion processing on the first and second images based on a distance from a reference position in the first and second images and an angle between a line passing through the reference position and a reference axis including the reference position;
generating third and fourth converted images having a lower resolution than the first and second converted images in a two-dimensional space defined by the distance and the angle; and
the determination is made based on the results of the correlation processing in a plurality of different relative positions of a first direction on the third and fourth converted images and a second direction perpendicular to the first direction, the first direction being a distance axis direction ρ, and the second direction being an angle axis direction θ.
8. An image matching system for matching a first image and a second image, comprising:
conversion means for performing image conversion processing in accordance with a distance from a reference position in each of the first image and the second image and an angle formed by a straight line passing through the reference position and a reference axis including the reference position, and generating a first converted image and a second converted image in a two-dimensional space defined by the distance and the angle; and
matching means for performing matching processing of the first image to the second image on the basis of correlation processing results at a plurality of different relative positions along a first direction and a second direction orthogonal to the first direction in the first converted image and the second converted image generated in the converting means, wherein the first direction is a distance off-axis direction ρ and the second direction is an angle axis direction θ.
9. The image matching system of claim 8, wherein in the conversion means, the image conversion processing is performed to generate the first converted image and the second converted image by converting the points in each image into a curved line pattern and converting the linear components in each image into a plurality of patterns of overlapping curves based on a distance from the reference position to a closest point on a straight line passing through the points in the images and an angle formed between the straight line passing through the reference position and the closest point and a reference axis including the reference position.
10. The image matching system of claim 8, wherein the matching means fourier-transforms the first and second transformed images generated in said transforming means along said first direction and said second direction, matching the first image and the second image based on correlation values corresponding to phase components of the resulting fourier-transformed processed values.
11. The image matching system of claim 9, further comprising position correction means for performing position correction based on patterns in the first and second converted images generated in the conversion means, and
wherein the matching means performs matching of the first and second images based on the degree of overlapping of the patterns in the first and second converted images after the position correction and coincidence or non-coincidence of the patterns in the first and second converted images.
12. The image matching system of claim 11, further comprising extracting means for extracting an area of the first and second converted images, the area having a degree of overlap of the curve pattern in one converted image greater than a predetermined threshold;
wherein the position correcting means performs position correction based on the patterns of the first and second converted images in the region extracted in the extracting means; and
wherein the matching means performs image matching based on coincidence or non-coincidence of the patterns in the extracted regions of the first and second converted images whose positions are corrected in the position correcting means.
13. The image matching system of claim 11, wherein the matching means,
a plurality of different positional relationships in the first and second converted images generated in the conversion means are compared,
generating a similarity from the comparison results as a correlation value, an
And performing image matching according to the generated similarity.
HK05105836.2A 2003-10-07 2005-07-11 Image matching method and image matching system HK1073372B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2003348293A JP4345426B2 (en) 2003-10-07 2003-10-07 Image collation method, program, and image collation apparatus
JP2003-348293 2003-10-07

Publications (2)

Publication Number Publication Date
HK1073372A1 HK1073372A1 (en) 2005-09-30
HK1073372B true HK1073372B (en) 2008-08-08

Family

ID=

Similar Documents

Publication Publication Date Title
CN100362524C (en) Image matching method and image matching system
US7136505B2 (en) Generating a curve matching mapping operator by analyzing objects of interest and background information
US7778467B2 (en) Image matching system and image matching method and program
JP4428067B2 (en) Image collation apparatus, program, and image collation method
CN100371953C (en) Image matching method and image matching device
US7139432B2 (en) Image pattern matching utilizing discrete curve matching with a mapping operator
US7171048B2 (en) Pattern matching system utilizing discrete curve matching with a mapping operator
HK1073372B (en) Image matching method and image matching system
US20030194133A1 (en) Pattern matching utilizing discrete curve matching with multiple mapping operators
US7120301B2 (en) Efficient re-sampling of discrete curves
JP5045763B2 (en) Biometric authentication device, biometric authentication method, blood vessel information storage device, and blood vessel information storage method
JP4161942B2 (en) Image collation method, image collation apparatus, and program
JP4380205B2 (en) Image collation device and image collation method
HK1092259B (en) Image matching method and image matching apparatus