[go: up one dir, main page]

CN111931811B - Calculation method based on super-pixel image similarity - Google Patents

Calculation method based on super-pixel image similarity Download PDF

Info

Publication number
CN111931811B
CN111931811B CN202010607158.0A CN202010607158A CN111931811B CN 111931811 B CN111931811 B CN 111931811B CN 202010607158 A CN202010607158 A CN 202010607158A CN 111931811 B CN111931811 B CN 111931811B
Authority
CN
China
Prior art keywords
pixel
image
similarity
super
key frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010607158.0A
Other languages
Chinese (zh)
Other versions
CN111931811A (en
Inventor
王卫
李珍珍
王梅云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Jusha Display Technology Co Ltd
Nanjing Jusha Medical Technology Co Ltd
Original Assignee
Nanjing Jusha Display Technology Co Ltd
Nanjing Jusha Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Jusha Display Technology Co Ltd, Nanjing Jusha Medical Technology Co Ltd filed Critical Nanjing Jusha Display Technology Co Ltd
Priority to CN202010607158.0A priority Critical patent/CN111931811B/en
Publication of CN111931811A publication Critical patent/CN111931811A/en
Priority to PCT/CN2021/098184 priority patent/WO2022001571A1/en
Application granted granted Critical
Publication of CN111931811B publication Critical patent/CN111931811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a calculation method based on super-pixel image similarity. The method comprises the following steps: selecting a key frame image from the image data through a network model; dividing the key frame image into super-pixel segmentation graphs of different pixel blocks by adopting an image segmentation algorithm; performing horizontal overturning treatment on the key frame image; extracting a segmentation boundary of the super-pixel segmentation graph from the super-pixel segmentation graph; superposing the segmentation boundary on the key frame image subjected to the horizontal overturning treatment to obtain a superposed image; and carrying out similarity comparison on the corresponding pixel blocks in the super-pixel segmentation map and the superimposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks. The invention automatically screens the key frame images from the acquired image video stream, and identifies the key frame images, thereby realizing intelligent computer-aided diagnosis.

Description

Calculation method based on super-pixel image similarity
Technical Field
The invention belongs to the field of image recognition, and particularly relates to a calculation method based on super-pixel image similarity.
Background
In recent years, the application of deep learning in the field of image processing has also been rapidly developed, but classification and identification of images is an important challenge of deep learning. With the increasing amount of data, there is an increasing need for reliable computer-aided diagnosis methods. There is currently a great deal of research by the relevant scholars on the application of artificial intelligence to computer-aided diagnosis methods. The computer aided diagnosis method is to extract effective features from one or several mode image data and to classify and identify the extracted effective feature sample with machine learning method. However, the current computer-aided diagnosis imaging method has the following problems: (1) Training a network model only for a certain part or a specific part, and not being applicable to other parts; (2) The key frame and the interested region are selected manually by manual intervention, so that convenience is low.
Disclosure of Invention
Aiming at the problems existing in the prior art, the invention aims to provide a calculation method based on the similarity of super-pixel images, which can judge the similarity of pixel blocks after the segmentation of the selected key frame images. In order to solve practicality among the prior art, the not high problem of precision.
In order to solve the technical problems, the invention adopts the following technical scheme:
a calculation method based on super-pixel image similarity comprises the following steps:
selecting a key frame image from the image data through a network model;
dividing the key frame image into super-pixel segmentation graphs of different pixel blocks by adopting an image segmentation algorithm;
performing horizontal overturning treatment on the key frame image;
extracting a segmentation boundary from the super-pixel segmentation map;
superposing the segmentation boundary on the key frame image subjected to the horizontal overturning treatment to obtain a superposed image;
and carrying out similarity comparison on the corresponding pixel blocks in the super-pixel segmentation map and the superimposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
Further, the training method of the network model is as follows:
selecting a key frame image of a corresponding part according to the part to be identified;
manufacturing a training set of the key frame image;
manufacturing a test set and a verification set which are not overlapped with the training set in a crossing way;
and training a network model with highest accuracy of identifying the key frame images on the test set according to the training set and the verification set.
Further, the segmentation process of the key frame image is as follows:
converting the key frame image into a feature vector;
constructing a distance measurement standard according to the feature vector;
clustering local image pixels according to the distance measurement standard;
and carrying out continuous iterative optimization on the clusters until the difference between each pixel point and the cluster center is not changed, and obtaining super-pixel segmentation graphs of different pixel blocks.
Further, the method also includes calculating color distance and spatial distance of the searched pixel points according to the distance metric.
Further, the calculation formulas of the color distance and the space distance are as follows:
wherein i and j represent key frame image x and key frame image y, l, a and b are feature vectors in Lab color space, x and y are feature vectors in XY coordinates, d, respectively c And d s Respectively expressed as color distance and space distance, D i The distance between the pixel point and the seed point is m, m is a constant, and S is the maximum spatial distance.
Further, the method for obtaining the similarity comprises the following steps:
converting the super-pixel segmentation map and the superimposed image into a YIQ color space image;
calculating PC values of the two color space images;
according to the similarity between each point on the image of the PC value;
and carrying out measurement weighting on the similarity to obtain the similarity of each pair of super pixel blocks.
Further, the calculation formula of the similarity is as follows:
wherein Pc is used for representing phase consistency information of the images x and y, G is used for representing gradient amplitude and S PC (x, y) is the feature similarity of the two images x, y, S G (x, y) is the gradient similarity of the two images x, y, T 1 And T 2 Is constant.
Further, the calculation formula of the similarity is as follows:
Pc n (x,y)=max[Pc(x),Pc(y)],
wherein x and y are key frame image x and key frame image y, Ω represents the whole airspace, S PC (x, y) is the feature similarity of the two images x, y, S G (x, y) is the gradient similarity of the two images x, y, alpha and beta are positive integers, pc represents the phase consistency information of the images x, y, and n represents the analysis of each pair of super pixel block labels.
Further, the result output of the network model adopts a Softmax function, and the calculation is shown as follows:
wherein e is a natural constant, j is a classified class, and k isTotal number of categories to be classified, z i Is the ith dimension component in the k-dimensional vector, P i The probability of being predicted as the i-th class in the image classification.
A computing system based on super-pixel image similarity, the system comprising:
and a screening module: the method comprises the steps of selecting a key frame image from image data through a network model;
and a segmentation module: a super-pixel segmentation map for dividing the key frame image into different pixels by an image segmentation algorithm;
and (3) a turnover module: the method is used for carrying out horizontal overturning treatment on the key frame image;
and an extraction module: for extracting a segmentation boundary from the super-pixel segmentation map;
and a superposition module: the segmentation boundary is used for being superimposed on the key frame image after the horizontal overturning treatment to obtain a superimposed image;
and a comparison module: and the method is used for carrying out similarity comparison on the corresponding pixel blocks in the super-pixel segmentation map and the superimposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
A computing method system based on super-pixel image similarity, the system comprising a processor and a storage medium;
the storage medium is used for storing instructions;
the processor is operative according to the instructions to perform steps according to the method described above.
A computer readable storage medium having stored thereon a computer program which when executed by a processor realizes the steps of the method described above.
Compared with the prior art, the invention has the following beneficial effects:
according to the invention, the key frame images are screened from the image data through the network model, and the convenience, practicality and precision of the method are improved through processing and similarity identification of the key frame images; according to the method, a network model with highest accuracy for identifying the key frames of the images of different positions can be trained according to the different key frames of the different positions, and the problems that the existing method is single and is only suitable for the auxiliary diagnosis of the images of the specific positions are solved.
Drawings
FIG. 1 is a schematic diagram of a process for assisting in diagnosing symmetrical parts of a human body according to an embodiment of the method of the present invention;
FIG. 2 is a graph showing the effect of super-pixel segmentation on a brain MRI image in an embodiment of the method of the present invention;
FIG. 3 is a super-pixel segmentation boundary extracted from a super-pixel segmented brain MRI in an embodiment of the method of the present invention.
Detailed Description
The invention is further illustrated by the following examples, which are intended to be illustrative only and not limiting in any way.
First, the specific operation method of the present invention is described:
the embodiment of the invention provides a calculation method based on the similarity of the super-pixel image, and the embodiment of the invention applies the method to the auxiliary diagnosis process, but the method is not limited to the application field provided in the embodiment, and the invention can be equivalently applied to other fields except the auxiliary diagnosis.
Fig. 1 is a schematic diagram of a process of calculating the similarity of pixel blocks of an image in an embodiment of the method of the present invention, where the method is applied to auxiliary diagnosis, in the figure, image data is input into a network model, a key frame image x is obtained, the key frame image x is horizontally turned to obtain a key frame image y, the key frame image x and the key frame image y are respectively subjected to image segmentation, similarity analysis is performed on each pair of super pixel blocks, and finally a lesion area is located. The method comprises the following specific steps:
step 1, training a network model with highest accuracy for identifying the key frame images of the images of different parts of the human body according to different key frames diagnosed by doctors. And selecting a human body part needing image analysis, and automatically screening out a key frame image from the acquired image data through a network model. The specific implementation steps are as follows:
step 1.1, selecting a key frame of a corresponding part to be identified according to the part to be identified, and labeling to manufacture a training set for identifying the key frame;
step 1.2, aiming at the training set in step 1.1, manufacturing a test set and a verification set which are not overlapped with the training set in a crossing way;
and step 1.3, selecting a network model with the depth suitable for manufacturing the data volume of the data set by using the training set and the verification set manufactured in the step 1.1 and the step 1.2, and training out the network model with the highest accuracy on the test set. The network model carries out classification and identification on the input image data, the key frame image label is 0, and the image with highest class 0 probability predicted by the network model is the key frame image. The Softmax function used for the output of the final result of the network model is calculated as follows:
where e is a natural constant, j is the class to be classified, k is the total number of classes to be classified, z i Is the ith dimension component in the k-dimensional vector, P i The probability of being predicted as the i-th class in the image classification.
And 2, labeling each pixel of the key frame image by adopting an image segmentation algorithm (super-pixel segmentation) of simple linear iterative clustering, and dividing the pixels into a plurality of pixel sets. Therefore, the pixel points with similar characteristics, such as texture, information entropy, brightness and the like, are subdivided into an irregular block, the method is compatible with a gray level diagram and a color diagram common to segmented images, the operation speed is high, meanwhile, a relatively complete outline can be maintained, and the method is relatively in line with the segmentation effect of the region of interest. The specific implementation steps are as follows:
step 2.1, converting a color key frame image into a Lab color space and a 5-dimensional feature vector under XY coordinates, wherein the Lab color space is irrelevant to equipment and consists of three elements of brightness L and color channels a and b, and XY is a plane coordinate for positioning the position;
and 2.2, constructing a new distance measurement standard for the feature vector obtained by the conversion in the step 2.1 so as to cluster local image pixels. Initializing data, distributing different labels for each pixel point in a neighborhood around a seed point, calculating the labels belonging to a clustering center, storing the distance from the pixel point to the pixel center, and calculating the color distance and the space distance of the searched pixel point by a new distance measurement standard, wherein the calculation formula is as follows:
where i and j represent key frame image x and key frame image y, respectively. l, a and b are feature vectors in Lab color space, and x and y are feature vectors in XY coordinates. d, d c And d s Respectively expressed as a color distance and a space distance to obtain a distance D between a pixel point and a seed point i . m is a constant, and the value range is [1,40 ]]Typically, the value is 10, and s is the maximum spatial distance. In the process, each pixel point is searched for multiple times, the minimum value of the distance between the pixel point and surrounding seed points is taken, and the corresponding seed point is the clustering center of the pixel;
and 2.3, performing continuous iterative optimization until the difference between each pixel point and the clustering center is not changed, and finding that the effect of super-pixel segmentation after 20 iterations at most is optimal through multiple image segmentation experiences.
And 3, carrying out horizontal overturning treatment on the key frame image, extracting the segmentation boundary of the current super-pixel treatment from the super-pixel segmentation map obtained in the step 2, and superposing the extracted super-pixel segmentation boundary on the key frame image subjected to the horizontal overturning treatment, so that the two maps are respectively corresponding to the horizontal parts in the same segmentation block. The specific implementation steps are as follows:
step 3.1, on the basis of the step 1 and the step 2, a part for dividing the region of interest and a super-pixel dividing boundary can be obtained, and the key frame image is horizontally turned over;
step 3.2, superposing the super-pixel segmentation boundary on the horizontal flipped key frame image, and segmenting the flipped key frame image, so that the two images are respectively corresponding super-pixel blocks of the horizontal part in the same segmentation block;
and 4, performing similarity comparison on corresponding pixel blocks in the super-pixel segmentation key frame image and the key frame image with the super-pixel segmentation boundary overlapped and turned over by using a phase consistency algorithm to respectively obtain the similarity of each segmented super-pixel block. The specific implementation steps are as follows:
step 4.1, converting the super-pixel segmentation key frame image and the key frame image with the super-pixel segmentation boundary overlapped and turned into a YIQ color space image, wherein the Y component represents the brightness information of the image, the I, Q component represents the chromaticity information, and the YIQ color space can separate the brightness and chromaticity of the color image;
step 4.2, calculating PC values of the two images, wherein PC is a measure of image phase consistency information, and similarity and gradient amplitude between chromaticity characteristics are obtained to obtain similarity between each point on the images; the calculation formula is as follows:
in the formula, pc is used for representing phase consistency information of the images x and y, and G is used for representing gradient amplitude. S is S PC (x, y) is the feature similarity of the two images x, y, S G (x, y) is the gradient similarity of the two images x, y, T 1 Constant sum T 2 The function of the constant is to avoid zero denominator and 0.001;
and 4.3, weighting the chromaticity characteristic similarity measurement by combining the characteristic similarity and the gradient amplitude of the images on the basis of the step 4.2 to obtain the similarity of each point, and further obtaining the similarity between the two images, wherein the calculation formula is as follows:
wherein x and y are key frame image x and key frame image y, Ω represents the whole airspace, S PC (x, y) is the feature similarity of the two images x, y, S G (x, y) is the gradient similarity of the two images x, y, alpha and beta are positive integers, and are mainly used for adjusting the weight between the characteristic similarity and the gradient similarity, pc represents the phase consistency information of the images x, y, n represents the analysis of each pair of super pixel block labels, pc n (x,y)=max[Pc(x),Pc(y)]For weighting the similarity of the two images as a whole. The similarity FSIM between the two images is obtained through calculation, and the closer the FSIM is, the lower the similarity of the two images is.
And 5, setting a threshold value to analyze the similarity of each pair of super pixel blocks, wherein the similarity of each pair of super pixel blocks in the key frame image is lower, the possibility of the part of the key frame image, which is cancerous, is higher as compared with the normal tissue, and the pixels of the normal tissue are different because the nutrient substances, the density and the like of the diseased tissue are changed compared with those of the normal tissue. According to the set threshold value, the coordinates of the super pixel block are obtained, and through a large number of experimental tests, when the similarity is between 0.15 and 0.48, the positioned pathological part is closest to the expected effect, so that the threshold value is set to be between 0.15 and 0.48, namely, the similarity of the super pixel block is within the range, the position which is different from the normal part and is suspected to be diseased is positioned, and the pathological part can be accurately positioned. The specific implementation steps are as follows:
step 5.1, analyzing the similarity of each pair of super pixel blocks, and sequencing all the super pixel blocks according to the size of the similarity and the ascending order;
and 5.2, analyzing the super pixel blocks ordered in the step 5.1, and obtaining coordinates of the super pixel blocks by setting a threshold value, so that positions which are different from the normal positions and suspected to be diseased are positioned, and pathological positions which are likely to be diseased can be accurately positioned.
The following is a specific embodiment for performing auxiliary diagnosis of human symmetrical part image identification by applying the method of the invention: the implementation of the present invention will be described in detail using brain nuclear magnetic resonance imaging as an example.
Magnetic resonance imaging is widely used clinically to assess brain lesions due to its high degree of soft tissue resolution. However, with increasing data volumes and possible experience errors in visual discrimination, there is an increasing need for automatic and reliable methods of locating pathological brain sites.
Firstly, executing step 1, selecting a key frame of a corresponding part to be identified according to the part to be identified, labeling, and manufacturing a training set, a test set and a verification set for identifying the key frame. And selecting a network model with the depth suitable for making the data volume of the data set, and training out the network model with the highest accuracy on the test set for screening out the key frame images.
Next, step 2 is executed, and each pixel of the key frame image is labeled by adopting an image segmentation algorithm of simple linear iterative clustering, and is divided into a plurality of pixel sets. This subdivides pixels with similar characteristics, such as texture, entropy, brightness, etc., into an irregular block. The super-pixel segmented image effect is shown in fig. 2. The extracted super-pixel segmentation boundary is shown in FIG. 3
Thirdly, executing step 3, performing horizontal overturning treatment on the key frame image, and superposing the extracted super-pixel segmentation boundary on the key frame image subjected to the horizontal overturning treatment, wherein the extracted super-pixel segmentation boundary is shown in fig. 3, and segmenting the key frame image subjected to overturning, so that the two images are respectively corresponding to the horizontal parts in the same segmentation block;
step 4 is executed, and similarity calculation is carried out on corresponding pixel blocks in the super-pixel segmentation key frame image and the key frame image with the super-pixel segmentation boundary overlapped and turned over by a phase consistency algorithm, so that the similarity of each segmented super-pixel block is obtained respectively;
and finally, executing the step 5, sorting each pair of super pixel blocks according to the similarity, and acquiring the coordinates of the super pixel blocks with the similarity within the threshold value through the set threshold value range of 0.15-0.48, so that the pathological part of the patient can be accurately positioned.
A computing system based on super-pixel image similarity, the system comprising:
and a screening module: the method comprises the steps of selecting a key frame image from image data through a network model;
and a segmentation module: a super-pixel segmentation map for dividing the key frame image into different pixels by an image segmentation algorithm;
and (3) a turnover module: the method is used for carrying out horizontal overturning treatment on the key frame image;
and an extraction module: for extracting a segmentation boundary from the super-pixel segmentation map;
and a superposition module: the segmentation boundary is used for being superimposed on the key frame image after the horizontal overturning treatment to obtain a superimposed image;
and a comparison module: and the method is used for carrying out similarity comparison on the corresponding pixel blocks in the super-pixel segmentation map and the superimposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
A computing system based on super-pixel image similarity, the system comprising a processor and a storage medium;
the storage medium is used for storing instructions;
the processor is operative according to the instructions to perform steps according to the method described above.
A computer readable storage medium having stored thereon a computer program which when executed by a processor realizes the steps of the method described above.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical aspects of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the above embodiments, it should be understood by those of ordinary skill in the art that: modifications and equivalents may be made to the specific embodiments of the invention without departing from the spirit and scope of the invention, which is intended to be covered by the claims.

Claims (8)

1. The calculation method based on the super-pixel image similarity is characterized by comprising the following steps of:
selecting a key frame image from the image data through a network model;
dividing the key frame image into super-pixel segmentation graphs of different pixel blocks through an image segmentation algorithm;
performing horizontal overturning treatment on the key frame image;
extracting a segmentation boundary from the super-pixel segmentation map;
superposing the segmentation boundary on the key frame image subjected to the horizontal overturning treatment to obtain a superposed image;
performing similarity comparison on corresponding pixel blocks in the super-pixel segmentation map and the superimposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks;
the method for obtaining the similarity comprises the following steps:
converting the super-pixel segmentation map and the superimposed image into a YIQ color space image;
calculating PC values of the two color space images;
according to the similarity between each point on the image of the PC value;
and carrying out measurement weighting on the similarity to obtain the similarity of each pair of super pixel blocks.
2. The method for computing the similarity of the super-pixel images according to claim 1, wherein the training method of the network model is as follows:
selecting a key frame image of a corresponding part according to the part to be identified;
manufacturing a training set of the key frame image;
manufacturing a test set and a verification set which are not overlapped with the training set in a crossing way;
and training a network model with highest accuracy of identifying the key frame images on the test set according to the training set and the verification set.
3. The method for computing similarity based on super-pixel images according to claim 1, wherein the process of segmentation of the key frame image is as follows:
converting the key frame image into a feature vector;
constructing a distance measurement standard according to the feature vector;
clustering local image pixels according to the distance measurement standard;
and carrying out continuous iterative optimization on the clusters until the difference between each pixel point and the cluster center is not changed, and obtaining super-pixel segmentation graphs of different pixel blocks.
4. A method of computing similarity of a super pixel image as claimed in claim 3, further comprising computing color distance and spatial distance of the searched pixels based on a distance metric.
5. The method of claim 4, wherein the color distance and the spatial distance are calculated according to the following formula:
wherein i and j represent key frame image x and key frame image y, l, a and b are feature vectors in Lab color space, x and y are feature vectors in XY coordinates, d, respectively c And d s Respectively expressed as color distance and space distance, D i The distance between the pixel point and the seed point is m, m is a constant, and S is the maximum spatial distance.
6. The method for calculating the similarity based on the super pixel image according to claim 1, wherein the calculation formula of the similarity is as follows:
wherein Pc is used for representing phase consistency information of the images x and y, G is used for representing gradient amplitude and S PC (x, y) is the feature similarity of the two images x, y, S G (x, y) is the gradient similarity of the two images x, y, T 1 And T 2 Is constant.
7. The method for calculating the similarity based on the super pixel image according to claim 1, wherein the calculation formula of the similarity is as follows:
Pc n (x,y)=max[Pc(x),Pc(y)],
wherein x and y are key frame image x and key frame image y, Ω represents the whole airspace, S PC (x, y) is the feature similarity of the two images x, y, S G (x, y) is the similarity of the gradients of the two images x, yThe alpha and beta are positive integers, pc represents phase consistency information of the images x and y, and n represents analysis of each pair of super pixel block labels.
8. The method for computing the similarity of the super pixel images according to claim 1, wherein the result output of the network model adopts a Softmax function, and the computation is shown as follows:
where e is a natural constant, j is the class to be classified, k is the total number of classes to be classified, z i Is the ith dimension component in the k-dimensional vector, P i The probability of being predicted as the i-th class in the image classification.
CN202010607158.0A 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity Active CN111931811B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010607158.0A CN111931811B (en) 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity
PCT/CN2021/098184 WO2022001571A1 (en) 2020-06-29 2021-06-03 Computing method based on super-pixel image similarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010607158.0A CN111931811B (en) 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity

Publications (2)

Publication Number Publication Date
CN111931811A CN111931811A (en) 2020-11-13
CN111931811B true CN111931811B (en) 2024-03-29

Family

ID=73317721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010607158.0A Active CN111931811B (en) 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity

Country Status (2)

Country Link
CN (1) CN111931811B (en)
WO (1) WO2022001571A1 (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931811B (en) * 2020-06-29 2024-03-29 南京巨鲨显示科技有限公司 Calculation method based on super-pixel image similarity
CN112669346B (en) * 2020-12-25 2024-02-20 浙江大华技术股份有限公司 Pavement emergency determination method and device
CN114519864B (en) * 2022-01-14 2025-10-03 江苏大学 A group pig identification method based on feature fusion
CN114663749A (en) * 2022-02-21 2022-06-24 北京箩筐时空数据技术有限公司 Training method and device for landslide mass recognition model, electronic equipment and storage medium
CN114821298B (en) * 2022-03-22 2024-08-06 大连理工大学 A multi-label remote sensing image classification method with adaptive semantic information
CN115294131A (en) * 2022-10-08 2022-11-04 南通海发水处理工程有限公司 Sewage treatment quality detection method and system
CN115641327B (en) * 2022-11-09 2023-05-09 浙江天律工程管理有限公司 Building engineering quality supervision and early warning system based on big data
CN116091509B (en) * 2022-12-01 2025-08-19 中国科学院深圳先进技术研究院 Super-pixel segmentation method, system, equipment and storage medium
CN115880295B (en) * 2023-02-28 2023-05-12 吉林省安瑞健康科技有限公司 Computer-aided tumor ablation navigation system with accurate positioning function
CN115914649B (en) * 2023-03-01 2023-05-05 广州高通影像技术有限公司 Data transmission method and system for medical video
CN116310337A (en) * 2023-03-21 2023-06-23 山西大学 An Unsupervised Image Segmentation Method Based on Multi-Representative Superpixels
CN116630658A (en) * 2023-06-08 2023-08-22 南京富岛油气智控科技有限公司 A Judgment Method of Information Entropy Fusion Working Condition Based on Euclidean Distance and Gradient Similarity of Indicator Diagram Eigenvector
CN117115490A (en) * 2023-06-20 2023-11-24 齐鲁理工学院 Image pre-segmentation method, device, medium, electronic equipment
CN116863469B (en) * 2023-06-27 2024-05-14 首都医科大学附属北京潞河医院 Deep learning-based surgical anatomy part identification labeling method
CN116630311B (en) * 2023-07-21 2023-09-19 聊城市瀚格智能科技有限公司 Pavement damage identification alarm method for highway administration
CN116823811B (en) * 2023-08-25 2023-12-01 汶上县誉诚制衣有限公司 Functional jacket surface quality detection method
CN117173175B (en) * 2023-11-02 2024-02-09 湖南格尔智慧科技有限公司 Image similarity detection method based on super pixels
CN118644715A (en) * 2024-06-13 2024-09-13 北京化工大学 A fine classification method for salt pan salt precipitation areas based on combined time-series PolSAR and spectral features
CN118865345B (en) * 2024-09-25 2024-12-13 青岛农业大学 Abnormal phenotype identification method for cucumber leaves
CN119379663B (en) * 2024-11-20 2025-10-21 成都信息工程大学 A method for image target detection based on multi-level judgment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108600865A (en) * 2018-05-14 2018-09-28 西安理工大学 A kind of video abstraction generating method based on super-pixel segmentation
CN109712143A (en) * 2018-12-27 2019-05-03 北京邮电大学世纪学院 A kind of Fast image segmentation method based on super-pixel multiple features fusion
CN109712153A (en) * 2018-12-25 2019-05-03 杭州世平信息科技有限公司 A kind of remote sensing images city superpixel segmentation method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016150873A1 (en) * 2015-03-20 2016-09-29 Ventana Medical Systems, Inc. System and method for image segmentation
CN111931811B (en) * 2020-06-29 2024-03-29 南京巨鲨显示科技有限公司 Calculation method based on super-pixel image similarity

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108600865A (en) * 2018-05-14 2018-09-28 西安理工大学 A kind of video abstraction generating method based on super-pixel segmentation
CN109712153A (en) * 2018-12-25 2019-05-03 杭州世平信息科技有限公司 A kind of remote sensing images city superpixel segmentation method
CN109712143A (en) * 2018-12-27 2019-05-03 北京邮电大学世纪学院 A kind of Fast image segmentation method based on super-pixel multiple features fusion

Also Published As

Publication number Publication date
WO2022001571A1 (en) 2022-01-06
CN111931811A (en) 2020-11-13

Similar Documents

Publication Publication Date Title
CN111931811B (en) Calculation method based on super-pixel image similarity
Xie et al. A context hierarchical integrated network for medical image segmentation
CN108364288B (en) Segmentation method and device for breast cancer pathological image
Zhang et al. Automated semantic segmentation of red blood cells for sickle cell disease
Ortiz et al. Improving MR brain image segmentation using self-organising maps and entropy-gradient clustering
CN107644420B (en) Vascular image segmentation method and MRI system based on centerline extraction
CN107133651B (en) The functional magnetic resonance imaging data classification method of subgraph is differentiated based on super-network
CN111476292A (en) Small sample element learning training method for medical image classification processing artificial intelligence
Pan et al. Cell detection in pathology and microscopy images with multi-scale fully convolutional neural networks
CN114600155A (en) Weakly Supervised Multi-Task Learning for Cell Detection and Segmentation
CN108257135A (en) The assistant diagnosis system of medical image features is understood based on deep learning method
Wu et al. A supervoxel classification based method for multi-organ segmentation from abdominal ct images
CN110415234A (en) Brain tumor dividing method based on multi-parameter magnetic resonance imaging
CN112164082A (en) Method for segmenting multi-modal MR brain image based on 3D convolutional neural network
CN106056595A (en) Method for automatically identifying whether thyroid nodule is benign or malignant based on deep convolutional neural network
CN104616289A (en) Removal method and system for bone tissue in 3D CT (Three Dimensional Computed Tomography) image
Zeng et al. Unsupervised skin lesion segmentation via structural entropy minimization on multi-scale superpixel graphs
CN113096080A (en) Image analysis method and system
CN114926486B (en) Thyroid ultrasound image intelligent segmentation method based on multi-level improvement
CN114882048A (en) Image segmentation method and system based on wavelet scattering learning network
CN116403211A (en) Segmentation and clustering method and system based on single-cell pathology image cell nuclei
Chuang et al. Efficient triple output network for vertebral segmentation and identification
CN114972745A (en) A Trigeminal Neural Automatic Segmentation Method Based on Deep Network
Banerjee et al. A CADe system for gliomas in brain MRI using convolutional neural networks
CN112614092A (en) Spine detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant