CN117788835A - Engineering camouflage effect evaluation method and related device - Google Patents
Engineering camouflage effect evaluation method and related device Download PDFInfo
- Publication number
- CN117788835A CN117788835A CN202311867947.8A CN202311867947A CN117788835A CN 117788835 A CN117788835 A CN 117788835A CN 202311867947 A CN202311867947 A CN 202311867947A CN 117788835 A CN117788835 A CN 117788835A
- Authority
- CN
- China
- Prior art keywords
- feature similarity
- camouflage
- similarity extraction
- effect image
- engineering
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000000694 effects Effects 0.000 title claims abstract description 170
- 238000011156 evaluation Methods 0.000 title claims abstract description 49
- 238000000605 extraction Methods 0.000 claims abstract description 191
- 238000012545 processing Methods 0.000 claims abstract description 105
- 238000000034 method Methods 0.000 claims abstract description 84
- 238000004364 calculation method Methods 0.000 claims abstract description 57
- 239000011159 matrix material Substances 0.000 claims description 64
- 230000015654 memory Effects 0.000 claims description 35
- 230000008569 process Effects 0.000 claims description 31
- 230000006870 function Effects 0.000 claims description 30
- 239000013598 vector Substances 0.000 claims description 18
- 238000010606 normalization Methods 0.000 claims description 17
- 238000004422 calculation algorithm Methods 0.000 claims description 14
- 238000009928 pasteurization Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 8
- 230000004075 alteration Effects 0.000 claims description 4
- 230000008859 change Effects 0.000 description 12
- 230000001186 cumulative effect Effects 0.000 description 6
- 238000013519 translation Methods 0.000 description 6
- 238000005286 illumination Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000004770 highest occupied molecular orbital Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000005291 magnetic effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses an engineering camouflage effect evaluation method and a related device, wherein the method comprises the following steps: obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image; performing color feature similarity extraction processing on the camouflage effect image; performing texture feature similarity extraction processing on the camouflage effect image; performing shape feature similarity extraction processing on the camouflage effect image; performing comprehensive similarity calculation processing on the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result to obtain a comprehensive similarity result; and obtaining an engineering camouflage effect evaluation result based on the comprehensive similarity result. In the embodiment of the invention, the problem that the evaluation result is inconsistent with subjective feeling due to the fact that the background area is too large and the average value is taken is avoided when the camouflage effect is evaluated, and the objective difference between the camouflage area and the background is relatively comprehensively mined.
Description
Technical Field
The invention relates to the field of image processing and computing, in particular to an engineering camouflage effect evaluation method and a related device.
Background
The engineering camouflage effect detection and evaluation mainly comprises the steps of judging through an expert, judging a target area by the judging expert to see whether a target is found or not, so that the camouflage effect is good or bad, but the number of the experts is limited, and the engineering camouflage effect detection and evaluation cannot be popularized and applied on a large scale. The interpretation result has close relation with subjective factors such as the state of mind and body of an expert at the time, experience accumulation and the like.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides an engineering camouflage effect evaluation method and a related device, which avoid the problem that an evaluation result is inconsistent with subjective feeling due to average value taking caused by overlarge background area when the camouflage effect is evaluated, and relatively comprehensively mine objective differences between the camouflage area and the background.
In order to solve the technical problems, an embodiment of the present invention provides an engineering camouflage effect evaluation method, which includes:
obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image of engineering camouflage to obtain a brightness feature similarity extraction result;
performing color feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a color feature similarity extraction result;
Performing texture feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a texture feature similarity extraction result;
performing shape feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a shape feature similarity extraction result;
performing comprehensive similarity calculation processing on the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result to obtain a comprehensive similarity result;
and obtaining an engineering camouflage effect evaluation result based on the comprehensive similarity result.
Optionally, the extracting the brightness feature similarity of the camouflage effect image of the engineering camouflage includes:
performing luminance respective histogram matrix extraction processing on the camouflage effect image based on a cv2.CalcHist () function to obtain luminance distribution histogram matrix information, wherein the luminance distribution histogram matrix information contains image information of the camouflage effect image and is used for reflecting probability distribution conditions of image pixels of the camouflage effect image;
and (3) carrying out brightness characteristic similarity extraction processing on the brightness distribution histogram matrix information based on a cv2.compacteHist () function.
Optionally, the luminance individual histogram matrix extraction processing is performed on the camouflage effect image based on the cv2.calchist () function, including:
calculating the probability p (k) of occurrence of the brightness level k in the camouflage effect image, wherein the calculation process of p (k) is as follows:
wherein N represents all brightness grade numbers; r represents all pixel numbers in the camouflage effect image; r is (r) k Representing the number of pixels corresponding to the brightness level k;
and carrying out accumulated summation calculation on the probability p (k) distribution to obtain brightness distribution histogram matrix information h (k), wherein the accumulated summation calculation process is as follows:
wherein n=n-1; for brightness level ranges of [0, N-1 ]]Is represented as S k The method comprises the steps of carrying out a first treatment on the surface of the For the brightness of the image, its 256-level high image is transformed to 0-255.
Optionally, the color feature similarity extracting process for the camouflage effect image of the engineering camouflage includes:
reading the camouflage effect image from the memory based on a cv2.imdecoding () function, and converting the camouflage effect image into a preset image format to form an image array;
creating RGB three channels based on the image array to obtain a color histogram matrix, and taking the color histogram matrix as a color characteristic, wherein each channel in the RGB three channels is set to be 16 bins to obtain the color histogram matrix;
And carrying out the pasteurization distance calculation processing on the color features based on the cv2. CompacteHist function, and carrying out the color feature similarity extraction processing based on the pasteurization distance calculation result.
Optionally, the extracting the similarity of the texture features of the camouflage effect image of the engineering camouflage includes:
obtaining image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image based on a method of a skimage library, and taking the image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image as texture features of the camouflage effect image;
and carrying out texture feature similarity calculation processing on the texture features based on a Euclidean distance algorithm.
Optionally, the shape feature similarity extracting process for the camouflage effect image of the engineering camouflage includes:
extracting outline features in a color image of the camouflage effect image based on a chromatic aberration Sobel operator, and carrying out shape feature description processing on the outline features by using 7 Hu invariant moment vectors to obtain description shape features;
and carrying out shape feature similarity calculation processing on the description shape features based on the Euclidean distance algorithm.
Optionally, the performing comprehensive similarity calculation on the luminance feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result, and the shape feature similarity extraction result includes:
Normalizing the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result based on a Gaussian normalization processing formula to obtain a normalized brightness feature similarity extraction result, a normalized color feature similarity extraction result, a normalized texture feature similarity extraction result and a normalized shape feature similarity extraction result;
and carrying out comprehensive weighting processing based on weight coefficients respectively corresponding to the normalized brightness feature similarity extraction result, the normalized color feature similarity extraction result, the normalized texture feature similarity extraction result and the normalized shape feature similarity extraction result, wherein the weight coefficients are determined by adopting an entropy weight method.
In addition, the embodiment of the invention also provides an engineering camouflage effect evaluation device, which comprises:
a first similarity extraction module: the method comprises the steps of obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a brightness feature similarity extraction result;
the second similarity extraction module: the color feature similarity extraction method is used for carrying out color feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a color feature similarity extraction result;
The third similarity extraction module: the method comprises the steps of performing texture feature similarity extraction processing on a camouflage effect image of the engineering camouflage to obtain a texture feature similarity extraction result;
a fourth similarity extraction module: the method comprises the steps of performing shape feature similarity extraction processing on a camouflage effect image of engineering camouflage to obtain a shape feature similarity extraction result;
and the comprehensive similarity calculation module is used for: the method comprises the steps of performing comprehensive similarity calculation processing on a brightness feature similarity extraction result, a color feature similarity extraction result, a texture feature similarity extraction result and a shape feature similarity extraction result to obtain a comprehensive similarity result;
the obtaining module is as follows: and the method is used for obtaining the engineering camouflage effect evaluation result based on the comprehensive similarity result.
In addition, the embodiment of the invention also provides an evaluation device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the steps of the method in any one of the above steps when executing the computer program.
In addition, the embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, the computer program implementing the steps of any one of the methods when being executed by a processor.
In the embodiment of the invention, the problem that the evaluation result is inconsistent with subjective feeling due to the fact that the background area is too large and the average value is taken is avoided when the camouflage effect is evaluated, objective differences between the camouflage area and the background are relatively comprehensively mined, and specific reference is provided for the improvement direction of the target camouflage effect.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings which are required in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an engineering camouflage effect evaluation method in an embodiment of the invention;
FIG. 2 is a schematic diagram showing the structural composition of an engineering disguising effect evaluation device in an embodiment of the present invention;
fig. 3 is a schematic structural composition of an evaluation apparatus in the embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In one embodiment, in the present application, both camouflage and reconnaissance are based on objective differences between the target and the background, and reconnaissance is to expand and identify such differences by various techniques, and camouflage is to eliminate, reduce, skew or mimic such differences by various techniques. The objective difference between the camouflage target and the background is reflected on the image as the difference between the image features. According to the target interpretation characteristics, the brightness characteristics, the color characteristics, the texture characteristics and the shape characteristics are very key image characteristics, and the similarity of the target and the background characteristics can represent the fusion degree of the target and the background and reflect the camouflage effect of the target.
Taking a picture without a camouflage target as a reference picture, taking the camouflage picture as a target picture, taking the camouflage target as the target picture by taking the camouflage target as the target picture in a cutting way if only the camouflage picture is taken, taking the peripheral area as the reference picture, carrying out similarity calculation on the target picture and the reference picture, and judging the quality of the camouflage effect according to the similarity value.
The method is influenced by factors such as picture definition, shooting equipment, external factors, complex conditions of camouflage target peripheral areas and the like, the similarity is calculated from brightness features, color features, texture features and shape features of the images respectively for reducing errors, and finally, the calculated similarity is provided with a weight coefficient for normalization processing, so that the final comprehensive similarity is calculated scientifically and objectively.
Referring to fig. 1, fig. 1 is a flow chart of an engineering disguising effect evaluation method according to an embodiment of the invention.
As shown in fig. 1, a method for evaluating camouflage effect of engineering, the method comprising:
s11: obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image of engineering camouflage to obtain a brightness feature similarity extraction result;
in the implementation process of the invention, the brightness feature similarity extraction processing of the camouflage effect image of the engineering camouflage comprises the following steps: performing luminance respective histogram matrix extraction processing on the camouflage effect image based on a cv2.CalcHist () function to obtain luminance distribution histogram matrix information, wherein the luminance distribution histogram matrix information contains image information of the camouflage effect image and is used for reflecting probability distribution conditions of image pixels of the camouflage effect image; and (3) carrying out brightness characteristic similarity extraction processing on the brightness distribution histogram matrix information based on a cv2.compacteHist () function.
Further, the luminance individual histogram matrix extraction processing for the camouflage effect image based on the cv2.calchist () function includes: calculating the probability p (k) of occurrence of the brightness level k in the camouflage effect image, wherein the calculation process of p (k) is as follows:
Wherein N represents all brightness grade numbers; r represents all pixel numbers in the camouflage effect image; r is (r) k Representing the number of pixels corresponding to the brightness level k; and carrying out accumulated summation calculation on the probability p (k) distribution to obtain brightness distribution histogram matrix information h (k), wherein the accumulated summation calculation process is as follows:
wherein n=n-1; for brightness level ranges of [0, N-1 ]]Is represented as S k The method comprises the steps of carrying out a first treatment on the surface of the For the brightness of the image, its 256-level high image is transformed to 0-255.
Specifically, after the camouflage effect image of the engineering camouflage is obtained, the camouflage effect image needs to be subjected to brightness feature extraction processing, namely, the brightness distribution of the image is represented by L in the LAB color space, and the coincidence degree of the brightness distribution histograms of the two pictures is defined as the brightness feature similarity.
Firstly, luminance respectively histogram matrix extraction processing is carried out on a camouflage effect image through a cv2.calcHist () function, so that luminance distribution histogram matrix information can be obtained, wherein the luminance distribution histogram matrix information contains image information of the camouflage effect image and is used for reflecting probability distribution conditions of image pixels of the camouflage effect image; for luminance distribution histogram matrix information f (x, y), where x, y represents pixel points with coordinates x, y, f (x, y) is used for probability of occurrence of luminance level k in the camouflage effect image, where the probability of occurrence of luminance level k in the camouflage effect image p (k), the calculation process of p (k) is as follows:
Wherein N represents all brightness grade numbers; r represents all pixel numbers in the camouflage effect image; r is (r) k Representing the number of pixels corresponding to the brightness level k; and carrying out cumulative summation on the probability distribution to obtain cumulative normalized luminance distribution histogram matrix information h (k), wherein the cumulative summation calculation process is as follows:
wherein n=n-1; for brightness level ranges of [0, N-1 ]]Is represented as S k The method comprises the steps of carrying out a first treatment on the surface of the For the brightness of the image, its 256-level high image is transformed to 0-255.
Then, luminance feature similarity extraction processing is carried out on the luminance distribution histogram matrix information through a cv2.compacteHist () function, and a luminance feature similarity extraction result can be obtained; namely, to compare two histograms (H1 and H2), firstly, a comparison standard for measuring the similarity of the histograms must be selected, d (H1, H2) is set, the histograms H1 and H2 are obtained by calculating the two input images, the histograms are normalized to the same scale space, then the similarity of the two histograms can be obtained by calculating the distance between the H1 and H2, the similarity of the images is further compared, the calculation result of the pasteurization distance is that the value is completely matched to be 1, the value is completely not matched to be 0, and the calculation formula is as follows:
The pasteurization distance achieves the best results when calculating the histogram similarity, but the calculation is most complex.
S12: performing color feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a color feature similarity extraction result;
in the implementation process of the invention, the color feature similarity extraction processing of the camouflage effect image of the engineering camouflage comprises the following steps: reading the camouflage effect image from the memory based on a cv2.imdecoding () function, and converting the camouflage effect image into a preset image format to form an image array; creating RGB three channels based on the image array to obtain a color histogram matrix, and taking the color histogram matrix as a color characteristic, wherein each channel in the RGB three channels is set to be 16 bins to obtain the color histogram matrix; and carrying out the pasteurization distance calculation processing on the color features based on the cv2. CompacteHist function, and carrying out the color feature similarity extraction processing based on the pasteurization distance calculation result.
Specifically, the color space includes RGB, HSV, and Lab, wherein the RGB color space is easy to understand, and is not intuitive when continuously transforming colors, (red green blue): the value ranges are as follows: [0,255], obtaining a histogram of the RGB color space, performing normalization processing, and calculating the coincidence degree of the color distribution histogram to define the brightness characteristic similarity.
Firstly, reading the camouflage effect image from a memory through a cv2.imdecoding () function, and converting the camouflage effect image into a preset image format to form an image array; the method is mainly used for reading pictures in the network, and if the pictures are local, the pictures can be replaced by an imread function.
Secondly, an RGB three-channel histogram matrix is created by using the acquired image as a color feature, and each channel of the three channels is set to 16 bins, so that a color histogram is obtained. A Histogram (Histogram), also known as a quality profile, is a statistical report, representing the data distribution by a series of vertical stripes or segments of unequal height; the data type is generally represented by a horizontal axis, and the distribution condition is represented by a vertical axis; the histogram is a widely used tool in image processing, the nature of the histogram is the graphical of probability distribution, and the histogram can be used for representing vectors; for images, not only gray level histograms can be created, but also arbitrary forms of histograms can be created by design methods.
Finally, the obtained color histogram is subjected to a pasteurization distance algorithm calculation by cv2.compacteHist.
S13: performing texture feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a texture feature similarity extraction result;
In the implementation process of the invention, the process for extracting the similarity of the texture features of the camouflage effect image of the engineering camouflage comprises the following steps: obtaining image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image based on a method of a skimage library, and taking the image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image as texture features of the camouflage effect image; and carrying out texture feature similarity calculation processing on the texture features based on a Euclidean distance algorithm.
Specifically, an image which is easily affected by illumination change is converted into an LBP image with good illumination change resistance, and the LBP image is used for establishing the gray level co-occurrence matrix characteristic of the image. The texture characteristics of a camouflage target and a background are represented by using energy (ASM), entropy (H), contrast (CON), correlation (COR) and inverse difference moment (L) of a gray level co-occurrence matrix, and the similarity of the texture distribution is defined by combining Euclidean distance.
Firstly, obtaining image energy (ASM), entropy (H), contrast (CON), correlation (COR) and inverse difference moment (L) as texture features by a method of a skimage library; inverse moment (HOMO): measuring the uniformity of local intensity of the gray level image, and if the local intensity is uniform, the inverse difference moment value is larger;
Second Moment (ASM): also called energy, which is a measure of image uniformity, the more uniform the gray scale distribution of an image, the larger the corresponding ASM value, and conversely, the smaller the ASM;
) Entropy (ENT): is an amount that measures how much the target image information is, where texture information is also an aspect of the entropy measure. The greater the entropy, and vice versa, the smaller the element in the image is comparatively dispersed. The magnitude of entropy represents the uniformity or complexity of the texture of the target image;
ENT=∑ a,b δ φ,d (a,b)log 2 δ φ,d (a,b);
contrast (CON): reflecting the gray level change degree of the local image, the larger the gray value difference in the image is, the sharper the image edge is, the larger the contrast is, and the typical k=2, and the lambda=1 is;
difference (dis): similar to the contrast, but the variability is better for the local characteristics, and when the local contrast is increased, the variability is also increased;
DISL=∑ a ∑ b δ φ,d (a,b)|a-b|;
correlation (COR): a metric representing the linear relationship of gray pixels of the target image, representing the similarity of gray-level relationships of the gray-level co-occurrence matrix lines and columns:
finally, calculating the similarity of the texture features of the obtained texture feature vector matrix through a Euclidean distance algorithm:
G m representing a texture feature vector matrix of the image.
S14: performing shape feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a shape feature similarity extraction result;
In the implementation process of the invention, the shape feature similarity extraction processing of the camouflage effect image of the engineering camouflage comprises the following steps: extracting outline features in a color image of the camouflage effect image based on a chromatic aberration Sobel operator, and carrying out shape feature description processing on the outline features by using 7 Hu invariant moment vectors to obtain description shape features; and carrying out shape feature similarity calculation processing on the description shape features based on the Euclidean distance algorithm.
Specifically, the color difference Sobel operator is adopted to extract the outline features in the color image, 7 Hu invariant moment vectors are used for describing the shape features, and the Euclidean distance is adopted to calculate the similarity of the shape feature vectors.
Knowing what the color moment is, the color moment is a simple and effective color feature representation method, and there are a first moment (i.e. mean), a second moment (i.e. standard deviation), a third moment (slope), and so on, since the color information is mainly distributed in the low moment, the color distribution in the image can be effectively represented by using the first moment, the second moment, and the third moment to sufficiently express the color distribution in the image.
Wherein p is i,j The ith color component representing the jth pixel of the color image and N represents the number of pixels in the image.
The Hu moment value is unchanged after rotation, scaling, mirroring and translation invariance, that is to say, the same or similar shape is subjected to rotation, scaling and translation transformation. For the obstacle with fixed shape of the color tent, the change of scaling, translation and the like can be generated in the flight process of the unmanned aerial vehicle, but the change of Hu moment value is not great; for the irregular change of the area and the outline of the flame, the Hu moment value of the flame also changes irregularly correspondingly. The Hu moment is a set of 7 invariant moments which can be obtained by second and third order central moments, and is shown in the following formula:
M1=y 20 +y 02 ;
M3=(y 30 -y 12 ) 2 +(3y 21 -y 03 ) 2 ;
M4=(y 30 +y 12 ) 2 +(y 21 +y 03 ) 2 ;
M5=(y 30 -3y 12 )(y 30 +y 12 )((y 30 +y 12 )-3(y 21 +y 03 ) 2 )+(3y 21 -y 03 )(y 21 +y 03 )(3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 );
M7=(3y 21 -y 03 )(y 30 +y 12 )((y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 )-(y 30 -3y 12 )(y 21 +y 03 )(3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 );
finally, calculating shape feature similarity of the obtained texture feature vector matrix through a Euclidean distance algorithm:
wherein M is t Representing a matrix of shape feature vectors of the image.
S15: performing comprehensive similarity calculation processing on the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result to obtain a comprehensive similarity result;
in the implementation process of the present invention, the performing a comprehensive similarity calculation process on the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result, and the shape feature similarity extraction result includes: normalizing the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result based on a Gaussian normalization processing formula to obtain a normalized brightness feature similarity extraction result, a normalized color feature similarity extraction result, a normalized texture feature similarity extraction result and a normalized shape feature similarity extraction result; and carrying out comprehensive weighting processing based on weight coefficients respectively corresponding to the normalized brightness feature similarity extraction result, the normalized color feature similarity extraction result, the normalized texture feature similarity extraction result and the normalized shape feature similarity extraction result, wherein the weight coefficients are determined by adopting an entropy weight method.
Specifically, the evaluation index is normalized, and camouflage practice shows that the distribution of the image feature similarity is approximately Gaussian distribution; therefore, the similarity of different features is subjected to gaussian normalization processing.
And carrying out normalization processing on the calculated value data of the image feature similarity, wherein a Gaussian normalization processing formula is as follows:
wherein x is i Representing the color and brightness of an imageSimilarity index values for shape and texture features;a mean value representing a plurality of image feature similarities; σ represents the distribution variance of feature similarity; the weight coefficients corresponding to the brightness features, the shape features, the color features and the texture features of the image are determined by adopting an entropy weight method, and the management interface can retrain the weight coefficients after fine adjustment, so that the accuracy is improved.
Calculation example:
[0.528,0.43,0.981,0.764]
1. calculating a mean value: tran_avg=sum (tranData)/len (tranData)
(0.528+0.43+0.981+0.764)/4=0.67575
2. Calculating the variance: tran_var=np.var (tranData)
0.045797
3. Normalization:
min=tran_avg-3*tran_var
max=0.813142;
max=tran_avg+3*tran_var
min=0.538358;
normalization:
tranData[i]<min:Y[i]=0
tranData[i]>max:Y[i]=1
tranData[i]>=min&&tranData[i]<=max:Y[i]=((tranData[i]-tran_avg)/3+1)/2
Y=[0,0,1,1.0294]
4. calculating weights:
a. calculating the entropy value e (j) of the j-th index:
e=[0,0,0,0]
k=1/log(10)4=1.66096
Y[0]==0:e[0]=0
Y[1]==0:e[1]=0
Y[2]!=0:e[2]=-1.66096*(1*log(10)1)=0
Y[3]!=0:e[3]=-1.66096*(1.0294*log(10)1.0294)=-0.0215
b. calculating information entropy redundancy: d=np. Ones_like (e) -e
d=[1,1,1,1]–[0,0,0,-0.0215]=[1,1,1,1.0215]
c. Calculating the weight of each index: w=d/np.sum (d)
w=[1,1,1,1.0215]/(1+1+1+1.0215)=
[0.249,0.249,0.249,0.253]
5. Calculating similarity: 0+0+0+1+0.249+1.0294×0.253= 0.50944.
S16: and obtaining an engineering camouflage effect evaluation result based on the comprehensive similarity result.
Specifically, the engineering camouflage effect evaluation result can be obtained by integrating the similarity result.
In the embodiment of the invention, the problem that the evaluation result is inconsistent with subjective feeling due to the fact that the background area is too large and the average value is taken is avoided when the camouflage effect is evaluated, objective differences between the camouflage area and the background are relatively comprehensively mined, and specific reference is provided for the improvement direction of the target camouflage effect.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an engineering disguising effect evaluation device according to an embodiment of the invention.
As shown in fig. 2, an engineering disguising effect evaluation device, the device comprising:
the first similarity extraction module 21: the method comprises the steps of obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a brightness feature similarity extraction result;
in the implementation process of the invention, the brightness feature similarity extraction processing of the camouflage effect image of the engineering camouflage comprises the following steps: performing luminance respective histogram matrix extraction processing on the camouflage effect image based on a cv2.CalcHist () function to obtain luminance distribution histogram matrix information, wherein the luminance distribution histogram matrix information contains image information of the camouflage effect image and is used for reflecting probability distribution conditions of image pixels of the camouflage effect image; and (3) carrying out brightness characteristic similarity extraction processing on the brightness distribution histogram matrix information based on a cv2.compacteHist () function.
Further, the luminance individual histogram matrix extraction processing for the camouflage effect image based on the cv2.calchist () function includes: calculating the probability p (k) of occurrence of the brightness level k in the camouflage effect image, wherein the calculation process of p (k) is as follows:
wherein N represents all brightness grade numbers; r represents all pixel numbers in the camouflage effect image; r is (r) k Representing the number of pixels corresponding to the brightness level k; and carrying out accumulated summation calculation on the probability p (k) distribution to obtain brightness distribution histogram matrix information h (k), wherein the accumulated summation calculation process is as follows:
wherein n=n-1; for brightness level ranges of [0, N-1 ]]Is represented as S k The method comprises the steps of carrying out a first treatment on the surface of the For the brightness of the image, its 256-level high image is transformed to 0-255.
Specifically, after the camouflage effect image of the engineering camouflage is obtained, the camouflage effect image needs to be subjected to brightness feature extraction processing, namely, the brightness distribution of the image is represented by L in the LAB color space, and the coincidence degree of the brightness distribution histograms of the two pictures is defined as the brightness feature similarity.
Firstly, luminance respectively histogram matrix extraction processing is carried out on a camouflage effect image through a cv2.calcHist () function, so that luminance distribution histogram matrix information can be obtained, wherein the luminance distribution histogram matrix information contains image information of the camouflage effect image and is used for reflecting probability distribution conditions of image pixels of the camouflage effect image; for luminance distribution histogram matrix information f (x, y), where x, y represents pixel points with coordinates x, y, f (x, y) is used for probability of occurrence of luminance level k in the camouflage effect image, where the probability of occurrence of luminance level k in the camouflage effect image p (k), the calculation process of p (k) is as follows:
Wherein N represents all brightness grade numbers; r represents all pixel numbers in the camouflage effect image; r is (r) k Representing the number of pixels corresponding to the brightness level k; and carrying out cumulative summation on the probability distribution to obtain cumulative normalized luminance distribution histogram matrix information h (k), wherein the cumulative summation calculation process is as follows:
wherein n=n-1; for brightness level ranges of [0, N-1 ]]Is represented as S k The method comprises the steps of carrying out a first treatment on the surface of the For the brightness of the image, its 256-level high image is transformed to 0-255.
Then, luminance feature similarity extraction processing is carried out on the luminance distribution histogram matrix information through a cv2.compacteHist () function, and a luminance feature similarity extraction result can be obtained; namely, to compare two histograms (H1 and H2), firstly, a comparison standard for measuring the similarity of the histograms must be selected, d (H1, H2) is set, the histograms H1 and H2 are obtained by calculating the two input images, the histograms are normalized to the same scale space, then the similarity of the two histograms can be obtained by calculating the distance between the H1 and H2, the similarity of the images is further compared, the calculation result of the pasteurization distance is that the value is completely matched to be 1, the value is completely not matched to be 0, and the calculation formula is as follows:
The pasteurization distance achieves the best results when calculating the histogram similarity, but the calculation is most complex.
The second similarity extraction module 22: the color feature similarity extraction method is used for carrying out color feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a color feature similarity extraction result;
in the implementation process of the invention, the color feature similarity extraction processing of the camouflage effect image of the engineering camouflage comprises the following steps: reading the camouflage effect image from the memory based on a cv2.imdecoding () function, and converting the camouflage effect image into a preset image format to form an image array; creating RGB three channels based on the image array to obtain a color histogram matrix, and taking the color histogram matrix as a color characteristic, wherein each channel in the RGB three channels is set to be 16 bins to obtain the color histogram matrix; and carrying out the pasteurization distance calculation processing on the color features based on the cv2. CompacteHist function, and carrying out the color feature similarity extraction processing based on the pasteurization distance calculation result.
Specifically, the color space includes RGB, HSV, and Lab, wherein the RGB color space is easy to understand, and is not intuitive when continuously transforming colors, (red green blue): the value ranges are as follows: [0,255], obtaining a histogram of the RGB color space, performing normalization processing, and calculating the coincidence degree of the color distribution histogram to define the brightness characteristic similarity.
Firstly, reading the camouflage effect image from a memory through a cv2.imdecoding () function, and converting the camouflage effect image into a preset image format to form an image array; the method is mainly used for reading pictures in the network, and if the pictures are local, the pictures can be replaced by an imread function.
Secondly, an RGB three-channel histogram matrix is created by using the acquired image as a color feature, and each channel of the three channels is set to 16 bins, so that a color histogram is obtained. A Histogram (Histogram), also known as a quality profile, is a statistical report, representing the data distribution by a series of vertical stripes or segments of unequal height; the data type is generally represented by a horizontal axis, and the distribution condition is represented by a vertical axis; the histogram is a widely used tool in image processing, the nature of the histogram is the graphical of probability distribution, and the histogram can be used for representing vectors; for images, not only gray level histograms can be created, but also arbitrary forms of histograms can be created by design methods.
Finally, the obtained color histogram is subjected to a pasteurization distance algorithm calculation by cv2.compacteHist.
The third similarity extraction module 23: the method comprises the steps of performing texture feature similarity extraction processing on a camouflage effect image of the engineering camouflage to obtain a texture feature similarity extraction result;
In the implementation process of the invention, the process for extracting the similarity of the texture features of the camouflage effect image of the engineering camouflage comprises the following steps: obtaining image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image based on a method of a skimage library, and taking the image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image as texture features of the camouflage effect image; and carrying out texture feature similarity calculation processing on the texture features based on a Euclidean distance algorithm.
Specifically, an image which is easily affected by illumination change is converted into an LBP image with good illumination change resistance, and the LBP image is used for establishing the gray level co-occurrence matrix characteristic of the image. The texture characteristics of a camouflage target and a background are represented by using energy (ASM), entropy (H), contrast (CON), correlation (COR) and inverse difference moment (L) of a gray level co-occurrence matrix, and the similarity of the texture distribution is defined by combining Euclidean distance.
Firstly, obtaining image energy (ASM), entropy (H), contrast (CON), correlation (COR) and inverse difference moment (L) as texture features by a method of a skimage library; inverse moment (HOMO): measuring the uniformity of local intensity of the gray level image, and if the local intensity is uniform, the inverse difference moment value is larger;
Second Moment (ASM): also called energy, which is a measure of image uniformity, the more uniform the gray scale distribution of an image, the larger the corresponding ASM value, and conversely, the smaller the ASM;
) Entropy (ENT): is an amount that measures how much the target image information is, where texture information is also an aspect of the entropy measure. The greater the entropy, and vice versa, the smaller the element in the image is comparatively dispersed. The magnitude of entropy represents the uniformity or complexity of the texture of the target image;
ENT=∑ a,b δ φ,d (a,b)log 2 δ φ,d (a,b);
contrast (CON): reflecting the gray level change degree of the local image, the larger the gray value difference in the image is, the sharper the image edge is, the larger the contrast is, and the typical k=2, and the lambda=1 is;
difference (dis): similar to the contrast, but the variability is better for the local characteristics, and when the local contrast is increased, the variability is also increased;
DISL=∑ a ∑ b δ φ,d (a,b)|a-b|;
correlation (COR): a metric representing the linear relationship of gray pixels of the target image, representing the similarity of gray-level relationships of the gray-level co-occurrence matrix lines and columns:
finally, calculating the similarity of the texture features of the obtained texture feature vector matrix through a Euclidean distance algorithm:
G m representing a texture feature vector matrix of the image.
Fourth similarity extraction module 24: the method comprises the steps of performing shape feature similarity extraction processing on a camouflage effect image of engineering camouflage to obtain a shape feature similarity extraction result;
In the implementation process of the invention, the shape feature similarity extraction processing of the camouflage effect image of the engineering camouflage comprises the following steps: extracting outline features in a color image of the camouflage effect image based on a chromatic aberration Sobel operator, and carrying out shape feature description processing on the outline features by using 7 Hu invariant moment vectors to obtain description shape features; and carrying out shape feature similarity calculation processing on the description shape features based on the Euclidean distance algorithm.
Specifically, the color difference Sobel operator is adopted to extract the outline features in the color image, 7 Hu invariant moment vectors are used for describing the shape features, and the Euclidean distance is adopted to calculate the similarity of the shape feature vectors.
Knowing what the color moment is, the color moment is a simple and effective color feature representation method, and there are a first moment (i.e. mean), a second moment (i.e. standard deviation), a third moment (slope), and so on, since the color information is mainly distributed in the low moment, the color distribution in the image can be effectively represented by using the first moment, the second moment, and the third moment to sufficiently express the color distribution in the image.
Wherein p is i,j The ith color component representing the jth pixel of the color image and N represents the number of pixels in the image.
The Hu moment value is unchanged after rotation, scaling, mirroring and translation invariance, that is to say, the same or similar shape is subjected to rotation, scaling and translation transformation. For the obstacle with fixed shape of the color tent, the change of scaling, translation and the like can be generated in the flight process of the unmanned aerial vehicle, but the change of Hu moment value is not great; for the irregular change of the area and the outline of the flame, the Hu moment value of the flame also changes irregularly correspondingly. The Hu moment is a set of 7 invariant moments which can be obtained by second and third order central moments, and is shown in the following formula:
M1=y 20 +y 02 ;
M3=(y 30 -y 12 ) 2 +(3y 21 -y 03 ) 2 ;
M4=(y 30 +y 12 ) 2 +(y 21 +y 03 ) 2 ;
M5=(y 30 -3y 12 )(y 30 +y 12 )((y 30 +y 12 )-3(y 21 +y 03 ) 2 )+(3y 21 -y 03 )(y 21 +y 03 )(3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 );
M7=(3y 21 -y 03 )(y 30 +y 12 )((y 30 +y 12 ) 2 -3(y 21 +y 03 ) 2 )-(y 30 -3y 12 )(y 21 +y 03 )(3(y 30 +y 12 ) 2 -(y 21 +y 03 ) 2 );
finally, calculating shape feature similarity of the obtained texture feature vector matrix through a Euclidean distance algorithm:
wherein M is t Representing a matrix of shape feature vectors of the image.
The comprehensive similarity calculation module 25: the method comprises the steps of performing comprehensive similarity calculation processing on a brightness feature similarity extraction result, a color feature similarity extraction result, a texture feature similarity extraction result and a shape feature similarity extraction result to obtain a comprehensive similarity result;
In the implementation process of the present invention, the performing a comprehensive similarity calculation process on the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result, and the shape feature similarity extraction result includes: normalizing the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result based on a Gaussian normalization processing formula to obtain a normalized brightness feature similarity extraction result, a normalized color feature similarity extraction result, a normalized texture feature similarity extraction result and a normalized shape feature similarity extraction result; and carrying out comprehensive weighting processing based on weight coefficients respectively corresponding to the normalized brightness feature similarity extraction result, the normalized color feature similarity extraction result, the normalized texture feature similarity extraction result and the normalized shape feature similarity extraction result, wherein the weight coefficients are determined by adopting an entropy weight method.
Specifically, the evaluation index is normalized, and camouflage practice shows that the distribution of the image feature similarity is approximately Gaussian distribution; therefore, the similarity of different features is subjected to gaussian normalization processing.
And carrying out normalization processing on the calculated value data of the image feature similarity, wherein a Gaussian normalization processing formula is as follows:
wherein x is i Similarity index values representing colors, brightness, shapes, and texture features of an image;a mean value representing a plurality of image feature similarities; σ represents the distribution variance of feature similarity; the weight coefficients corresponding to the brightness features, the shape features, the color features and the texture features of the image are determined by adopting an entropy weight method, and the management interface can retrain the weight coefficients after fine adjustment, so that the accuracy is improved. />
Calculation example:
[0.528,0.43,0.981,0.764]
1. calculating a mean value: tran_avg=sum (tranData)/len (tranData)
(0.528+0.43+0.981+0.764)/4=0.67575
2. Calculating the variance: tran_var=np.var (tranData)
0.045797
3. Normalization:
min=tran_avg-3*tran_var
max=0.813142;
max=tran_avg+3*tran_var
min=0.538358;
normalization:
tranData[i]<min:Y[i]=0
tranData[i]>max:Y[i]=1
tranData[i]>=min&&tranData[i]<=max:Y[i]=((tranData[i]-tran_avg)/3+1)/2
Y=[0,0,1,1.0294]
4. calculating weights:
a. calculating the entropy value e (j) of the j-th index:
e=[0,0,0,0]
k=1/log(10)4=1.66096
Y[0]==0:e[0]=0
Y[1]==0:e[1]=0
Y[2]!=0:e[2]=-1.66096*(1*log(10)1)=0
Y[3]!=0:e[3]=-1.66096*(1.0294*log(10)1.0294)=-0.0215
b. calculating information entropy redundancy: d=np. Ones_like (e) -e
d=[1,1,1,1]–[0,0,0,-0.0215]=[1,1,1,1.0215]
c. Calculating the weight of each index: w=d/np.sum (d)
w=[1,1,1,1.0215]/(1+1+1+1.0215)=
[0.249,0.249,0.249,0.253]
5. Calculating similarity: 0+0+0+1+0.249+1.0294×0.253= 0.50944.
The obtaining module 26: and the method is used for obtaining the engineering camouflage effect evaluation result based on the comprehensive similarity result.
Specifically, the engineering camouflage effect evaluation result can be obtained by integrating the similarity result.
In the embodiment of the invention, the problem that the evaluation result is inconsistent with subjective feeling due to the fact that the background area is too large and the average value is taken is avoided when the camouflage effect is evaluated, objective differences between the camouflage area and the background are relatively comprehensively mined, and specific reference is provided for the improvement direction of the target camouflage effect.
The embodiment of the invention provides a computer readable storage medium, wherein an application program is stored on the computer readable storage medium, and the program is executed by a processor to realize the engineering disguise effect evaluation method of any one of the above embodiments. The computer readable storage medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs (Read-Only memories), RAMs (Random AcceSS Memory, random access memories), EPROMs (EraSable Programmable Read-Only memories), EEPROMs (Electrically EraSable ProgrammableRead-Only memories), flash memories, magnetic cards, or optical cards. That is, a storage device includes any medium that stores or transmits information in a form readable by a device (e.g., computer, cell phone), and may be read-only memory, magnetic or optical disk, etc.
The embodiment of the invention also provides a computer application program which runs on a computer and is used for executing the engineering disguise effect evaluation method of any one of the embodiments.
Further, fig. 3 is a schematic structural composition diagram of the evaluation apparatus in the embodiment of the present invention.
The embodiment of the invention also provides an evaluation device, as shown in fig. 3. The evaluation device includes a processor 302, a memory 303, an input unit 304, a display unit 305, and the like. Those skilled in the art will appreciate that the evaluation device architecture shown in fig. 3 does not constitute a limitation of all devices, and may include more or fewer components than shown, or may combine certain components. The memory 303 may be used to store an application 301 and various functional modules, and the processor 302 runs the application 301 stored in the memory 303, thereby performing various functional applications of the device and data processing. The memory may be internal memory or external memory, or include both internal memory and external memory. The internal memory may include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), flash memory, or random access memory. The external memory may include a hard disk, floppy disk, ZIP disk, U-disk, tape, etc. The disclosed memory includes, but is not limited to, these types of memory. The memory disclosed herein is by way of example only and not by way of limitation.
The input unit 304 is used for receiving input of a signal and receiving keywords input by a user. The input unit 304 may include a touch panel and other input devices. The touch panel may collect touch operations on or near the user (e.g., the user's operation on or near the touch panel using any suitable object or accessory such as a finger, stylus, etc.), and drive the corresponding connection device according to a preset program; other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., play control keys, switch keys, etc.), a trackball, mouse, joystick, etc. The display unit 305 may be used to display information input by a user or information provided to the user and various menus of the terminal device. The display unit 305 may take the form of a liquid crystal display, an organic light emitting diode, or the like. The processor 302 is a control center of the terminal device, connects various parts of the entire device using various interfaces and lines, performs various functions and processes data by running or executing software programs and/or modules stored in the memory 303, and invoking data stored in the memory.
As one embodiment, the evaluation device includes: the system comprises one or more processors 302, a memory 303, one or more application programs 301, wherein the one or more application programs 301 are stored in the memory 303 and configured to be executed by the one or more processors 302, and the one or more application programs 301 are configured to perform the engineering disguise effect assessment method in any one of the above embodiments.
In the embodiment of the invention, the problem that the evaluation result is inconsistent with subjective feeling due to the fact that the background area is too large and the average value is taken is avoided when the camouflage effect is evaluated, objective differences between the camouflage area and the background are relatively comprehensively mined, and specific reference is provided for the improvement direction of the target camouflage effect.
In addition, the method for evaluating the camouflage effect of the engineering and the related device provided by the embodiment of the invention are described in detail, and specific examples are adopted to illustrate the principle and the implementation mode of the engineering camouflage effect, and the description of the above examples is only used for helping to understand the method and the core idea of the engineering camouflage effect; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.
Claims (10)
1. An engineering disguising effect evaluation method, comprising:
obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image of engineering camouflage to obtain a brightness feature similarity extraction result;
Performing color feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a color feature similarity extraction result;
performing texture feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a texture feature similarity extraction result;
performing shape feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a shape feature similarity extraction result;
performing comprehensive similarity calculation processing on the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result to obtain a comprehensive similarity result;
and obtaining an engineering camouflage effect evaluation result based on the comprehensive similarity result.
2. The engineering camouflage effect evaluation method according to claim 1, wherein the processing of extracting the brightness feature similarity of the camouflage effect image of the engineering camouflage includes:
performing luminance respective histogram matrix extraction processing on the camouflage effect image based on a cv2.CalcHist () function to obtain luminance distribution histogram matrix information, wherein the luminance distribution histogram matrix information contains image information of the camouflage effect image and is used for reflecting probability distribution conditions of image pixels of the camouflage effect image;
And (3) carrying out brightness characteristic similarity extraction processing on the brightness distribution histogram matrix information based on a cv2.compacteHist () function.
3. The engineering disguising effect evaluation method according to claim 2, wherein the performing luminance individual histogram matrix extraction processing on the disguising effect image based on a cv2.calchist () function includes:
calculating the probability p (k) of occurrence of the brightness level k in the camouflage effect image, wherein the calculation process of p (k) is as follows:
wherein N represents all brightness grade numbers; r represents all pixel numbers in the camouflage effect image; r is (r) k Representing the number of pixels corresponding to the brightness level k;
and carrying out accumulated summation calculation on the probability p (k) distribution to obtain brightness distribution histogram matrix information h (k), wherein the accumulated summation calculation process is as follows:
wherein n=n-1; for brightness level ranges of [0, N-1 ]]Is represented as S k The method comprises the steps of carrying out a first treatment on the surface of the For the brightness of the image, its 256-level high image is transformed to 0-255.
4. The engineering camouflage effect evaluation method according to claim 1, wherein the color feature similarity extraction processing is performed on the camouflage effect image of the engineering camouflage, and the method comprises:
Reading the camouflage effect image from the memory based on a cv2.imdecoding () function, and converting the camouflage effect image into a preset image format to form an image array;
creating RGB three channels based on the image array to obtain a color histogram matrix, and taking the color histogram matrix as a color characteristic, wherein each channel in the RGB three channels is set to be 16 bins to obtain the color histogram matrix;
and carrying out the pasteurization distance calculation processing on the color features based on the cv2. CompacteHist function, and carrying out the color feature similarity extraction processing based on the pasteurization distance calculation result.
5. The engineering camouflage effect evaluation method according to claim 1, wherein the processing of extracting the similarity of the texture features of the camouflage effect image of the engineering camouflage comprises:
obtaining image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image based on a method of a skimage library, and taking the image energy, entropy, contrast, correlation and inverse moment in the camouflage effect image as texture features of the camouflage effect image;
and carrying out texture feature similarity calculation processing on the texture features based on a Euclidean distance algorithm.
6. The engineering camouflage effect evaluation method according to claim 1, wherein the shape feature similarity extraction processing of the camouflage effect image of the engineering camouflage includes:
Extracting outline features in a color image of the camouflage effect image based on a chromatic aberration Sobel operator, and carrying out shape feature description processing on the outline features by using 7 Hu invariant moment vectors to obtain description shape features;
and carrying out shape feature similarity calculation processing on the description shape features based on the Euclidean distance algorithm.
7. The engineering disguise effect evaluation method of claim 1, wherein the performing the comprehensive similarity calculation processing on the luminance feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result, and the shape feature similarity extraction result comprises:
normalizing the brightness feature similarity extraction result, the color feature similarity extraction result, the texture feature similarity extraction result and the shape feature similarity extraction result based on a Gaussian normalization processing formula to obtain a normalized brightness feature similarity extraction result, a normalized color feature similarity extraction result, a normalized texture feature similarity extraction result and a normalized shape feature similarity extraction result;
and carrying out comprehensive weighting processing based on weight coefficients respectively corresponding to the normalized brightness feature similarity extraction result, the normalized color feature similarity extraction result, the normalized texture feature similarity extraction result and the normalized shape feature similarity extraction result, wherein the weight coefficients are determined by adopting an entropy weight method.
8. An engineering disguising effect evaluation device, the device comprising:
a first similarity extraction module: the method comprises the steps of obtaining a camouflage effect image of engineering camouflage, and carrying out brightness feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a brightness feature similarity extraction result;
the second similarity extraction module: the color feature similarity extraction method is used for carrying out color feature similarity extraction processing on the camouflage effect image of the engineering camouflage to obtain a color feature similarity extraction result;
the third similarity extraction module: the method comprises the steps of performing texture feature similarity extraction processing on a camouflage effect image of the engineering camouflage to obtain a texture feature similarity extraction result;
a fourth similarity extraction module: the method comprises the steps of performing shape feature similarity extraction processing on a camouflage effect image of engineering camouflage to obtain a shape feature similarity extraction result;
and the comprehensive similarity calculation module is used for: the method comprises the steps of performing comprehensive similarity calculation processing on a brightness feature similarity extraction result, a color feature similarity extraction result, a texture feature similarity extraction result and a shape feature similarity extraction result to obtain a comprehensive similarity result;
The obtaining module is as follows: and the method is used for obtaining the engineering camouflage effect evaluation result based on the comprehensive similarity result.
9. An evaluation device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 7 when the computer program is executed by the processor.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311867947.8A CN117788835A (en) | 2023-12-29 | 2023-12-29 | Engineering camouflage effect evaluation method and related device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311867947.8A CN117788835A (en) | 2023-12-29 | 2023-12-29 | Engineering camouflage effect evaluation method and related device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117788835A true CN117788835A (en) | 2024-03-29 |
Family
ID=90396299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311867947.8A Pending CN117788835A (en) | 2023-12-29 | 2023-12-29 | Engineering camouflage effect evaluation method and related device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117788835A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119762810A (en) * | 2024-12-30 | 2025-04-04 | 北京理工大学 | A method for evaluating the effect of digital camouflage |
-
2023
- 2023-12-29 CN CN202311867947.8A patent/CN117788835A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119762810A (en) * | 2024-12-30 | 2025-04-04 | 北京理工大学 | A method for evaluating the effect of digital camouflage |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11004129B2 (en) | Image processing | |
KR102783839B1 (en) | Method for detecting skin quality, method for classifying skin quality grades, device for detecting skin quality, electronic device and storage medium | |
Pavithra et al. | An efficient framework for image retrieval using color, texture and edge features | |
US20220366194A1 (en) | Computer Vision Systems and Methods for Blind Localization of Image Forgery | |
Phung et al. | Skin segmentation using color pixel classification: analysis and comparison | |
US8401292B2 (en) | Identifying high saliency regions in digital images | |
CN111860670A (en) | Domain adaptive model training method, image detection method, device, equipment and medium | |
Wang et al. | A pixel-based color image segmentation using support vector machine and fuzzy C-means | |
US6674915B1 (en) | Descriptors adjustment when using steerable pyramid to extract features for content based search | |
KR101548928B1 (en) | Invariant visual scene and object recognition | |
US8340412B2 (en) | Image processing | |
US8064694B2 (en) | Nonhuman animal integument pixel classification | |
Zhang et al. | Dual-channel multi-task CNN for no-reference screen content image quality assessment | |
Jung et al. | Eye detection under varying illumination using the retinex theory | |
Sakthivel et al. | Color image segmentation using SVM pixel classification image | |
Guo et al. | Multitemporal hyperspectral images change detection based on joint unmixing and information coguidance strategy | |
CN112991448A (en) | Color histogram-based loop detection method and device and storage medium | |
CN111639212A (en) | Image retrieval method in mining intelligent video analysis | |
CN111105436B (en) | Target tracking method, computer device and storage medium | |
CN117788835A (en) | Engineering camouflage effect evaluation method and related device | |
Yang et al. | EHNQ: Subjective and objective quality evaluation of enhanced night-time images | |
Cao et al. | Grayscale Image Colorization Using an Adaptive Weighted Average Method. | |
Huang et al. | Robust skin detection in real-world images | |
Aznaveh et al. | A new color based method for skin detection using RGB vector space | |
Ko et al. | Adaptive growing and merging algorithm for image segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |