[go: up one dir, main page]

CN1295643C - Automatic identifying method for skin micro imiage symptom - Google Patents

Automatic identifying method for skin micro imiage symptom Download PDF

Info

Publication number
CN1295643C
CN1295643C CNB200410053539XA CN200410053539A CN1295643C CN 1295643 C CN1295643 C CN 1295643C CN B200410053539X A CNB200410053539X A CN B200410053539XA CN 200410053539 A CN200410053539 A CN 200410053539A CN 1295643 C CN1295643 C CN 1295643C
Authority
CN
China
Prior art keywords
symptom
average
recorded
edge
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB200410053539XA
Other languages
Chinese (zh)
Other versions
CN1588429A (en
Inventor
胡越黎
曹家麟
冉峰
赵倩
冯栩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CNB200410053539XA priority Critical patent/CN1295643C/en
Publication of CN1588429A publication Critical patent/CN1588429A/en
Application granted granted Critical
Publication of CN1295643C publication Critical patent/CN1295643C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The present invention relates to an automatic identifying method for a symptom a skin micro image, which is characterized in that after the skin micro image is preprocessed, a symptom partition image is input into a microcomputer; the characteristic parameter of the symptom partition image is extracted to be converted into the characteristic vector which reflects the symptom class essential, the characteristic vector is identified and calculated according to the recognition algorithm on the basis of a support vector machine, which is used for identifying the attributes of the symptom, and then, the attributes of the symptom are displayed on a microcomputer display by characters. The present invention provides the foundation for a skin diagnosis system, and can be used as the realization basis of the skin diagnosis system.

Description

Skin micro-image symptom automatic identifying method
Technical field
The present invention relates to a kind of automatic distinguishing method for image, particularly a kind of skin micro-image symptom automatic identifying method.
Background technology
The automatic identifying method of skin micro-image symptom comprises two aspects: at first be the symptom split image extraction characteristic parameter from skin micro-image, carry out Classification and Identification according to its characteristic parameter then, thereby draw the attribute of skin symptom.
In the prior art, the skin symptom Feature Extraction has: the method for imitation skin expert visual diagnostic, the size of extracting skin symptom is (with diameter or cartographic represenation of area, also available needle point, syringe needle, the grain of rice, mung bean, pea, shelled peanut, longan, lichee, walnut, egg, fist, analogy in kind such as palm), shape (ellipse, semisphere, cusp configuration, pancake, polygon, arc, annular, irregular shape), color is (red, yellow, brown, black, in vain, the normal colour of skin etc.), color and luster is (as scarlet, light red, dark red etc.), the surface is (smooth or coarse, palilate or cauliflower form, the pinnacle, the center has or not omphalos), edge (clear or fuzzy, neat or present infiltration shape, bending etc.), quality is (solid or soft, capsule, fluctuation), arrange and (be the different figure that multiple infringement forms, wire is arranged, polycyclic, gregariousness, being dispersed in property, band shape or umbrella).This method is comparatively complicated, is difficult to accurately grasp analyze.
The method of image classification has a lot, statistical pattern classification method for example, and the structure method, classification tree and neural network etc., but for higher-dimension and magnanimity classification and identification, and do not having under the situation of priori, adopt these methods all to be difficult to obtain satisfied result.The artificial neural network method is more effective in classification on a small scale, but the artificial neural network learning algorithm has its intrinsic shortcoming, as network structure determine still not to have regular reliably, reach the local optimum point easily.Therefore, for small sample, multi-class training set, artificial neural network can produce more serious over-fitting (overfitting) problem, i.e. good the and problem of value of forecasting difference of fitting result.
Summary of the invention
The object of the present invention is to provide a kind of automatic identifying method of skin micro-image symptom characteristic,, can discern the attribute of symptom automatically the pretreated symptom split image of the skin micro-image that provides.
For achieving the above object, the present invention adopts following technical proposals:
A kind of skin micro-image symptom automatic identifying method, it is characterized in that to the pretreated symptom split image of microcomputer input skin micro-image, the characteristic parameter that extracts the symptom split image changes into the proper vector of reflection symptom classification essence, this proper vector is discerned the attribute of symptom by discerning computing based on the recognizer of support vector machine, demonstrate the attribute of this symptom then on microcomputer monitor with literal, concrete steps are:
1) above-mentioned extraction symptom split image characteristic parameter changes into the concrete steps of the proper vector of reflection symptom classification essence and is:
A. extract the geometric properties of symptom:
(a) extract the area of symptom: the pixel value among the figure cut apart of adding up all kinds of symptoms is 1 number, and this number is exactly the area of symptom;
(b) extract maximum, the minimum diameter of symptom: with symptom cut apart figure with direction projection from the horizontal by the θ angle, obtain a projecting direction vector; Statistics is 1 number at this vectorial interior element, is designated as N (θ); θ increases progressively 1 ° from 0 ° to 179 ° at every turn, obtains 180 N (θ); With the maximum gauge of the maximal value among these 180 N (θ) as symptom, minimum value is as the minimum diameter of symptom;
B. extract the color characteristic of symptom:
(a) color characteristic of symptom inside extracts: the difference of average color amount of selecting the average color amount of symptom inside and background skin is as the feature of color:
I. calculate symptom tone characteristics value, be designated as Hcha:Hcha=(averHzh-averHbei) * 360; In the formula: the average color tone pitch of symptom inside is designated as averHzh; The average color tone pitch of background skin is designated as averHbei;
Ii. calculate symptom saturation degree (S) eigenwert, be designated as Scha:Scha=averSzh-averSbei; In the formula: the average staturation value of symptom inside is designated as averSzh; The average staturation value of background skin is designated as averSbei;
Iii. calculate symptom intensity (I) eigenwert, be designated as Icha:Icha=averIzh-averIbei; In the formula: the average intensity value of symptom inside is designated as averIzh; The average intensity value of background skin is designated as averIbei;
Iv. calculate symptom gray scale (G) eigenwert, be designated as Gcha:Gcha=averGzh-averGbei; In the formula: the average gray value of symptom inside is designated as averGzh; The average gray value of background skin is designated as averGbei;
(b) color characteristic at symptom edge extracts: outside the feature of difference as color of the average color amount at symptom edge and the average color amount of background skin, the difference of average color amount of also selecting the average color amount at symptom edge and symptom inside is as the feature of color:
I. the difference of the average color amount of the average color amount at symptom edge and background skin
● calculate the tone difference of symptom edge and background skin, be designated as ebHcha:ebHcha=(averHedge-averHbei) * 360; In the formula: the average tone at symptom edge is designated as averHedge; The average tone of background skin is designated as averHbei;
● calculate the saturation degree difference of symptom edge and background skin, be designated as ebScha, ebScha=averSedge-averSbei; In the formula: the average staturation at symptom edge is designated as averSedge; The average staturation of background skin is designated as averSbei;
● calculate the strength difference of symptom edge and background skin, be designated as ebIcha, ebIcha=averIedge-averIbei; In the formula: the mean intensity at symptom edge is designated as averIedge: the mean intensity of background skin is designated as averIbei;
● calculate the gray scale difference value of symptom edge and background skin, be designated as ebGcha, ebGcha=averGedge-averGbei; In the formula: the average gray at symptom edge is designated as averGedge; The average gray of background skin is designated as averGbei;
Ii. the difference of the average color amount of the average color amount at symptom edge and symptom inside
● calculate the tone difference of symptom edge and symptom inside, be designated as ezHcha, ebHcha=(averHedge-averHbei) * 360, in the formula: the average tone at symptom edge is designated as averHedge; The average tone of symptom inside is designated as averHzh;
● calculate the saturation degree difference of symptom edge and symptom inside, be designated as ezScha, ezScha=averSedge-averSzh, in the formula: the average staturation at symptom edge is designated as averSedge; The average staturation of symptom inside is designated as averSzh;
● calculate the strength difference of symptom edge and symptom inside, be designated as ezIcha, ezIcha=averIedge-averIzh, in the formula: the mean intensity at symptom edge is designated as averIedge; The mean intensity of symptom inside is designated as averIzh;
● calculate the gray scale difference value of symptom edge and symptom inside, be designated as ezGcha, ezGcha=averGedge-averGzh, in the formula: the average gray at symptom edge is designated as averGedge; The average gray of symptom inside is designated as averGzh;
(c) extraction of the shape facility of symptom and conversion computing: ((x is that image is at pixel (x, the value of y) locating, (p+q) rank square m of entire image y) to f for x, y) value to extract f PqBe calculated as follows:
m pq = Σ x Σ y f ( x , y ) x p y q , p , q = 0,1,2 , · · ·
The central moment μ of image PqBe calculated as follows:
μ pq = Σ x Σ y ( x - x ‾ ) p ( y - y ‾ ) q f ( x , y )
Wherein x ‾ = Σ x Σ y xf ( x , y ) / Σ x Σ y f ( x , y ) , y ‾ = Σ x Σ y yf ( x , y ) / Σ x Σ y f ( x , y )
Normalization central moment η PqBe calculated as follows:
η pq = μ pq / μ 00 γ
γ in the formula=(p+q+2)/2, p+q=2,3,
Can derive the calculating formula of one group of invariant moments from second moment and third moment:
φ 1=η 2002
φ 2 = ( η 20 - η 02 ) 2 + 4 η 11 2
φ 3=(η 30-3η 12) 2+(3η 2103) 2
φ 4=(η 3012) 2+(η 2103) 2
φ 5=(η 30-3η 12)(η 3012)×[(η 3012) 2-3(η 2103) 2]+
(3η 2103)(η 2103)×[3(η 3012) 2-(η 2103) 2]
φ 6=(η 20-η 02)[(η 3012) 2-(η 2103) 2]+
113012)(η 2103)
φ 7=(3η 2103)(η 3012)×[(η 3012) 2-3(η 2103) 2]+
(3η 2103)(η 2103)×[3(η 0312) 2-(η 1203) 2 ]
2) concrete steps of above-mentioned identification computing are:
A. select kernel function: adopt different inner product kernel functions will cause different algorithm of support vector machine, adopt 3 more class kernel functions as follows at present:
I. polynomial kernel function: k (x, y)=(x*y+1) dD=1,2 ... ..
Ii.RBF (Radial Basis Function) kernel function: K ( x , y ) = exp [ - | | x - y | | 2 2 σ 2 ]
Iii.Sigmoid kernel function: K (x, y)=tanh[b (x*y)-c]
B. training: for authentic specimen and false sample, make its ideal be output as 1 and-1 respectively, adopt the SVM training algorithm, promptly above-mentioned classification function formula training promptly gets a sorter based on SVM this moment to convergence;
C. identification: in the sample input category device to be identified, be output as a real number value, by preset threshold, decidable revise classification originally;
A)~c) step is for two class classification problems, and for the skin micro-image classification, we are divided into four classes, therefore, can design k (k-1)/2 sorter, and each sorter all uses two class data to train;
D. repeat b for chloasma and blackhead sample), obtain sorter SVM1;
E. repeat b for chloasma and acne sample), obtain sorter SVM2;
F. repeat b for chloasma and freckle sample), obtain sorter SVM3;
G. repeat b for freckle and blackhead sample), obtain sorter SVM4;
H. repeat b for freckle spot and acne sample), obtain sorter SVM5;
I. repeat b for blackhead and acne sample), obtain sorter SVM6;
J. in classification, adopt a kind of marking strategy:, obtain 6 sorters with training process respectively and test, be i.e. c for certain sample to be classified) step, each result is 1 minute, the score that accumulative total is of all categories;
K. selecting the pairing classification of score soprano is the classification of test data;
L. the calculating of discrimination;
M. supposition has n initial sample, takes out one of them sample at every turn and does test, does training sample for all the other n-1;
N. to this n-1 training sample, repeat d)~i) go on foot;
O. do test with the another one sample again, obtain an accuracy, 0 or 100%;
P. such process repeats n time, and each sample is all once as test sample book as a result, so last accuracy value gets is n time mean value;
Q. select the method for kernel function, pass through m exactly)~p) go on foot, select the highest kernel function of discrimination.
The present invention compared with prior art, have following conspicuous outstanding substantive distinguishing features and remarkable advantage: among the present invention, the geometric properties that symptom area, maximum gauge and the minimum diameter that the symptom split image of skin micro-image is extracted represented, one group of shape facility that invariant moments is represented that color characteristic of representing with the difference of the average color amount of the average color amount of symptom inside and background skin and computing obtain can reflect symptom classification essence more exactly; The recognition methods of a kind of skin micro-image symptom based on support vector machine (SVM) of extracting among the present invention, can solve preferably since skin micro-image have very strong non-linear, dimension is higher and belong to small sample learns these problems, image identification system calculates simple, convenient, and generalization ability is strong, is fit to very much real time processing system.The present invention lays a good foundation for the skin diagnosis system, can be as the realization foundation of skin diagnosis system.
Description of drawings
The system chart that Fig. 1 uses for one embodiment of the invention.
Fig. 2 is the micrograph image pattern of chloasma in Fig. 1 example, and wherein figure (a) is cut apart figure for symptom, and figure (b) is the symptom outline map, and figure (c) is the chloasma cromogram.
Fig. 3 is the system flowchart in Fig. 1 example
Embodiment
One embodiment of the present of invention are at the chloasma in the skin micro-image, freckle, blackhead and four kinds of typical skin symptoms of acne, provide its symptom automatic identifying method.Adopt system shown in Figure 1, carry out the extraction and the identification computing of symptom characteristic by system flow block diagram shown in Figure 3:
1. feature extraction
With the chloasma is example, and the symptom that Fig. 2 illustrates chloasma is cut apart figure, symptom outline map and symptom cromogram.Concrete steps:
1) geometric properties of calculating symptom
1.1 extract the area of symptom
Pixel value is that 1 number is 10765 in the statistical graph 2 (a), and this number is exactly the area of symptom.
1.2 extract maximum, the minimum diameter of symptom
1.2.1 Fig. 2 (a) with the direction projection from the horizontal by the θ angle, obtains a projecting direction vector;
1.2.2 statistics is 1 number at this vectorial interior element, is designated as N (θ);
1.2.3 θ increases progressively 1 ° from 0 ° to 179 ° at every turn, obtains 180 N (θ),
1.2.4 with the maximum gauge of the maximal value among these 180 N (θ) as symptom, minimum value is as the minimum diameter of symptom.The maximum gauge of Fig. 2 (a) is 151, and minimum diameter is 105.
2) symptom color characteristic
2.1 the color characteristic of Fig. 2 symptom inside extracts
2.1.1 calculate symptom tone characteristics value Hcha=(averHzh-averHbei) * 360=-220.071
2.1.2 calculate symptom saturation degree (S) eigenwert Scha=averSzh-averSbei=0.248057
2.1.3 calculate symptom intensity (I) eigenwert Icha=averIzh-averIbei=-12.7408;
2.1.4 calculate symptom gray scale (G) eigenwert Gcha=averGzh-averGbei=-24.9796;
2.2 the color characteristic at Fig. 2 symptom edge extracts
2.2.1 the difference of the average color amount at symptom edge and the average color amount of background skin
2.2.1.1 calculate the tone difference ebHcha=averHedge-averHbei=-224.289 of symptom edge and background skin
2.2.1.2 calculate the saturation degree difference e bScha=averSedge-averSbei=0.125347 of symptom edge and background skin
2.2.1.3 calculate the strength difference ebIcha=averIedge-averIbei=-8.34654 of symptom edge and background skin
2.2.1.4 calculate the gray scale difference value ebGcha=averGedge-averGbei=-16.0704 of symptom edge and background skin
2.2.2 the difference of the average color amount at symptom edge and the average color amount of symptom inside
2.2.1.1 calculate the tone difference ebHcha=averHedge-averHbei=-4.21761 of symptom edge and symptom inside
2.2.1.2 calculate the saturation degree difference e zScha=averSedge-averSzh=-0.122709 of symptom edge and symptom inside
2.2.1.3 calculate the strength difference ezIcha=averIedge-averIzh=4.39426 of symptom edge and symptom inside
2.2.1.4 calculate the gray scale difference value ezGcha=averGedge-averGzh=8.90916 of symptom edge and symptom inside
3) shape facility of Fig. 2 symptom
Calculate 7 invariant moments:
φ 1=η 2002=0.73399
φ 2 = ( η 20 - η 02 ) 2 + 4 η 11 2 = 2.6432
φ 3=(η 30-3η 12) 2+(3η 2103) 2=3.2293
φ 4=(η 3012) 2+(η 2103) 2=4.61191
φ 5=(η 30-3η 12)(η 3012)×[(η 3012) 2-3(η 2103) 2]+
(3η 2103)(η 2103)×[3(η 3012) 2-(η 2103) 2]=7.87397
φ 6=(η 20-η02)[(η 3012) 2-(η 2103) 2]+
113012)(η 2103)=6.10877
φ 7=(3η 2103)(η 3012)×[(η 3012) 2-3(η 2103) 2]+
(3η 2103)(η 2103)×[3(η 0312) 2-(η 1203) 2]=7.78115
Like this, the chloasma image just is converted to 22 dimensional feature vectors, as [10,765 151 105-220.0710.248057-12.7408-24.9796-224.289 0.125347-8.34654-16.0704-4.21761-0.122709 4.39426 8.90916 0.73399 2.6432 3.2293 4.611917.87397 6.10877 7.78115]
2 skin micro-image recognizers based on support vector machine (SVM)
We have collected 289 skin micro-images, under relevant professional's guidance, sample are instructed classification, and random division is two groups, are used separately as training and testing sample (159 examples/130 examples), each sample calculation its 22 eigenwerts.Like this, training sample just converts 159 22 dimensional vectors to.
Make up 6 sorters, each sorter all uses two class data to train, and during training, has attempted three class kernel functions: (1) linear kernel function (linear); (2) polynomial kernel function (polynomial); (3) warp-wise base kernel function (radial basis) is selected the kernel function that obtains best LOOCV (leaving-one method) accuracy.Through experiment, this patent adopts first kind kernel function as algorithm of support vector machine, promptly k (x, y)=(x*y+1) 3, can make the sorter discrimination the highest, result such as table 1.
Several different discriminations of kernel function under different parameters of table 1
Kernel function RBF linea r Poly
Paramete r Do not have Do not have D=1 D=2 D=3 D=4 D=5 D=6、7 D=8、9
Leaving-one method discrimination/% 55.88 79.41 79.41 82.54 88.24 85.29 58.82 11.76 2.94
Table 2 has provided with training data (159 example) and has trained a SVM, at last test sample book (130 example) is predicted sorting result.
Table 2 svm classifier device is to the test sample book discrimination
Sample class Chloasma (30) Freckle (23) Blackhead (16) Acne (32) Normal skin (11)
Nicety of grading/% 88.1 90.0 86.2 85.2 100.0

Claims (1)

1.一种皮肤显微图像症状自动识别方法,其特征在于向微机输入皮肤显微图像预处理后的症状分割图像,提取症状分割图像的特征参数转化成反映症状分类本质的特征向量,将该特征向量按基于支持向量机的识别算法进行识别运算而识别症状的属性,然后在微机显示器上以文字显示出该症状的属性,具体步骤为:1. a skin microscopic image symptom automatic recognition method, it is characterized in that to microcomputer input the symptom segmentation image after skin microscopic image preprocessing, extract the characteristic parameter of symptom segmentation image and convert into the feature vector reflecting symptom classification essence, the The eigenvector performs the recognition operation based on the recognition algorithm of the support vector machine to identify the attribute of the symptom, and then displays the attribute of the symptom in text on the computer display. The specific steps are: 1)所述的提取症状分割图像特征参数转化成反映症状分类本质的特征向量的具体步骤为:1) The specific steps of converting the feature parameter of the extracted symptom segmentation image into the feature vector reflecting the essence of symptom classification are: a.提取症状的几何特征:a. Extract geometric features of symptoms: (a)提取症状的面积:统计各类症状的分割图中像素值为1的数目,此数目就是症状的面积;(a) Extract the area of symptoms: count the number of pixels with a value of 1 in the segmentation map of various symptoms, and this number is the area of symptoms; (b)提取症状的最大、最小直径:将症状的分割图以与水平方向成θ角的方向投影,得到一个投影方向向量;统计在此向量内元素为1的个数,记为N(θ);θ从0°到179°每次递增1°,得到180个N(θ);将这180个N(θ)中的最大值作为症状的最大直径,最小值作为症状的最小直径;(b) Extract the maximum and minimum diameters of the symptoms: Project the segmentation map of the symptoms in a direction that forms an angle θ with the horizontal direction to obtain a projection direction vector; count the number of elements in this vector that are 1, and record it as N(θ ); θ increases by 1° each time from 0° to 179°, and 180 N(θ) are obtained; the maximum value among these 180 N(θ) is used as the maximum diameter of the symptom, and the minimum value is used as the minimum diameter of the symptom; b.提取症状的颜色特征:b. Extract color features of symptoms: (a)症状内部的颜色特征提取:选择症状内部的平均颜色量与背景皮肤的平均颜色量的差值作为颜色的特征:(a) Color feature extraction inside the symptom: select the difference between the average color amount inside the symptom and the average color amount of the background skin as the feature of the color: i.计算症状色调特征值,记为Hcha:Hcha=(averHzh-averHbei)*360;式中:症状内部的平均色调值,记为averHzh;背景皮肤的平均色调值,记为averHbei;i. Calculate the characteristic value of the symptom hue, which is recorded as Hcha: Hcha=(averHzh-averHbei)*360; in the formula: the average hue value inside the symptom, which is recorded as averHzh; the average hue value of the background skin, which is recorded as averHbei; ii.计算症状饱和度(S)特征值,记为Scha:Scha=averSzh-averSbei;式中:症状内部的平均饱和度值,记为averSzh;背景皮肤的平均饱和度值,记为averSbei;ii. Calculate the symptom saturation (S) eigenvalue, denoted as Scha: Scha=averSzh-averSbei; In the formula: the average saturation value inside the symptom, denoted as averSzh; The average saturation value of the background skin, denoted as averSbei; iii.计算症状强度(I)特征值,记为Icha:Icha=averIzh-averIbei;式中:症状内部的平均强度值,记为averIzh;背景皮肤的平均强度值,记为averIbei;iii. calculate symptom intensity (I) eigenvalue, denoted as Icha: Icha=averIzh-averIbei; In the formula: the average intensity value inside the symptom, denoted as averIzh; The average intensity value of background skin, denoted as averIbei; iv.计算症状灰度(G)特征值,记为Gcha:Gcha=averGzh-averGbei;式中:症状内部的平均灰度值,记为averGzh;背景皮肤的平均灰度值,记为averGbei;iv. Calculate the symptom gray scale (G) eigenvalue, be recorded as Gcha: Gcha=averGzh-averGbei; In the formula: the average gray value of symptom inside, be recorded as averGzh; The average gray value of background skin, be recorded as averGbei; (b)症状边缘的颜色特征提取:症状边缘的平均颜色量与背景皮肤的平均颜色量的差值作为颜色的特征外,还选择症状边缘的平均颜色量与症状内部的平均颜色量的差值作为颜色的特征:(b) Color Feature Extraction of Symptom Edge: In addition to the difference between the average color amount of the symptom edge and the average color amount of the background skin as the feature of the color, the difference between the average color amount of the symptom edge and the average color amount inside the symptom is also selected As a feature of color: i.症状边缘的平均颜色量与背景皮肤的平均颜色量的差值i. The difference between the average color volume of the symptom edge and the average color volume of the background skin ●计算症状边缘与背景皮肤的色调差值,记为ebHcha:ebHcha=(averHedge-averHbei)*360;式中:症状边缘的平均色调,记为averHedge;背景皮肤的平均色调,记为averHbei;Calculate the hue difference between symptom edge and background skin, which is recorded as ebHcha: ebHcha=(averHedge-averHbei)*360; where: the average hue of symptom edge is recorded as averHedge; the average hue of background skin is recorded as averHbei; ●计算症状边缘与背景皮肤的饱和度差值,记为ebScha,ebScha=averSedge-averSbei;式中:症状边缘的平均饱和度,记为averSedge;背景皮肤的平均饱和度,记为averSbei;●Calculate the saturation difference between the symptom edge and the background skin, denoted as ebScha, ebScha=averSedge-averSbei; where: the average saturation of the symptom edge, denoted as averSedge; the average saturation of the background skin, denoted as averSbei; ●计算症状边缘与背景皮肤的强度差值,记为ebIcha,ebIcha=averIedge-averIbei;式中:症状边缘的平均强度,记为averIedge:背景皮肤的平均强度,记为averIbei;●Calculate the intensity difference between the symptom edge and the background skin, which is recorded as ebIcha, ebIcha=averIedge-averIbei; where: the average intensity of the symptom edge, which is recorded as averIedge: the average intensity of the background skin, which is recorded as averIbei; ●计算症状边缘与背景皮肤的灰度差值,记为ebGcha,ebGcha=averGedge-averGbei;式中:症状边缘的平均灰度,记为averGedge;背景皮肤的平均灰度,记为averGbei;●Calculate the gray level difference between the symptom edge and the background skin, which is recorded as ebGcha, ebGcha=averGedge-averGbei; where: the average gray value of the symptom edge is recorded as averGedge; the average gray value of the background skin is recorded as averGbei; ii.症状边缘的平均颜色量与症状内部的平均颜色量的差值ii. The difference between the average amount of color at the edge of the symptom and the average amount of color inside the symptom ●计算症状边缘与症状内部的色调差值,记为ezHcha,ebHcha=(averHedge-averHbei)*360,式中:症状边缘的平均色调,记为averHedge;症状内部的平均色调,记为averHzh;● Calculate the hue difference between the edge of the symptom and the interior of the symptom, and record it as ezHcha, ebHcha=(averHedge-averHbei)*360, where: the average hue of the edge of the symptom is recorded as averHedge; the average hue of the interior of the symptom is recorded as averHzh; ●计算症状边缘与症状内部的饱和度差值,记为ezScha,ezScha=averSedge-averSzh,式中:症状边缘的平均饱和度,记为averSedge;症状内部的平均饱和度,记为averSzh;● Calculate the saturation difference between the edge of the symptom and the interior of the symptom, which is recorded as ezScha, ezScha=averSedge-averSzh, where: the average saturation of the edge of the symptom is recorded as averSedge; the average saturation of the interior of the symptom is recorded as averSzh; ●计算症状边缘与症状内部的强度差值,记为ezIcha,ezIcha=averIedge-averIzh,式中:症状边缘的平均强度,记为averIedge;症状内部的平均强度,记为averIzh;● Calculate the intensity difference between the edge of the symptom and the inside of the symptom, which is recorded as ezIcha, ezIcha=averIedge-averIzh, where: the average intensity of the edge of the symptom is recorded as averIedge; the average intensity of the inside of the symptom is recorded as averIzh; ●计算症状边缘与症状内部的灰度差值,记为ezGcha,ezGcha=averGedge-averGzh,式中:症状边缘的平均灰度,记为averGedge;症状内部的平均灰度,记为averGzh;●Calculate the gray level difference between the edge of the symptom and the interior of the symptom, which is recorded as ezGcha, ezGcha=averGedge-averGzh, where: the average gray level of the symptom edge is recorded as averGedge; the average gray level of the symptom interior is recorded as averGzh; (c)症状的形状特征的提取和转化运算:提取f(x,y)值,f(x,y)为图像在像素(x,y)处的值,整幅图像的(p+q)阶矩mpq按下式计算:(c) Extraction and conversion operation of shape features of symptoms: extract f(x, y) value, f(x, y) is the value of the image at the pixel (x, y), and (p+q) of the entire image The order moment m pq is calculated according to the following formula: mm pqpq == ΣΣ xx ΣΣ ythe y ff (( xx ,, ythe y )) xx pp ythe y pp ,, pp ,, qq == 0,1,20,1,2 ·· ·· ·&Center Dot; 图像的中心矩μpq按下式计算:The central moment μ pq of the image is calculated as follows: μμ pqpq == ΣΣ xx ΣΣ ythe y (( xx -- xx ‾‾ )) pp (( ythe y -- ythe y ‾‾ )) qq ff (( xx ,, ythe y )) 其中 x ‾ = Σ x Σ y xf ( x , y ) / Σ x Σ y f ( x , y ) , y ‾ = Σ x Σ y yf ( x , y ) / Σ x Σ y f ( x , y ) in x ‾ = Σ x Σ the y xf ( x , the y ) / Σ x Σ the y f ( x , the y ) , the y ‾ = Σ x Σ the y yf ( x , the y ) / Σ x Σ the y f ( x , the y ) 归一化中心矩ηpq按下式计算:The normalized central moment η pq is calculated according to the following formula: ηη pqpq == μμ pqpq // μμ 0000 γγ 式中γ=(p+q+2)/2,p+q=2,3,…In the formula, γ=(p+q+2)/2, p+q=2, 3,... 从二阶矩和三阶矩可以导出一组不变矩的计算式:A set of invariant moments can be derived from the second and third moments: φ1=η2002 φ 12002 φφ 22 == (( ηη 2020 -- ηη 0202 )) 22 ++ 44 ηη 1111 22 φ3=(η30-3η12)2+(3η2103)2 φ 3 =(η 30 -3η 12 ) 2 +(3η 2103 ) 2 φ4=(η3012)2+(η2103)2 φ 4 =(η 3012 ) 2 +(η 2103 ) 2 φ5=(η30-3η12)(η3012)×[(η3012)2-3(η2103)2]+(3η2103)(η2103)×[3(η3012)2-(η2103)2]φ 5 =(η 30 -3η 12 )(η 3012 )×[(η 3012 ) 2 -3(η 2103 ) 2 ]+(3η 2103 )(η 21 + η 03 )×[3(η 3012 ) 2 -(η 2103 ) 2 ] φ6=(η2002)[(η3012)2-(η2103)2]+4η113012)(η2103)φ 6 =(η 2002 )[(η 3012 ) 2 -(η 2103 ) 2 ]+4η 113012 )(η 2103 ) φ7=(3η2103)(η3012)×[(η3012)2-3(η2103)2]+(3η2103)(η2103)×[3(η0312)2-(η1203)2]φ 7 =(3η 2103 )(η 3012 )×[(η 3012 ) 2 -3(η 2103 ) 2 ]+(3η 2103 )(η 21 + η 03 )×[3(η 0312 ) 2 -(η 1203 ) 2 ] 2)所述的识别运算的具体步骤为:2) The specific steps of the described recognition operation are: a选择核函数:采用不同的内积核函数将导致不同的支持向量机算法,目前采用较多的3类核函数如下:a Select the kernel function: Using different inner product kernel functions will lead to different support vector machine algorithms. At present, more three types of kernel functions are used as follows: i.多项式核函数:k(x,y)=(x*y+1)d  d=1,2…..i. Polynomial kernel function: k(x, y)=(x*y+1) d d=1, 2..... ii.RBF(Radial Basis Function)核函数: K = ( x , y ) = epx [ - | | x - y | | 2 2 σ 2 ] ii. RBF (Radial Basis Function) kernel function: K = ( x , the y ) = epx [ - | | x - the y | | 2 2 σ 2 ] iii.Sigmoid核函数:K(x,y)=tanh[b(x*y)-c]iii.Sigmoid kernel function: K(x, y)=tanh[b(x*y)-c] b训练:对于真实样本与虚假样本,分别令其理想输出为1和-1,采用SVM训练算法,即上述分类函数公式训练至收敛,此时即得一基于SVM的分类器;b Training: For the real sample and the fake sample, the ideal output is set to 1 and -1 respectively, and the SVM training algorithm is used, that is, the above classification function formula is trained to converge, and a classifier based on SVM is obtained at this time; c.识别:待识别的样本输入分类器中,输出为一实数值,通过设定的阈值,可判定该样本的类别;c. Identification: The sample to be identified is input into the classifier, and the output is a real value, and the category of the sample can be determined through the set threshold; a)~c)步是对于两类分类问题,对于皮肤显微图像分类,分成四类,因此,可以设计k(k-1)/2个分类器,每一个分类器都用两类数据来训练;Steps a) to c) are for two types of classification problems, and for the classification of skin microscopic images, they are divided into four types. Therefore, k(k-1)/2 classifiers can be designed, and each classifier uses two types of data to train; d.对于黄褐斑和黑头样本重复b),得到分类器SVM1;d. Repeat b) for the chloasma and blackhead samples to obtain the classifier SVM1; e.对于黄褐斑和粉刺样本重复b),得到分类器SVM2;e. Repeat b) for the chloasma and acne samples to obtain classifier SVM2; f.对于黄褐斑和雀斑样本重复b),得到分类器SVM3;f. Repeat b) for chloasma and freckle samples to obtain classifier SVM3; g.对于雀斑和黑头样本重复b),得到分类器SVM4;g. Repeat b) for freckles and blackhead samples to obtain classifier SVM4; h.对于雀斑斑和粉刺样本重复b),得到分类器SVM5;h. Repeat b) for freckles and acne samples to obtain classifier SVM5; i.对于黑头和粉刺样本重复b),得到分类器SVM6;i. Repeat b) for blackhead and acne samples to obtain classifier SVM6; j.在分类的时候,采用一种打分策略:对于某个待分类的样本,分别用训练过程得到6个分类器进行测试,即第c)步骤,每个结果为1分,累计各类别的得分;j. When classifying, adopt a scoring strategy: for a certain sample to be classified, use the training process to obtain 6 classifiers for testing, that is, step c), each result is 1 point, and each category is accumulated Score; k.选择得分最高者所对应的类别为测试数据的类别;k. Select the category corresponding to the highest scorer as the category of the test data; l.识别率的计算;l. Calculation of recognition rate; m.假定有n个初始样本,每次拿出其中一个样本做测试,其余n-1个做训练样本;m. Assuming that there are n initial samples, one sample is taken out for testing each time, and the remaining n-1 are used as training samples; n.对这n-1个训练样本,重复d)~i)步;n. Repeat steps d) to i) for the n-1 training samples; o.再用另外一个样本做测试,得到一个准确度,0或者100%;o. Test with another sample to get an accuracy, 0 or 100%; p.这样的过程重复n次,结果每个样本都有一次作为测试样本,因此最后的准确度值取的是n次的平均值;p. This process is repeated n times, and as a result, each sample is used as a test sample once, so the final accuracy value is the average value of n times; q.选择核函数的方法,就是通过m)~p)步,选择出识别率最高的核函数。q. The method of selecting the kernel function is to select the kernel function with the highest recognition rate through steps m) to p).
CNB200410053539XA 2004-08-06 2004-08-06 Automatic identifying method for skin micro imiage symptom Expired - Fee Related CN1295643C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB200410053539XA CN1295643C (en) 2004-08-06 2004-08-06 Automatic identifying method for skin micro imiage symptom

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB200410053539XA CN1295643C (en) 2004-08-06 2004-08-06 Automatic identifying method for skin micro imiage symptom

Publications (2)

Publication Number Publication Date
CN1588429A CN1588429A (en) 2005-03-02
CN1295643C true CN1295643C (en) 2007-01-17

Family

ID=34602907

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB200410053539XA Expired - Fee Related CN1295643C (en) 2004-08-06 2004-08-06 Automatic identifying method for skin micro imiage symptom

Country Status (1)

Country Link
CN (1) CN1295643C (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8154612B2 (en) * 2005-08-18 2012-04-10 Qualcomm Incorporated Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
CN102799860A (en) * 2012-06-28 2012-11-28 济南大学 Method for holographic recognition of microscopic image
US9727969B2 (en) * 2012-08-17 2017-08-08 Sony Corporation Image processing device, image processing method, program, and image processing system
CN103310099A (en) * 2013-05-30 2013-09-18 佛山电视台南海分台 Method and system for realizing augmented reality by adopting image capture and recognition technology
US20170032285A1 (en) * 2014-04-09 2017-02-02 Entrupy Inc. Authenticating physical objects using machine learning from microscopic variations
CN105205490B (en) * 2015-09-23 2019-09-24 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN112037162B (en) * 2019-05-17 2022-08-02 荣耀终端有限公司 Facial acne detection method and equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1442823A (en) * 2002-12-30 2003-09-17 潘国平 Individual identity automatic identification system based on iris analysis
WO2004060165A1 (en) * 2003-01-07 2004-07-22 Japan Science And Technology Agency Osteoporosis diagnosis support device using panorama x-ray image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1442823A (en) * 2002-12-30 2003-09-17 潘国平 Individual identity automatic identification system based on iris analysis
WO2004060165A1 (en) * 2003-01-07 2004-07-22 Japan Science And Technology Agency Osteoporosis diagnosis support device using panorama x-ray image

Also Published As

Publication number Publication date
CN1588429A (en) 2005-03-02

Similar Documents

Publication Publication Date Title
Zhang et al. Classification of medical images in the biomedical literature by jointly using deep and handcrafted visual features
Suzuki et al. Automatic segmentation and classification of human intestinal parasites from microscopy images
Şevik et al. Identification of suitable fundus images using automated quality assessment methods
CN1275190C (en) Method and device for correcting image askew
CN1162798C (en) Analysis method of tongue color, fur color and tongue thickness in traditional Chinese medicine based on multi-class support vector machine
CN1273516A (en) Method and system for automated detection of clustered microcalcifications from digital mammograms
JP2022528961A (en) Methods and systems for selecting embryos
JP2022551683A (en) Methods and systems for non-invasive genetic testing using artificial intelligence (AI) models
JP6945253B2 (en) Classification device, classification method, program, and information recording medium
Burie et al. ICFHR2016 competition on the analysis of handwritten text in images of balinese palm leaf manuscripts
CN1977286A (en) Object recognition method and apparatus therefor
CN1445715A (en) Systems and methods that facilitate pattern recognition
CN1295643C (en) Automatic identifying method for skin micro imiage symptom
Lins et al. ICDAR 2019 time-quality binarization competition
CN1091905C (en) How to Build the Database of Character Recognition System
Shajahan et al. Identification and counting of soybean aphids from digital images using shape classification
CN101034440A (en) Identification method for spherical fruit and vegetables
CN102129576B (en) Method for extracting duty ratio parameter of all-sky aurora image
Lins et al. Doceng'2020 time-quality competition on binarizing photographed documents
CN1931087A (en) Automatic tongue picture grain analysis method
CN1442823A (en) Individual identity automatic identification system based on iris analysis
Mettripun Thai herb leaves classification based on properties of image regions
CN1632823A (en) Automatic fingerprint classification system and method
CN101968851A (en) Medical image processing method based on dictionary studying upsampling
Cogan et al. Deep understanding of breast density classification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20070117