Disclosure of Invention
The invention aims to solve the defects in the prior art and provides an image data processing method, an image data processing system, computer equipment and a storage medium.
In order to achieve the above purpose, the invention adopts the following technical scheme that the image data processing method comprises the following steps:
S1, obtaining image target boundary pixel data, calculating boundary gradient change data and local pixel density change data, judging the shape offset direction and amplitude, extracting the area with the largest offset, setting local shape adjustment weight, calculating direction compensation value, adjusting image target contour, calculating contour matching difference value, and generating a shape compensation data set;
s2, calculating illumination gradient of an image target area based on the morphological compensation data set, extracting spectral reflectivity, analyzing optical characteristic difference of the image target and a shielding object, calculating dynamic illumination compensation parameters by combining illumination change trend, analyzing illumination matching error, and extracting an illumination matching data set;
S3, calculating an optical characteristic vector of an image target area according to the illumination matching data set, comparing the spectrum characteristic difference of the image target and the shielding object, analyzing the optical independence of the image target area, calculating a motion track difference value of the image target area, and obtaining a separation boundary value of the image target area;
And S4, calculating a compensated matching score by combining the image target region separation boundary value and the morphological compensation data set, adjusting the classification matching priority of the image target region, and correcting the classification weight to obtain an image target classification result.
As a further scheme of the invention, the specific steps of S1 are as follows:
S101, obtaining boundary pixels of an image target area, calculating gradient change values of the boundary pixels, counting pixel density change values in a local area, calculating matching deviation values of the image target form according to the pixel density change values and the gradient change values, and judging form deviation directions and amplitudes of the target area according to the matching deviation values to obtain form deviation data;
S102, calculating the offset gradient of a local area based on the form offset data, extracting the area with the largest offset, setting local form adjustment weight, calculating a direction compensation value according to the form adjustment weight, and adjusting the outline of the image target area by combining the direction compensation value to obtain an outline form adjustment value;
S103, calculating a contour matching difference value based on the contour form adjustment value, analyzing the form change characteristics of the local area by combining the contour matching difference value, screening the offset contour area, calculating the form compensation intensity of the offset area, and establishing a form compensation data set by combining the area compensation intensity.
As a further scheme of the invention, the specific steps of S2 are as follows:
S201, acquiring illumination intensity of a target area in the morphological compensation data set, calculating illumination change rates of adjacent pixel points in the horizontal and vertical directions, screening gradient abrupt change areas, and calculating gradient change trend according to difference of adjacent gradient point values to obtain illumination gradient values of the target area of the image;
s202, calculating the spectral reflectivity of a pixel point of a target area in a difference wavelength range based on the illumination gradient value of the target area of the image, screening areas with poor reflectivity, calculating the spectral reflectivity deviation value of the target and a shielding object, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
S203, calculating an illumination intensity change range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity change and the spectral reflectance deviation, and calculating illumination matching errors among areas by combining form compensation information to obtain an illumination matching data set.
As a further aspect of the present invention, the specific calculation formula for calculating the spectral reflectance deviation value of the target and the shielding object is:
;
Calculating a spectral reflectance deviation value, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
Wherein, Representing the wavelength of the target and the shielding objectAt which the spectral reflectance deviation value is found,Representing pixel pointsAt a wavelength ofThe value of the illumination gradient at the position,Representing pixel pointsAt the target area at the wavelengthThe spectral reflectance at which the light is reflected,Representing pixel pointsAt the wavelength of the shielding object areaSpectral reflectance at n represents the total number of selected pixels in the target region and the occlusion region,Representing the sum of illumination gradient values of all selected pixel points in the wavelength range.
As a further scheme of the invention, the specific steps of S3 are as follows:
S301, acquiring the illumination matching data set, extracting optical characteristic data of all pixel points in a target area, calculating the spectrum intensity of the pixel points under different wavelengths, and counting the numerical distribution range of illumination characteristics in the area to obtain the optical characteristic data distribution information of the image target area;
S302, calculating the reflectance ratio of the wavelengths of the target area and the shelter area based on the distribution information of the optical characteristic data of the target area of the image and the optical characteristic data of the corresponding area of the shelter, calculating the spectrum deviation according to the spectrum energy distribution, analyzing the change trend of the spectrum deviation in a difference wave band, and determining the independence degree of the spectrum characteristic of the target area compared with the shelter to obtain the optical independence numerical value of the target area of the image;
S303, calculating the motion track deviation degree of the target area and the shielding object area based on the optical independence numerical value of the image target area and combining the motion track data of the image target area, adjusting the boundary range of the target area according to the optical independence numerical value, comparing the optical characteristic change of the target area and the background area, calculating the spectrum and the motion characteristic change rate of the edge area, and obtaining the boundary point of the target area and the background area according to the change rate to obtain the separation boundary value of the image target area.
As a further scheme of the invention, the specific steps of S4 are as follows:
S401, obtaining a separation boundary value of the image target area, calculating gradient change, continuity degree and sealing level of boundary pixels, screening an area meeting the integrity requirement, adjusting the form of boundary contour pixels based on the screened boundary value, establishing compensated boundary contour data, calculating a matching error after form compensation, and generating a matching error value;
S402, calculating the classification matching priority of the target area based on the matching error value, extracting optical characteristics including light intensity balance degree, color gradient and reflection level, and calculating optical contrast, boundary optical gradient and influence degree of light interference by combining the separation boundary value of the target area to obtain an optical contrast priority value;
s403, adjusting a target region classification standard based on the optical contrast priority value, calculating a classification weight correction value, correcting classification data of the target region according to the matching error value and the optical independence value, screening the region meeting the classification correction requirement, and obtaining an image target classification result.
As a further aspect of the present invention, the specific calculation formula for calculating the classification weight correction value is:
;
calculating a classification weight correction value, correcting classification data of a target area by combining the optical independence value, screening an area meeting classification correction requirements, and obtaining an image target classification result;
Wherein, Representing the correction value of the classification weight,Represents the firstThe optical contrast priority value of each target area,Represents the firstThe number of match errors for the individual target areas,Represents the firstThe value of the optical independence of the individual target areas,Represents the firstThe initial classification criterion values for the individual target regions,Representing the total number of target areas.
An image data processing system for performing the above-described image data processing method, the image data processing system comprising:
The form matching adjustment module obtains the boundary pixel value of the target area, calculates the gradient change rate of adjacent pixels, counts the pixel density change quantity of the unit area, calculates the form matching deviation value, judges the deviation direction and amplitude, extracts the area with the maximum deviation quantity based on the local deviation gradient value, sets the form adjustment weight, calculates the direction compensation value, adjusts the boundary contour of the target area, calculates the contour matching difference value after adjustment, and obtains the form compensation data set;
The illumination matching optimization module calculates an illumination intensity gradient value of a target area based on the form compensation data set, extracts surface spectral reflectivity, calculates the optical characteristic difference between the target area and a shielding object, calculates dynamic illumination compensation parameters according to illumination change trend, combines the form compensation data set, constructs an illumination matching error model, and acquires an illumination matching data set;
The optical characteristic analysis module calculates an optical characteristic vector of the target area based on the illumination matching data set, compares the spectrum characteristic difference of the target area and the shielding object, calculates optical independence, analyzes a motion track deviation value and acquires an image target area separation boundary value;
The target classification correction module calculates a target region matching score based on the image target region separation boundary value and combines the morphological compensation data set, adjusts the classification matching priority, calculates an optical independence score according to the target region separation boundary value, corrects the target classification weight, and obtains an image target classification result.
Compared with the prior art, the invention has the advantages and positive effects that:
According to the method, local pixel density fluctuation is captured through boundary pixel extraction and gradient change calculation, so that target morphology adjustment is more accurate, illumination intensity gradient and spectral reflectivity extraction are combined, the optical matching degree of a target area is improved, in motion track analysis, target contour separation precision is optimized, recognition accuracy is improved based on separation boundary values and morphology compensation data, the overall scheme enhances target contour precision, optical adaptability and classification stability, and high recognition reliability is maintained in a complex environment.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, an image data processing method includes the following steps:
S1, obtaining image target boundary pixels, calculating boundary gradient change values, counting local pixel density changes, calculating image target form matching bias values, judging form bias directions and amplitudes, calculating local area bias gradients, extracting a maximum offset area, setting local form adjustment weights, calculating direction compensation values, adjusting image target area outlines, calculating contour matching difference values, and generating a form compensation data set;
S2, calculating an illumination intensity gradient value of an image target area according to the form compensation data set, extracting the spectral reflectivity of the surface of the image target, calculating the optical characteristic difference between the image target and a shielding object, calculating dynamic illumination compensation parameters of the image target area based on illumination change trend, analyzing illumination matching errors by combining form compensation information, and extracting an illumination matching data set;
S3, calculating an optical characteristic vector of an image target area based on the illumination matching data set, comparing the spectrum characteristic difference of the image target and the shielding object, analyzing the optical independence of the image target area, calculating a motion track difference value of the image target area, removing an area with consistent motion mode but different optical characteristic, correcting the outline of the image target, and obtaining a separation boundary value of the image target area;
S4, calculating a compensated image target region matching score according to the morphological compensation data set based on the image target region separation boundary value, adjusting the image target region classification matching priority, calculating an optical independence score of the image target region according to the image target region separation boundary value, adjusting an image target classification standard, calculating an image target classification weight correction value, and obtaining an image target classification result.
The morphology compensation data set comprises a boundary gradient change value, a local pixel density change, a morphology matching deviation value, a morphology deviation direction and amplitude, a local area deviation gradient, a deviation maximum area, a local morphology adjustment weight, a direction compensation value and a contour matching difference value, the illumination matching data set comprises an illumination intensity gradient value, a surface spectral reflectivity, an optical characteristic difference, a dynamic illumination compensation parameter and an illumination matching error model, the image target area separation boundary value comprises a corrected image target contour, an optical independence analysis result, a motion track difference value and a rejecting area range, and the image target classification adjustment value comprises a matching score, a classification matching priority, an optical independence score, a classification standard and a classification weight correction value.
The specific steps of S1 are as follows:
S101, obtaining boundary pixels of an image target area, calculating gradient change values of the boundary pixels, counting pixel density change values in a local area, calculating matching deviation values of the image target form according to the pixel density change values and the gradient change values, and judging form deviation directions and amplitudes of the target area according to the matching deviation values to obtain form deviation data;
Boundary pixels of an image target area are acquired, contour information of the target area is firstly extracted through an edge detection algorithm, a Canny edge detection operator can be adopted for processing, and a low threshold value is set And a high thresholdTo screen appropriate edge pixels, select boundary pixel setsThen calculating gradient change values of boundary pixels, respectively calculating horizontal gradient and vertical gradient by using Sobel operator, combining to obtain an overall gradient value G, and dividing a plurality of local windows in a target areaCounting pixel density variation values within a window, i.e. the number of highlighted pixelsAnd low bright pixel numberCalculating pixel density variation value D, measuring brightness fluctuation of local area, simultaneously combining gradient information of boundary pixels, analyzing variation degree of whole form, then calculating matching deviation value of form by comparing relation of gradient variation value G and pixel density variation value DThe calculation method is that the difference between the gradient and the pixel density change value is accumulated to match the deviation valueReflecting the degree of the morphological shift and setting a threshold value of the morphological shiftIf the deviation value isExceeding the limitAnd judging that the target area has obvious form offset, further analyzing the offset direction and amplitude, and acquiring final form offset data by combining the offset data.
S102, calculating an offset gradient of a local area based on the form offset data, extracting an area with the largest offset, setting local form adjustment weight, calculating a direction compensation value according to the form adjustment weight, and adjusting the outline of the image target area by combining the direction compensation value to obtain an outline form adjustment value;
Calculating the offset gradient of the local region based on the morphology offset data, and calculating the morphology offset data first The change rate in the horizontal and vertical directions is analyzed to analyze the deviation degree in the local area, and the partial differential is adopted to calculate the gradient change amountThen extracting the region with the largest offset, and screening out the region with the highest gradient change by a non-maximum value inhibition methodSetting local morphology adjustment weightsThe adjustment weight is determined according to the offset degree, and the calculation formula is thatThe larger the offset is, the higher the adjustment weight is, and on the basis, the direction compensation value is calculatedThe compensation value is calculated by the product of the adjustment weight and the offset gradient, and the direction compensation valueAnd reflecting the correction amount required by local morphology adjustment, then applying a direction compensation value to the contour adjustment of the target area, and optimizing the target contour by adopting a morphology transformation method, wherein specific operations comprise morphology expansion or corrosion to compensate the morphology change in the offset direction, so that the adjusted contour is closer to the original morphology, and finally obtaining a contour morphology adjustment value.
S103, calculating a contour matching difference value based on a contour form adjustment value, analyzing form change characteristics of a local area by combining the contour matching difference value, screening an offset contour area, calculating form compensation intensity of the offset area, and establishing a form compensation data set by combining the area compensation intensity;
calculating a contour matching difference value of the target area based on the contour form adjustment value, and setting a contour matching degree Calculating the matching degree through a morphological matching algorithm, and defining a matching difference valueIs thatThe larger the matching difference value is, the more obvious the profile change is, and then the morphological change characteristics of the local area are analyzed by combining the profile matching difference value, so as to extract the morphological change areaScreening the offset contour region and calculating the morphological compensation intensity of the offset regionThe form compensation intensity is calculated by the following methodThe magnitude of the compensation intensity is in direct proportion to the matching difference value, the larger the matching difference value is, the higher the corresponding morphological compensation intensity is, and finally, the compensation intensity of the combined area isAnd establishing a morphology compensation data set, wherein the morphology compensation data set is used for storing morphology compensation information of different areas, ensuring that the subsequent morphology adjustment can be accurately optimized based on the matching difference and is applied to the correction of the target contour.
The specific steps of S2 are as follows:
S201, obtaining illumination intensity of a target area in a morphological compensation data set, calculating illumination change rates of adjacent pixel points in horizontal and vertical directions, screening gradient abrupt change areas, and calculating gradient change trend according to difference of adjacent gradient point values to obtain illumination gradient values of the target area of an image;
Acquiring illumination intensity of a target area in morphological compensation data, firstly, extracting pixel points of the target area from original image data, recording the illumination intensity, and aiming at each pixel point Calculating the adjacent pixels in the horizontal directionAnd in the vertical directionThe change rate of the illumination intensity is calculated by adopting a gradient calculation method, such as a Sobel operator, to calculate the horizontal gradientVertical gradientThrough a gradient amplitude formulaAcquiring overall gradient information, and setting a mutation threshold value when screening gradient mutation regionsIf a certain point is gradientAnd judging the region as a mutation region, and introducing first derivative and second derivative analysis for further judging the gradient change trend:
First derivative The gradient value is used for judging the gradient increasing and decreasing trend;
Second derivative Representing the change in the first order rate of change, for capturing the "stationarity" or "variability" of the local gradient change.
If it isFalls within a threshold rangeAnd judging that the gradient change is stable, judging that the gradient region is stable, otherwise judging that the gradient change is a mutation point.
Finally obtaining the illumination gradient value of the target area, traversing the whole target area and constructing a gradient distribution matrixAnd storing the calculation result into a morphological compensation data set, and finally obtaining the illumination gradient value data set of the target area.
S202, calculating the spectral reflectivity of a pixel point of a target area in a difference wavelength range based on an illumination gradient value of the target area of the image, screening areas with different reflectivity, calculating a spectral reflectivity deviation value of the target and a shielding object, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
The specific calculation formula for calculating the spectral reflectance deviation value of the target and the shielding object is as follows: ;
Calculating a spectral reflectance deviation value, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
Wherein, Representing the wavelength of the target and the shielding objectAt which the spectral reflectance deviation value is found,Representing pixel pointsAt a wavelength ofThe value of the illumination gradient at the position,Representing pixel pointsAt the target area at the wavelengthThe spectral reflectance at which the light is reflected,Representing pixel pointsAt the wavelength of the shielding object areaSpectral reflectance at n represents the total number of selected pixels in the target region and the occlusion region,Representing the sum of illumination gradient values of all selected pixel points in the wavelength range.
Parameter acquisition: representing pixel points At a wavelength ofAn illumination gradient value at. And calculating the illumination gradient of each pixel point in the target area through an image processing technology. Assume that at the wavelength of 550nm, the illumination gradient values of the selected 5 pixels are respectively:。
representing pixel points At the target area at the wavelengthSpectral reflectance at. The spectral reflectance of each pixel point within the target area is measured by spectral imaging techniques. Assuming that at the wavelength of 550nm, the spectral reflectances of the 5 selected pixels are respectively:。
representing pixel points At the wavelength of the shielding object areaSpectral reflectance at. The spectral reflectance of each pixel in the area of the obstruction is measured by spectral imaging techniques. Assuming that at the wavelength of 550nm, the spectral reflectances of the 5 selected pixels are respectively:。
The calculation steps are as follows:
step 1, calculating the sum of illumination gradient values ;
Step 2, calculating the weighted average spectral reflectivity of the target area;
Calculated molecular fraction (0.8x0.35) + (0.6x0.40) + (0.9x0.38) + (0.7x0.36) + (0.85 x0.37) =0.28+0.24+0.342+0.252+0.3145= 1.4285;
then a weighted average spectral reflectance is calculated: ;
Step 3, calculating the weighted average spectral reflectivity of the shelter area ;
Calculated molecular fraction (0.8x0.50) + (0.6x0.48) + (0.9x0.52) + (0.7x0.49) + (0.85x0.51) =0.40+0.288+0.468+0.343+0.4335= 1.9325;
then a weighted average spectral reflectance is calculated: ;
Step 4, calculating the deviation value of the spectral reflectivity of the target and the shielding object ;
The results illustrate:
calculated spectral reflectance deviation value It was shown that at a wavelength of 550nm there was a difference of 0.131 in spectral reflectance of the target region and the mask region. The results can be used to further analyze the difference in optical properties of the target and the obstruction.
S203, calculating an illumination intensity variation range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity variation and the spectral reflectivity deviation, and calculating illumination matching errors among areas by combining form compensation information to obtain an illumination matching data set;
Calculating the illumination intensity variation range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity variation and the spectral reflectance deviation, and setting an illumination compensation coefficient By the formulaCalculating the compensation weight of each pixel point, calculating the illumination matching error between areas by combining the form compensation information, and setting an error measurement indexUsing mean square error calculationWhereinIn order to compensate for the value of the illumination,And finally obtaining the illumination matching data set as the reference illumination value.
The specific steps of S3 are as follows:
S301, acquiring an illumination matching data set, extracting optical characteristic data of all pixel points in a target area, calculating the spectrum intensity of the pixel points under different wavelengths, and counting the numerical distribution range of illumination characteristics in the area to obtain the optical characteristic data distribution information of the image target area;
The method comprises the steps of acquiring an illumination matching data set, firstly, selecting a designated target area from a target image, extracting optical characteristic data of all pixel points of the area through pixel coordinates, wherein the optical characteristic of each pixel point comprises reflectivity, transmissivity and absorptivity of each pixel point at different wavelengths, capturing spectral responses of the target area at a plurality of wavelengths by using a hyperspectral imaging device, calculating spectral intensities by calculating, calculating the spectral intensities of the pixel points at different wavelengths, and setting a certain pixel point at a wavelength The spectral intensity at the point isThe intensity can be expressed as:;
Wherein, Representing the intensity of the incident light,Representing the reflectivity of the pixel point, for a certain target area, by performing the calculation on all the pixel points and summarizing the spectrum intensity data, the overall spectrum response of the area under different wavelengths can be obtained, in order to further analyze the distribution of illumination characteristic data, the distribution range of the optical data of all the pixel points of the area needs to be counted, the maximum value, the minimum value, the mean value and the standard deviation can be calculated, and the spectrum intensity set of all the pixel points in the target area under a certain wavelength is assumed to beThe maximum value and the minimum value are respectively as follows:
;
;
The mean and standard deviation were calculated as follows:
;
;
In practical cases, if a certain image target area includes 100 pixels, at a wavelength of 500nm, its spectral intensity has a maximum value of 1200, a minimum value of 800, a mean value of 1000, and a standard deviation of about 115.47, at this time, the numerical distribution range of the illumination characteristic of the area is [800,1200], and finally the optical characteristic data distribution information of the image target area is obtained.
S302, calculating the reflectance ratio of the wavelength of the target area and the shelter area based on the distribution information of the optical characteristic data of the target area of the image and the optical characteristic data of the corresponding area of the shelter, calculating the spectrum deviation according to the spectrum energy distribution, analyzing the change trend of the spectrum deviation in a difference wave band, and determining the independence degree of the spectrum characteristic of the target area compared with the shelter to obtain the optical independence value of the target area of the image;
based on the distribution information of the optical characteristic data of the image target area, firstly, acquiring the optical characteristic data of the area corresponding to the shielding object, comparing and analyzing the optical characteristic data with the optical characteristic data of the target area, calculating the reflectivity ratio of the wavelength of the target area to the wavelength of the shielding object area, and setting the wavelength of the target area Average reflectance at this point isThe average reflectivity of the occlusion region isIts reflectance ratio can be expressed as:
;
The ratio reflects the difference in optical reflection characteristics between the target area and the mask area when When the reflectivity of the target area is higher than that of the shielding object, and conversely, the reflectivity is lower, in order to further analyze the optical independence of the target area, the spectral deviation degree needs to be calculated, and the spectral deviation degree is used for measuring the difference of the spectral characteristics of the target area compared with the shielding object and can be expressed as:
;
In order to analyze the variation trend of the spectrum deviation degree in different wave bands, the variation rate of the spectrum deviation degree in a plurality of wavelengths is calculated, and the spectrum deviation degree set in a plurality of wavelengths is assumed to be The rate of change between adjacent wavelengths can be calculated:
;
If the reflectivity of the target area at the wavelengths of 500nm, 600nm and 700nm is 0.5, 0.6 and 0.7 respectively, and the reflectivity of the shielding area at the same wavelength is 0.4, 0.45 and 0.5 respectively, the reflectivity ratios are 1.25, 1.33 and 1.4 respectively, the spectral deviation is 0.1, 0.15 and 0.2 respectively, the change rate is 0.0005 respectively, and the optical independence value of the target area is determined by the calculation results.
S303, calculating the motion trail deviation degree of the target area and the shielding object area based on the optical independence value of the image target area and combining the motion trail data of the image target area, adjusting the boundary range of the target area according to the optical independence value, comparing the optical characteristic change of the target area and the background area, calculating the spectrum and the motion characteristic change rate of the edge area, and obtaining the boundary points of the target area and the background area according to the change rate to obtain the separation boundary value of the image target area;
Based on the optical independence value of the image target area, calculating the motion trail offset degree of the target area and the shelter area by combining the motion trail data of the target area, and setting the trail of the target area as The track of the shielding object area isCalculating the offset between the two:;
In order to determine the boundary range of the target area more accurately, the boundary is adjusted according to the optical independence value, and the initial boundary is set as The adjusted boundary isThe boundary adjustment function is:
;
Wherein the method comprises the steps of To adjust the coefficients, setting according to the spectral characteristics of different wavebands, then calculating the change rate of the spectral and motion characteristics of the edge region by comparing the optical characteristic change of the target region and the background region, and setting the edge pixel point of the target region at different timeIs of the spectral intensity ofAndThe rate of change is calculated as follows:;
if the initial boundary of the target area is 50 pixels, the optical independence value is 0.2, the adjustment coefficient is 10, the adjusted boundary is 50+0.2x10=52 pixels, if the spectral intensity of a certain edge pixel point is changed from 1000 to 1050 within a 500ms time interval, the spectral change rate is 100 units/second, and finally, the boundary point of the target area and the background area is obtained, so that the separation boundary value of the image target area is obtained.
The specific steps of S4 are as follows:
S401, obtaining a separation boundary value of an image target area, calculating gradient change, continuity degree and sealing level of boundary pixels, screening an area meeting the integrity requirement, adjusting the form of boundary contour pixels based on the screened boundary value, establishing compensated boundary contour data, calculating a matching error after form compensation, and generating a matching error value;
Obtaining a separation boundary value of an image target area, extracting an edge contour of the target area through an image segmentation algorithm, selecting a Canny operator to calculate a boundary pixel gradient, and setting an edge detection threshold value
(Wherein) Screening gradient mutation points, respectively calculating gradient changes in the x direction and the y direction through Sobelf (x, y), and setting a gradient threshold valueTo screen significant gradient points and calculate continuous pixel gradient varianceWhen (when)When the boundary gradient is considered to be continuous, calculating the boundary closure levelSetting a closed threshold for the proportion of the closed boundary pixel points to the total boundary pointsScreening a complete boundary area, adjusting boundary contours of screened boundary data by adopting morphological expansion operation, and setting the size of expansion structural elementsBoundary pixels are adjusted, boundary contour data after compensation is constructed, and matching errors between the boundary after morphological compensation and the original boundary are calculatedCalculating a matching error value through Hausdorff distanceWherein A is the original boundary point set, B is the compensated boundary point set,For Euclidean distance, whenAnd when the pixel is considered to be successfully compensated, finally generating a matching error value.
S402, calculating the classification matching priority of a target area based on the matching error value, extracting optical characteristics including light intensity balance degree, color gradient and reflection level, and calculating optical contrast, boundary optical gradient and influence degree of light interference by combining with a separation boundary value of the target area to obtain an optical contrast priority value;
Firstly, the obtained matching errors are required to be classified according to a set numerical range, for example, 0 to 5 is used as an error total interval, wherein the errors are classified into high priority between 0 and 1.5, the errors are classified into medium priority between 1.5 and 3, the areas exceeding 3 are classified into low priority, the priority distribution can be ensured to cover the whole target area through the setting of the error range, then, the optical characteristic extraction operation is carried out on each area, wherein the optical characteristic extraction operation comprises light intensity balance degree, color gradient and reflection level, the light intensity balance degree can be obtained through counting the discrete degree of gray values of all pixels in the area, for example, in an industrial production line image, the pixel brightness in the area is concentrated between 125 and 135, the balance degree is higher, if the pixel brightness distribution spans between 60 and 210, the pixel brightness distribution belongs to the area with lower balance degree, the color gradient extraction can be respectively observed through three channels of red, green and blue of the image, for example, the color values of adjacent pixels in a blue channel are respectively 70, 75, 72, 110 and 115, the color gradient of the area is large if the color gradient is concentrated between 70 and 75, the color gradient is small, in commodity surface detection, the analysis can be used for distinguishing an edge bar code and a smooth surface, the reflection level can be judged through the intensity of a bright part area, in an indoor image, if the brightness of the local area obviously exceeds the surrounding area and no obvious light source exists, the area is indicated to have specular reflection or light interference, the reflection intensity can be observed through the brightness jump frequency, for example, 6 times of brightness fluctuation in continuous 10 pixels exceeds 30 units, the reflection level is regarded as high, after the optical characteristic extraction is finished, the optical characteristic is fused with the separation boundary value of a target area, the optical contrast is further calculated, the difference of the maximum brightness value and the minimum brightness value can be observed, if the brightness value of a certain area floats between 240 and 50, the contrast is obviously higher, otherwise, if the brightness value is distributed between 110 and 130, the contrast is lower, in the actual operation, the contrast difference value exceeds 150 to be a high contrast area, the boundary optical gradient condition is further judged, a pixel line close to the boundary is selected, the brightness difference between every two adjacent pixels in the contrast line is selected, if the overall change stably indicates that the boundary optical gradient is low, otherwise, the boundary area is a strong boundary caused by strong light or color jump, the degree of influence of light interference can be judged based on whether obvious overexposure or shadow casting exists in an image, for example, a straight strong light band or a dark black band appears on a target area, the contrast can be initially identified as a light interference area, the number of interference pixels is counted and compared with the total area pixel number, if the contrast is more than 30%, after all the characteristics are combined, the weight given to each index is given to experience, for total weight set is summarized, for example, the balance weight is reset to 0.3, the color gradient is 0.3, the reflection level is 0.3, the weight is multiplied by the total weight value and the total weight is the total weight value and is multiplied by the total weight value to be a certain optical gradient index, and the optical gradient index is higher than 9, and the optical gradient is a priority is obtained, and the contrast is a priority value is obtained, and is a priority value of a higher than a priority value is obtained, and is obtained by a priority score of a comparison area is obtained.
S403, adjusting a target region classification standard based on the optical contrast priority value, calculating a classification weight correction value, correcting classification data of the target region according to the matching error value and the optical independence value, screening a region meeting the classification correction requirement, and obtaining an image target classification result;
the specific calculation formula for calculating the classification weight correction value is as follows:
;
calculating a classification weight correction value, correcting classification data of a target area by combining the optical independence value, screening an area meeting classification correction requirements, and obtaining an image target classification result;
Wherein, Representing the correction value of the classification weight,Represents the firstThe optical contrast priority value of each target area,Represents the firstThe number of match errors for the individual target areas,Represents the firstThe value of the optical independence of the individual target areas,Represents the firstThe initial classification criterion values for the individual target regions,Representing the total number of target areas.
Optical contrast priority valueQuantified by measuring the difference in light intensity between the target area and the background. According to the data of the Aittmonte optics, the contrast calculation formula is as follows:
;
Wherein, AndRepresenting the maximum and minimum light intensity values of the target area, respectively. Assuming a target area(Unit: light intensity value),Then:
;
Matching error value Is determined by calculating the difference between the target area and the reference template. The normalization correlation coefficient matching method is adopted, and the calculation formula is as follows:
;
Wherein, AndRepresenting the pixel values of the template and the target region respectively,AndAre their average values. Calculated on the assumption。
Optical independence numerical valueRepresenting the degree of independence of the target area in terms of optical characteristics. The independence is quantified by analyzing the optical properties, such as spectral characteristics or polarization characteristics, of the target area. Assuming that the optical independence value of a certain target area is。
Initial classification criterion valueAnd performing preliminary classification on the target area according to a preset classification standard to obtain a numerical value. Assume that an initial classification standard value of a certain target area is。
Total number of target areasRepresenting the number of target areas involved in the calculation. Assume that。
Substituting the parameters into a formula to calculate a classification weight correction value:
Assume that there are 5 target areas, the parameters of which are as follows:
region 1: ;
Region 2: ;
Region 3: ;
region 4: ;
Region 5: ;
First, the molecular fraction is calculated:
;
;
then, the denominator part is calculated:
;
;
;
Thus, the denominator is 1.7958+0.45= 2.2458;
finally, the classification weight correction value The method comprises the following steps:
;
the result shows that the classification weight correction value is calculated About 0.4765. The value is used for adjusting the classification data of the target area, screening the area meeting the classification correction requirement and obtaining the image target classification result.
Referring to fig. 2, an image data processing system includes:
The form matching adjustment module obtains the boundary pixel value of the target area, calculates the gradient change rate of adjacent pixels, counts the pixel density change quantity of the unit area, calculates the form matching deviation value, judges the deviation direction and amplitude, extracts the area with the maximum deviation quantity based on the local deviation gradient value, sets the form adjustment weight, calculates the direction compensation value, adjusts the boundary contour of the target area, calculates the contour matching difference value after adjustment, and obtains the form compensation data set;
The illumination matching optimization module calculates an illumination intensity gradient value of a target area based on the form compensation data set, extracts the surface spectral reflectivity, calculates the optical characteristic difference between the target area and the shielding object, calculates dynamic illumination compensation parameters according to the illumination change trend, combines the form compensation data set, constructs an illumination matching error model, and acquires an illumination matching data set;
The optical characteristic analysis module calculates an optical characteristic vector of a target area based on the illumination matching data set, compares the spectrum characteristic difference of the target area and the shielding object, calculates optical independence, analyzes a motion trail deviation value, eliminates an area with consistent motion modes but different optical characteristics, and acquires an image target area separation boundary value;
The target classification correction module calculates a target region matching score based on the image target region separation boundary value and combines the morphology compensation data set, adjusts the classification matching priority, calculates an optical independence score according to the target region separation boundary value, corrects the target classification weight, and obtains an image target classification result.
The present invention is not limited to the above embodiments, and any equivalent embodiments which can be changed or modified by the technical disclosure described above can be applied to other fields, but any simple modification, equivalent changes and modification made to the above embodiments according to the technical matter of the present invention will still fall within the scope of the technical disclosure.