[go: up one dir, main page]

CN119887816A - Image data processing method, system, computer equipment and storage medium - Google Patents

Image data processing method, system, computer equipment and storage medium Download PDF

Info

Publication number
CN119887816A
CN119887816A CN202510367847.1A CN202510367847A CN119887816A CN 119887816 A CN119887816 A CN 119887816A CN 202510367847 A CN202510367847 A CN 202510367847A CN 119887816 A CN119887816 A CN 119887816A
Authority
CN
China
Prior art keywords
value
calculating
target
target area
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202510367847.1A
Other languages
Chinese (zh)
Inventor
鄢涛
鲜翰禹
吴闻辰
刘永红
赵卫东
王述珍
肖國行
罗钦文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu University
Original Assignee
Chengdu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu University filed Critical Chengdu University
Priority to CN202510367847.1A priority Critical patent/CN119887816A/en
Publication of CN119887816A publication Critical patent/CN119887816A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

本发明涉及图像处理技术领域,具体为一种图像数据处理方法、系统、计算机设备及存储介质,包括以下步骤:获取图像目标边界像素数据,计算边界梯度变化数据与局部像素密度变化数据,判断形态偏移方向与幅度,提取偏移量最大区域,设定局部形态调整权重,计算方向补偿值,调整图像目标轮廓,生成形态补偿数据集。本发明中,通过边界像素提取与梯度变化计算,捕捉局部像素密度波动,使目标形态调整更精确,结合光照强度梯度与光谱反射率提取,提高目标区域的光学匹配度,在轨迹分析中优化目标轮廓分离精度,基于分离边界值与形态补偿数据,提高识别准确性,整体方案增强了目标轮廓精度、光学适应性和分类稳定性,在复杂环境下保持较高识别可靠性。

The present invention relates to the field of image processing technology, specifically to an image data processing method, system, computer equipment and storage medium, comprising the following steps: acquiring image target boundary pixel data, calculating boundary gradient change data and local pixel density change data, determining morphological offset direction and amplitude, extracting the maximum offset area, setting local morphological adjustment weights, calculating direction compensation values, adjusting the image target contour, and generating a morphological compensation data set. In the present invention, by extracting boundary pixels and calculating gradient changes, local pixel density fluctuations are captured, so that the target morphological adjustment is more accurate, combining light intensity gradient and spectral reflectance extraction, improving the optical matching of the target area, optimizing the target contour separation accuracy in trajectory analysis, and improving recognition accuracy based on separation boundary values and morphological compensation data. The overall solution enhances the target contour accuracy, optical adaptability and classification stability, and maintains high recognition reliability in complex environments.

Description

Image data processing method, system, computer equipment and storage medium
Technical Field
The present invention relates to the field of image technologies, and in particular, to an image data processing method, an image data processing system, a computer device, and a storage medium.
Background
The technical field of image processing comprises a plurality of aspects such as acquisition, analysis, transformation, application and the like of digital image data, and the core content of the method comprises technologies such as image acquisition, storage, compression, enhancement, identification, segmentation, reconstruction and the like. The technical field covers that the original image data is acquired from the image acquisition equipment and is processed by means of mathematical transformation, filtering, histogram equalization and the like, so that the image data is suitable for different application requirements. In the fields of computer vision, medical image analysis, remote sensing monitoring, industrial detection and the like, image processing technology is widely applied, and links such as specific image feature extraction, edge detection, pattern matching, object recognition and the like are often involved. Along with the improvement of hardware performance and the development of algorithm optimization, the image processing method is continuously evolved, and gradually develops to the direction of higher precision and lower calculated amount.
The image data processing method is a processing step and an operation mode of the pointer to the image data, and relates to links such as data acquisition, transformation, storage, analysis and the like. The method is generally based on mathematical methods such as multi-channel filtering, image matrix operation, pixel level calculation, fourier transformation and the like, and processes such as denoising, feature extraction, boundary detection or color correction are performed on input image data. For example, in the image denoising process, the image is smoothed by using methods such as Gaussian filtering or median filtering to reduce noise interference, in the feature extraction link, gradient operation or edge detection operators are applied to obtain the outline information of the image, and in the image color adjustment process, the brightness, contrast or saturation is adjusted by means of a color space transformation method. The method performs a specific process on the image data according to a predetermined calculation step for subsequent analysis or application.
In the aspect of image target boundary processing, the prior art relies on an edge detection operator to extract contour information, but has poor adaptability to the change of local pixel density, is easy to be interfered by noise, and leads to inaccurate boundary extraction results. In the morphology matching process, morphology offset analysis is generally performed through feature point comparison, and gradient changes of local areas cannot be fully considered, so that errors exist in judgment of offset direction and amplitude. In the aspect of illumination compensation, optical characteristic correction is mostly carried out by adopting a global brightness adjustment or color space conversion mode, so that the spectrum reflectivity difference between a target and a shielding object is ignored, and a target area is easily confused with a background in an illumination complex environment. In a motion analysis link, the existing method mainly relies on track characteristics to perform target separation, and optical characteristics cannot be effectively combined, so that regions with similar motion modes and different optical properties are difficult to reject, and the accuracy of the target contour is affected. In the aspect of target classification, the target classification is matched according to morphological characteristics, and influence caused by optical independence is ignored, so that the stability of the target classification is lower in an environment with larger illumination variation. These problems easily lead to an increase in recognition error rate in complex application scenarios, affecting the reliability of the final analysis results.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides an image data processing method, an image data processing system, computer equipment and a storage medium.
In order to achieve the above purpose, the invention adopts the following technical scheme that the image data processing method comprises the following steps:
S1, obtaining image target boundary pixel data, calculating boundary gradient change data and local pixel density change data, judging the shape offset direction and amplitude, extracting the area with the largest offset, setting local shape adjustment weight, calculating direction compensation value, adjusting image target contour, calculating contour matching difference value, and generating a shape compensation data set;
s2, calculating illumination gradient of an image target area based on the morphological compensation data set, extracting spectral reflectivity, analyzing optical characteristic difference of the image target and a shielding object, calculating dynamic illumination compensation parameters by combining illumination change trend, analyzing illumination matching error, and extracting an illumination matching data set;
S3, calculating an optical characteristic vector of an image target area according to the illumination matching data set, comparing the spectrum characteristic difference of the image target and the shielding object, analyzing the optical independence of the image target area, calculating a motion track difference value of the image target area, and obtaining a separation boundary value of the image target area;
And S4, calculating a compensated matching score by combining the image target region separation boundary value and the morphological compensation data set, adjusting the classification matching priority of the image target region, and correcting the classification weight to obtain an image target classification result.
As a further scheme of the invention, the specific steps of S1 are as follows:
S101, obtaining boundary pixels of an image target area, calculating gradient change values of the boundary pixels, counting pixel density change values in a local area, calculating matching deviation values of the image target form according to the pixel density change values and the gradient change values, and judging form deviation directions and amplitudes of the target area according to the matching deviation values to obtain form deviation data;
S102, calculating the offset gradient of a local area based on the form offset data, extracting the area with the largest offset, setting local form adjustment weight, calculating a direction compensation value according to the form adjustment weight, and adjusting the outline of the image target area by combining the direction compensation value to obtain an outline form adjustment value;
S103, calculating a contour matching difference value based on the contour form adjustment value, analyzing the form change characteristics of the local area by combining the contour matching difference value, screening the offset contour area, calculating the form compensation intensity of the offset area, and establishing a form compensation data set by combining the area compensation intensity.
As a further scheme of the invention, the specific steps of S2 are as follows:
S201, acquiring illumination intensity of a target area in the morphological compensation data set, calculating illumination change rates of adjacent pixel points in the horizontal and vertical directions, screening gradient abrupt change areas, and calculating gradient change trend according to difference of adjacent gradient point values to obtain illumination gradient values of the target area of the image;
s202, calculating the spectral reflectivity of a pixel point of a target area in a difference wavelength range based on the illumination gradient value of the target area of the image, screening areas with poor reflectivity, calculating the spectral reflectivity deviation value of the target and a shielding object, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
S203, calculating an illumination intensity change range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity change and the spectral reflectance deviation, and calculating illumination matching errors among areas by combining form compensation information to obtain an illumination matching data set.
As a further aspect of the present invention, the specific calculation formula for calculating the spectral reflectance deviation value of the target and the shielding object is:
;
Calculating a spectral reflectance deviation value, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
Wherein, Representing the wavelength of the target and the shielding objectAt which the spectral reflectance deviation value is found,Representing pixel pointsAt a wavelength ofThe value of the illumination gradient at the position,Representing pixel pointsAt the target area at the wavelengthThe spectral reflectance at which the light is reflected,Representing pixel pointsAt the wavelength of the shielding object areaSpectral reflectance at n represents the total number of selected pixels in the target region and the occlusion region,Representing the sum of illumination gradient values of all selected pixel points in the wavelength range.
As a further scheme of the invention, the specific steps of S3 are as follows:
S301, acquiring the illumination matching data set, extracting optical characteristic data of all pixel points in a target area, calculating the spectrum intensity of the pixel points under different wavelengths, and counting the numerical distribution range of illumination characteristics in the area to obtain the optical characteristic data distribution information of the image target area;
S302, calculating the reflectance ratio of the wavelengths of the target area and the shelter area based on the distribution information of the optical characteristic data of the target area of the image and the optical characteristic data of the corresponding area of the shelter, calculating the spectrum deviation according to the spectrum energy distribution, analyzing the change trend of the spectrum deviation in a difference wave band, and determining the independence degree of the spectrum characteristic of the target area compared with the shelter to obtain the optical independence numerical value of the target area of the image;
S303, calculating the motion track deviation degree of the target area and the shielding object area based on the optical independence numerical value of the image target area and combining the motion track data of the image target area, adjusting the boundary range of the target area according to the optical independence numerical value, comparing the optical characteristic change of the target area and the background area, calculating the spectrum and the motion characteristic change rate of the edge area, and obtaining the boundary point of the target area and the background area according to the change rate to obtain the separation boundary value of the image target area.
As a further scheme of the invention, the specific steps of S4 are as follows:
S401, obtaining a separation boundary value of the image target area, calculating gradient change, continuity degree and sealing level of boundary pixels, screening an area meeting the integrity requirement, adjusting the form of boundary contour pixels based on the screened boundary value, establishing compensated boundary contour data, calculating a matching error after form compensation, and generating a matching error value;
S402, calculating the classification matching priority of the target area based on the matching error value, extracting optical characteristics including light intensity balance degree, color gradient and reflection level, and calculating optical contrast, boundary optical gradient and influence degree of light interference by combining the separation boundary value of the target area to obtain an optical contrast priority value;
s403, adjusting a target region classification standard based on the optical contrast priority value, calculating a classification weight correction value, correcting classification data of the target region according to the matching error value and the optical independence value, screening the region meeting the classification correction requirement, and obtaining an image target classification result.
As a further aspect of the present invention, the specific calculation formula for calculating the classification weight correction value is:
;
calculating a classification weight correction value, correcting classification data of a target area by combining the optical independence value, screening an area meeting classification correction requirements, and obtaining an image target classification result;
Wherein, Representing the correction value of the classification weight,Represents the firstThe optical contrast priority value of each target area,Represents the firstThe number of match errors for the individual target areas,Represents the firstThe value of the optical independence of the individual target areas,Represents the firstThe initial classification criterion values for the individual target regions,Representing the total number of target areas.
An image data processing system for performing the above-described image data processing method, the image data processing system comprising:
The form matching adjustment module obtains the boundary pixel value of the target area, calculates the gradient change rate of adjacent pixels, counts the pixel density change quantity of the unit area, calculates the form matching deviation value, judges the deviation direction and amplitude, extracts the area with the maximum deviation quantity based on the local deviation gradient value, sets the form adjustment weight, calculates the direction compensation value, adjusts the boundary contour of the target area, calculates the contour matching difference value after adjustment, and obtains the form compensation data set;
The illumination matching optimization module calculates an illumination intensity gradient value of a target area based on the form compensation data set, extracts surface spectral reflectivity, calculates the optical characteristic difference between the target area and a shielding object, calculates dynamic illumination compensation parameters according to illumination change trend, combines the form compensation data set, constructs an illumination matching error model, and acquires an illumination matching data set;
The optical characteristic analysis module calculates an optical characteristic vector of the target area based on the illumination matching data set, compares the spectrum characteristic difference of the target area and the shielding object, calculates optical independence, analyzes a motion track deviation value and acquires an image target area separation boundary value;
The target classification correction module calculates a target region matching score based on the image target region separation boundary value and combines the morphological compensation data set, adjusts the classification matching priority, calculates an optical independence score according to the target region separation boundary value, corrects the target classification weight, and obtains an image target classification result.
Compared with the prior art, the invention has the advantages and positive effects that:
According to the method, local pixel density fluctuation is captured through boundary pixel extraction and gradient change calculation, so that target morphology adjustment is more accurate, illumination intensity gradient and spectral reflectivity extraction are combined, the optical matching degree of a target area is improved, in motion track analysis, target contour separation precision is optimized, recognition accuracy is improved based on separation boundary values and morphology compensation data, the overall scheme enhances target contour precision, optical adaptability and classification stability, and high recognition reliability is maintained in a complex environment.
Drawings
FIG. 1 is a schematic flow chart of the steps of the present invention;
FIG. 2 is a block diagram of a system according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, an image data processing method includes the following steps:
S1, obtaining image target boundary pixels, calculating boundary gradient change values, counting local pixel density changes, calculating image target form matching bias values, judging form bias directions and amplitudes, calculating local area bias gradients, extracting a maximum offset area, setting local form adjustment weights, calculating direction compensation values, adjusting image target area outlines, calculating contour matching difference values, and generating a form compensation data set;
S2, calculating an illumination intensity gradient value of an image target area according to the form compensation data set, extracting the spectral reflectivity of the surface of the image target, calculating the optical characteristic difference between the image target and a shielding object, calculating dynamic illumination compensation parameters of the image target area based on illumination change trend, analyzing illumination matching errors by combining form compensation information, and extracting an illumination matching data set;
S3, calculating an optical characteristic vector of an image target area based on the illumination matching data set, comparing the spectrum characteristic difference of the image target and the shielding object, analyzing the optical independence of the image target area, calculating a motion track difference value of the image target area, removing an area with consistent motion mode but different optical characteristic, correcting the outline of the image target, and obtaining a separation boundary value of the image target area;
S4, calculating a compensated image target region matching score according to the morphological compensation data set based on the image target region separation boundary value, adjusting the image target region classification matching priority, calculating an optical independence score of the image target region according to the image target region separation boundary value, adjusting an image target classification standard, calculating an image target classification weight correction value, and obtaining an image target classification result.
The morphology compensation data set comprises a boundary gradient change value, a local pixel density change, a morphology matching deviation value, a morphology deviation direction and amplitude, a local area deviation gradient, a deviation maximum area, a local morphology adjustment weight, a direction compensation value and a contour matching difference value, the illumination matching data set comprises an illumination intensity gradient value, a surface spectral reflectivity, an optical characteristic difference, a dynamic illumination compensation parameter and an illumination matching error model, the image target area separation boundary value comprises a corrected image target contour, an optical independence analysis result, a motion track difference value and a rejecting area range, and the image target classification adjustment value comprises a matching score, a classification matching priority, an optical independence score, a classification standard and a classification weight correction value.
The specific steps of S1 are as follows:
S101, obtaining boundary pixels of an image target area, calculating gradient change values of the boundary pixels, counting pixel density change values in a local area, calculating matching deviation values of the image target form according to the pixel density change values and the gradient change values, and judging form deviation directions and amplitudes of the target area according to the matching deviation values to obtain form deviation data;
Boundary pixels of an image target area are acquired, contour information of the target area is firstly extracted through an edge detection algorithm, a Canny edge detection operator can be adopted for processing, and a low threshold value is set And a high thresholdTo screen appropriate edge pixels, select boundary pixel setsThen calculating gradient change values of boundary pixels, respectively calculating horizontal gradient and vertical gradient by using Sobel operator, combining to obtain an overall gradient value G, and dividing a plurality of local windows in a target areaCounting pixel density variation values within a window, i.e. the number of highlighted pixelsAnd low bright pixel numberCalculating pixel density variation value D, measuring brightness fluctuation of local area, simultaneously combining gradient information of boundary pixels, analyzing variation degree of whole form, then calculating matching deviation value of form by comparing relation of gradient variation value G and pixel density variation value DThe calculation method is that the difference between the gradient and the pixel density change value is accumulated to match the deviation valueReflecting the degree of the morphological shift and setting a threshold value of the morphological shiftIf the deviation value isExceeding the limitAnd judging that the target area has obvious form offset, further analyzing the offset direction and amplitude, and acquiring final form offset data by combining the offset data.
S102, calculating an offset gradient of a local area based on the form offset data, extracting an area with the largest offset, setting local form adjustment weight, calculating a direction compensation value according to the form adjustment weight, and adjusting the outline of the image target area by combining the direction compensation value to obtain an outline form adjustment value;
Calculating the offset gradient of the local region based on the morphology offset data, and calculating the morphology offset data first The change rate in the horizontal and vertical directions is analyzed to analyze the deviation degree in the local area, and the partial differential is adopted to calculate the gradient change amountThen extracting the region with the largest offset, and screening out the region with the highest gradient change by a non-maximum value inhibition methodSetting local morphology adjustment weightsThe adjustment weight is determined according to the offset degree, and the calculation formula is thatThe larger the offset is, the higher the adjustment weight is, and on the basis, the direction compensation value is calculatedThe compensation value is calculated by the product of the adjustment weight and the offset gradient, and the direction compensation valueAnd reflecting the correction amount required by local morphology adjustment, then applying a direction compensation value to the contour adjustment of the target area, and optimizing the target contour by adopting a morphology transformation method, wherein specific operations comprise morphology expansion or corrosion to compensate the morphology change in the offset direction, so that the adjusted contour is closer to the original morphology, and finally obtaining a contour morphology adjustment value.
S103, calculating a contour matching difference value based on a contour form adjustment value, analyzing form change characteristics of a local area by combining the contour matching difference value, screening an offset contour area, calculating form compensation intensity of the offset area, and establishing a form compensation data set by combining the area compensation intensity;
calculating a contour matching difference value of the target area based on the contour form adjustment value, and setting a contour matching degree Calculating the matching degree through a morphological matching algorithm, and defining a matching difference valueIs thatThe larger the matching difference value is, the more obvious the profile change is, and then the morphological change characteristics of the local area are analyzed by combining the profile matching difference value, so as to extract the morphological change areaScreening the offset contour region and calculating the morphological compensation intensity of the offset regionThe form compensation intensity is calculated by the following methodThe magnitude of the compensation intensity is in direct proportion to the matching difference value, the larger the matching difference value is, the higher the corresponding morphological compensation intensity is, and finally, the compensation intensity of the combined area isAnd establishing a morphology compensation data set, wherein the morphology compensation data set is used for storing morphology compensation information of different areas, ensuring that the subsequent morphology adjustment can be accurately optimized based on the matching difference and is applied to the correction of the target contour.
The specific steps of S2 are as follows:
S201, obtaining illumination intensity of a target area in a morphological compensation data set, calculating illumination change rates of adjacent pixel points in horizontal and vertical directions, screening gradient abrupt change areas, and calculating gradient change trend according to difference of adjacent gradient point values to obtain illumination gradient values of the target area of an image;
Acquiring illumination intensity of a target area in morphological compensation data, firstly, extracting pixel points of the target area from original image data, recording the illumination intensity, and aiming at each pixel point Calculating the adjacent pixels in the horizontal directionAnd in the vertical directionThe change rate of the illumination intensity is calculated by adopting a gradient calculation method, such as a Sobel operator, to calculate the horizontal gradientVertical gradientThrough a gradient amplitude formulaAcquiring overall gradient information, and setting a mutation threshold value when screening gradient mutation regionsIf a certain point is gradientAnd judging the region as a mutation region, and introducing first derivative and second derivative analysis for further judging the gradient change trend:
First derivative The gradient value is used for judging the gradient increasing and decreasing trend;
Second derivative Representing the change in the first order rate of change, for capturing the "stationarity" or "variability" of the local gradient change.
If it isFalls within a threshold rangeAnd judging that the gradient change is stable, judging that the gradient region is stable, otherwise judging that the gradient change is a mutation point.
Finally obtaining the illumination gradient value of the target area, traversing the whole target area and constructing a gradient distribution matrixAnd storing the calculation result into a morphological compensation data set, and finally obtaining the illumination gradient value data set of the target area.
S202, calculating the spectral reflectivity of a pixel point of a target area in a difference wavelength range based on an illumination gradient value of the target area of the image, screening areas with different reflectivity, calculating a spectral reflectivity deviation value of the target and a shielding object, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
The specific calculation formula for calculating the spectral reflectance deviation value of the target and the shielding object is as follows: ;
Calculating a spectral reflectance deviation value, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
Wherein, Representing the wavelength of the target and the shielding objectAt which the spectral reflectance deviation value is found,Representing pixel pointsAt a wavelength ofThe value of the illumination gradient at the position,Representing pixel pointsAt the target area at the wavelengthThe spectral reflectance at which the light is reflected,Representing pixel pointsAt the wavelength of the shielding object areaSpectral reflectance at n represents the total number of selected pixels in the target region and the occlusion region,Representing the sum of illumination gradient values of all selected pixel points in the wavelength range.
Parameter acquisition: representing pixel points At a wavelength ofAn illumination gradient value at. And calculating the illumination gradient of each pixel point in the target area through an image processing technology. Assume that at the wavelength of 550nm, the illumination gradient values of the selected 5 pixels are respectively:
representing pixel points At the target area at the wavelengthSpectral reflectance at. The spectral reflectance of each pixel point within the target area is measured by spectral imaging techniques. Assuming that at the wavelength of 550nm, the spectral reflectances of the 5 selected pixels are respectively:
representing pixel points At the wavelength of the shielding object areaSpectral reflectance at. The spectral reflectance of each pixel in the area of the obstruction is measured by spectral imaging techniques. Assuming that at the wavelength of 550nm, the spectral reflectances of the 5 selected pixels are respectively:
The calculation steps are as follows:
step 1, calculating the sum of illumination gradient values ;
Step 2, calculating the weighted average spectral reflectivity of the target area;
Calculated molecular fraction (0.8x0.35) + (0.6x0.40) + (0.9x0.38) + (0.7x0.36) + (0.85 x0.37) =0.28+0.24+0.342+0.252+0.3145= 1.4285;
then a weighted average spectral reflectance is calculated: ;
Step 3, calculating the weighted average spectral reflectivity of the shelter area ;
Calculated molecular fraction (0.8x0.50) + (0.6x0.48) + (0.9x0.52) + (0.7x0.49) + (0.85x0.51) =0.40+0.288+0.468+0.343+0.4335= 1.9325;
then a weighted average spectral reflectance is calculated: ;
Step 4, calculating the deviation value of the spectral reflectivity of the target and the shielding object ;
The results illustrate:
calculated spectral reflectance deviation value It was shown that at a wavelength of 550nm there was a difference of 0.131 in spectral reflectance of the target region and the mask region. The results can be used to further analyze the difference in optical properties of the target and the obstruction.
S203, calculating an illumination intensity variation range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity variation and the spectral reflectivity deviation, and calculating illumination matching errors among areas by combining form compensation information to obtain an illumination matching data set;
Calculating the illumination intensity variation range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity variation and the spectral reflectance deviation, and setting an illumination compensation coefficient By the formulaCalculating the compensation weight of each pixel point, calculating the illumination matching error between areas by combining the form compensation information, and setting an error measurement indexUsing mean square error calculationWhereinIn order to compensate for the value of the illumination,And finally obtaining the illumination matching data set as the reference illumination value.
The specific steps of S3 are as follows:
S301, acquiring an illumination matching data set, extracting optical characteristic data of all pixel points in a target area, calculating the spectrum intensity of the pixel points under different wavelengths, and counting the numerical distribution range of illumination characteristics in the area to obtain the optical characteristic data distribution information of the image target area;
The method comprises the steps of acquiring an illumination matching data set, firstly, selecting a designated target area from a target image, extracting optical characteristic data of all pixel points of the area through pixel coordinates, wherein the optical characteristic of each pixel point comprises reflectivity, transmissivity and absorptivity of each pixel point at different wavelengths, capturing spectral responses of the target area at a plurality of wavelengths by using a hyperspectral imaging device, calculating spectral intensities by calculating, calculating the spectral intensities of the pixel points at different wavelengths, and setting a certain pixel point at a wavelength The spectral intensity at the point isThe intensity can be expressed as:;
Wherein, Representing the intensity of the incident light,Representing the reflectivity of the pixel point, for a certain target area, by performing the calculation on all the pixel points and summarizing the spectrum intensity data, the overall spectrum response of the area under different wavelengths can be obtained, in order to further analyze the distribution of illumination characteristic data, the distribution range of the optical data of all the pixel points of the area needs to be counted, the maximum value, the minimum value, the mean value and the standard deviation can be calculated, and the spectrum intensity set of all the pixel points in the target area under a certain wavelength is assumed to beThe maximum value and the minimum value are respectively as follows:
;
;
The mean and standard deviation were calculated as follows:
;
;
In practical cases, if a certain image target area includes 100 pixels, at a wavelength of 500nm, its spectral intensity has a maximum value of 1200, a minimum value of 800, a mean value of 1000, and a standard deviation of about 115.47, at this time, the numerical distribution range of the illumination characteristic of the area is [800,1200], and finally the optical characteristic data distribution information of the image target area is obtained.
S302, calculating the reflectance ratio of the wavelength of the target area and the shelter area based on the distribution information of the optical characteristic data of the target area of the image and the optical characteristic data of the corresponding area of the shelter, calculating the spectrum deviation according to the spectrum energy distribution, analyzing the change trend of the spectrum deviation in a difference wave band, and determining the independence degree of the spectrum characteristic of the target area compared with the shelter to obtain the optical independence value of the target area of the image;
based on the distribution information of the optical characteristic data of the image target area, firstly, acquiring the optical characteristic data of the area corresponding to the shielding object, comparing and analyzing the optical characteristic data with the optical characteristic data of the target area, calculating the reflectivity ratio of the wavelength of the target area to the wavelength of the shielding object area, and setting the wavelength of the target area Average reflectance at this point isThe average reflectivity of the occlusion region isIts reflectance ratio can be expressed as:
;
The ratio reflects the difference in optical reflection characteristics between the target area and the mask area when When the reflectivity of the target area is higher than that of the shielding object, and conversely, the reflectivity is lower, in order to further analyze the optical independence of the target area, the spectral deviation degree needs to be calculated, and the spectral deviation degree is used for measuring the difference of the spectral characteristics of the target area compared with the shielding object and can be expressed as:
;
In order to analyze the variation trend of the spectrum deviation degree in different wave bands, the variation rate of the spectrum deviation degree in a plurality of wavelengths is calculated, and the spectrum deviation degree set in a plurality of wavelengths is assumed to be The rate of change between adjacent wavelengths can be calculated:
;
If the reflectivity of the target area at the wavelengths of 500nm, 600nm and 700nm is 0.5, 0.6 and 0.7 respectively, and the reflectivity of the shielding area at the same wavelength is 0.4, 0.45 and 0.5 respectively, the reflectivity ratios are 1.25, 1.33 and 1.4 respectively, the spectral deviation is 0.1, 0.15 and 0.2 respectively, the change rate is 0.0005 respectively, and the optical independence value of the target area is determined by the calculation results.
S303, calculating the motion trail deviation degree of the target area and the shielding object area based on the optical independence value of the image target area and combining the motion trail data of the image target area, adjusting the boundary range of the target area according to the optical independence value, comparing the optical characteristic change of the target area and the background area, calculating the spectrum and the motion characteristic change rate of the edge area, and obtaining the boundary points of the target area and the background area according to the change rate to obtain the separation boundary value of the image target area;
Based on the optical independence value of the image target area, calculating the motion trail offset degree of the target area and the shelter area by combining the motion trail data of the target area, and setting the trail of the target area as The track of the shielding object area isCalculating the offset between the two:;
In order to determine the boundary range of the target area more accurately, the boundary is adjusted according to the optical independence value, and the initial boundary is set as The adjusted boundary isThe boundary adjustment function is:
;
Wherein the method comprises the steps of To adjust the coefficients, setting according to the spectral characteristics of different wavebands, then calculating the change rate of the spectral and motion characteristics of the edge region by comparing the optical characteristic change of the target region and the background region, and setting the edge pixel point of the target region at different timeIs of the spectral intensity ofAndThe rate of change is calculated as follows:;
if the initial boundary of the target area is 50 pixels, the optical independence value is 0.2, the adjustment coefficient is 10, the adjusted boundary is 50+0.2x10=52 pixels, if the spectral intensity of a certain edge pixel point is changed from 1000 to 1050 within a 500ms time interval, the spectral change rate is 100 units/second, and finally, the boundary point of the target area and the background area is obtained, so that the separation boundary value of the image target area is obtained.
The specific steps of S4 are as follows:
S401, obtaining a separation boundary value of an image target area, calculating gradient change, continuity degree and sealing level of boundary pixels, screening an area meeting the integrity requirement, adjusting the form of boundary contour pixels based on the screened boundary value, establishing compensated boundary contour data, calculating a matching error after form compensation, and generating a matching error value;
Obtaining a separation boundary value of an image target area, extracting an edge contour of the target area through an image segmentation algorithm, selecting a Canny operator to calculate a boundary pixel gradient, and setting an edge detection threshold value
(Wherein) Screening gradient mutation points, respectively calculating gradient changes in the x direction and the y direction through Sobelf (x, y), and setting a gradient threshold valueTo screen significant gradient points and calculate continuous pixel gradient varianceWhen (when)When the boundary gradient is considered to be continuous, calculating the boundary closure levelSetting a closed threshold for the proportion of the closed boundary pixel points to the total boundary pointsScreening a complete boundary area, adjusting boundary contours of screened boundary data by adopting morphological expansion operation, and setting the size of expansion structural elementsBoundary pixels are adjusted, boundary contour data after compensation is constructed, and matching errors between the boundary after morphological compensation and the original boundary are calculatedCalculating a matching error value through Hausdorff distanceWherein A is the original boundary point set, B is the compensated boundary point set,For Euclidean distance, whenAnd when the pixel is considered to be successfully compensated, finally generating a matching error value.
S402, calculating the classification matching priority of a target area based on the matching error value, extracting optical characteristics including light intensity balance degree, color gradient and reflection level, and calculating optical contrast, boundary optical gradient and influence degree of light interference by combining with a separation boundary value of the target area to obtain an optical contrast priority value;
Firstly, the obtained matching errors are required to be classified according to a set numerical range, for example, 0 to 5 is used as an error total interval, wherein the errors are classified into high priority between 0 and 1.5, the errors are classified into medium priority between 1.5 and 3, the areas exceeding 3 are classified into low priority, the priority distribution can be ensured to cover the whole target area through the setting of the error range, then, the optical characteristic extraction operation is carried out on each area, wherein the optical characteristic extraction operation comprises light intensity balance degree, color gradient and reflection level, the light intensity balance degree can be obtained through counting the discrete degree of gray values of all pixels in the area, for example, in an industrial production line image, the pixel brightness in the area is concentrated between 125 and 135, the balance degree is higher, if the pixel brightness distribution spans between 60 and 210, the pixel brightness distribution belongs to the area with lower balance degree, the color gradient extraction can be respectively observed through three channels of red, green and blue of the image, for example, the color values of adjacent pixels in a blue channel are respectively 70, 75, 72, 110 and 115, the color gradient of the area is large if the color gradient is concentrated between 70 and 75, the color gradient is small, in commodity surface detection, the analysis can be used for distinguishing an edge bar code and a smooth surface, the reflection level can be judged through the intensity of a bright part area, in an indoor image, if the brightness of the local area obviously exceeds the surrounding area and no obvious light source exists, the area is indicated to have specular reflection or light interference, the reflection intensity can be observed through the brightness jump frequency, for example, 6 times of brightness fluctuation in continuous 10 pixels exceeds 30 units, the reflection level is regarded as high, after the optical characteristic extraction is finished, the optical characteristic is fused with the separation boundary value of a target area, the optical contrast is further calculated, the difference of the maximum brightness value and the minimum brightness value can be observed, if the brightness value of a certain area floats between 240 and 50, the contrast is obviously higher, otherwise, if the brightness value is distributed between 110 and 130, the contrast is lower, in the actual operation, the contrast difference value exceeds 150 to be a high contrast area, the boundary optical gradient condition is further judged, a pixel line close to the boundary is selected, the brightness difference between every two adjacent pixels in the contrast line is selected, if the overall change stably indicates that the boundary optical gradient is low, otherwise, the boundary area is a strong boundary caused by strong light or color jump, the degree of influence of light interference can be judged based on whether obvious overexposure or shadow casting exists in an image, for example, a straight strong light band or a dark black band appears on a target area, the contrast can be initially identified as a light interference area, the number of interference pixels is counted and compared with the total area pixel number, if the contrast is more than 30%, after all the characteristics are combined, the weight given to each index is given to experience, for total weight set is summarized, for example, the balance weight is reset to 0.3, the color gradient is 0.3, the reflection level is 0.3, the weight is multiplied by the total weight value and the total weight is the total weight value and is multiplied by the total weight value to be a certain optical gradient index, and the optical gradient index is higher than 9, and the optical gradient is a priority is obtained, and the contrast is a priority value is obtained, and is a priority value of a higher than a priority value is obtained, and is obtained by a priority score of a comparison area is obtained.
S403, adjusting a target region classification standard based on the optical contrast priority value, calculating a classification weight correction value, correcting classification data of the target region according to the matching error value and the optical independence value, screening a region meeting the classification correction requirement, and obtaining an image target classification result;
the specific calculation formula for calculating the classification weight correction value is as follows:
;
calculating a classification weight correction value, correcting classification data of a target area by combining the optical independence value, screening an area meeting classification correction requirements, and obtaining an image target classification result;
Wherein, Representing the correction value of the classification weight,Represents the firstThe optical contrast priority value of each target area,Represents the firstThe number of match errors for the individual target areas,Represents the firstThe value of the optical independence of the individual target areas,Represents the firstThe initial classification criterion values for the individual target regions,Representing the total number of target areas.
Optical contrast priority valueQuantified by measuring the difference in light intensity between the target area and the background. According to the data of the Aittmonte optics, the contrast calculation formula is as follows:
;
Wherein, AndRepresenting the maximum and minimum light intensity values of the target area, respectively. Assuming a target area(Unit: light intensity value),Then:
;
Matching error value Is determined by calculating the difference between the target area and the reference template. The normalization correlation coefficient matching method is adopted, and the calculation formula is as follows:
;
Wherein, AndRepresenting the pixel values of the template and the target region respectively,AndAre their average values. Calculated on the assumption
Optical independence numerical valueRepresenting the degree of independence of the target area in terms of optical characteristics. The independence is quantified by analyzing the optical properties, such as spectral characteristics or polarization characteristics, of the target area. Assuming that the optical independence value of a certain target area is
Initial classification criterion valueAnd performing preliminary classification on the target area according to a preset classification standard to obtain a numerical value. Assume that an initial classification standard value of a certain target area is
Total number of target areasRepresenting the number of target areas involved in the calculation. Assume that
Substituting the parameters into a formula to calculate a classification weight correction value:
Assume that there are 5 target areas, the parameters of which are as follows:
region 1: ;
Region 2: ;
Region 3: ;
region 4: ;
Region 5: ;
First, the molecular fraction is calculated:
;
;
then, the denominator part is calculated:
;
;
;
Thus, the denominator is 1.7958+0.45= 2.2458;
finally, the classification weight correction value The method comprises the following steps:
;
the result shows that the classification weight correction value is calculated About 0.4765. The value is used for adjusting the classification data of the target area, screening the area meeting the classification correction requirement and obtaining the image target classification result.
Referring to fig. 2, an image data processing system includes:
The form matching adjustment module obtains the boundary pixel value of the target area, calculates the gradient change rate of adjacent pixels, counts the pixel density change quantity of the unit area, calculates the form matching deviation value, judges the deviation direction and amplitude, extracts the area with the maximum deviation quantity based on the local deviation gradient value, sets the form adjustment weight, calculates the direction compensation value, adjusts the boundary contour of the target area, calculates the contour matching difference value after adjustment, and obtains the form compensation data set;
The illumination matching optimization module calculates an illumination intensity gradient value of a target area based on the form compensation data set, extracts the surface spectral reflectivity, calculates the optical characteristic difference between the target area and the shielding object, calculates dynamic illumination compensation parameters according to the illumination change trend, combines the form compensation data set, constructs an illumination matching error model, and acquires an illumination matching data set;
The optical characteristic analysis module calculates an optical characteristic vector of a target area based on the illumination matching data set, compares the spectrum characteristic difference of the target area and the shielding object, calculates optical independence, analyzes a motion trail deviation value, eliminates an area with consistent motion modes but different optical characteristics, and acquires an image target area separation boundary value;
The target classification correction module calculates a target region matching score based on the image target region separation boundary value and combines the morphology compensation data set, adjusts the classification matching priority, calculates an optical independence score according to the target region separation boundary value, corrects the target classification weight, and obtains an image target classification result.
The present invention is not limited to the above embodiments, and any equivalent embodiments which can be changed or modified by the technical disclosure described above can be applied to other fields, but any simple modification, equivalent changes and modification made to the above embodiments according to the technical matter of the present invention will still fall within the scope of the technical disclosure.

Claims (10)

1. An image data processing method, characterized by comprising the steps of:
S1, obtaining image target boundary pixel data, calculating boundary gradient change data and local pixel density change data, judging the shape offset direction and amplitude, extracting the area with the largest offset, setting local shape adjustment weight, calculating direction compensation value, adjusting image target contour, calculating contour matching difference value, and generating a shape compensation data set;
s2, calculating illumination gradient of an image target area based on the morphological compensation data set, extracting spectral reflectivity, analyzing optical characteristic difference of the image target and a shielding object, calculating dynamic illumination compensation parameters by combining illumination change trend, analyzing illumination matching error, and extracting an illumination matching data set;
S3, calculating an optical characteristic vector of an image target area according to the illumination matching data set, comparing the spectrum characteristic difference of the image target and the shielding object, analyzing the optical independence of the image target area, calculating a motion track difference value of the image target area, and obtaining a separation boundary value of the image target area;
And S4, calculating a compensated matching score by combining the image target region separation boundary value and the morphological compensation data set, adjusting the classification matching priority of the image target region, and correcting the classification weight to obtain an image target classification result.
2. The image data processing method according to claim 1, wherein the specific steps of S1 are:
S101, obtaining boundary pixels of an image target area, calculating gradient change values of the boundary pixels, counting pixel density change values in a local area, calculating matching deviation values of the image target form according to the pixel density change values and the gradient change values, and judging form deviation directions and amplitudes of the target area according to the matching deviation values to obtain form deviation data;
S102, calculating the offset gradient of a local area based on the form offset data, extracting the area with the largest offset, setting local form adjustment weight, calculating a direction compensation value according to the form adjustment weight, and adjusting the outline of the image target area by combining the direction compensation value to obtain an outline form adjustment value;
S103, calculating a contour matching difference value based on the contour form adjustment value, analyzing the form change characteristics of the local area by combining the contour matching difference value, screening the offset contour area, calculating the form compensation intensity of the offset area, and establishing a form compensation data set by combining the area compensation intensity.
3. The image data processing method according to claim 1, wherein the specific step of S2 is:
S201, acquiring illumination intensity of a target area in the morphological compensation data set, calculating illumination change rates of adjacent pixel points in the horizontal and vertical directions, screening gradient abrupt change areas, and calculating gradient change trend according to difference of adjacent gradient point values to obtain illumination gradient values of the target area of the image;
s202, calculating the spectral reflectivity of a pixel point of a target area in a difference wavelength range based on the illumination gradient value of the target area of the image, screening areas with poor reflectivity, calculating the spectral reflectivity deviation value of the target and a shielding object, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
S203, calculating an illumination intensity change range based on the optical characteristic difference of the target and the shielding object, calculating illumination compensation data according to the illumination intensity change and the spectral reflectance deviation, and calculating illumination matching errors among areas by combining form compensation information to obtain an illumination matching data set.
4. The image data processing method according to claim 3, wherein the specific calculation formula for calculating the spectral reflectance deviation value of the object and the obstruction is:
;
Calculating a spectral reflectance deviation value, and acquiring the optical characteristic difference of the target and the shielding object by combining the illumination gradient change;
Wherein, Representing the wavelength of the target and the shielding objectAt which the spectral reflectance deviation value is found,Representing pixel pointsAt a wavelength ofThe value of the illumination gradient at the position,Representing pixel pointsAt the target area at the wavelengthThe spectral reflectance at which the light is reflected,Representing pixel pointsAt the wavelength of the shielding object areaSpectral reflectance at n represents the total number of selected pixels in the target region and the occlusion region,Representing the sum of illumination gradient values of all selected pixel points in the wavelength range.
5. The image data processing method according to claim 1, wherein the specific step of S3 is:
S301, acquiring the illumination matching data set, extracting optical characteristic data of all pixel points in a target area, calculating the spectrum intensity of the pixel points under different wavelengths, and counting the numerical distribution range of illumination characteristics in the area to obtain the optical characteristic data distribution information of the image target area;
S302, calculating the reflectance ratio of the wavelengths of the target area and the shelter area based on the distribution information of the optical characteristic data of the target area of the image and the optical characteristic data of the corresponding area of the shelter, calculating the spectrum deviation according to the spectrum energy distribution, analyzing the change trend of the spectrum deviation in a difference wave band, and determining the independence degree of the spectrum characteristic of the target area compared with the shelter to obtain the optical independence numerical value of the target area of the image;
S303, calculating the motion track deviation degree of the target area and the shielding object area based on the optical independence numerical value of the image target area and combining the motion track data of the image target area, adjusting the boundary range of the target area according to the optical independence numerical value, comparing the optical characteristic change of the target area and the background area, calculating the spectrum and the motion characteristic change rate of the edge area, and obtaining the boundary point of the target area and the background area according to the change rate to obtain the separation boundary value of the image target area.
6. The image data processing method according to claim 1, wherein the specific step of S4 is:
S401, obtaining a separation boundary value of the image target area, calculating gradient change, continuity degree and sealing level of boundary pixels, screening an area meeting the integrity requirement, adjusting the form of boundary contour pixels based on the screened boundary value, establishing compensated boundary contour data, calculating a matching error after form compensation, and generating a matching error value;
S402, calculating the classification matching priority of the target area based on the matching error value, extracting optical characteristics including light intensity balance degree, color gradient and reflection level, and calculating optical contrast, boundary optical gradient and influence degree of light interference by combining the separation boundary value of the target area to obtain an optical contrast priority value;
s403, adjusting a target region classification standard based on the optical contrast priority value, calculating a classification weight correction value, correcting classification data of the target region according to the matching error value and the optical independence value, screening the region meeting the classification correction requirement, and obtaining an image target classification result.
7. The image data processing method according to claim 6, wherein the specific calculation formula for calculating the classification weight correction value is:
;
calculating a classification weight correction value, correcting classification data of a target area by combining the optical independence value, screening an area meeting classification correction requirements, and obtaining an image target classification result;
Wherein, Representing the correction value of the classification weight,Represents the firstThe optical contrast priority value of each target area,Represents the firstThe number of match errors for the individual target areas,Represents the firstThe value of the optical independence of the individual target areas,Represents the firstThe initial classification criterion values for the individual target regions,Representing the total number of target areas.
8. An image data processing system for performing the image data processing method of any one of claims 1 to 7, the image data processing system comprising:
The form matching adjustment module obtains the boundary pixel value of the target area, calculates the gradient change rate of adjacent pixels, counts the pixel density change quantity of the unit area, calculates the form matching deviation value, judges the deviation direction and amplitude, extracts the area with the maximum deviation quantity based on the local deviation gradient value, sets the form adjustment weight, calculates the direction compensation value, adjusts the boundary contour of the target area, calculates the contour matching difference value after adjustment, and obtains the form compensation data set;
The illumination matching optimization module calculates an illumination intensity gradient value of a target area based on the form compensation data set, extracts surface spectral reflectivity, calculates the optical characteristic difference between the target area and a shielding object, calculates dynamic illumination compensation parameters according to illumination change trend, combines the form compensation data set, constructs an illumination matching error model, and acquires an illumination matching data set;
The optical characteristic analysis module calculates an optical characteristic vector of the target area based on the illumination matching data set, compares the spectrum characteristic difference of the target area and the shielding object, calculates optical independence, analyzes a motion track deviation value and acquires an image target area separation boundary value;
The target classification correction module calculates a target region matching score based on the image target region separation boundary value and combines the morphological compensation data set, adjusts the classification matching priority, calculates an optical independence score according to the target region separation boundary value, corrects the target classification weight, and obtains an image target classification result.
9. A computer device comprising a memory and a processor, wherein the memory has stored therein a computer program, which when executed by the processor implements the image data processing system of claim 8.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the image data processing method of any one of claims 1 to 7.
CN202510367847.1A 2025-03-26 2025-03-26 Image data processing method, system, computer equipment and storage medium Pending CN119887816A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202510367847.1A CN119887816A (en) 2025-03-26 2025-03-26 Image data processing method, system, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202510367847.1A CN119887816A (en) 2025-03-26 2025-03-26 Image data processing method, system, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN119887816A true CN119887816A (en) 2025-04-25

Family

ID=95420918

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202510367847.1A Pending CN119887816A (en) 2025-03-26 2025-03-26 Image data processing method, system, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN119887816A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120220117A (en) * 2025-05-29 2025-06-27 四川藏区高速公路有限责任公司 Road foreign matter detection method, system, electronic equipment and product
CN120346994A (en) * 2025-05-16 2025-07-22 山东丰融新材料有限公司 Optical compensation method and system for color sorter
CN120404499A (en) * 2025-07-03 2025-08-01 深圳市恒得源环保新材料科技有限公司 A machine vision-based calcium carbonate masterbatch wind-cutting molding detection system
CN120747536A (en) * 2025-06-25 2025-10-03 北京骁驭科技有限公司 Computer image processing system based on deep learning

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160366305A1 (en) * 2015-06-09 2016-12-15 Konica Minolta, Inc. Colorimetric device, image forming device, image forming system, and image forming management device
CN117949397A (en) * 2024-03-27 2024-04-30 潍坊市勘察测绘研究院 Hyperspectral remote sensing geological mapping control system and hyperspectral remote sensing geological mapping control method
CN119006876A (en) * 2024-06-26 2024-11-22 中国自然资源航空物探遥感中心 Land utilization category classification method, apparatus and storage medium
CN119295460A (en) * 2024-12-12 2025-01-10 深圳市三德盈电子有限公司 A printed circuit board defect detection method and system based on image recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160366305A1 (en) * 2015-06-09 2016-12-15 Konica Minolta, Inc. Colorimetric device, image forming device, image forming system, and image forming management device
CN117949397A (en) * 2024-03-27 2024-04-30 潍坊市勘察测绘研究院 Hyperspectral remote sensing geological mapping control system and hyperspectral remote sensing geological mapping control method
CN119006876A (en) * 2024-06-26 2024-11-22 中国自然资源航空物探遥感中心 Land utilization category classification method, apparatus and storage medium
CN119295460A (en) * 2024-12-12 2025-01-10 深圳市三德盈电子有限公司 A printed circuit board defect detection method and system based on image recognition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张瑞: ""基于全局稀疏梯度的图像处理方法研究"", 中国博士学位论文全文数据库, 15 December 2018 (2018-12-15) *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120346994A (en) * 2025-05-16 2025-07-22 山东丰融新材料有限公司 Optical compensation method and system for color sorter
CN120220117A (en) * 2025-05-29 2025-06-27 四川藏区高速公路有限责任公司 Road foreign matter detection method, system, electronic equipment and product
CN120747536A (en) * 2025-06-25 2025-10-03 北京骁驭科技有限公司 Computer image processing system based on deep learning
CN120747536B (en) * 2025-06-25 2025-12-05 北京骁驭科技有限公司 Computer image processing system based on deep learning
CN120404499A (en) * 2025-07-03 2025-08-01 深圳市恒得源环保新材料科技有限公司 A machine vision-based calcium carbonate masterbatch wind-cutting molding detection system

Similar Documents

Publication Publication Date Title
CN119887816A (en) Image data processing method, system, computer equipment and storage medium
CN117557820B (en) Quantum dot optical film damage detection method and system based on machine vision
CN119887747B (en) Deep learning-based AOI optical detection method and system
CN115841434A (en) Infrared image enhancement method for gas concentration analysis
CN117237747B (en) Hardware defect classification and identification method based on artificial intelligence
CN118329929B (en) Nondestructive testing method for textile material based on spectral characteristics
CN120198416B (en) Geological structure crack automatic identification and detection system based on deep learning
CN119992229A (en) Grain classification and recognition method and system based on image analysis
CN117974671B (en) Watch dial defect intelligent detection method based on artificial intelligence
CN120219819A (en) A feature extraction method and system for thyroid nodule ultrasound image classification
CN120726405B (en) Image recognition method and system for touch screen defect detection
CN119600298B (en) Side slope deformation monitoring method based on image recognition
CN120932049A (en) Ecological environment detection method and system based on multispectral remote sensing fusion
CN120612327A (en) A visual inspection system and method for minor defects in industrial products
CN119599994B (en) An intelligent visual inspection system and method for non-standard automated production lines
CN120404611A (en) A system and method for detecting optical properties of camouflage materials
CN120147263A (en) An automatic detection system for bearing surface defects based on image recognition
CN120877277B (en) Food detection system and method based on computer vision
CN117690008B (en) Aquatic ecological suspended matter identification method and system
CN119942223B (en) A rice variety classification method based on hyperspectral pixel-level information
CN120782763B (en) Real-time monitoring method for density relay production line
CN121259463A (en) A Geographic Mapping Image Classification and Processing System and Method Based on Big Data
CN121504920A (en) A Non-destructive Testing Method for Edamame Based on Hyperspectral Images
莫冠宗 JND-Based Illumination Compensationand DoG-Mask RCNN Optimization forAccurate Radish Image Segmentation
Zhou et al. An Evaluation Method Based on Image Texture and KL Divergence for SAR Image Quantization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination