[go: up one dir, main page]

CN115089203B - DR imaging analysis method and DR imaging equipment - Google Patents

DR imaging analysis method and DR imaging equipment

Info

Publication number
CN115089203B
CN115089203B CN202210657368.XA CN202210657368A CN115089203B CN 115089203 B CN115089203 B CN 115089203B CN 202210657368 A CN202210657368 A CN 202210657368A CN 115089203 B CN115089203 B CN 115089203B
Authority
CN
China
Prior art keywords
image
parameter
features
acquiring
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210657368.XA
Other languages
Chinese (zh)
Other versions
CN115089203A (en
Inventor
许�鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Wuhan Mindray Technology Co Ltd
Original Assignee
Wuhan Mindray Biomedical Technology Co ltd
Shenzhen Mindray Bio Medical Electronics Co Ltd
Wuhan Mindray Medical Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Mindray Biomedical Technology Co ltd, Shenzhen Mindray Bio Medical Electronics Co Ltd, Wuhan Mindray Medical Technology Research Institute Co Ltd filed Critical Wuhan Mindray Biomedical Technology Co ltd
Priority to CN202210657368.XA priority Critical patent/CN115089203B/en
Publication of CN115089203A publication Critical patent/CN115089203A/en
Application granted granted Critical
Publication of CN115089203B publication Critical patent/CN115089203B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/542Control of apparatus or devices for radiation diagnosis involving control of exposure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Quality & Reliability (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)

Abstract

一种DR成像的分析方法和DR成像设备,该方法包括:通过DR成像设备获取目标对象的数字X线摄影DR图像,DR图像包括DR原始图像和DR原始图像被处理后的图像中的至少一种;从DR图像中提取图像特征;根据DR成像设备的设备参数获取设备参数特征,和/或根据目标对象的对象参数获取对象参数特征;根据图像特征、以及设备参数特征和/或对象参数特征,确定DR特征指数,DR特征指数用于表征目标对象所受X线的辐射剂量和DR图像的图像质量之间的综合水平。本申请根据DR图像的图像特征和参数特征获得能够客观化地表征目标对象所受X线的辐射剂量和DR图像的图像质量之间的综合水平的DR特征指数,从而为操作者提供客观化的判断依据。

A DR imaging analysis method and DR imaging device, the method comprising: acquiring a digital X-ray radiographic DR image of a target object through a DR imaging device, the DR image comprising at least one of an original DR image and a processed image of the original DR image; extracting image features from the DR image; acquiring device parameter features based on device parameters of the DR imaging device, and/or acquiring object parameter features based on object parameters of the target object; and determining a DR feature index based on the image features, device parameter features, and/or object parameter features, the DR feature index being used to characterize the comprehensive level between the X-ray radiation dose received by the target object and the image quality of the DR image. The present application obtains a DR feature index based on the image features and parameter features of the DR image that can objectively characterize the comprehensive level between the X-ray radiation dose received by the target object and the image quality of the DR image, thereby providing an objective basis for judgment for the operator.

Description

Analysis method of DR imaging and DR imaging apparatus
Technical Field
The application relates to the technical field of medical images, in particular to a DR imaging analysis method and DR imaging equipment.
Background
Digital radiography (Digital Radiography, DR) images are a common type of medical digital images that are widely used in the fields of physical examination and conventional medical imaging diagnosis. With the development of digital X-ray detectors and digital image processing systems, DR images often exhibit diagnostic image effects over a wide exposure dose range, and the process often relies on the experience of an operator to grasp the final image effects, and the accuracy of the judgment is susceptible to various factors such as experience differences, subjective differences, and post-processing.
In order to determine the Exposure dose of the DR image, one way is to use an Exposure Index (EI) to indicate the magnitude of the single Exposure, but the Exposure Index is generally simply calculated based on gray information of the DR image only, and cannot be well adapted to a complex and variable clinical scenario.
Disclosure of Invention
In the summary, a series of concepts in a simplified form are introduced, which will be further described in detail in the detailed description. The summary of the application is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one aspect, an embodiment of the present invention provides a method for analyzing DR imaging, where the method includes:
Acquiring a Digital Radiography (DR) image of a target object by a DR imaging device, wherein the DR image comprises at least one of a DR original image and an image processed by the DR original image;
extracting image features from the DR image, wherein the image features comprise at least one of gray entropy features, texture features, noise features, gradient features and divergence features;
Acquiring device parameter characteristics according to device parameters of the DR imaging device and/or acquiring object parameter characteristics according to object parameters of the target object;
And determining a DR characteristic index according to the image characteristic, the equipment parameter characteristic and/or the object parameter characteristic, wherein the DR characteristic index is used for representing the comprehensive level between the radiation dose of X-rays received by the target object and the image quality of the DR image.
Another aspect of an embodiment of the present invention provides a DR imaging apparatus including an X-ray generator, a detector, and a processor and a display;
the X-ray generator is used for generating X-rays, emitting the X-rays to a target tissue site of a target object and controlling the X-rays to pass through the target tissue site;
the detector is configured to receive X-rays after passing through the target tissue site;
The processor is configured to process the X-rays after passing through the target tissue site to obtain a digital radiography, DR, image, wherein the DR image includes at least one of a DR raw image and an image after the DR raw image is processed;
The processor is further configured to obtain a DR feature index according to the analysis method of DR imaging as described above, where the DR feature index is used to characterize a comprehensive level between a radiation dose of X-rays to which the target object is subjected and an image quality of the DR image;
the display is used for displaying the DR characteristic index.
According to the DR imaging analysis method and the DR imaging device, DR characteristic indexes capable of objectively representing comprehensive levels between the radiation dose of X-rays received by a target object and the image quality of the DR image are obtained according to the image characteristics of the DR image, the device parameter characteristics and/or the object parameter characteristics, so that an objective judgment basis is provided for an operator.
Drawings
The above and other objects, features and advantages of the present application will become more apparent from the following more particular description of embodiments of the present application, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 shows a schematic flow chart of an analysis method of DR imaging according to one embodiment of the present invention;
FIG. 2 shows a schematic diagram of model training and DR imaging analysis using a trained model in accordance with one embodiment of the present invention;
FIG. 3 shows a schematic block diagram of a DR imaging apparatus according to one embodiment of the present invention;
Fig. 4 shows a block diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein. Based on the embodiments of the application described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the application.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present application. It will be apparent, however, to one skilled in the art that the application may be practiced without one or more of these details. In other instances, well-known features have not been described in detail in order to avoid obscuring the application.
It should be understood that the present application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present application, detailed structures will be presented in the following description in order to illustrate the technical solutions presented by the present application. Alternative embodiments of the application are described in detail below, however, the application may have other implementations in addition to these detailed descriptions.
Next, an analysis method 100 of DR imaging according to an embodiment of the present application is described first with reference to fig. 1. As shown in fig. 1, the analysis method 100 of DR imaging may include the following steps:
In step S110, acquiring a digital radiography DR image of a target object by a DR imaging apparatus, the DR image including at least one of a DR original image and an image after the DR original image is processed;
Extracting image features from the DR image, the image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features at step S120;
In step S130, acquiring device parameter characteristics according to device parameters of the DR imaging device, and/or acquiring object parameter characteristics according to object parameters of the target object;
In step S140, a DR feature index is determined according to the image feature, the device parameter feature and/or the object parameter feature, where the DR feature index is used to represent a comprehensive level between a radiation dose of X-rays to which the target object is subjected and an image quality of the DR image.
According to the DR imaging analysis method 100 of the embodiment of the application, image characteristics and parameter characteristics are extracted from the DR image, and DR characteristic indexes capable of objectively representing the comprehensive level between the radiation dose of X-rays received by a target object and the image quality of the DR image are obtained according to the extracted image characteristics and parameter characteristics, so that an objective judgment basis is provided for an operator, and directions of in-hospital quality control, dose-reducing imaging and the like can be guided. The DR characteristic index may also be referred to as a DR characteristic index, a DR objectification index, or any other suitable designation.
In step S110, the DR image may be acquired locally from the DR imaging apparatus, or may be acquired from another external apparatus, which is not particularly limited herein. The DR image may be acquired from the local acquisition device in real time, or may be acquired from a DR image acquired in non-real time and stored in the local DR imaging device.
In one possible implementation, the specific process of acquiring the DR image from the local site is to control the DR imaging apparatus to emit X-rays toward and pass through a target tissue site of a target object, receive the X-rays after passing through the target tissue site, and process the X-rays after passing through the target tissue site to acquire a digital radiography DR image. The target tissue site may be a tissue site of a human or other animal body to be examined, such as the head, abdomen, etc.
The DR image of the embodiment of the present invention may be at least one of a DR original image and an image after the DR original image is processed. In the DR imaging process, the X-rays emitted by the X-ray generator are attenuated to different degrees after passing through the human body and are received by the detector. Illustratively, the upper layer of the detector is an X-ray conversion medium having photoconductive properties capable of converting X-rays into an electrical signal, producing positive and negative charges that are moved along an electric field in the form of a current under a bias voltage, collected by an array of detector detection units, each detection unit corresponding to a pixel of the DR image. The digital signal, i.e., DR original image, can be obtained by amplifying the electric signal, analog-to-digital converting, and the like. The DR original image is a gray-scale image that has not undergone post-processing such as contrast adjustment, brightness adjustment, and the like. The image processed by the DR original image may be an image obtained by subjecting the DR original image to a certain transformation relationship, or post-processing such as image contrast, image brightness adjustment, etc., that is, the image processed by the DR original image may be regarded as an image obtained by subjecting the DR original image to any post-processing, which is not particularly limited herein.
In step S120, image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features are extracted from the DR image. That is, the image features extracted in step S120 may be all of gray entropy features, texture features, noise features, gradient features, and divergence features, or any one or more of them. The gray entropy feature, the texture feature, the noise feature, the gradient feature and the divergence feature respectively reflect gray information, texture information, noise information, gradient information and divergence information of the DR image, and the multi-dimensional image feature is beneficial to obtaining a more accurate DR feature index.
In one embodiment, the gray entropy feature is the information content obtained by statistically screening redundant gray scale sources in the DR image. Depending on the transfer characteristics of the X-ray quanta converted to digital signals by the detector of the DR system, each individual gray level in the DR image may be considered a source and the DR imaging process may be considered a process of transferring information through the gray level source. Of all gray levels, there are some gray levels where no gray value exists, and thus it is regarded as a redundant gray level source. Because the extraction process of the gray entropy features of the embodiment of the invention screens redundant gray level information sources, more effective image information can be extracted.
Illustratively, gray entropy features may be extracted from a DR image in the following manner:
First, a first probability statistical distribution pH (i) of each gray level source in the DR image is obtained, wherein i is an image gray value of the DR image. Then, according to the size of the first probability statistical distribution, a first probability statistical distribution pNH (i) of the non-redundant gray scale source is determined from the first probability statistical distribution pH (i) of the gray scale source. Illustratively, if the first probability statistical distribution pH (i) is greater than 0, the first probability statistical distribution pNH (i) that is a non-redundant gray scale source is determined, whereas if the first probability statistical distribution pH (i) is greater than 0, the first probability statistical distribution pNH (i) that is a non-redundant gray scale source is determined, namely:
After obtaining the first probability statistical distribution pNH (i) of the non-redundant gray scale information sources, obtaining a second probability statistical distribution pH' (i) of each non-redundant gray scale information source according to the ratio of the first probability statistical distribution pNH (i) of each non-redundant gray scale information source to the sum Σpnh (i) of the first probability statistical distributions of all the non-redundant gray scale information sources, namely:
pH' (i) =pnh (i)/Σpnh (i) formula (2)
Finally, entropy is calculated according to a second probability statistical distribution pH' (i) of the non-redundant gray level source to obtain a gray entropy feature H (Image), where the entropy calculation method includes, but is not limited to, taking a natural logarithm ln based on e or taking a logarithm based on other values, and the entropy calculation method is expressed as:
therefore, gray entropy characteristics obtained after redundant gray level information sources are screened out can be obtained.
The texture features represent the spatial change relation of different gray values on the DR image, and can reflect the abstract features of the change rule of human tissues. Illustratively, a statistical method may be employed to extract texture features of the DR image, which results in statistical characteristics of texture regions based on gray scale properties of pels and their neighbors. Details of the statistical method for extracting texture features are mainly described below, but the method for extracting texture features may also include geometric methods, model methods, or other suitable texture feature extraction methods.
When a statistical method is used to extract texture features from a DR image, a texture feature description matrix is first extracted from the gray values of each pixel in the DR image, and then at least one two-dimensional component of the texture feature description matrix is extracted as at least one texture feature.
In one embodiment, the texture feature description matrix describes the distance between different pixels in the DR image and the change in gray value in the direction between different pixels. Let DR Image image=f (x, y), the texture feature description matrix P (i, j) is:
P (i, j) = # { (x 1, y 1), (x 2, y 2) ∈m x n|f (x 1, y 1) =i, f (x 2, y 2) =j } formula (4)
Where, # (x) represents the number of elements in the set x, and assuming that the distance between two points (x 1, y 1) and (x 2, y 2) in the image is k, the texture feature description matrix in different directions l can be expanded to P (i, j, k, l). The texture feature description matrix can be expanded by counting the changes of gray values in different distances and different directions to obtain the texture feature description matrix, so that information in more dimensions can be obtained. For example, in order to calculate the actual feature values of different angles, super-resolution interpolation may be performed on the DR image in each direction to obtain a sub-pixel gray value corresponding to the new target position, and the texture feature description matrix in different directions may be obtained according to the sub-pixel gray value.
After the texture feature description matrix is obtained, at least one two-dimensional component of the texture feature description matrix is extracted to obtain texture features, and the key characteristics of the texture feature description matrix are reflected by the texture features. Illustratively, the texture feature comprises at least one of:
The size of the P1 value can reflect the definition of the DR image and the groove depth of the texture, the deeper the groove of the texture is, the larger the P1 value is, and the shallower the groove is, the smaller the P1 value is;
the second texture feature P2 is used for counting the similarity degree of the value distribution in the texture feature description matrix and the parallel and normal directions in the DR image, and the larger the P2 value is, the larger the similarity degree of the gray level of the image in different directions is;
the third texture feature P3 is used for counting the uniformity degree of the value distribution in the texture feature description matrix and the gray level change distribution in the DR image, the size of the P3 value can reflect the uniformity degree of the gray level distribution of the image and the thickness of the texture, and the larger the P3 value is, the more stable the texture change of the DR image is;
and the fourth texture feature P4 is used for counting the value distribution in the texture feature description matrix and the measurement of the local variation of the texture of the DR image, and the larger the P4 value is, the stronger the regularity of the texture is.
The texture features extracted from the texture feature description matrix are not limited to the above four, and in other embodiments, other two-dimensional components of the texture feature description matrix may be extracted as texture features.
The image noise mainly comprises X-ray quantum noise, the distribution of the X-ray quantum is subjected to Poisson distribution, the variance of the distribution is in direct proportion to the average quantum detection number, and the fluctuation degree of the X-ray quantum can be counted from the DR image as noise characteristics according to the characteristic of the X-ray quantum noise, namely the noise characteristics reflect the fluctuation degree of the X-ray quantum in the DR image.
In one embodiment, extracting the noise characteristic from the DR image comprises the steps of filtering the DR image to obtain a high-frequency image, wherein the high-frequency image obtained from the DR image mainly comprises noise information, solving a root mean square of a neighborhood of each pixel point in the high-frequency image to obtain a noise distribution image, and counting noise value distribution in the noise distribution image to obtain the noise characteristic.
As one implementation mode, filtering the DR image to obtain a high-frequency image comprises the steps of carrying out low-frequency filtering on the DR image I to filter low-frequency components in the DR image I so as to obtain a low-frequency image I1, and obtaining the high-frequency image I2 according to the difference value between the low-frequency image and the DR image. In other implementations, the DR image I may also be directly subjected to high-frequency filtering to obtain a high-frequency image I2.
Illustratively, the DR image I may be subjected to gaussian low-pass filtering, resulting in a low-frequency image I1. The image is a two-dimensional signal, so that a two-dimensional Gaussian function is adopted to carry out Gaussian low-pass filtering, and the two-dimensional Gaussian filtering kernel is as follows:
The high frequency image I2, i.e., i2=i1-I, can be obtained by calculating the difference between the low frequency image I1 and the image I. The noise distribution image I3 can be obtained by extracting effective information in the high-frequency image I2. In one embodiment, the noise distribution image I3 is an image formed by solving for the root mean square for the neighborhood of each pixel in the high frequency image, and the method for calculating the root mean square includes using the L1 norm or other approximate or equivalent measure, for example:
where I3 (I, j) is the value of the noise distribution image I3 at the pixel point (I, j), and I2 (l, k) is the value of the high frequency I2 at the pixel point (l, k).
After the noise distribution image I3 is acquired, the noise value distribution is counted to obtain the acoustic characteristics. For example, a noise value section having the highest noise value distribution probability in the noise distribution image may be determined, and a noise value of the noise value section having the highest noise value distribution probability may be used as the noise feature.
Specifically, first, the pixel value of the noise distribution image I3 is divided into M bins, the pixel value interval of each bin is d, the initialized histogram vector is h, the length is M, and each component h (I) of the initialized histogram represents the number of pixel value values in the I-th bin. For a pixel point (I, j) in the noise distribution image I3, the interval corresponding to the pixel value of the point is R [ I3 (I, j) ]. And traversing each pixel point of the noise distribution image I3, counting h (R (I3 (I, j))), obtaining h, wherein after h is obtained, the maximum value of the main peak is max (h), and calculating the noise distribution probability R0 x d as the noise characteristic when the corresponding interval R0=argmax (h).
In some embodiments, the image features further comprise gradient features. According to the difference of absorption coefficients of X-rays penetrating through different tissues of a human body, region distribution of different tissues can be formed in a DR image, boundaries with different degrees exist between different tissue regions, and the definition degree of the boundaries can be quantified by extracting gradient features, namely the gradient features are used for representing the definition of the boundaries of the different tissues in the DR image. In one embodiment, extracting gradient features from the DR image includes determining regional distributions of different tissues in the DR image, and obtaining sharpness of boundaries of the regional distributions of different tissues as gradient features.
Illustratively, the gradient characteristics may be calculated as follows:
wherein Grad (x, y) is used to characterize the gradient magnitude at coordinates (x, y), and Image (x, y) is used to characterize the pixel value at coordinates (x, y).
In some embodiments, the image features further include a divergence feature, in the DR image, the boundaries of the different tissues are not strictly defined, but rather there is a degree of transition, and the degree of intensity and/or trend consistency of such transition may be quantified by extracting the divergence feature, i.e., the divergence feature is used to characterize the degree of transition intensity and/or trend consistency of the boundaries of the different tissues in the DR image. In one embodiment, the extracting of the divergence features from the DR image comprises determining regional distributions of different tissues in the DR image, and obtaining transition intensity and/or trend consistency of boundaries of the regional distributions of the different tissues as the divergence features.
Illustratively, the divergence characteristics may be calculated as follows:
Wherein Diver (x, y) is used to characterize the magnitude of the divergence at coordinates (x, y), image (x, y) is used to characterize the pixel value at coordinates (x, y), actan represents the arctangent function.
In some embodiments, in addition to at least one of the above image features, other image features may be extracted from the DR image for obtaining DR feature indexes, for example, gray gradient, gray average, variance, pixel value information, signal-to-noise ratio, contrast-to-noise ratio, and the like, which may be specific according to practical situations, and is not limited herein.
In step S130, device parameter characteristics are acquired according to device parameters of the DR imaging device, and/or object parameter characteristics are acquired according to object parameters of the target object. The device parameter features and object parameter features may be collectively referred to as parameter features.
Wherein the device parameter characteristics may be obtained from device parameters that can affect the image quality of the DR image or relate to the dose of X-ray radiation to the target object. When the device parameter characteristics are acquired, the device parameter characteristics can be acquired according to the parameter setting value of the DR imaging device, and also can be acquired according to the parameter detection value. After obtaining the equipment parameters, the equipment parameters can be operated according to a specific operation rule so as to obtain the characteristics of the equipment parameters.
In one embodiment, the device parameter characteristics include tube voltage parameter characteristics and/or tube current parameter characteristics, the tube voltage and tube current being the primary factors affecting the exposure dose. Specifically, an X-ray generator of the DR imaging device mainly comprises a high-voltage generator and a bulb tube, tube voltage is provided for the bulb tube through the high-voltage generator, electrons generated by a filament in the bulb tube are accelerated under the action of the tube voltage, bombard a target material to generate X-rays, and the electrons received by the target material form tube current. Generally, the tube voltage controls the energy and quality of the X-ray beam generated by the bulb tube, which in turn determines the penetration of the X-rays, the intensity of which determines the amount of the line impinging on the detector, and thus the contrast of the DR image. The tube current controls the number of X-ray beams, which determines the upper limit of the amount of X-rays that may impinge on the detector during DR imaging, and thus the density of the DR image. Too high tube voltage and tube current will cause too many X-rays penetrating the human body to increase image noise, and too low tube voltage and tube current will cause too few X-rays penetrating the human body to clearly display image details. The tube voltage parameter characteristics and the tube current parameter characteristics can reflect whether the tube voltage and the tube current are suitable or not, and finally are embodied in the DR characteristic index.
When acquiring the tube voltage parameter characteristics, the tube voltage parameter characteristics can be acquired according to the tube voltage setting value of a console of the DR imaging equipment, and also can be acquired according to the tube voltage receiving value of a high-voltage generator of the DR imaging equipment. Similarly, when acquiring the tube current parameter characteristics, the tube current parameter characteristics may be acquired from the tube current setting value of the console of the DR imaging apparatus, or may be acquired from the tube current reception value of the high voltage generator of the DR imaging apparatus.
The device parameter characteristics may also include exposure time parameter characteristics. Specifically, X-rays are generated by a high-voltage driving bulb from a high-voltage generator to enter an exposure process, wherein a detector receives the X-rays and converts the X-rays into electric signals, and the exposure time determines whether the dosage of the X-rays reaches the standard. Illustratively, the exposure time parameter characteristic may be acquired from an exposure time setting value of a console of the DR imaging apparatus, or from an exposure time reception value of a high voltage generator of the DR imaging apparatus, or from a feedback value of an automatic exposure control device of the DR imaging apparatus.
The device parameter characteristics may also include shot body position and/or body shape parameter characteristics. The tube voltage, tube current, exposure parameters, etc. applied to different photographing positions/body types are different. For example, the photographing body position and/or body type parameter characteristics may be acquired from body position and/or body type setting values of a console of the DR imaging apparatus, or from a visualized image obtained by an image pickup device of the DR imaging apparatus.
The device parameter features may also include camera distance parameter features. The imaging distance is generally referred to as the Source image distance (Source IMAGE DISTANCE, SID), which is the distance between the X-ray generator and the detector, and SID can affect the size of the field of illumination. For example, the photographing distance parameter characteristic may be acquired according to a photographing distance set value of a console of the DR imaging apparatus, or according to a photographing distance measured by a first measuring device of the DR imaging apparatus, wherein the first measuring device may be an additional measuring device of a beam limiter of the DR imaging apparatus.
The device parameter characteristics may also include an illumination field parameter characteristic. The irradiation field is the exposure area of the body surface irradiated by the X-rays, and the size of the irradiation field is closely related to the radiation dose of the X-rays and the quality of the DR image. The irradiation field mainly depends on a beam limiter and a projection distance, wherein the beam limiter is an optical device, and can control the irradiation field of X rays emitted by the bulb tube in a focusing mode, so that the projection range of the X rays is reduced on the premise of meeting the requirement of X ray imaging, and unnecessary radiation dose is avoided. Generally, the irradiation field should be reduced as much as possible under the condition of covering the photographing part to improve DR image quality and reduce the radiation dose of the target object. The irradiation field parameter characteristic may be obtained according to an irradiation field setting value of a console of the DR imaging apparatus, or may be obtained according to a photographing distance parameter characteristic and an opening size fed back by the beam limiter of the DR imaging apparatus in combination with a preset inference rule, because the irradiation field depends on the beam limiter and the photographing distance, for example.
The grid is capable of eliminating stray radiation and improving the sharpness of the DR image, and thus, in some embodiments, the device parameter features may also include grid parameter features derived from the grid parameters. In particular, grid parameter characteristics may be obtained from grid settings or grid parameter feedback values of a console of the DR imaging apparatus. Grid parameters include the thickness of the grid, the width of the bars, the gap width, the grid ratio, etc.
In addition to device parameter features, parameter features may also include object parameter features. Because of the differences of attenuation coefficients and thicknesses among human tissues, when X-rays penetrate different tissues of a human body, the attenuation degrees are different, and different images can be obtained after imaging treatment. Accordingly, corresponding object parameter characteristics may be obtained from object parameters of the target object related to tissue attenuation coefficients or thickness.
The object parameter characteristics include, for example, height parameter characteristics and/or weight parameter characteristics. When the DR imaging device is provided with the image pickup device, the height parameter characteristics and/or the weight parameter characteristics can be obtained according to the visual image obtained by the image pickup device of the DR imaging device, and specifically, the visual image can be subjected to image recognition to obtain the height parameters and/or the weight parameters, so that the height parameter characteristics and/or the weight parameter characteristics are obtained. Or the height parameter characteristic and/or the weight parameter characteristic can be obtained according to the input body position and/or body type setting value. In addition, the height parameter characteristic and/or the weight parameter characteristic can be obtained according to the height parameter and/or the weight parameter measured by the second measuring device of the DR imaging device, specifically, the second measuring device comprises a height measuring device for measuring the height parameter to obtain the height parameter characteristic, or a weight measuring device for measuring the weight parameter to obtain the weight parameter characteristic.
The subject parameter characteristics may also include, for example, an age parameter characteristic and/or a gender parameter characteristic. Wherein the age parameter characteristic is determined according to the age parameter and the gender parameter characteristic is determined according to the gender parameter. The age parameter characteristic and/or the sex parameter characteristic can be obtained according to the visual image obtained by the camera device of the DR imaging equipment, and specifically, the visual image can be subjected to image recognition to obtain the age parameter and/or the sex parameter, and further the age parameter characteristic and/or the sex parameter characteristic can be obtained. Or the age and/or sex parameter characteristics are obtained according to the input body position and/or body type setting value. Or the age parameter characteristic and/or the sex parameter characteristic may also be acquired from the age information and/or the sex information input through the input means of the DR imaging apparatus.
Referring to fig. 2, the step of extracting the image features and the step of acquiring the parameter features may be mutually affected or optimized based on a preset rule. That is, the extraction of the image features may be adjusted according to the device parameter features and/or the object parameter features, or the acquisition of the device parameter features and/or the object parameter features may be adjusted according to the image features. For example, the statistical scale or direction of the texture features in the image features may be adjusted to different extents depending on the capture body position and/or body type parameter features in the device parameter features. Specifically, for a larger body position/body shape, the texture degree is considered to be rich, and the statistical scale or direction of the texture features can be enlarged. For the mutual adjustment of other parameters, the adjustment can also be performed according to the association between the device parameter/object parameter and the image feature in the above-mentioned manner, which is not described in detail herein. By the mutual influence and optimization of the image features and the parameter features, the image features and the parameter features can better represent the image information and the parameter information, and the DR feature index can better reflect the feature information content of the DR image.
In step S140, a DR feature index reflecting the feature information content of the DR image is determined according to the image features, and the device parameter features and/or the object parameter features.
In one embodiment, with continued reference to FIG. 2, image features as well as device parameter features and/or object parameter features may be input into the target model and DR feature indices output by the target model are obtained. The target model can be a trained model or a model which is not trained. The target model can be trained in an offline mode, and the training method mainly comprises the following steps:
First, a DR image set including a plurality of sample DR images, i.e., DR images as training samples, is acquired. An image feature set may be derived from the DR image set, the image feature set comprising sample image features extracted from a plurality of sample DR images. Illustratively, at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature is extracted for each sample DR image in the DR image set, respectively, and a plurality of sample image features extracted from a plurality of sample DR images in the DR image set collectively constitute an image feature set, each sample DR image in the DR image set corresponds to at least one sample image feature in the image feature set, and the method of extracting sample image features from sample DR images may refer to the method of extracting image features from DR images in step S120.
According to the corresponding device parameters and/or object parameters of each sample DR image, a parameter feature set can be obtained, wherein the parameter feature set comprises device parameter features and/or object parameter features. For example, for each sample DR image in the DR image set, the device parameter feature and/or the object parameter feature are acquired respectively, the plurality of sample parameter features acquired from the plurality of sample DR images in the DR image set collectively form a parameter feature set, each sample DR image in the DR image set corresponds to at least one device parameter feature and/or object parameter feature in the parameter feature set, and the method for acquiring the device parameter feature and/or the object parameter feature from the sample DR image may refer to the method for acquiring the device parameter feature and/or the object parameter feature from the DR image in step S130.
A scoring set may also be derived from the DR image set, the scoring set including scores derived from evaluating the plurality of sample DR images. For example, if the sample DR image in the DR image set is a DR original image, image processing may be performed on a plurality of sample DR images in the DR image set, so as to obtain a plurality of DR diagnostic images, that is, visual images used for diagnosis by a doctor, and the plurality of DR diagnostic images form a DR diagnostic image set, where the score set is obtained according to the DR diagnostic image set, and the score set includes scores obtained by evaluating the plurality of DR diagnostic images. In some embodiments, the clinical expert may evaluate the DR diagnostic images in the DR diagnostic image set according to the quality of the DR diagnostic images, and give an expert score, where the higher the score of the DR diagnostic image is, the more the content of the basic feature information is included in the DR original image corresponding thereto, whereas the lower the score of the DR diagnostic image is, the less the content of the basic feature information is included in the DR original image corresponding thereto. The expert scores of the DR diagnostic images constitute an expert score set. Alternatively, if the DR image in the DR image set is a visual image for diagnosis by a doctor, the score set may be obtained directly from the DR image set.
And then, training the initial model by taking the acquired image feature set, parameter feature set and score set as training sample sets to obtain a target model. The target model can be a traditional machine learning model or a deep learning model, and comprises a neural network, a support vector machine, linear discriminant analysis and the like, wherein the training method comprises a model training method such as linear regression, gradient descent and the like. For example, an optimal mapping function of image features to scores may be learned such that the DR feature index resulting from image feature mapping is minimally error from the actual calibrated expert score. The optimal mapping function is performed on the image features acquired in step S120 and the parameter features acquired in step S130, so as to obtain a prediction result of the DR feature index closest to the expert score.
In one embodiment, when the image feature set, the parameter feature set and the score set are used for classification regression training, the classification regression training formula is:
Wherein x is a feature vector, { x i}i=1,...,m is a support vector, { α i is a weighting coefficient, b is a bias, k is a kernel function, and each value of the feature vector x is required to be obtained through a linear transfer function:
Wherein, the For input features, w and s are scaling and translation parameter vectors, respectively, the kernel function k may employOr (b)
It should be noted that the embodiment of the present invention does not limit the linear transfer function, kernel function or parameters used in the regression training method.
After the trained target model is obtained in the model training stage, in the practical application process, at least one image feature of the gray entropy feature, the texture feature, the noise feature, the gradient feature and the divergence feature extracted in the step S120 and at least one parameter feature of the device parameter feature and the object parameter feature obtained in the step S130 are input into the trained network model, so that a DR feature index for representing the comprehensive level between the radiation dose of the X-ray received by the target object and the image quality of the DR image can be obtained, and the DR feature index can be displayed by a display device or otherwise output. It is understood that the integrated level herein refers to the integrated level of the quality/level of the image quality of the DR image, and the level of the radiation dose of the X-rays to which it is subjected. Under the condition that the user considers that the image quality is relatively good and meets the requirement, the DR characteristic index can be tried to be reduced so as to achieve the effect of reducing the radiation dose of the received X rays, and under the condition that the image quality is relatively poor, the DR characteristic index can be improved so as to achieve the effect of improving the image quality. Factors affecting the DR characteristic index include image characteristics, device parameter characteristics, and/or object parameter characteristics as described herein, and the DR characteristic index is output by training a model.
In some embodiments, after obtaining the DR characteristic index, a prompt message for guiding adjustment of the emission dose of the X-ray may be further output according to the DR characteristic index. Since the DR characteristic index characterizes the comprehensive level between the radiation dose of the X-ray received by the target object and the image quality of the DR image, it can be determined whether the radiation dose of the X-ray received by the target object is suitable according to the DR characteristic index, and when it is determined that the radiation dose of the X-ray received by the target object is unsuitable, a prompt message for guiding and adjusting the emission dose of the X-ray is output, so as to guide the user to increase or decrease the emission dose of the X-ray.
In some embodiments, the prompting information about whether the DR characteristic index satisfies the requirement may be further output according to a relationship between the numerical value of the DR characteristic index and a preset threshold. Specifically, the numerical value of the DR characteristic index may be compared with a preset threshold, and when the numerical value of the DR characteristic index does not meet the range of the preset threshold, it is determined that the DR characteristic index does not meet the requirement, and prompt information is output in the form of text, graphics, voice, etc., so as to prompt the user to adjust the imaging parameters in time.
In some embodiments, after obtaining the DR feature index, the image quality of the DR image may be further evaluated according to the DR feature index, and the evaluation result may be output. For example, in order to more intuitively embody the image quality of the DR image, classification or grading may be performed based on the DR feature index, to obtain a qualitative quality evaluation result, or a semi-quantitative quality grading result, to prompt the user of the quality of the DR image.
Based on the above description, the analysis method for DR imaging according to the embodiments of the present application obtains, according to the image features and the device parameter features of the DR image and/or the object parameter features, a DR feature index capable of objectively reflecting a comprehensive level between a radiation dose of X-rays to which the target object is subjected and an image quality of the DR image, thereby providing an objective judgment basis for an operator.
Another aspect of an embodiment of the present invention provides a DR imaging apparatus, see fig. 3, the DR imaging apparatus 300 comprising an X-ray generator 310, a detector 320, a processor 330 and a display 340, wherein the processor 330 is communicatively coupled to the X-ray generator 310, the detector 320 and the display 340, the X-ray generator 310 is configured to generate X-rays, emit the X-rays toward a target tissue site of a target subject, and control the X-rays to pass through the target tissue site, the detector 320 is configured to receive the X-rays after passing through the target tissue site, the processor 330 is configured to process the X-rays after passing through the target tissue site to obtain a digital radiography DR image, and the display 340 is configured to display the DR image.
The X-ray generator 310 illustratively comprises a bulb for emitting X-rays, and the detector 320 comprises a flat panel detector, and in particular comprises an upper layer of X-ray conversion medium that converts the X-rays into electrical signals, producing positive and negative charges that are moved in the form of electrical currents under bias voltages along an electric field, collected by the array of detection units, and converted into electrical signals, and a lower layer of detection unit arrays. The detector 320 includes, but is not limited to, amorphous silicon, amorphous selenium, CCD, etc. In image acquisition using the DR imaging apparatus 300, the position of the X-ray generator 310 and the position of the detector 320 need to be fixed in a designated manner, so that the X-rays emitted from the X-ray generator 310 can be irradiated to the detector 320 after being transmitted through the target tissue, and thus the detector 320 can acquire the X-rays after passing through the target tissue site to obtain a DR image.
The processor 330 is configured to process the X-rays after passing through the target tissue site to obtain a digital radiography DR image. The processor 330 is further configured to perform the analysis method 100 of DR imaging described above to obtain a DR characteristic index reflecting a combined level between a dose of X-rays received by the target object and an image quality of the DR image, and the display 340 is configured to display the DR characteristic index. Specific details of the analysis method 100 of DR imaging can be found above. According to the DR imaging device 300 provided by the embodiment of the application, the DR characteristic index which can objectively reflect the comprehensive level between the radiation dose of the X-ray received by the target object and the image quality of the DR image is obtained according to the image characteristics and the device parameter characteristics of the DR image and/or the object parameter characteristics, so that an objectively judging basis is provided for an operator.
Referring to fig. 4, an embodiment of the present invention further provides an electronic device 400, where the electronic device 400 may be used to implement the above-described analysis method 100 for DR imaging, where the electronic device 400 may be a DR imaging device, for example, a flat panel detector or an automatic exposure controller, or a mobile DR device or a fixed DR device, etc., and may also be another computer or a terminal device, for example, a mobile phone, a computer, a palm computer, etc., which are not limited herein specifically.
The electronic device 400 includes a memory 410 and a processor 420, where the memory 410 stores a computer program executed by the processor 420, and the computer program executes the steps of the analysis method 100 for DR imaging when executed by the processor, and details thereof are referred to above and are not described herein.
Wherein the processor 420 may be implemented in software, hardware, firmware, or any combination thereof, may use circuitry, single or multiple application specific integrated circuits, single or multiple general purpose integrated circuits, single or multiple microprocessors, single or multiple programmable logic devices, or any combination of the foregoing circuits and/or devices, or other suitable circuits or devices, and the processor 420 may control other components in the electronic device 400 to perform the desired functions. Memory 410 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random access memory and/or cache memory, etc. The non-volatile memory may include, for example, read-only memory, hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer readable storage medium that can be executed by the processor 420 to implement the DR image analysis method and/or other various desired functions of the present invention. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer readable storage medium.
Furthermore, according to an embodiment of the present invention, there is also provided a computer storage medium on which program instructions are stored, which program instructions, when executed by a computer or processor, are adapted to carry out the respective steps of the analysis method of DR imaging of any embodiment of the present invention. In some embodiments, the computer storage medium is a non-volatile computer readable storage medium that may include a storage program area that may store an operating system, application programs required for at least one function, data created during execution of program instructions, and the like. Further, the non-volatile computer-readable storage medium may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the non-transitory computer readable storage medium optionally includes memory remotely located relative to the processor.
By way of example, the computer storage media may comprise, for example, a hard disk of a personal computer, a memory component of a tablet computer, a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a portable compact disc read-only memory (CD-ROM), a USB memory, a memory card of a smart phone, or any combination of the foregoing. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
Furthermore, according to an embodiment of the present invention, there is also provided a computer program, which may be stored on a cloud or local storage medium. Which when executed by a computer or processor is adapted to carry out the respective steps of the analysis method of DR imaging of an embodiment of the invention.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules in an item analysis device according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.

Claims (17)

1. A method of analysis of DR imaging, the method comprising:
Acquiring a Digital Radiography (DR) image of a target object by a DR imaging device, wherein the DR image comprises at least one of a DR original image and an image processed by the DR original image;
Extracting image features from the acquired DR image, wherein the image features comprise at least one of gray entropy features, texture features, noise features, gradient features and divergence features;
Acquiring device parameter characteristics according to device parameters of the DR imaging device and/or acquiring object parameter characteristics according to object parameters of the target object, wherein the device parameters comprise at least one of tube voltage parameter characteristics, tube current parameter characteristics, shooting position and/or body type parameter characteristics, exposure time parameter characteristics, shooting distance parameter characteristics, irradiation field parameter characteristics and grid parameter characteristics;
And determining a DR characteristic index according to the image characteristic, the equipment parameter characteristic and/or the object parameter characteristic, wherein the DR characteristic index is used for representing the comprehensive level between the radiation dose of X-rays received by the target object and the image quality of the DR image.
2. The method of claim 1, wherein,
Acquiring the device parameter characteristics includes acquiring the tube voltage parameter characteristics, and acquiring the tube voltage parameter characteristics includes:
acquiring the tube voltage parameter characteristic according to a tube voltage setting value of a console of the DR imaging device, or acquiring the tube voltage parameter characteristic according to a tube voltage receiving value of a high voltage generator of the DR imaging device, and/or,
Acquiring device parameter characteristics includes acquiring the tube current parameter characteristics, including:
Acquiring the tube current parameter characteristic according to a tube current setting value of a console of the DR imaging apparatus, or acquiring the tube current parameter characteristic according to a tube current receiving value of a high voltage generator of the DR imaging apparatus, and/or,
Acquiring device parameter features includes acquiring the shooting position and/or body type parameter features, including:
acquiring the shooting body position and/or body type parameter characteristics according to the body position and/or body type setting value of a console of the DR imaging device, or acquiring the shooting body position and/or body type parameter characteristics according to a visual image acquired by an imaging device of the DR imaging device, and/or,
Acquiring the equipment parameter feature comprises acquiring the exposure time parameter feature, and acquiring the exposure time parameter feature comprises:
The exposure time parameter characteristic is obtained according to the exposure time setting value of a console of the DR imaging device, or the exposure time parameter characteristic is obtained according to the exposure time receiving value of a high voltage generator of the DR imaging device, or the exposure time parameter characteristic is obtained according to the feedback value of an automatic exposure control device of the DR imaging device, and/or,
Acquiring the equipment parameter feature comprises acquiring the photographic distance parameter feature, and acquiring the photographic distance parameter feature comprises:
acquiring the photographic distance parameter feature according to a photographic distance setting value of a console of the DR imaging device, or acquiring the photographic distance parameter feature according to a photographic distance measured by a first measuring device of the DR imaging device, and/or,
Acquiring the device parameter feature includes acquiring the illumination field parameter feature, the acquiring the illumination field parameter feature includes:
Acquiring the irradiation field parameter characteristic according to the irradiation field setting value of a console of the DR imaging device, or acquiring the irradiation field parameter characteristic according to the photographing distance parameter characteristic and the opening size fed back by a beam limiter of the DR imaging device, and/or,
Acquiring device parameter characteristics includes acquiring the grid parameter characteristics, the acquiring the grid parameter characteristics including:
The grid parameter characteristics are obtained according to grid setting values of a console of the DR imaging device or according to grid parameter feedback values.
3. The method of claim 1, wherein the subject parameter characteristics include at least one of height parameter characteristics, weight parameter characteristics, age parameter characteristics, and gender parameter characteristics.
4. The method of claim 3, wherein,
Acquiring the height parameter feature and/or the weight parameter feature comprises:
The height parameter feature and/or the weight parameter feature are obtained according to a visual image obtained by an image pickup device of the DR imaging device, or the height parameter feature and/or the weight parameter feature are obtained according to an input body position and/or body type setting value, or the height parameter feature and/or the weight parameter feature are obtained according to a height parameter and/or a weight parameter measured by a second measuring device of the DR imaging device, and/or,
Acquiring the age parameter feature and/or the gender parameter feature comprises:
The age parameter feature and/or the sex parameter feature are/is acquired according to a visual image obtained by an image pickup device of the DR imaging device, or are/is acquired according to an input body position and/or body type setting value, or are/is acquired according to age information and/or sex information input by an input device of the DR imaging device.
5. The method of claim 1, wherein extracting image features comprises extracting the grayscale entropy features, the extracting the grayscale entropy features comprising:
obtaining a first probability statistical distribution of each gray level information source in the DR image;
determining a first probability statistical distribution of the non-redundant gray level source from the first probability statistical distribution of the gray level source according to the size of the first probability statistical distribution;
obtaining a second probability statistical distribution of each non-redundant gray level information source according to the ratio of the first probability statistical distribution of each non-redundant gray level information source to the sum of the first probability statistical distributions of all the non-redundant gray level information sources;
And entropy calculation is carried out on the second probability statistical distribution of the non-redundant gray level information source, and the gray entropy characteristic is obtained.
6. The method of claim 1, wherein extracting image features comprises extracting the texture features, the extracting the texture features comprising:
obtaining a texture feature description matrix according to the gray value of each pixel point in the DR image, wherein the texture feature description matrix describes the distance between different pixel points in the DR image and the change of the gray value in the direction between different pixel points;
at least one two-dimensional component of the texture feature description matrix is extracted as the texture feature.
7. The method of claim 1, wherein extracting image features comprises extracting the noise features, the extracting the noise features comprising:
Filtering the DR image to obtain a high-frequency image;
Solving root mean square of the neighborhood of each pixel point in the high-frequency image to obtain a noise distribution image;
and counting the noise value distribution in the noise distribution image to obtain the noise characteristic.
8. The method of claim 1, wherein extracting image features comprises extracting the gradient features, the extracting the gradient features comprising:
determining the regional distribution of different tissues in the DR image;
and acquiring the definition of the boundary of the distribution of different tissue areas as the gradient characteristic.
9. The method of claim 1, wherein extracting image features comprises extracting the divergence features, the extracting the divergence features comprising:
determining the regional distribution of different tissues in the DR image;
And obtaining the transition intensity degree and/or trend consistency of the boundaries of the distribution of different tissue areas as the divergence characteristic.
10. The method of claim 1, further comprising adjusting the extraction of the image features based on the device parameter features and/or the object parameter features and/or adjusting the acquisition of the device parameter features and/or the object parameter features based on the image features.
11. The method of claim 1, wherein said determining a DR feature index from said image features, and said device parameter features and/or said object parameter features, comprises:
inputting the image features, the equipment parameter features and/or the object parameter features into a target model, and acquiring the DR feature index output by the target model.
12. The method of claim 11, wherein the training process of the target model comprises:
Acquiring a DR image set, wherein the DR image set comprises a plurality of sample DR images;
Obtaining an image feature set according to the DR image set, wherein the image feature set comprises a plurality of sample image features extracted from a plurality of sample DR images;
obtaining a parameter feature set, wherein the parameter feature set comprises device parameter features obtained according to device parameters of a DR imaging device for generating the sample DR image and/or object parameter features obtained according to object parameters of a target object of the sample DR image;
Obtaining a scoring set according to the DR image set, wherein the scoring set comprises scores obtained by evaluating a plurality of sample DR images;
and training an initial model by taking the image feature set, the parameter feature set and the evaluation set as training sample sets to obtain the target model.
13. The method of claim 1, wherein acquiring a digital radiography DR image of the target object by a DR imaging apparatus comprises:
Controlling the DR imaging device to emit X-rays toward a target tissue site of the target subject and controlling the X-rays to pass through the target tissue site;
receiving X-rays after passing through the target tissue site;
processing the X-rays after passing through the target tissue site to acquire the digital radiography DR image.
14. The method of claim 13, wherein the method further comprises:
and outputting prompt information for guiding and adjusting the emission dosage of the X-rays according to the DR characteristic index.
15. The method of claim 1, wherein the method further comprises:
and displaying the DR characteristic index, and outputting prompt information whether the DR characteristic index meets the requirement according to the relation between the numerical value of the DR characteristic index and a preset threshold value.
16. The method of claim 1, wherein the method further comprises:
and evaluating the image quality of the DR image according to the DR characteristic index, and outputting an evaluation result.
17. A DR imaging apparatus, comprising an X-ray generator, a detector, a processor and a display;
the X-ray generator is used for generating X-rays, emitting the X-rays to a target tissue site of a target object and controlling the X-rays to pass through the target tissue site;
the detector is configured to receive X-rays after passing through the target tissue site;
The processor is configured to process the X-rays after passing through the target tissue site to obtain a digital radiography, DR, image, wherein the DR image includes at least one of a DR raw image and an image after the DR raw image is processed;
The processor is further configured to obtain a DR feature index according to an analysis method of DR imaging of any one of claims 1-16, the DR feature index being configured to characterize a level of integration between a radiation dose of X-rays experienced by the target object and an image quality of the DR image;
the display is used for displaying the DR characteristic index.
CN202210657368.XA 2022-06-10 2022-06-10 DR imaging analysis method and DR imaging equipment Active CN115089203B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210657368.XA CN115089203B (en) 2022-06-10 2022-06-10 DR imaging analysis method and DR imaging equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210657368.XA CN115089203B (en) 2022-06-10 2022-06-10 DR imaging analysis method and DR imaging equipment

Publications (2)

Publication Number Publication Date
CN115089203A CN115089203A (en) 2022-09-23
CN115089203B true CN115089203B (en) 2025-10-21

Family

ID=83291129

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210657368.XA Active CN115089203B (en) 2022-06-10 2022-06-10 DR imaging analysis method and DR imaging equipment

Country Status (1)

Country Link
CN (1) CN115089203B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119523500A (en) * 2023-08-22 2025-02-28 深圳迈瑞生物医疗电子股份有限公司 DR image analysis method and DR imaging system
CN117115466B (en) * 2023-09-11 2025-07-25 四川大学 Head side X-ray film age and sex estimation method and system based on hidden variable model
CN119579615B (en) * 2025-02-06 2025-04-29 万里云医疗信息科技(北京)有限公司 DR image quality inspection method, device and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104039262A (en) * 2011-09-30 2014-09-10 儿童医院医疗中心 Consistent and verifiable optimization method for computed tomography (CT) radiation dose
CN114359129A (en) * 2020-10-13 2022-04-15 深圳迈瑞生物医疗电子股份有限公司 DR image analysis method and electronic device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10085698B2 (en) * 2016-01-26 2018-10-02 Genereal Electric Company Methods and systems for automated tube current modulation
CN108742663A (en) * 2018-04-03 2018-11-06 深圳蓝韵医学影像有限公司 Exposure dose evaluation method, device and computer readable storage medium
CN109587389B (en) * 2018-12-19 2020-12-04 上海联影医疗科技股份有限公司 A method and system for collecting images with a digital grid system
CN111493909B (en) * 2020-04-30 2023-10-03 上海联影医疗科技股份有限公司 Medical image scanning method, medical image scanning device, computer equipment and storage medium
WO2021232195A1 (en) * 2020-05-18 2021-11-25 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for image optimization

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104039262A (en) * 2011-09-30 2014-09-10 儿童医院医疗中心 Consistent and verifiable optimization method for computed tomography (CT) radiation dose
CN114359129A (en) * 2020-10-13 2022-04-15 深圳迈瑞生物医疗电子股份有限公司 DR image analysis method and electronic device

Also Published As

Publication number Publication date
CN115089203A (en) 2022-09-23

Similar Documents

Publication Publication Date Title
CN115089203B (en) DR imaging analysis method and DR imaging equipment
US12020821B2 (en) Method and apparatus of predicting fracture risk
US7646902B2 (en) Computerized detection of breast cancer on digital tomosynthesis mammograms
CN106413236B (en) A kind of exposure parameter method of adjustment and device
US8965078B2 (en) Projection-space denoising with bilateral filtering in computed tomography
CN105559813B (en) Medical diagnostic imaging apparatus and medical image-processing apparatus
CN113689342A (en) A method and system for optimizing image quality
CN109938764A (en) A kind of adaptive multiple location scan imaging method and its system based on deep learning
JP2014064941A (en) Method for calculating brightness level of digital x-ray image for medical application
CN104700389B (en) Object identifying method in dual intensity CT scan image
CN105488781A (en) Dividing method based on CT image liver tumor focus
US7873196B2 (en) Medical imaging visibility index system and method for cancer lesions
Katherine et al. CT scan image segmentation based on hounsfield unit values using Otsu thresholding method
CN116309806A (en) A CSAI-Grid RCNN-based method for locating regions of interest in thyroid ultrasound images
US20240237956A1 (en) Systems, Methods, and Media for Generating Low-Energy Virtual Monoenergetic Images from Multi-Energy Computed Tomography Data
CN114359129B (en) DR image analysis method and electronic device
Fazilov et al. Comparative analysis of noise reduction filters for the quality enhancement of mammography images
JPH08263648A (en) Detection method for abnormal shading candidate
Petrov et al. Model and human observer reproducibility for detection of microcalcification clusters in digital breast tomosynthesis images of three-dimensionally structured test object
CN120144800B (en) Artificial intelligence assisted CT image quality enhancement and noise reduction method
Dovganich et al. Automatic quality control in lung X-ray imaging with deep learning
Ilyasova et al. Segmentation of lung images using textural features
CN119523513A (en) Pre-exposure prediction method for DR imaging system and DR imaging system
CN119523516A (en) DR characteristic index calibration method and DR imaging system
CN119523511A (en) Exposure control method of DR imaging system and DR imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Country or region after: China

Address after: Building 5, No. 828 Gaoxin Avenue, Donghu New Technology Development Zone, Wuhan City, Hubei Province, 430206

Applicant after: Wuhan Mindray Biomedical Technology Co.,Ltd.

Applicant after: SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS Co.,Ltd.

Address before: 430223 floor 3, building B1, zone B, high tech medical device Park, No. 818, Gaoxin Avenue, Donghu New Technology Development Zone, Wuhan, Hubei Province (Wuhan area of free trade zone)

Applicant before: Wuhan Mairui Medical Technology Research Institute Co.,Ltd.

Country or region before: China

Applicant before: SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20251210

Address after: 430206 Hubei Province Wuhan City Donghu New Technology Development Zone JiuLong South Road 18 Building 5

Patentee after: Wuhan Mindray Technology Co.,Ltd.

Country or region after: China

Patentee after: SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS Co.,Ltd.

Address before: Building 5, No. 828 Gaoxin Avenue, Donghu New Technology Development Zone, Wuhan City, Hubei Province, 430206

Patentee before: Wuhan Mindray Biomedical Technology Co.,Ltd.

Country or region before: China

Patentee before: SHENZHEN MINDRAY BIO-MEDICAL ELECTRONICS Co.,Ltd.

TR01 Transfer of patent right