Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
The Intima-Media structure of an artery is contained inside two nearly parallel boundaries, IMT being the distance between the Lumen-Intima Interface (LII) and the Media-Adventitia Interface, MAI. The normal IMT is below 1mm, 1.0-1.2 mm is intimal thickening, 1.2-1.4 mm is plaque formation, and greater than 1.4mm is carotid stenosis. In the IMT measurement process, an appropriate detection depth needs to be adjusted, and if necessary, an image needs to be amplified to observe accurate upper and lower boundary positions.
The traditional IMT measurement method relies on doctors to carry out manual measurement to a large extent, and the examination method is easily influenced by personal experience and subjective judgment of the doctors, and has large measurement errors of observers. Meanwhile, the pure manual measurement is time-consuming and has low inspection efficiency. In recent years, for the automatic measurement scheme of IMT, researchers mainly propose two ideas: semi-automatic IMT measurement and purely automatic IMT measurement. The semi-automatic IMT measurement scheme requires a doctor to visually interpret an ultrasonic image of a cervical artery of a patient, draws an intima-media area to be measured by using a sampling frame tool by means of ultrasonic equipment, selects an IMT measurement type, finally invokes a background algorithm to calculate various parameters of the IMT in real time, draws contour boundaries of LII and MAI in the sampling frame tool, greatly reduces the workload of an ultrasonic inspection doctor, and improves the inspection efficiency. However, the measurement scheme is not provided with the capability of automatically distinguishing the IMT measurement types, so that in the actual application process, a doctor needs to manually select the current IMT measurement type; or only support semi-automatic measurement of far-field IMT, when near-field IMT measurement is required, the corresponding semi-automatic IMT measurement algorithm cannot give a suitable measurement result. The automatic IMT measurement scheme utilizes various algorithms to realize automatic measurement and analysis of the IMT in the region to be measured. Although this approach reduces the number of steps performed by the physician, it has a number of drawbacks: on one hand, the algorithm not only needs to select a proper position to perform IMT analysis, but also needs to complete IMT contour analysis, the execution time of the algorithm can be greatly increased, the execution efficiency of the algorithm is difficult to meet the requirement of measuring the IMT in real time, and whether the measured position is proper or not needs to be checked by doctors; on the other hand, because the imaging effect of the inner tunica media of the front wall is poor, the automatic measurement scheme cannot always realize the scene of measuring the front field and the back field together or can only realize measurement to a certain extent on the back wall, the imaging quality requirement on the image is high, and meanwhile, because the measured sampling frame is selected by an algorithm independently, a doctor cannot measure the IMT again on the region of interest.
At present, most researches mainly realize automatic extraction of the IMT upper and lower contour lines by means of methods based on gradient boundary detection, an active contour model, dynamic planning, hough transformation and the like, wherein the extraction effect of a Snake model is optimal, but the model has the problem of calculation stability, and the requirement of extracting the IMT upper and lower contour lines in real time is difficult to achieve.
In view of this, in some embodiments of the present disclosure, for an input that is an ultrasound video stream, the algorithm automatically identifies the IMT measurement type within the sample box based on the manually selected sample box and performs a complete IMT measurement procedure on the ultrasound image of the first frame. For other frames of ultrasonic images, part of IMT measurement flow is carried out based on the target mask image of the previous frame of ultrasonic image, so that repeated operation is avoided, calculation efficiency is improved, and IMT of each frame of ultrasonic image in the ultrasonic video stream is extracted in real time.
Fig. 1 is a schematic view of an application scenario of an intima-media measuring system according to some embodiments of the present description.
In some embodiments, the intima-media measurement system of vascular ultrasound images may be applied in a variety of fields. For example, in the medical field, intima-media thickness measurements are used to aid in the diagnosis and treatment of cardiovascular diseases, such as atherosclerosis, coronary heart disease, and the like. Based on the measured intima-media thickness, a physician can determine the extent of vascular stenosis, evaluate the condition, and formulate a treatment regimen. Also for example, in the scientific field, intima-media thickness measurements are used to study the progression of cardiovascular disease and drug efficacy assessment. Also for example, arterial ultrasound may measure the intima-media layers of various internal organs or vessels, such as carotid arteries, heart, blood vessels, liver, gall bladder, and the like.
In some embodiments, as shown in fig. 1, the application scenario 100 of the intima-media measurement system includes an ultrasound device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150.
The ultrasound device 110 can be used for ultrasound imaging of an object to be scanned. The object to be scanned may be biological or non-biological. For example, the object to be scanned may be a patient, an artificial object, or the like. The object to be scanned can be subjected to an ultrasound examination in any body position, for example, a supine position, a lateral position, a prone position, a semi-prone position, or a sitting position. The object to be scanned may comprise a specific part, organ, tissue and/or body part of the patient. By way of example only, the object to be scanned may include a head, brain, neck, body, shoulder, arm, chest, heart, stomach, blood vessels, soft tissue, knee, foot, or the like, or a combination thereof.
The ultrasound device 110 can reflect image information of internal body tissue of the subject to be scanned to assist the physician in disease diagnosis. In some embodiments, the ultrasound device 110 is capable of acquiring ultrasound images using differences in the physical characteristics of the ultrasound waves and the acoustic properties of the object, which may be displayed and/or recorded in the form of waveforms, curves, or images of features associated with the object. In some embodiments, the ultrasound device 110 comprises one of an ultrasound pulse echo imaging device, an ultrasound echo doppler imaging device, an ultrasound electronic endoscope, an ultrasound doppler blood flow analysis device, an ultrasound anthropometric device, or the like, or any combination thereof. In some embodiments, the ultrasound device 110 is capable of receiving, via the network 120, relevant information and/or imaging operation instructions from the object to be scanned sent by the terminal device 130 or the processing device 140, and may send the scan data or ultrasound images to the processing device 140, the storage device 150, or the terminal device 130.
Network 120 refers to any suitable network that facilitates the exchange of information and/or data by one or more components in application scenario 100. In some embodiments, one or more other components of application scenario 100 (e.g., ultrasound device 110, terminal device 130, processing device 140, storage device 150, etc.) may exchange information and/or data with each other over network 120. For example, the processing device 140 can acquire ultrasound images from the ultrasound device 110 over the network 120. As another example, processing device 140 may be capable of obtaining user instructions from terminal device 130 via network 120.
In some embodiments, network 120 is any one or more of a wired network or a wireless network. For example, the network 120 includes a combination of one or more of a cable network, a wired network, a fiber optic network, a telecommunications network, a local area network, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth (TM) network, a zigbee (TM) network, a near field communication Network (NFC), and the like. The network connection between the parts can be in one of the above-mentioned ways or in a plurality of ways. In some embodiments, network 120 may be a point-to-point, shared, centralized, etc. variety of topologies or combinations of topologies. In some embodiments, network 120 may include one or more network access points. For example, network 120 may include wired and/or wireless network access points, such as base stations and/or network switching points, through which one or more components of application scenario 100 may access network 120 for data and/or information exchange.
Terminal device 130 can be in communication and/or connection with ultrasound device 110, processing device 140, and/or storage device 150. For example, the terminal device 130 can send one or more control instructions to the ultrasound device 110 over the network 120 to control the ultrasound device 110 to scan the object to be scanned as instructed. In some embodiments, the terminal device 130 includes one or any combination of a mobile device 130-1, a tablet 130-2, a laptop 130-3, etc., or other input and/or output enabled devices. In some embodiments, the terminal device 130 is capable of remotely operating the ultrasound device 110. In some embodiments, the terminal device 130 is capable of operating the ultrasound device 110 via a wireless connection. In some embodiments, the terminal device 130 is capable of receiving information and/or instructions entered by a user and transmitting the received information and/or instructions to the ultrasound device 110 or the processing device 140 via the network 120. In some embodiments, terminal device 130 is capable of receiving data and/or information from processing device 140. In some embodiments, the terminal device 130 may be part of the processing device 140. In some embodiments, the terminal device 130 may be integral with the processing device 140 as a console for the ultrasound device 110. In some embodiments, the application scenario 100 may omit the terminal device 130.
The processing device 140 is capable of processing data and/or information obtained from the ultrasound device 110, the terminal device 130, and/or the storage device 150. For example, the processing device 140 can acquire a current frame of ultrasound image from the ultrasound device 110 and determine the intima-media boundary of the current frame of ultrasound image. In some embodiments, the processing device 140 can send the processing results to the terminal device 130. For example, the processing device 140 may transmit the acquired current frame ultrasound image, intima-media boundary, intima-media thickness, etc. to the terminal device 130 and display on one or more display devices in the terminal device 130. In some embodiments, the processing device 140 may be a single server or a group of servers. The server farm may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, processing device 140 can access information and/or data stored at ultrasound device 110, terminal device 130, and/or storage device 150 via network 120. For example, processing device 140 may be directly connected to ultrasound device 110, terminal device 130, and/or storage device 150 to access information and/or data stored therein. In some embodiments, the processing device 140 can be executed on a cloud platform. In some embodiments, the processing device 140 is part of the ultrasound device 110 or the terminal device 130.
Storage device 150 may store data, instructions, and/or other information. In some embodiments, storage device 150 may store data obtained from terminal device 130 and/or processing device 140. In some embodiments, storage device 150 may store data and/or instructions that are executed or used by processing device 140 to perform the exemplary methods described herein. In some embodiments, storage device 150 may include a combination of one or more of mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like. In some embodiments, the storage device 150 may be executed on a cloud platform.
In some embodiments, the storage device 150 is capable of connecting to the network 120 to communicate with one or more other components (e.g., the processing device 140, the terminal device 130, etc.) in the application scenario 100. One or more components in the application scenario 100 are capable of accessing data or instructions stored in the storage device 150 through the network 120. In some embodiments, the storage device 150 can be directly connected to or in communication with one or more other components (e.g., the processing device 140, the terminal device 130, etc.) in the application scenario 100. In some embodiments, the storage device 150 may be part of the processing device 140.
It is noted that the application scenario 100 of the intima-media measuring system is provided for illustrative purposes only and is not intended to limit the scope of the present description. Various changes and modifications may be made by one of ordinary skill in the art in light of the description herein. For example, the application scenario 100 may also include databases, information sources, and the like. As another example, the application scenario 100 may be implemented on other devices to implement similar or different functionality. However, such changes and modifications do not depart from the scope of the present specification.
Fig. 2 is a schematic diagram of an endomesomembrane measurement system 200 according to some embodiments of the present disclosure.
As shown in fig. 2, the intima-media measurement system 200 includes an acquisition module 210, a classification module 220, an extraction module 230, a mask module 240, and a determination module 250. In some embodiments, the acquisition module 210, the classification module 220, the extraction module 230, the mask module 240, the determination module 250 may be implemented by the processing device 140.
In some embodiments, the acquisition module 210 is configured to acquire a region of interest of the ultrasound image.
In some embodiments, the classification module 220 is configured to determine the intima-media measurement type based on the region of interest by a preset classification algorithm.
In some embodiments, classification module 220 is further to: extracting image features based on the region of interest to obtain image feature data; and classifying the intima-media measurement types based on the image characteristic data, and determining the intima-media measurement types.
In some embodiments, classification module 220 is further to: extracting an ultrasonic image area corresponding to the region of interest, optimizing the brightness distribution of the ultrasonic image area, and determining a first image; performing multi-scale image reconstruction based on the first image, and determining a second image; and extracting image features based on the second image, and determining image feature data.
In some embodiments, the extracting module 230 is configured to extract an image to be measured from the current frame of ultrasound image based on the region of interest. In some embodiments, the extracting module 230 is configured to perform image feature optimization on the current frame of ultrasound image based on the region of interest, to obtain an image to be measured.
In some embodiments, the extraction module 230 is further to: and performing expansion based on the region of interest, and extracting an image to be measured from the current frame ultrasonic image according to the expansion region. In some embodiments, the extraction module 230 is further to: and carrying out gray level optimization and scale optimization on an image area corresponding to the expansion area in the ultrasonic image to obtain an image to be measured.
In some embodiments, the mask module 240 is configured to obtain a target mask image of the current frame ultrasound image based on the image to be measured.
In some embodiments, mask module 240 is further to: gradient extraction is carried out based on an image to be measured, and a gradient image is obtained; determining an estimation result of an intima-media boundary of the gradient image based on the gradient image; based on the estimation result of the intima-media boundary, acquiring an initial mask image of the current frame ultrasonic image; and based on the initial mask image of the current frame ultrasonic image, obtaining a target mask image of the current frame ultrasonic image through iterative calculation of a preset contour optimization algorithm.
In some embodiments, mask module 240 is further to: extracting image gradients based on the image to be measured, and obtaining a forward gradient image; extracting a phase gradient based on an image to be measured, and obtaining a phase gradient image; a gradient image is determined based on the forward gradient image and the phase gradient image.
In some embodiments, mask module 240 is further to: determining the upper and lower boundary positions of each column of pixels in the gradient image based on the estimation result of the inner and middle membrane boundaries; and based on the upper and lower boundary positions, performing intima-media boundary fluctuation analysis, and determining an initial mask image of the current frame ultrasonic image.
In some embodiments, the determination module 250 is configured to determine the intima-media thickness parameter of the current frame ultrasound image from the target mask image of the current frame ultrasound image based on the intima-media measurement type.
In some embodiments, the determination module 250 is further to: generating an intima-media boundary based on the target mask image of the current frame ultrasonic image and the region of interest; determining a first contour line of an adventitia boundary of the intima and a second contour line of an intima boundary of the lumen based on the intima-media measurement type and the intima-media boundary; and determining an intima-media thickness parameter corresponding to the intima-media measurement type based on the first contour line and the second contour line.
For further description of the acquisition module 210, classification module 220, extraction module 230, mask module 240, determination module 250, see fig. 3-5 and their associated description.
In some embodiments, the intima-media measurement system 200 of the arterial ultrasound image further includes a real-time measurement module (not shown in fig. 2). In some embodiments, the real-time measurement module obtains a next frame of ultrasound image of the current frame of ultrasound image; determining a target mask image of the next frame of ultrasonic image based on the target mask image of the current frame of ultrasonic image and the next frame of ultrasonic image; and determining the intima-media thickness parameter of the next frame of ultrasonic image based on the intima-media measurement type and the target mask image of the next frame of ultrasonic image. In some embodiments, the real-time measurement module is further to: extracting a first image area corresponding to the region of interest from the next frame of ultrasonic image; determining an initial mask image of a next frame of ultrasonic image based on the target mask image of the current frame of ultrasonic image and the first image area; a target mask image of the next frame of ultrasound image is determined based on the initial mask image of the next frame of ultrasound image. See fig. 6 for more description of a real-time measurement module and its associated description.
It should be noted that the above description of the intima-media measuring system 200 and its modules is for convenience only and is not intended to limit the present description to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. In some embodiments, the obtaining module 210, the classifying module 220, the extracting module 230, the mask module 240, and the determining module 250 may be different modules in one system, or may be one module to implement the functions of two or more modules. For example, each module may share one memory module, or each module may have a respective memory module. Such variations are within the scope of the present description.
Fig. 3 is an exemplary flowchart of an intima-media measuring method for arterial ultrasound images, according to some embodiments of the present description. In some embodiments, the process 300 may be performed by the processing device 140 of the intima-media measurement system based on arterial ultrasound images. As shown in fig. 3, the process 300 includes the following steps.
In step 310, a region of interest of an ultrasound image is acquired. In some embodiments, step 310 may be performed by the acquisition module 210 or the processing device 140.
The ultrasound image includes at least one frame. In some embodiments, the ultrasound image may be a real-time image, i.e., an image scanned in real-time by the ultrasound device. In some embodiments, the ultrasound image may also be a historical image stored in a storage device.
The region of interest refers to the portion of the ultrasound image that needs to be processed. For example, the region of interest may be the region of the intima-media structure in the ultrasound image. In some embodiments, the region of interest may be a selected region within a sample frame. The sampling box may mark the region of interest to distinguish from other portions of the ultrasound image. The sampling box refers to a graphical structure that performs a box selection task in the ultrasound image. The sampling frame can be a closed graph structure with a rectangular shape, a round shape, an elliptic shape and the like.
In some embodiments, the processing device may obtain the region of interest of the ultrasound image in a variety of ways. For example, the processing device may acquire a region of interest of the ultrasound image through manual input. For example, a user may interact with a user interface of the terminal device to select a region of interest as the region of interest by outlining a sampling frame where the carotid intima-media of the subject is located. In some embodiments, the user is an operator of the intima-media measuring system 200 of an arterial ultrasound image, e.g., a doctor or the like.
In some embodiments, the processing device may obtain parameter information (e.g., location, shape, area, size, etc.) of the region of interest based on the sampling frame.
Step 320, based on the region of interest, determining the intima-media measurement type by a preset classification algorithm. In some embodiments, step 320 may be performed by classification module 220 or processing device 140.
The intima-media measurement type refers to the type of intima-media measured. Intima-media measurement types may include anterior wall IMT, posterior wall IMT, and anterior-posterior wall IMT. The front wall IMT is used for measuring the thickness of the front wall of the inner middle membrane, and the front wall of the inner middle membrane is used for measuring the structure of the inner middle membrane close to one side of the ultrasonic probe. The back wall IMT refers to the measurement of the thickness of the back wall of the inner media, which refers to the structure of the inner media on the side remote from the probe. The front-back wall IMT refers to the measurement of the thickness of both the anterior and posterior inner media walls.
In some embodiments, the processing device may determine the intima-media measurement type based on the region of interest in a variety of ways, including by means of machine learning models, image matching, feature algorithms, and the like. For example, the processing device may perform image matching in a database based on the region of interest, obtain reference image data with high similarity, and determine the intima-media measurement type corresponding to the reference image data as the intima-media measurement type corresponding to the current region of interest. The database may include preset correspondence between different reference image data and different intima-media measurement types. The database may be determined based on a priori knowledge or historical data.
In some embodiments, the preset classification algorithm comprises: extracting image features based on the region of interest to obtain image feature data; and classifying the intima-media measurement types based on the image characteristic data, and determining the intima-media measurement types.
The image feature data is data for describing image features of the region of interest. In some embodiments, the image feature data includes geometric features, color features, texture features, shape features, histogram features, and the like of the region of interest. In some embodiments, the image characteristic data includes gray scale distribution characteristics of the region of interest. The gray distribution feature may be used to represent the number of pixels of each gray value in the region of interest and/or the distribution of gray values of each pixel in the region of interest.
In some embodiments, the processing device may perform image feature extraction in a variety of ways. For example, the processing device may image-encode the region of interest, obtaining encoded data as image feature data. Among them, methods of image coding include, but are not limited to, compression coding, transform coding, predictive coding, and the like. For example, the processing device may perform image feature extraction on the region of interest by an image feature extraction algorithm. Exemplary image feature extraction algorithms include, but are not limited to, SIFT feature extraction algorithms, HOG feature extraction algorithms, LBP feature extraction algorithms, haar feature extraction algorithms, and the like. The image feature extraction method in the embodiment of the present disclosure is not particularly limited, and may be performed by operations known to those skilled in the art.
In some embodiments, the processing device may extract an ultrasound image region corresponding to the region of interest, and perform a shading optimization on the ultrasound image region, to determine the first image; performing multi-scale image reconstruction based on the first image, and determining a second image; and extracting image features based on the second image, and determining image feature data.
The ultrasound image region refers to a portion of the ultrasound image that is located within the region of the sampling frame.
In some embodiments, the ultrasound image region may be an image region that includes at least one of a carotid blood vessel, a carotid blood vessel wall, an inner, middle and outer membrane of a blood vessel wall, and an external structure of a blood vessel, or any combination thereof.
The first image is an image obtained by optimizing the brightness distribution of the ultrasonic image area.
Shading optimization refers to a method of adjusting the exposure, contrast, and color of an ultrasound image region. By optimizing the light-shade distribution, the gray-scale difference of blood flow and blood vessel wall in the ultrasonic image area can be enhanced.
In some embodiments, the processing device may perform shading optimization in a number of ways. For example, the processing device may perform shading optimization on the ultrasound image region by histogram equalization, percentage stretching, and the like, to obtain the first image.
The second image is an image obtained by reconstructing the multi-scale image of the first image.
Multiscale image reconstruction is a method of reconstructing an image by multiscale transformation. The multi-scale change comprises wavelet change, image difference value, image reconstruction and the like.
In some embodiments, the processing device may filter the detail information of the first image by multi-scale image reconstruction to obtain a second image retaining gray transition features. Wherein the detail information includes at least one of a color feature, a texture feature, a shape feature, and the like of the first image. The gray transition feature is a feature related to the gray difference of the blood flow and the blood vessel wall.
In some embodiments, the processing device may perform multi-scale image reconstruction in a variety of ways. For example, the processing device may filter the detail information of the first image through an image filter. The image filter may include a gaussian filter, an average filter, and the like.
In some embodiments, the processing device may alternately repeat the plurality of different multi-scale image reconstruction methods for the first image to obtain the second image. The multi-scale image reconstruction method comprises an image pyramid, a Laplacian pyramid and the like. The image resampling operator may comprise nearest neighbor interpolation, bilinear interpolation, bicubic interpolation, etc., or any combination thereof. The alternate repetition refers to processing based on a plurality of different multi-scale image reconstruction methods in an alternate manner according to a preset sequence. The preset sequence may be a manually preset sequence.
In some embodiments, before performing multi-scale image reconstruction on the first image, the classification module 220 may further adjust a size of the first image to obtain a first image with a preset size; and performing multi-scale image reconstruction on the first image with the preset size to obtain a second image. The preset size can be a manual preset value, a system default value and the like.
In some embodiments of the present description, the shading details of the ultrasound image region may be improved and the gray scale difference of the blood flow and the blood vessel wall in the ultrasound image region may be enhanced by shading optimization; unnecessary detail information can be filtered through multi-scale image reconstruction of the first image, more attention-worthy contents are reserved, and meanwhile resolution and definition of the second image are guaranteed.
In some embodiments, the processing device may determine the intima-media measurement type by performing the intima-media measurement type classification in a variety of ways based on the image characteristic data. In some embodiments, the processing device may perform intima-media measurement type classification based on gray scale distribution characteristics of the region of interest. In some embodiments, the processing device may determine the relative position of the blood vessel in the region of interest based on the gray scale distribution characteristics of the region of interest; and judging the intima-media measurement type according to the relative positions of the blood vessels. Since the gray value of the pixel inside the blood vessel is lower than the gray value of the pixel outside the blood vessel, the processing device can determine the relative position of the blood vessel according to the position of the part of the gray value lower than the preset gray threshold in the region of interest. The preset gray threshold may be a system default value, an empirical value, an artificial preset value, or any combination thereof, and may be set according to actual requirements, which is not limited in this specification.
In some embodiments, the vessel relative position includes any of a lower portion of the region of interest, a middle portion of the region of interest, an upper portion of the region of interest, and the like. The upper part of the region of interest refers to a region that partially or entirely contains the upper edge of the region of interest, which is a side edge near the ultrasound probe. The lower part of the region of interest refers to a region that partially or entirely contains the lower edge of the region of interest, which is a side edge near the ultrasound probe. The middle of the region of interest refers to other regions than the upper and lower portions of the region of interest. When the relative position of the blood vessel is the middle part of the region of interest, the gray value of the middle part of the region of interest is lower than the gray values of the upper part and the lower part. When the relative position of the blood vessel is the upper part of the region of interest, the gray value of the upper part of the region of interest is lower than the gray value of the lower part.
In some embodiments, the processing device may determine that the intima-media measurement type is anterior wall IMT in response to the vessel relative position being a lower portion of the region of interest; determining the intima-media measurement type as a posterior wall IMT in response to the vessel relative position being an upper portion of the region of interest; in response to the relative position of the vessel being the middle of the region of interest, the intima-media measurement type is determined to be anterior-posterior wall IMT. By way of example only, as shown in fig. 5, in the region of interest 510 of fig. a, where the gray level of the upper region is lower than the gray level of the lower region, knowing that the vessel relative position is located in the upper portion of the region of interest 510, the intima-media measurement type of the region of interest 510 may be determined to be the posterior wall IMT; in the region of interest 520 of fig. B, the gray level of the upper region is higher than the gray level of the lower region, and it can be determined that the intima-media measurement type of the region of interest 520 is the anterior wall IMT if the blood vessel relative position is located in the lower portion of the region of interest 520 in the drawing; similarly, the type of endo-medial measurement for region of interest 530 in fig. C may be determined to be posterior wall IMT, the type of endo-medial measurement for region of interest 540 in fig. D to be anterior-posterior wall IMT, the type of endo-medial measurement for region of interest 550 in fig. E to be anterior wall IMT, and the type of endo-medial measurement for region of interest 560 in fig. F to be anterior wall IMT.
In some embodiments, the preset classification algorithm may further include: and processing the region of interest through the classification model, and determining the intima-media measurement type. In some embodiments, the classification model is a machine learning model. For example, support vector machine classifiers, decision trees, or the like, in any combination. In some embodiments, the input of the classification model includes the region of interest and/or an ultrasound image containing the region of interest, output as an intima-media measurement type. In some embodiments, the classification model may be trained in a variety of possible ways based on a large number of labeled training samples. For example, the parameter update is performed based on a gradient descent method. In some embodiments, the training sample comprises a sample region of interest and/or a sample ultrasound image containing the sample region of interest, labeled as an intima-media measurement type of the sample region of interest. The training samples may be determined based on historical data and the tags may be labeled by the system or by human based on the historical data. The intima-media measurement type is noted, for example, by the system or by human personnel in the manner described in the previous examples.
In some embodiments of the present disclosure, the classification of the intima-media measurement type is performed through the image feature data, which can improve the accuracy of classification, and save human resources while improving the classification efficiency compared with manual classification. Further, the accuracy and efficiency of classification can be improved by using the classification model.
In some embodiments, when the processing device determines that the intima-media measurement type of the region of interest cannot be determined, the processing device may trigger an instruction to reacquire the ultrasound image and/or to redefine the region of interest, and end the current process.
And step 330, extracting an image to be measured from the ultrasonic image of the current frame based on the region of interest. In some embodiments, step 330 may be performed by extraction module 230.
The current frame of ultrasonic image refers to the ultrasonic image of which the intima-media measurement needs to be carried out at the current moment. For example, the current frame of ultrasound image may be a frame of ultrasound image selected from the historical ultrasound images. For another example, the current frame of ultrasound image may be a frame of ultrasound image acquired by the ultrasound device in real time at the current time.
The image to be measured refers to an image containing the user-selected intima-media region.
In some embodiments, the processing device may extract the image to be measured from the current frame ultrasound image in a number of ways based on the region of interest. For example, the processing device may directly treat the region of interest as an image to be measured.
In some embodiments, the processing device may adjust the region of interest and extract an image to be measured from the current frame ultrasound image based on the adjusted region.
The adjusted region is a region obtained by adjusting a sampling frame of the region of interest. In some embodiments, the adjustment process may include, but is not limited to, one or more of companding, downscaling, shifting, and the like.
In some embodiments, the processing device may expand the region of interest and extract the image to be measured from the current frame ultrasound image based on the expanded region. The expansion region is a region obtained by performing expansion processing on a sampling frame of the region of interest.
In some embodiments, the sample frame can be expanded outward by a magnification process in accordance with a preset ratio. For example, when the sampling frame is rectangular, the long side and/or the wide side of the rectangular sampling frame may be expanded outwards according to a preset ratio. For another example, when the sampling frame is circular, the diameter of the circular sampling frame may be expanded outward according to a predetermined ratio. The preset proportion can be a system preset value, a system default value and the like, and the preset proportion is larger than 1. The external expansion modes of the sampling frames with different shapes can be different, and the external expansion modes can be set according to actual requirements.
In some embodiments, the processing device may determine the frame selection area of the expanded sampling frame as an expansion area, and crop the current frame ultrasound image based on the expansion area to obtain an image to be measured.
In some embodiments of the present disclosure, by expanding the region of interest, a larger measurement region than manually selected may be obtained, which effectively avoids the occurrence of unclear inner and middle membrane boundaries in the sampling frame, and is beneficial to improving the definition of the inner and middle membrane boundaries.
In some embodiments, the processing device may perform gray level optimization and multi-scale optimization on the image area corresponding to the adjusted area in the ultrasound image, to obtain an image to be measured.
The image area corresponding to the adjusted area refers to the image area extracted from the current frame of ultrasonic image based on the adjusted area.
In some embodiments, the processing device may perform gray level optimization and multi-scale optimization on the image region corresponding to the adjusted region in the ultrasound image in a variety of ways. For example, the processing device may perform gray optimization on the image area corresponding to the adjusted area by means of histogram equalization, percentage stretching, and the like. For another example, the processing device may perform multi-scale optimization on the image region corresponding to the adjusted region through an USM sharpening enhancement algorithm, laplace sharpening, frequency domain sharpening, and other algorithms.
It should be noted that, the single-scale sharpening scheme has limited sharpening effect, if the parameter setting is unreasonable, the sharpening is easy to be excessive, the noise of the image is increased, and the subsequent recognition of the intima-media boundary is affected. If a multi-scale sharpening scheme with more scales is adopted, the execution efficiency of the algorithm is affected.
In some embodiments, the processing device may perform multi-scale optimization on the image area corresponding to the adjusted area by using an image processing method of the laplacian pyramid, to obtain an image to be measured. The processing device may process the image region corresponding to the adjusted region through laplacian pyramid transformation to obtain image edge information; and (3) superposing the extracted image edge information in an image area corresponding to the adjusted area through inverse Laplacian pyramid transformation, and obtaining an image to be measured.
In some embodiments of the present disclosure, gray level optimization is performed on an image area corresponding to the adjusted area, so that a light-shade distribution of the image area corresponding to the adjusted area can be optimized, and multi-scale optimization is performed on the image area corresponding to the adjusted area, so that sharpness characteristics of the image can be further improved, sharpness information of boundary areas such as LII and MAI in the image can be increased, and sharpness of the image can be improved; the image processing scheme of the Laplacian pyramid adopts the thought of a multi-stage pyramid to perform image sharpening, so that the execution efficiency of an algorithm can be optimized, the definition of the image is improved, and the method is beneficial to the subsequent determination of the inner media boundary of the clear current frame ultrasonic image.
Step 340, based on the image to be measured, obtaining the target mask image of the current frame ultrasonic image. In some embodiments, step 340 may be performed by mask module 240.
The target mask image is an image obtained by shielding a part of the ultrasonic image. In some embodiments, the target mask image may be an image that obscures non-important content in the ultrasound image, leaving only important content. Important content may include, among other things, intima-media boundaries. In some embodiments, the important content may also include a vascular portion. The target mask image may be in various forms such as a binary image, a gray scale image, and the like.
In some embodiments, the target mask image may be a binary image generated based on the intima-media boundary of the image to be measured, representing contour information of the intima-media boundary of the image to be measured. In some embodiments, the target mask image is a binary image consistent with the size of the image to be measured, and may be represented by a matrix, where the matrix includes values of both 0 and 1. Where 0 represents other tissue regions and 1 represents non-uniform tissue regions (i.e., contour information of intima-media boundaries). In some embodiments, the target mask image may also be in the form of a multi-valued image (where 0 represents background and different values represent different types of vascular membranes).
In some embodiments, the processing device may acquire the target mask image of the current frame ultrasound image in a variety of ways based on the image to be measured. For example, the mask module 240 may perform image segmentation on the image to be measured using an image segmentation algorithm to obtain a target mask image for the current frame of ultrasound image. Image segmentation algorithms include, but are not limited to, thresholding, region growing, and the like. For another example, the mask module 240 may further perform segmentation processing on the image to be measured through the segmentation model to obtain a target mask image of the current frame of ultrasound image. The segmentation model may be a trained machine learning model, among other things. For example, a deep learning model based on VB-Net network, etc.
In some embodiments, the processing device may perform gradient extraction on the image to be measured, obtain a gradient image, determine an estimation result of the intima-media boundary based on the gradient image, and further determine the target mask image. For more description of determining the target mask image, see fig. 4 and its associated description.
Step 350, based on the intima-media measurement type, determining the intima-media thickness parameter of the current frame ultrasound image according to the target mask image of the current frame ultrasound image. In some embodiments, step 350 may be performed by determination module 250.
The intima-media thickness parameter refers to a parameter related to the thickness of the intima-media boundary. For example, the intima-media thickness parameters may include maximum, minimum, mean, standard deviation, etc. parameters of the thickness of the intima-media boundary.
Intima-media boundary refers to the dividing line between the intima of an arterial vessel and the media of an arterial vessel in the image to be measured.
In some embodiments, the inner and middle membrane boundaries include Lumen-inner membrane boundaries (LII) and middle membrane outer membrane boundaries (Media-Adventitia Interface, MAI).
In some embodiments, the processing device may determine the intima-media boundary of the current frame ultrasound image from the target mask image of the current frame ultrasound image in a variety of ways based on the intima-media measurement type. For example, the processing device may determine the medial membrane boundary in the target mask image of the current frame ultrasound image based on gray scale distribution characteristics of different medial measurement types (e.g., anterior wall IMT, posterior wall IMT, and anterior-posterior wall IMT). For example, when the inner and middle membrane measurement type is the front wall IMT, the processing device may analyze each column of the target mask image, from the first pixel point of each column, search from top to bottom until the gray value of the pixel point of the column is 255, record the information of the pixel point, take the information as the boundary point of the middle membrane and the outer membrane, and determine the boundary points of the middle membrane and the outer membrane of all columns as the middle membrane and outer membrane boundary of the current frame ultrasonic image; and searching from bottom to top until the gray value of the pixel point of the column is less than 255 or 0, recording the information of the pixel point, taking the information as the boundary point of the lumen inner membrane, and determining the boundary points of the lumen inner membranes of all columns as the lumen inner membrane boundary of the current frame ultrasonic image. As yet another example, when the intima-media measurement type is back wall IMT, the processing device may analyze each column of the target mask image, search from top to bottom, starting from the first pixel point of each column until the gray value of the pixel point of the column is not 255, record the information of the pixel point, and use it as the boundary point of the lumen intima, and determine the boundary points of the lumen intima of all columns as the lumen intima boundary of the current frame ultrasound image; and searching from bottom to top until the gray value of the pixel point of the column is 255, recording the information of the pixel point, taking the information as the boundary point of the adventitia of the media, and determining the boundary points of the adventitia of the media of all columns as the adventitia boundary of the media of the current frame ultrasonic image. For another example, the processing device may implement the initial extraction of the LII and MAI contours based on Canny operators, hough transforms, and the like, and then input iterative optimization of the snap model to output final LII and MAI contours. The mode of extracting the contour boundary in the embodiment of the present specification is not particularly limited, and may be an operation well known to those skilled in the art.
In some embodiments, the processing device may also generate an intima-media boundary based on the target mask image of the current frame ultrasound image and the region of interest; determining a first contour line of an adventitia boundary of the intima and a second contour line of an intima boundary of the lumen based on the intima-media measurement type and the intima-media boundary; and determining an intima-media thickness parameter corresponding to the intima-media measurement type based on the first contour line and the second contour line.
The first contour line is a line for reflecting the shape of the media adventitia boundary (MAI). The second contour line is a line for reflecting the shape of the lumen intima boundary (LII).
In some embodiments, the processing device may connect boundary points of the inner and middle membranes of the target mask image to obtain a first contour line in the target mask image; and connecting boundary points of the inner membrane and the inner membrane to obtain a second contour line in the target mask image. As shown in fig. 5, the type of endo-medial measurement of the region of interest 510 in fig. a is posterior wall IMT,511 is a second contour line formed by the boundary point connection of the endoluminal membrane, and 512 is a first contour line formed by the boundary point connection of the media outer membrane; the type of endo-medial measurement of region of interest 520 in fig. B is anterior wall IMT,521 is a first contour line formed by the boundary point connection of the media outer membrane, 522 is a second contour line formed by the boundary point connection of the lumen inner membrane; similarly, 531 in the graph C may be determined as a second contour line formed by the boundary point connection of the lumen inner membrane, 532 as a first contour line formed by the boundary point connection of the middle membrane outer membrane; in fig. D, 541 is a first contour line formed by connection of boundary points of the adventitia of the media, 542 is a second contour line formed by connection of boundary points of the endoluminal membrane, 543 is a second contour line formed by connection of boundary points of the endoluminal membrane, and 544 is a first contour line formed by connection of boundary points of the adventitia of the media; 551 in fig. E is a first contour line formed by boundary point connections of the adventitia of the media, 552 is a second contour line formed by boundary point connections of the endoluminal membrane; in fig. F561 is a first contour line formed by the boundary point connection of the media outer membrane, 562 is a second contour line formed by the boundary point connection of the lumen inner membrane.
In some embodiments, the processing device may obtain the first contour line, the second contour line in the target mask image based on a boundary tracking algorithm. In some embodiments, the processing device may cut the target mask image based on the position, the size and the shape of the region of interest, perform a logic operation on the region of interest in the current frame of ultrasonic image based on the cut target mask image, obtain a first contour line and a second contour line in the region of interest, and send the current frame of ultrasonic image after the logic operation to a display interface of the terminal device for display. Wherein the logical operation may be a pixel-by-pixel and operation.
In some embodiments, the processing device may determine the intima-media thickness corresponding to the intima-media measurement type in a variety of ways based on the first and second profiles. For example, the processing device may calculate a distance between a pixel point on the first contour line and a pixel point on the second contour line of each column based on the first contour line and the second contour line of the current frame ultrasound image, take a maximum value of the distance as an IMT maximum thickness, a minimum value of the distance as an IMT minimum thickness, and an average value of the distances of all columns as an IMT average thickness. For another example, the processing device may also take the standard deviation of the distances of all columns as the standard deviation of IMT. For another example, the processing device may further use the number of pixels between the pixels at two ends of the first contour line as the length of the adventitia boundary of the middle membrane in the sampling frame.
In some embodiments of the present disclosure, by drawing the first contour line and the second contour line, the inner and middle membrane outer membrane boundary, the shape, the position, etc. of the inner and middle membrane inner membrane can be intuitively displayed, so that the user experience is improved, the user can conveniently select the interested measurement position, and the measurement flexibility and convenience are improved.
In some embodiments, the processing device may perform the intima-media measurement method of arterial ultrasound images shown in some embodiments of the present description for each frame of ultrasound images acquired by the ultrasound device. In some embodiments, the processing device may perform the intima-media measuring method of arterial ultrasound images shown in some embodiments of the present description for each frame of ultrasound images acquired by the ultrasound device in chronological order. In some embodiments, the processing device may, after determining the intima-media boundary and intima-media thickness of the current frame of ultrasound image, make intima-media measurements for a next frame of ultrasound image of the current frame of ultrasound image in the manner described in flow 600. See fig. 6 for more description and related description.
In some embodiments of the present disclosure, the type of the intima-media measurement is determined by the region of interest, which can save the time of manual determination, and can eliminate the influence of personal factors of an operator, improve the accuracy of determination, facilitate the subsequent determination of the intima-media boundary, and accurately segment the intima-media by determining the target mask image, so that the accuracy of IMT measurement is improved. The outline information of the inner membrane boundary can be obtained through the target mask image of the current frame ultrasonic image, which is beneficial to determining the target mask image of the next frame ultrasonic image, and the calculation efficiency is improved while the calculation accuracy is improved. For an ultrasonic image, the efficiency of using a target mask image is obviously improved due to higher requirements on the frame rate of the image.
Fig. 4 is an exemplary flow chart for determining a target mask image according to some embodiments of the present description. In some embodiments, the process 400 may be performed based on the processing device 140 or the mask module 240 of the intima-media measurement system of the arterial ultrasound image. As shown in fig. 4, the process 400 includes the following steps.
Step 410, gradient extraction is performed based on the image to be measured, and a gradient image is obtained.
The gradient image is used to describe differences or variations between individual pixels in the ultrasound image. For example, the gradient image may be a matrix of image gradients for individual pixels in the ultrasound image. Wherein, the image gradient is the variation trend of the pixel point in different directions. In some embodiments, for each pixel, an image gradient of the image to be measured on the pixel may be determined according to the pixel value of the pixel and the pixel values of the neighboring pixels. The neighborhood pixel point refers to a pixel point in the neighborhood of a certain pixel point. The neighborhood includes a 4-neighborhood, an 8-neighborhood, a diagonal neighborhood, etc. Correspondingly, the neighborhood pixel points comprise the pixel points in the 4-neighborhood, 8-neighborhood or diagonal neighborhood of a certain pixel point.
In some embodiments, the processing device may perform gradient extraction in a variety of ways. For example, the processing device may perform convolution operation on the image to be measured based on a Sobel operator, a Prewitt operator, a Laplace operator, or the like, to obtain the gradient image.
In some embodiments, the image gradient of each pixel point can be represented by a vector as (B1, B2, …), where vector element B1 represents the gradient of pixel point B in the horizontal direction and vector element B2 represents the gradient of pixel point B in the vertical direction.
In some embodiments, the processing device performs image gradient extraction based on the image to be measured, and acquires a forward gradient image; extracting a phase gradient based on an image to be measured, and obtaining a phase gradient image; a gradient image is determined based on the forward gradient image and the phase gradient image.
The forward gradient image is a gradient image determined from the forward gradient of the image to be measured. The positive gradient means that the pixel value of the pixel point is increased along the changing direction of the gradient direction, that is, the brightness or intensity of the pixel value of the pixel point along the gradient direction is increased. In some embodiments, the gradient direction in which image gradient extraction is performed may be the ultrasound detection direction or the opposite direction of the ultrasound detection direction. In some embodiments, the gradient direction in which the image gradient extraction is performed may be the Y direction of the sample frame or the opposite direction of the Y direction. The Y-direction of the sampling frame may be the direction in which the frame line in the sampling frame is perpendicular or approximately perpendicular (e.g., the included angle is greater than 45 degrees and less than 90 degrees) to the blood flow direction, and the direction is close to the ultrasound probe.
In some embodiments, the processing device may perform image gradient extraction in a variety of ways. For example, the processing device may calculate a pixel value change of each pixel point in the image to be measured in the ultrasound detection direction or the Y direction, obtain an initial gradient image, traverse each pixel point in the initial gradient image, and reject the pixel point with the negative gradient direction, so as to obtain a positive gradient image.
In some embodiments, the processing device may perform convolution operation on the image to be measured through a Prewitt operator in the ultrasound detection direction or the Y direction and a Sobel operator in the vertical direction to obtain an initial gradient image.
The phase gradient image is a gradient image determined according to the phase change rate of the image to be measured.
In some embodiments, the processing device is capable of phase gradient extraction in a variety of ways. For example, the processing device may calculate a frequency matrix and a homography matrix of an image to be measured, perform fourier transform on the frequency matrix and the homography matrix, perform multi-scale transform processing, and then perform filtering by using a LogGabor filter. And processing the filtered result by utilizing inverse Fourier transform, calculating local energy by using Hilbert transform (Hilbert transform), and processing by using a Rayleigh distribution (Rayleigh Distribution) model to obtain a symmetrical energy map, namely obtaining a phase gradient image.
In some embodiments, the processing device may determine the gradient image in a variety of ways based on the forward gradient image and the phase gradient image. For example, the processing device may multiply each pixel point of the forward gradient image and the phase gradient image correspondingly to obtain a gradient image.
In some embodiments of the present disclosure, edge features of an image to be measured may be effectively preserved by a phase gradient image; the extracted frequency characteristic of the image to be measured is not easily affected by the contrast change of the image, and the more complete image characteristic is obtained under the complex condition (such as brightness change); the phase gradient image and the forward gradient image are multiplied by the corresponding pixel points to obtain a gradient image, so that the common boundary information of the phase gradient image (frequency domain information) and the forward gradient image (image information) can be reserved, the interference of other boundaries is restrained, and the subsequent estimation of the intima-media boundary is facilitated.
Step 410, determining an estimation of intima-media boundary of the gradient image based on the gradient image.
The estimation result of the intima-media boundary refers to the result of preliminary determination of the intima-media boundary in the gradient image, i.e., the estimated intima-media boundary. The estimation of intima-media boundary may include an estimation of LII and/or an estimation of MAI.
In some embodiments, the processing device determines an estimate of the intima-media boundary of the gradient image based on the gradient image in a number of ways. In some embodiments, the processing device may analyze each column or each row of the gradient image, obtain a gradient distribution condition of each column or each row, and use a gradient value that meets a preset condition as a target threshold of the column or the row; performing binarization processing on the gradient of the pixel points of the column or the row based on a target threshold value, and obtaining a binary image corresponding to the gradient image based on the result of the binarization processing of each column or each row; and determining an estimation result of the intima-media boundary in the binary image corresponding to the gradient image. The gradient distribution refers to the distribution of gradient values of different pixels in a certain column.
The preset condition refers to an evaluation condition for selecting a target threshold. For example, the preset condition is that the number of occurrences of a certain gradient value reaches a preset number. For example, the preset condition is that the occurrence duty cycle of a certain gradient value is greater than or equal to the duty cycle threshold (e.g., 90%). The preset times and the duty ratio threshold value can be determined based on experiments or experience.
In general, the gradient value at the inner middle membrane boundary is large, and the binarization result of the pixel point with the gradient value greater than or equal to the target threshold value can be set as a gray maximum value (e.g., 255); the gradient value at the non-intima-media boundary is small, and the binarization result of the pixel point whose gradient value is smaller than the target threshold value may be set to a gray-level minimum value (e.g., 0).
In some cases, noise interference may occur, and the result of binarization of the pixel points other than the inner-middle membrane boundary may be a maximum gray level. In some embodiments, the processing device may perform connected domain analysis based on the binary image corresponding to the gradient image; updating or correcting the binary image corresponding to the gradient image based on the result of the connected domain analysis, and determining a target binary image of the gradient image. Correspondingly, the processing device can determine the estimation result of the intima-media boundary in the target binary image corresponding to the gradient image.
The connected region, namely, the connected region, refers to a region formed by pixel points which have the same gray value and are adjacent in position in the binary image corresponding to the gradient image. In some embodiments, the connected domain may be a region formed by a plurality of adjacent pixel points, where the gray-scale maximum value is included in the binary image corresponding to the gradient image.
In some embodiments, the processing device may extract the connected domain in the binary image corresponding to the gradient image using an existing connected domain extraction technique. For example, the processing device may extract the connected domain in the binary image corresponding to the gradient image using an algorithm for pixel-by-pixel comparison.
In some embodiments, connected domain analysis includes: classifying the connected domain based on the area of the connected domain; responsive to the connected domain area being greater than or equal to a first area threshold, retaining the connected domain; removing the connected domain in response to the connected domain area being less than a second area threshold (the second area threshold being less than the first area threshold); and in response to the connected domain area being less than the first area threshold and greater than or equal to the second area threshold, performing a elongation analysis on the connected domain.
In some embodiments, the retaining of the connected domain may be retaining the original gray values of all pixels in the connected domain. Removing the connected domain may be to update the original gray value (e.g., gray maximum value) of all pixel points in the connected domain to the opposite value (e.g., gray minimum value).
In some embodiments, the processing device may perform a slit analysis based on characteristics of the intima-media boundary. The characteristics of the intima-media boundary are related characteristics reflecting the shape and size of the intima-media boundary. For example, the characteristics of the endo-medial membrane boundary include the length, width, and/or aspect ratio of the endo-medial membrane boundary, etc.
In some embodiments, the elongation analysis includes: analyzing whether the length of the connected domain meets the length requirement, reserving the connected domain meeting the length requirement, and removing the connected domain not meeting the length requirement. The length requirement may be that the length of the connected domain is between a set upper and lower length limit, the lower length limit may be determined based on experience or historical data, and the upper length limit may be determined based on the size of the sampling frame. In some embodiments, the elongation analysis includes: analyzing whether the width of the connected domain meets the width requirement, reserving the connected domain meeting the width requirement, and removing the connected domain not meeting the width requirement. The width requirements are similar to the length requirements and are not described in detail herein. In some embodiments, the elongation analysis includes: analyzing whether the length-width ratio of the connected domain meets the proportion requirement, reserving the connected domain meeting the proportion requirement, and removing the connected domain not meeting the proportion requirement. Wherein the ratio requirement is that the aspect ratio of the connected domain is between the set upper and lower ratio limits, which may be determined based on empirical or historical data.
In some embodiments, the processing device may determine a target binary image of the gradient image based on the result of the connected domain analysis. The target binary image may be used to determine an estimate of the intima-media boundary.
In some embodiments, the processing device may determine an initial binary image of the gradient image based on the results of the connected domain analysis; and filtering and analyzing based on the initial binary image to determine a target binary image of the gradient image. In some embodiments, the filtering analysis includes: determining the ratio of the area of the sampling frame to the area of each connected domain in the initial binary image, removing the connected domains with the area ratio smaller than a preset ratio threshold in the initial binary image, and reserving the connected domains with the area ratio larger than or equal to the preset ratio threshold in the initial binary image to obtain a target binary image of the gradient image.
In some embodiments, the processing device may connect a plurality of adjacent pixels corresponding to the gray maxima therein based on the target binary image to obtain an estimation result of the intima-media boundary. In some embodiments, the processing device may further perform interpolation processing on the target binary image (for example, changing a gray-scale minimum value between two adjacent connected domains to a gray-scale maximum value), and connect a plurality of adjacent pixel points corresponding to the gray-scale maximum value to obtain an estimation result of the intima-media boundary. Through interpolation processing, the connected domains belonging to the same intima-media boundary but with dispersed positions can be connected to form a large connected domain, so that the accuracy of the estimation result of the determined intima-media boundary is improved.
And step 430, acquiring an initial mask image of the ultrasonic image of the current frame based on the estimation result of the intima-media boundary.
The initial mask image is a mask image which is preliminarily determined and used for determining the target mask image.
In some embodiments, the processing device may obtain the initial mask image of the current frame ultrasound image in a number of ways based on the estimation of the intima-media boundary. For example, the processing device may multiply each pixel point corresponding to the estimation result of the intima-media boundary by each pixel point corresponding to the image to be measured, to obtain the initial mask image. In the initial mask image, the pixel values of the boundary portion of the inner middle membrane remain unchanged, and the pixel values of the remaining regions are 0.
In some embodiments, the processing device may perform smoothing processing based on the estimation result of the intima-media boundary, and multiply each pixel point corresponding to the smoothed intima-media boundary with each pixel point corresponding to the image to be measured, to obtain the initial mask image. The smoothing process includes denoising, filtering, etc. the estimated intima-media boundary. The smoothing treatment can enable the estimation result of the inner and middle membrane boundary to be smoother, and reduce the influence of noise and vibration.
In some embodiments, the processing device may smooth the estimated intima-media boundary in a variety of ways. For example, the processing apparatus may smooth the estimation result of the intima-media boundary by a weighted average method, an exponential average method, an SG filtering method, or the like.
In some embodiments, the processing device may determine the upper and lower boundary positions of each column of pixels in the gradient image based on the estimation of the intima-media boundary; and based on the upper and lower boundary positions, performing intima-media boundary fluctuation analysis, and determining an initial mask image of the current frame ultrasonic image.
The upper and lower boundary positions refer to the positions of LII and MAI in each column of pixels in the gradient image. In some embodiments, the upper and lower boundary positions include an upper boundary position, a lower boundary position. The upper boundary position refers to a position of a boundary that is relatively upper in a row of pixel points. The lower boundary position refers to a position of a column of pixel points relative to the lower boundary.
When the inner media measurement types are different, the boundary types of the upper boundary position or the upper and lower boundary positions are different. The boundary type is one of a lumen inner membrane boundary and a lumen outer membrane boundary. Illustratively, when the intima-media measurement type is anterior wall IMT, the upper boundary position is the position of LII and the lower boundary position is the position of MAI. When the inner media measurement type is back wall IMT, the upper boundary position is the position of MAI and the lower boundary position is the position of LII.
In some embodiments, the processing device may analyze the gradient values of each column of pixels of the gradient image to determine the upper and lower boundary positions of each column of pixels. For example, the processing device may analyze each column of the gradient image, start from the first pixel point of each column, search from top to bottom or from bottom to top until the gradient of the pixel point of the column reaches a peak value, record the position information of the two pixel points, and finally determine the upper and lower boundary positions according to the intima-media measurement type.
In an actual ultrasonic inspection scene, the boundary of the intima-media in the ultrasonic image has continuity, and larger boundary fluctuation does not occur, so that the intima-media boundary fluctuation analysis is needed to be performed so as to meet the requirements of the actual application scene.
In some embodiments, the processing device may perform intima-media boundary relief analysis in a number of ways to determine an initial mask image for the current frame ultrasound image. For example, the processing device may perform statistical analysis on the upper and lower boundary positions of each column of pixel points of the gradient image, to obtain the variation amplitude data of the upper boundary position or the lower boundary position, respectively; and smoothing the upper boundary position or the lower boundary position with the variation amplitude larger than the amplitude threshold value based on morphological transformation. The amplitude threshold may be a manual preset value or a system default value.
The change amplitude data is used to reflect the change in boundary position of different columns. For example, the variation amplitude data may include a sequence of variation amplitude components of upper/lower boundary positions of different columns. The change amplitude refers to the change amount of a certain boundary position compared with a specified object. The specified object refers to a boundary position having a specified adjacency or an average value of all boundary positions of the same type. The specified adjacency relationship refers to an adjacency relationship between any one boundary position and at least one other boundary position. For example, a specified adjacency refers to a specified side of a boundary position, such as one directly adjacent to the left side.
In some embodiments, the processing device may perform morphological transformations based on morphological operators. Morphological operators include, but are not limited to, erosion, dilation, open and closed operations, and the like. The inner middle membrane boundary is generally flat and smooth, but due to the influence of noise, small burrs may exist near the inner membrane surface in the initial mask image, so that the initial mask image needs to be processed based on morphological operators, for example, corrosion is performed before expansion, so that the purposes of eliminating the small burrs and smoothing the larger object boundary are achieved, and meanwhile, the area of the object boundary is not changed obviously.
It should be noted that, through gradient extraction and intima-media boundary estimation, a mask image suitable for a general scene can be obtained. However, when the gradient information is not obvious, the estimation result of the intima-media boundary is smaller, or the interference noise is too large, the estimation result of the intima-media boundary is larger, or holes or depressions in the horizontal direction exist in the gradient image, the problems that the accuracy of the estimation result of the intima-media boundary is too large in difference with the true value and the stability and the robustness of the system are lower are easily caused.
In some embodiments of the present description, by performing an endomesomembrane boundary relief analysis, pressing is performed for cases where a sharp increase in the boundary occurs, and moderating is performed for cases where the boundary is suddenly reduced, ensuring a normal transition of the boundary; the morphological operator is used for optimizing and adjusting the condition of larger boundary fluctuation, so that an estimation result of an intima-media boundary with higher quality and more in line with the actual condition can be obtained, other complex conditions (such as noise, plaque and the like) are avoided to be processed, and the robustness of the system is improved.
Step 440, based on the initial mask image of the current frame ultrasonic image, obtaining the target mask image of the current frame ultrasonic image through iterative calculation of a preset contour optimization algorithm.
In some embodiments, the preset contour optimization algorithm may include an active contour model (Active Contour Models). In some embodiments, the preset profile optimization algorithm may include a geodesic dynamic profile (Geodesic Active Contours, GAC) and in some embodiments, the preset profile optimization algorithm may include a morphological rake algorithm.
In some embodiments, the processing device may obtain the target mask image of the current frame ultrasound image in a variety of ways based on the initial mask image of the current frame ultrasound image. For example, the processing device may input the initial mask image into a morphological snap algorithm for iterative optimization to obtain a target mask image of the current frame ultrasound image.
In some embodiments, the processing device may determine the number of iterations of the morphological snap algorithm based on the width of the initial mask image. Different widths of the initial mask image correspond to different iteration times.
In some embodiments of the present disclosure, a morphological snap algorithm is used to perform contour iterative optimization on an initial mask image, which has higher operation efficiency and more robust calculation result compared to a traditional numerical snap algorithm. The mask image after iterative optimization by using the morphological Snake algorithm can be more attached to the inner and middle membrane boundary, so that a target mask image with higher quality and more in accordance with actual conditions is obtained.
Fig. 6 is an exemplary flow chart of processing a next frame of ultrasound image according to some embodiments of the present description. In some embodiments, the process 600 may be performed by a processing device of an intima-media measurement system based on an arterial ultrasound image. As shown in fig. 6, the process 600 includes the steps of:
step 610, a next frame of ultrasound image of the current frame of ultrasound image is acquired.
The next frame of ultrasound image refers to any one frame of ultrasound image acquired after the current frame of ultrasound image. For example, the next frame of ultrasound image may be the ultrasound image acquired by the ultrasound device at the time of arrival at the next time instant of the current time instant. The preset time is set between the current time and the next time, and the preset time can be a system preset value, a system default value and the like.
In some embodiments, the processing device may periodically acquire ultrasound images acquired by the ultrasound device and store them in the storage device. After performing intima-media measurement on the current frame of ultrasound image, the processing device may read a next frame of ultrasound image of the current frame of ultrasound image from the storage device according to the image acquisition time. In some embodiments, the processing device may control the ultrasound device to acquire the next frame of ultrasound image in real time after performing intima-media measurement on the current frame of ultrasound image, and send the next frame of ultrasound image to the processing device for processing.
Step 620, determining a target mask image of the next frame of ultrasound image based on the target mask image of the current frame of ultrasound image and the next frame of ultrasound image.
In some embodiments, the processing device may determine the target mask image of the next frame of ultrasound image based on the target mask image of the current frame of ultrasound image by various means of image merging (overlaying), registration, and the like. For example, the processing device may determine the initial mask image of the next frame of ultrasound image based on the target mask image of the current frame of ultrasound image, and according to the initial mask image of the next frame of ultrasound image, for example, the processing device may overlay the target mask image of the current frame of ultrasound image to the region of interest of the next frame of ultrasound image to obtain the initial mask image of the next frame of ultrasound image, and determine the target mask image of the next frame of ultrasound image in the manner described in the foregoing embodiments.
In some embodiments, the processing device may extract a first image region corresponding to the region of interest from the next frame of ultrasound image; determining an initial mask image of a next frame of ultrasonic image based on the target mask image of the current frame of ultrasonic image and the first image area; a target mask image of the next frame of ultrasound image is determined based on the initial mask image of the next frame of ultrasound image.
The first image region refers to an image region corresponding to the region of interest in the next frame of ultrasound image. In some embodiments, the processing device may extract a first image region corresponding to the region of interest from the next frame of ultrasound image according to the previously determined sampling frame.
In some embodiments, the processing device may perform appropriate preprocessing on the target mask image of the current frame of ultrasound image, such as morphological dilation operation, morphological opening and closing operation, and the like, to obtain an initial mask image of the next frame of ultrasound image.
In some embodiments, the processing device may determine the target mask image for the next frame of ultrasound image based on the initial mask image for the next frame of ultrasound image in a similar manner as in fig. 4 in which the target mask image for the current frame of ultrasound image is determined based on the initial mask image for the current frame of ultrasound image. See fig. 4 for more description and its associated description.
In some embodiments of the present disclosure, by extracting a first image region corresponding to the region of interest in the next frame of ultrasound image, a registration action between the region of interest and the first image region may be implemented, which helps to quickly and accurately obtain an intima-media boundary in the first image region based on the target mask image of the current frame of ultrasound image. In some embodiments, the processing device may also determine the target mask image of the next frame of ultrasound image in a number of ways based on the target mask image of the current frame of ultrasound image and the target mask image of the history frame of ultrasound image. For example, the processing device may fuse (e.g., weight fuse, etc.) the target mask image of the current frame ultrasound image with the target mask image of the history frame ultrasound image to determine the target mask image of the next frame ultrasound image. The historical frame ultrasonic image can be one or more frames of ultrasonic images which have been subjected to intima-media measurement before the current frame ultrasonic image.
Step 630, determining the intima-media boundary of the next frame of ultrasound image based on the intima-media measurement type and the target mask image of the next frame of ultrasound image.
In some embodiments, the processing device determines the medial membrane boundary of the next frame of ultrasound image based on the medial membrane measurement type and the target mask image of the next frame of ultrasound image in a similar manner as the determination of the medial membrane boundary of the current frame of ultrasound image based on the medial membrane measurement type and the target mask image of the current frame of ultrasound image in fig. 1. For more description see fig. 1 and the associated description.
In some embodiments of the present disclosure, the intima-media structure is in a certain range under the condition that the ultrasound device collects the intima-media structure and the position of the ultrasound images of the front and rear frames in real time, so that the initial mask image of the next frame of ultrasound image is determined through the target mask image and the first image area of the current frame of ultrasound image, the calculation processes of determining the intima-media measurement type and determining the gradient image in the subsequent ultrasound image can be reduced, the calculation efficiency is improved, and the real-time measurement of the intima-media thickness of the ultrasound image video is realized.
It should be noted that the above description of the flows 300, 400, and 600 is for illustration and description only, and is not intended to limit the scope of applicability of the present description. Various modifications and changes to the processes 300, 400, and 600 may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
Fig. 7 is an exemplary schematic diagram of real-time measurements shown in accordance with some embodiments of the present description.
In some embodiments, the processing device may acquire a region of interest of a multi-frame ultrasound image; judging the type of intima-media measurement based on the region of interest; extracting an image to be measured from the current frame ultrasonic image based on the region of interest; acquiring a target mask image of the current frame ultrasonic image based on the image to be measured; determining a target mask image of the next frame of ultrasonic image based on the target mask image of the current frame of ultrasonic image; and determining the intima-media boundary of the multi-frame ultrasonic image based on the intima-media measurement type and the target mask image of the multi-frame ultrasonic image.
In some embodiments, the multi-frame ultrasound image may be a multi-frame continuous ultrasound image in an ultrasound video stream. The ultrasonic video stream can be acquired by an ultrasonic device in real time.
In some embodiments, the probe of the ultrasound device remains stationary during the ultrasound real-time scan, and the region of interest of the multi-frame ultrasound image is consistent. After the doctor determines the region of interest of the 1 st frame of ultrasonic image through the sampling frame, the processing device can determine the region of interest of the ultrasonic image acquired later through registration and the like. For more description of determining a region of interest see fig. 3 and its associated description.
The intima-media boundary of the multi-frame ultrasonic image refers to the intima-media boundary corresponding to each frame of ultrasonic image in the multi-frame ultrasonic image.
In some embodiments, the processing device, when acquiring the 1 st frame of ultrasound image, may take the 1 st frame of ultrasound image as the current frame of ultrasound image and determine a target mask image and an intima-media measurement (including the intima-media boundary and/or the intima-media thickness) of the 1 st frame of ultrasound image in the manner described in some embodiments of fig. 2-6, and further determine an intima-media measurement of a next frame of ultrasound image to the 1 st frame of ultrasound image based on the target mask image of the 1 st frame of ultrasound image.
Referring to fig. 7, the processing device may determine an intima-media measurement type 730 of the 1 st frame ultrasound image based on the 1 st frame ultrasound image 711 and the region of interest 720, and extract an image 740 to be measured from the 1 st frame ultrasound image 711; acquiring an initial mask image 751 of the 1 st frame ultrasonic image based on the image to be measured 740; acquiring a target mask image 761 of the 1 st frame ultrasonic image based on the initial mask image 751 of the 1 st frame ultrasonic image; and determining an intima-media measurement 771 of the 1 st frame ultrasound image based on the target mask image 761, the region of interest 720, and/or the intima-media measurement type 730 of the 1 st frame ultrasound image. After determining the target mask image 761 of the 1 st frame of ultrasound image or after acquiring the next frame of ultrasound image 712, the processing device may determine the target mask image 762 of the next frame of ultrasound image based on the target mask image 761 of the 1 st frame of ultrasound image and determine the intima-media measurement 772 of the next frame of ultrasound image based on the target mask image 762 of the next frame of ultrasound image, the region of interest 720, and/or the intima-media measurement type 730. By analogy, the processing device may determine a target mask image 763 for the n-1 frame ultrasound image, and an intima-media measurement 773 for the n-1 frame ultrasound image; and after the target mask image 763 of the n-1 st frame ultrasound image is determined or the n-th frame ultrasound image 713 is acquired, determining the target mask image 764 of the n-th frame ultrasound image based on the target mask image 763 of the n-1 st frame ultrasound image and determining the intima-media measurement 774 of the n-th frame ultrasound image based on the target mask image 764 of the n-th frame ultrasound image, the region of interest 720, and/or the intima-media measurement type 730. For more relevant description, see fig. 3-6 and their associated description.
In some embodiments, the processing device may perform a time series analysis of the intima-media measurements of the multi-frame ultrasound image during the real-time intima-media measurement to determine an IMT time series analysis report. In some embodiments, the IMT timing analysis report may include intima-media measurements corresponding to multiple frames of ultrasound images. In some embodiments, a trend analysis graph of the thickness of the intima-media may be included in the IMT timing analysis report. The trend analysis chart refers to a trend chart of measured values of the intima-media thickness with respect to time. As shown in fig. 7, the processing device may perform a timing analysis of the target mask image 761 of the 1 st frame ultrasound image, the intima-media measurements 772, … … of the next frame ultrasound image, the intima-media measurements 773 of the n-1 st frame ultrasound image, the intima-media measurements 774 of the n-th frame ultrasound image, and determine an IMT timing analysis report 780.
In some embodiments of the present disclosure, by determining the target mask image of the 1 st frame of ultrasound image, and determining the target mask image and the intima-media measurement result of each subsequent frame of ultrasound image in real time based on the target mask image and the intima-media measurement result of the 1 st frame of ultrasound image, the efficiency of determining the intima-media boundary of each subsequent frame of ultrasound image can be improved, which is beneficial to realizing real-time measurement of the intima-media thickness of the ultrasound image video. By carrying out time sequence analysis on the intima-media measurement results of the multi-frame images, the condition that larger errors exist between the IMT measurement items and the true values under the condition that the single-frame images are not selected well can be effectively avoided.
There is also provided in one or more embodiments of the present specification an intima-media measuring device for an arterial ultrasound image, comprising processing means for performing an intima-media measuring method for an arterial ultrasound image as described in any of the embodiments above.
There is further provided in one or more embodiments of the present specification a computer readable storage medium storing computer instructions that, when read by a computer in the storage medium, the computer performs the intima-media measurement method of an arterial ultrasound image as described in any of the embodiments above.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.