WO2009044316A1 - System and method for real-time multi-slice acquisition and display of medical ultrasound images - Google Patents
System and method for real-time multi-slice acquisition and display of medical ultrasound images Download PDFInfo
- Publication number
- WO2009044316A1 WO2009044316A1 PCT/IB2008/053906 IB2008053906W WO2009044316A1 WO 2009044316 A1 WO2009044316 A1 WO 2009044316A1 IB 2008053906 W IB2008053906 W IB 2008053906W WO 2009044316 A1 WO2009044316 A1 WO 2009044316A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- image planes
- planes
- imaging system
- interest
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/52074—Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52053—Display arrangements
- G01S7/52057—Cathode ray tube displays
- G01S7/5206—Two-dimensional coordinated display of distance and direction; B-scan display
- G01S7/52063—Sector scan display
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52079—Constructional features
- G01S7/52084—Constructional features related to particular user interfaces
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52085—Details related to the ultrasound signal acquisition, e.g. scan sequences
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/54—Control of the diagnostic device
- A61B8/543—Control of the diagnostic device involving acquisition triggered by a physiological signal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8909—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
- G01S15/8915—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
- G01S15/8925—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
- G01S15/8906—Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
- G01S15/8993—Three dimensional imaging systems
Definitions
- This invention relates to systems and methods for real-time capture and display of medical ultrasound images and more particularly to real-time capture and display of multi- slice of medical ultrasound images of a volumetric region while maintaining an acceptable frame rate.
- Medical ultrasound imaging is increasingly being applied to image a region of the body in three dimensions. Medical ultrasound also derives much of its utility from its real-time aspect. That is, many dynamic phenomena, such as blood flow, are best viewed in real-time during an examination. Likewise, ultrasound-guided interventional procedures, such as tumor biopsy or ablation, take advantage of real-time acquisition and display of medical ultrasound images.
- the medical ultrasound system For many such applications, it is important for the medical ultrasound system to display the real-time images with a sufficient display frame rate.
- the frame rate of display that medical ultrasound systems can provide are ultimately limited by the speed of sound through human tissue. That is, there is a fixed time in which an ultrasound pulse may be sent into human tissue to a specific target and echoes returned therefrom. This fixed time limits the time needed to scan an entire image region and hence the frame rate that is attainable.
- the tissue of interest is interrogated by a series of focused, steered beams of sound that are progressively advanced through a 2D plane or 3-D volume such that the image of the plane or volume is constructed out of scan lines of echo signals acquired over time.
- multiple pulses must also be fired along each scan line direction so that a Doppler estimate of motion may be computed. Imaging a larger volume of tissue or acquiring multiple samples per line for flow imaging will cause additional reduction of the frame rate that is attainable.
- 3-D volume sizes and scan line densities can be increased only to a certain point beyond which the decreasing frame rate becomes clinically unacceptable to diagnose flow characteristics. That is, the frame rate becomes so slow that dynamic events in the body, such as the motion of a heart valve, cannot be viewed with the necessary temporal precision.
- the line density can be reduced compared to conventional 2-D imaging. This reduction in line density has the effect of compromising image quality, which may not be acceptable in some applications when grayscale quantification of brightness is critical, such as contrast imaging. Another compromise is to reduce the field of view, but an unacceptable depth of field for the targeted anatomy may result.
- a popular approach is the multi-slice display layout similar to that used in CT imaging and other modalities, in which a series of regularly spaced slices are displayed simultaneously in a matrix layout, e.g., a 2x2 or 3x3 grid, analogous to slices of bread laid out on a checkerboard.
- a matrix layout e.g., a 2x2 or 3x3 grid
- this method ensures sufficient resolution and permits inspection of arbitrary planes within the volume, they do not permit real-time display of such slices or variation in acquisition parameters that would produce optimal images.
- Prior art medical ultrasound imaging systems thus force the user to choose from among high frame-rate 2-D imaging, low line-density real-time 3-D imaging, or retrospective visualization of 3-D slice "snapshots.” There is therefore a need for a medical ultrasound imaging system that permits trading off image quality, frame rate, and volume size for better diagnostic -quality imaging.
- an ultrasonic imaging system which acquires a plurality of real-time slice images of a region of the body.
- the desired coverage of the region of interest is achieved by enabling user control of scanning parameters such as the number of image planes being acquired, the orientation, and the spacing of the planes.
- Scanning parameters are adjustable by the user to achieve the desired real-time frame rate, such as the number of planes, the temporal resolution, and the line density of the slice images.
- Figure 1 is a block diagram of an ultrasound imaging system according to one example of the invention.
- Figure 2 is a flow chart depicting a process flow diagram of a real-time multi- slice image acquisition method in accordance with an embodiment of the invention.
- Figure 3 is an ultrasound imaging system user interface according to one embodiment of the invention.
- Embodiments of the invention enable a user to image a region of interest by 2-
- D multislice imaging and permit tradeoffs of scanning parameters to achieve the desired realtime temporal resolution. It is recognized that in some clinical contexts in which imaging throughout a volume is desired, maintaining an adequate real-time frame rate of the motion (tissue or flow) or spatial resolution characteristics is more essential than fully sampling a large region of the volume.
- Fully electronic matrix transducer technology makes it possible for an imaging array to step among image planes spaced throughout a volume without the mechanical translation delay inherent in mechanically scanned 3-D probes. Acquisition of slice sequences may typically be spaced over elevation angle, azimuth angle, or rotation angle.
- Embodiments of the invention utilize matrix arrays where the plane locations may be spaced across the aperture to provide truly parallel acquisition planes or planes of another chosen relative orientation spaced as needed in elevation or azimuth. Acquiring planes in this manner allows, for example, the maintenance of a constant flow angle to a vessel and thus a consistent presentation among the slice planes.
- embodiments of the invention display image planes in real time as they are acquired. Only those planes that are selected by the user are scanned and displayed.
- Embodiments of the invention create the ability to image dynamic behavior, such as blood flow, contrast kinetics, or an interventional procedure, simultaneously in multiple planes throughout a tumor or organ.
- the ability to retrospectively inspect any arbitrary slice through the volume is preserved by saving the image data of a fully scanned volume for later post-processing.
- imaging and visualization that is inherently qualitative, such as color power angio ('CPA') or intervention guidance, this presents an acceptable trade-off.
- embodiments of the invention provide the additional benefit of accelerating some workflows by allowing a survey of an entire region or lesion to be quickly acquired and documented.
- an ultrasound imaging system includes a transducer array 10a for transmitting ultrasonic waves and receiving echo signals.
- the array shown is a two dimensional matrix array of transducer elements capable of scanning planes in a 3D volume through electronic beam steering and focusing over a volumetric region.
- the transducer array is coupled to a microbeamformer 12a which controls transmission and reception of signals by the matrix array elements.
- the microbeamformer is also capable of at least partial beamforming of the signals received by groups or "patches" of transducer elements as described in US Pats.
- the partially beamformed signals produced by the microbeamformer 12a are coupled to the main beamformer 116 where partially beamformed signals from the individual patches of elements are combined into a fully beamformed signal.
- the main beamformer 116 may have 128 channels, each of which receives a partially beamformed signal from a patch of 12 transducer elements. In this way the signals received by over 1500 transducer elements of a two dimensional matrix array can contribute efficiently to a single beamformed signal.
- the beamformed echo signals are processed by a quadrature bandpass (QBP) filter 62.
- QBP filters are commonly used in ultrasound systems to filter received echo signals, produce I and Q quadrature signal components for Doppler and coherent image processing and provide sampling decimation.
- QBP filters are generally described in, for example, US Pat. 6,050,942 (Rust et al.), which is incorporated by reference herein.
- the beamformed signals are coupled to a signal processor 24 where they may undergo additional enhancement such as speckle removal, signal compounding, harmonic separation, filtering, multiline interpolation and processing, and noise elimination.
- the processed signals are coupled to a B mode processor 26 and a Doppler processor 28.
- the B mode processor 26 employs amplitude detection for the imaging of tissues in the body such as muscle, tissue, and blood cells. B mode images of structure of the body may be formed in either the harmonic mode or the fundamental mode.
- the Doppler processor processes temporally distinct signals from moving tissue and blood flow for the detection of motion of substances in the image field including blood cells, tissue, and microbubbles.
- the Doppler processor operates on ensembles of temporally distinct samples from each location in the slice being imaged to produce an estimate of Doppler power, velocity, acceleration, or variance at each location in the image plane.
- the Doppler processor may also operate on long ensembles from a sample volume location to produce a spectral Doppler display of velocity variation at the sample volume location as described in US Pat.
- the Doppler processor 28 can operate on the I,Q quadrature data typically produced by the QBP filter 62 and the B mode processor 26 can operate on the same data by performing amplitude detection in the form of (I 2 + Q 2 ) 1/2 .
- the structural B mode signals are mapped to corresponding grayscale levels by the processor 26 which converts the B mode signals to a range of grayscale values.
- the flow signals from the Doppler processor 28 are mapped to different colors or hues by the processor 28 which similarly converts the flow signals to a range of color values.
- the range of color values corresponds to a range of flow velocities, for instance.
- Other Doppler modes such as power Doppler, acceleration, and variance may be used if desired.
- the mapping processes may implement grayscale and color ranges selected by the user and may be constructed as lookup tables.
- the Doppler processor 28 also includes a wall filter which passes relatively low velocity values when tissue motion is being imaged, and relatively high velocity values when blood flow is being imaged.
- harmonic imaging When harmonic imaging is being performed the beamformed signals undergo fundamental/harmonic signal separation as previously mentioned. This separation acts to separate linear and nonlinear signals so as to enable the identification of the strongly nonlinear echo signals returned from microbubbles.
- Harmonic signal separation may be accomplished in a variety of ways such as by bandpass filtering the received signals in fundamental frequency and harmonic frequency bands, or by a process known as pulse inversion harmonic separation.
- a suitable fundamental/harmonic signal separator is shown and described in international patent publication WO 2005/074805 (Bruce et al.), which is incorporated by reference herein.
- B mode images of structure of the body may be formed in either the harmonic mode or the fundamental mode or a combination of both as described in US Pat. 6,283,919 (Roundhill et al.) and US Pat. 6,458,083 (Jago et al.), which are incorporated by reference herein.
- Doppler processors 26 and 28 are coupled to a scan converter 32 and a volume renderer 34, which produce image data of tissue structure, flow, or a combined image of both characteristics.
- the scan converter will convert echo signals with polar coordinates into image signals of the desired image format such as a sector image in Cartesian coordinates.
- the volume renderer 34 will convert a 3D data set into a projected 3D image as viewed from a given reference point as described in US Pat. 6,530,885 (Entrekin et al.), which is incorporated by reference herein. As described therein, when the reference point of the rendering is changed the 3D image can appear to rotate in what is known as a kinetic parallax display.
- This image manipulation is controlled by the user as indicated by the Display Control line between the user interface 38 and the volume renderer 34. Also described is the representation of a 3D volume by planar images of different image planes of the volume, a technique known as multiplanar reformatting, which can be used to retrospectively form and image selected image planes during post-processing.
- the volume renderer 34 can operate on image data in either rectilinear or polar coordinates as described in US Pat. 6,723,050 (Dow et al.), which is incorporated by reference herein.
- the 2D or 3D images are coupled from the scan converter 32 and the volume renderer 34 to an image processor 30 for further enhancement, buffering and temporary storage for display on an image display 40.
- a graphics processor 36 is also coupled to the image processor 30 which generates graphic overlays for display with the ultrasound images. These graphic overlays can contain standard identifying information such as patient name, date and time of the image, imaging parameter settings, and the like. For these purposes the graphics processor 36 is coupled to receive input from the user interface 38. The user can also operate the user interface to store images in an image store 52 which may comprise an optical disk, hard drive, or other storage media. The user interface 38 may further be used to operate a QLab processor 50 which performs quantitative post-processing and image analysis of acquired and/or stored images.
- the ultrasound display 40 will also preferably show an ECG trace drawn in response to reception of an R-wave signal.
- the R-wave is the electrical physiological signal produced to stimulate the heart's contraction, and is conventionally detected by an electrocardiograph (ECG).
- Figure 1 shows a set of ECG electrodes 180 which may be affixed to the chest of a patient to detect the R-wave signal.
- the signal is detected and processed by an ECG signal processor 182 and coupled to the graphics processor 30, which displays the ECG waveform in synchronism with a scrolling spectral Doppler display and/or an anatomical Doppler image.
- the ECG signal may also be coupled to the beamform controller 18 for the gating of image acquisition at desired phases of the heart cycle.
- FIG. 2 is a flow chart depicting a real-time multi-slice acquisition and processing method 200 in accordance with an embodiment of the invention.
- the process depicted outlines typical steps a sonographer or other clinician may use to acquire and display multiple image slices of anatomy of interest in real-time.
- the process 200 begins at step 210 where the sonographer obtains an initial scan of the patient's anatomy, preferably a 3D volumetric image. This scan is used to produce an image of the volumetric region of interest in which the desired slice images are to be located at step 220. Within this region of interest, the sonographer may further select a number, orientation, and spacing for multiple parallel planes that are to be scanned within the region of interest.
- a parallel orientation is not desired, such as when a fan- shaped (different plane tilt) orientation of planes is needed, or some other relative orientation of multiple planes.
- the sonographer selects a multi-slice display format size.
- This size may be, for example, 2x2, 2x3 or 3x3 image planes.
- the size of the display format determines the number of sub-panes available for displaying images of the scanned planes. Thus, in each of these example sizes above, the system would display 4, 6, or 9 sub-panes, respectively, in each display format.
- a larger format causes a larger number of planes to be acquired.
- the size of the display format may be adjusted to suit the required frame rate. For instance, a smaller format with fewer images would permit a greater sampling rate of each image plane and hence a higher frame rate of display.
- the ultrasound system acquires echo data from each of the planes at step 240 as required to produce an image at step 250 from such data and to display the image in the appropriate sub-pane at step 260. Any plane that is not to be displayed in the selected format will not be scanned. This improves the overall frame rate. As the examination continues, it may be desirable to change the number of planes and/or their spacing so as to improve the frame rate of display or more precisely capture anatomy of interest at step 270. For instance, if a specific point in the volumetric region of interest contains the anatomy which is to be examined, such as a heart valve in a volumetric region of the heart, only a few image planes could be positioned at the heart valve location. If the temporal resolution of the heart valve images is unacceptable, the number of planes scanned or the line density can be reduced to improve the frame rate and the temporal resolution of the planar images.
- Figure 3 is an ultrasound imaging system screen display 300 according to one embodiment of the invention.
- the screen display as shown can be presented after the steps through step 260 of Figure 2 have been completed and the volume of interest is available for further slice selection and refinement through the user control panel 38.
- Orienting image planes of the volumetric region of interest can be seen in the display regions 305-315 on the right side of the screen display 300.
- the display regions 305-315 represent planar views along each of the axes in the volumetric region of interest, which are usually presented to the user in a "preview" mode during which acquisition parameters are selected.
- the display region 315 is a planar view taken through one axis of the volumetric region.
- the image of the display region 315 is outlined in blue, and there are vertical and horizontal lines intersecting in the middle of the image which are colored red and green. These lines represent the relative plane locations of the other two orienting images 305 and 310 of the region of interest, one of which is outlined in red and the other in green.
- These three images of the mutually orthogonal image planes are commonly used in the art to orient the user in a volumetric region being scanned.
- the display region 315 also contains a set of numbers along the left edge of its bounding box. These numbers correspond to the number of planes, or slices, located above and below the center slice of the image. The locations of these slices are represented by parallel dashed lines positioned over the image of display region 315, as the slices are orthogonal to the plane of the image 315. Some or all of the planar images represented by the dashed lines are displayed in the multi-slice display region 347 of the screen display, as selected by the user.
- the four central images represented by the four center dashed lines over the image 315 have been selected by the user for display in the multi-slice display region 347.
- These image planes are orthogonal to the plane of the image 315 as previously mentioned, and in this example the multi- slice image planes are parallel to the image plane of orienting image 305.
- the slices 349-355 that are shown in the multi-slice display region 347 are determined by a combination of user inputs from the softkey controls shown at the bottom of the display screen 300 as selected by user manipulation of a pointing device such as a trackball or mouse of the control panel 38.
- the axis select softkey 345 of the display screen determines which of the axes is used for acquiring the parallel slices.
- the softkey 345 is set at a '1' which is associated with the axis corresponding to the image plane of display region 305 as shown in the lower right hand corner of the display region 305.
- the Depth control 339 determines the location of the primary display pane of the multi-slice display matrix 347 in depth, elevation, or azimuth.
- the primary display pane is, in this example, display pane 353.
- the Depth control 339 is set to '36mm' and the display pane 353 is displaying a slice depth of '36mm' as can be seen in the lower left corner of the display pane 353.
- the relative positions of the planes of the images of the remaining panes in the multi-slice display 347 are chosen based on the setting of the Interval control 341.
- the inter-slice interval is set to '1.6mm.'
- the slices in each display pane differ in depth from one another by this 1.6mm inter-slice spacing.
- the multi-slice display format size is determined through the layout setting 343 as is described in step 230 of Figure 2. Clicking on the layout setting 343 incrementally changes the format size of the multi-slice display.
- a 2x2 format is selected and therefore, four display panes arranged 2x2 are used to form the multi- slice display 347.
- the described combination of inputs with the softkeys at the bottom of the screen 300 thus serves to define the slices that are displayed in the multi- slice display 347.
- a number of other scanning parameters can also be adjusted to achieve the desired balance between temporal and spatial resolution of the multi-slice images.
- One is to adjust the lateral dimension of the slice images. In the illustrated embodiment this may be done by clicking on the appropriate arrow of the Sector Width control 357. Increasing the width of the image will increase the scanning time required to acquire each image and will correspondingly reduce the frame rate of display. Narrowing the width will increase the frame rate, but may exclude some of the region of interest, so an acceptable balance should be obtained.
- Another scanning parameter which affects frame rate is the line density, which can be changed in this embodiment by clicking on the Density control 359. Each time the user clicks on this control, the Density setting will toggle among low, medium and high line density.
- Decreasing the line density will decrease the slice acquisition time and hence increase the frame rate of display.
- Other controls which may be adjusted to affect the frame rate are Doppler settings such as the size of the color box, the window in which flow is estimated, and the length of the sample ensemble (number of samples) used to estimate flow velocity. A smaller color box and a shorter ensemble will each improve the frame rate.
- There may also be subjective controls which affect these parameters such as a "resolution/speed" control which varies several parameters simultaneously to adjust the balance of temporal and spatial resolution.
- the ultrasound imaging system repeatedly acquires echo data from along each of these slices, producing an image for each slice and displaying the image in the appropriate display pane 349-355 of the multi-slice display 347.
- Embodiments of the invention will then continuously update the display panes in real time according to changes in the number of slices selected by the Slice control 337, the depth setting according to the Depth control 339 and the slice interval according to the Interval control 341. In this manner, the user is permitted to trade off the image quality and frame rate according to the clinical needs at the time of the examination.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Human Computer Interaction (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
An ultrasound imaging system is disclosed for real-time creation and display of a number of images corresponding to parallel slices of a target region in a patient. The ultrasound imaging system includes a matrix transducer array capable of electronic beam steering. The system further includes a controller capable of driving the array to image a sequence of parallel or other selectively oriented planes located throughout the target region and of simultaneously displaying the multiple image planes acquired in real time. The system also includes a user interface capable of allowing the user to control the number, orientation, and spacing of the multiple acquired planes. During imaging, the system may repeatedly acquire a sequence of planes and display the frames from each plane in a sub-pane of a multi-slice display area, such as 2x2, 2x3, or 3x3 image sub-panes. The user may adjust the plane number, orientation, and spacing to encompass the region of interest, spanning a lesion or organ, while increasing or decreasing the number of planes and/or their scanning parameters to achieve a suitable frame rate for each sub-pane.
Description
SYSTEM AND METHOD FOR REAL-TIME MULTI-SLICE ACQUISITION AND DISPLAY OF MEDICAL ULTRASOUND IMAGES
[001] This invention relates to systems and methods for real-time capture and display of medical ultrasound images and more particularly to real-time capture and display of multi- slice of medical ultrasound images of a volumetric region while maintaining an acceptable frame rate.
[002] Medical ultrasound imaging is increasingly being applied to image a region of the body in three dimensions. Medical ultrasound also derives much of its utility from its real-time aspect. That is, many dynamic phenomena, such as blood flow, are best viewed in real-time during an examination. Likewise, ultrasound-guided interventional procedures, such as tumor biopsy or ablation, take advantage of real-time acquisition and display of medical ultrasound images.
[003] For many such applications, it is important for the medical ultrasound system to display the real-time images with a sufficient display frame rate. With current technology, the frame rate of display that medical ultrasound systems can provide are ultimately limited by the speed of sound through human tissue. That is, there is a fixed time in which an ultrasound pulse may be sent into human tissue to a specific target and echoes returned therefrom. This fixed time limits the time needed to scan an entire image region and hence the frame rate that is attainable.
[004] For the majority of commercial ultrasound systems, the tissue of interest is interrogated by a series of focused, steered beams of sound that are progressively advanced through a 2D plane or 3-D volume such that the image of the plane or volume is constructed out of scan lines of echo signals acquired over time. In some cases, such as flow imaging, multiple pulses must also be fired along each scan line direction so that a Doppler estimate of motion may be computed. Imaging a larger volume of tissue or acquiring multiple samples per line for flow imaging will cause additional reduction of the frame rate that is attainable.
[005] As a consequence of these constraints, 3-D volume sizes and scan line densities can be increased only to a certain point beyond which the decreasing frame rate becomes clinically unacceptable to diagnose flow characteristics. That is, the frame rate becomes so slow that dynamic events in the body, such as the motion of a heart valve, cannot be viewed with the necessary temporal precision. To provide acceptable real-time 3-D imaging, the line density can be reduced compared to conventional 2-D imaging. This reduction in line density has the effect of compromising image quality, which may not be
acceptable in some applications when grayscale quantification of brightness is critical, such as contrast imaging. Another compromise is to reduce the field of view, but an unacceptable depth of field for the targeted anatomy may result.
[006] There is, therefore, typically a trade-off between the display frame rate and the display resolution over the imaged volume at that rate. Prior art medical ultrasound systems typically permit a user to trade off these constraints during a real-time examination, but the ability to do so easily when seeking adequate 2D images during fully sampled volumetric imaging is often deficient. For example, imaging of a human liver is typically done by scanning the entire volume over a time interval sufficient to store ultrasound image data for the entire volume. Slices of the images may then be formed and inspected retrospectively by viewing 2-D slices of the 3-D volume. A popular approach is the multi-slice display layout similar to that used in CT imaging and other modalities, in which a series of regularly spaced slices are displayed simultaneously in a matrix layout, e.g., a 2x2 or 3x3 grid, analogous to slices of bread laid out on a checkerboard. Although the use of this method ensures sufficient resolution and permits inspection of arbitrary planes within the volume, they do not permit real-time display of such slices or variation in acquisition parameters that would produce optimal images.
[007] Prior art medical ultrasound imaging systems thus force the user to choose from among high frame-rate 2-D imaging, low line-density real-time 3-D imaging, or retrospective visualization of 3-D slice "snapshots." There is therefore a need for a medical ultrasound imaging system that permits trading off image quality, frame rate, and volume size for better diagnostic -quality imaging.
[008] In accordance with the principles of the present invention, an ultrasonic imaging system is provided which acquires a plurality of real-time slice images of a region of the body. The desired coverage of the region of interest is achieved by enabling user control of scanning parameters such as the number of image planes being acquired, the orientation, and the spacing of the planes. Scanning parameters are adjustable by the user to achieve the desired real-time frame rate, such as the number of planes, the temporal resolution, and the line density of the slice images.
[009] In the drawings:
[010] Figure 1 is a block diagram of an ultrasound imaging system according to one example of the invention.
[011] Figure 2 is a flow chart depicting a process flow diagram of a real-time multi- slice image acquisition method in accordance with an embodiment of the invention.
[012] Figure 3 is an ultrasound imaging system user interface according to one embodiment of the invention.
[013] Embodiments of the invention enable a user to image a region of interest by 2-
D multislice imaging and permit tradeoffs of scanning parameters to achieve the desired realtime temporal resolution. It is recognized that in some clinical contexts in which imaging throughout a volume is desired, maintaining an adequate real-time frame rate of the motion (tissue or flow) or spatial resolution characteristics is more essential than fully sampling a large region of the volume. Fully electronic matrix transducer technology makes it possible for an imaging array to step among image planes spaced throughout a volume without the mechanical translation delay inherent in mechanically scanned 3-D probes. Acquisition of slice sequences may typically be spaced over elevation angle, azimuth angle, or rotation angle. Embodiments of the invention utilize matrix arrays where the plane locations may be spaced across the aperture to provide truly parallel acquisition planes or planes of another chosen relative orientation spaced as needed in elevation or azimuth. Acquiring planes in this manner allows, for example, the maintenance of a constant flow angle to a vessel and thus a consistent presentation among the slice planes.
[014] Unlike retrospective multi-slice display, in which parallel post-scan-converted planes are reconstructed and displayed using fixed, previously acquired image data, embodiments of the invention display image planes in real time as they are acquired. Only those planes that are selected by the user are scanned and displayed. Embodiments of the invention create the ability to image dynamic behavior, such as blood flow, contrast kinetics, or an interventional procedure, simultaneously in multiple planes throughout a tumor or organ. The ability to retrospectively inspect any arbitrary slice through the volume is preserved by saving the image data of a fully scanned volume for later post-processing. During imaging and visualization that is inherently qualitative, such as color power angio ('CPA') or intervention guidance, this presents an acceptable trade-off. Additionally, embodiments of the invention provide the additional benefit of accelerating some workflows by allowing a survey of an entire region or lesion to be quickly acquired and documented.
[015] Acquiring multiple planes throughout a volume will lower the frame rate of each acquired plane relative to conventional 2-D imaging by a factor equal to the number of slices being scanned. An embodiment of the present invention enables acquisition parameters such as the number of planes and the inter-plane spacing to be traded off so that an acceptable 2-D frame rate is achieved. In other cases, embodiments of the invention may provide the user with more flexibility to trade off image quality and frame rate while
providing more dynamic clinical information than static volumetric imaging. Also, if many planes are displayed in a matrix, each individual image would be smaller on the display and not benefit from a high line density, and thus it may be acceptable to lower the line density in each plane and recover the necessary frame rate of display.
[016] Referring first to Figure 1, one embodiment of an ultrasound imaging system includes a transducer array 10a for transmitting ultrasonic waves and receiving echo signals. In this example the array shown is a two dimensional matrix array of transducer elements capable of scanning planes in a 3D volume through electronic beam steering and focusing over a volumetric region. The transducer array is coupled to a microbeamformer 12a which controls transmission and reception of signals by the matrix array elements. The microbeamformer is also capable of at least partial beamforming of the signals received by groups or "patches" of transducer elements as described in US Pats. 5,997,479 (Savord et al.), 6,013,032 (Savord), and 6,623,432 (Powers et al.), which are incorporated herein by reference. The transmission of ultrasonic beams from the transducer array 10a and the processing of received echo signals by the microbeamformer 12a and a main beamformer 116 are under control of a beamform controller 18, which receives input from the user's operation of the user interface or control panel 38. By use of the user interface the user can control parameters such as focal depth, beam steering for spectral Doppler, the number of image slices, line density, the image slice location and inter-slice spacing, and other scanning variables.
[017] The partially beamformed signals produced by the microbeamformer 12a are coupled to the main beamformer 116 where partially beamformed signals from the individual patches of elements are combined into a fully beamformed signal. For example, the main beamformer 116 may have 128 channels, each of which receives a partially beamformed signal from a patch of 12 transducer elements. In this way the signals received by over 1500 transducer elements of a two dimensional matrix array can contribute efficiently to a single beamformed signal. The beamformed echo signals are processed by a quadrature bandpass (QBP) filter 62. QBP filters are commonly used in ultrasound systems to filter received echo signals, produce I and Q quadrature signal components for Doppler and coherent image processing and provide sampling decimation. QBP filters are generally described in, for example, US Pat. 6,050,942 (Rust et al.), which is incorporated by reference herein. The beamformed signals are coupled to a signal processor 24 where they may undergo additional enhancement such as speckle removal, signal compounding, harmonic separation, filtering, multiline interpolation and processing, and noise elimination.
[018] The processed signals are coupled to a B mode processor 26 and a Doppler processor 28. The B mode processor 26 employs amplitude detection for the imaging of tissues in the body such as muscle, tissue, and blood cells. B mode images of structure of the body may be formed in either the harmonic mode or the fundamental mode. Tissues in the body and microbubble contrast agents both return both types of signals and the harmonic returns of microbubbles enable microbubbles to be clearly segmented in an image in most applications. The Doppler processor processes temporally distinct signals from moving tissue and blood flow for the detection of motion of substances in the image field including blood cells, tissue, and microbubbles. The Doppler processor operates on ensembles of temporally distinct samples from each location in the slice being imaged to produce an estimate of Doppler power, velocity, acceleration, or variance at each location in the image plane. The Doppler processor may also operate on long ensembles from a sample volume location to produce a spectral Doppler display of velocity variation at the sample volume location as described in US Pat. 5,287,753 (Routh et al.), which is incorporated by reference herein. Different transmit signals may be used for B mode and Doppler returns or the same signals may be used by both processors as described in US Pat. 6,139,501 (Roundhill et al.), which is incorporated by reference herein. The Doppler processor 28 can operate on the I,Q quadrature data typically produced by the QBP filter 62 and the B mode processor 26 can operate on the same data by performing amplitude detection in the form of (I2 + Q2)1/2. The structural B mode signals are mapped to corresponding grayscale levels by the processor 26 which converts the B mode signals to a range of grayscale values. The flow signals from the Doppler processor 28 are mapped to different colors or hues by the processor 28 which similarly converts the flow signals to a range of color values. When the flow signals are velocity-related signals for color flow imaging the range of color values corresponds to a range of flow velocities, for instance. Other Doppler modes such as power Doppler, acceleration, and variance may be used if desired. The mapping processes may implement grayscale and color ranges selected by the user and may be constructed as lookup tables. The Doppler processor 28 also includes a wall filter which passes relatively low velocity values when tissue motion is being imaged, and relatively high velocity values when blood flow is being imaged.
[019] When harmonic imaging is being performed the beamformed signals undergo fundamental/harmonic signal separation as previously mentioned. This separation acts to separate linear and nonlinear signals so as to enable the identification of the strongly nonlinear echo signals returned from microbubbles. Harmonic signal separation may be
accomplished in a variety of ways such as by bandpass filtering the received signals in fundamental frequency and harmonic frequency bands, or by a process known as pulse inversion harmonic separation. A suitable fundamental/harmonic signal separator is shown and described in international patent publication WO 2005/074805 (Bruce et al.), which is incorporated by reference herein. B mode images of structure of the body may be formed in either the harmonic mode or the fundamental mode or a combination of both as described in US Pat. 6,283,919 (Roundhill et al.) and US Pat. 6,458,083 (Jago et al.), which are incorporated by reference herein.
[020] The structural (tissue) and motion signals produced by the B mode and
Doppler processors 26 and 28 are coupled to a scan converter 32 and a volume renderer 34, which produce image data of tissue structure, flow, or a combined image of both characteristics. The scan converter will convert echo signals with polar coordinates into image signals of the desired image format such as a sector image in Cartesian coordinates. The volume renderer 34 will convert a 3D data set into a projected 3D image as viewed from a given reference point as described in US Pat. 6,530,885 (Entrekin et al.), which is incorporated by reference herein. As described therein, when the reference point of the rendering is changed the 3D image can appear to rotate in what is known as a kinetic parallax display. This image manipulation is controlled by the user as indicated by the Display Control line between the user interface 38 and the volume renderer 34. Also described is the representation of a 3D volume by planar images of different image planes of the volume, a technique known as multiplanar reformatting, which can be used to retrospectively form and image selected image planes during post-processing. The volume renderer 34 can operate on image data in either rectilinear or polar coordinates as described in US Pat. 6,723,050 (Dow et al.), which is incorporated by reference herein. The 2D or 3D images are coupled from the scan converter 32 and the volume renderer 34 to an image processor 30 for further enhancement, buffering and temporary storage for display on an image display 40.
[021] A graphics processor 36 is also coupled to the image processor 30 which generates graphic overlays for display with the ultrasound images. These graphic overlays can contain standard identifying information such as patient name, date and time of the image, imaging parameter settings, and the like. For these purposes the graphics processor 36 is coupled to receive input from the user interface 38. The user can also operate the user interface to store images in an image store 52 which may comprise an optical disk, hard drive, or other storage media. The user interface 38 may further be used to operate a QLab
processor 50 which performs quantitative post-processing and image analysis of acquired and/or stored images.
[022] For most cardiology applications the ultrasound display 40 will also preferably show an ECG trace drawn in response to reception of an R-wave signal. The R-wave is the electrical physiological signal produced to stimulate the heart's contraction, and is conventionally detected by an electrocardiograph (ECG). Figure 1 shows a set of ECG electrodes 180 which may be affixed to the chest of a patient to detect the R-wave signal. The signal is detected and processed by an ECG signal processor 182 and coupled to the graphics processor 30, which displays the ECG waveform in synchronism with a scrolling spectral Doppler display and/or an anatomical Doppler image. The ECG signal may also be coupled to the beamform controller 18 for the gating of image acquisition at desired phases of the heart cycle.
[023] Figure 2 is a flow chart depicting a real-time multi-slice acquisition and processing method 200 in accordance with an embodiment of the invention. The process depicted outlines typical steps a sonographer or other clinician may use to acquire and display multiple image slices of anatomy of interest in real-time. The process 200 begins at step 210 where the sonographer obtains an initial scan of the patient's anatomy, preferably a 3D volumetric image. This scan is used to produce an image of the volumetric region of interest in which the desired slice images are to be located at step 220. Within this region of interest, the sonographer may further select a number, orientation, and spacing for multiple parallel planes that are to be scanned within the region of interest. In some application a parallel orientation is not desired, such as when a fan- shaped (different plane tilt) orientation of planes is needed, or some other relative orientation of multiple planes. At step 230, the sonographer selects a multi-slice display format size. This size may be, for example, 2x2, 2x3 or 3x3 image planes. The size of the display format determines the number of sub-panes available for displaying images of the scanned planes. Thus, in each of these example sizes above, the system would display 4, 6, or 9 sub-panes, respectively, in each display format. A larger format, of course, causes a larger number of planes to be acquired. The size of the display format may be adjusted to suit the required frame rate. For instance, a smaller format with fewer images would permit a greater sampling rate of each image plane and hence a higher frame rate of display.
[024] After the display format size has been selected, the ultrasound system acquires echo data from each of the planes at step 240 as required to produce an image at step 250 from such data and to display the image in the appropriate sub-pane at step 260. Any plane
that is not to be displayed in the selected format will not be scanned. This improves the overall frame rate. As the examination continues, it may be desirable to change the number of planes and/or their spacing so as to improve the frame rate of display or more precisely capture anatomy of interest at step 270. For instance, if a specific point in the volumetric region of interest contains the anatomy which is to be examined, such as a heart valve in a volumetric region of the heart, only a few image planes could be positioned at the heart valve location. If the temporal resolution of the heart valve images is unacceptable, the number of planes scanned or the line density can be reduced to improve the frame rate and the temporal resolution of the planar images.
[025] The method of Figure 2 will now be explained with reference to Figure 3 which is taken from a constructed embodiment of the present invention. Figure 3 is an ultrasound imaging system screen display 300 according to one embodiment of the invention. The screen display as shown can be presented after the steps through step 260 of Figure 2 have been completed and the volume of interest is available for further slice selection and refinement through the user control panel 38. Orienting image planes of the volumetric region of interest can be seen in the display regions 305-315 on the right side of the screen display 300.
[026] The display regions 305-315 represent planar views along each of the axes in the volumetric region of interest, which are usually presented to the user in a "preview" mode during which acquisition parameters are selected. The display region 315 is a planar view taken through one axis of the volumetric region. In the constructed embodiment the image of the display region 315 is outlined in blue, and there are vertical and horizontal lines intersecting in the middle of the image which are colored red and green. These lines represent the relative plane locations of the other two orienting images 305 and 310 of the region of interest, one of which is outlined in red and the other in green. These three images of the mutually orthogonal image planes are commonly used in the art to orient the user in a volumetric region being scanned. With these orienting images, it is relatively easy for the user to orient herself and position planes within the volumetric region of interest. The display region 315 also contains a set of numbers along the left edge of its bounding box. These numbers correspond to the number of planes, or slices, located above and below the center slice of the image. The locations of these slices are represented by parallel dashed lines positioned over the image of display region 315, as the slices are orthogonal to the plane of the image 315. Some or all of the planar images represented by the dashed lines are displayed in the multi-slice display region 347 of the screen display, as selected by the user.
In this example the four central images represented by the four center dashed lines over the image 315 have been selected by the user for display in the multi-slice display region 347. These image planes are orthogonal to the plane of the image 315 as previously mentioned, and in this example the multi- slice image planes are parallel to the image plane of orienting image 305.
[027] The slices 349-355 that are shown in the multi-slice display region 347 are determined by a combination of user inputs from the softkey controls shown at the bottom of the display screen 300 as selected by user manipulation of a pointing device such as a trackball or mouse of the control panel 38. The axis select softkey 345 of the display screen determines which of the axes is used for acquiring the parallel slices. In this example, the softkey 345 is set at a '1' which is associated with the axis corresponding to the image plane of display region 305 as shown in the lower right hand corner of the display region 305. The Depth control 339 determines the location of the primary display pane of the multi-slice display matrix 347 in depth, elevation, or azimuth. The primary display pane is, in this example, display pane 353. This can be appreciated because the Depth control 339 is set to '36mm' and the display pane 353 is displaying a slice depth of '36mm' as can be seen in the lower left corner of the display pane 353. The relative positions of the planes of the images of the remaining panes in the multi-slice display 347 are chosen based on the setting of the Interval control 341. In this example, the inter-slice interval is set to '1.6mm.' As can be seen in the display panes 349-355, the slices in each display pane differ in depth from one another by this 1.6mm inter-slice spacing. Lastly, the multi-slice display format size is determined through the layout setting 343 as is described in step 230 of Figure 2. Clicking on the layout setting 343 incrementally changes the format size of the multi-slice display. In this example, a 2x2 format is selected and therefore, four display panes arranged 2x2 are used to form the multi- slice display 347. The described combination of inputs with the softkeys at the bottom of the screen 300 thus serves to define the slices that are displayed in the multi- slice display 347.
[028] A number of other scanning parameters can also be adjusted to achieve the desired balance between temporal and spatial resolution of the multi-slice images. One is to adjust the lateral dimension of the slice images. In the illustrated embodiment this may be done by clicking on the appropriate arrow of the Sector Width control 357. Increasing the width of the image will increase the scanning time required to acquire each image and will correspondingly reduce the frame rate of display. Narrowing the width will increase the frame rate, but may exclude some of the region of interest, so an acceptable balance should
be obtained. Another scanning parameter which affects frame rate is the line density, which can be changed in this embodiment by clicking on the Density control 359. Each time the user clicks on this control, the Density setting will toggle among low, medium and high line density. Decreasing the line density will decrease the slice acquisition time and hence increase the frame rate of display. Other controls which may be adjusted to affect the frame rate are Doppler settings such as the size of the color box, the window in which flow is estimated, and the length of the sample ensemble (number of samples) used to estimate flow velocity. A smaller color box and a shorter ensemble will each improve the frame rate. There may also be subjective controls which affect these parameters such as a "resolution/speed" control which varies several parameters simultaneously to adjust the balance of temporal and spatial resolution.
[029] Once the slices have been defined by these and other parameters, the ultrasound imaging system repeatedly acquires echo data from along each of these slices, producing an image for each slice and displaying the image in the appropriate display pane 349-355 of the multi-slice display 347. Embodiments of the invention will then continuously update the display panes in real time according to changes in the number of slices selected by the Slice control 337, the depth setting according to the Depth control 339 and the slice interval according to the Interval control 341. In this manner, the user is permitted to trade off the image quality and frame rate according to the clinical needs at the time of the examination.
[030] Although the invention has been described with reference to the disclosed examples, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention. Such modifications may be well within the skill of those ordinarily skilled in the art. Accordingly, the invention is not limited except as by the appended claims.
Claims
1. A method of displaying a plurality of images on an ultrasound imaging system display, comprising: specifying a volume of interest; choosing a number of image planes of a desired relative orientation within the volume of interest; repeatedly transmitting ultrasound pulses down a line of sight corresponding to each of the image planes; receiving echoes of the transmitted ultrasound pulses from along each line of sight; processing the echoes to create the plurality of images corresponding to the image planes; and displaying the plurality of images on the ultrasound imaging system display.
2. The method of claim 1 further comprising updating the display of the plurality of images as the echoes from the repeated transmissions of ultrasound pulses are received and processed and according to changes in the number of image planes.
3. The method of claim 1 wherein choosing a number of image planes within the volume of interest comprises choosing the number, orientation and spacing of parallel planes.
4. The method of claim 1 wherein transmitting ultrasound pulses down a line of sight corresponding to each of the image planes comprises using a matrix array transducer to transmit ultrasound pulses down a line of sight corresponding to each of the image planes using electronic beam steering.
5. The method of claim 1 wherein specifying a volume of interest comprises specifying the volume of interest that spans an anatomical region of interest.
6. The method of claim 5 wherein the anatomical region of interest comprises a tissue or organ.
7. The method of claim 1 wherein the image planes are parallel planes which are equally spaced within the volume of interest.
8. An ultrasound imaging system comprising: a display; a processor coupled to the display; a user interface coupled to the processor; a transducer coupled to the processor and operable to transmit ultrasound pulses down a plurality of lines of sight corresponding to each of a number of image planes within a volume of interest; and a display controller operatively connected to the processor, the display controller providing a user the ability to create a plurality of images corresponding to each of the image planes, the display controller being operable to display a plurality of images corresponding to respective repetitive ultrasound scans along the image planes through the volume of interest.
9. The ultrasound imaging system of claim 8 wherein the display controller is further operable to update the display of the plurality of images as the echoes from the repetitive ultrasound scans are received and processed.
10. The ultrasound imaging system of claim 8 wherein the display controller is further operable to accept user input specifying the number of image planes to scan.
11. The ultrasound imaging system of claim 8 wherein the display controller is further operable to accept user input specifying the spacing between the image planes.
12. The ultrasound imaging system of claim 8 wherein the display controller is further operable to accept user input specifying the orientation of the image planes relative to the volume of interest.
13. The ultrasound imaging system of claim 8 wherein the display controller is further operable to accept user input specifying the line density of the image planes.
14. The ultrasound imaging system of claim 8 wherein the display controller is further operable to accept user input specifying the lateral width of the image planes.
15. The ultrasound imaging system of claim 8 wherein the display controller is further operable to accept user input specifying Doppler parameters of the image planes.
16. The ultrasound imaging system of claim 15 wherein the Doppler parameters include at least one of color box size and Doppler ensemble length.
17. The ultrasound imaging system of claim 8 wherein the analysis package is further operable to transmit ultrasound pulses using a matrix array transducer to transmit ultrasound pulses using electronic beam steering.
18. The ultrasound imaging system of claim 8 wherein the image planes are parallel planes equally spaced within the volume of interest.
19. A system for presenting ultrasound data in real time comprising: a matrix array transducer operable to repeatedly emit ultrasonic waves along a number of image planes of a target area within a subject, receive reflected waves from the target area, and to generate output signals corresponding to the reflected waves; a processor coupled to the transducer and operable to receive the output signals and generate echo data based on the output signals, the processor further operable to generate a plurality of images corresponding to the image planes based on the echo data; and a display device coupled to the processor and operable to display the plurality of images simultaneously in real time.
20. The system of claim 19, further comprising a user input device coupled to the processor, wherein the input device provides adjustment of at least one of: the number, orientation and spacing of the image planes.
21. The ultrasound imaging system of claim 15 wherein the image planes are parallel planes equally spaced within the volume of interest and images of the planes are displayed simultaneously and adjacent to each other.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US97725107P | 2007-10-03 | 2007-10-03 | |
| US60/977,251 | 2007-10-03 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2009044316A1 true WO2009044316A1 (en) | 2009-04-09 |
Family
ID=40260473
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2008/053906 Ceased WO2009044316A1 (en) | 2007-10-03 | 2008-09-25 | System and method for real-time multi-slice acquisition and display of medical ultrasound images |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2009044316A1 (en) |
Cited By (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2281510A1 (en) * | 2009-07-30 | 2011-02-09 | Medison Co., Ltd. | Providing a plurality of slice images in an ultrasound system |
| WO2011055312A1 (en) * | 2009-11-06 | 2011-05-12 | Koninklijke Philips Electronics N.V. | Quantification results in multiplane imaging |
| WO2012014120A1 (en) * | 2010-07-30 | 2012-02-02 | Koninklijke Philips Electronics N.V. | Display and export of individual biplane images |
| WO2012014125A1 (en) * | 2010-07-30 | 2012-02-02 | Koninklijke Philips Electronics N.V. | Automated sweep and export of 2d ultrasound images of 3d volumes |
| WO2012066470A1 (en) * | 2010-11-19 | 2012-05-24 | Koninklijke Philips Electronics N.V. | A method for guiding the insertion of a surgical instrument with three dimensional ultrasonic imaging |
| WO2012003369A3 (en) * | 2010-06-30 | 2013-01-10 | Muffin Incorporated | Percutaneous, ultrasound-guided introduction of medical devices |
| WO2013068883A1 (en) * | 2011-11-10 | 2013-05-16 | Koninklijke Philips Electronics N.V. | Improving large volume three-dimensional ultrasound imaging |
| US20130229504A1 (en) * | 2010-11-19 | 2013-09-05 | Koninklijke Philips Electronics N.V. | Three dimensional ultrasonic guidance of surgical instruments |
| EP2433567A4 (en) * | 2009-05-20 | 2013-10-16 | Hitachi Medical Corp | DEVICE FOR DIAGNOSING MEDICAL IMAGES AND ASSOCIATED METHOD OF DETERMINING A ZONE INVESTIGATED |
| US20140018708A1 (en) * | 2012-07-16 | 2014-01-16 | Mirabilis Medica, Inc. | Human Interface and Device for Ultrasound Guided Treatment |
| CN103702032A (en) * | 2013-12-31 | 2014-04-02 | 华为技术有限公司 | Image processing method, device and terminal equipment |
| WO2015114484A1 (en) * | 2014-01-28 | 2015-08-06 | Koninklijke Philips N.V. | Ultrasound systems for multi-plane acquisition with single- or bi-plane real-time imaging, and methods of operation thereof |
| EP2989984A1 (en) * | 2014-08-25 | 2016-03-02 | Samsung Medison Co., Ltd. | Ultrasonic imaging apparatus and control method thereof |
| US9305347B2 (en) | 2013-02-13 | 2016-04-05 | Dental Imaging Technologies Corporation | Automatic volumetric image inspection |
| US9492638B2 (en) | 2012-11-01 | 2016-11-15 | Muffin Incorporated | Implements for identifying sheath migration |
| US9730675B2 (en) | 2014-01-27 | 2017-08-15 | Koninklijke Philips N.V. | Ultrasound imaging system and an ultrasound imaging method |
| WO2017211774A1 (en) * | 2016-06-06 | 2017-12-14 | Koninklijke Philips N.V. | Medical ultrasound image processing device |
| CN110636799A (en) * | 2017-03-16 | 2019-12-31 | 皇家飞利浦有限公司 | Optimal scan plane selection for organ viewing |
| CN111372520A (en) * | 2017-10-16 | 2020-07-03 | 皇家飞利浦有限公司 | Ultrasound imaging systems and methods |
| CN111970974A (en) * | 2018-02-26 | 2020-11-20 | 皇家飞利浦有限公司 | Providing three-dimensional ultrasound images |
| EP4461230A1 (en) * | 2023-05-07 | 2024-11-13 | Pulsify Medical | Ultrasound system |
| US20250041628A1 (en) * | 2021-12-07 | 2025-02-06 | Sonire Therapeutics Inc. | Display device for hifu therapy |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030018264A1 (en) * | 2001-06-19 | 2003-01-23 | Yoichi Suzuki | Ultrasonic imaging apparatus |
| WO2004106969A2 (en) * | 2003-05-30 | 2004-12-09 | Koninklijke Philips Electronics N.V. | Biplane ultrasonic imaging system |
| US20050049494A1 (en) * | 2003-08-29 | 2005-03-03 | Arthur Gritzky | Method and apparatus for presenting multiple enhanced images |
| US20050283078A1 (en) * | 2004-06-22 | 2005-12-22 | Steen Eric N | Method and apparatus for real time ultrasound multi-plane imaging |
| US20060004291A1 (en) * | 2004-06-22 | 2006-01-05 | Andreas Heimdal | Methods and apparatus for visualization of quantitative data on a model |
| DE102007018454A1 (en) * | 2006-04-20 | 2007-10-25 | General Electric Co. | System and method for automatically obtaining ultrasound image planes based on patient-specific data |
-
2008
- 2008-09-25 WO PCT/IB2008/053906 patent/WO2009044316A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030018264A1 (en) * | 2001-06-19 | 2003-01-23 | Yoichi Suzuki | Ultrasonic imaging apparatus |
| WO2004106969A2 (en) * | 2003-05-30 | 2004-12-09 | Koninklijke Philips Electronics N.V. | Biplane ultrasonic imaging system |
| US20050049494A1 (en) * | 2003-08-29 | 2005-03-03 | Arthur Gritzky | Method and apparatus for presenting multiple enhanced images |
| US20050283078A1 (en) * | 2004-06-22 | 2005-12-22 | Steen Eric N | Method and apparatus for real time ultrasound multi-plane imaging |
| US20060004291A1 (en) * | 2004-06-22 | 2006-01-05 | Andreas Heimdal | Methods and apparatus for visualization of quantitative data on a model |
| DE102007018454A1 (en) * | 2006-04-20 | 2007-10-25 | General Electric Co. | System and method for automatically obtaining ultrasound image planes based on patient-specific data |
Cited By (48)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2433567A4 (en) * | 2009-05-20 | 2013-10-16 | Hitachi Medical Corp | DEVICE FOR DIAGNOSING MEDICAL IMAGES AND ASSOCIATED METHOD OF DETERMINING A ZONE INVESTIGATED |
| EP2281510A1 (en) * | 2009-07-30 | 2011-02-09 | Medison Co., Ltd. | Providing a plurality of slice images in an ultrasound system |
| WO2011055312A1 (en) * | 2009-11-06 | 2011-05-12 | Koninklijke Philips Electronics N.V. | Quantification results in multiplane imaging |
| CN102596054A (en) * | 2009-11-06 | 2012-07-18 | 皇家飞利浦电子股份有限公司 | Quantification results in multiplane imaging |
| WO2012003369A3 (en) * | 2010-06-30 | 2013-01-10 | Muffin Incorporated | Percutaneous, ultrasound-guided introduction of medical devices |
| US12383223B2 (en) | 2010-06-30 | 2025-08-12 | Muffin Incorporated | Percutaneous, ultrasound-guided introduction of medical devices |
| US9437043B2 (en) | 2010-07-30 | 2016-09-06 | Koninklijke Philips Electronics N.V. | Display and export of individual biplane images |
| CN103037773A (en) * | 2010-07-30 | 2013-04-10 | 皇家飞利浦电子股份有限公司 | Display and export of individual biplane images |
| CN103037774A (en) * | 2010-07-30 | 2013-04-10 | 皇家飞利浦电子股份有限公司 | Automated sweep and export of 2d ultrasound images of 3d volumes |
| WO2012014125A1 (en) * | 2010-07-30 | 2012-02-02 | Koninklijke Philips Electronics N.V. | Automated sweep and export of 2d ultrasound images of 3d volumes |
| CN103037773B (en) * | 2010-07-30 | 2016-08-24 | 皇家飞利浦电子股份有限公司 | Display and output of individual biplane images |
| RU2577938C2 (en) * | 2010-07-30 | 2016-03-20 | Конинклейке Филипс Электроникс Н.В. | Automated scanning and export of two-dimensional ultrasonic images of three-dimensional volumes |
| US10610198B2 (en) | 2010-07-30 | 2020-04-07 | Koninklijke Philips N.V. | Automated sweep and export of 2D ultrasound images of 3D volumes |
| WO2012014120A1 (en) * | 2010-07-30 | 2012-02-02 | Koninklijke Philips Electronics N.V. | Display and export of individual biplane images |
| CN103037774B (en) * | 2010-07-30 | 2015-11-25 | 皇家飞利浦电子股份有限公司 | Automatically scan and export the 2D ultrasonoscopy of 3D volume |
| WO2012066470A1 (en) * | 2010-11-19 | 2012-05-24 | Koninklijke Philips Electronics N.V. | A method for guiding the insertion of a surgical instrument with three dimensional ultrasonic imaging |
| CN103220981B (en) * | 2010-11-19 | 2016-05-18 | 皇家飞利浦电子股份有限公司 | Utilize the method for the insertion of 3-D supersonic imaging guided surgery instrument |
| US10624607B2 (en) | 2010-11-19 | 2020-04-21 | Koninklijke Philips N.V. | Method for guiding the insertion of a surgical instrument with three dimensional ultrasonic imaging |
| RU2627596C2 (en) * | 2010-11-19 | 2017-08-09 | Конинклейке Филипс Электроникс Н.В. | Method of controlling the introduction of surgical instrument through three-dimensional ultrasound visualization |
| US20130229504A1 (en) * | 2010-11-19 | 2013-09-05 | Koninklijke Philips Electronics N.V. | Three dimensional ultrasonic guidance of surgical instruments |
| CN103220981A (en) * | 2010-11-19 | 2013-07-24 | 皇家飞利浦电子股份有限公司 | A method for guiding the insertion of a surgical instrument with three dimensional ultrasonic imaging |
| WO2013068883A1 (en) * | 2011-11-10 | 2013-05-16 | Koninklijke Philips Electronics N.V. | Improving large volume three-dimensional ultrasound imaging |
| US20140018708A1 (en) * | 2012-07-16 | 2014-01-16 | Mirabilis Medica, Inc. | Human Interface and Device for Ultrasound Guided Treatment |
| CN104619263A (en) * | 2012-07-16 | 2015-05-13 | 米瑞碧利斯医疗公司 | Human interface and device for ultrasound guided treatment |
| US9675819B2 (en) * | 2012-07-16 | 2017-06-13 | Mirabilis Medica, Inc. | Human interface and device for ultrasound guided treatment |
| WO2014014965A1 (en) * | 2012-07-16 | 2014-01-23 | Mirabilis Medica, Inc. | Human interface and device for ultrasound guided treatment |
| US9492638B2 (en) | 2012-11-01 | 2016-11-15 | Muffin Incorporated | Implements for identifying sheath migration |
| US9305347B2 (en) | 2013-02-13 | 2016-04-05 | Dental Imaging Technologies Corporation | Automatic volumetric image inspection |
| CN103702032A (en) * | 2013-12-31 | 2014-04-02 | 华为技术有限公司 | Image processing method, device and terminal equipment |
| US9730675B2 (en) | 2014-01-27 | 2017-08-15 | Koninklijke Philips N.V. | Ultrasound imaging system and an ultrasound imaging method |
| WO2015114484A1 (en) * | 2014-01-28 | 2015-08-06 | Koninklijke Philips N.V. | Ultrasound systems for multi-plane acquisition with single- or bi-plane real-time imaging, and methods of operation thereof |
| US11382602B2 (en) | 2014-01-28 | 2022-07-12 | Koninklijke Philips N.V. | Ultrasound systems for multi-plane acquisition with single- or bi-plane real-time imaging, and methods of operation thereof |
| US10405835B2 (en) | 2014-01-28 | 2019-09-10 | Koninklijke Philips N.V. | Ultrasound systems for multi-plane acquisition with single- or bi-plane real-time imaging, and methods of operation thereof |
| EP2989984A1 (en) * | 2014-08-25 | 2016-03-02 | Samsung Medison Co., Ltd. | Ultrasonic imaging apparatus and control method thereof |
| US11083434B2 (en) | 2014-08-25 | 2021-08-10 | Samsung Medison Co., Ltd. | Ultrasonic imaging apparatus and control method thereof |
| CN109310399A (en) * | 2016-06-06 | 2019-02-05 | 皇家飞利浦有限公司 | Medical Ultrasound Image Processing Equipment |
| WO2017211774A1 (en) * | 2016-06-06 | 2017-12-14 | Koninklijke Philips N.V. | Medical ultrasound image processing device |
| CN109310399B (en) * | 2016-06-06 | 2022-12-06 | 皇家飞利浦有限公司 | Medical Ultrasound Image Processing Equipment |
| US11266380B2 (en) | 2016-06-06 | 2022-03-08 | Koninklijke Philips N.V. | Medical ultrasound image processing device |
| CN110636799B (en) * | 2017-03-16 | 2024-07-05 | 皇家飞利浦有限公司 | Optimal scan plane selection for organ viewing |
| US11696745B2 (en) | 2017-03-16 | 2023-07-11 | Koninklijke Philips N.V. | Optimal scan plane selection for organ viewing |
| CN110636799A (en) * | 2017-03-16 | 2019-12-31 | 皇家飞利浦有限公司 | Optimal scan plane selection for organ viewing |
| CN111372520B (en) * | 2017-10-16 | 2023-07-14 | 皇家飞利浦有限公司 | Ultrasound imaging systems and methods |
| CN111372520A (en) * | 2017-10-16 | 2020-07-03 | 皇家飞利浦有限公司 | Ultrasound imaging systems and methods |
| CN111970974A (en) * | 2018-02-26 | 2020-11-20 | 皇家飞利浦有限公司 | Providing three-dimensional ultrasound images |
| US20250041628A1 (en) * | 2021-12-07 | 2025-02-06 | Sonire Therapeutics Inc. | Display device for hifu therapy |
| EP4461230A1 (en) * | 2023-05-07 | 2024-11-13 | Pulsify Medical | Ultrasound system |
| WO2024231329A1 (en) * | 2023-05-07 | 2024-11-14 | Pulsify Medical | Ultrasound system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2009044316A1 (en) | System and method for real-time multi-slice acquisition and display of medical ultrasound images | |
| US6951543B2 (en) | Automatic setup system and method for ultrasound imaging systems | |
| US5865750A (en) | Method and apparatus for enhancing segmentation in three-dimensional ultrasound imaging | |
| JP3878343B2 (en) | 3D ultrasonic diagnostic equipment | |
| EP1697759B1 (en) | Ultrasonic diagnostic imaging method and system with an automatic control of resolution and frame rate | |
| US6210328B1 (en) | Ultrasonic diagnostic imaging system with variable spatial compounding | |
| EP1501419B1 (en) | Contrast-agent enhanced color-flow imaging | |
| JP4444108B2 (en) | Ultrasound diagnostic system with elevation biplane image | |
| US10194888B2 (en) | Continuously oriented enhanced ultrasound imaging of a sub-volume | |
| US20090203996A1 (en) | Ultrasound-imaging systems and methods for a user-guided three-dimensional volume-scan sequence | |
| US8425422B2 (en) | Adaptive volume rendering for ultrasound color flow diagnostic imaging | |
| US10682122B2 (en) | Image-based user interface for controlling medical imaging | |
| US9204862B2 (en) | Method and apparatus for performing ultrasound elevation compounding | |
| EP3537980B1 (en) | Triple mode ultrasound imaging for anatomical, functional, and hemodynamical imaging | |
| US8394027B2 (en) | Multi-plane/multi-slice processing for 2-D flow imaging in medical diagnostic ultrasound | |
| JP4413909B2 (en) | 3D ultrasonic diagnostic equipment | |
| JP2021509835A (en) | Ultrasound imaging system with tissue-specific presets for diagnostic testing | |
| US11607194B2 (en) | Ultrasound imaging system with depth-dependent transmit focus | |
| JP2007513727A (en) | Ultrasound diagnostic contrast image by spatial synthesis | |
| JP5627171B2 (en) | Ultrasonic diagnostic equipment | |
| US20070073152A1 (en) | Systems and methods for acquiring images simultaneously | |
| EP1402284A1 (en) | Ultrasonic diagnostic system for selectively developing ultrasound diagnostic data | |
| Kremkau et al. | General principles of echocardiography | |
| Badano et al. | Three-dimensional echocardiography | |
| Jakovljevic et al. | Clinical realization of short-lag spatial coherence imaging on 2D arrays |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08807802 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 08807802 Country of ref document: EP Kind code of ref document: A1 |