[go: up one dir, main page]

CN115317128A - Ablation simulation method and device - Google Patents

Ablation simulation method and device Download PDF

Info

Publication number
CN115317128A
CN115317128A CN202211091789.7A CN202211091789A CN115317128A CN 115317128 A CN115317128 A CN 115317128A CN 202211091789 A CN202211091789 A CN 202211091789A CN 115317128 A CN115317128 A CN 115317128A
Authority
CN
China
Prior art keywords
simulated
ablation
needle
space
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211091789.7A
Other languages
Chinese (zh)
Inventor
黄光亮
张晓儿
谢晓燕
于开欣
韩晓涛
丛龙飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
First Affiliated Hospital of Sun Yat Sen University
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
First Affiliated Hospital of Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd, First Affiliated Hospital of Sun Yat Sen University filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN115317128A publication Critical patent/CN115317128A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00577Ablation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • A61B2018/1405Electrodes having a specific shape
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2063Acoustic tracking systems, e.g. using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Otolaryngology (AREA)
  • Plasma & Fusion (AREA)
  • Robotics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

本发明实施例提供一种消融模拟方法及设备,包括:通过超声探头向包含病灶的目标组织发射超声波,并根据接收到的超声回波数据生成目标组织的超声图像;获取目标组织的三维图像数据;确定超声图像所在的超声图像空间与三维图像数据所在的三维图像空间之间的空间变换矩阵;根据空间变换矩阵在目标空间中确定模拟消融针的位置信息和模拟病灶的位置信息,目标空间为超声图像空间或者三维图像空间;根据模拟消融针的位置信息和模拟病灶的位置信息,实时计算并显示模拟消融针与模拟病灶之间的相对位置。通过模拟消融针与模拟病灶之间的相对位置可以实现对插针位置的定量化评价,直观地引导医生进行插针。

Figure 202211091789

Embodiments of the present invention provide an ablation simulation method and device, including: transmitting ultrasonic waves to a target tissue containing a lesion through an ultrasonic probe, and generating an ultrasonic image of the target tissue according to the received ultrasonic echo data; acquiring three-dimensional image data of the target tissue ; Determine the space transformation matrix between the ultrasound image space where the ultrasound image is located and the 3D image space where the 3D image data is located; Determine the position information of the simulated ablation needle and the simulated lesion in the target space according to the space transformation matrix, and the target space is Ultrasound image space or three-dimensional image space; according to the position information of the simulated ablation needle and the position information of the simulated lesion, the relative position between the simulated ablation needle and the simulated lesion is calculated and displayed in real time. By simulating the relative position between the ablation needle and the simulated lesion, the quantitative evaluation of the position of the needle can be realized, and the doctor can be intuitively guided to insert the needle.

Figure 202211091789

Description

Ablation simulation method and device
Technical Field
The embodiment of the invention relates to the technical field of medical ultrasound, in particular to an ablation simulation method and equipment.
Background
Tumors are new organisms formed by the proliferation of local histiocytes of organisms, are easy to cause various diseases, and have important clinical significance for the treatment of the tumors. With the continuous development of medical technology, tumor ablation treatment methods for directly destroying tumors by physical methods such as radio frequency, microwave, freezing, high-frequency electrocautery, laser, high-energy focused ultrasound and the like have the advantages of high curative effect, small invasion, quick postoperative recovery and the like, and are widely applied in clinic.
At present, whether in the preoperative planning stage of tumor ablation or in the surgical stage of tumor ablation, the tumor ablation is usually simulated by means of an ultrasonic imaging device. The ablation range of the tumor and the ablation needle is simulated through the ultrasonic imaging equipment, and the proportion of the simulated lesion covered by the simulated ablation range is calculated so as to guide a doctor to perform tumor ablation. The insertion position of the ablation needle directly determines the tumor ablation effect, and the proportion of the simulated lesion covered by the simulated ablation range can quantitatively evaluate the lesion ablation effect, but can not intuitively guide a doctor to perform needle insertion.
Disclosure of Invention
The embodiment of the invention provides an ablation simulation method and equipment, which are used for quantitatively evaluating the position of an insertion needle of an ablation needle so as to intuitively guide a doctor to insert the insertion needle.
In a first aspect, an embodiment of the present invention provides an ablation simulation method, including:
transmitting ultrasonic waves to target tissues containing focuses through an ultrasonic probe, and generating ultrasonic images of the target tissues according to received ultrasonic echo data;
acquiring three-dimensional image data of a target tissue;
determining a spatial transformation matrix between an ultrasonic image space in which an ultrasonic image is located and a three-dimensional image space in which three-dimensional image data is located;
determining position information of a simulated ablation needle and position information of a simulated focus in a target space according to the spatial transformation matrix, wherein the target space is an ultrasonic image space or a three-dimensional image space;
and calculating and displaying the relative position between the simulated ablation needle and the simulated focus in real time according to the position information of the simulated ablation needle and the position information of the simulated focus.
In one embodiment, determining a spatial transformation matrix between an ultrasound image space in which an ultrasound image is located and a three-dimensional image space in which three-dimensional image data is located includes:
acquiring a first transformation matrix of an ultrasonic image space to a position sensor space;
acquiring a second transformation matrix from the space of the position sensor to the space of world coordinates;
determining a third transformation matrix from the world coordinate space to the three-dimensional image space;
and determining a spatial transformation matrix according to the first transformation matrix, the second transformation matrix and the third transformation matrix.
In one embodiment, determining a third transformation matrix from world coordinate space to three-dimensional image space comprises:
registering the ultrasonic image and the three-dimensional image data based on the in-vitro marker or based on the image information to obtain a registration matrix;
a third transformation matrix is determined from the registration matrix.
In one embodiment, if there are a plurality of ablation needles of the same type, determining the position information of the ablation needle includes: determining position information of central positions of a plurality of simulated ablation needles; if the number of the simulated ablation needles is multiple and the types of the simulated ablation needles are not completely the same, determining the position information of the simulated ablation needles comprises the following steps: and determining ablation centers of the plurality of simulated ablation needles according to the simulated ablation range of each simulated ablation needle, and determining position information of the ablation centers of the plurality of simulated ablation needles.
In one embodiment, determining position information of a simulated ablation needle and position information of a simulated lesion center in a target space according to a spatial transformation matrix comprises:
acquiring a needle inserting angle and a needle inserting depth of a simulated ablation needle;
determining position information of the simulated ablation needle in an ultrasonic image space according to the needle inserting angle and the needle inserting depth;
acquiring position information of a simulated focus center in a three-dimensional image space;
when the target space is the ultrasonic image space, the position information of the simulated focus center in the three-dimensional image space is converted into the position information of the simulated focus center in the ultrasonic image space according to the space transformation matrix,
or,
and when the target space is a three-dimensional image space, converting the position information of the simulated ablation needle in the ultrasonic image space into the position information of the simulated ablation needle in the three-dimensional image space according to the space transformation matrix.
In one embodiment, acquiring three-dimensional image data of a target tissue includes:
acquiring at least one of computed tomography data, magnetic resonance imaging data, and three-dimensional ultrasound data of a target tissue,
or,
three-dimensional image data of the target tissue is acquired by performing three-dimensional reconstruction on an ultrasonic image of the target tissue acquired by an ultrasonic probe.
In one embodiment, the method further comprises:
fusing and displaying the ultrasonic image and the three-dimensional image data;
and performing three-dimensional visual display on the simulated ablation needle and the simulated focus in the three-dimensional image displayed in a fusion mode.
In one embodiment, a three-dimensional visualization of a simulated ablation needle and a simulated lesion includes:
performing three-dimensional visual display on the simulated ablation needle and the simulated focus from a target angle through a window, wherein the target angle is determined and/or adjusted according to operation information input by a user;
or,
and performing three-dimensional visual display on the simulated ablation needle and the simulated lesion from different angles through a plurality of windows, wherein the displayed angle in each window is determined and/or adjusted according to the operation information input by the user.
In one embodiment, the method further comprises:
determining a simulated ablation range of the simulated ablation needle;
and calculating and displaying the proportion of the simulated lesion covered by the simulated ablation range in real time according to the position information and the simulated ablation range of the simulated ablation needle and the position information and the size of the simulated lesion.
In one embodiment, determining a simulated ablation range for a simulated ablation needle comprises:
acquiring ablation parameters of a simulated ablation needle, wherein the ablation parameters comprise an energy radiation range and a heat source position;
and determining the simulated ablation range of the simulated ablation needle according to the ablation parameters.
In one embodiment, if there are a plurality of ablation simulation needles, determining a simulated ablation range of the ablation simulation needle according to the ablation parameters includes:
determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameters corresponding to each simulated ablation needle to obtain a plurality of simulated energy distributions corresponding to a plurality of simulated ablation needles;
determining a combined energy value in a combined simulated energy distribution range corresponding to the plurality of simulated ablation needles based on the plurality of simulated energy distributions;
determining an energy distribution area with a combined energy value larger than a first energy threshold value from the combined simulation energy distribution range;
and determining simulated ablation ranges corresponding to the plurality of simulated ablation needles according to the energy distribution areas with the combined energy value larger than the first energy threshold value.
In one embodiment, determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameters corresponding to each simulated ablation needle comprises:
determining a three-dimensional coordinate point in the energy radiation range according to the energy radiation range of each simulated ablation needle and the heat source position of each simulated ablation needle;
calculating the energy value of the three-dimensional coordinate point and determining a target three-dimensional coordinate point with the energy value larger than a second energy threshold value;
determining the target three-dimensional coordinate points as the simulated ablation range of each simulated ablation needle;
and determining the simulated energy distribution corresponding to each simulated ablation needle according to the simulated ablation range of each simulated ablation needle and the energy value of the target three-dimensional coordinate point.
In one embodiment, the method further comprises:
the distance between the boundary of the simulated ablation volume and the boundary of the at-risk tissue is determined and displayed on the display interface.
In one embodiment, the method further comprises:
when the simulated ablation range intersects with the dangerous tissue, warning information is output, and the warning information is used for prompting a user to touch the dangerous tissue.
In a second aspect, an embodiment of the present invention provides an ultrasound imaging apparatus, including:
an ultrasonic probe;
the transmitting circuit is used for outputting the corresponding transmitting sequence to the ultrasonic probe according to a set mode so as to control the ultrasonic probe to transmit the corresponding ultrasonic wave;
the receiving circuit is used for receiving the ultrasonic echo signal output by the ultrasonic probe and outputting ultrasonic echo data;
a display for outputting visual information;
a processor for performing the ablation simulation method of any of the first aspects.
In a third aspect, embodiments of the present invention provide a computer-readable storage medium, in which computer-executable instructions are stored, and when executed by a processor, the computer-executable instructions are used to implement the ablation simulation method according to any one of the first aspect.
According to the ablation simulation method and the ablation simulation equipment provided by the embodiment of the invention, the ultrasonic image of the target tissue is generated, the three-dimensional image data of the target tissue is obtained, then the spatial transformation matrix between the ultrasonic image space where the ultrasonic image is located and the three-dimensional image space where the three-dimensional image data is located is determined, the position information of the simulated ablation needle and the position information of the simulated focus are determined in the same target space according to the spatial transformation matrix, and finally the relative position between the simulated ablation needle and the simulated focus is calculated and displayed in real time according to the position information of the simulated ablation needle and the position information of the simulated focus, so that the quantitative evaluation of the position of the insertion needle is realized, and a doctor can be intuitively guided to perform insertion needle.
Drawings
Fig. 1 is a block diagram of an ultrasound imaging apparatus according to an embodiment of the present invention;
FIG. 2 is a flow chart of an ablation simulation method provided in accordance with an embodiment of the present invention;
FIGS. 3A and 3B are schematic diagrams of an interface provided by an embodiment of the present invention;
FIG. 4 is a flowchart of determining a spatial transformation matrix according to an embodiment of the present invention;
FIG. 5 is a flow chart of an ablation simulation method according to yet another embodiment of the present invention;
FIG. 6 is a schematic diagram illustrating a cube vertex determination according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of 15 basic cubes according to an embodiment of the present invention;
fig. 8 is a flowchart of an ablation simulation method according to another embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments have been given like element numbers associated therewith. In the following description, numerous specific details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in this specification in order not to obscure the core of the present application with unnecessary detail, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the description of the methods may be transposed or transposed in order, as will be apparent to a person skilled in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" as used herein includes both direct and indirect connections (couplings), unless otherwise specified.
As shown in fig. 1, the ultrasound imaging apparatus provided by the present invention may include: an ultrasound probe 20, a transmit/receive circuit 30 (i.e., a transmit circuit 310 and a receive circuit 320), a beam-forming module 40, an IQ demodulation module 50, a memory 60, a processor 70, and a human-computer interaction device. The processor 70 may include a control module 710 and an image processing module 720.
The ultrasonic probe 20 includes a transducer (not shown) composed of a plurality of array elements arranged in an array, the plurality of array elements are arranged in a row to form a linear array, or are arranged in a two-dimensional matrix to form an area array, and the plurality of array elements may also form a convex array. The array elements are used for emitting ultrasonic beams according to the excitation electric signals or converting the received ultrasonic beams into electric signals. Each array element can thus be used to perform a mutual transformation of the electrical impulse signal and the ultrasound beam, thereby performing an emission of ultrasound waves into a target region of human tissue (e.g. target tissue containing a lesion, such as a tumor in this embodiment) and also to receive echoes of ultrasound waves reflected back through the tissue. In performing ultrasonic detection, which array elements are used for transmitting ultrasonic beams and which array elements are used for receiving ultrasonic beams can be controlled by the transmitting circuit 310 and the receiving circuit 320, or the time slots of the array elements are controlled for transmitting ultrasonic beams or receiving echoes of ultrasonic beams. The array elements participating in ultrasonic wave transmission can be simultaneously excited by the electric signals, so that the ultrasonic waves are transmitted simultaneously; or the array elements participating in the ultrasonic wave transmission can be excited by a plurality of electric signals with certain time intervals, so that the ultrasonic waves with certain time intervals are continuously transmitted.
In this embodiment, the user selects an appropriate position and angle by moving the ultrasound probe 20 to transmit ultrasound waves to the target tissue 10 including the lesion and receive echoes of the ultrasound waves returned by the target tissue 10 including the lesion, and obtains and outputs electrical signals of the echoes, where the electrical signals of the echoes are channel analog electrical signals formed by using the receiving array elements as channels, and carry amplitude information, frequency information, and time information.
The transmitting circuit 310 is configured to generate a transmitting sequence according to the control of the control module 710 of the processor 70, where the transmitting sequence is configured to control some or all of the plurality of array elements to transmit ultrasonic waves to the biological tissue, and parameters of the transmitting sequence include the position of the array element for transmission, the number of array elements, and ultrasonic beam transmitting parameters (e.g., amplitude, frequency, number of transmissions, transmitting interval, transmitting angle, wave pattern, focusing position, etc.). In some cases, the transmit circuitry 310 is further configured to phase delay the transmitted beams to cause different transmit elements to transmit ultrasound at different times so that each transmitted ultrasound beam can be focused at a predetermined region of interest. In different operating modes, such as a B image mode, a C image mode, and a D image mode (doppler mode), the parameters of the transmitted sequence may be different, and the echo signals received by the receiving circuit 320 and processed by subsequent modules and corresponding algorithms may generate a B image reflecting the tissue anatomy, a C image reflecting the tissue anatomy and blood flow information, and a D image reflecting the doppler spectrum image.
The receiving circuit 320 is used for receiving the electrical signal of the ultrasonic echo from the ultrasonic probe 20 and processing the electrical signal of the ultrasonic echo. The receive circuit 320 may include one or more amplifiers, analog-to-digital converters (ADCs), and the like. The amplifier is used for amplifying the electric signal of the received ultrasonic echo after proper gain compensation, the analog-to-digital converter is used for sampling the analog echo signal according to a preset time interval so as to convert the analog echo signal into a digitized signal, and the digitized echo signal still retains amplitude information, frequency information and phase information. The data output from the receiving circuit 320 may be output to the beam forming module 40 for processing or may be output to the memory 60 for storage.
The beam forming module 40 is connected to the receiving circuit 320 for performing corresponding beam forming processing such as delaying and weighted summing on the signals output by the receiving circuit 320, because distances from ultrasonic receiving points in the measured tissue to the receiving array elements are different, channel data of the same receiving point output by different receiving array elements have a delay difference, delay processing is required to be performed, phases are aligned, and weighted summing is performed on different channel data of the same receiving point, so as to obtain ultrasonic image data after beam forming, and the ultrasonic image data output by the beam forming module 40 is also called as radio frequency data (RF data). The beam synthesis module 40 outputs the radio frequency data to the IQ demodulation module 50. In some embodiments, the beam forming module 40 may also output the rf data to the memory 60 for buffering or saving, or directly output the rf data to the image processing module 720 of the processor 70 for image processing.
Beamforming module 40 may perform the above functions in hardware, firmware, or software, for example, beamforming module 40 may include a central controller Circuit (CPU), one or more microprocessor chips, or any other electronic components capable of processing input data according to specific logic instructions, which when implemented in software, may execute instructions stored on a tangible and non-transitory computer-readable medium (e.g., memory 60) to perform beamforming calculations using any suitable beamforming method.
The IQ demodulation module 50 removes the signal carrier by IQ demodulation, extracts the tissue structure information included in the signal, and performs filtering to remove noise, and the signal obtained at this time is called a baseband signal (IQ data pair). The IQ demodulation module 50 performs image processing on the IQ data to an image processing module 720 that outputs to the processor 70. In some embodiments, the IQ demodulation module 50 further buffers or saves the IQ data pair output to the memory 60, so that the image processing module 720 reads out the data from the memory 60 for subsequent image processing.
The processor 70 is used to configure a central controller Circuit (CPU), one or more microprocessors, a graphics controller circuit (GPU) or any other electronic components capable of processing input data according to specific logic instructions, which may perform control on peripheral electronic components according to the input instructions or predetermined instructions, or perform data reading and/or saving on the memory 60, or may process the input data by executing programs in the memory 60, such as performing one or more processing operations on acquired ultrasound data according to one or more operating modes, the processing operations including, but not limited to, adjusting or defining the form of ultrasound waves emitted by the ultrasound probe 20, generating various image frames for display by a display 80 of a subsequent human-computer interaction device, or adjusting or defining the content and form displayed on the display 80, or adjusting one or more image display settings (e.g., ultrasound images, interface components, positioning regions) displayed on the display 80.
The image processing module 720 is used to process the data output by the beam synthesis module 40 or the data output by the IQ demodulation module 50 to generate a gray-scale image of signal intensity variation within the scanning range, which reflects the anatomical structure inside the tissue, and is called B image. The image processing module 720 may output the B image to the display 80 of the human-computer interaction device for display.
The human-computer interaction device is used for performing human-computer interaction, namely receiving input and output visual information of a user; the input of a user can be received by the touch screen integrated with the display, such as a keyboard, an operating button, a mouse, a track ball and the like; which outputs visual information using the display 80.
The memory 60 may be a tangible and non-transitory computer readable medium, such as a flash memory card, solid state memory, hard disk, etc., for storing data or programs, e.g., the memory 60 may be used to store acquired ultrasound data or temporarily not immediately displayed image frames generated by the processor 70, or the memory 60 may store a graphical user interface, one or more default image display settings, programming instructions for the processor, the beam-forming module, or the IQ decoding module.
It should be noted that the structure shown in fig. 1 is merely illustrative, and may include more or fewer components than those shown in fig. 1, or have a different configuration than that shown in fig. 1. The components shown in fig. 1 may be implemented in hardware and/or software. The ultrasound imaging device shown in fig. 1 may be used to perform the ablation simulation method provided by any of the embodiments of the present invention.
Referring to fig. 2, an ablation simulation method according to an embodiment of the present invention may include:
s201, transmitting ultrasonic waves to target tissues containing the focus through an ultrasonic probe, and generating ultrasonic images of the target tissues according to received ultrasonic echo data.
When performing ablation simulation with ultrasound, whether in the preoperative planning stage or the surgical stage, an ultrasound image of the target tissue needs to be acquired first. The target tissue in this embodiment is a tissue containing a lesion to be ablated, such as a tumor. Specifically, ultrasonic echo data may be generated by moving an ultrasonic probe of the ultrasonic imaging apparatus at an appropriate position, transmitting ultrasonic waves at an appropriate angle to target tissue, and receiving ultrasonic echoes returned from the target tissue, and then generating an ultrasonic image of the target tissue from the ultrasonic echo data.
S202, three-dimensional image data of the target tissue is obtained.
In this embodiment, the three-dimensional image data of the target tissue stored in advance may be acquired from the storage device, or the three-dimensional image data of the target tissue may be directly acquired by the acquisition device. The Three-Dimensional image data of the target tissue in this embodiment may be any one or more of Computed Tomography (CT) data, magnetic Resonance Imaging (MRI) data, and Three-Dimensional Ultrasound (3D-US) data. For example, CT data of the target tissue may be acquired for ablation simulation, or CT data and MRI data of the target tissue may also be acquired for ablation simulation.
In an alternative embodiment, a 3D reconstruction technique may be further used to perform 3D reconstruction on the two-dimensional image data of the target tissue to obtain three-dimensional image data of the target tissue. Specifically, three-dimensional ultrasound image data of the target tissue may be acquired by three-dimensionally reconstructing a two-dimensional ultrasound image of the target tissue acquired by the ultrasound probe. The target tissue can be 3D reconstructed, for example, by an in situ Freehand scan of a section of an ultrasound movie with localization information to obtain 3D-US data of the target tissue.
It should be noted that the execution order of S201 and S202 is not limited in this embodiment. Or the three-dimensional image data of the target tissue can be acquired first, and then the two-dimensional ultrasonic image of the target tissue can be acquired.
S203, determining a space transformation matrix between the ultrasonic image space where the ultrasonic image is located and the three-dimensional image space where the three-dimensional image data is located.
The ultrasound image is located in an ultrasound image space and the three-dimensional image data is located in a three-dimensional image space, which may be converted to the same space by a spatial transformation matrix although they are located in different image spaces. The spatial transformation matrix may be a matrix capable of transforming data in the ultrasound image space to data in the three-dimensional image space, or may be a matrix capable of transforming data in the three-dimensional image space to data in the ultrasound image space, according to actual transformation requirements.
S204, determining the position information of the simulated ablation needle and the position information of the simulated focus in a target space according to the space transformation matrix, wherein the target space is an ultrasonic image space or a three-dimensional image space.
When the ultrasonic imaging equipment is used for ablation simulation, the puncture frame can be arranged on the ultrasonic probe and used for fixing the ablation needle. The position information of the simulated ablation needle in the ultrasonic image space can be acquired according to the information provided by the positioning sensor arranged on the ultrasonic probe. Optionally, the position information of the simulated ablation needle in the ultrasound image space can be acquired according to the information provided by the positioning sensor arranged at the tail part of the ablation needle. A doctor usually marks the position of a lesion in three-dimensional image data of a target tissue, and may obtain position information of a simulated lesion in a three-dimensional image space according to the marking of the doctor. In this embodiment, the position information of the simulated ablation needle may be position information of a heat source center of the simulated ablation needle, and the position information of the simulated lesion may be position information of a center of the simulated lesion.
In order to facilitate quantitative evaluation of the position of the insertion needle, position information of the simulated ablation needle and the simulated lesion needs to be determined in the same target space. When the target space is an ultrasonic image space, the position information of the simulated focus in the three-dimensional image space can be converted into the position information of the simulated focus in the ultrasonic image space according to the space transformation matrix; when the target space is a three-dimensional image space, the position information of the simulated ablation needle in the ultrasonic image space can be converted into the position information of the simulated ablation needle in the three-dimensional image space according to the spatial transformation matrix. The position information of the simulated ablation needle and the simulated focus in the target space can be obtained by converting the space transformation matrix.
When a single needle is adopted for ablation aiming at small tumors, the position information of the heat source center of the single simulated ablation needle is determined; when a plurality of needles are adopted for ablation of a large tumor, the central positions of a plurality of simulated ablation needles or the position information of an ablation center are determined. Specifically, if a plurality of ablation simulation needles are provided and the types of the ablation simulation needles are the same, determining the position information of the ablation simulation needles includes: determining position information of central positions of a plurality of simulated ablation needles; if the number of the simulated ablation needles is multiple and the types of the simulated ablation needles are not completely the same, determining the position information of the simulated ablation needles comprises the following steps: and determining ablation centers of the plurality of simulated ablation needles according to the simulated ablation range of each simulated ablation needle, and determining position information of the ablation centers of the plurality of simulated ablation needles. It should be noted that the above tumor size division can refer to the clinical practical requirement, and this embodiment does not limit this. For the case that the target space is a three-dimensional image space, the position information of the plurality of simulated ablation needles in the ultrasound image space can be respectively obtained, then the central positions of the plurality of simulated ablation needles or the position information of the ablation center in the ultrasound image space are determined, and finally the central positions of the plurality of simulated ablation needles or the position information of the ablation center in the three-dimensional image space is obtained through one-time conversion of a space transformation matrix; or after the position information of the plurality of simulated ablation needles in the ultrasound image space is respectively obtained, the position information of the plurality of simulated ablation needles in the three-dimensional image space is respectively obtained through multiple times of conversion of the spatial transformation matrix, and finally, the center positions of the plurality of simulated ablation needles or the position information of the ablation center is determined according to the position information of the plurality of simulated ablation needles in the three-dimensional image space.
Taking the example of inserting two ablation needles, suppose the position coordinates of two simulated ablation needles are P 1 And P 2 If the two ablation needles are of the same type, the center positions (P) of the two simulated ablation needles can be determined 1 +P 2 ) The position information of the ablation needle is determined as simulation; if the types of the two ablation needles are not the same,the ablation centers of the two simulated ablation needles are determined, and the distance from the ablation center to each simulated ablation needle is inversely related to the size of the simulated ablation range of each simulated ablation needle. Suppose P 1 And P 2 The distance between them is 20mm 1 The simulated ablation range of the simulated ablation needle is a sphere with the radius of 4mm, P 2 The simulated ablation range of the simulated ablation needle is a sphere with the radius of 1mm, and the ablation center can be P 1 And P 2 Distance P on the connecting line 2 Is a position point at 16 mm.
S205, calculating and displaying the relative position between the simulated ablation needle and the simulated focus in real time according to the position information of the simulated ablation needle and the position information of the simulated focus.
After the position information of the simulated ablation needle and the simulated focus is determined in the same space, the relative position between the simulated ablation needle and the simulated focus can be determined according to the position information. The relative position in the embodiment can be expressed in various ways, and no matter what way the relative position is displayed, the quantitative evaluation of the position of the inserted ablation needle can be realized through the relative position, and the doctor can be intuitively guided to insert the needle through displaying the relative position.
In an alternative embodiment, the distance between the simulated ablation needle and the simulated lesion can be calculated and displayed in real time, and the distance can be in the form of Euclidean distance. As shown in fig. 3A and 3B. Fig. 3A shows, for a single needle ablation scenario, in the target space (303), a simulated tumor (301) and a single simulated ablation needle (302), respectively, and shows "the distance between the simulated ablation needle and the simulated lesion" in the form of a text box (304): 10mm ", which may be the distance between the center of the heat source of the simulated ablation needle and the center of the simulated lesion; fig. 3B shows, for a scenario of two-needle ablation, in the target space (303), a simulated tumor (301) and two simulated ablation needles (305), respectively, and "the distance between the simulated ablation needles and the simulated lesion" in the form of a text box (304): 5mm ", which may be the distance between the centers of the two simulated ablation needles and the center of the simulated lesion. The quantitative evaluation of the position of the inserted ablation needle can be realized through the distance, and a doctor can adjust the position of the inserted ablation needle by taking the reduction of the distance as a target, so that the doctor can be intuitively guided to insert the needle by displaying the distance.
In another alternative embodiment, the relative position between the simulated ablation needle and the simulated lesion may also be represented by an angle. Taking the single-needle ablation scenario shown in fig. 3A as an example, an angle between a straight line where the needle bar of the simulated ablation needle is located and a straight line passing through the center point of the simulated lesion and parallel to the body position of the patient can be calculated; taking the two-needle ablation scenario shown in fig. 3B as an example, the angle between the straight line passing through the centers of the two simulated ablation needles and the straight line passing through the center of the simulated lesion and parallel to the patient position can be calculated. The quantitative evaluation of the position of the inserted ablation needle can be realized through the angle, and the doctor can adjust the position of the inserted ablation needle by taking the reduction of the angle as a target, so that the doctor can be intuitively guided to insert the needle by displaying the angle. Alternatively, the angle and distance may also be combined to represent the relative position between the simulated ablation needle and the simulated lesion.
In yet another alternative embodiment, a three-dimensional cartesian coordinate system may be established in the target space with the center of the simulated lesion as the coordinate origin, and then the position coordinates of the simulated ablation needle in the established three-dimensional cartesian coordinate system may be determined according to the position information of the simulated ablation needle and the simulated lesion in the target space, so as to represent the relative position between the simulated ablation needle and the simulated lesion with the position coordinates.
In order to more intuitively guide the insertion of the needle by the doctor, in addition to displaying the relative position between the simulated ablation needle and the simulated lesion on the display interface, the variation trend of the relative position between the simulated ablation needle at the current position compared with the former position and the simulated lesion can be determined. For example, it may be determined whether a distance between the simulated ablation needle at the current position and the simulated lesion is increased or decreased compared to a distance between the simulated ablation needle at the previous position and the simulated lesion, and the trend of the change may be displayed on the display interface. The variation trend can be displayed by means of characters, icons, colors and the like. For example, "increase" or "decrease" may be displayed directly to indicate a trend of change; a red icon may be used to alert that the distance is increased and a green icon may be used to indicate that the distance is decreased. After the doctor sees the indication information for indicating the distance increase, the doctor can know that the current adjustment direction deviates from the ablation target and needs to be adjusted again; after the doctor sees the instruction information for instructing the distance reduction, the doctor can know that the current adjustment direction is correct, and can continue to adjust. By displaying the variation trend of the relative position on the display interface, a doctor can be guided to insert the needle more intuitively.
According to the ablation simulation method provided by the embodiment, the ultrasonic image of the target tissue is generated, the three-dimensional image data of the target tissue is obtained, the spatial transformation matrix between the ultrasonic image space where the ultrasonic image is located and the three-dimensional image space where the three-dimensional image data is located is determined, the position information of the simulated ablation needle and the position information of the simulated focus are determined in the same target space according to the spatial transformation matrix, and finally the relative position between the simulated ablation needle and the simulated focus is calculated and displayed in real time according to the position information of the simulated ablation needle and the position information of the simulated focus, so that the quantitative evaluation of the position of the insertion needle is realized, and the insertion needle can be intuitively guided by a doctor.
In order to quantitatively evaluate the needle inserting position of the ablation needle, the relative position between the simulated ablation needle and the simulated lesion needs to be calculated in the same space, and a spatial transformation matrix for performing spatial transformation is particularly important. Referring to fig. 4, on the basis of the foregoing embodiment, in the ablation simulation method provided in this embodiment, determining a spatial transformation matrix between an ultrasound image space in which an ultrasound image is located and a three-dimensional image space in which three-dimensional image data is located may include:
s2031, a first transformation matrix from the ultrasonic image space to the position sensor space is obtained.
The position sensor may be generally fixed to the ultrasound probe, and continuously provides position information as the ultrasound probe moves. The first transformation matrix in this embodiment is a transformation matrix from an ultrasound image space to a position sensor space, which may be denoted as a. When the position sensor is fixed on the ultrasonic probe and the model of the ultrasonic probe is unchanged, A is fixed and can be obtained by a calibration method.
S2032, a second transformation matrix from the position sensor space to the world coordinate space is obtained.
The second transformation matrix in this embodiment is a transformation matrix from the position sensor space to the world coordinate space, which may be denoted as R probe 。R probe Can be read directly by the magnetic positioning controller. With the movement of the ultrasonic probe, R probe Will be constantly changing.
S2033, determining a third transformation matrix from the world coordinate space to the three-dimensional image space.
The third transformation matrix in the present embodiment is a transformation matrix from the world coordinate space to the three-dimensional image space, and may be denoted as P. P may be determined by a registration result between the ultrasound image of the target tissue and the three-dimensional image data of the target tissue.
In an alternative embodiment, the ultrasound image of the target tissue and the three-dimensional image data of the target tissue may be registered based on the in-vitro markers or based on the image information to obtain a registration matrix M, and then the third transformation matrix P may be determined according to the registration matrix M.
S2034, a space transformation matrix is determined according to the first transformation matrix, the second transformation matrix and the third transformation matrix.
Obtaining a first transformation matrix A and a second transformation matrix R probe And a third transformation matrix P, then according to A, R probe And P determines the spatial transformation matrix. When the target space is a three-dimensional image space, the spatial transformation matrix T may be expressed as: t = P · R probe A; when the target space is an ultrasound image space, the spatial transformation matrix T can be expressed as: t = (P · R) probe ·A) -1
In the embodiment, a first transformation matrix from an ultrasonic image space to a position sensor space, a second transformation matrix from the position sensor space to a world coordinate space, and a third transformation matrix from the world coordinate space to a three-dimensional image space are sequentially obtained, and then a space transformation matrix is determined according to the first transformation matrix, the second transformation matrix, and the third transformation matrix. The spatial transformation matrix can realize the mutual conversion of the position information between the ultrasonic image space and the three-dimensional image space, thereby obtaining the position information in the same target space.
In the following, how to determine the position information of the simulated ablation needle and the position information of the simulated lesion center in the target space according to the spatial transformation matrix will be described in detail respectively for the case where the target space is an ultrasound image space and the case where the target space is a three-dimensional image space. In one embodiment, determining the position information of the simulated ablation needle and the position information of the simulated lesion center in the target space according to the spatial transformation matrix may include:
acquiring a needle inserting angle and a needle inserting depth of a simulated ablation needle;
determining position information of the simulated ablation needle in an ultrasonic image space according to the needle inserting angle and the needle inserting depth;
acquiring position information of a simulated focus center in a three-dimensional image space;
when the target space is the ultrasonic image space, the position information of the simulated focus center in the three-dimensional image space is converted into the position information of the simulated focus center in the ultrasonic image space according to the space transformation matrix,
or,
and when the target space is a three-dimensional image space, converting the position information of the simulated ablation needle in the ultrasonic image space into the position information of the simulated ablation needle in the three-dimensional image space according to the space transformation matrix.
Ablation parameters of different types of ablation needles are often different, for example, some ablation needles have spherical energy radiation ranges, and some ablation needles have ellipsoidal energy radiation ranges; some ablation needles have a heat source at the needle tip and some ablation needles have a heat source at a distance d from the needle tip hot At the location. Therefore, when in use, corresponding ablation parameters need to be set according to the model of the ablation needle actually used. For an ablation needle with a spherical energy radiation range, the radius, the needle tip distance, the needle inserting depth and the like are required to be set; for the ablation needle with the energy radiation range in an ellipsoidal shape, the long diameter, the short diameter and the needle tip distance need to be setPin depth, etc. The insertion needle depth d of the simulated ablation needle may be derived based on the set insertion needle depth. When ablation is carried out based on the puncture frame, the angle of the puncture frame needs to be set first, and the needle inserting angle beta of the simulated ablation needle can be obtained according to the set angle of the puncture frame. After the inserting needle angle beta and the inserting needle depth d of the analog ablation needle are obtained, the position information P of the analog ablation needle in the current ultrasonic sector, namely in the ultrasonic image space can be determined according to the angle beta and the depth d us_probe (x us_probe ,y us_probe ) Wherein x is us_probe =d·sinβ,y us_probe = d · cos β. When d is hot >At 0 time, x us_probe =(d-d hot )·sinβ,y us_probe =(d-d hot ) Cos β. Doctors usually mark the position of the focus in the three-dimensional image data of the target tissue, and can obtain the position information P of the simulated focus center in the three-dimensional image space according to the mark of the doctors sec_tumor
When the target space is an ultrasonic image space, the position information P of the simulated focus center in the three-dimensional image space can be obtained according to the space transformation matrix T sec_tumor Converting into position information P of simulated focus center in ultrasonic image space us_tumor (P us_tumor =T·P sec_tumor ). Thus, the position information P of the simulated ablation needle in the ultrasonic image space is obtained us_probe And position information P of the simulated lesion us_tumor The distance D between the ablation needle and the simulated focus can be simulated dist =|P us_tumor -P us_probe And | to express the relative position between the simulated ablation needle and the simulated lesion so as to realize quantitative evaluation of the position of the insertion needle. D, irrespective of whether there is tissue at risk surrounding the tumor dist Smaller indicates better pin placement.
When the target space is a three-dimensional image space, the position information P of the simulated ablation needle in the ultrasonic image space can be converted according to the space transformation matrix T us_probe Converting into position information P of analog ablation needle in three-dimensional image space sec_probe (P sec_probe =T·P us_probe ). Thus, a three-dimensional image space is obtainedPosition information P of the analog ablation needle sec_probe And position information P of the simulated lesion sec_tumor The distance D between the ablation needle and the simulated focus can be simulated dist =|P sec_tumor -P sec_probe And | to express the relative position between the simulated ablation needle and the simulated lesion so as to realize quantitative evaluation of the position of the insertion needle. D, irrespective of whether there is tissue at risk surrounding the tumor dist Smaller indicates better pin placement.
Referring to fig. 5, in order to more intuitively guide the physician to perform the insertion during the ablation process, on the basis of any of the above embodiments, the ablation simulation method provided in this embodiment may further include:
s501, fusion display is carried out on the ultrasonic image and the three-dimensional image data.
And S502, performing three-dimensional visual display on the simulated ablation needle and the simulated focus in the three-dimensional image displayed in a fusion mode.
The image information and the positioning information can be utilized to carry out registration fusion on the ultrasonic image and the three-dimensional image data of the target tissue, and the fusion result is displayed on a display interface. By fusing and displaying the ultrasonic image and the three-dimensional image data, the relative position of the current ultrasonic sector image of the target tissue in the three-dimensional image data can be visually checked. The three-dimensional visual display of the simulated ablation needle and the simulated focus under the same space coordinate system can be realized in the three-dimensional image displayed in a fusion manner. When the ultrasonic probe moves, the relative position of the current ultrasonic sector and the three-dimensional image data changes along with the change of the coordinates of the position sensor.
In an alternative embodiment, the simulated ablation needle and the simulated lesion are displayed in a three-dimensional manner through a window, and the simulated ablation needle and the simulated lesion are displayed in a three-dimensional manner from a target angle, wherein the target angle is determined and/or adjusted according to the operation information input by the user. That is to say, the relative position between the simulated ablation needle and the simulated focus can be checked from any angle through one display window on the display interface, so that the position relationship between the simulated ablation needle and the simulated focus can be conveniently and comprehensively known, and the needle inserting process can be better guided. For example, the simulated ablation needle and the simulated focus can be displayed in a rotating manner at a preset angular speed, so that 360-degree display without dead angles is realized; the simulated ablation needle and the simulated lesion may also be displayed visually in three dimensions from a user-specified perspective.
In another alternative embodiment, the simulated ablation needle and the simulated lesion may be three-dimensionally visualized through a plurality of windows from different angles, and the displayed angle in each window is preset or determined and/or adjusted according to the operation information input by the user. That is, each window can display the simulated ablation needle and the simulated lesion in a three-dimensional visualization mode from any angle. The relative position between the simulated ablation needle and the simulated focus is displayed from a plurality of different angles through a plurality of windows, so that comparison is convenient, the position relation between the simulated ablation needle and the simulated focus is conveniently and rapidly and comprehensively known, and the needle inserting process can be better guided.
The three-dimensional visual display of the simulated ablation needle and the simulated lesion can be realized by referring to the following modes:
and marking the focus in the three-dimensional image, such as marking a tumor, and drawing the marked focus to realize three-dimensional visualization of the focus. In the three-dimensional visualization, a three-dimensional model stereo image of a target structure is reconstructed by using the target structure in a series of original two-dimensional images based on a drawing method. The rendering method can be divided into surface rendering and volume rendering.
The surface drawing is to adopt a segmentation technology to carry out operations such as contour recognition, extraction and the like on a series of two-dimensional images, finally restore a three-dimensional model of the detected object and display the three-dimensional model in a surface mode. The premise of lesion mapping is to segment the lesions in the volume data, and the accurate segmentation of the lesions is an important premise in the stages of subsequent treatment plan formulation, computer-assisted surgery, needle insertion result verification and the like. There are many ways to segment the lesion, such as manual segmentation, semi-automatic segmentation, and automatic segmentation. There are also many lesion segmentation algorithms, such as level set segmentation, graph segmentation, region growing, etc. in the traditional algorithm; FCN, U-Net, V-Net and the like in the deep learning method.
After the lesion is accurately segmented, surface rendering may be performed based on the segmented boundaries. The Marching Cube (MC) algorithm is a classic algorithm in surface rendering algorithms, also known as "iso-surface extraction". Essentially, a series of two-dimensional slice data is treated as a three-dimensional data field from which material having a certain threshold is extracted and connected into triangular patches in a topological manner. The basic idea of the MC algorithm is to process each voxel in the volumetric data field one by one and to determine the structural form of the iso-surface inside the voxel based on the values of its respective vertices. In the algorithm implementation process, the intra-voxel isosurface structure is subjected to the following two main calculations: 1. calculating an approximation isosurface of a triangular patch in a voxel; 2. and calculating normal vectors of vertexes of the triangular patch.
Referring to fig. 6, after the vertex energy value is calculated, the vertex energy value is compared with the set energy threshold, and if the vertex energy value is smaller than the energy threshold, it indicates that the point is an external point (outs) and is set to 1; if the vertex energy value is greater than the energy threshold, it indicates that the point is an interior point (INSIDE) and is set to 0. An index value between 0 and 255 can be obtained from the states of 8 vertexes of a Cube (Cube), and the Cube has Rotation (Rotation) symmetry, and the Rotation does not influence the topological structure of an isosurface. Referring to fig. 7, 15 Basic cubes (Basic Cube) can be used to cover all 256 possible cases. From these 15 Basic cubes, a look-up table can be created. The length of the table is 256, and the iso-surface joining pattern is recorded in all cases. And inquiring a lookup table with the length of 256 according to the index value to obtain the edge numbers of the three vertexes of the isosurface triangular plate. After the edge number is obtained, linear interpolation operation is carried out on the edge to obtain the coordinates of the vertex of the triangular plate.
Volume rendering is a technique for generating two-dimensional images on a screen directly from a three-dimensional data field. Digital images correspond to a two-dimensional array of colors and intensities that describe the data elements, referred to as pixels. Similarly, a three-dimensional data field may be described by a three-dimensional array having corresponding values, referred to as voxels. Similar to the two-dimensional raster of a digital image, the volumetric data field can be seen as a three-dimensional raster. A typical three-dimensional data field is a medical image three-dimensional data field, which is obtained by performing Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) scanning to obtain a series of medical image slice data, regularizing the slice data according to position and angle information, and then forming a regular data field composed of a uniform grid in a three-dimensional space, where each node on the grid is a voxel and describes attribute information such as density of an object. The greatest advantage of volume rendering techniques is that the internal structure of an object can be explored, very shaped objects can be described, such as muscles, etc., while surface rendering is weaker in these respects, but the rendering speed is somewhat slower. In order to improve the drawing speed, a surface drawing method can be adopted, namely, a three-dimensional target structure surface is firstly reconstructed from three-dimensional data, namely, the reconstruction of the object surface is carried out according to a segmentation result and a contour line, and then a three-dimensional entity with reality is generated by utilizing a reasonable illumination model and a texture mapping method.
Referring to fig. 8, in order to perform quantitative evaluation on the position of the needle more fully so as to guide the physician to perform needle insertion more accurately, the ablation simulation method provided in this embodiment may further include, on the basis of any of the above embodiments:
s801, determining a simulated ablation range of the simulated ablation needle.
The ablation range of different types of ablation needles is often different, for example, the ablation range of some ablation needles is spherical with a radius of 10mm, and the ablation range of some ablation needles is ellipsoidal with a major diameter of 20mm and a minor diameter of 15mm, so in order to determine the proportion of the simulated lesion covered by the simulated ablation range, the simulated ablation range of the simulated ablation needle needs to be determined first.
The ablation parameters of the simulated ablation needle can be obtained, and then the simulated ablation range of the simulated ablation needle can be determined according to the ablation parameters. The ablation parameters can include an energy radiation range and a heat source position, wherein the energy radiation range can be used for indicating the shape and size of the ablation range, and the heat source position can be used for indicating the distance from the center of the heat source to the needle tip. Specifically, the ablation parameters of the simulated ablation needle can be obtained according to the type of the simulated ablation needle and the mapping relationship between the preset type and the ablation parameters.
S802, calculating and displaying the proportion of the simulated focus covered by the simulated ablation range in real time according to the position information and the simulated ablation range of the simulated ablation needle and the position information and the size of the simulated focus.
After the simulated ablation range of the simulated ablation needle is determined, three-dimensional volume data corresponding to the simulated ablation range can be determined in the target space according to the position information of the simulated ablation needle and the simulated ablation range. And determining three-dimensional volume data corresponding to the simulated focus in the target space according to the position information and the size of the simulated focus. And finally, calculating the proportion of the simulated focus covered by the simulated ablation range according to the three-dimensional volume data corresponding to the simulated ablation range and the three-dimensional volume data corresponding to the simulated focus, and displaying the proportion on a display interface for a user to view.
Optionally, when the three-dimensional visualization display is performed on the simulated ablation needle and the simulated lesion in the three-dimensional image displayed in the fusion manner, the three-dimensional visualization display may be performed on the simulated ablation range of the simulated ablation needle in the three-dimensional image displayed in the fusion manner, so that the user may visually check the simulated ablation range of the simulated ablation needle.
The ablation simulation method provided by the embodiment further calculates and displays the proportion of the simulated lesion covered by the simulated ablation range in real time on the basis of calculating and displaying the relative position between the simulated ablation needle and the simulated lesion in real time, can more comprehensively realize quantitative evaluation on the position of the needle through the relative position and the coverage proportion, can more intuitively guide a doctor to adjust the position of the needle in the direction of increasing the coverage proportion and enabling the simulated ablation needle to be close to the simulated lesion, and is convenient for finding out a more proper needle position.
When the tumor is small, the single needle is used for ablation, and the simulated ablation range of the simulated ablation needle can be accurately obtained according to the ablation parameters. When the tumor is large, multiple needles are needed for ablation, and at the moment, due to the heat conduction of tissues among the needles, the simulated ablation ranges of the multiple simulated ablation needles are no longer simple superposition of the simulated ablation ranges of the various simulated ablation needles. Accurately determining the simulated ablation range of the plurality of simulated ablation needles is a prerequisite for obtaining an accurate coverage ratio. In the following, how to determine the simulated ablation range in the case of multiple simulated ablation needles will be described in detail.
In an alternative embodiment, if there are a plurality of simulated ablation needles, determining a simulated ablation range of the simulated ablation needle according to the ablation parameters may include:
determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameter corresponding to each simulated ablation needle to obtain a plurality of simulated energy distributions corresponding to the plurality of simulated ablation needles;
determining a combined energy value in a combined simulated energy distribution range corresponding to the plurality of simulated ablation needles based on the plurality of simulated energy distributions;
determining an energy distribution area with a combined energy value larger than a first energy threshold value from the combined simulation energy distribution range;
and determining simulated ablation ranges corresponding to the plurality of simulated ablation needles according to the energy distribution areas of which the combined energy value is greater than the first energy threshold value.
Determining the simulated energy distribution corresponding to each simulated ablation needle according to the ablation parameter corresponding to each simulated ablation needle may include:
determining a three-dimensional coordinate point in the energy radiation range according to the energy radiation range of each simulated ablation needle and the heat source position of each simulated ablation needle;
calculating the energy values of the three-dimensional coordinate points and determining a target three-dimensional coordinate point with the energy value larger than a second energy threshold value from the energy values;
determining the target three-dimensional coordinate points as the simulated ablation range of each simulated ablation needle;
and determining the simulated energy distribution corresponding to each simulated ablation needle according to the simulated ablation range of each simulated ablation needle and the energy value of the target three-dimensional coordinate point.
The energy distribution of the simulated ablation range of a single simulated ablation needle needs to be calculated first. If the shape of the simulated ablation range set by the user is an ellipsoid, the energy distribution of the ablation needle of the ablation range can be obtained by calculation based on the long and short diameters of the simulated ablation ellipsoid. The formula is as follows:
Figure BDA0003837324740000181
in the formula, x, y and z are coordinate points in a three-dimensional space, x1, y1 and z1 are energy centers, namely heat source positions, and A1, B1 and C1 are energy radiation ranges simulating an ablation ellipsoid. Transforming the formula yields:
(x-x1) 2 ·B1 2 ·C1 2 +(y-y1) 2 ·A1 2 ·C1 2 +(z-z1) 2 ·A1 2 ·B1 2 ≤A1 2 ·B1 2 ·C1 2
setting up
Figure BDA0003837324740000182
Figure BDA0003837324740000183
If it is used
Figure BDA0003837324740000184
Indicating that the spot is within the radiation range, and then calculating the energy of the spot
Figure BDA0003837324740000185
From this equation, it can be calculated that when a coordinate point is at the center of the ellipsoid, the energy is at most 1, when the point is at the edge of the ellipsoid, the energy is at least 0, by comparing energyl of the coordinate value of each point with the energy threshold energyThre, if: energy1 > energyThre, indicating that the energy at this point satisfies the ablation condition and can be included in the ablation range. The energyThre can be obtained by the ablation range set by the user, that is, the energy threshold of the ablation range set by the user can be reached under the current energy radiation.
Energy superposition between multiple ablation needles is calculated. Based on the above system, the energy distribution of a single ablation needle can be obtained, and if there are two ablation needles, the energy distribution of the ablation needle 1 is calculated:
Figure BDA0003837324740000191
energy distribution of the ablation needle 2:
Figure BDA0003837324740000192
calculating the energy sum of coordinate points in the radiation range of the two needles under the radiation of the ablation needle 1 and the ablation needle 2, namely:
energy=energy1+energy2
based on the fixed energy threshold value, the simulated ablation range of the two needles for combined ablation can be obtained. The ablation range can be drawn based on a surface drawing method, so that three-dimensional visualization of the ablation range can be realized.
When ablating a lesion, there may be dangerous tissue around the lesion. For example, when a tumor in the liver is ablated, the dangerous tissues that may be encountered during ablation include the large blood vessels around the tumor, the gall bladder, the kidney, the heart, and the like. Access to the hazardous tissue may pose a risk that is difficult to predict during ablation. Therefore, when ablation is performed, not only is the ablation needle as close to the center of the lesion as possible, but also the ablation range of the ablation needle is as far away from the hazardous tissue as possible. In order to improve the safety of ablation, the ablation simulation method provided by this embodiment may further include, on the basis of any of the above embodiments: the distance between the boundary of the simulated ablation volume and the boundary of the at-risk tissue is determined and displayed on the display interface.
When ablation simulation is performed in consideration of the dangerous tissue, since damage to any part of the dangerous tissue may cause serious consequences, the distance between the center of the dangerous tissue and the simulated ablation range should not be used as an evaluation criterion, and the distance between the boundary of the simulated ablation range and the boundary of the dangerous tissue should be used as an evaluation criterion, and the larger the distance between the two is, the better.
The boundary position of the simulated ablation range is recorded as T sec_probe The boundary position of the dangerous tissue is denoted as T sec_danger The distance D between the boundary of the simulated ablation volume and the boundary of the dangerous tissue can be quantitatively evaluated by the following expression:
Figure BDA0003837324740000201
wherein, T sec_danger And T sec_probe Whether the coordinates of the ablation range and the coordinates of the dangerous tissue coincide is determined by simulating whether the coordinates of the ablation range and the coordinates of the dangerous tissue coincide. If there is coincidence, T sec_danger And T sec_probe Intersecting, simulating the ablation range of the ablation needle to damage dangerous tissues; otherwise, then T sec_danger And T sec_probe Are not intersected.
In the above expression, when the dangerous tissue and the simulated ablation range are not intersected, the farther the distance between the dangerous tissue and the simulated ablation range is, the closer the value of the final D is to 0, which indicates that the evaluation is higher; when the dangerous tissue intersects the simulated ablation zone, the value of D is negative, indicating that the dangerous tissue has been touched, and is relatively dangerous.
When the simulated ablation range is intersected with the dangerous tissue, the dangerous tissue can be damaged and is dangerous, and at the moment, the user can be timely prompted to touch the dangerous tissue by outputting warning information, so that the user can timely adjust the insertion needle position of the simulated ablation needle, and the safety is ensured.
Reference is made herein to various exemplary embodiments. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope hereof. For example, the various operational steps, as well as the components used to perform the operational steps, may be implemented in differing ways depending upon the particular application or consideration of any number of cost functions associated with operation of the system (e.g., one or more steps may be deleted, modified or incorporated into other steps).
Additionally, as will be appreciated by one skilled in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium having computer readable program code pre-loaded thereon. Any tangible, non-transitory computer-readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-ROMs, DVDs, blu Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means for implementing the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been illustrated in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components particularly adapted to specific environments and operative requirements may be employed without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, one skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the disclosure is to be considered in all respects as illustrative and not restrictive, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. However, the benefits, advantages, solutions to problems, and any element(s) that may cause any element(s) to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims. As used herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "coupled," and any other variation thereof, as used herein, refers to a physical connection, an electrical connection, a magnetic connection, an optical connection, a communicative connection, a functional connection, and/or any other connection.
The present invention has been described in terms of specific examples, which are provided to aid in understanding the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (13)

1.一种消融模拟方法,其特征在于,包括:1. An ablation simulation method, characterized in that, comprising: 通过超声探头向包含病灶的目标组织发射超声波,并根据接收到的超声回波数据生成所述目标组织的超声图像;transmitting ultrasonic waves to the target tissue containing the lesion through the ultrasonic probe, and generating an ultrasonic image of the target tissue according to the received ultrasonic echo data; 获取所述目标组织的三维图像数据;acquiring three-dimensional image data of the target tissue; 确定所述超声图像所在的超声图像空间与所述三维图像数据所在的三维图像空间之间的空间变换矩阵;determining a space transformation matrix between the ultrasound image space in which the ultrasound image is located and the three-dimensional image space in which the three-dimensional image data is located; 根据所述空间变换矩阵在目标空间中确定模拟消融针的位置信息和模拟病灶的位置信息,所述目标空间为所述超声图像空间或者所述三维图像空间;Determine the position information of the simulated ablation needle and the position information of the simulated lesion in a target space according to the space transformation matrix, where the target space is the ultrasound image space or the three-dimensional image space; 根据所述模拟消融针的位置信息和所述模拟病灶的位置信息,实时计算并显示所述模拟消融针与所述模拟病灶之间的相对位置;calculating and displaying the relative position between the simulated ablation needle and the simulated lesion in real time according to the position information of the simulated ablation needle and the position information of the simulated lesion; 确定所述模拟消融针的模拟消融范围;determining the simulated ablation range of the simulated ablation needle; 根据所述模拟消融针的位置信息和模拟消融范围以及所述模拟病灶的位置信息和大小,实时计算并显示所述模拟病灶被所述模拟消融范围覆盖的比例;calculating and displaying the ratio of the simulated lesion covered by the simulated ablation range in real time according to the position information of the simulated ablation needle, the simulated ablation range, and the position information and size of the simulated lesion; 其中,所述确定所述模拟消融针的模拟消融范围,包括:获取所述模拟消融针的消融参数,所述消融参数包括能量辐射范围和热源位置;根据所述消融参数确定所述模拟消融针的模拟消融范围;Wherein, the determining the simulated ablation range of the simulated ablation needle includes: acquiring the ablation parameters of the simulated ablation needle, the ablation parameters including the energy radiation range and the position of the heat source; determining the simulated ablation needle according to the ablation parameters The simulated ablation range of ; 其中,若模拟消融针为多个时,所述根据所述消融参数确定所述模拟消融针的模拟消融范围,包括:根据每个模拟消融针对应的消融参数,确定每个模拟消融针对应的模拟能量分布,得到多个模拟消融针对应的多个模拟能量分布;基于所述多个模拟能量分布,确定所述多个模拟消融针对应的联合模拟能量分布范围内的组合能量值;从所述联合模拟能量分布范围中确定组合能量值大于第一能量阈值的能量分布区域;根据所述组合能量值大于第一能量阈值的能量分布区域确定所述多个模拟消融针对应的模拟消融范围。Wherein, if there are multiple simulated ablation needles, the determining the simulated ablation range of the simulated ablation needle according to the ablation parameters includes: determining the corresponding ablation range of each simulated ablation needle according to the ablation parameters corresponding to each simulated ablation needle Simulating energy distribution to obtain multiple simulated energy distributions corresponding to multiple simulated ablation targets; based on the multiple simulated energy distributions, determining a combined energy value within the joint simulation energy distribution range corresponding to the multiple simulated ablation targets; from the Determine the energy distribution area whose combined energy value is greater than the first energy threshold in the joint simulation energy distribution range; determine the simulated ablation range corresponding to the plurality of simulated ablation targets according to the energy distribution area whose combined energy value is greater than the first energy threshold. 2.如权利要求1所述的方法,其特征在于,所述确定所述超声图像所在的超声图像空间与所述三维图像数据所在的三维图像空间之间的空间变换矩阵,包括:2. The method according to claim 1, wherein the determining the space transformation matrix between the ultrasound image space where the ultrasound image is located and the three-dimensional image space where the three-dimensional image data is located comprises: 获取所述超声图像空间到位置传感器空间的第一变换矩阵;Acquiring a first transformation matrix from the ultrasound image space to the position sensor space; 获取所述位置传感器空间到世界坐标空间的第二变换矩阵;Obtain a second transformation matrix from the position sensor space to the world coordinate space; 确定所述世界坐标空间到所述三维图像空间的第三变换矩阵;determining a third transformation matrix from the world coordinate space to the three-dimensional image space; 根据所述第一变换矩阵、所述第二变换矩阵和所述第三变换矩阵确定所述空间变换矩阵。The spatial transformation matrix is determined according to the first transformation matrix, the second transformation matrix and the third transformation matrix. 3.如权利要求2所述的方法,其特征在于,所述确定所述世界坐标空间到所述三维图像空间的第三变换矩阵包括:3. The method according to claim 2, wherein said determining the third transformation matrix from said world coordinate space to said three-dimensional image space comprises: 基于体外标志物或者基于图像信息对所述超声图像和所述三维图像数据进行配准,得到配准矩阵;Registering the ultrasonic image and the three-dimensional image data based on in vitro markers or based on image information to obtain a registration matrix; 根据所述配准矩阵确定所述第三变换矩阵。The third transformation matrix is determined according to the registration matrix. 4.如权利要求1所述的方法,其特征在于,若模拟消融针为多个且类型相同时,所述确定模拟消融针的位置信息包括:确定多个模拟消融针的中心位置的位置信息;若模拟消融针为多个且类型不完全相同时,所述确定模拟消融针的位置信息包括:根据各个模拟消融针的模拟消融范围确定多个模拟消融针的消融中心,确定所述多个模拟消融针的消融中心的位置信息。4. The method according to claim 1, wherein if there are multiple simulated ablation needles of the same type, said determining the position information of the simulated ablation needles comprises: determining the position information of the center positions of multiple simulated ablation needles ; If there are multiple simulated ablation needles and the types are not exactly the same, the determination of the position information of the simulated ablation needles includes: determining the ablation centers of multiple simulated ablation needles according to the simulated ablation range of each simulated ablation needle, and determining the ablation centers of the multiple simulated ablation needles The position information of the ablation center of the simulated ablation needle. 5.如权利要求1所述的方法,其特征在于,所述根据所述空间变换矩阵在目标空间中确定模拟消融针的位置信息和模拟病灶中心的位置信息,包括:5. The method according to claim 1, wherein the determining the position information of the simulated ablation needle and the position information of the center of the simulated lesion in the target space according to the space transformation matrix comprises: 获取所述模拟消融针的插针角度和插针深度;Obtaining the insertion angle and insertion depth of the simulated ablation needle; 根据所述插针角度和所述插针深度确定所述模拟消融针在所述超声图像空间中的位置信息;determining position information of the simulated ablation needle in the ultrasound image space according to the needle angle and the needle depth; 获取所述模拟病灶中心在所述三维图像空间中的位置信息;Obtaining position information of the simulated lesion center in the three-dimensional image space; 当所述目标空间为所述超声图像空间时,根据所述空间变换矩阵将所述模拟病灶中心在所述三维图像空间中的位置信息转换为所述模拟病灶中心在所述超声图像空间中的位置信息,When the target space is the ultrasound image space, transform the position information of the simulated lesion center in the three-dimensional image space into the position information of the simulated lesion center in the ultrasound image space according to the space transformation matrix location information, 或者,or, 当所述目标空间为所述三维图像空间时,根据所述空间变换矩阵将所述模拟消融针在所述超声图像空间中的位置信息转换为所述模拟消融针在所述三维图像空间中的位置信息。When the target space is the three-dimensional image space, converting the position information of the simulated ablation needle in the ultrasonic image space into the position information of the simulated ablation needle in the three-dimensional image space according to the space transformation matrix location information. 6.如权利要求1所述的方法,其特征在于,所述获取所述目标组织的三维图像数据,包括:6. The method according to claim 1, wherein said acquiring the three-dimensional image data of said target tissue comprises: 获取所述目标组织的电子计算机断层扫描数据、磁共振成像数据和三维超声数据中的至少一种,acquiring at least one of computed tomography data, magnetic resonance imaging data, and three-dimensional ultrasound data of the target tissue, 或者,or, 通过对所述超声探头所采集的所述目标组织的超声图像进行三维重建,以获取所述目标组织的三维图像数据。The three-dimensional image data of the target tissue is obtained by performing three-dimensional reconstruction on the ultrasonic image of the target tissue collected by the ultrasonic probe. 7.如权利要求1所述的方法,其特征在于,所述方法还包括:7. The method of claim 1, further comprising: 对所述超声图像和所述三维图像数据进行融合显示;performing fusion display on the ultrasound image and the three-dimensional image data; 在融合显示的三维图像中对所述模拟消融针和所述模拟病灶进行三维可视化显示。The simulated ablation needle and the simulated lesion are visualized in three dimensions in the fusion displayed three-dimensional image. 8.如权利要求7所述的方法,其特征在于,所述对所述模拟消融针和所述模拟病灶进行三维可视化显示,包括:8. The method according to claim 7, wherein the three-dimensional visual display of the simulated ablation needle and the simulated lesion comprises: 通过一个窗口从目标角度对所述模拟消融针和所述模拟病灶进行三维可视化显示,所述目标角度是根据用户输入的操作信息进行确定和/或调整的;performing a three-dimensional visual display of the simulated ablation needle and the simulated lesion from a target angle through a window, and the target angle is determined and/or adjusted according to the operation information input by the user; 或者,or, 通过多个窗口从不同的角度对所述模拟消融针和所述模拟病灶进行三维可视化显示,每个窗口中所显示的角度是预置的,或根据用户输入的操作信息进行确定和/或调整的。Three-dimensional visual display of the simulated ablation needle and the simulated lesion from different angles through multiple windows, the angle displayed in each window is preset, or determined and/or adjusted according to the operation information input by the user of. 9.如权利要求1所述的方法,其特征在于,所述根据每个模拟消融针对应的消融参数,确定每个模拟消融针对应的模拟能量分布,包括:9. The method according to claim 1, wherein, according to the ablation parameters corresponding to each simulated ablation needle, determining the simulated energy distribution corresponding to each simulated ablation needle comprises: 根据每个模拟消融针的能量辐射范围和每一个模拟消融针的热源位置,确定能量辐射范围内的三维坐标点;According to the energy radiation range of each simulated ablation needle and the heat source position of each simulated ablation needle, determine the three-dimensional coordinate points within the energy radiation range; 计算所述三维坐标点的能量值并从中确定出能量值大于第二能量阈值的目标三维坐标点;calculating the energy value of the three-dimensional coordinate point and determining a target three-dimensional coordinate point whose energy value is greater than a second energy threshold; 将所述目标三维坐标点确定为每个模拟消融针的模拟消融范围;Determining the target three-dimensional coordinate point as the simulated ablation range of each simulated ablation needle; 根据每个模拟消融针的模拟消融范围和目标三维坐标点的能量值,确定每个模拟消融针对应的模拟能量分布。According to the simulated ablation range of each simulated ablation needle and the energy value of the target three-dimensional coordinate point, the simulated energy distribution corresponding to each simulated ablation needle is determined. 10.如权利要求1所述的方法,其特征在于,所述方法还包括:10. The method of claim 1, further comprising: 确定并在显示界面上显示所述模拟消融范围的边界与危险组织的边界之间的距离。The distance between the boundary of the simulated ablation range and the boundary of the dangerous tissue is determined and displayed on the display interface. 11.如权利要求1所述的方法,其特征在于,所述方法还包括:11. The method of claim 1, further comprising: 当所述模拟消融范围与危险组织相交时,输出警示信息,所述警示信息用于提示用户触及了危险组织。When the simulated ablation range intersects the dangerous tissue, output warning information, the warning information is used to remind the user that the dangerous tissue is touched. 12.一种超声成像设备,其特征在于,包括:12. An ultrasonic imaging device, comprising: 超声探头;Ultrasound probe; 发射电路,用于按照设定模式将相应的发射序列输出至所述超声探头,以控制所述超声探头发射相应的超声波;A transmitting circuit, configured to output a corresponding transmitting sequence to the ultrasonic probe according to a set mode, so as to control the ultrasonic probe to transmit corresponding ultrasonic waves; 接收电路,用于接收所述超声探头输出的超声回波信号,输出超声回波数据;The receiving circuit is used to receive the ultrasonic echo signal output by the ultrasonic probe, and output the ultrasonic echo data; 显示器,用于输出可视化信息;a display for outputting visual information; 处理器,用于执行如权利要求1-11任一项所述的消融模拟方法。A processor, configured to execute the ablation simulation method according to any one of claims 1-11. 13.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,所述计算机执行指令被处理器执行时用于实现如权利要求1-11任一项所述的消融模拟方法。13. A computer-readable storage medium, wherein computer-executable instructions are stored in the computer-readable storage medium, and the computer-executable instructions are used to implement any one of claims 1-11 when executed by a processor The ablation simulation method described.
CN202211091789.7A 2022-01-04 2022-09-07 Ablation simulation method and device Pending CN115317128A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210001938X 2022-01-04
CN202210001938 2022-01-04

Publications (1)

Publication Number Publication Date
CN115317128A true CN115317128A (en) 2022-11-11

Family

ID=83930999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211091789.7A Pending CN115317128A (en) 2022-01-04 2022-09-07 Ablation simulation method and device

Country Status (1)

Country Link
CN (1) CN115317128A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115778545A (en) * 2022-12-22 2023-03-14 天津市鹰泰利安康医疗科技有限责任公司 Ablation positioning method and system
CN115841873A (en) * 2022-11-21 2023-03-24 天津市鹰泰利安康医疗科技有限责任公司 Ablation simulation method and system
CN116531089A (en) * 2023-07-06 2023-08-04 中国人民解放军中部战区总医院 Ultrasound-guided data processing method for block anesthesia based on image enhancement
CN118303986A (en) * 2024-06-07 2024-07-09 北京智愈医疗科技有限公司 Method and device for determining execution parameters of vapor needle and vapor ablation system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115841873A (en) * 2022-11-21 2023-03-24 天津市鹰泰利安康医疗科技有限责任公司 Ablation simulation method and system
CN115841873B (en) * 2022-11-21 2024-01-16 天津市鹰泰利安康医疗科技有限责任公司 Ablation simulation method and system
CN115778545A (en) * 2022-12-22 2023-03-14 天津市鹰泰利安康医疗科技有限责任公司 Ablation positioning method and system
CN115778545B (en) * 2022-12-22 2023-11-14 天津市鹰泰利安康医疗科技有限责任公司 Ablation positioning method and system
CN116531089A (en) * 2023-07-06 2023-08-04 中国人民解放军中部战区总医院 Ultrasound-guided data processing method for block anesthesia based on image enhancement
CN116531089B (en) * 2023-07-06 2023-10-20 中国人民解放军中部战区总医院 Image-enhancement-based blocking anesthesia ultrasonic guidance data processing method
CN118303986A (en) * 2024-06-07 2024-07-09 北京智愈医疗科技有限公司 Method and device for determining execution parameters of vapor needle and vapor ablation system

Similar Documents

Publication Publication Date Title
US20200281662A1 (en) Ultrasound system and method for planning ablation
CN115317128A (en) Ablation simulation method and device
EP2064991B1 (en) Flashlight view of an anatomical structure
AU2006201451B2 (en) Registration of ultrasound data with pre-acquired image
AU2006201644B2 (en) Registration of electro-anatomical map with pre-acquired imaging using ultrasound
US20170095226A1 (en) Ultrasonic diagnostic apparatus and medical image diagnostic apparatus
JP5622995B2 (en) Display of catheter tip using beam direction for ultrasound system
CN115811961A (en) Three-dimensional display method and ultrasonic imaging system
JP7171168B2 (en) Medical image diagnosis device and medical image processing device
JP2006305359A (en) Software product for 3D cardiac imaging using ultrasound contour reconstruction
AU2006201645A1 (en) Display of two-dimensional ultrasound fan
WO2014031531A1 (en) System and method for image guided medical procedures
Singla et al. Intra‐operative ultrasound‐based augmented reality guidance for laparoscopic surgery
KR102439769B1 (en) Medical imaging apparatus and operating method for the same
CN112634191A (en) Medical image analysis method, ultrasonic imaging apparatus, and computer storage medium
CN112043377B (en) Method and system for ablation path planning assisted by ultrasound field simulation in any CT slice
US20240122653A1 (en) Parametric contour map visualization for needle guide system
WO2015087203A1 (en) Imaging systems and methods for monitoring treatment of tissue lesions
CN115530973A (en) Ablation visualization method and system
KR102850298B1 (en) Map of body cavity
CN115998423B (en) Display method for simulated ablation and ultrasonic imaging system
CN115998334B (en) Ablation effect display method and ultrasound imaging system
CN120770927B (en) A method and system for planning minimally invasive surgery based on image navigation
CN118593111A (en) Image processing method, device, system and storage medium
CN119745505A (en) Real-time monitoring puncture robot tumor ablation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination