AU5782301A - Stereotactic surgical procedure apparatus and method - Google Patents
Stereotactic surgical procedure apparatus and method Download PDFInfo
- Publication number
- AU5782301A AU5782301A AU57823/01A AU5782301A AU5782301A AU 5782301 A AU5782301 A AU 5782301A AU 57823/01 A AU57823/01 A AU 57823/01A AU 5782301 A AU5782301 A AU 5782301A AU 5782301 A AU5782301 A AU 5782301A
- Authority
- AU
- Australia
- Prior art keywords
- image
- computer
- block
- representation
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 108
- 238000001356 surgical procedure Methods 0.000 title claims description 17
- 238000003860 storage Methods 0.000 claims description 19
- 238000013459 approach Methods 0.000 claims description 15
- 230000008859 change Effects 0.000 claims description 12
- 238000003780 insertion Methods 0.000 claims description 10
- 230000037431 insertion Effects 0.000 claims description 10
- 210000003484 anatomy Anatomy 0.000 claims 6
- 239000012636 effector Substances 0.000 description 17
- 238000002591 computed tomography Methods 0.000 description 16
- 238000002594 fluoroscopy Methods 0.000 description 13
- 238000003708 edge detection Methods 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000003247 decreasing effect Effects 0.000 description 4
- 206010028980 Neoplasm Diseases 0.000 description 3
- 238000001574 biopsy Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 101100321720 Arabidopsis thaliana PP2AA1 gene Proteins 0.000 description 1
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 241001559589 Cullen Species 0.000 description 1
- 241000406799 Deto Species 0.000 description 1
- 241001674048 Phthiraptera Species 0.000 description 1
- 241000700608 Sagitta Species 0.000 description 1
- 229910000831 Steel Inorganic materials 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000002675 image-guided surgery Methods 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000011862 kidney biopsy Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- COCAUCFPFHUGAA-MGNBDDOMSA-N n-[3-[(1s,7s)-5-amino-4-thia-6-azabicyclo[5.1.0]oct-5-en-7-yl]-4-fluorophenyl]-5-chloropyridine-2-carboxamide Chemical compound C=1C=C(F)C([C@@]23N=C(SCC[C@@H]2C3)N)=CC=1NC(=O)C1=CC=C(Cl)C=N1 COCAUCFPFHUGAA-MGNBDDOMSA-N 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000001959 radiotherapy Methods 0.000 description 1
- 238000002432 robotic surgery Methods 0.000 description 1
- 210000004872 soft tissue Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
- 238000002672 stereotactic surgery Methods 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000008733 trauma Effects 0.000 description 1
Landscapes
- Apparatus For Radiation Diagnosis (AREA)
- Processing Or Creating Images (AREA)
Description
AUSTRALIA
Patents Act 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name of Applicant: Actual Inventors: Address for Service: Invention Title: NORTHWESTERN UNIVERSITY Michael A. Peshkin Julio J. Santos-Munn6 CULLEN
CO.,
Patent Trade Mark Attorneys, 239 George Street, Brisbane, QId. 4000, Australia.
Stereotactic Surgical Procedure Apparatus and Method 9 Details of Associated Convention Application: United States Application No.
08/648,313 filed 15 May 1996 The following statement is a full description of this invention, including the best method of performing it known to us: 1 la This invention is a divisional of Australian Patent Application No.
30664/97.
Background and Summary of the Invention The present invention relates to an apparatus and method for planning and guiding insertion of an object along a linear trajectory into a body. More particularly, the present invention relates to an apparatus and method for coordinating two captured fluoroscope images to permit effective three-dimensional planning of the trajectory using only two-dimensional images.
Numerous medical interventions involve placing a needle, drill, screw, nail, wire or other device in the body. In some cases the angle and position of the device are both of critical importance, for example in the drilling of a hole for a screw along the axis of a spinal pedicle. In other cases, it is primarily the positioning of the end-point of the device which is important, for example in placing a biopsy needle into a suspected tumor.
In still other cases, the objective is only to define a point rather than a line, for example in targeting a tumor for radiation therapy. Many other examples exist, especially in the field 20 of orthopaedics.
The present invention is also relevant to the development of percutaneous technique. Executing a linear trajectory for the insertion of instrumentation into the body through the skin is more difficult than open surgical technique, but the reduced invasiveness and trauma of percutaneous placement makes it desirable.
z Fluoroscopy is frequently used by surgeons to assist medical procedures.
Continuous fluoroscopy during a surgical procedure is undesirable because it exposes the surgeon's hands to radiation. Furthermore, regardless of whether intermittent or continuous fluoroscopy is used, the resulting images are two-dimensional while insertion of the surgical instrument requires three-dimensional awareness by the surgeon.
The apparatus and method of the present Invention involve acquisition and storage of two separate fluoroscopic images of the body, taken from two different angles.
Typically, although not necessarily, these would be an anterior/posterior image taken front-to-back of the patient, and a sagittal image taken side-to-side. These two fluoroscopic images are displayed on two adjacent computer monitors. The surgeon uses a trackball or other computer input device to specify on the monitors an insertion point and an insertion trajectory.
A mechanical positioning device is then used to position a guide through which the surgeon performs the insertion of the surgical instrument. The positioning device may either be an active computer controlled manipulator such as a robot, or it may be a manually adjusted mechanical device which is set numerically in accordance with an output from the computer.
1o The apparatus and method of the present invention establish the projective geometric relationships relating each of two acquired fluoroscopic images to the three-dimensional workspace around and within the patient's body, despite essentially arbitrary positioning of the fluoroscope. The two images then become a coordinated pair, which permits three-dimensional planning that might otherwise be expected to require a computed tomography (CT) scan.
While the acquisition and display of two approximately orthogonal images may be expected to present the surgeon with the greatest ability to plan in three dimensions, two images are not strictly necessary. It is possible to use a single captured 0. image for some procedures, particularly if the surgeon has adjusted the beam axis of the 20 fluoroscope into alignment with the intended trajectory. Furthermore, more than two images could also be acquired and coordinated, should that be advantageous, Several other approaches to stereotactic or robotic surgery, planned on a computer screen displaying medical images, have been described by other workers, and will be listed below. Some background is given here before discussing prior art. The method and apparatus of the present invention constitute a technique we call coordinated fluoroscopy. Coordinated fluoroscopy is a technique for REGISTRATION and for SURGICAL PLANNING. It allows registration based on the acquired fluoroscopic images themselves, without requiring any additional measuring devices. It allows three-dimensional surgical planning based on fluoroscopic views from two angles, without requiring three-dimensional imaging such as computed tomography and without requiring that the two fluoroscopic images be acquired from orthogonal fluoroscope poses.
REGISTRATION
Registration is a key step in any image-guided surgical system.
Registration is the determination of the correspondence between points of the image upon which a surgical plan is prepared, and points of the workspace in the vicinity of (and within) the patient. If a numerically controlled tool (whether robotic or manual) is to be used, the coordinate system of that device must also be brought into registry with the image.
It is common to accomplish registration with the help of a global positioning device, usually optical, which can measure the three-dimensional coordinates of markers placed anywhere over a large volume of space. Coordinated fluoroscopy avoids the necessity for this expensive and inconvenient device, instead deriving registration directly from the acquired fluoroscopic images themselves, Coordinated fluoroscopy uses a "registration artifact" which is held in a fixed position relative to the patient while one or more fluoroscopic images are acquired from different angles (poses).
There is no need to constrain the fluoroscope poses at which these various images are acquired, for instance to require that they be orthogonal, nor is there a need to instrument Sthe fluoroscope so that the pose angles can be measured. Instead, pose information is extracted after-the-fact from the images. It is a substantial benefit.of the present invention 20 that surgeons can acquire fluoroscopic images using fluoroscope poses of their own choosing, as they are accustomed.
The registration artifact contains a plurality of features (fiducials) which :i are designed to be easily identifiable on a fluoroscopic image. The embodiment described 0* here uses eight small steel spheres embedded in a radiolucent matrix. The positions of 25 these fiducials are known relative to a coordinate system fixed in the artifact, either by design or by measurement.
From the two-dimensional locations of the projections of these fiducials in a fluoroscopic image, we can determine the geometric projections that carry a general three dimensional point anywhere in the vicinity of the artifact into a projected point on the image. This establishes registration between image and workspace. Several images can each be registered relative to the same registration artifact, thus also bringing all the images into registry with one another.
Identification of the geometric projections, as discussed above, would not be possiblc with raw fluoroscope images, which are highly nontlinear and distorted. It is necessary first to map and compensate for these distortions. It is useful to be aware of the necessity of distortion compensation when comparing the present invention to prior art.
SURGICAL PLANNING Surgical planning is also a key step in image-guided surgery, Planning of three-dimensional surgical procedures might be expccted to be done on a three-dimensional dataset, such as can be reconstructed from computed tomography (CT) data. However, surgeons are accustomed to planning on two-dimensional images: radiographs or fluoroscopic images. Indeed even when CT data is available, planning is usually done on individual two-dimensional CT "slices" rather than on a three-dimensional reconstruction, The coordinates of the endpoints of a line segment representing an intended screw, biopsy needle, or drilled hole are of course three-dimensional, as, are the coordinates of a single point within the body marking the present location of a tumor or a fragment of shrapnel. In surgical planning such points can be specified on a two-dimensional image, or on each of several two-dimensional images. Each such two-dimensional image is a projection of the same three-dimensional space.
:It is necessary to convert the two-dimensional coordinates of specified points on each of several images into a three-dimensional coordinate which can be used to guide a tool along a desired trajectory or to a desired point within the body. To do so one must have knowledge of the geometric relationship of the projections that created the images.
In the absence of such geometric knowledge a point specified on one image and a point independently specified on another image may in fact not correspond to any single point within the body. This is so because a point specified on a two-dimensional image is the projection of a LINE in space. The implied point in three-dimensions is the intersection of two such lines, one implied by the point specified on each image. Two such lines created independently may be skew, intersecting nowhere.
Similarly, line segments for an intended procedure can not be chosen independently on two images, otherwise they will in general not correspond to a well-defined three-dimensional line segment.
in coordinated fluoroscopy, the geometric projections that relate the two images to a single three-dimensional coordinate system are established before planning commences. The points chosen by the surgeon on two (or more) images can therefore be constrained by the software such that they DO correspond to a well-defined point in three-dimensions, In practice, as a surgeon adjusts an intended point or line segment on one image, the point or line segment displayed on the other irnage(s).coltiluously updatcs and adjusts as well. One cannot draw "arbitrary" points or line segments independently on the images; the software only allows one to draw points or line segments that correspond to a well-defined point or ine segment in three-dimensions.
The benefits of planning on geometrically coordinated images as described above are threefold: 1) Once the surgeon has selected a point or a line segment on two images, the three-dimensional point or line segment to which the selections correspond is ftilly defined and ready to be executed.
unataina~le 2) An axial view such as could be attained from a CT slice is generally untaial fluoroscopically. The angle that is most easily visualized in axial view, known as the transverse angle, is therefore difficult to select or execute under fluoroscopy, In coordinated fluoroscopy the transverse angle is implicitly specified by the surgeon by selecting line segments on two images, This may assist the surgeon in visualizing and planning the transverse angle for a procedure.
3) In conventional fluoroscopy, image dilation due to beam divergence is of unknown extent, making accurate measurement of anatomic distances difficult. In coordinated fluoroscopy the actual in-situ length of an intended line segment can be determined by the software. This is useful for selecting appropriate screw length, as well as for other purposes.
BACKGROUND
Lavalle et al. in Grenoble, France have developed a system for spinal surgery which uses computed tomography as an imnage source. The CT data is assembled into a three-dimensional data set which can then be resliced at will on orthogonal planes.
-6- Surgical planning proceeds on three mutually orthogonal planes simultaneously, Registration is performed by Lsing an optical tracking device to digitize arbitrary surface points of the vertebrae, and matches those surface points to the CT data set.
Nolte et al. in Bern. Switzerland have developed a very similar spinal system to Lavalle et al. Registration differs in that the optical tracking device is used to digitize specific anatomic landmarks rather than general surface contours. The features are then pointed out manually in CT data, allowing a match to be made.
P. Finlay in High Wycombie, England has developed a fluoroscopic system for head-of-femur (hip) fractures. Accuracy requirements in this procedure are not very great, so fluoroscope distortion compensation is not needed. Its absence also precludes identification of the geometric projections from images as is done in the present invention Instead, the two fluoroscope poses are required to be orthogonal and the C-arm must not be moved along the floor in between the two images. Registration is accomplished by noting various features of a surgical tool which appears in the images, and by highlighting a marker wire which also appears in the field of view of the fluoroscope.
Potamianos et al, in London, England have developed a system for kidney biopsy and similar soft-tissue procedures. It incorporates a digitizing mechanical arm to which a biopsy needle is attached, and which can be moved about manually by the surgeon. Surgical planning per se is absent; instead a line segment representing the 20 present position of needle is displayed superimposed upon captured (static) fluoroscope *images, as the needle is moved manually near and within the patient.
Phillips et al. in Hull, England have developed a system for orthopaedic procedures. It uses a optical tracking device as well as a fluoroscope. Registration is accomplished by instrumenting the fluoroscope with light emitting diodes and tracking them with the optical tracker. Surgical planning software is specific to the surgical procedure, and tends to offer medical opinion rather than just display a trajectory as in the present invention. For intramedullary nail placement, for instance, the surgeon outlines target holes in an intramedullary prosthetic, and software calculates a trajectory through them.
U.S. Patent 4,750,487 (Zanetti) describes a stereotactic frame which overlays a patient. A single anterior/posterior fluorograph is then acquired, in which a crosshairs affixed to the frame is visible. By measuring the displacement of the crosshairs from the desired target, a motion of the frame can be accomplished which brings the two into alignment. This invention does not facilitate three-dimensional stereotaxy as does the present invention.
U.S. Patent 5,078.140 (Kwoh) describes a stereotactic and robotic system for neurosurgery, It uses CT images.
ASPECTS OF THE INVENTION According to the present invention, a method is provided for planning a stereotactic surgical procedure for a linear trajectory insertion of surgical instrumentation into a body using a fluoroscope for generating images of the body. The method includes placing adjacent to the body a registration artifact containing a plurality of fiducials; displaying on a computer monitor an image of the patient's body and the registration artifact; receiving a user or automatic algorithmic input to identify two-dimensional coordinates of the fiducials of the registration artifact displayed on the first monitor; and registering the image by creating a geometric model having parameters, said model projecting three-dimensional coordinates into image points, and numerically optimizing the parameters of the geometric model such that the projections of the known three-dimensional coordinates of the fiducials best fit the identified two-dimensional coordinates in the image.
20 The method further includes displaying on a second computer monitor a second image, taken of the patient's body and the registration artifact but from an angle different from that of the first image, and receiving a user or automatic algorithmic input to identify two-dimensional coordinates of the fiducials displayed on the second computer monitor; and registering the second image by creating a geometric model having parameters, said model projecting three-dimensional coordinates into image points, and S :numerically optimizing the parameters of the geometric model such that the projections of the known three-dimensional coordinates of the fiducials best fit the identified two-dimensional coordinates in the second image.
The method, whether one or two images have been acquired, further includes the step of receiving a user input to select on a computer monitor an entry point for a surgical instrument. In the case of two images, also receiving a user input to select on a computer monitor the position, length, and angles of a virtual guidewire representing the trajectory for the surgical instrument; and drawing a segment, to be known as a PROJECTED GUTIDEWIME, on the imnage(s). When there are two images, the projected guidewires are constrained to correspond geometrically to the same three-dimensional segment in space, to be known as the VIRTUAL GUIDEWIRE, The method furthcr inrltude3 receivng a u~cr input to move either and of a projected guidewire, by revising the virtual guidewire of which the projected guidewire(s) are projectina and by redrawing the projected guidewires in correspondence with the revised virtual guidewire.
The method further includes receiving a user input to change the length of the virtual guidewire, and redrawing the projected guidewire(s) in correspondence with the revised virtual guidewire. A special case is that the length is zero, so that what is planned is a virtual targetpoint rather than a virtual Suidewire.
The method further includes receiving a uscer input to chmnge the sagittal, trancverze, or coronal angle(s) of the virtual guidewire, updating the orientation of the virual guidewire based on the new angles, and redrawing the projected guidewire(s) in correspondence with the revised virtual guidewire.
The method fuirther includes producing an output to adjust the coordinates of a tool guide such that the projection of the axis of the guide in an image is brought into correspondence with the entry point displayed on the computer monitor.
aThe method further includes producing an output to adjust the coordinates of a tool guide such that it is brought into correspondence with the virtual guidewire; or producing an output to adjust the coordinates of a tool guide such that the position of the guide along its axis is offset by a preselected distance from one endpoint of the virtual guidewire, in order to control the location within the body of the surgical instrument to be inserted.
*.The method further includes transmnitting said coordinates to a robot or other automnatic mechanical device, or displaying said coordinates such that humn operaor may manually adjust a mechanical device.
JBricffesctiptiotl of thte Drawing The detailed descfiption particularly refers to the accompanying figures in which: Fig. 1 is a diagrammatic illustration of the stereotactic surgical apparatus of the present invention for coordinating images from a fluoroscope, planning a linear trajectory medical intervention, and controlling a robot to control the linear trajectory medical intervention; Fig. 2 is a perspective view of a registration artifact and tool guide of the present invention; Fig. 3a is a sample screen display of the user interface which includes an anterior/posterior taken by the fluoroscope and displayed on a first computer monitor along with a number of the buttons and entry fields necessary to run the program; Fig. 3b is a sample screen display which includes a sagittal image taken by the fluoroscope and displayed on a second computer monitor along with a number of the buttons and entry fields necessary to run the program; Fig. 3c is a flow chart of the steps performed by the computer during a main program loop; Fig. 4 is a flow chart illustrating the steps performed by the computer to acquire an A/P Image from the fluoroscope; Fig. 5 is a flow chart illustrating the steps performed by the computer to acquire a sagittal image from the fluoroscope: Fig. 6 is a flow chart illustrating the steps performed by the computer and 2o the user to select or identify A/P fiducials from the A/P image displayed in Fig. 3a; Fig. 7 is a flow chart of the steps performed by the computer and the user to select or identify sagittal fiducials displayed on the sagittal image of Fig. 3b; Fig. 8 is a flow chart illustrating the steps performed by the computer to register the A/P image; 25 Fig. 9 is a flow chart illustrating the steps performed by the computer to register the sagittal image; Fig. 10 is a flow chart illustrating the steps performed by the computer for changing a transverse angle of the virtual guidewire; Fig. 11 is a flow chart illustrating the steps performed by the computer to 3o change the length of the virtual guidewire used in the stereotactic surgical procedure; Fig. 12 is a flow chart illustrating the steps performed by the computer to change a sagittal angle of the virtual guidewire; Fig, 13 is a flow chart illustrating the steps performed by the computer to change the approach angle of the robot; Fig, 14 is a flow chart illustrating the steps performed by the computvr to move the robot illustrated in Fig. I to the plannied position and orientation; Fig. 15 is a flow chart illustrating the steps performed by the computer to move the end effector of the robot along the axis of the tool guide; Fig. 16 is a flow chart illustrating the steps performed by the computer when the computer reccives a user input based on a cursor in the A/P image area of Fig.
3a); Fig. 17 is a flow chart illustrating the steps performed by the computer when the computer receives a user input based on a cursor in the sagittal image area in Fig. 3b; and Fig. 18 is a flow chart illustrating the steps performed by the computer when the computer receives a user input based on a cursor in the robot control areas of Figs. 3a-b, Detaied Description of Daia :Referring now to the drawings, Fig. illustrates the stereotactic system for linear trajcctory medical interventions using calibrated and coordinated fluoroscopy.
The apparatus and method of the present invention is designed to utilize images from a fluoroscope 12 such as a standard C-arm which generates fluoroscopic or x-ray images of a body on a surgical table 14. The imaging arm 16 is mioveable so that both anterior/posterior and sagittal or side images of the body can be taken.
A robot 18 is situated adjacent the surgical table 14. Illustratively, the robot is a PUMA-560 robot. The robot 18 includes a movable arn assembly 20 having an end flange 22. An alignent or registration artifact 24 is coupled to the end flange 22 of robot I18.
*~.The registration artifact 24 is best illustrated in Fig. 2, The artifact 24 is X-ray and visually transparent with the exception of 8 opaque spheres or fiducials 26, and an aperture 30 to hold a tool guide 28 through the artifact 24. Initially, the artifact 24 is positioned roughly over the area of interest of body 32 and within the field of view of the fluoroscope 16. Therefore, the fiducials 26 show up ms distinct dots on the A/P and -11sagittal images as discussed below. The shape of the artifact is designed so that the image dots from the fiducials 26 will not over saadow each other and is sensitive to any angular deviations. The robot arm 20 can adjust the artifact 24 in three-dimensions about X-axis 34, Y-axis 36, or Z-axis 38 illustrated in Fig. 1.
The coordinated fluoroscopic control system of the present invention is controlled by computer 40, which includes a microprocessor 42, and internal RAM 44, and a hard disk drive 46. Computer 40 is coupled to two separate graphics monitors 48 and 50. The first graphics monitor 48 displays a sagittal image taken by the C-arm 12.
The second monitor 50 displays an A/P Image taken by the C-arm 12. Computer further includes a serial communication port 52 which is coupled to a controller 53 of robot 18. Computer 40 is also coupled to C-arm 12 for receiving the images from the C-arm 12 through an image acquisition card 54. Computer 40 is also coupled to an input device 56 which is illustratively a keyboard having a track ball input control 58. Track ball input 58 controls a cursor on both monitor 48, The displays on monitors 48 and 50 are illustrated in Figs. 3a and 3b.
Referring now to Fig. 3b, the sagittal image is displayed in area 62 on monitor 48. All eight fiducials 26 should appear in the sagittal image area 62. If not, the artifact 24 or the C-arm 12 should be adjusted. As discussed in detailed below, computer 40 displays a top entry point 64 and a bottom point 66 of a projected guidewire 68. The projected 20 guidewire 68 is a line segment which is displayed on the sagittal image area representing S. the position of the instrumentation to be inserted during the stereotactic surgical procedure. A line of sight 70 is also displayed in the sagittal image area 62.
Various user option buttons are displayed on monitor 48. The surgeon or operator can access these options by moving the cursor to the buttons and clicking or by selecting the appropriate function keys (Fl, F2, etc.) on the keyboard. The option buttons displayed on monitor 48 include button 72 (function F2) for acquiring the sagittal image, button 74 (F4) for selecting sagittal fiducials, and button 76 (F6) for registering the sagittal image. In addition, button 78 (F10) is provided for setting the sagittal angle, button 80 (F8) is provided for setting the screw length, and button 82 (F12) is provided for moving the robot along an axis of the tool guide. Finally, the display screen includes a robot control area 84. The operator can move the cursor and click in the robot control area 84 to control robot 18 as discussed below.
-12- Referring to Fig. 3a, the A/P image displayed on the display screen of monitor 50 is illustrated. The A/P image is displayed in area 86 of the screen. Again, all eight fiducials 26 should appear within the A/P image area 86. The top insertion point of the virtual guidewire is illustrated at location 88, and the bottom point is located at location 90. The projection of the guidewire onto the A/P image is illustrated by line segment 92.
Computer 40 also displays various option buttons on monitor 50. Button 94 (Fl) is provided for acquiring the A/P image. Button 96 (F3) is provided for selecting the A/P fiducials. Button 98 (F5) is provided for registering the AP image. Button 100 (F7) is provided for setting a transverse angle of the virtual guidewire, and button 102 (F9) is provided for setting an approach angle for the robot. Button 104 (Fl 1) is provided for moving the robot. Computer 40 also displays a robot control area 84. The operator can move the cursor and click in the robot control area 84 to control robot 18 as discussed in detail below.
The present invention allows the surgeon to select the point of entry for the surgical instrument by moving the top point of the projected guidewire 88 in the A/P image area 86. The operator can also adjust the bottom point of the projected guidcwire 90 to specify the transverse and sagittal angle. In addition, the operator can adjust the top point of the projected guidewire 64 to specify the position on the line of sight and bottom 20 point of the projected guidewire 66 to specify the sagittal and transverse angle in the sagittal image area 62. Therefore, the surgeon can select the desired position and orientation of the surgical instrument into the body.
The computer 40 is programmed with software to correct spatial distortions from the optics of the fluoroscope 12. The system of the present invention 25 permits effective three-dimensional planning of the stereotactic surgical procedure using only a pair of two dimensional fluorographic images displayed on the adjacent monitors 48 and 50. It is not required to use a CT slice in order to fully specify the location of the surgical instrument. The computer 40 establishes the direct geometric relationship between the A/P and sagittal images, despite image distortions and the essentially random or free-hand positioning of the C-arm 12, to establish the A/P and sagittal images. The improved system of the present invention can establish this exact geometric relationship within sub-millimeter accuracy.
-13- Once the sagittal and A/P images are registered, points or lines chosen by the surgeon on one of the A/P image or toe sagittal image are immediately displayed by computer 40 as corresponding projections on the other image. Therefore, using the sagittal image on monitor 48 and the A/P image on monitor 50, the surgeon can stereotactically plan the linear trajectory without the requirement of CT scan slice.
Accordingly, the procedure of the present invention can be performed without the very expensive CT scan devices which can cost in excess of $1 million.
Details of the operation of the software for controlling the system of the present invention are Illustrated in Figs. 3c-18.
All of the notations, subscripts and mathematical formulae, equations, and explanations are included in the attached Appendix. Throughout the flow charts described Figs. 4-18, reference will be made to the Appendix and to the numbered Sections through [15] set forth in the Appendix.
The main program begins at block 110 of Fig. 3c. Computer 40 creates a parent window at block 112 and then draws buttons on a main window as illustrated at block 114. Computer 40 then creates a sagittal child window on monitor 48 as illustrated at block 116. Computer 40 also creates an A/P child window on monitor 50 as illustrated at block 118. Computer 40 then determines whether a button or key has been pressed at block 120. If not, computer 20 waits as illustrated at block 122 and then 20 returns to block 120 to wait for a button or key to be pressed.
If a button or key was pressed at block 120, computer 40 determines whether the Acquire A/P Image button 94 or the Fl key was pressed at block 124. If so, computer 40 advances to block 166 of Fig. 4. If not, computer 40 determines whether the Acquire Sagittal Image button 94 or the F2 key was pressed at block 126. If so, the computer 40 advances to block 200 of Fig. 5. If not, computer 40 determines whether the Select A/P Fiducial button 96 or the F3 key was pressed at block 128. If so, computer advances to block 234 of Fig. 6. If button 96 or the F3 key was not pressed at block 128, computer 40 determines whether the Select Sagittal Fiducial button 74 or the F4 key was selected as illustrated at block 130. If so, computer 40 advances to block 276 of Fig. 7. If not, computer 40 advances to block 132.
In block 132, computer 40 determines whether the Register A/P Image button 98 or the FS key was pressed. If so, computer 40 advances to block 324 of Fig. 8.
if no0t. computer 40 determines whether the Register Sagittal Image button 76 or the F6 was Pressed as illustrated at block 134. If so, computer 40 advances to block 350 of Fig.
9. If not, computer 40 advances to block 136.
From block 136, computer 40 determines whether the Transverse Angle button 100 or the F7 key was pressed as illustrated at block 13 8, If so, computer advances to block 376 of Fig. 10. If not, computer 40 determinus whether the Screw Length button 80 or F8 key was pressed as illustrated at block 140. If so, computer advances to block 388 of Fig. 11. If not, computer 40 determines whether the Sagittal Angle button 78 or the Fl 10 key was pressed as illustrated at block 142. If so, computer 40 advances to block 400 of Fig. 12. If not, computer 40 determines whether the Approach Angle button 102 or the F9 key was pressed as illustrated at block 144, If so, computer. 40 advances to block 412 of Fig. 13. If not, computer 40 advances to block 146.
In block 146, computer 40 determines whether the Move Robot button 104 or the F1l key was pressed. If so, comnputer 40 advances to block 422 of Fig. 14. If not, computer 40 determines whether the Move Robot Along Axis button 82 or the F 12 key was pressed as illustrated at block 148. If so, computer 40 advances to block 452 of Fig. 15. Ifno, computer 40 deterines whether the A/P been selected by clicking when the cursor is in the A/P image area 86 as ilustrated at block 150. If so, computer 40 advances to block 476 of Fig. 16. If not, computer then determines whcther the Sagittal Image area was selected by positionting the cursor in the sagittal image area 62 on monitor 48 and clicking. If so, computer 40 advances to block 506 of Fig. 17. if not, computer 40 advances to block 154.
From block 154, computer 40 deteminines whether the robot control area 54 or 106 was selected by moving the cursor and clicking in the Robot Control area 84 on S monitor 48 or the Robot Control area 106 on monitor 50. If the Robot Control was selected, computer 40 advances block 536 of Fig. 18. If the Robot Control was not selected, computer 40 advances to block 158 to determine whether the key was pressed indicating the operator desires to quit the main program. If the button was pressed, then computer 40 frees all allocated memory as illustrated at block 160 and ends the main program as illustrated at block 162. If the button was not pressed at block 158, computer 40 advances back to block 122, waiting for a another button or key to be pressed.
The various functions performed by the system of the present invention will be described in detail. If the Acquire A/P Image button 94 or the FlI key is pressed the, Computer 40 advances to block 166 of Fig. 4. Computer 40 then determines whether the image acquisition card is in a pnssthrough mnode at block 168. Button 94 and the FlI key are toggle buttons. When the button 94 or the FlI key is initially pressed, the card in passthraugh mode and images from the C-armn 12 are transmitted directly to the monitor 50, Whatever image is being taken by the C-arm is Seen on the monitor 50 in the A/P image are 86. Therefore, if the card is not in the pass-through mode at block 168, pressing button 94 or the FI key sets the pass-through mode at block 170. Computer then returns to wait for the next commiand as illustrated at block 172, When the button 94 or the F I key is pressed Again after the image acquisition card within the computer is in pass-through mode, it freezes the live image and captures the AJP imagc as illustrated at block 174. This captured image is then displayed on monitor 50 as illustrated at block 176. Computer 40 then disables and dims buttons F I- I; 12 and F5, and enables and V'S."brightens button 96 and key F3 as illustrated at block 178. In other words, after the A/P image has been captured, computer 40 allows the operator to have the option to select the A/P fiducials through button 96 or key F3.
Computer 40 then assigng a NULL tool as illustrated at block 180. The NULL tool of the robot is the three-dimensonal location of end flange 22 of robot 18. In other words, the end flange 22 establishes a three-dimensional position for the robot, $fee*:without depending on the particular surgical instrumentation which may be attached to 0 the end flange 22. Computer 40 determines whether the NULL tool was properly assigned at block 182. If not, computer 40 generates an error message "Tool Not Assigned!" as illustrated at block 184. Computer 40 then waits for the next command as .9..illustrated at block 186. if the NULL tool is assigned properly at block 182, computer *.*gets the current position of the end flange from the robot controller 53 as illustrated at block 188. Computer 40 then determines whether the sagittal image is displayed on monitor 48 as illustrated at block 190, If not, computer 40 sends a message of "Acquire Sagittal Image" as illustrated at block 192, and then returns to wait for the next command at block 194. If the sagittal image is displayed at block 190, computer 40 sends the -16message "Select the Fiducials" as illustrated at block 196. Computer 40 then returns to wait for the next command at block 198.
If the Acquire Sagittal Image button 72 or the F2 key is pressed, computer advances to block 200 of Fig. 5. Computer 40 then determines whether the image acquisition card is in a pass-through mode at block 202. Button 72 and the F2 key are toggle buttons. If the card is not in the pass-through mode at block 202, pressing button 72 or the F2 key sets the pass-through mode at block 204. Computer 40 then returns to wait for the next command as illustrated at block 206, When the button 72 or the F2 key is pressed again after the image acquisition card within the computer 40 is in pass-through mode, it freezes the live image and captures the sagitta image as illustrated at block 208.
This captured image is then displayed on monitor 48 as illustrated at block 210.
Computer 40 then disables and dims buttons F11, F12 and F6, and enables and brightens button 74 and key F3 as illustrated at block 212. In other words, after the sagittal image has been captured, computer 40 allows the operator to have the option to select the sagittal fiducials through button 74 or key F4.
Computer 40 then assigns a NULL tool as illustrated at block 214.
Computer 40 determines whether the NULL tool was properly assigned at block 216. If not, computer 40 generates an error message "Tool Not Assigned!" as illustrated at block d 218. Computer 40 then waits for the next command as illustrated at block 220. If the 20 NULL tool is assigned properly at block 216, computer 40 gets the current position of the 0end flange 22 from the robot controller 53 as illustrated at block 222. Computer 40 then determines whether the A/P image is displayed on monitor 50 as illustrated at block 224.
If not, computer 40 sends a message of "Acquire A/P Image" as illustrated at block 226, s and then returns to wait for the next command at block 228. If the A/P image is displayed at block 224, computer 40 sends the message "Select the Fiducials" as illustrated at block 230. Computer 40 then returns to wait for the next command at block 232.
If the Select A/P Fiducials button 96 or the F3 key button is pressed, computer 40 advances to block 234 ofFig. 6, Computer 40 first determines whether the A/P image is displayed on monitor 50 as illustrated at block 236. If not, computer generates an error message, "Acquire A/P Image" as illustrated at block 238. Computer then returns to wait for the next command as illustrated at block 240.
-17- If the A/P image is displayed at block 236, computer 40 displays a square cursor on the display screen of monitor 50. as illustrated at block 242, Computer 40 then resets the number of locatcd fiducials to zero as illustrated at block 244. Next, computer waits for the trackball button to be clicked by the operator as illustratcd as block 246.
Once the trackball button is clicked over a fiducial shadow, computer 40 generates a beep as illustrated at block 248, Computer 40 then performs edge detection gr~ound the selected mouse cursor coordinate as illustrated at block 250. Such edge detection is pcrformed using a gradient base method developed by John Canny and described in the article referenced in Section [I]l of the attached Appendix. Such article is hereby incorporated by reference and made a part of this detailed description.
Computer 40 then determines whether at least 3 edge pixels were found during the edge detection step aS illustrated at block 252. If not, computer 40 generates an en-or message of "Try Again Closer to the Fiducial as illustrated at block 2 54.
Computer 40 then returns to block 246 to wait for the mouse button to be clicked again, If at least three edge pixels were found at block 252, computer 40 maps the edge pixels to their calibrated image coordinates using equation 13] from the attached Appendix as 0 11: illustrated at block 256.
o o o Cmputer40 then finds the center of the fiducial shadow generated byth 0 0 0 0fiducials 26 using the calibrated edge pixels as set forth in equation [14) of the Appendix.
*20 This step is illustrated at block 258. Computer 40 then advances to block 262 of Fig, 6.
0: From block 262, computer 40 draws a circle around the center of the fiducial shadow.
:..0*Computer 40 then determines whether all eight of the fiducials 26 have 0 0 been located in the A/P image as illustrated at block 264. If not, computer 40 returns to 0* 0 0 0block 246 of F~ig, 6 and then waits for the mouse button to be clicked again over a 000:: 6 25 different fiducial shadow.
If' all eight fidUCial3 have been located at block 264, computer 40 then 0 saves the established image coordinates of all the fiducials in the computer memory as illustrated at block 268. Computer 40 then enables and brightens the Register A/P Image Button 98 and FS key as illustrated at block 270. Computer 40 then transmits the message "Register A/P Image" as illustrated at block 272.
Next, computer 40 automatically advances to location ENTRYI of ig. 8 as illustrated at Block 274. Computer 40 does not wait for an operator to press a button to move to location ENTRYI of Fig. 8.
If the Select Sagittal Fiducials or the F4 key button is pressed, computer 40 advances to block 276 of Fig, 7. Computer 40 first determines whether the sagittal image is displayed on monitor 48 as illustrated at block 278. If not, computer generates an error message, "Acquire Sagittal Image" as illustrated at block 280.
Computer 40 then returns to wait for the next command as illustrated at block 282.
If the sagittal image is displayed at block 278, computer 40 displays a square cursor on the display screen of monitor 48 as illustrated at block 290. Computer then resets the number of located fiducials to zero as illustrated at block 292. Next, computer 40 waits for the trackball button to be clicked by the operator as illustrated as block 294. Once the trackball button is clicked, computer 40 gencrates a beep as illustrated at block 296. Computer 40 then performs edge detection around the selected trackball cursor coordinate as illustrated at block 298. Such edge detection is performed S• using a gradient base method developed by John Canny and described in the article referenced in Section (1I of the attached Appendix.
Computer 40 then determines whether at least 3 edge pixels were found during the edge detection step as illustrated at block 300. If not, computer 40 generates an error message of"Try Again Closer to the Fiducial" as illustrated at block 302.
Computer 40 then returns to block 294 to wait for the trackball button to be clicked again. If at least three edge pixels were found at block 300, computer 40 maps the edge pixels to their calibrated image coordinates using equation [13] from the attached Appendix as illustrated at block 304.
Computer 40 then finds the center of the fiducial shadow generated by the Sfiducials 26 using the calibrated edge pixels as set forth in equation (14] of the Appendix.
SThis step is illustrated at block 306. Computer 40 then advances to block 310. From block 310, computer 40 draws a circle around the center of the fiducial shadow.
Computer 40 then determines whether all eight of the fiducials 26 have been located in the sagittal image as illustrated at block 312. If not, computer 40 returns to block 294 and then waits for the trackball button to be clicked again.
-19.
If all eight Iiducials have been located at block 3 12, computer 40 then saves the established image coordinates o1"alJ the fiducials in the computer memory as illustrated at block 3 16. Computer 40 then enables and brightens the Register sagittal Image Button 76 and the F6 key as illustrated at block 318, Computer 40 then transmits a message of "Register Sagittal Image" as illustrated at block 320.
Next, computer 40 automnatically advances to location FENTRY2 of Fig. 9 as illustrated at block 322. Computer 40 does not wait for an operator to press a button to move to location ENTRY2 of Fig, 9.
If the Register A/P Image button 98 or the F5 key was pressed, computer 40 advances to block 324 of Fig, 8. Computer 40 first determines whether all of the A/P fiducials have been found as illustrated at block 326, If riot, computer 40 generates an error message of "Hlaven't Selected All the Fiducials" as illustrated at block 328.
Computer 40 then returns to wait for the next comn-ind as illustrated at block 330.
If all the A/P fiducials have been found at block 326, computer advances to block 332. As discussed above, computer 40 also automatically advances to block 332 from block 274 of Fig, 6 after all the fiducials have been selected.
:In block 332 computer 40 first recalls all the two-dimensional coordinates .of the A/P fiducial centers. Next, the computer 40 reads in data from a file of the *.*.three-dimensionalj coordinates of the center of the fiducials 26 as illustrated at block 334, .20 The three-dimensional coordinates of the fiducials; 26 tire obtained using a Coordinate Measurement Machine (CMM). Therefore, this data provides information related to the actual location of the fiducials 26. Typically, these CMMed coordinates are obtained from the manufacturer of the registration artifact 24.
Next, computer 40 otmzsthe parameters of a geometric model which projcts three dimensional coordinates into corresponding image points. The optiMzed model ig encapsulated in a registration matrix as set forth in section Optimization is performed by minimizing (in a leart squares sense) the deviation between the model's ****projections of the three-dimensional coordintes read at block 334, and the two-dimensional coordinates read at block 332. The Levenberg-Marquardt method is used for optimization, as described in equation of the attached Appendix and as illustrated at block 336 Computer 40 then constructs a registration matrix as set forth in section of the attached Appendix. This step is illustrated at block 338.
Computer 40 next determines whether the sagittal image has been registered as illustrated at block 340. If not, computer 40 gencrates a message of ".Pcrrform Sagittal Registration" as illustrated at block 342. Computer 40 then returns to wait for the next command a3 ilustrated at block 344.
If the sagittal image has been registered at block 340, computer generates a display message of "Pick Entry Point" as illustrated at block 346. Computer then returns to wait for the ncxt command as illustrated at block 348.
If the Register sagittal Image button 76 or the F6 key have been pressed, computer 410 advances to block 350 of Fig. 9. Computer 40 first determines whether all of the sagittal fiducials have been found as illustrated at block 352. If not, computer generates an error message of "Haven't Selected All the Fiducjajs" as illustrated at block 354. Computer 40 then returns to wait for the next command as illustrated at block 356.
If all the sagittal fiducials have been found at block 352, computer advances to block 358. As discussed above, computer 40 also automatically advances to 1s block 358 from block 322 of Fig. 7 after all the fiducials have been selected.
Iblock 38coptr40 first recalls all the todmnialcoordinates of the sagirttal fiducial centers. Next, the computer 40 reads in data fr-om a file of the three-.dimensional coordinates of the center of the fiducials 26 as illustrated at block 360.
The coordinates of the fiducials 26 are obtained using a Coordinate Measurement :20 Machine (CMM), Therefore, this data provides infirnation related to the actual location of the fiducials 26. Typically, these coordinates are obtained from the manufacturer of the registration artifact 24.
Next, computer 40 optimizes the fit between the three-dimensional .coordinates read at block 360 and the two-dimensional coordinates read at block 358 25 uning the Levcnbers..Marquardt method described in equation of the attached Appendix as ilustrated at block 362. Computer 40 then constructs a registration matrix set forth in section [41 of the attached Appendix. This step i3 illustrated at block 364, Computer 40 next determines whether the A/P image has been registered as illustrated at block 366. [f not, computer 40 generates a message of "Perform
A/P
Registration" as illustrated at block 368. Computer 40 then returns to wait for the next command as illustrated at block 370.
If the A/P image has been registered at block 366, computer 40 generates a message of'"Pick Entry Point" as illustrated at block 372. Computer 40 then returns to wait for the next command as illustrated at block 374.
If the transverse angle button 100 or the F7 key is pressed, computer advances to block 376 of Fig. 10. The transverse angle is the angle determined by using the right hand rule about the X axis 34 of Fig. 1. To adjust the tran,5vcrse angle, the operator places the cursor in the Entry Field button 10 1 of Fig. 3 a as illustrated at block 378 of Fig. 10. The operator then enters a numeric value for the transverse angle as illustrated at block 380. Computer 40 then reads the new transverse angle, and updates 310 the orientation of the virtual guidewire using the equations set forth in section of the attached Appendix. This step is illustrated at block 382. Next, computer 40 redraws the virtual guidewire projection 92 in the A/P image area 86 and 68 in the sagittal image area 62 based on the new transverse angle using the equation set forth in section of the attached Appendix as illustrated at block 384. Computer 40 then returns to wait for the next command as illustrated at block 386.
If he cre Legthbutton 80 or the F8 key was pressed, computer 4 adacsto block 388 of Fig, 11. The cursor is then placcd on the entry field 81 of Fig, 3b as illustrated at block 390, The operator then enters the numeric value for the new screw length as illustrated at block 392. Computer 40 reads the new screw length, and updates the length of the virtual guidewire using the equations set forth in section [11I] of the Appendix. This step is illustrated at block 394. Next, computer 40 redraWs the projected guidewire 92 in the A/P image area 86 and the projected guidewire 68 in the sagittal image area 62 using the equations set forth in section of the Appendix. These steps are illustrated at block 396. Next, computer 40 returns to wait for the next command as illustrated at block 398.
If the Sagittal Angle button 78 or the F 10 key is pressed, computer advances to block 400 of Fig. 13 to adjust thc sagittal angle. The sagittal angle is the angle about the Y-axis 36 of Fig. 1 using the right hand rule.
The cursor is placed in an entry field 79 of Fig, 3b as illustrated at block 402. Next, the operator enters a numeric value for the sagittal angle as illustrated at block 404. Computer 40 then reads the value of the flew sagittal angle, and updates the orientation of the virtual guidewire using the equations set forth in section [101 of the -22..
Appendix. These steps are illustrated at block 406. Next, computer 40 redraws the projected guidewire 92 in the A/P image area 86 and the projected guidewire 68 in the sagittal Image area 62 using the equations set forth in section of the Appendix. These steps are illustrated at block 408. The computer 40 then returns to wait for the next instruct ion as illustrated at block 4 If the Approach Angle button 102 or the P9 key was pressed, computer advances to black 412 ofFig. 12. The approach angle is the angle taken about the Z-axis 38 of Fig. I using the right hand rule.
The cursor is placed in the entry field 103 of Fig, 3a as illustrated at block 414. The operator then enters a numeric value for the new approach angle as illustrated at block 416. The computer 40 then reads the new approach angle as illustrated at block 418. Computer 40 then returns to wait for the next command as illustrated at block 420.
In order to plan a linear trajectory in spn'.ce, only two angles are needed, for this particular procedure the transverse angle and the sagittal angle are used. The approach angle permits the surgeon to control movement of the robot. In other words, the approach angle is not used with the planning of the trajectory.
If theoe Robot button 104, or the 1l 1 key are pressed, computer adacsto block 422 of Fig. 14. Computer 40 first recalls teapproach anl from memory as illustrated at block 424. Next, computer 40 recalls the sagittal angle, the :20 transverse angle and the three-dimensional coordinates of the top point of the virtual guidewi-e as illustrated at block 426. Next, computer 40 calculates the planned position and rietatin uingthe equations in section [1 2] of the Appendix. This step is set forth at block 428. Next, computer 40 reads in data from a file related to the specific surgical end-effector being used for the surgical procedure as illustrated at block 430. This data includes the three-dimensional coordinates ftrm the Coordinate Measurement Machine
(CMM.
Computer 40 determinies whether the surgical and-eff'ector is properly assigned at block 434. If not, computer 40 generates an error message of "Surgical end-effector Not Assigned" as illustrated at block 436. Computer 40 then returns to wait for the next command as illustrated at block 438.
If the surgical end-effector is properly assigned at block 434, computer senids a conunandc through serial comnmunication port 50 to the robot controller 53 to move the robot to the planned position and orientation as illustrated at block 440.
Computer 40 assigns the "NULL" end-effector as illustrated at block 442. Computer determines whether the NULL end-effector was properly assigned at block 444. If not, computer 40 generates an error message of "NULL end-effector Not Assigned" as illustrated at block 446. Computer 40 then returns to wait for the next command at block 448. If the NULL end-effector is properly assigned at block 444, computer 40 returns to wait for the next command as illustrated at block 450, If the Move Robot Along Axis button 82 of Fig. 3b is selected, computer advances to block 452 of Fig. 15. The computer 40 has already moved the robot to the proper orientation during the steps of Fig. 20. Therefore, the steps of Fig. 21 are designed to move the robot along the tool guide axis defined by the tool guide 28 of Fig, 2. The tool guide axis typically moves toward and away from the body on the table 14 along the tool guide axis. Computer 40 determines whether a thread entitled "Move Robot Axis" has been dispatched at block 454. This thread program runs by itself until it is stopped. If the program is not started at block 454, computer 40 starts this program as illustrated at block 456. Computer 40 then returns to wait for additional instructions at block 458. If the thread program has started at block 454, then computer determines whether the Page Up button has been pressed at block 460. If not, computer determines whether the Page Down button has been pressed at block 462. If not, computer 40 returns to block 464 to wait for the next command.
If the Page Up button was pressed at block 460, computer 40 determines whether the Page Up button is still being pressed at block 466. If not, computer returns to wait for the next command as illustrated at block 468. If the Page Up button is still being pressed at block 466, computer 40 sends a VAL command from communication port 50 to robot controller 53 to move the robot in the positive tool guide axis direction as illustrated at block 470. The positive tool guide axis direction is up away "from the patient. Computer 40 then returs to block 466.
If the Page Down button has been pressed at block 462, computer determines whether the Page Down button is still being pressed at block 472. If not, computer 40 returns at block 468 to wait for the next command. If the Page Down button is still being pressed at block 472, computer 40 sends a VAL command to move the robot in the negative tool guide axis direction as illustrated at block 474. The negative tool guide axis direction is down toward the patient. Computer 40 then returns to block 472.
In other words, the control steps of Fig, 15 permit the operator to move the robot along its tool guide axis. Once the robot is Moving in either the positive or negative direction, it keeps moving until the Page Up or Page Down are released. The entire robot moves in order to maintain the end-effector 24 and the tool guide 28 in the same orientation along the planned axis. In other words, the end-effector 24 of robot 18 may be maintained in an orientation that is 45 0 relative to Z-axis 38 of Fig. I1. VAL is the program control language for the PUMA-560 controller 53, It is understood that other robots, controllers, arnd program languages may be used in accordance with the present invention.
If a cursor is over the A/P image area 86 of Fig. 3a, computer 40 advances to block 476 of Fig. 16. Computer 40 waits for the trackball to be clicked in the A/P image area 86 as illustrated at block 478. Once the trackball has been clicked at block 478, computer 40 determines whether both the A/P image and the sagirttnl image have been registered as illustrated at block 480. If not, comj~uter 40 does nothing and returns :to block 482 to wait for the next command.
0 .0 If the A/P and the sagittal images have been registered at block 480, computer 40 determines whether the projected guidewire is drawn as illustrated at block 484, If not, computer 40 assumnes that the operator intends to draw the projected gtiidewire. Therefore, the computer 40 draws a cross hair at trackball coordinate
(UV)
as illustrated at block 486. Next, computer 40 draws a curve representing the line of site on the sagittaJ image using the equations of section of the attached Appendix as 0 0 0illustrated atblock 488, A uv sdrawn representing the line of site deto the *2 S.2 distortion in the Images, If you take an x-ray of a straight line, its image will be a curve due to the distortions inherent in the fluoroscope's image intensifier. This is why a curve must be drawn to represent the line of sight. Once dhe line of sight indicator 70 is drawn on the sagittal image area 62 of Fig, 31b, computer 40 returns to wait for the next command as illustrated at block 490.
If the projected guidewire is already drawn at block 484, computer determines whether the trackball coordinates are within five pixels fromt the top point 89 in the A/P image area 86. This step is illustrated at block 492. If the cursor coordinates are within five Pixels from the top point 88, computer 40 erases the projected guidewire as illustrated at block 494 and returns to wait for the next command as illustrated at block 496.
If the trackcball Cursor coordinates are not within five pixels from the top point 88 at block 492, computer 40 determines whether the trackball coordinates are within five pixels of the bottom paint 90 as illustrated at block 498, If not, computer returns to wait for the next commnand as illustrated at block 490. If the trackball cursor coordinates are within five pixels from the bottom point 90 at block 498, computer determines whether the trackball has been clicked again as illustrated at block 500. if' so, computer 40 returns to block 490 to wait for the next command. If not, computer updates the transverse or sagittal angle as illustrated at block 502 based on movement of the trackball. The transverse angle value~ is incremented if the trackball is being moved up. The transverse angle value is decreased if the trackball i5 moving down. The sagittal angle value is incremented if the trackball is being moved right. The sagittal angle value is decreased if the trackball is moving left. The incrementing factor is 0. 1 per Pixel. The equations for this step are set forth in section of the Appendix, £Afterj the transverse and/or sagittal angle have been updated at block copue 40edaw the projected guidewire 92 in the A/P iaearea 86 and the projected guidewire 68 in the sagittal imnage area 62 using the equations in section of .20 the attached Appendix, These steps are illustrated at block 504. Computer 40 then returns to block 500.
If the cursor is over the sagittal image area 62 of Fig. 3b, computer advances to block 506 of Fig. 17. Computer 40 determines whether the line of sight has kbeendrawn at block 508. If not, computer 40 returns to wait for the next command at 2S block 5 10, If the line of sight has been drawn at block 508, computer 40 draws the projected guidewire 92 in the A/P irmage area 86 and the projected guidewire 68 in the *sagittal image area 62 using the equations in section of the Appendix. These steps are illustrated at block 512. Computer 40 also checks if the robot has been initialized at block 513, if it has then computer 40 enables and brightens buttons "Move Robot" 104, and "Move Along Drill Axis" 82, and keys F1 1, and F 12 at block 513.5. Next. computer waits for the track ball in the sagittal image area 62 to be clicked as illustrated at block 514. if robot has not been initialized then computer 40 waits for the track ball in the -26sagittal image area 62 to be clicked as illustrated at block 514, Next, computer determines whether the trackbafl cursor coordinates are within ive pixels from the top point 64 as illustrated at block 516, If not, computer 40 determines whether the trackball coordinates are within five pixels of the bottom point 66 as ilustrated at block 518. If not, computer 40 returns at block 520 to wait for the next cornniand.
If the trackball coordinates are within five pixels of the top poiflt 64 at block 516, computer 40 determines whether the trackball has been clicked again at block 522, If so, computer 40 returns at block 524 to wait for the next command. if not, computer 40 updates the position of the virtual guidewire 68 by moving it along the line 3.o of sight in the same direction as the trackball movements. The incrementing, ratio is 0. 1 nun/pixel. This step is illustrated at block 526. The computer uses the equations set forth in section of the Appendix to update the virtual guidewire position. Computer 40 then redraws the projected guidewire 68 in the sagittal image area 62 and also redraws the projected guidewire 92 in the A/P image area 86 as illustrated at block 528 by using the equations set forth in Section of the Appendix, Computer 40 then returns back to block 522.
If the trackball coordinates are wthin five pixels from the bottom point 66 *9at block 5 18, computer 40 determines whether the trackball has been clicked again a block 530. If so, computer 40 returns at block 524 to wait for the next command. If not, computer 40 assumes that the operator wanta to adjust the position of bottom point 66, Therefore, computer 40 updates the sagittal and/or transverse angle as illustrated at block 532 based on movement of the trackball. The transverse angle value is iflcreniented if the trackball is being moved up. The transverse angle value is decreased if the trackball is moving down. The sagittal angle value is incretnented if the trackball is being moved to the right. The sagittal angle value is decreased if the trackball is moving to the left. The incrementing ratio is 0.1 /pixel. Computer 40 uses the equations of section [110] of the Appendix for these steps as illustrated at block 532, Next computer 40 redraws the 9 projected guidewire 68 in the sagittal image area 62 and the projected guidewire 92 in the A/P image area 86 as illustrated at block 534 using the equations &et forth in Section [7] of the Appendix. Computer 40 then returns to block 530.
Ifthe Robot Control areas 84 of Fig. 3a-b is selected, computer advances to block 536 of Fig. 18. Computer 40 then displays a menu giving the user options at block 538. The first option is a "Initialize Robot" option. Computer determines whether the Initialize Robot menu item was selected at block 540. if so, computer 40 opens the serial communication port 52 for communication with the robot eontroller 53 as illustrated at block 542. Computer 40 sends the VAL program language commands required to intialize the robot controller 53 as illustrated at block 544.
Computer 40 determines whether VAL was initialized properly at block 546. If' VAL was not initialized properly then the computer 40 sends message VAL not iniitialized 53 As illustrated at block 548. Computer 40 then returns at block 550.
IfCVAL was properly initialized at block 546, computer 40 transmits preestablished HOME and START positions to the robot controller 53 as illustrated at block 552. The HONM and START position are two positions in the work space of' the robot. In addition, computer 40 initializes the preestablished NULL crnd-effector and SURGICAL end-effector as illustrated at block 554. In other words, computer 40 sends specifications to the precise configurations of the specific surgical instrument that is going to be used, Therefore, the controller 53 is programmed to move the robot to these posrJ~itins. During oprtin computer 40 can instruct the controller 53 to move to the .:partic.ular HOME or START positions. In addition, controller 53 wilrecognize 0 .0instructions for the particular surgical end-effector which was initialized during step 554.
Next, the robot speed is set to a very slow speed as illustrated at block 556. For example, .20 the robot speed is set to a speed of 5 out of 256. Next, the computer 4Q checks if the virtual guidewire has been planned, if it has then it enables and brightens buttons "Move Robot" 104 and "Move Robot Along Tool Axis" 82 and keys F 11, F 12, as illustrated in block 557.5. Computer 40 then returns to wait for the next instruction as illustrated at *block 559.
If the virtual guidewire has not been planned, computer 40 then returns to wait for the next instruction as illustrated at block 558.
*If an option entitled "Move to a Predefined Location" was selected from the pop-up menu 538 and if the robot was already initialized as illustrated at block 560, then computer 40 displays a dialog box with options to m ove the robot to the predefined locations as illustrated at block 562. In other words, a dialog box with the options to move the robot to the HOME position or the START position are displayed. The operator can select one of these options at block 562. Computer 40 then sends a VAL command to controller 53 to move the robot 18 to the specified location as illustrated at block 564. Computer 40 then returns at block 568 to wait for the next command, If computer 40 determines that the option "Assigned Predefined Tool" was selected from the menu 538 and iftthe robot has already been initialized as illustrated at -9 block 570, then computer 40 displays a dialog box with options to assign the predefined tools established during the initialization step at block 554. This 3tcp 4~ illustrated at block 574. In other words, computer 40 displays a dialog box~ for assigning either the NULL end-effector or the SURGICAL. end-effector at block 574, Once the desired tool is selected, computer 40 transmits to VAL the command to assign the specifiedj end-effector to controller 53 as illustrated at block 576, Computer 40 then returns to wait for the nexct command at block 578. If the assigned predefined end-effector item was not selected or the robot was not initialized at block 570, computer 40 returns at block 572 to wait for the nexct command.
Although the invention has been described in detail with reference to a certain preferred embodiment, variations and modifications exist within the scope and spirit of the present invention as described and defined in the following claims.
0*00* 0 000* 0*0* 0.0..
29
APPENDIX
(Page I of 6) WCS world Coordinate Systemn CCS C-arrm Coordinate System y, Used for 3D coordinates in WCS and the CCS, (xr, A) Used for calibrated image coordinates.
V) Used for real image coordinlats.
a Sagittal A ogle.
p Transverse Angle.
r Approach Angle.
w= WCS Specifies the coordinate system. Only UScd with 3D C =CCS coodinate systems, t =top SpcifieS a point an the virtual guidewirc.
b bottom a AJP SpccjfiC3 to what liagc the Wn-nration perftans to.
s= Sagittal Canny; "A Computational Approach to Edge Detection"; IEEE Transacjons on patenAayi ahn :Intelligencc; Vol 8, Nov. 1986, pp. 679-698, tenAayi ahn t21 Mathematics iflvolycd in perfomiing the LeyenbcrgMarquayjt optl ton methd 9. .cosocose cososinesin V/ si*co tcos V O~ in~osv,+ sinoriniv feaR si2OCosO sinoszijn v+ cosocosw~ sinosinecos y- cososinv/ -slecos6biny- Co tsV a)(RiiX 1 N3 +2 and R.7 1 1 R 2 Z4
X
2 a) 1 -U UC(Xi; a))l +1 V(x 1 a)) 2 where x, y, are the 3D coordinates of the fiducials, v) arc the 2D) coordinates of the center of the fiIucilas and a~ tO., t, t1] 2re the six parametera that dcfinc a.5ix degree-of-feecdom pose- [33 Once the fit haas been performed I conlstruct the homogeeous transformation nlab-ix'that corresponds to the oPtftized paamees(a= 0, ly, as follows: 30
APPENDIX
(Page 2 of 6)
REGAL
cos~vose sirncoso -sine 0 cas~lsirlfsinw sincjcosy sin14sin9 3 ,ny cvmGos 4 coSsisw 0 COS~sinecosV SinoSiny 5islfle~cosw cososinv cosOcosIV 0 Once the fit has been performed contruct the homogeneous transformation matriX that corresponds to thc optimized parameters (a 0, t, I. 1,)as follows: REG =L[ .Zino C04sinesinw sn sin4sinesinxv cosujco]Si Cososiny 1 CO50sinO0osu 3ia Sinl sin4osinecossy, -cos4sizy Cosecosy 0 (51 The line of sight is calculated in the following way! The line of sight is bound by 0, 0) and v,J in the CCS., Note: vJ) is the calibrated equivalent of See [13] FLSXw I LSx.,i 1 I 01 LS4yw LSy,12 [REGA1{ f J
S.
S*
*5
S
*5
S.
S S .54 0
S
*0*S
S
*5SS
S.
S S
S.
S
S.
S S
*SSS
S
L C3 Xg C3 e2 1FLSx Yei2 [REGA] M~y :2 1 1 T I
LSI
I.2 xw I~ LSY, 2 I3 LSZWI 1 I u1 &Lf zu*J ZU 2 YiiLY IZelI' V2Y= f Due to the inherent distortion in the fluoroccopic itmage the line of siht i5 drawn as a curve image, This is done by un-calibrating 50 points on the lice bound by (4 1, vi) and (ul, Yz) as in C(151 and drawing a polyline through them.
Recall that the virtual guidewire is a 3D object bound by (0wl, Ol, 0,J) and OWN, 4sCreWengthwb).
g-0+ 0. 1 pixels moved by tho trackball) E Vx- 1 VX~b 1F0,4 Vz,4t 0"~,X~Yf)Ia I I 1 0 wb Qw, -sc?-ewlengthwb With (VX'4, t/yM, and VYwtl, Vib) the viual guidewife's projection is drawn on both thc A/P and sagittal images usins the following equations: 31
APPENDIX
(Page 3 of 6) [XCII Xu.b 1Vzwt VX- 1 IY-1~ Ycab ;Y1I J"O Zeb [GIV~ VZ'.b
[XCS
1 Xdb 1[VX.1 VX~b1 Ycit Ycsb (rREG1 VYI 1 .t VY-b ZC-C~ Y*~zw VzwbJ u 1 J &Sf Va /:h u5U21 V,6 ZM6 flue to the distorrdon in fluoroscopic ixmages the projected guidewire is drawVn as a curve, This is done byuncalibrating 20 points on the line bound by (u.
2 arnd (Ub Yab) as it 15] and drawing a polyline through them on the A/P irnage and similarly for the Sagittal iniag using fi, and (usb, Vsb).
:*0696 To draw the vutual guidewire's projection, two points 0, 0) and 0. -screwlengtk), in the WCS are trasformed so that the top point 0, 0) lies on the line of sight. The virtul guidewire is initially set to projected guidewirc is drawn using the following math: initially: 0 ~depth 02 see 0 screwlength 3otom (tt, ty, rz) is constrained to lie ou the lhe of sight bound by qZYI LSy,,, LSz..
1 and LSY, 2 LSzwz), thus (x LSx,,, depthS (L" 2 LSx.
1 L~y.w dvpth*(L~y.
2 LSywI) (Z LSZ*, dePIh(L" 2 LMzI) 000 Yw" 1 VY'b 0 Wt OWb Vx,. YZb Ow .scr::leghw,ztV,, =['bpe T is composed of the following transformations! T Traxxs(tx, ty, iz) Rot(y, a) Potqx, or 0os sicmiOsf ndbs UT(, z't, 0 cos6 si.i~ ing s IY -sina cos ai/I cosororp i 32
APPENDIX
(Page 4 of 6) in order to draw the projected guidewire oai the images, the points (TVxW, Vyw. and (VX~b, VYwb, VZ,,b) are used in conjunction Recall that the virtual suidewire is a 3D object bound by and (04 vcrew1-9ngthwb).
depth =depth 0. 1 pi.'cls moved by the trackball) txLsx,, depth (Lsx., LSx. 1 =y LSy,. depth *(LSywz L3y,.
1 M depth *(LSz.
2 LSz,,, VX.w VX.-b l, Ow V2, Vz' O m -scr w length Recall that the virtual guidewire is a 3D object bound by (O 0,I) and A6~, O.wb, -SCreW~enrgthk.).
aa+ 0. 1 pixels moved by the trackball) ~b [la .ttL) w Owb WI" VzWb r]0" *Screwlengdzhb 11] Recall that the virtual guidewirc i3a3 bject budby 0t) and (Owb, Owb, -SCreWleflgtllb).
FY.' VY:: V~a 16 1y z w [11Given [TailI' =[Rot(z, -90)1[JROt(y, [Pun) 2 IROt(y,t)J [ROt(x,1) T-ol] [Api h] j'Z~ and using the following contraints I determinie the remaining two vectors that would be cornpletelllPI.
Note: The first vector (N is miaintained from the lPlani since it is the dirli guide a: Contraints: 2These matrices. re of the following form: ENx Ox Ax Ny Oy Ay Nz OzAz 1 33-
APPENDIX
(Page 5 of 6) 1) FP,, 2 +PP VFP~y 2 2 AN -FPA 0 3) P 0 D= AN,, (A j% FPN, FPN, A& AN.~ (FPNX, AN, AN.- FPNy) A, H (ANK FPw± FPNX 'AN,) (IFPN. N FPN,) FPI l- FPA,, E FPk FJPo is determined using PO FPJ X "PA Hence, F 'Nx FPD. FPAX [Fini~an] PNy FPoy FPA, Since the PUMA 560 robot uses an Euler representation for specifying an orientation, the inverse solution ofTFPI is detetrmned i thle follo wing nuanner: Euler representation Rotqz, Rot(y, 0) Rot(Z, W) thus from [i] 0 arctan(FP.,,, PPM) 0 arctani(FPA. COO() +FPAY
FPAZ)
-Ar~tan(-FP~ sin() +FPMy COS(O), -FPO.
FF
0 7 os() Adding a PUMA Spcific offset to 0, and 0 the final position and orientation is established .9 9Final pose +90, 9- go, V, tx, ty, tz) 1113] The calibrated coordinates y) ofthe ed gc-Pixels vi) are determined using a quartic polynomial equation as fWlows.
x~aot V'+a I UYv+ az u'v1.+ a2.3UV+al Y -ao* 4 4 at uY a tzu a2)uv +a 24 the set of pazrmeters a and b, are Previously detcrtuned Using the image calibration program.
[14] The center of the fiducial shadow is found by fitting the equation of a circle to the edge-pixels using a pseudoinverse approach: 34
APPENDIX
(Page 6 of 6) or
A=SP"
using pseudo inverse P (BTB)- IB
T
A
oncoe P ia cstablishud the center Qf thbv fiduciajS 0h, k) is determined as follows: 2 k =P Te un-calibrated (ciistoned) coordinates v) corresponds to the calibrated coordinate yA and is determincd using a quartic polynomnia.1 equation as follows: u aD X4Y+ a IX'+ a 2 xy 2 a23xy+ 2 V ao x Y' +a Ix9+ a X4y- a~ixy +a 2 the set of parameters a and h, are previously determined using a separate calibration program IRnbnt M-oinuatm. Mthematici Progafmhi. and Cnti, Richard P. Paul; Thie MIT Press, Cambridge, Massachusetts and London, England, 1983.
Claims (18)
1. A method for planning a stereotactic surgical procedure using a fluoroscope for generating images of the body, the method comprising the steps of: placing adjacent to the body a registration artifact including a plurality of fiducials at known positions relative to a known coordinate frame of the artifact; displaying on a computer monitor an image taken of the patient's body and the registration artifact; receiving an input to identify two-dimensional coordinates of the fiducials of the registration artifact displayed on the image; and registering the image by creating a geometric model having parameters, said model projecting three-dimensional coordinates Into image points, and numerically optimizing the parameters of the geometric model such that the projections of the known three-dimensional coordinates of the fiducials best fit the identified two-dimensional 15 coordinates in the image.
2. The method of claim 1, further comprising the steps of: displaying a second image taken of the patient's body and the registration *artifact but from an angle different from that of the first image; receiving an input to identify two-dimensional coordinates of the fiducials 20 of the registration artifact displayed on the second image; and registering the second image by creating a geometric model having parameters, said model projecting three-dimensional coordinates into image points, and numerically optimizing the parameters of the geometric model such that the projections of the known three-dimensional coordinates of the fiducials best fit the identified 25 two-dimensional coordinates in the second image.
3. The method of claim 2, further comprising the step of receiving a user input to select a point upon the first image, said point partially designating a virtual guidewire.
4. The method of claim 3, further comprising the step of receiving an input specifying a position, a length, and angles of the virtual guidewire. 36 The method of claim 4, further comprising the step of drawing projected guidewire segments on the images, such that the projected guidewires are projections of the virtual guidewire onto the images.
6. The method of claim 5, further comprising the steps of receiving a user input to move either end of the projected guidewire on either image, by revising the virtual guidewire of which the two projected guidewires are projections, and by redrawing the two projected guidewires on their respective images in correspondance with the revised virtual guidewire.
7. The method of claim 5, further comprising the steps of receiving a user input to change the length of the virtual guidewire, and redrawing the two projected guidewires on their respective images in correspondance with the revised virtual guidewire.
8. The method of claim 5, further comprising the steps of receiving a user input to change the sagittal angle of the virtual guidewire, updating the orientation of 1 5 the virtual guidewire based on the new sagittal angle, and redrawing the two projected guidewires on their respective images in correspondance with the revised virtual guidewire.
9. The method of claim 5, further comprising the steps of receiving a user input to adjust the transverse angle of the virtual guidewire, updating the orientation 20 of the virtual guidewire based on the new transverse angle, and redrawing the two projected guidewires on their respective images in correspondance with the revised virtual guidewire.
10. The method of claim 5, further comprising the steps of receiving a user input to adjust the coronal angle of the virtual guidewire, updating the orientation of 25 the virtual guidewire based on the new coronal angle, and redrawing the two projected uidewires on their respective images in correspondance with the revised virtual guidewire.
11. The method of claim 5, further comprising the step of producing an output to adjust the coordinates of a tool guide such that its axis is brought into alignment with the virtual guidewire.
12. The method of claim 11, further comprising the step of producing an output to adjust the coordinates of a tool guide such that the position of the guide 37 along its axis is offset by a preselected distance ffom one endpoint of the virtual guidewire.
13. The method of claim 11, firther comprising the step of transmitting said coordinates to an automatic mechanical device.
14. The method of claim 1 1, further comprising the step of displaying said coordinates with which a human operator may manually adjust a mechanical dcvice. The method of claim 11, whercin the registration artifact includes a tool guide. 16, The method of claim 2, further comprising the step of receiving an input to select a point upon the first image, said point partially designating a virtual targetpoint for a surgical Instrument.
17. The method of claim 16, fuirther comprising the step of drawing a projected targetpoint both on the first image and another on the, second image, such that the projected targetpoints are projections of a virtual targetpaint onto the images.
18. The method of claim 17, further comprising the steps of receiving a user input to move the projected targetpoint on either image, by revising the virtual targetpoint of which the two projected targctpoints are projections, and by redrawing the two projected targetpoints on their respective images in correspondance with the revised virtual targetpoint.
19. The method of claim 1 g, further comprising the step of producing an output to adjust the coordinates of a tool guide such that its axis intersects the virtual targetpoint. The method of claim 19, further comprising the step of producing an output to adjust the coordinates of a tool guide such that the position of the guide along its axis is offset by a preselected distance from the virtual targetpoint. *21. The mrethod of claim 19, further comprising the step of transmitting said coordinates to an automatic miechanical device.
22. The method of claim 19, further comprising the step of displaying said coordinates with which a human operatoir may manuzally adjust a mechanical device.
3023. The method of claim 19, wherein the registation artifact includes a tool guide. 38 24. The method of claim 1, further comprising the step of receiving an input to select a point upon the first image, said point partially designating a virtual guidewire representing a trajectory for the surgical instrument into the body. The method of claim 24, further comprising the step of producing an output to adjust the coordinates of a tool guide such that its axis is brought into alignment with the virtual guidewire. 26. The method of claim 25, further comprising the step of transmitting said coordinates to an automatic mechanical device. 27. The method of claim 25, further comprising the step of displaying said coordinates with which a human operator may manually adjust a mechanical device. 28, The method of claim 25, wherein the registration artifact includes a tool guide. 29. An apparatus for planning a steretactic surgical procedure using a fluoroscope for generating images of the body, the apparatus comprising: 15 means for placing adjacent to the body a registration artifact including a plurality of fiducials; means for displaying an image taken of the body and the fiducials; S." means for identifying two-dimensional coordinates of the fiducials in an image; 20 means for registering an image with respect to said fiducial artifact; means for receiving inputs to select and adjust a virtual guidewire or S targetpoint, while the projections of said guidewire or targetpoint are displayed superimposed upon the image; and means for producing an output to adjust the coordinates of a tool guide. 25 30. An apparatus for planning a stereotactic surgical procedure for a linear trajectory insertion of a surgical instrument into a body using a fluoroscope for generating images of the body, the apparatus comprising: a registration artifact located adjacent to the body, the registration artifact including a plurality offiducials located at known three-dimensional coordinates relative a 3o known coordinate frame; means for displaying at least one image taken of the body and the fiducials on at least one computer monitor, 39 means for identifying two-dimensional coordinates of the fiducials in each image; and means for numerically optimizing parameters of a geometric model, said model projecting three-dimensional coordinates into image points, such that the projections of the known three-dimensional coordinates of the fiducjals best fit the identified two-dimensional coordinates in the image. 31. The apparatus of claim 30, further comprising a means for receiving user input to select a position, a length, arnd the angles of a virtual guide-Aire; and means for displaying a projected guidewire segment on each registered image representing the location of the virtual guidewire. 32. The apparatus of claim 30, further comprising a tool guide, and mean& for producing an output to adjust the coordinates of the tool guide. 33. A computer-aided method for planning a surgical procedure comprising: registering to a known coordinate frame a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle; displaying the first image; and drawing in the displayed first image a representation of a surgical object to be placed in the body based the registration of the first image with the known coordinate frame. 34. The method of Claim 33 wherein drawing in the displayed first image the representation of the surgical object is in response to a user indicating at least one positioning parameter for the surgical object. *35. The method of Claim 34 wherein the at least one positioning parameter for the surgical object is defined in reference to the known coordinate frame. 36. The method of Claim 35 wherein the at least one parameter includes an approach angle of the surgical object. ee* 37. The method of Claim 34 wherein the at least one parameter is defined in reference to the first image. -41- 38. The method of Claim 37 wherein the at least one parameter includes a point in the body. 39. The method of Claim 37 wherein the user indicates the at least one parameter by positioning a cursor displayed within the first image. The method of Claim 33 wherein a user indicates at least one parameter defining the surgical object. 41. The method of Claim 40 wherein the at least one parameter includes a dimension of the surgical object. 42. The method of Claim 33 further comprising: registering to the known coordinate frame a second two-dimensional, fluoroscopic image of the body's anatomy taken at a second observation angle; displaying the second image; and drawing in the displayed second image the representation of the surgical object based the registration of the second image with the known coordinate frame. 43. The method of Claim 42 wherein drawing the representation of the surgical object in the second image is in response to a user indicating on the displayed first image indicating a change in position of the representation of the surgical object in the first image. -42- 44. The method of Claim 33, wherein the representation of the surgical object is a virtual guidewire defining, at least in part, a trajectory of insertion of the surgical object into the body. The method of Claim 33, wherein the representation of the surgical object is a virtual guidewire having a length corresponding to a dimension of the surgical object to be inserted into the body. 46. The method of Claim 33 further comprising transmitting to a positioning mechanism coordinates for indicating the position of the surgical object represented in the S first image. 47. The method of Claim 46 further comprising manipulating the positioning 9999*9 9 S mechanism such that a guide coupled to the positioning mechanism is substantially aligned with the representation of the surgical object in the image. 9 9 48. The method of Claim 33 further comprising displaying information for indicating the position of the surgical object represented in the first image. the position of the surgical object represented in the first image. -43- 49. A computer readable storage medium encoded with instructions, which, when read by a computer, enable a computer to undertake a process comprising: registering to a known coordinate frame a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle; displaying the first image; and drawing in the displayed first image a representation of a surgical object to be placed in the body based the registration of the first image with the known coordinate frame. :6,09, 50. The computer readable storage medium of Claim 49 wherein drawing in the *0 0.: S displayed first image the representation of the surgical object is in response to a user indicating at least one positioning parameter for the surgical object. 000000 0 51. The computer readable storage medium of Claim 50 wherein the at least one positioning parameter for the surgical object is defined in reference to the known 0@ coordinate frame. 00 o00 52. The computer readable storage medium of Claim 51, wherein the indication of the at least one positioning parameter is a reference on the displayed first image controlled by a user. -44- 53. The computer readable storage medium of Claim 49 wherein the process further comprises: registering to the known coordinate frame a second two-dimensional, fluoroscopic image of the body's anatomy taken at a second observation angle; displaying the second image; and drawing in the displayed second image the representation of the surgical object based the registration of the second image with the known coordinate frame. 54. The computer readable storage medium of Claim 53 wherein drawing the representation of the surgical object in the second image is in response to an input 99 94" S received from a user indicating a position of the surgical object. 55. The computer readable storage medium of Claim 53 wherein drawing the representation of the surgical object in the second image is in response to an input fees received from a user indicating a position of the representation of the surgical object in the displayed first image. *o S 56. The computer readable storage medium of Claim 53 wherein drawing the *08 representation of the surgical object in the second image is in response to an input indicating a change in position of the representation of the surgical object in the first image. 57. The computer readable storage medium of Claim 53 wherein registering to the known coordinate frame the first image and the second image includes registering known coordinates of a plurality of fiducials within the reference frame with positions of the plurality of fiducials in the first and second images. 58. A computer-aided method for planning a surgical procedure comprising: registering a first two-dimensional, fluoroscopic image of a body's anatomy taken at a first observation angle with a second two-dimensional fluoroscopic image of the body's anatomy taken at a second observation angle; displaying the first image; drawing within the displayed first image drawing within the displayed first image a representation of a surgical object to be placed in the body based on an input indicating a position of the surgical object; displaying the second image; and drawing in the displayed second image the representation of the surgical object. 59. The method of Claim 5 8 wherein drawing the representation of the surgical object in the second image is based, at least in part, on positioning in the displayed first image of the representation of the surgical object in the first image. -46- The method of Claim 58 wherein drawing in the first image and drawing second image the representation the surgical object is at least in part in response to a user indicating at least one positioning parameter for the surgical object. 61. The method of Claim 60 wherein the at least one positioning parameter for the surgical object is defined in reference to a known coordinate frame to which the first and the second images are registered. 62. The method of Claim 60 wherein the at least one parameter includes an approach angle of the surgical object. 63. The method of Claim 60 wherein the at least one parameter includes a point in the body. 64. The method of Claim 60 wherein the user indicates the at least one parameter by positioning a reference displayed within the first or second images. The method of Claim 58 wherein a user indicates at least one parameter defining the surgical object. -47- 66. The method of Claim 58 further comprising transmitting to a positioning mechanism coordinates for indicating the position of the surgical object represented in the first image. 67. The method of Claim 66 further comprising manipulating the positioning mechanism such that a guide coupled to the positioning mechanism is substantially aligned with the representation of the surgical object in the image. 68. The method of Claim 58 further comprising displaying information for indicating the position within a known coordinate frame of reference for the surgical object for use in manually positioning a guide. o• 69. The method of Claim 58 wherein registering the first and second images includes registering a plurality of fiducials having known coordinates within a known i coordinate frame of reference with images of the plurality of fiducials within in the respective first and second images. A computer readable storage medium encoded with instructions, which, when read by a computer, enable a computer to undertake a process comprising: receiving a first two-dimensional, fluoroscopic image taken of a patient's body and a plurality of radio-opaque fiducials placed adjacent the body at known positions; and -48- registering the fluoroscopic image by optimizing parameters of a known geometric model such that projections of the plurality of fiducials into the first image best fit positions of the plurality of fiducials in the image. 71. The computer readable storage medium of Claim 70, wherein the process further comprises: receiving a second, two-dimensional fluoroscopic image taken of the patient's body and the plurality of fiducials from a position different from the first fluoroscopic image; and registering the second fluoroscopic image by optimizing parameters of the known geometric model such that projections of the plurality of fiducials into the second image best fit positions of the plurality of fiducials in the second image. 72. The computer readable storage medium of Claim 71, wherein the process further comprises: receiving input indicating on one of the first and second images a position of a representation of an imaginary object with respect to the body; and drawing on the other of the first and second images a corresponding representation of the imaginary object projected into said other of the first and second images. 73. The computer readable storage medium of Claim 72 further comprising: receiving input indicating a change to a second position of the representation of the imaginary object within said one of the first and second image; and -49- redrawing within said other of the first and the second images the corresponding representation of imaginary object in the second position. 74. The computer readable storage medium of Claim 72 wherein the imaginary object is a representation of a surgical object and the corresponding representation is also of the same surgical object. The computer readable storage medium of Claim 71, wherein the process further comprises: receiving an input indicating a position of an imaginary object within the body; and o o drawing on the first and the second images a representation of the imaginary object in the indicated position. 76. The computer readable storage medium of Claim 75, wherein the process further comprises: receiving an input indicating a change in the position of the imaginary object to a S second position; and redrawing in the first and the second images the representation of imaginary object ~in the second position. 77. The computer readable storage medium of Claim 70, wherein registering the fluoroscopic image further comprises: displaying the fluoroscopic image; and receiving an input from a user indicating on the fluoroscopic image the position of each of the plurality of fiducials within the image. 78. The computer readable storage medium of Claim 70, wherein the process further comprises linearizing the fluoroscopic image before registering the image. 79. A method comprising: receiving a first two-dimensional, fluoroscopic image taken of a patient's body and a plurality of radio-opaque fiducials placed adjacent the body at known positions; and registering the fluoroscopic image by optimizing parameters of a known geometric model such that projections of the plurality of fiducials into the first image best fit positions of the plurality of fiducials in the image. 80. The method of Claim 79 further comprising: receiving a second, two-dimensional fluoroscopic image taken of the patient's body and the plurality of fiducials from a position different from the first fluoroscopic image; and registering the second fluoroscopic image by optimizing parameters of the known geometric model such that projections of the plurality of fiducials into the second image best fit positions of the plurality of fiducials in the second image. 81. The method of Claim 80 further comprising: -51- receiving input indicating on one of the first and second images a position of a representation of an imaginary object with respect to the body; and drawing on the other of the first and second images a corresponding representation of the imaginary object projected into said other of the first and second images. 82. The method of Claim 81 further comprising: receiving input indicating a change to a second position of the representation of the imaginary object within said one of the first and second image; and redrawing within said other of the first and the second images the corresponding representation of imaginary object in the second position. S:oI 83. The method of 81 wherein the imaginary object is a representation of a surgical object and the corresponding representation is also of the same surgical object. 84. The method of Claim 80 further comprising: receiving an input indicating a position of an imaginary object within the body; and drawing on the first and the second images a representation of the imaginary object in the indicated position. The method of Claim 84 further comprising: receiving an input indicating a change in the position of the imaginary object to a second position; and -52- redrawing in the first and the second images the representation of imaginary object in the second position. 86. The method of Claim 79 further comprising: displaying the fluoroscopic image; and receiving an input from a user indicating on the fluoroscopic image the position of each of the plurality of fiducials within the image. 87. The method of Claim 79 further comprising linearizing the fluoroscopic image before registering the image. Dated this 6 th Day of August 2001 Northwestern University By their Patent Attorneys CULLEN CO.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US64831396A | 1996-05-15 | 1996-05-15 | |
| US08648313 | 1996-05-15 | ||
| US08649798 | 1996-05-17 | ||
| AU30664/97A AU3066497A (en) | 1996-05-15 | 1997-05-14 | Stereotactic surgical procedure apparatus and method |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU30664/97A Division AU3066497A (en) | 1996-05-15 | 1997-05-14 | Stereotactic surgical procedure apparatus and method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| AU5782301A true AU5782301A (en) | 2001-09-27 |
| AU773931B2 AU773931B2 (en) | 2004-06-10 |
Family
ID=32597810
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU57823/01A Ceased AU773931B2 (en) | 1996-05-15 | 2001-08-06 | Stereotactic surgical procedure apparatus and method |
Country Status (1)
| Country | Link |
|---|---|
| AU (1) | AU773931B2 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113211652A (en) * | 2020-05-08 | 2021-08-06 | 凯特利·凯姆 | Hole cutter with plug ejection and method thereof |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5099846A (en) * | 1988-12-23 | 1992-03-31 | Hardy Tyrone L | Method and apparatus for video presentation from a variety of scanner imaging sources |
| US5389101A (en) * | 1992-04-21 | 1995-02-14 | University Of Utah | Apparatus and method for photogrammetric surgical localization |
| JP2896305B2 (en) * | 1993-05-15 | 1999-05-31 | 株式会社東芝 | Semiconductor integrated circuit device |
-
2001
- 2001-08-06 AU AU57823/01A patent/AU773931B2/en not_active Ceased
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113211652A (en) * | 2020-05-08 | 2021-08-06 | 凯特利·凯姆 | Hole cutter with plug ejection and method thereof |
Also Published As
| Publication number | Publication date |
|---|---|
| AU773931B2 (en) | 2004-06-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CA2255041C (en) | Stereotactic surgical procedure apparatus and method | |
| USRE40176E1 (en) | Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy | |
| EP3254621B1 (en) | 3d image special calibrator, surgical localizing system and method | |
| Hofstetter et al. | Fluoroscopy as an imaging means for computer-assisted surgical navigation | |
| JP7204663B2 (en) | Systems, apparatus, and methods for improving surgical accuracy using inertial measurement devices | |
| US5389101A (en) | Apparatus and method for photogrammetric surgical localization | |
| US6097994A (en) | Apparatus and method for determining the correct insertion depth for a biopsy needle | |
| USRE43952E1 (en) | Interactive system for local intervention inside a non-homogeneous structure | |
| US10433915B2 (en) | System and method for image localization of effecters during a medical procedure | |
| US8335553B2 (en) | CT-free spinal surgical imaging system | |
| US11974821B2 (en) | System and method for image localization of effecters during a medical procedure | |
| CN101268967B (en) | Method and apparatus for providing correction information | |
| US20160000518A1 (en) | Tracking apparatus for tracking an object with respect to a body | |
| JPH09511430A (en) | Three-dimensional data set registration system and registration method | |
| CN113017834B (en) | Joint replacement operation navigation device and method | |
| JPH09507131A (en) | Equipment for computer-assisted microscopic surgery and use of said equipment | |
| EP1204369A1 (en) | Method and system for displaying cross-sectional images of a body | |
| US6249713B1 (en) | Apparatus and method for automatically positioning a biopsy needle | |
| US12178523B2 (en) | Computer assisted surgical navigation system for spine procedures | |
| US6028912A (en) | Apparatus and method for point reconstruction and metric measurement on radiographic images | |
| Wesarg et al. | Accuracy of needle implantation in brachytherapy using a medical AR system: a phantom study | |
| AU5782301A (en) | Stereotactic surgical procedure apparatus and method | |
| CN115486937A (en) | 2D image surgical positioning navigation system and method | |
| JP2022521615A (en) | Intervention device tracking | |
| KR102612603B1 (en) | 2d-3d image registraion method and medical operating robot system thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| MK1 | Application lapsed section 142(2)(a) - no request for examination in relevant period | ||
| NA | Applications received for extensions of time, section 223 |
Free format text: AN APPLICATION TO EXTEND THE TIME FROM 20020325 TO 20020825 IN WHICH TO REQUEST EXAMINATION HAS BEEN LODGED |
|
| NB | Applications allowed - extensions of time section 223(2) |
Free format text: THE TIME IN WHICH TO REQUEST EXAMINATION HAS BEEN EXTENDED TO 20020825 |
|
| FGA | Letters patent sealed or granted (standard patent) |