[go: up one dir, main page]

WO2025088616A1 - Method and apparatus for procedure navigation - Google Patents

Method and apparatus for procedure navigation Download PDF

Info

Publication number
WO2025088616A1
WO2025088616A1 PCT/IL2024/051035 IL2024051035W WO2025088616A1 WO 2025088616 A1 WO2025088616 A1 WO 2025088616A1 IL 2024051035 W IL2024051035 W IL 2024051035W WO 2025088616 A1 WO2025088616 A1 WO 2025088616A1
Authority
WO
WIPO (PCT)
Prior art keywords
zone
image
instrument
control module
surgical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/051035
Other languages
French (fr)
Inventor
Ido ZUCKER
Kfir AKONS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mazor Robotics Ltd
Original Assignee
Mazor Robotics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mazor Robotics Ltd filed Critical Mazor Robotics Ltd
Publication of WO2025088616A1 publication Critical patent/WO2025088616A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/16Instruments for performing osteoclasis; Drills or chisels for bones; Trepans
    • A61B17/17Guides or aligning means for drills, mills, pins or wires
    • A61B17/1739Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
    • A61B17/1757Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the spine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00115Electrical control of surgical instruments with audible or visual output
    • A61B2017/00119Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy

Definitions

  • the present disclosure relates to positioning an instrument during planning and execution of a procedure.
  • the present disclosure includes a planning and/or navigation system, such as a surgical planning and/or navigation system.
  • the system may incorporate or use an magnetic resonance (MRI) image (including MRI image data) of a planned or actual surgical site.
  • An imaging device control module is configured to segment soft tissue in the MRI image and generate a segmented image of the MRI image, and further configured to identify a no-go zone for an instrument in the segmented image.
  • a tracking system control module is configured to plot an execution path or trajectory for the instrument to perform a procedure at the surgical site and determine whether the execution path includes (e.g., passes through) the no-go zone, the tracking system control module is further configured to modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone.
  • a memory module is configured to save the execution path for execution during the procedure.
  • the present disclosure further includes a planning and/or navigation system, such as a planning and/or navigation system surgical system with an MRI image of a surgical site and a projection or x-ray image or image data (such as computed tomography (CT) image) of the surgical site.
  • CT computed tomography
  • the MRI and the CT image may be segmented.
  • the MRI may be segmented to segment and identify soft tissue and the CT image may be segmented to segment and identify hard (e.g., bony) tissue.
  • An imaging device control module is configured to merge the segmented MRI image and the CT image into a segmented image, that includes segmented bony tissue in the segmented image, and segmented soft tissue in the segmented image.
  • the imaging device control module may alternatively merge the images and then segment the selected portions, as noted above.
  • a tracking system control module is configured to track an instrument relative to a no-go zone identified in the segmented image including at least one of soft tissue and bony tissue, and configured to generate alerts of increasing intensity as the instrument approaches the no-go zone.
  • the tracking system control module is further configured to deactivate the instrument when the instrument enters the no-go zone.
  • the method includes the following: acquiring an MRI image of a surgical site such as from an MRI imaging device; acquiring a CT image of the surgical site such as from a CT imaging device; merging the MRI image and the CT image into a segmented image with an imaging device control module that includes segmented bony tissue in the segmented image with the imaging device control module; segmented soft tissue in the segmented image with the imaging device control module; identifying a no- go zone for a surgical instrument in the segmented image; plotting an execution path for an instrument to perform a procedure at a surgical site and determine whether the execution path includes the no-go zone with a tracking system control module; modifying the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and saving the execution path to a memory module for execution during the procedure.
  • the MRI and/or CT image data may be segmented prior to merging.
  • the merged image (e.g., merged CT and MRI image) may also be segmented
  • FIG. 1 is diagrammatic view illustrating an overview of a robotic system and a navigation system in accordance with the present disclosure
  • FIG. 2 is a detailed environmental view of the robotic system and a tracking system in accordance with the present disclosure
  • FIG. 3 is a detailed view of additional features of the robotic system and the navigation system
  • FIG. 4A illustrates an MRI image of a spine of a subject
  • FIG. 4B illustrates a CT image of the spine of the subject
  • FIG. 5 is a merged image of the MRI image and the CT image of the spine of the subject
  • FIG. 6 is a cross-sectional view of the merged image of the spine
  • FIG. 7A illustrates a method in accordance with the present disclosure for identifying a no-go zone during planning and execution of spine surgery.
  • FIG. 7B is a continuation of the method of FIG. 7A.
  • FIG. 1 is a diagrammatic view illustrating an overview of a procedure room or arena.
  • the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures.
  • the robotic system 20 may include a Mazor XTM robotic guidance system, sold by Medtronic, Inc., which is configured to apply restraint to movement of an instrument.
  • the robotic system 20 may also include a Mako robotic-assisted surgical robot offered by Stryker Corporation.
  • the robotic system 20 may be used to assist in guiding selected instruments during manual movement by the user 72, such as drills, screws, etc. relative to a subject 30.
  • the robotic system 20 may also be used to assist in moving selected instruments, such as drills, screws, etc. relative to a subject 30.
  • the robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30.
  • the robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44.
  • the end effector 44 may be any appropriate portion, such as a tube, guide, or passage member.
  • the end effector 44 may be moved relative to the base 38 with one or more motors.
  • the position of the end effector 44 may be known or determined relative to the base 38 with one or more encoders at one or more joints, such as a wrist joint 48 and/or an elbow joint 52 of the robotic system 20.
  • the navigation system 26 can be used to track the location of one or more tracking devices, which may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, and/or a tool tracking device 66, for example.
  • a tool or instrument 68 may be any appropriate tool such as a drill, forceps, or other tool operated by a user 72.
  • the tool 68 may also include an implant, such as a spinal implant or orthopedic implant.
  • the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc.
  • the instruments may be used to navigate or map any region of the body.
  • the navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
  • An imaging device 80 may be used to acquire pre-, intra-, or post-operative or real-time image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject.
  • the imaging device 80 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA.
  • the imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed.
  • the image capturing portion may include an x-ray source or emission portion and an x-ray receiving or image receiving portion located generally or as practically possible 180 degrees from each other and mounted on a rotor relative to a track or rail.
  • the image capturing portion can be operable to rotate 360 degrees during image acquisition.
  • the image capturing portion may rotate around a central point or axis, allowing image data of the subject 30 to be acquired from multiple directions or in multiple planes.
  • the imaging device 80 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference, or any appropriate portions thereof.
  • the imaging device 80 can utilize flat plate technology having a 1 ,720 by 1 ,024 pixel viewing area.
  • the position of the imaging device 80, and/or portions therein such as the image capturing portion can be precisely known relative to any other portion of the imaging device 80.
  • the imaging device 80 can know and recall precise coordinates relative to a fixed or selected coordinate system. This can allow the imaging system 80 to know its position relative to the patient 30 or other references.
  • the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30.
  • the imaging device 80 can also be tracked with a tracking device 62.
  • the image data defining an image space acquired of the patient 30 can, according to various embodiments, be inherently or automatically registered relative to an object space.
  • the object space can be the space defined by a patient 30 in the navigation system 26.
  • the automatic registration can be achieved by including the tracking device 62 on the imaging device 80 and/or the determinable precise location of the image capturing portion.
  • imageable portions, virtual fiducial points and other features can also be used to allow for registration, automatic or otherwise. It will be understood, however, that image data can be acquired of any subject which will define subject space.
  • Patient space is an exemplary subject space. Registration allows for a transformation map to be determined between patient space and image space to correlate points in each to one another.
  • the patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58.
  • the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration.
  • registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data.
  • a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84.
  • Various tracking systems such as one including an optical localizer 88 or an electromagnetic (EM) localizer 94 can be used to track the instrument 68.
  • More than one tracking system can be used to track the instrument 68 in the navigation system 26.
  • these can include an electromagnetic tracking (EM) system having the EM localizer 94 and/or an optical tracking system having the optical localizer 88.
  • EM electromagnetic tracking
  • optical tracking system having the optical localizer 88.
  • Either or both of the tracking systems can be used to track selected tracking devices, as discussed herein. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system.
  • a tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
  • the imaging device 80 may be an imaging device other than the O-arm® imaging device and may include in addition or alternatively a fluoroscopic C-arm.
  • Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc.
  • the imaging device 80 may be configured as any suitable imaging device, such as one or more of an MRI imaging device, CT imaging device, ultrasound imaging device, etc.
  • an imaging device controller 96 may control the imaging device 80 and can receive the image data generated at the image capturing portion and store the images for later use.
  • the controller 96 can also control the rotation of the image capturing portion of the imaging device 80.
  • the controller 96 need not be integral with the gantry housing 82, but may be separate therefrom.
  • the controller 96 may be portions of the navigation system 26 that may include a processing and/or control system 98 including a processing unit or processing portion 101.
  • the controller 96 may be integral with the gantry 82 and may include a second and separate processor, such as that in a portable computer.
  • the patient 30 can be fixed onto an operating table 104.
  • the table 104 can be an Axis Jackson ® operating table sold by OSI, a subsidiary of Mizuho Ikakogyo Co., Ltd., having a place of business in Tokyo, Japan or Mizuho Orthopedic Systems, Inc. having a place of business in California, USA.
  • Patient positioning devices can be used with the table, and include a Mayfield ® clamp or those set forth in commonly assigned U.S. Pat. Appl. No. 10/405,068 published as U.S. Pat. App. Pub. No. 2004/0199072, entitled “An Integrated Electromagnetic Navigation And Patient Positioning Device”, filed April 1 , 2003 which is hereby incorporated by reference.
  • the position of the patient 30 relative to the imaging device 80 can be determined by the navigation system 26.
  • the tracking device 62 can be used to track and locate at least a portion of the imaging device 80, for example the gantry or housing 82.
  • the patient 30 can be tracked with the dynamic reference frame 58, as discussed further herein. Accordingly, the position of the patient 30 relative to the imaging device 80 can be determined. Further, the location of the imaging portion can be determined relative to the housing 82 due to its precise position on the rail within the housing 82, substantially inflexible rotor, etc.
  • the imaging device 80 can include an accuracy of within ten microns, for example, if the imaging device 80 is an O-Arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado.
  • the imaging device 80 can generate and/or emit x-rays from the x-ray source that propagate through the patient 30 and are received by the x-ray imaging receiving portion.
  • the image capturing portion generates image data representing the intensities of the received x-rays.
  • the image capturing portion can include an image intensifier that first converts the x-rays to visible light and a camera (e.g. a charge couple device) that converts the visible light into digital image data.
  • the image capturing portion may also be a digital device that converts x-rays directly to digital image data for forming images, thus potentially avoiding distortion introduced by first converting to visible light.
  • Two dimensional and/or three dimensional fluoroscopic image data that may be taken by the imaging device 80 can be captured and stored in the imaging device controller 96. Multiple image data taken by the imaging device 80 may also be captured and assembled to provide a larger view or image of a whole region of a patient 30, as opposed to being directed to only a portion of a region of the patient 30. For example, multiple image data of the patient’s 30 spine may be appended together to provide a full view or complete set of image data of the spine.
  • the image data can then be forwarded from the image device controller 96 to the navigation computer and/or processor system 101 that can be a part of a controller or work station 98 having the display 84 and a user interface 106. It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98.
  • the work station 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data.
  • the user interface 106 which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84.
  • the work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional image data.
  • the navigation system 26 can further include the tracking system including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88.
  • the tracking systems may include a controller and interface portion 1 10.
  • the controller 1 10 can be connected to the processor portion 101 , which can include a processor included within a computer.
  • the EM tracking system may include the STEALTHSTATION® AXIEMTM Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in U.S. Patent Application Serial No. 10/941 ,782, filed Sept. 15, 2004, and entitled "METHOD AND APPARATUS FOR SURGICAL NAVIGATION"; U.S. Patent No.
  • the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7TM tracking systems having an optical localizer, that may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Louisville, Colorado.
  • Other tracking systems include an acoustic, radiation, radar, etc. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
  • Wired or physical connections can interconnect the tracking systems, imaging device 80, etc.
  • various portions such as the instrument 68 may employ a wireless communications channel, such as that disclosed in U.S. Patent No. 6,474,341 , entitled “Surgical Communication Power System,” issued November 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the controller 1 10.
  • the tracking devices 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
  • the instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device.
  • the instrument 68 can include a graspable or manipulable portion at a proximal end and the tracking devices may be fixed near the manipulable portion of the instrument 68. Additional representative or alternative localization and tracking system is set forth in U.S. Patent No. 5,983,126, entitled “Catheter Location System and Method,” issued November 9, 1999, which is hereby incorporated by reference.
  • the navigation system 26 may be a hybrid system that includes components from various tracking systems.
  • the navigation system 26 can be used to track the instrument 68 relative to the patient 30.
  • the instrument 68 can be tracked with the tracking system, as discussed above.
  • Image data of the patient 30, or an appropriate subject can be used to assist the user 72 in guiding the instrument 68.
  • the image data is registered to the patient 30.
  • the image data defines an image space that is registered to the patient space defined by the patient 30.
  • registration allows a transformation map to be generated of the physical location of the instrument 68 relative to the image space of the image data.
  • the transformation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image data 108.
  • a graphical representation 68i also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
  • the tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly.
  • the fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58.
  • the fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy. As illustrated in Figs. 1 and 2, the fiducial assembly 120 can be interconnected with a portion of a spine F, such as a spinous process 130.
  • the fixation portion 124 can be interconnected with a spinous process 130 in any appropriate manner.
  • a pin or a screw can be driven into the spinous process 130.
  • a clamp portion 124 can be provided to interconnect the spinous process 130.
  • the fiducial portions 120 may be imaged with the imaging device 80. It is understood, however, that various portions of the subject (such as a spinous process) may also be used as a fiducial portion.
  • image data is generated that includes or identifies the fiducial portions 120.
  • the fiducial portions 120 can be identified in image data automatically (e.g., with a processor executing a program), manually (e.g. by selection an identification by the user 72), or combinations thereof (e.g. by selection an identification by the user 72 of a seed point and segmentation by a processor executing a program).
  • Methods of automatic imageable portion identification include those disclosed in U.S. Patent No. 8,150,494 issued on April 3, 2012, incorporated herein by reference.
  • Manual identification can include selecting an element (e.g. pixel) or region in the image data wherein the imageable portion has been imaged.
  • the fiducial portions 120 identified in the image data can be used as fiducial points or positions that can be used to register the image data or the image space of the image data with patient space.
  • the fiducial portions 120 that are identified in the image 108 may then be identified in the subject space defined by the subject 30, in an appropriate manner.
  • the user 72 may move the instrument 68 relative to the subject 30 to touch the fiducial portions 120, if the fiducial portions are attached to the subject 30 in the same position during the acquisition of the image data to generate the image 108.
  • the fiducial portions 120 may be attached to the subject 30 and/or may include anatomical portions of the subject 30.
  • a tracking device may be incorporated into the fiducial portions 120 and they may be maintained with the subject 30 after the image is acquired.
  • the registration or the identification of the fiducial portions 120 in a subject space may be made.
  • the user 72 may move the instrument 68 to touch the fiducial portions 120.
  • the tracking system such as with the optical localizer 88, may track the position of the instrument 68 due to the tracking device 66 attached thereto. This allows the user 72 to identify in the navigation space the locations of the fiducial portions 120 that are identified in the image 108.
  • the transformation map may be made between the subject space defined by the subject 30 in a navigation space and the image space defined by the image 108. Accordingly, identical or known locations allow for registration as discussed further herein.
  • a transformation map is determined between the image data coordinate system of the image data, such as the image 108 and the patient space defined by the patient 30.
  • the instrument 68 can be tracked with the tracking system that is registered to the image data to allow an identification and illustration of a position of the tracked instrument 68 as an icon superimposed on the image data. Registration of the image 108 (or any selected image data) to the subject 30 may occur at any appropriate time.
  • the instrument 68 can be tracked relative to the image 108.
  • the icon 68i representing a position or pose (which may include a 6 degree of freedom position (including 3D location and orientation)) of the instrument 68 can be displayed relative to the image 108 on the display 84. Due to the registration of the image space to the patient space, the position of the icon 68i relative to the image 108 can substantially identify or mimic the location of the instrument 68 relative to the patient 30 in the patient space. As discussed above, this can allow a navigated procedure to occur.
  • the robotic system 20 having the robotic system coordinate system may be registered to the navigation space coordinate system, as discussed herein, due to the reference tracking device 54 (e.g., if fixed to a known position on or relative to the robotic system 20) and/or due to the tracking of a snapshot tracking device 160. Due to or during registration, a transformation map is determined between the robotic system coordinate system of the robotic system and the navigation space coordinate system patient space defined by the patient 30.
  • the snapshot tracking device 160 may include one or more trackable portions 164 that may be tracked with the localizer 88 or any appropriate localizer (e.g. optical, EM, radar). It is understood, however, that any appropriate tracking system may be used to track the snapshot tracking device 160.
  • a fixed reference tracking device may also be positioned within the navigation space.
  • the fixed navigation tracker may include the patient tracker 58, which may be connected to the patient 30 and/or the robot tracker 54 that may be fixed to the mount 34 of the robotic system 20.
  • the reference tracker therefore, may be any appropriate tracker that is positioned relative to the snapshot tracker 160 that is within the navigation coordinate space during the registration period.
  • the robot tracker 54 will be referred to, however, the patient tracker 58 may also be used as the reference tracker.
  • the reference tracker may be positioned within the coordinate system at any position relative to the snapshot tracker 160 as long as the snapshot tracker 160 may be tracked relative to the reference tracker.
  • the snapshot tracker 160 may be positioned at a known position relative to the end effector 44.
  • the snapshot tracker 160 as illustrated in FIG. 3, which includes the trackable portions 164, extends from a rod or connection member 168.
  • the connection member 168 may include a keyed portion, such as a projection 172 that may engage a slot 174 of the end effector 44.
  • the end effector 44 may form or define a cannula or passage 176 that may engage the connector 168.
  • the connector 168 may be positioned within the passage 176 of the end effector 44.
  • the connector 168 may then be fixed to the end effector 44, such as with a fixation member including a set screw or clamping of the end effector 44, such as with a set screw or clamping member 180.
  • the localizer 88 may then view or determine a position of the snapshot tracking device 160 relative to the reference tracking device 54 and/or the reference tracking device 58. As the localizer 88 defines, or may be used to define, the navigation space, determining or tracking a position of the snapshot tracking device 160 relative to the reference frame 54 may be used to determine a relationship between a position within the navigation space and the robotic space of the end effector 44.
  • the navigation space defined by the localizer 88 may include the full navigation space 170, which may include portions relative to the subject, such as the subject tracker 58 and other portions that may be moved therein, such as the instrument 68.
  • the robotic registration space may be smaller and may include a robotic registration space 174 that may include the reference frame 54 and the snapshot tracker 160.
  • the robot registration navigation space may include the snapshot tracker 160 and the patient tracker 58 for registration.
  • the exemplary registration navigation space 174 is merely for the current discussion. Both the robotic reference tracker 54 and the patient tracker 58 need not be used simultaneously. This is particularly true when the patient 30 is fixed in space, such as fixed relative to the robotic system 20.
  • the processor 101 is configured as a tracking system control module, and will subsequently be referred to herein as such.
  • the tracking system control module 102 is in communication with a robotic arm control module 152 (see FIG. 2) in any suitable manner.
  • the tracking system control module 102 tracks the robotic arm 40, and thus knows the orientation and position of the robotic arm 40 within the three-dimensional coordinate system of the arm 40.
  • the position of the snapshot tracking device 160 and the end effector 44 relative to the robotic arm 40 is input to the tracking system control module 102.
  • the tracking system control module 102 thus specifically knows the positions of the snapshot tracking device 160 and the end effector 44 within the three- dimensional coordinate system of the arm 40.
  • the robotic arm 40 may be tracked physically with any suitable encoders, etc.
  • encoders included with selected motors and/or included at various joints of the arm 40 may be used to determine position of the robotic arm 40, including the end effector.
  • the encoders may determine an amount of relative and/or total movement of the respective portions of the arm 40 to determine pose of the end effector at a selected time.
  • the encoders may produce a signal that is sent to the processor 101 to allow determination and navigation of the end effector of the arm 40.
  • the location of the optical tracking device 160 (as well as the end effector 44 and the robotic arm 40 generally) within a coordinate system of the optical localizer 88 is known and input to the processor 101 . That is, the tracking system control module 102 is configured to execute instructions to know the pose (including spatial coordinates and orientation) of the optical tracking device 160 when it is visible to the optical localizer 88.
  • the relative location of the portions attached or fixed to the optical tracking device 160 may also be known or recalled from a selected memory system, such as a portion of the arm 40 or instrument or portion thereof attached to the tracking device 160.
  • the pose of the distal end may also be tracked or navigated.
  • the tracking system control module 102 also knows the location of the snapshot tracking device 160 within the coordinate system of the robotic arm 40 at all times. Based on known locations of the snapshot optical tracking device 160 in both the coordinate system of the robotic arm 40 and the coordinate system of the optical localizer 88, the tracking system control module 102 is configured to execute instructions to determine the position of any appropriate portion associated with the arm 40 in the coordinate system of the robotic arm 40. This is, at least in part, due to the registration or correlation of the optical localizer 88 coordinate system and the coordinate system of the robotic arm 40 as discussed above.
  • the robotic system 20 includes the robotic arm control module 152 configured to control movement of the robotic arm 40.
  • the robotic arm control module 152 may receive the encoder signals, as discussed above.
  • the robotic arm control module 152 therefore, may know or determine a pose of the end effector is 44 at a selected time is based on inputs from its controls and sensors.
  • the pose may be relative to the base of the robotic system 20 and/or a pose in navigation space.
  • the tracking system control module 102 is configured to track movement of the tracking device 58 relative to the localizer 88.
  • the localizer 88 may include at least two of the receivers (e.g., at least two laser receivers, optical wavelength cameras, etc.) to generate a stereoscopic image or view.
  • the localizer 88 may be an optical localizer as discussed above and as used in various tracking systems, including various STEALTHSTATION® surgical navigation systems such as the TREON® , S7TM , or S8TM tracking systems having an optical localizer, sold by Medtronic Navigation, Inc. of Colorado. Other tracking systems that may require line-of-site at least for accuracy and/or speed may include acoustic, radar, etc.
  • FIG. 4A is an exemplary soft tissue image which may be and also referred to herein as a MRI image 210 of the subject’s spine 126.
  • the MRI image 210 is a sagittal view.
  • the MRI image 210 may be captured with an appropriate imaging device, in various embodiments the imaging device 80 may be a suitable MRI imaging device.
  • the MRI image 210 captures or includes various portions of the spine 126, particularly the soft tissue in and/or around the bony portions thereof.
  • the MRI image 210 includes the spinal cord 220, spinal nerves 222, ligaments 224, and intervertebral discs 226.
  • the MRI image 210 may be configured to capture and/or segment soft tissue structure, for example.
  • the MRI image 210 may also display bony areas of the spine 126, such as, but not limited to, vertebra, facet joints, and pars interarticularis, for example.
  • FIG. 4B is an exemplary CT image 250 of the subject’s spine 126.
  • the CT image 250 may be captured with an appropriate imaging device, in various embodiments the imaging device 80, or any other suitable CT imaging device.
  • the CT image 250 captures various portions of the spine 126, particularly the bony areas thereof.
  • the CT image 250 may include detail or definition of various bony portions and may include vertebra 230, facet joints 232, and pars interarticularis 234.
  • Various other bony regions of or near the spine 126 may also be displayed such as a sacrum, pelvis, or ribs.
  • the image data may be acquired at any appropriate time.
  • the image data may be acquired in a planning or pre-operative phase or time period.
  • the image data may be acquired in a non-invasive manner to assist in planning a procedure, as discussed herein.
  • the image data may be acquired intra- or postoperative to assist in further planning and/or confirmation of a procedure.
  • the imaging device controller 96 is configured to merge the MRI image 210 and the CT image 250 into a merged image including both soft tissue and bony tissue.
  • the image data such as the MRI image 210 and the CT image 250 may be individually acquired at any appropriate time, such as prior to (pre-), during (intra-), and/or following (post-) at least a portion of a procedure including an operation.
  • any merged image may include two or more of the types of image data.
  • FIG. 5 illustrates a merged image 310A, which is a sagittal view of the spine 126, formed by merging the MRI image 210 and the CT image 250.
  • FIG. 6 is a merged image 310B, which is a cross-sectional at a selected axial view from inferior-to- superior view of the spine 126, formed by merging the MRI image 210 and the CT image 250.
  • the merged image 310B shows a spinous process 240, transverse processes 242, a first pedicle 244, a second pedicle 246, and a vertebral foramen 248 through which the spinal cord 220 extends.
  • the tracking system control module 102 is configured to display the merged images 310A, 310B on the display 84.
  • the merged images 310A and 310B are formed in an appropriate manner.
  • the MRI image 210 and the CT image 250 are merged and various views thereof may be made, such as the sagittal view 310A and the cross-section or slice view 310B.
  • the merged image is generated by registering and/or overlaying two or more images or image data sets. For example, identical points or portions of both the MRI image 210 and the CT image 250 may be identified and this may be used to register the images together.
  • the merging process may include the registration and/or additional processes that allow two or more data sets, such as images, to be related to one another such as on a point by point basis.
  • registration processes include a 3D to 3D registration. Generally, this may be understood by one skilled in the art that registration is volume to volume and therefore we can cross-correlate between the images.
  • a soft tissue portion in a region of interest in the imaged area of the spine 126 is further processed to segment the soft tissue and the bony tissue.
  • a type of segmentation may occur for the entire image data set (e.g., the images area of the spine 126).
  • the spinal cord 220, the spinal nerves 222, the ligaments 224, and the intervertebral discs 226 may be segmented during soft tissue segmentation.
  • the vertebra 230, the facet joints 232, and the pars interarticularis 234 may be segmented during bony segmentation.
  • the segmentation may be within a manually selected region of interest, an entire images volume, a predefined or automatically determined region, or in any appropriate portion.
  • Segmentation may occur in various manner and/or according to various techniques. Segmentation may include edge detection, contrast variation determination, and the like. According to various embodiments, segmentation of the merged images 310A, 310B may include selected and appropriate techniques and processes such those understood by those skilled in the art. These may include or additionally include a machine learning system such as a neural network (artificial intelligence) that is trained to perform an appropriate segmentation of merged images. This may be done for either or both soft tissue and bone.
  • a machine learning system such as a neural network (artificial intelligence) that is trained to perform an appropriate segmentation of merged images. This may be done for either or both soft tissue and bone.
  • FIGS. 7A and 7B illustrate an exemplary method 510 in accordance with the present disclosure for determining instrument no-go (also referred to here as no-fly or stop navigation) zone identification during planning and execution of spine surgery. It is understood, however, that the method 510 may be used to plan or assist in planning any appropriate procedure. For example, a partial or total knee replacement, laryngeal, wrist, or other appropriate procedure may have regions or zone that may be identified and selected as no-go zones.
  • the no-go zones are areas or volumes where instruments are selected to not entered to protect sensitive portions of the spine 126, such as the spinal cord 220, the spinal nerve 222, and the ligaments 224.
  • the no-go zones may also include portions of the vertebra 230 as well.
  • the method 510 includes scanning the subject 30 at block 512 to obtain or capture the MRI image 210.
  • the subject 30 may be scanned to obtain or capture the CT image 250.
  • the CT image 250 is optional, and thus the method 510 may be performed using only the MRI image 210.
  • the images 210, 250 are merged at block 520 to obtain the merged images 310A and 310B, as described above.
  • Merged images may be registered volume to volume and each segmentation may occur independently and the registration brings it all together. In various embodiments, therefore, the segmentation may happen before or after the merging.
  • the volume of the selected image is segmented. As discussed above only the MRI image 512 may be selected, therefore only the MRI image may be segmented in block 522.
  • the merged images 310A, 310B is segmented as described above.
  • the merged images 310A, 310B are further segmented at blocks 524 and 526.
  • bone is segmented.
  • soft tissue is segmented. It is further understood that the various images may be segmented separately. For example, soft tissue may be segmented and identified in the MRI image 210. That is that the spinal cord 220, the nerves 222 or other appropriate portions may be segmented in the MRI image 210.
  • the segmented portions may be identified automatically or manually in the segmented image.
  • the CT image 250 may be segmented to segmented bone portions.
  • Various portions of the bone may also be identified in the segmented image and further segmented to delineate those portions, such as the facet joints 232, and/or to keep the pars interarticularis 234
  • cuts to the bony areas of the spine 126 are planned and/or saved in any appropriate manner such as manual identification with the merged image.
  • sensitive areas of the spine 126 are identified as no-go zones for instrumentation. Sensitive soft tissue areas include, but are not limited to, the spinal cord 220 and spinal nerves 222. Various bony areas may be designated as no-go zones as well. For example, areas of the vertebra 230 may need to be removed to relieve pressure on the spinal nerves 222 and/or the spinal cord 220. The bony areas must be removed without damaging the nerves 222 and the spinal cord 220.
  • Selecting or identifying no-go zones may be used to ensure or assist in ensuring to not remove too much bone, which may destabilize the spine 126. Depending on the particular surgery, it may also be desirable to remove no more than 50% of the facet joints 232, and/or to keep the pars interarticularis 234 intact. Otherwise, destabilization may occur, which may require implanting rods and screws.
  • FIGS. 5 and 6 illustrate exemplary no-go zones 610A - 610E identified in the merged images 310A and 310B.
  • the no-go zones 610A - 610E are sensitive areas into which, in accordance with the present disclosure, instruments are or should be selectively restricted from traveling.
  • Exemplary surgical instrumentation includes, but is not limited to, the robotic arm 40 and the instrument 60.
  • Exemplary no-go zone 610A includes the spinal cord 220.
  • No-go zone 61 OB includes the nerve endings 222.
  • No-go zone 61 OC includes intervertebral discs 226.
  • No-go zone 61 OD includes facet joints 232.
  • No-go zone 61 OE includes pars interarticularis 234.
  • Another no-go zone may include portions of the vertebra 230 to prevent over-resection of the vertebra 230. Any suitable additional no-go zones may be added as well.
  • the no-go zones 610 may be identified in any suitable manner.
  • the no-go zones 610 may be identified manually by way of the user interface 106 and/or the display 108. More specifically, using any suitable input device, such as a stylus, mouse, trackball, etc., or touching the display 84 configured as a touch display, personnel may designate one or more no-go zones, such as, but not limited to, one or more of the no-go zones 610A-E.
  • the tracking system control module 102 may be configured to automatically identify sensitive tissue areas and bony areas as no-go zones based on the anatomy in the area of the procedure. As discussed above, it may be selected to restrict or eliminate movement of various instruments into selected regions.
  • the selected regions may be identified or include regions that include selected soft tissue portions, such as nerves or the spinal cord. Selected regions may also include boney portions, such as spinal facets. Accordingly, the system may identify, such as via segmentation, volumes that are identified as nerves, other selected soft tissue, or selected bony regions. The volumes identified as nerves may be identified as no-go zones.
  • the no-go zones may be identified based upon the segmentation of the images, such as the MRI image 210 or the CT scan 250. Further, various volumes or dimensions of the boney portions may be identified and limitations may be made or identified relative thereto regarding an amount of removal.
  • the facets 232 may be identified and a no-go zone or limitation line 232a may be identified or selected as a no-go zone to limit the amount of material removed from the facets 232.
  • the no-go zones or at least portions thereof may be identified substantially automatically by the system by executing instructions regarding the segmented regions and/or a dimension relative to an exterior surface or external surface of a segmented region. Additional automatic no-go zones may be identified by portions that if removed may cause instability in the spine or are prone to cause issues in the rehabilitation process. These may be identified in general or relative to a specific patient, such as by the user 72.
  • safety zones may optionally be added around one or more of the no-go zones 61 OA-E.
  • the safety zones are buffer zones for the no-go zones 610A- E.
  • the safety zones may extend any suitable distance beyond or from the no-go zones 61 OA-E, and may have any suitable width/depth.
  • each safety zone may be defined by a boundary that is a selected distance from the no-go zone such as about 3mm-5mm around one or more of the no-go zones 61 OA-E.
  • the safety zones may be used to provide a first feedback (e.g., audible or visual) and/or limited motion (e.g., slow motion of a robot) and the no-go zones may be used to provide a second feedback that may be different than the first feedback (e.g., audible or visual) and/or stop motion or navigation (e.g., remove tracked pose of instruments from a display or stop motion of a robot).
  • a first feedback e.g., audible or visual
  • limited motion e.g., slow motion of a robot
  • stop motion or navigation e.g., remove tracked pose of instruments from a display or stop motion of a robot.
  • the tracking system control module 102 is configured to calculate an execution path for a procedure.
  • the execution path may be for movement of the robotic arm 40, the instrument 68, and/or any other instrumentation to execute a particular surgical procedure.
  • the details of the surgical procedure are entered by way of the user interface 106, or in any other suitable manner. Details of the surgical procedure include, but are not limited to, identification of bony areas of the spine 126 to be resected.
  • the procedure execution path may include various information, such as a trajectory planned or selected for an instrument to move along, limitations of movement of various portions of the robotic arm 40, or the like. For example, with reference to Figs.
  • a planned trajectory may include a trajectory or path 541 to engage one or more of the facets 232.
  • the path 541 may include a trajectory of the end effector 44 and/or to guide the instrument 68, such as when the instrument is in the end effector 44 and generally positioned along an instrument axis 68A. Therefore, the trajectory or path 541 may be defined as a path along which a portion of the instrument 68, such as a distal or terminal end of the axis 68a defined by the instrument 68, may pass.
  • the path 541 may also include instructions for movement of the robotic arm 40, including movement of the various portions or joints thereof, such as the joint 48. Additionally, the path 541 may be illustrated on the display, such as a path graphical representation 541 i that may be displayed on the display 84. The user 72 may then move the instrument 68 along the path which may be confirmed by the illustration of the graphical representation 68i on the display 84 relative to the graphical representation of the path 541 i.
  • the path may be entirely manually identified, such as by the user 72.
  • the control system may execute instructions to calculate the path 541 between an entry point and an end point, such as at the facet 232.
  • the position or target of a procedure may be input, such as by the user 72, and one or more entry points may be identified and/or a region may be selected that includes a plurality of points.
  • the path may include at least a straight or curved line or multiple portions thereof form the entry point to the procedure target.
  • the path may be determined based on various parameters such as a time of procedure, length of the path, avoiding the no-go zones and/or safety zones, or the like.
  • Various parameters for defining the path may also take into consideration elements that may block the movement of the robotic arm and the robotic arm ability to execute the path due to its physical limitations.
  • the tracking system control module 102 is configured to determine whether movement of the robotic arm 40, the instrument 68, etc. along the execution path will result in virtual contact with one or more of the no-go zones 610A-E, and thus actual contact with bone or soft tissue within the no-go zones 610A-E. If the tracking system control module 102 determines that contact will occur, then the method 510 returns to block 540, where a revised execution path is calculated. In other words, the tracking system control module may execute instructions to iterate the path calculation to ensure that no-go zones are avoided and/or safety zone passage is minimized, or other parameters met.
  • the no-go zones may include areas that are selected for entirely restricted for movement of the instrument 68, such as a burr or other resection tool.
  • the safety zones may include zones that are near the no-go zones and may allow for restricted movement, speed of movement, or the like. Therefore, the determination of whether the path goes through a safety zone or no-go zone in block 542 may include a determination of whether a safety zone is passed through and/or no-go zone is planned to be passed through with the path 541 , together or separately.
  • the path 541 may go through a safety zone and the instructions may include limited speed of movement, limited speed of the instrument 68 (e.g., rotational speed of the burr) or other restrictions.
  • a determination that path goes through a no-go zone may include requirement that the path redone. Therefore, a determination of whether path enters or goes through a safety zone or a no-go zone in a block 542 may include altering the portion or all of the path 541 and/or including instructions in the path determination and execution regarding a speed of movement, speed of an instrument, or the like. Additionally, various other parameters may be determined relative to the path or determine whether the path goes through a safety zone or no-go zone such as anatomical maps, best known surgical technique, or bone density maps.
  • the method 510 proceeds to block 544.
  • the tracking system control module 102 saves the execution path for use during the spinal procedure.
  • the planned path may be saved with the robotic system 20 and/or the navigation system 98.
  • the navigation system may include the processor or tracking system control module 102 and a related memory module 103.
  • the memory module 103 located and/or accessed appropriately, may store the planned path.
  • the method 510 protects sensitive tissue and bony areas.
  • the method 510 may be executed in real-time, such as during a surgical procedure at that includes real time 545 blocks 550 - 556 in Fig. 7B.
  • the real time procedure may include as inputs the determined of the no-go and/or safety zones. The zones may be used to provided feedback to the user 72 and/or limit and/or control motion of the robotic arm 40 or the instrument 68.
  • the tracking system control module 102 is configured to track the position of at least one of the robotic arm 40 or the instruments 68 in real time, such as described above with respect to the robotic arm 40 and the instrument 68.
  • the tracking system control module 102 is configured to identify the distance of the instruments from the no-go zones 610A-E.
  • the robotic system 20 may include the base 34.
  • the pose of the arm 40 including the end effector 44 may be known relative to the base 34 with appropriate systems.
  • encoders may be used to determine the pose of the arm 40 relative to the base.
  • the robotic coordinate system, including the pose of the arm 40 relative to the base may be registered or correlated to the localizer navigation coordinated system, such as the localizer 88, and the image data coordinate system that include the no-go zones 610A-E, the no-go zones 610A-E may be known in both the robot coordinate system and the localizer 88 coordinate system.
  • the robotic system 20 may be registered or correlated to a selected coordinate system such as via image-to image registration or correlation or similar methods, such as disclosed in U.S. Pat. No. 1 1 ,135,025, incorporated herein by reference.
  • the pose of the robotic arm 40 may be restricted to not move within the no-go zones 610A-E and/or alerts may be provided based on the coordinate system of the robotic system 20 itself.
  • the tracking system control module 102 is configured to provide feedback, such as generate alerts of increasing intensity as the instrument or restriction of movement (e.g., force feedback), such as the robotic arm 40 or the instrument 68, moves through any of the safety zones towards one of the no-go zones 610A-E. Any suitable alerts may be generated, such as any suitable audible and/or tactile alerts that increase in intensity as the robotic arm 40 moves closer to a particular no-go zone 610A-E.
  • the robotic arm 40 including the end effector 44, may be used by the user 72. That is, the robotic arm 40 may be manually moved by the user 72 to assist in guiding the procedure.
  • the robotic arm may be to position the robotic arm, including the effector 44, to assist in positioning or moving the instrument 68 along a path, such as a path 541 .
  • the path may not be a predetermined path based upon the acquired images or image data, as discussed above.
  • the identified no-go zones and/or safety zones may be determined substantially in real time based upon image data of the subject 30.
  • the robotic arm 40 while being manually moved may provide a feedback, such as a haptic or force feedback, to the user 72 when a trajectory of the instrument 68 through the end effector 44 may pass through one of the selected no-go zones and/or safety zones.
  • the end effector 44 may engage the instrument 68 in a selected manner, such as with a braking force, as the instrument 58 is moved along a trajectory toward a safety or no-go zone.
  • the brake may include the end effector 44 or a portion thereof compressing against the instrument 68 to slow or stop movement when being moved by the user 72.
  • the alerts may include audible or visual alerts, such as visual alerts displayed on the display device 84.
  • the display device 84 may flash a selected color, include a written notice or warning, or provide other visual output for the user 72.
  • the tracking system control module 102 detects that the robotic arm 40 or instrument 68 has breached one of the no-go zones 610Aa-610E, the tracking system control module 102 is configured to stop or deactivate a procedure or navigation at block 556.
  • the deactivation may include stopping movement of the robotic arm 40, deactivating any instrument mounted to the robotic arm 40, and deactivate the instrument 68.
  • the robotic arm 44 may be locked in a selected position.
  • the end effector 44 may engage the instrument 68 to stop or restrict or resist movement of the instrument 68, even along a trajectory of the end effector 44.
  • the user 72 may feel or be provided with feedback that the instrument 68 is within and/or immediately adjacent to a no-go zone. Further, the instrument 68 may be deactivated or stopped, such as stopping a rotation of a burr tip of the instrument 68. Thus, the operation of the instrument 68 may be stopped when a no-go zone is encountered or reached.
  • the system may operate various portions in the operating theatre to ensure that the no-go zones are not breached or passed through with the instrument 68.
  • the instrument 68 may be used to operate on various portions of bone, such as a portion of the facet 232, but may be stopped when a no-go zone is reached.
  • the surgeon 72 may have access to additional information regarding the procedure and position of the instrument 68, such as a working end thereof, relative to the subject 30.
  • the no-go zones may be determined relative to various data acquired of the subject 30.
  • the image data may include the MRI image data 210 that is captured in block 512 of the method 510.
  • the MRI image data may include information regarding various soft tissue such as the spinal cord 220 and nerves 222, and of other selected soft tissue.
  • the soft tissue information may be used to identify various no-go zones that may not be easily identified in image data that may include x-ray image data or image data that better delineates hard tissue.
  • image data such as CT image data 250 that is captured in block 514 may be used to identify additional or refine no-go zones.
  • the two types of image data that are different image data 210, 250 may be used separately or registered and/or merged together, such as a block 520, to assist in defining no-go zones and/or safety zones for a procedure. This may allow the user 72 to more efficiently and/or more definitively define no-go zones such as an amount of bone to be removed and/or soft tissue to be avoided.
  • no-go zones may be defined, such as with one or more data, including a combination of two types of image data.
  • a robotic arm 40 may be used to define a path that would avoid the no-go zones and/or the robotic system 40 may be used to limit manual movement of the instrument 68 relative to the subject 30. That is the robotic arm 40 may be used as a guide for manual movement of the instrument 68 (such as via the user 72) and the robotic arm may resist movement into or near a no-go zone. Alternatively or additionally, the robotic arm 40 may move the instrument and may have the path 541 defined to avoid the no-go zones and selected movement in the safety zones.
  • the instrument 68 may be operated or controlled, at least in part, by the navigation system to cease operation of the instrument when the instrument 68 is moved toward a no-go zone. Accordingly, the system may be used to identify and/or limit operation with a selected or determined no-go zone and/or safety zone relative to the subject 30.
  • Example 1 A surgical system comprising: an imaging device control module [e.g., 96] configured to execute instructions to (1 ) segment soft tissue in a MRI image and generate a segmented image of the MRI image and (2) identify a no-go zone for an instrument [e.g., 68] in the segmented image; a tracking system control module [e.g., 102] configured to execute instructions to (1 ) plot an execution path for the instrument to perform a procedure at the surgical site and determine whether the execution path includes the no-go zone and (2) modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and a memory module [e.g., 103] configured to save the execution path for execution during the procedure.
  • an imaging device control module e.g., 96] configured to execute instructions to (1 ) segment soft tissue in a MRI image and generate a segmented image of the MRI image and (2) identify a no-go zone for an instrument [e.g., 68] in the segment
  • Example 2 The surgical system of example 1 , wherein the procedure is a spinal procedure.
  • Example 3 The surgical system of example 2, wherein the soft tissue includes at least one of spinal nerves, a spinal cord, and intervertebral discs.
  • Example 4 The surgical system of example 2, wherein the no-go zone further includes bony areas.
  • Example 5 The surgical system of example 4, wherein the bony areas include at least one of, vertebral bodies, facet joints, and pars interarticularis.
  • Example 6 The surgical system of example 1 , further comprising the imaging device control module further configured to execute instructions to merge a CT image with the MRI image, segment bony tissue, and generate the segmented image as a merged image based on both the MRI image and the CT image.
  • Example 7 The surgical system of example 1 , wherein: at least one of the imaging device control module and the tracking system control module is configured to add a safety zone to the no-go zone; the tracking system control module is configured to track the instrument and generate alerts of increasing intensity as the instrument moves towards and to the no-go zone; and the tracking system control module is configured to deactivate the instrument when the instrument reaches the no-go zone.
  • Example 8 The surgical system of example 1 , wherein the instrument is a robotic arm.
  • Example 9 The surgical system of example 1 , wherein the instrument is a robotic-assisted instrument.
  • Example 10 A surgical system comprising: an imaging device control module [e.g., 102] configured to merge a MRI image of a surgical site and a CT image of the surgical site into a segmented image, segment bony tissue in the segmented image, and segment soft tissue in the segmented image; and a tracking system control module configured to track an instrument relative to a no-go zone identified in the segmented image including at least one of soft tissue and bony tissue, the tracking system control module configured to generate alerts of increasing intensity as the instrument approaches the no-go zone, and deactivate the instrument when the instrument enters the no-go zone.
  • an imaging device control module e.g., 102
  • a tracking system control module configured to track an instrument relative to a no-go zone identified in the segmented image including at least one of soft tissue and bony tissue, the tracking system control module configured to generate alerts of increasing intensity as the instrument approaches the no-go zone, and deactivate the instrument when the instrument enters the no-go zone.
  • Example 1 1 The surgical system of example 10, wherein: the surgical site includes a spine; the bony tissue includes vertebra, facet joints, and pars interarticularis; and the soft tissue includes a spinal cord and spinal nerves.
  • Example 12 The surgical system of example 1 1 , wherein the no-go zone includes at least the spinal nerves.
  • Example 13 The surgical system of example 12, wherein the no-go zone further includes at least one of facet joints and pars interarticularis.
  • Example 14 The surgical system of example 10, wherein the instrument includes at least one of a robotic arm and a robotic-assisted instrument.
  • Example 15 The surgical system of example 10, wherein the tracking system control module is configured to track the instrument in real-time as a procedure is performed at the surgical site.
  • Example 16 The surgical system of example 10, wherein at least one of the imaging device control module and the tracking system control module is configured to add a safety zone around the no-go zone, and generate alerts of increasing intensity as the instrument moves through the safety zone towards the no-go zone.
  • Example 17 The surgical system of example 10, wherein: the tracking system control module is configured to identify an execution path for the instrument to perform a spinal procedure at the surgical site and determine whether the execution path includes the no-go zone, the tracking system control module further configured to modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and a memory module configured to save the execution path for execution during the procedure.
  • Example 18 A method of performing a surgical procedure, the method comprising: acquiring an MRI image of a surgical site from an MRI imaging device; acquiring a CT image of the surgical site from a CT imaging device; merging the MRI image and the CT image into a segmented image with an imaging device control module; segmenting bony tissue in at least one of the CT image or the segmented image with the imaging device control module; segmenting soft tissue in at least one of the MRI image or the segmented image with the imaging device control module; and identifying a no-go zone for a surgical instrument in the segmented image.
  • Example 19 The method of example 18, further comprising: calculating an execution path for an instrument to perform a procedure at a surgical site and determine whether the execution path includes the no-go zone with a tracking system control module; modifying the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and saving the execution path to a memory module for execution during the procedure.
  • Example 20 The method of example 18, further comprising: adding a safety zone to the no-go zone; tracking movement of the instrument in real-time during the surgical procedure with the tracking system control module; generating alerts of increasing intensity with the tracking system control module as the instrument moves through the safety zone towards the no-go zone; and deactivating the instrument with the tracking system control module when the instrument enters the no-go zone.
  • Example embodiments are provided so that this disclosure will be thorough and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
  • Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects.
  • the term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules.
  • the term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above.
  • the term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules.
  • the term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
  • the apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e., created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs.
  • the computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium.
  • the computer programs may also include or rely on stored data.
  • the computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
  • BIOS basic input/output system
  • the computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc.
  • source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
  • Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.1 1 -2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20-2008.
  • IEEE 802.1 1 -2012 may be supplemented by draft IEEE standard 802.1 1 ac, draft IEEE standard 802.1 1 ad, and/or draft IEEE standard 802.11 ah.
  • a processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically disclosed otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.
  • ASIC Application Specific Integrated Circuit
  • FPGA field programmable gate array
  • processors or processor modules such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors or processor modules may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques.
  • the techniques could be fully implemented in one or more circuits or logic elements.
  • the processor or processors may operate entirely automatically and/or substantially automatically. In automatic operation the processor may execute instructions based on received input and execute instructions in light thereof. Thus, various outputs may be made without further or any manual (e.g., user) input.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

An imaging device control module is configured to segment portions of an image, such as soft tissue in a MRI image, and generate a segmented image of the image. Further configured to identify a no-go zone for an instrument in the segmented image. A tracking system control module is configured to plot an execution path for the instrument to perform a procedure at the surgical site and determine whether the execution path includes the no-go zone or modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone.

Description

METHOD AND APPARATUS FOR PROCEDURE NAVIGATION
FIELD
[0001] The present disclosure relates to positioning an instrument during planning and execution of a procedure.
BACKGROUND
[0002] This section provides background information related to the present disclosure, which is not necessarily prior art.
[0003] Typically, during a surgical procedure care must be taken to avoid various tissues. With respect to spinal surgery, for example, care must be taken to avoid the spinal cord and nerve endings. Care must also be taken to avoid resecting various bony areas. While current surgical procedures are suitable for their intended use, they are subject to improvement. The present disclosure advantageously provides improved surgical systems and methods that address various needs in the art and provide numerous advantages, as explained in detail herein.
SUMMARY
[0004] This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
[0005] The present disclosure includes a planning and/or navigation system, such as a surgical planning and/or navigation system. The system may incorporate or use an magnetic resonance (MRI) image (including MRI image data) of a planned or actual surgical site. An imaging device control module is configured to segment soft tissue in the MRI image and generate a segmented image of the MRI image, and further configured to identify a no-go zone for an instrument in the segmented image. A tracking system control module is configured to plot an execution path or trajectory for the instrument to perform a procedure at the surgical site and determine whether the execution path includes (e.g., passes through) the no-go zone, the tracking system control module is further configured to modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone. A memory module is configured to save the execution path for execution during the procedure.
[0006] The present disclosure further includes a planning and/or navigation system, such as a planning and/or navigation system surgical system with an MRI image of a surgical site and a projection or x-ray image or image data (such as computed tomography (CT) image) of the surgical site. The MRI and the CT image may be segmented. The MRI may be segmented to segment and identify soft tissue and the CT image may be segmented to segment and identify hard (e.g., bony) tissue. An imaging device control module is configured to merge the segmented MRI image and the CT image into a segmented image, that includes segmented bony tissue in the segmented image, and segmented soft tissue in the segmented image. The imaging device control module may alternatively merge the images and then segment the selected portions, as noted above. A tracking system control module is configured to track an instrument relative to a no-go zone identified in the segmented image including at least one of soft tissue and bony tissue, and configured to generate alerts of increasing intensity as the instrument approaches the no-go zone. The tracking system control module is further configured to deactivate the instrument when the instrument enters the no-go zone. [0007] The present disclosure further includes a method of performing a surgical procedure. The method includes the following: acquiring an MRI image of a surgical site such as from an MRI imaging device; acquiring a CT image of the surgical site such as from a CT imaging device; merging the MRI image and the CT image into a segmented image with an imaging device control module that includes segmented bony tissue in the segmented image with the imaging device control module; segmented soft tissue in the segmented image with the imaging device control module; identifying a no- go zone for a surgical instrument in the segmented image; plotting an execution path for an instrument to perform a procedure at a surgical site and determine whether the execution path includes the no-go zone with a tracking system control module; modifying the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and saving the execution path to a memory module for execution during the procedure. Further, the MRI and/or CT image data may be segmented prior to merging. The merged image (e.g., merged CT and MRI image) may also be segmented in a selected manner.
[0008] Further areas of applicability will become apparent from the description provided herein. The description and specific examples in this summary are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
DRAWINGS
[0009] The drawings described herein are for illustrative purposes only of select embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure. [0010] FIG. 1 is diagrammatic view illustrating an overview of a robotic system and a navigation system in accordance with the present disclosure;
[0011] FIG. 2 is a detailed environmental view of the robotic system and a tracking system in accordance with the present disclosure;
[0012] FIG. 3 is a detailed view of additional features of the robotic system and the navigation system;
[0013] FIG. 4A illustrates an MRI image of a spine of a subject;
[0014] FIG. 4B illustrates a CT image of the spine of the subject;
[0015] FIG. 5 is a merged image of the MRI image and the CT image of the spine of the subject;
[0016] FIG. 6 is a cross-sectional view of the merged image of the spine;
[0017] FIG. 7A illustrates a method in accordance with the present disclosure for identifying a no-go zone during planning and execution of spine surgery; and
[0018] FIG. 7B is a continuation of the method of FIG. 7A.
[0019] Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
[0020] Example embodiments will now be described more fully with reference to the accompanying drawings.
[0021] The subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. [0022] FIG. 1 is a diagrammatic view illustrating an overview of a procedure room or arena. In various embodiments, the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures. The robotic system 20 may include a Mazor X™ robotic guidance system, sold by Medtronic, Inc., which is configured to apply restraint to movement of an instrument. The robotic system 20 may also include a Mako robotic-assisted surgical robot offered by Stryker Corporation. The robotic system 20 may be used to assist in guiding selected instruments during manual movement by the user 72, such as drills, screws, etc. relative to a subject 30. The robotic system 20 may also be used to assist in moving selected instruments, such as drills, screws, etc. relative to a subject 30. The robotic system 20 may include a mount 34 that fixes a portion, such as a robotic base 38, relative to the subject 30. The robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44. The end effector 44 may be any appropriate portion, such as a tube, guide, or passage member. The end effector 44 may be moved relative to the base 38 with one or more motors. The position of the end effector 44 may be known or determined relative to the base 38 with one or more encoders at one or more joints, such as a wrist joint 48 and/or an elbow joint 52 of the robotic system 20.
[0023] The navigation system 26 can be used to track the location of one or more tracking devices, which may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, and/or a tool tracking device 66, for example. A tool or instrument 68 may be any appropriate tool such as a drill, forceps, or other tool operated by a user 72. The tool 68 may also include an implant, such as a spinal implant or orthopedic implant. It should further be noted that the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc. Moreover, the instruments may be used to navigate or map any region of the body. The navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
[0024] An imaging device 80 may be used to acquire pre-, intra-, or post-operative or real-time image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject. In the example shown, the imaging device 80 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA. The imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed. The image capturing portion may include an x-ray source or emission portion and an x-ray receiving or image receiving portion located generally or as practically possible 180 degrees from each other and mounted on a rotor relative to a track or rail. The image capturing portion can be operable to rotate 360 degrees during image acquisition. The image capturing portion may rotate around a central point or axis, allowing image data of the subject 30 to be acquired from multiple directions or in multiple planes. The imaging device 80 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference, or any appropriate portions thereof. In one example, the imaging device 80 can utilize flat plate technology having a 1 ,720 by 1 ,024 pixel viewing area. [0025] The position of the imaging device 80, and/or portions therein such as the image capturing portion, can be precisely known relative to any other portion of the imaging device 80. The imaging device 80, according to various embodiments, can know and recall precise coordinates relative to a fixed or selected coordinate system. This can allow the imaging system 80 to know its position relative to the patient 30 or other references. In addition, the precise knowledge of the position of the image capturing portion can be used in conjunction with a tracking system to determine the position of the image capturing portion and the image data relative to the tracked subject, such as the patient 30.
[0026] The imaging device 80 can also be tracked with a tracking device 62. The image data defining an image space acquired of the patient 30 can, according to various embodiments, be inherently or automatically registered relative to an object space. The object space can be the space defined by a patient 30 in the navigation system 26. The automatic registration can be achieved by including the tracking device 62 on the imaging device 80 and/or the determinable precise location of the image capturing portion. According to various embodiments, as discussed herein, imageable portions, virtual fiducial points and other features can also be used to allow for registration, automatic or otherwise. It will be understood, however, that image data can be acquired of any subject which will define subject space. Patient space is an exemplary subject space. Registration allows for a transformation map to be determined between patient space and image space to correlate points in each to one another.
[0027] The patient 30 can also be tracked as the patient moves with a patient tracking device, DRF, or tracker 58. Alternatively, or in addition thereto, the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration. As discussed further herein, registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data. When navigating the instrument 68, a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84. Various tracking systems, such as one including an optical localizer 88 or an electromagnetic (EM) localizer 94 can be used to track the instrument 68.
[0028] More than one tracking system can be used to track the instrument 68 in the navigation system 26. According to various embodiments, these can include an electromagnetic tracking (EM) system having the EM localizer 94 and/or an optical tracking system having the optical localizer 88. Either or both of the tracking systems can be used to track selected tracking devices, as discussed herein. It will be understood, unless discussed otherwise, that a tracking device can be a portion trackable with a selected tracking system. A tracking device need not refer to the entire member or structure to which the tracking device is affixed or associated.
[0029] It is further appreciated that the imaging device 80 may be an imaging device other than the O-arm® imaging device and may include in addition or alternatively a fluoroscopic C-arm. Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc. The imaging device 80 may be configured as any suitable imaging device, such as one or more of an MRI imaging device, CT imaging device, ultrasound imaging device, etc.
[0030] In various embodiments, an imaging device controller 96 may control the imaging device 80 and can receive the image data generated at the image capturing portion and store the images for later use. The controller 96 can also control the rotation of the image capturing portion of the imaging device 80. It will be understood that the controller 96 need not be integral with the gantry housing 82, but may be separate therefrom. For example, the controller 96 may be portions of the navigation system 26 that may include a processing and/or control system 98 including a processing unit or processing portion 101. The controller 96, however, may be integral with the gantry 82 and may include a second and separate processor, such as that in a portable computer.
[0031] The patient 30 can be fixed onto an operating table 104. According to one example, the table 104 can be an Axis Jackson ® operating table sold by OSI, a subsidiary of Mizuho Ikakogyo Co., Ltd., having a place of business in Tokyo, Japan or Mizuho Orthopedic Systems, Inc. having a place of business in California, USA. Patient positioning devices can be used with the table, and include a Mayfield ® clamp or those set forth in commonly assigned U.S. Pat. Appl. No. 10/405,068 published as U.S. Pat. App. Pub. No. 2004/0199072, entitled “An Integrated Electromagnetic Navigation And Patient Positioning Device”, filed April 1 , 2003 which is hereby incorporated by reference.
[0032] The position of the patient 30 relative to the imaging device 80 can be determined by the navigation system 26. The tracking device 62 can be used to track and locate at least a portion of the imaging device 80, for example the gantry or housing 82. The patient 30 can be tracked with the dynamic reference frame 58, as discussed further herein. Accordingly, the position of the patient 30 relative to the imaging device 80 can be determined. Further, the location of the imaging portion can be determined relative to the housing 82 due to its precise position on the rail within the housing 82, substantially inflexible rotor, etc. The imaging device 80 can include an accuracy of within ten microns, for example, if the imaging device 80 is an O-Arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado. Precise positioning of the imaging portion is further described in U.S. Patent Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference,
[0033] According to various embodiments, the imaging device 80 can generate and/or emit x-rays from the x-ray source that propagate through the patient 30 and are received by the x-ray imaging receiving portion. The image capturing portion generates image data representing the intensities of the received x-rays. Typically, the image capturing portion can include an image intensifier that first converts the x-rays to visible light and a camera (e.g. a charge couple device) that converts the visible light into digital image data. The image capturing portion may also be a digital device that converts x-rays directly to digital image data for forming images, thus potentially avoiding distortion introduced by first converting to visible light.
[0034] Two dimensional and/or three dimensional fluoroscopic image data that may be taken by the imaging device 80 can be captured and stored in the imaging device controller 96. Multiple image data taken by the imaging device 80 may also be captured and assembled to provide a larger view or image of a whole region of a patient 30, as opposed to being directed to only a portion of a region of the patient 30. For example, multiple image data of the patient’s 30 spine may be appended together to provide a full view or complete set of image data of the spine.
[0035] The image data can then be forwarded from the image device controller 96 to the navigation computer and/or processor system 101 that can be a part of a controller or work station 98 having the display 84 and a user interface 106. It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98. The work station 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data. The user interface 106, which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84. The work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional image data.
[0036] With continuing reference to FIG. 1 , the navigation system 26 can further include the tracking system including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88. The tracking systems may include a controller and interface portion 1 10. The controller 1 10 can be connected to the processor portion 101 , which can include a processor included within a computer. The EM tracking system may include the STEALTHSTATION® AXIEM™ Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in U.S. Patent Application Serial No. 10/941 ,782, filed Sept. 15, 2004, and entitled "METHOD AND APPARATUS FOR SURGICAL NAVIGATION"; U.S. Patent No. 5,913,820, entitled “Position Location System,” issued June 22, 1999; and U.S. Patent No. 5,592,939, entitled “Method and System for Navigating a Catheter Probe,” issued January 14, 1997; all of which are herein incorporated by reference. It will be understood that the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7™ tracking systems having an optical localizer, that may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Louisville, Colorado. Other tracking systems include an acoustic, radiation, radar, etc. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
[0037] Wired or physical connections can interconnect the tracking systems, imaging device 80, etc. Alternatively, various portions, such as the instrument 68 may employ a wireless communications channel, such as that disclosed in U.S. Patent No. 6,474,341 , entitled “Surgical Communication Power System,” issued November 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the controller 1 10. Also, the tracking devices 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
[0038] Various portions of the navigation system 26, such as the instrument 68, and others as will be described in detail below, can be equipped with at least one, and generally multiple, of the tracking devices 66. The instrument can also include more than one type or modality of tracking device 66, such as an EM tracking device and/or an optical tracking device. The instrument 68 can include a graspable or manipulable portion at a proximal end and the tracking devices may be fixed near the manipulable portion of the instrument 68. Additional representative or alternative localization and tracking system is set forth in U.S. Patent No. 5,983,126, entitled “Catheter Location System and Method,” issued November 9, 1999, which is hereby incorporated by reference. The navigation system 26 may be a hybrid system that includes components from various tracking systems.
[0039] According to various embodiments, the navigation system 26 can be used to track the instrument 68 relative to the patient 30. The instrument 68 can be tracked with the tracking system, as discussed above. Image data of the patient 30, or an appropriate subject, can be used to assist the user 72 in guiding the instrument 68. The image data, however, is registered to the patient 30. The image data defines an image space that is registered to the patient space defined by the patient 30.
[0040] Generally, registration allows a transformation map to be generated of the physical location of the instrument 68 relative to the image space of the image data. The transformation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image data 108. A graphical representation 68i, also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
[0041] With continuing reference to FIG. 1 and additional reference to FIG. 2 and FIG. 3, a subject registration system or method can use the tracking device 58. The tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly. The fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58. The fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy. As illustrated in Figs. 1 and 2, the fiducial assembly 120 can be interconnected with a portion of a spine F, such as a spinous process 130.
[0042] The fixation portion 124 can be interconnected with a spinous process 130 in any appropriate manner. For example, a pin or a screw can be driven into the spinous process 130. Alternatively, or in addition thereto, a clamp portion 124 can be provided to interconnect the spinous process 130. The fiducial portions 120 may be imaged with the imaging device 80. It is understood, however, that various portions of the subject (such as a spinous process) may also be used as a fiducial portion. [0043] In various embodiments, when the fiducial portions 120 are imaged with the imaging device 80, image data is generated that includes or identifies the fiducial portions 120. The fiducial portions 120 can be identified in image data automatically (e.g., with a processor executing a program), manually (e.g. by selection an identification by the user 72), or combinations thereof (e.g. by selection an identification by the user 72 of a seed point and segmentation by a processor executing a program). Methods of automatic imageable portion identification include those disclosed in U.S. Patent No. 8,150,494 issued on April 3, 2012, incorporated herein by reference. Manual identification can include selecting an element (e.g. pixel) or region in the image data wherein the imageable portion has been imaged. Regardless, the fiducial portions 120 identified in the image data can be used as fiducial points or positions that can be used to register the image data or the image space of the image data with patient space.
[0044] In various embodiments, to register an image space or coordinate system to another space or coordinate system, such as a navigation space, the fiducial portions 120 that are identified in the image 108 may then be identified in the subject space defined by the subject 30, in an appropriate manner. For example, the user 72 may move the instrument 68 relative to the subject 30 to touch the fiducial portions 120, if the fiducial portions are attached to the subject 30 in the same position during the acquisition of the image data to generate the image 108. It is understood that the fiducial portions 120, as discussed above in various embodiments, may be attached to the subject 30 and/or may include anatomical portions of the subject 30. Additionally, a tracking device may be incorporated into the fiducial portions 120 and they may be maintained with the subject 30 after the image is acquired. In this case, the registration or the identification of the fiducial portions 120 in a subject space may be made. Nevertheless, according to various embodiments, the user 72 may move the instrument 68 to touch the fiducial portions 120. The tracking system, such as with the optical localizer 88, may track the position of the instrument 68 due to the tracking device 66 attached thereto. This allows the user 72 to identify in the navigation space the locations of the fiducial portions 120 that are identified in the image 108. After identifying the positions of the fiducial portions 120 in the navigation space, which may include a subject space, the transformation map may be made between the subject space defined by the subject 30 in a navigation space and the image space defined by the image 108. Accordingly, identical or known locations allow for registration as discussed further herein.
[0045] During registration, a transformation map is determined between the image data coordinate system of the image data, such as the image 108 and the patient space defined by the patient 30. Once the registration occurs, the instrument 68 can be tracked with the tracking system that is registered to the image data to allow an identification and illustration of a position of the tracked instrument 68 as an icon superimposed on the image data. Registration of the image 108 (or any selected image data) to the subject 30 may occur at any appropriate time.
[0046] After the registration of the image space to the patient space, the instrument 68 can be tracked relative to the image 108. As illustrated in FIG. 1 , the icon 68i representing a position or pose (which may include a 6 degree of freedom position (including 3D location and orientation)) of the instrument 68 can be displayed relative to the image 108 on the display 84. Due to the registration of the image space to the patient space, the position of the icon 68i relative to the image 108 can substantially identify or mimic the location of the instrument 68 relative to the patient 30 in the patient space. As discussed above, this can allow a navigated procedure to occur. [0047] The robotic system 20 having the robotic system coordinate system may be registered to the navigation space coordinate system, as discussed herein, due to the reference tracking device 54 (e.g., if fixed to a known position on or relative to the robotic system 20) and/or due to the tracking of a snapshot tracking device 160. Due to or during registration, a transformation map is determined between the robotic system coordinate system of the robotic system and the navigation space coordinate system patient space defined by the patient 30. The snapshot tracking device 160 may include one or more trackable portions 164 that may be tracked with the localizer 88 or any appropriate localizer (e.g. optical, EM, radar). It is understood, however, that any appropriate tracking system may be used to track the snapshot tracking device 160. A fixed reference tracking device may also be positioned within the navigation space. The fixed navigation tracker may include the patient tracker 58, which may be connected to the patient 30 and/or the robot tracker 54 that may be fixed to the mount 34 of the robotic system 20. The reference tracker, therefore, may be any appropriate tracker that is positioned relative to the snapshot tracker 160 that is within the navigation coordinate space during the registration period. For the discussion herein, the robot tracker 54 will be referred to, however, the patient tracker 58 may also be used as the reference tracker. Further, the reference tracker may be positioned within the coordinate system at any position relative to the snapshot tracker 160 as long as the snapshot tracker 160 may be tracked relative to the reference tracker.
[0048] In various embodiments, the snapshot tracker 160 may be positioned at a known position relative to the end effector 44. For example, the snapshot tracker 160, as illustrated in FIG. 3, which includes the trackable portions 164, extends from a rod or connection member 168. The connection member 168 may include a keyed portion, such as a projection 172 that may engage a slot 174 of the end effector 44. The end effector 44 may form or define a cannula or passage 176 that may engage the connector 168. The connector 168 may be positioned within the passage 176 of the end effector 44. The connector 168 may then be fixed to the end effector 44, such as with a fixation member including a set screw or clamping of the end effector 44, such as with a set screw or clamping member 180.
[0049] The localizer 88 may then view or determine a position of the snapshot tracking device 160 relative to the reference tracking device 54 and/or the reference tracking device 58. As the localizer 88 defines, or may be used to define, the navigation space, determining or tracking a position of the snapshot tracking device 160 relative to the reference frame 54 may be used to determine a relationship between a position within the navigation space and the robotic space of the end effector 44.
[0050] With continuing reference to FIG. 3, therefore, the navigation space defined by the localizer 88 may include the full navigation space 170, which may include portions relative to the subject, such as the subject tracker 58 and other portions that may be moved therein, such as the instrument 68. The robotic registration space may be smaller and may include a robotic registration space 174 that may include the reference frame 54 and the snapshot tracker 160. As discussed above, however, the robot registration navigation space may include the snapshot tracker 160 and the patient tracker 58 for registration. Accordingly, the exemplary registration navigation space 174 is merely for the current discussion. Both the robotic reference tracker 54 and the patient tracker 58 need not be used simultaneously. This is particularly true when the patient 30 is fixed in space, such as fixed relative to the robotic system 20. [0051] The processor 101 is configured as a tracking system control module, and will subsequently be referred to herein as such. The tracking system control module 102 is in communication with a robotic arm control module 152 (see FIG. 2) in any suitable manner. The tracking system control module 102 tracks the robotic arm 40, and thus knows the orientation and position of the robotic arm 40 within the three-dimensional coordinate system of the arm 40. For example, the position of the snapshot tracking device 160 and the end effector 44 relative to the robotic arm 40 is input to the tracking system control module 102. The tracking system control module 102 thus specifically knows the positions of the snapshot tracking device 160 and the end effector 44 within the three- dimensional coordinate system of the arm 40. Alternatively, the robotic arm 40 may be tracked physically with any suitable encoders, etc. For example, encoders included with selected motors and/or included at various joints of the arm 40 may be used to determine position of the robotic arm 40, including the end effector. The encoders may determine an amount of relative and/or total movement of the respective portions of the arm 40 to determine pose of the end effector at a selected time. The encoders may produce a signal that is sent to the processor 101 to allow determination and navigation of the end effector of the arm 40.
[0052] When the snapshot optical tracking device 160 is visible to the optical localizer 88, the location of the optical tracking device 160 (as well as the end effector 44 and the robotic arm 40 generally) within a coordinate system of the optical localizer 88 is known and input to the processor 101 . That is, the tracking system control module 102 is configured to execute instructions to know the pose (including spatial coordinates and orientation) of the optical tracking device 160 when it is visible to the optical localizer 88. The relative location of the portions attached or fixed to the optical tracking device 160 may also be known or recalled from a selected memory system, such as a portion of the arm 40 or instrument or portion thereof attached to the tracking device 160. For example, if a distal end of the instrument is known to be 5 cm from the tracking device 160, the pose of the distal end may also be tracked or navigated. The tracking system control module 102 also knows the location of the snapshot tracking device 160 within the coordinate system of the robotic arm 40 at all times. Based on known locations of the snapshot optical tracking device 160 in both the coordinate system of the robotic arm 40 and the coordinate system of the optical localizer 88, the tracking system control module 102 is configured to execute instructions to determine the position of any appropriate portion associated with the arm 40 in the coordinate system of the robotic arm 40. This is, at least in part, due to the registration or correlation of the optical localizer 88 coordinate system and the coordinate system of the robotic arm 40 as discussed above.
[0053] The robotic system 20 includes the robotic arm control module 152 configured to control movement of the robotic arm 40. The robotic arm control module 152 may receive the encoder signals, as discussed above. The robotic arm control module 152, therefore, may know or determine a pose of the end effector is 44 at a selected time is based on inputs from its controls and sensors. The pose may be relative to the base of the robotic system 20 and/or a pose in navigation space.
[0054] The tracking system control module 102 is configured to track movement of the tracking device 58 relative to the localizer 88. The localizer 88 may include at least two of the receivers (e.g., at least two laser receivers, optical wavelength cameras, etc.) to generate a stereoscopic image or view. The localizer 88 may be an optical localizer as discussed above and as used in various tracking systems, including various STEALTHSTATION® surgical navigation systems such as the TREON® , S7™ , or S8™ tracking systems having an optical localizer, sold by Medtronic Navigation, Inc. of Colorado. Other tracking systems that may require line-of-site at least for accuracy and/or speed may include acoustic, radar, etc.
[0055] FIG. 4A is an exemplary soft tissue image which may be and also referred to herein as a MRI image 210 of the subject’s spine 126. The MRI image 210 is a sagittal view. The MRI image 210 may be captured with an appropriate imaging device, in various embodiments the imaging device 80 may be a suitable MRI imaging device. The MRI image 210 captures or includes various portions of the spine 126, particularly the soft tissue in and/or around the bony portions thereof. For example, the MRI image 210 includes the spinal cord 220, spinal nerves 222, ligaments 224, and intervertebral discs 226. The MRI image 210 may be configured to capture and/or segment soft tissue structure, for example. The MRI image 210 may also display bony areas of the spine 126, such as, but not limited to, vertebra, facet joints, and pars interarticularis, for example.
[0056] FIG. 4B is an exemplary CT image 250 of the subject’s spine 126. The CT image 250 may be captured with an appropriate imaging device, in various embodiments the imaging device 80, or any other suitable CT imaging device. The CT image 250 captures various portions of the spine 126, particularly the bony areas thereof. For example, the CT image 250 may include detail or definition of various bony portions and may include vertebra 230, facet joints 232, and pars interarticularis 234. Various other bony regions of or near the spine 126 may also be displayed such as a sacrum, pelvis, or ribs.
[0057] The image data may be acquired at any appropriate time. For example, the image data may be acquired in a planning or pre-operative phase or time period. Thus, the image data may be acquired in a non-invasive manner to assist in planning a procedure, as discussed herein. Further, the image data may be acquired intra- or postoperative to assist in further planning and/or confirmation of a procedure.
[0058] The imaging device controller 96 is configured to merge the MRI image 210 and the CT image 250 into a merged image including both soft tissue and bony tissue. The image data, such as the MRI image 210 and the CT image 250 may be individually acquired at any appropriate time, such as prior to (pre-), during (intra-), and/or following (post-) at least a portion of a procedure including an operation. Thus, any merged image may include two or more of the types of image data.
[0059] FIG. 5 illustrates a merged image 310A, which is a sagittal view of the spine 126, formed by merging the MRI image 210 and the CT image 250. FIG. 6 is a merged image 310B, which is a cross-sectional at a selected axial view from inferior-to- superior view of the spine 126, formed by merging the MRI image 210 and the CT image 250. The merged image 310B shows a spinous process 240, transverse processes 242, a first pedicle 244, a second pedicle 246, and a vertebral foramen 248 through which the spinal cord 220 extends. The tracking system control module 102 is configured to display the merged images 310A, 310B on the display 84.
[0060] The merged images 310A and 310B are formed in an appropriate manner. Generally, the MRI image 210 and the CT image 250 are merged and various views thereof may be made, such as the sagittal view 310A and the cross-section or slice view 310B. In various embodiments, the merged image is generated by registering and/or overlaying two or more images or image data sets. For example, identical points or portions of both the MRI image 210 and the CT image 250 may be identified and this may be used to register the images together. The merging process may include the registration and/or additional processes that allow two or more data sets, such as images, to be related to one another such as on a point by point basis. For example, registration processes include a 3D to 3D registration. Generally, this may be understood by one skilled in the art that registration is volume to volume and therefore we can cross-correlate between the images.
[0061] In the merged images 310A and 31 OB, at least portions of the area of the spine 126 are segmented. For example, a soft tissue portion in a region of interest in the imaged area of the spine 126 is further processed to segment the soft tissue and the bony tissue. Moreover, a type of segmentation may occur for the entire image data set (e.g., the images area of the spine 126). For example, the spinal cord 220, the spinal nerves 222, the ligaments 224, and the intervertebral discs 226 may be segmented during soft tissue segmentation. The vertebra 230, the facet joints 232, and the pars interarticularis 234 may be segmented during bony segmentation. The segmentation may be within a manually selected region of interest, an entire images volume, a predefined or automatically determined region, or in any appropriate portion.
[0062] Segmentation may occur in various manner and/or according to various techniques. Segmentation may include edge detection, contrast variation determination, and the like. According to various embodiments, segmentation of the merged images 310A, 310B may include selected and appropriate techniques and processes such those understood by those skilled in the art. These may include or additionally include a machine learning system such as a neural network (artificial intelligence) that is trained to perform an appropriate segmentation of merged images. This may be done for either or both soft tissue and bone.
[0063] The merged images 310A and 310B are shown on the display 84 and used for planning and executing spinal surgery and may further be used to assist in confirming a procedure. FIGS. 7A and 7B illustrate an exemplary method 510 in accordance with the present disclosure for determining instrument no-go (also referred to here as no-fly or stop navigation) zone identification during planning and execution of spine surgery. It is understood, however, that the method 510 may be used to plan or assist in planning any appropriate procedure. For example, a partial or total knee replacement, laryngeal, wrist, or other appropriate procedure may have regions or zone that may be identified and selected as no-go zones. The no-go zones are areas or volumes where instruments are selected to not entered to protect sensitive portions of the spine 126, such as the spinal cord 220, the spinal nerve 222, and the ligaments 224. The no-go zones may also include portions of the vertebra 230 as well.
[0064] With reference to FIG. 7A, the method 510 includes scanning the subject 30 at block 512 to obtain or capture the MRI image 210. At block 514, the subject 30 may be scanned to obtain or capture the CT image 250. The CT image 250 is optional, and thus the method 510 may be performed using only the MRI image 210. When both the MRI image 210 and the CT image 250 are taken, the images 210, 250 are merged at block 520 to obtain the merged images 310A and 310B, as described above. Merged images may be registered volume to volume and each segmentation may occur independently and the registration brings it all together. In various embodiments, therefore, the segmentation may happen before or after the merging.
[0065] At block 522, the volume of the selected image is segmented. As discussed above only the MRI image 512 may be selected, therefore only the MRI image may be segmented in block 522. The merged images 310A, 310B is segmented as described above. The merged images 310A, 310B are further segmented at blocks 524 and 526. At block 524, bone is segmented. And at block, 526, soft tissue is segmented. It is further understood that the various images may be segmented separately. For example, soft tissue may be segmented and identified in the MRI image 210. That is that the spinal cord 220, the nerves 222 or other appropriate portions may be segmented in the MRI image 210. The segmented portions may be identified automatically or manually in the segmented image. Similarly, the CT image 250 may be segmented to segmented bone portions. Various portions of the bone may also be identified in the segmented image and further segmented to delineate those portions, such as the facet joints 232, and/or to keep the pars interarticularis 234
[0066] At block 530, cuts to the bony areas of the spine 126 are planned and/or saved in any appropriate manner such as manual identification with the merged image. At block 532, sensitive areas of the spine 126 are identified as no-go zones for instrumentation. Sensitive soft tissue areas include, but are not limited to, the spinal cord 220 and spinal nerves 222. Various bony areas may be designated as no-go zones as well. For example, areas of the vertebra 230 may need to be removed to relieve pressure on the spinal nerves 222 and/or the spinal cord 220. The bony areas must be removed without damaging the nerves 222 and the spinal cord 220. Selecting or identifying no-go zones may be used to ensure or assist in ensuring to not remove too much bone, which may destabilize the spine 126. Depending on the particular surgery, it may also be desirable to remove no more than 50% of the facet joints 232, and/or to keep the pars interarticularis 234 intact. Otherwise, destabilization may occur, which may require implanting rods and screws.
[0067] FIGS. 5 and 6 illustrate exemplary no-go zones 610A - 610E identified in the merged images 310A and 310B. The no-go zones 610A - 610E are sensitive areas into which, in accordance with the present disclosure, instruments are or should be selectively restricted from traveling. Exemplary surgical instrumentation includes, but is not limited to, the robotic arm 40 and the instrument 60. Exemplary no-go zone 610A includes the spinal cord 220. No-go zone 61 OB includes the nerve endings 222. No-go zone 61 OC includes intervertebral discs 226. No-go zone 61 OD includes facet joints 232. No-go zone 61 OE includes pars interarticularis 234. Another no-go zone may include portions of the vertebra 230 to prevent over-resection of the vertebra 230. Any suitable additional no-go zones may be added as well.
[0068] The no-go zones 610 may be identified in any suitable manner. For example, the no-go zones 610 may be identified manually by way of the user interface 106 and/or the display 108. More specifically, using any suitable input device, such as a stylus, mouse, trackball, etc., or touching the display 84 configured as a touch display, personnel may designate one or more no-go zones, such as, but not limited to, one or more of the no-go zones 610A-E.
[0069] Rather than manually identifying the no-go zones, the tracking system control module 102 may be configured to automatically identify sensitive tissue areas and bony areas as no-go zones based on the anatomy in the area of the procedure. As discussed above, it may be selected to restrict or eliminate movement of various instruments into selected regions. The selected regions may be identified or include regions that include selected soft tissue portions, such as nerves or the spinal cord. Selected regions may also include boney portions, such as spinal facets. Accordingly, the system may identify, such as via segmentation, volumes that are identified as nerves, other selected soft tissue, or selected bony regions. The volumes identified as nerves may be identified as no-go zones. Therefore, the no-go zones may be identified based upon the segmentation of the images, such as the MRI image 210 or the CT scan 250. Further, various volumes or dimensions of the boney portions may be identified and limitations may be made or identified relative thereto regarding an amount of removal. For example, the facets 232 may be identified and a no-go zone or limitation line 232a may be identified or selected as a no-go zone to limit the amount of material removed from the facets 232. The no-go zones or at least portions thereof may be identified substantially automatically by the system by executing instructions regarding the segmented regions and/or a dimension relative to an exterior surface or external surface of a segmented region. Additional automatic no-go zones may be identified by portions that if removed may cause instability in the spine or are prone to cause issues in the rehabilitation process. These may be identified in general or relative to a specific patient, such as by the user 72.
[0070] At block 534, safety zones may optionally be added around one or more of the no-go zones 61 OA-E. The safety zones are buffer zones for the no-go zones 610A- E. The safety zones may extend any suitable distance beyond or from the no-go zones 61 OA-E, and may have any suitable width/depth. For example, each safety zone may be defined by a boundary that is a selected distance from the no-go zone such as about 3mm-5mm around one or more of the no-go zones 61 OA-E. As discussed herein, the safety zones may be used to provide a first feedback (e.g., audible or visual) and/or limited motion (e.g., slow motion of a robot) and the no-go zones may be used to provide a second feedback that may be different than the first feedback (e.g., audible or visual) and/or stop motion or navigation (e.g., remove tracked pose of instruments from a display or stop motion of a robot).
[0071] At block 540, the tracking system control module 102 is configured to calculate an execution path for a procedure. The execution path may be for movement of the robotic arm 40, the instrument 68, and/or any other instrumentation to execute a particular surgical procedure. The details of the surgical procedure are entered by way of the user interface 106, or in any other suitable manner. Details of the surgical procedure include, but are not limited to, identification of bony areas of the spine 126 to be resected. The procedure execution path may include various information, such as a trajectory planned or selected for an instrument to move along, limitations of movement of various portions of the robotic arm 40, or the like. For example, with reference to Figs. 1 , 2 and 5, the snap shot racking device 160 may be removed and the end effector 44 (or portion held relative thereto) may be operable as a guide for the instrument 68. Therefore, the instrument 68 may be positioned within the end effector 44. A planned trajectory may include a trajectory or path 541 to engage one or more of the facets 232. The path 541 may include a trajectory of the end effector 44 and/or to guide the instrument 68, such as when the instrument is in the end effector 44 and generally positioned along an instrument axis 68A. Therefore, the trajectory or path 541 may be defined as a path along which a portion of the instrument 68, such as a distal or terminal end of the axis 68a defined by the instrument 68, may pass. The path 541 may also include instructions for movement of the robotic arm 40, including movement of the various portions or joints thereof, such as the joint 48. Additionally, the path 541 may be illustrated on the display, such as a path graphical representation 541 i that may be displayed on the display 84. The user 72 may then move the instrument 68 along the path which may be confirmed by the illustration of the graphical representation 68i on the display 84 relative to the graphical representation of the path 541 i.
[0072] The path may be entirely manually identified, such as by the user 72. Alternatively, and/or additionally, the control system may execute instructions to calculate the path 541 between an entry point and an end point, such as at the facet 232. For example, the position or target of a procedure may be input, such as by the user 72, and one or more entry points may be identified and/or a region may be selected that includes a plurality of points. The path may include at least a straight or curved line or multiple portions thereof form the entry point to the procedure target. The path may be determined based on various parameters such as a time of procedure, length of the path, avoiding the no-go zones and/or safety zones, or the like. Various parameters for defining the path may also take into consideration elements that may block the movement of the robotic arm and the robotic arm ability to execute the path due to its physical limitations.
[0073] At block 542, the tracking system control module 102 is configured to determine whether movement of the robotic arm 40, the instrument 68, etc. along the execution path will result in virtual contact with one or more of the no-go zones 610A-E, and thus actual contact with bone or soft tissue within the no-go zones 610A-E. If the tracking system control module 102 determines that contact will occur, then the method 510 returns to block 540, where a revised execution path is calculated. In other words, the tracking system control module may execute instructions to iterate the path calculation to ensure that no-go zones are avoided and/or safety zone passage is minimized, or other parameters met.
[0074] As discussed above, the no-go zones may include areas that are selected for entirely restricted for movement of the instrument 68, such as a burr or other resection tool. The safety zones may include zones that are near the no-go zones and may allow for restricted movement, speed of movement, or the like. Therefore, the determination of whether the path goes through a safety zone or no-go zone in block 542 may include a determination of whether a safety zone is passed through and/or no-go zone is planned to be passed through with the path 541 , together or separately. For example, the path 541 may go through a safety zone and the instructions may include limited speed of movement, limited speed of the instrument 68 (e.g., rotational speed of the burr) or other restrictions. A determination that path goes through a no-go zone may include requirement that the path redone. Therefore, a determination of whether path enters or goes through a safety zone or a no-go zone in a block 542 may include altering the portion or all of the path 541 and/or including instructions in the path determination and execution regarding a speed of movement, speed of an instrument, or the like. Additionally, various other parameters may be determined relative to the path or determine whether the path goes through a safety zone or no-go zone such as anatomical maps, best known surgical technique, or bone density maps.
[0075] Once it is determined at block 542 that the execution path will avoid the no-go zones 610A-E, then the method 510 proceeds to block 544. At block 544, the tracking system control module 102 saves the execution path for use during the spinal procedure. The planned path may be saved with the robotic system 20 and/or the navigation system 98. For example, the navigation system may include the processor or tracking system control module 102 and a related memory module 103. The memory module 103, located and/or accessed appropriately, may store the planned path. Thus, during the spinal procedure the method 510 protects sensitive tissue and bony areas.
[0076] As an alternative to the preplanned procedure 539 that includes blocks 540, 542, and 544, the method 510 may be executed in real-time, such as during a surgical procedure at that includes real time 545 blocks 550 - 556 in Fig. 7B. The real time procedure may include as inputs the determined of the no-go and/or safety zones. The zones may be used to provided feedback to the user 72 and/or limit and/or control motion of the robotic arm 40 or the instrument 68. [0077] At block 550, the tracking system control module 102 is configured to track the position of at least one of the robotic arm 40 or the instruments 68 in real time, such as described above with respect to the robotic arm 40 and the instrument 68. At block 552, the tracking system control module 102 is configured to identify the distance of the instruments from the no-go zones 610A-E. As discussed above, the robotic system 20 may include the base 34. The pose of the arm 40 including the end effector 44 may be known relative to the base 34 with appropriate systems. As noted above, encoders may be used to determine the pose of the arm 40 relative to the base. Further, because the robotic coordinate system, including the pose of the arm 40 relative to the base, may be registered or correlated to the localizer navigation coordinated system, such as the localizer 88, and the image data coordinate system that include the no-go zones 610A-E, the no-go zones 610A-E may be known in both the robot coordinate system and the localizer 88 coordinate system. As discussed herein, the robotic system 20 may be registered or correlated to a selected coordinate system such as via image-to image registration or correlation or similar methods, such as disclosed in U.S. Pat. No. 1 1 ,135,025, incorporated herein by reference. Thus, the pose of the robotic arm 40 may be restricted to not move within the no-go zones 610A-E and/or alerts may be provided based on the coordinate system of the robotic system 20 itself.
[0078] At block 554, the tracking system control module 102 is configured to provide feedback, such as generate alerts of increasing intensity as the instrument or restriction of movement (e.g., force feedback), such as the robotic arm 40 or the instrument 68, moves through any of the safety zones towards one of the no-go zones 610A-E. Any suitable alerts may be generated, such as any suitable audible and/or tactile alerts that increase in intensity as the robotic arm 40 moves closer to a particular no-go zone 610A-E. The robotic arm 40, including the end effector 44, may be used by the user 72. That is, the robotic arm 40 may be manually moved by the user 72 to assist in guiding the procedure. The robotic arm may be to position the robotic arm, including the effector 44, to assist in positioning or moving the instrument 68 along a path, such as a path 541 . The path, however, may not be a predetermined path based upon the acquired images or image data, as discussed above.
[0079] Therefore, the identified no-go zones and/or safety zones may be determined substantially in real time based upon image data of the subject 30. The robotic arm 40 while being manually moved may provide a feedback, such as a haptic or force feedback, to the user 72 when a trajectory of the instrument 68 through the end effector 44 may pass through one of the selected no-go zones and/or safety zones. Further, the end effector 44 may engage the instrument 68 in a selected manner, such as with a braking force, as the instrument 58 is moved along a trajectory toward a safety or no-go zone. The brake may include the end effector 44 or a portion thereof compressing against the instrument 68 to slow or stop movement when being moved by the user 72. The alerts may include audible or visual alerts, such as visual alerts displayed on the display device 84. In various embodiments, the display device 84 may flash a selected color, include a written notice or warning, or provide other visual output for the user 72.
[0080] Should the tracking system control module 102 detect that the robotic arm 40 or instrument 68 has breached one of the no-go zones 610Aa-610E, the tracking system control module 102 is configured to stop or deactivate a procedure or navigation at block 556. The deactivation may include stopping movement of the robotic arm 40, deactivating any instrument mounted to the robotic arm 40, and deactivate the instrument 68. In stop the instrument block 556 the robotic arm 44 may be locked in a selected position. Further, as noted above, the end effector 44 may engage the instrument 68 to stop or restrict or resist movement of the instrument 68, even along a trajectory of the end effector 44. Thus, the user 72 may feel or be provided with feedback that the instrument 68 is within and/or immediately adjacent to a no-go zone. Further, the instrument 68 may be deactivated or stopped, such as stopping a rotation of a burr tip of the instrument 68. Thus, the operation of the instrument 68 may be stopped when a no-go zone is encountered or reached.
[0081] Accordingly, the system, such as through the tracking system control module 102 and/or controller 1 10 may operate various portions in the operating theatre to ensure that the no-go zones are not breached or passed through with the instrument 68. The instrument 68 may be used to operate on various portions of bone, such as a portion of the facet 232, but may be stopped when a no-go zone is reached. Thus, the surgeon 72 may have access to additional information regarding the procedure and position of the instrument 68, such as a working end thereof, relative to the subject 30.
[0082] Further, as noted above, the no-go zones may be determined relative to various data acquired of the subject 30. The image data may include the MRI image data 210 that is captured in block 512 of the method 510. The MRI image data may include information regarding various soft tissue such as the spinal cord 220 and nerves 222, and of other selected soft tissue. The soft tissue information may be used to identify various no-go zones that may not be easily identified in image data that may include x-ray image data or image data that better delineates hard tissue. In addition image data such as CT image data 250 that is captured in block 514 may be used to identify additional or refine no-go zones. Therefore, the two types of image data that are different image data 210, 250 may be used separately or registered and/or merged together, such as a block 520, to assist in defining no-go zones and/or safety zones for a procedure. This may allow the user 72 to more efficiently and/or more definitively define no-go zones such as an amount of bone to be removed and/or soft tissue to be avoided.
[0083] Thus, no-go zones may be defined, such as with one or more data, including a combination of two types of image data. Thereafter, a robotic arm 40 may be used to define a path that would avoid the no-go zones and/or the robotic system 40 may be used to limit manual movement of the instrument 68 relative to the subject 30. That is the robotic arm 40 may be used as a guide for manual movement of the instrument 68 (such as via the user 72) and the robotic arm may resist movement into or near a no-go zone. Alternatively or additionally, the robotic arm 40 may move the instrument and may have the path 541 defined to avoid the no-go zones and selected movement in the safety zones. Further, the instrument 68 may be operated or controlled, at least in part, by the navigation system to cease operation of the instrument when the instrument 68 is moved toward a no-go zone. Accordingly, the system may be used to identify and/or limit operation with a selected or determined no-go zone and/or safety zone relative to the subject 30.
[0084] EXAMPLES:
[0085] Example 1. A surgical system comprising: an imaging device control module [e.g., 96] configured to execute instructions to (1 ) segment soft tissue in a MRI image and generate a segmented image of the MRI image and (2) identify a no-go zone for an instrument [e.g., 68] in the segmented image; a tracking system control module [e.g., 102] configured to execute instructions to (1 ) plot an execution path for the instrument to perform a procedure at the surgical site and determine whether the execution path includes the no-go zone and (2) modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and a memory module [e.g., 103] configured to save the execution path for execution during the procedure.
[0086] Example 2. The surgical system of example 1 , wherein the procedure is a spinal procedure.
[0087] Example 3. The surgical system of example 2, wherein the soft tissue includes at least one of spinal nerves, a spinal cord, and intervertebral discs.
[0088] Example 4. The surgical system of example 2, wherein the no-go zone further includes bony areas.
[0089] Example 5. The surgical system of example 4, wherein the bony areas include at least one of, vertebral bodies, facet joints, and pars interarticularis.
[0090] Example 6. The surgical system of example 1 , further comprising the imaging device control module further configured to execute instructions to merge a CT image with the MRI image, segment bony tissue, and generate the segmented image as a merged image based on both the MRI image and the CT image.
[0091] Example 7. The surgical system of example 1 , wherein: at least one of the imaging device control module and the tracking system control module is configured to add a safety zone to the no-go zone; the tracking system control module is configured to track the instrument and generate alerts of increasing intensity as the instrument moves towards and to the no-go zone; and the tracking system control module is configured to deactivate the instrument when the instrument reaches the no-go zone.
[0092] Example 8. The surgical system of example 1 , wherein the instrument is a robotic arm. [0093] Example 9. The surgical system of example 1 , wherein the instrument is a robotic-assisted instrument.
[0094] Example 10. A surgical system comprising: an imaging device control module [e.g., 102] configured to merge a MRI image of a surgical site and a CT image of the surgical site into a segmented image, segment bony tissue in the segmented image, and segment soft tissue in the segmented image; and a tracking system control module configured to track an instrument relative to a no-go zone identified in the segmented image including at least one of soft tissue and bony tissue, the tracking system control module configured to generate alerts of increasing intensity as the instrument approaches the no-go zone, and deactivate the instrument when the instrument enters the no-go zone.
[0095] Example 1 1 . The surgical system of example 10, wherein: the surgical site includes a spine; the bony tissue includes vertebra, facet joints, and pars interarticularis; and the soft tissue includes a spinal cord and spinal nerves.
[0096] Example 12. The surgical system of example 1 1 , wherein the no-go zone includes at least the spinal nerves.
[0097] Example 13. The surgical system of example 12, wherein the no-go zone further includes at least one of facet joints and pars interarticularis.
[0098] Example 14. The surgical system of example 10, wherein the instrument includes at least one of a robotic arm and a robotic-assisted instrument.
[0099] Example 15. The surgical system of example 10, wherein the tracking system control module is configured to track the instrument in real-time as a procedure is performed at the surgical site. [00100] Example 16. The surgical system of example 10, wherein at least one of the imaging device control module and the tracking system control module is configured to add a safety zone around the no-go zone, and generate alerts of increasing intensity as the instrument moves through the safety zone towards the no-go zone.
[00101] Example 17. The surgical system of example 10, wherein: the tracking system control module is configured to identify an execution path for the instrument to perform a spinal procedure at the surgical site and determine whether the execution path includes the no-go zone, the tracking system control module further configured to modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and a memory module configured to save the execution path for execution during the procedure.
[00102] Example 18. A method of performing a surgical procedure, the method comprising: acquiring an MRI image of a surgical site from an MRI imaging device; acquiring a CT image of the surgical site from a CT imaging device; merging the MRI image and the CT image into a segmented image with an imaging device control module; segmenting bony tissue in at least one of the CT image or the segmented image with the imaging device control module; segmenting soft tissue in at least one of the MRI image or the segmented image with the imaging device control module; and identifying a no-go zone for a surgical instrument in the segmented image.
[00103] Example 19. The method of example 18, further comprising: calculating an execution path for an instrument to perform a procedure at a surgical site and determine whether the execution path includes the no-go zone with a tracking system control module; modifying the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and saving the execution path to a memory module for execution during the procedure.
[00104] Example 20. The method of example 18, further comprising: adding a safety zone to the no-go zone; tracking movement of the instrument in real-time during the surgical procedure with the tracking system control module; generating alerts of increasing intensity with the tracking system control module as the instrument moves through the safety zone towards the no-go zone; and deactivating the instrument with the tracking system control module when the instrument enters the no-go zone.
[00105] Example embodiments are provided so that this disclosure will be thorough and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth such as examples of specific components, devices, and methods, to provide a thorough understanding of embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that example embodiments may be embodied in many different forms and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
[00106] Instructions may be executed by a processor and may include software, firmware, and/or microcode, and may refer to programs, routines, functions, classes, data structures, and/or objects. The term shared processor circuit encompasses a single processor circuit that executes some or all code from multiple modules. The term group processor circuit encompasses a processor circuit that, in combination with additional processor circuits, executes some or all code from one or more modules. References to multiple processor circuits encompass multiple processor circuits on discrete dies, multiple processor circuits on a single die, multiple cores of a single processor circuit, multiple threads of a single processor circuit, or a combination of the above. The term shared memory circuit encompasses a single memory circuit that stores some or all code from multiple modules. The term group memory circuit encompasses a memory circuit that, in combination with additional memories, stores some or all code from one or more modules.
[00107] The apparatuses and methods described in this application may be partially or fully implemented by a processor (also referred to as a processor module) that may include a special purpose computer (i.e., created by configuring a processor) and/or a general purpose computer to execute one or more particular functions embodied in computer programs. The computer programs include processor-executable instructions that are stored on at least one non-transitory, tangible computer-readable medium. The computer programs may also include or rely on stored data. The computer programs may include a basic input/output system (BIOS) that interacts with hardware of the special purpose computer, device drivers that interact with particular devices of the special purpose computer, one or more operating systems, user applications, background services and applications, etc.
[00108] The computer programs may include: (i) assembly code; (ii) object code generated from source code by a compiler; (iii) source code for execution by an interpreter; (iv) source code for compilation and execution by a just-in-time compiler, (v) descriptive text for parsing, such as HTML (hypertext markup language) or XML (extensible markup language), etc. As examples only, source code may be written in C, C++, C#, Objective-C, Haskell, Go, SQL, Lisp, Java®, ASP, Perl, Javascript®, HTML5, Ada, ASP (active server pages), Perl, Scala, Erlang, Ruby, Flash®, Visual Basic®, Lua, or Python®.
[00109] Communications may include wireless communications described in the present disclosure can be conducted in full or partial compliance with IEEE standard 802.1 1 -2012, IEEE standard 802.16-2009, and/or IEEE standard 802.20-2008. In various implementations, IEEE 802.1 1 -2012 may be supplemented by draft IEEE standard 802.1 1 ac, draft IEEE standard 802.1 1 ad, and/or draft IEEE standard 802.11 ah.
[00110] A processor, processor module, module or ‘controller’ may be used interchangeably herein (unless specifically disclosed otherwise) and each may be replaced with the term ‘circuit.’ Any of these terms may refer to, be part of, or include: an Application Specific Integrated Circuit (ASIC); a digital, analog, or mixed analog/digital discrete circuit; a digital, analog, or mixed analog/digital integrated circuit; a combinational logic circuit; a field programmable gate array (FPGA); a processor circuit (shared, dedicated, or group) that executes code; a memory circuit (shared, dedicated, or group) that stores code executed by the processor circuit; other suitable hardware components that provide the described functionality; or a combination of some or all of the above, such as in a system-on-chip.
[00111] Instructions may be executed by one or more processors or processor modules, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” or “processor module” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements. The processor or processors may operate entirely automatically and/or substantially automatically. In automatic operation the processor may execute instructions based on received input and execute instructions in light thereof. Thus, various outputs may be made without further or any manual (e.g., user) input.
[00112] The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not specifically shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the invention, and all such modifications are intended to be included within the scope of the invention.

Claims

CLAIMS What is claimed is:
1 . A surgical system comprising: an imaging device control module [96] configured to execute instructions to (1 ) segment soft tissue in a MRI image and generate a segmented image of the MRI image and (2) identify a no-go zone for an instrument [68] in the segmented image; a tracking system control module [102] configured to execute instructions to (1 ) plot an execution path for the instrument to perform a procedure at the surgical site and determine whether the execution path includes the no-go zone and (2) modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and a memory module [103] configured to save the execution path for execution during the procedure.
2. The surgical system of claim 1 , wherein the procedure is a spinal procedure, and the soft tissue includes at least one of spinal nerves, a spinal cord, and intervertebral discs.
3. The surgical system of claim 2, wherein the no-go zone further includes bony areas, wherein the bony areas include at least one of, vertebral bodies, facet joints, and pars interarticularis.
4. The surgical system of claim 1 , further comprising the imaging device control module further configured to execute instructions to merge a CT image with the MRI image, segment bony tissue, and generate the segmented image as a merged image based on both the MRI image and the CT image.
5. The surgical system of claim 1 , wherein: at least one of the imaging device control module and the tracking system control module is configured to add a safety zone to the no-go zone; the tracking system control module is configured to track the instrument and generate alerts of increasing intensity as the instrument moves towards and to the no-go zone; and the tracking system control module is configured to deactivate the instrument when the instrument reaches the no-go zone.
6. The surgical system of claim 1 , wherein the instrument is a robotic arm [40].
7. The surgical system of claim 1 , wherein the instrument is a robotic-assisted instrument.
8. A surgical system comprising: an imaging device control module [96] configured to merge a MRI image of a surgical site and a CT image of the surgical site into a segmented image, segment bony tissue in the segmented image, and segment soft tissue in the segmented image; and a tracking system control module [102] configured to track an instrument relative to a no-go zone identified in the segmented image including at least one of soft tissue and bony tissue, the tracking system control module configured to generate alerts of increasing intensity as the instrument approaches the no-go zone, and deactivate the instrument when the instrument enters the no-go zone.
9. The surgical system of claim 8, wherein: the surgical site includes a spine; the bony tissue includes vertebra, facet joints, and pars interarticularis; the soft tissue includes a spinal cord and spinal nerves; the no-go zone includes at least the spinal nerves; and the no-go zone further includes at least one of facet joints and pars interarticularis.
10. The surgical system of claim 8, wherein the instrument includes at least one of a robotic arm and a robotic-assisted instrument.
1 1 . The surgical system of claim 8, wherein the tracking system control module is configured to track the instrument in real-time as a procedure is performed at the surgical site.
12. The surgical system of claim 8, wherein at least one of the imaging device control module and the tracking system control module is configured to add a safety zone around the no-go zone, and generate alerts of increasing intensity as the instrument moves through the safety zone towards the no-go zone.
13. The surgical system of claim 8, wherein: the tracking system control module is configured to identify an execution path for the instrument to perform a spinal procedure at the surgical site and determine whether the execution path includes the no-go zone, the tracking system control module further configured to modify the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and a memory module configured to save the execution path for execution during the procedure.
14. A method of performing a surgical procedure, the method comprising: acquiring an MRI image of a surgical site from an MRI imaging device [80]; acquiring a CT image of the surgical site from a CT imaging device [80]; merging the MRI image and the CT image into a segmented image with an imaging device control module [96]; segmenting bony tissue in at least one of the CT image or the segmented image with the imaging device control module; segmenting soft tissue in at least one of the MRI image or the segmented image with the imaging device control module; and identifying a no-go zone for a surgical instrument in the segmented image.
15. The method of claim 14, further comprising: calculating an execution path for an instrument to perform a procedure at a surgical site and determine whether the execution path includes the no-go zone with a tracking system control module [102]; modifying the execution path to exclude the no-go zone upon determining that the execution path includes the no-go zone; and saving the execution path to a memory module for execution during the procedure.
PCT/IL2024/051035 2023-10-27 2024-10-25 Method and apparatus for procedure navigation Pending WO2025088616A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363593907P 2023-10-27 2023-10-27
US63/593,907 2023-10-27

Publications (1)

Publication Number Publication Date
WO2025088616A1 true WO2025088616A1 (en) 2025-05-01

Family

ID=93648487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2024/051035 Pending WO2025088616A1 (en) 2023-10-27 2024-10-25 Method and apparatus for procedure navigation

Country Status (1)

Country Link
WO (1) WO2025088616A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025172998A1 (en) * 2024-02-12 2025-08-21 Mazor Robotics Ltd. Adaptive bone removal system and method

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US5913820A (en) 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US5983126A (en) 1995-11-22 1999-11-09 Medtronic, Inc. Catheter location system and method
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US20040199072A1 (en) 2003-04-01 2004-10-07 Stacy Sprouse Integrated electromagnetic navigation and patient positioning device
US6940941B2 (en) 2002-02-15 2005-09-06 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US7001045B2 (en) 2002-06-11 2006-02-21 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7106825B2 (en) 2002-08-21 2006-09-12 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US7108421B2 (en) 2002-03-19 2006-09-19 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US7188998B2 (en) 2002-03-13 2007-03-13 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US8150494B2 (en) 2007-03-29 2012-04-03 Medtronic Navigation, Inc. Apparatus for registering a physical space to image space
US20160070436A1 (en) * 2013-03-15 2016-03-10 Monroe M. Thomas Planning, navigation and simulation systems and methods for minimally invasive therapy
US11135025B2 (en) 2019-01-10 2021-10-05 Medtronic Navigation, Inc. System and method for registration between coordinate systems and navigation
US20220280240A1 (en) * 2021-03-02 2022-09-08 Mazor Robotics Ltd. Devices, methods, and systems for screw planning in surgery
WO2022254436A1 (en) * 2021-06-02 2022-12-08 Xact Robotics Ltd. Closed-loop steering of a medical instrument toward a moving target
WO2023007029A1 (en) * 2021-07-30 2023-02-02 Medos International Sarl Imaging during a medical procedure

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913820A (en) 1992-08-14 1999-06-22 British Telecommunications Public Limited Company Position location system
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US5983126A (en) 1995-11-22 1999-11-09 Medtronic, Inc. Catheter location system and method
US6474341B1 (en) 1999-10-28 2002-11-05 Surgical Navigation Technologies, Inc. Surgical communication and power system
US6940941B2 (en) 2002-02-15 2005-09-06 Breakaway Imaging, Llc Breakable gantry apparatus for multidimensional x-ray based imaging
US7188998B2 (en) 2002-03-13 2007-03-13 Breakaway Imaging, Llc Systems and methods for quasi-simultaneous multi-planar x-ray imaging
US7108421B2 (en) 2002-03-19 2006-09-19 Breakaway Imaging, Llc Systems and methods for imaging large field-of-view objects
US7001045B2 (en) 2002-06-11 2006-02-21 Breakaway Imaging, Llc Cantilevered gantry apparatus for x-ray imaging
US7106825B2 (en) 2002-08-21 2006-09-12 Breakaway Imaging, Llc Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system
US20040199072A1 (en) 2003-04-01 2004-10-07 Stacy Sprouse Integrated electromagnetic navigation and patient positioning device
US8150494B2 (en) 2007-03-29 2012-04-03 Medtronic Navigation, Inc. Apparatus for registering a physical space to image space
US20160070436A1 (en) * 2013-03-15 2016-03-10 Monroe M. Thomas Planning, navigation and simulation systems and methods for minimally invasive therapy
US11135025B2 (en) 2019-01-10 2021-10-05 Medtronic Navigation, Inc. System and method for registration between coordinate systems and navigation
US20220280240A1 (en) * 2021-03-02 2022-09-08 Mazor Robotics Ltd. Devices, methods, and systems for screw planning in surgery
WO2022254436A1 (en) * 2021-06-02 2022-12-08 Xact Robotics Ltd. Closed-loop steering of a medical instrument toward a moving target
WO2023007029A1 (en) * 2021-07-30 2023-02-02 Medos International Sarl Imaging during a medical procedure

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2025172998A1 (en) * 2024-02-12 2025-08-21 Mazor Robotics Ltd. Adaptive bone removal system and method

Similar Documents

Publication Publication Date Title
US11759272B2 (en) System and method for registration between coordinate systems and navigation
US12268454B2 (en) Method and apparatus for image-based navigation
US12295679B2 (en) Robotic positioning of a device
US11672607B2 (en) Systems, devices, and methods for surgical navigation with anatomical tracking
WO2016010719A1 (en) Surgical tissue recognition and navigation apparatus and method
CA2553842A1 (en) Methods, systems, and apparatuses for providing patient-mounted surgical navigational sensors
US20240164848A1 (en) System and Method for Registration Between Coordinate Systems and Navigation
WO2025088616A1 (en) Method and apparatus for procedure navigation
US20230190377A1 (en) Technique Of Determining A Scan Region To Be Imaged By A Medical Image Acquisition Device
US20240277415A1 (en) System and method for moving a guide system
US20240358466A1 (en) Surgical Cart With Robotic Arm
JP6656313B2 (en) Display system
CN120731053A (en) Systems and methods for mobile guidance systems
WO2025088617A1 (en) Path planning with collision avoidance
WO2025158394A1 (en) System and method for imaging
WO2024215874A1 (en) System and method for imaging and registration for navigation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24812248

Country of ref document: EP

Kind code of ref document: A1