WO2024224310A1 - Surgical cart with robotic arm - Google Patents
Surgical cart with robotic arm Download PDFInfo
- Publication number
- WO2024224310A1 WO2024224310A1 PCT/IB2024/054002 IB2024054002W WO2024224310A1 WO 2024224310 A1 WO2024224310 A1 WO 2024224310A1 IB 2024054002 W IB2024054002 W IB 2024054002W WO 2024224310 A1 WO2024224310 A1 WO 2024224310A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robotic
- robotic arm
- operating table
- actuator
- relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B90/57—Accessory clamps
- A61B2090/571—Accessory clamps for clamping a support arm to a bed or other supports
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B50/00—Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
- A61B50/10—Furniture specially adapted for surgical or diagnostic appliances or instruments
- A61B50/13—Trolleys, e.g. carts
Definitions
- the subject disclosure is directed to a surgical cart including a robotic arm having a z-axis adjustment.
- Robotic arms are used during operating room procedures to perform a variety of different tasks.
- a robotic arm may be used to maneuver a surgical instrument and to assist with imaging.
- the robotic arm may be mounted to the operating table or on a mobile cart. While current robotic arms are suitable for their intended use, they are subject to improvement.
- the present disclosure includes a surgical system including an improved robotic arm that provides numerous advantages, such as those set forth herein.
- the present disclosure includes a surgical system including a base assembly having an actuator vertically movable along a z-axis.
- a motor in cooperation with the actuator to vertically move the actuator along the z-axis.
- a support member mounted to the actuator and configured to support a robotic arm. The support member movable by the actuator to vertically move the robotic arm along the z-axis.
- a linear sensing device mounted to the base assembly. The linear sensing device configured to connect to an operating table and sense vertical movement of the operating table relative to the base assembly.
- a control module configured to receive sensor signals from the linear sensing device identifying a vertical position of the operating table, and configured to operate the motor to vertically move the actuator, the support member, and the robotic arm when mounted to the support member in unison with the operating table.
- the present disclosure further includes a surgical system with a surgical tracking system defining a navigation space having a navigation coordinate system.
- a robotic arm has a robotic coordinate system.
- a base assembly includes an actuator. The robotic arm is supported by the actuator. The actuator is vertically movable between a first position and a second position to vertically move the robotic arm along a z-axis.
- a linear sensing device is mounted to the base assembly. The linear sensing device is configured to connect to an operating table and sense vertical movement of the operating table relative to the base assembly.
- a base assembly control module is configured to receive sensor signals from the linear sensing device identifying a vertical position of the operating table, and configured to operate the motor to vertically move the actuator, the support member, and the robotic arm in unison with the operating table.
- a navigation processor system is configured to register the robotic coordinate system to the navigation space. Operation of the motor by the base assembly control module to vertically move the robotic arm in unison with the operating table maintains registration of the robotic coordinate system to the navigation space throughout vertical movement of both the robotic arm and the operating table.
- the present disclosure also provides for a method of operating a surgical system.
- the method includes the following: positioning a robotic assembly relative to an operating table, the robotic assembly including a robotic arm, an actuator configured to vertically raise and lower the robotic arm, a motor configured to actuate the actuator, a sensing device configured to sense vertical movement of the operating table, and a control module in communication with the sensing device and the motor; connecting the robotic assembly to the operating table such that vertical movement of the operating table is sensed by the sensing device; and registering, with a navigation processor system, a robotic coordinate system of the robotic arm to a navigation space of a surgical tracking system.
- the control module of the mobile cart is configured to receive sensor signals from the linear sensing device identifying a vertical position of the operating table, and configured to operate the motor to vertically move the robotic arm in unison with the operating table.
- FIG. 1 is diagrammatic view illustrating an overview of a surgical system in accordance with the present disclosure including a mobile cart with a robotic arm;
- FIG. 2A illustrates an operating table and the robotic arm of FIG. 1 both in a lowered position;
- FIG. 2B illustrates the operating table and the robotic arm of FIG. 1 both in a raised position
- FIG. 3 illustrates a method in accordance with the present disclosure for registering the robotic arm to a navigation space of the surgical system
- FIG. 4 illustrates a method in accordance with the present disclosure for using the mobile cart to maintain relative height of the robotic arm to an operating room table as the table is raised and lowered along the z-axis.
- the subject disclosure is directed to an exemplary embodiment of a surgical procedure on a subject, such as a human patient. It is understood, however, that the system and methods described herein are merely exemplary and not intended to limit the scope of the claims included herein. In various embodiments, it is understood, that the systems and methods may be incorporated into and/or used on non-animate objects.
- Fig. 1 is a diagrammatic view illustrating an overview of a procedure room or arena.
- the procedure room may include a surgical suite in which may be placed a robotic system 20 and a navigation system 26 that can be used for various procedures.
- the robotic system 20 may include a Mazor XTM robotic guidance system, sold by Medtronic, Inc.
- the robotic system 20 may be used to assist in guiding selected instrument, such as drills, screws, etc. relative to a subject 30.
- the robotic system 20 may be registered or correlated to a selected coordinate system such as via image-to image registration or correlation or similar methods, such as disclosed in U.S. Pat. No. 11 ,135,025, incorporated herein by reference.
- the robotic system 20 will be described further herein.
- the robotic system 20 may include one or more arms 40 that are moveable or pivotable relative to the subject 30, such as including an end effector 44 and mounted, such as moveably mounted to the base 38.
- the end effector 44 may be any appropriate portion, such as a tube, guide, or passage member.
- the end effector 44 may be moved relative to the base 38 with one or more motors.
- the position of the end effector 44 may be known or determined relative to the base 38 with one or more encoders at one or more joints, such as a wrist joint 48 and/or an elbow joint 52 of the robotic system 20.
- the robotic system 20 is arranged relative to an operating room table 104.
- the operating room table 104 may be any suitable operating table configured to be laterally moved in the X-Y axis, and vertically raised/lowered in the Z-axis. Exemplary tables 104 are described further herein.
- the robotic system 20, particularly the base 38 and arm 40 are also movable in the z-axis, as explained further herein. Based on user preference and the type of procedure to be performed, the height of the base 38 and the arm 40 are vertically set in the Z-axis relative to the table 104 by the operator at an operating position. Once the operating position is set, it is desirable to maintain the operating position throughout the procedure.
- the present disclosure includes a system for maintaining this relative position.
- the navigation system 26 can be used to track the location of one or more tracking devices.
- the tracking devices may include a robot tracking device 54, a subject tracking device 58, an imaging system tracking device 62, and/or an tool tracking device 66.
- a tool 68 may be any appropriate tool such as a drill, forceps, or other tool operated by a user 72.
- the tool 68 may also include an implant, such as a spinal implant or orthopedic implant.
- the navigation system 26 may be used to navigate any type of instrument, implant, or delivery system, including: guide wires, arthroscopic systems, orthopedic implants, spinal implants, deep brain stimulation (DBS) probes, etc.
- the instruments may be used to navigate or map any region of the body.
- the navigation system 26 and the various instruments may be used in any appropriate procedure, such as one that is generally minimally invasive or an open procedure.
- An imaging device 80 may be used to acquire pre-, intra-, or post-operative or real-time image data of a subject, such as the subject 30. It will be understood, however, that any appropriate subject can be imaged and any appropriate procedure may be performed relative to the subject.
- the imaging device 80 comprises an O-arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado, USA.
- the imaging device 80 may have a generally annular gantry housing 82 in which an image capturing portion is moveably placed.
- the image capturing portion may include an x-ray source or emission portion and an x-ray receiving or image receiving portion located generally or as practically possible 180 degrees from each other and mounted on a rotor relative to a track or rail.
- the image capturing portion can be operable to rotate 360 degrees during image acquisition.
- the image capturing portion may rotate around a central point or axis, allowing image data of the subject 30 to be acquired from multiple directions or in multiple planes.
- the imaging device 80 can include those disclosed in U.S. Pat. Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and 6,940,941 ; all of which are incorporated herein by reference, or any appropriate portions thereof.
- the imaging device 80 can utilize flat plate technology having a 1 ,720 by 1 ,024 pixel viewing area.
- the imaging system 80 may be any appropriate imaging system such as the O-arm imaging system, a C-arm imaging system, etc.
- the imaging device 80 can also be tracked with the tracking device 62.
- the image data defining an image space acquired of the patient 30 can, according to various embodiments, be inherently or automatically registered relative to an object space.
- the object space can be the space defined by a patient 30 in the navigation system 26.
- the automatic registration can be achieved by including the tracking device 62 on the imaging device 80 and/or the determinable precise location of the image capturing portion.
- imageable portions, virtual fiducial points and other features can also be used to allow for registration, automatic or otherwise. It will be understood, however, that image data can be acquired of any subject which will define subject space.
- Patient space is an exemplary subject space. Registration allows for a translation between patient space and image space.
- the patient 80 can also be tracked as the patient moves with a patient tracking device, DRF, or the tracker 58.
- the patient 30 may be fixed within navigation space defined by the navigation system 26 to allow for registration.
- registration of the image space to the patient space or subject space allows for navigation of the instrument 68 with the image data.
- a position of the instrument 68 can be illustrated relative to image data acquired of the patient 30 on a display device 84.
- Various tracking systems such as one including an optical localizer 88 or an electromagnetic (EM) localizer 92 can be used to track the instrument 68.
- the imaging device 80 may be an imaging device other than the O-arm® imaging device and may include in addition or alternatively a fluoroscopic C-arm.
- Other exemplary imaging devices may include fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc.
- fluoroscopes such as bi-plane fluoroscopic systems, ceiling mounted fluoroscopic systems, cath-lab fluoroscopic systems, fixed C-arm fluoroscopic systems, isocentric C-arm fluoroscopic systems, 3D fluoroscopic systems, etc.
- Other appropriate imaging devices can also include MRI, CT, ultrasound, etc.
- an imaging device controller 96 may control the imaging device 80 and can receive the image data generated at the image capturing portion and store the images for later use.
- the controller 96 can also control the rotation of the image capturing portion of the imaging device 80.
- the controller 96 need not be integral with the gantry housing 82, but may be separate therefrom.
- the controller may be a portions of the navigation system 26 that may include a processing and/or control system 98 including a processing unit or processing portion 102.
- the controller 96 may be integral with the gantry 82 and may include a second and separate processor, such as that in a portable computer.
- the patient 30 can be fixed onto an operating table 104.
- the operating table 104 may be any suitable operating table configured to be raised and lowered along a z-axis by an operator.
- the operating table 104 may be raised or lowered using foot pedals 130 of FIGS. 2A and 2B.
- the table 104 can be an Axis Jackson ® operating table sold by OSI, a subsidiary of Mizuho Ikakogyo Co., Ltd., having a place of business in Tokyo, Japan or Mizuho Orthopedic Systems, Inc. having a place of business in California, USA.
- Patient positioning devices can be used with the table, and include a Mayfield ® clamp or those set forth in commonly assigned U.S. Pat. Appl. No.
- the position of the patient 30 relative to the imaging device 80 can be determined by the navigation system 26.
- the tracking device 62 can be used to track and locate at least a portion of the imaging device 80, for example the gantry or housing 82.
- the patient 30 can be tracked with the dynamic reference frame 58. Accordingly, the position of the patient 30 relative to the imaging device 80 can be determined. Further, the location of the imaging portion can be determined relative to the housing 82 due to its precise position on the rail within the housing 82, substantially inflexible rotor, etc.
- the imaging device 80 can include an accuracy of within 10 microns, for example, if the imaging device 80 is an O-Arm® imaging device sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado. Precise positioning of the imaging portion is further described in U.S. Patent Nos. 7,188,998; 7,108,421 ; 7,106,825; 7,001 ,045; and
- the imaging device 80 can generate and/or emit x-rays from the x-ray source that propagate through the patient 30 and are received by the x-ray imaging receiving portion.
- the image capturing portion generates image data representing the intensities of the received x-rays.
- the image capturing portion can include an image intensifier that first converts the x-rays to visible light and a camera (e.g. a charge couple device) that converts the visible light into digital image data.
- the image capturing portion may also be a digital device that converts x- rays directly to digital image data for forming images, thus potentially avoiding distortion introduced by first converting to visible light.
- Two dimensional and/or three dimensional fluoroscopic image data that may be taken by the imaging device 80 can be captured and stored in the imaging device controller 96. Multiple image data taken by the imaging device 80 may also be captured and assembled to provide a larger view or image of a whole region of a patient 30, as opposed to being directed to only a portion of a region of the patient 30. For example, multiple image data of the patient’s 30 spine may be appended together to provide a full view or complete set of image data of the spine.
- the image data can then be forwarded from the image device controller 96 to the navigation computer and/or processor system 102 that can be a part of a controller or work station 98 having the display 84 and a user interface 106. It will also be understood that the image data is not necessarily first retained in the controller 96, but may also be directly transmitted to the work station 98.
- the work station 98 can provide facilities for displaying the image data as an image 108 on the display 84, saving, digitally manipulating, or printing a hard copy image of the received image data.
- the user interface 106 which may be a keyboard, mouse, touch pen, touch screen or other suitable device, allows the user 72 to provide inputs to control the imaging device 80, via the image device controller 96, or adjust the display settings of the display 84.
- the work station 98 may also direct the image device controller 96 to adjust the image capturing portion of the imaging device 80 to obtain various two-dimensional images along different planes in order to generate representative two-dimensional and three-dimensional image data.
- the navigation system 26 can further include the tracking system including either or both of the electromagnetic (EM) localizer 94 and/or the optical localizer 88.
- the tracking systems may include a controller and interface portion 110.
- the controller 110 can be connected to the processor portion 102, which can include a processor included within a computer.
- the EM tracking system may include the STEALTHSTATION® AXIEMTM Navigation System, sold by Medtronic Navigation, Inc. having a place of business in Louisville, Colorado; or can be the EM tracking system described in U.S. Patent Application Serial No. 10/941 ,782, filed Sept. 15, 2004, and entitled “METHOD AND APPARATUS FOR SURGICAL NAVIGATION”; U.S. Patent No.
- the navigation system 26 may also be or include any appropriate tracking system, including a STEALTHSTATION® TREON® or S7TM tracking systems having an optical localizer, that may be used as the optical localizer 88, and sold by Medtronic Navigation, Inc. of Louisville, Colorado.
- Other tracking systems include an acoustic, radiation, radar, etc. The tracking systems can be used according to generally known or described techniques in the above incorporated references. Details will not be included herein except when to clarify selected operation of the subject disclosure.
- Wired or physical connections can interconnect the tracking systems, imaging device 80, etc.
- various portions such as the instrument 68 may employ a wireless communications channel, such as that disclosed in U.S. Patent No. 6,474,341 , entitled “Surgical Communication Power System,” issued November 5, 2002, herein incorporated by reference, as opposed to being coupled directly to the controller 110.
- the tracking devices 62, 66, 54 can generate a field and/or signal that is sensed by the localizer(s) 88, 94.
- the navigation system 26 may be a hybrid system that includes components from various tracking systems.
- the navigation system 26 can be used to track the instrument 68 relative to the patient 30.
- the instrument 68 can be tracked with the tracking system, as discussed above.
- Image data of the patient 30, or an appropriate subject can be used to assist the user 72 in guiding the instrument 68.
- the image data is registered to the patient 30.
- the image data defines an image space that is registered to the patient space defined by the patient 30.
- the registration can be performed as discussed herein, automatically, manually, or combinations thereof.
- registration allows a translation map to be generated of the physical location of the instrument 68 relative to the image space of the image data.
- the translation map allows the tracked position of the instrument 68 to be displayed on the display device 84 relative to the image data 108.
- a graphical representation 68i also referred to as an icon, can be used to illustrate the location of the instrument 68 relative to the image data 108.
- a registration system or method can use the tracking device 58.
- the tracking device 58 may include portions or members 120 that may be trackable, but may also act as or be operable as a fiducial assembly.
- the fiducial assembly 120 can include a clamp or other fixation portion 124 and the imageable fiducial body 120. It is understood, however, that the members 120 may be separate from the tracking device 58.
- the fixation portion 124 can be provided to fix any appropriate portion, such as a portion of the anatomy.
- the fiducial assembly 120 can be interconnected with a portion of a spine of the subject 30.
- the fiducial portions 120 may be imaged with the imaging device 80. It is understood, however, that various portions of the subject (such as a spinous process) may also be used as a fiducial portion.
- image data is generated that includes or identifies the fiducial portions 120.
- the fiducial portions 120 can be identified in image data automatically (e.g. with a processor executing a program), manually (e.g. by selection an identification by the user 72), or combinations thereof (e.g. by selection an identification by the user 72 of a seed point and segmentation by a processor executing a program).
- Methods of automatic imageable portion identification include those disclosed in U.S. Patent No. 8,150,494 issued on April 3, 2012, incorporated herein by reference.
- Manual identification can include selecting an element (e.g. pixel) or region in the image data wherein the imageable portion has been imaged.
- the fiducial portions 120 identified in the image data can be used as fiducial points or positions that can be used to register the image data or the image space of the image data with patient space.
- the fiducial portions 120 that are identified in the image 108 may then be identified in the subject space defined by the subject 30, in an appropriate manner.
- the user 72 may move the instrument 68 relative to the subject 30 to touch the fiducial portions 120, if the fiducial portions are attached to the subject 30 in the same position during the acquisition of the image data to generate the image 108.
- the fiducial portions 120 may be attached to the subject 30 and/or may include anatomical portions of the subject 30.
- a tracking device may be incorporated into the fiducial portions 120 and they may be maintained with the subject 30 after the image is acquired.
- the registration or the identification of the fiducial portions 120 in a subject space may be made.
- the user 72 may move the instrument 68 to touch the fiducial portions 120.
- the tracking system such as with the optical localizer 88, may track the position of the instrument 68 due to the tracking device 66 attached thereto. This allows the user 72 to identify in the navigation space the locations of the fiducial portions 120 that are identified in the image 108.
- the translation map may be made between the subject space defined by the subject 30 in a navigation space and the image space defined by the image 108. Accordingly, identical or known locations allow for registration as discussed further herein.
- a translation map is determined between the image data coordinate system of the image data such as the image 108 and the patient space defined by the patient 30.
- the instrument 68 can be tracked with the tracking system that is registered to the image data to allow an identification and illustration of a position of the tracked instrument 68 as an icon superimposed on the image data. Registration of the image 108 (or any selected image data) to the subject 30 may occur at any appropriate time.
- the icon 68i representing a position (which may include a 6 degree of freedom position (including 3D location and orientation)) of the instrument 68 can be displayed relative to the image 108 on the display 84. Due to the registration of the image space to the patient space, the position of the icon 68i relative to the image 108 can substantially identify or mimic the location of the instrument 68 relative to the patient 30 in the patient space. As discussed above, this can allow a navigated procedure to occur.
- the robotic system 20 having the robotic system coordinate system may be registered to the navigation space coordinate system, as discussed herein, due to the reference tracking device 54 (e.g. if fixed to a known position on or relative to the robotic system 20) and/or due to the tracking of the snapshot tracking device 160 (FIGS. 2A and 2B).
- the snapshot tracking device 160 may include one or more trackable portions 164 that may be tracked with the localizer 88 or any appropriate localizer (e.g. optical, EM, radar). It is understood, however, that any appropriate tracking system may be used to track the snapshot tracking device 160.
- a fixed reference tracking device may also be positioned within the navigation space.
- the fixed navigation tracker may include the patient tracker 58, which may be connected to the patient 30 and/or the robot tracker 54 that may be fixed to the base 34 of the robotic system 20.
- the reference tracker therefore, may be any appropriate tracker that is positioned relative to the snapshot tracker 160 that is within the navigation coordinate space during the registration period.
- the robot tracker 54 will be referred to however, the patient tracker 58 may also be used as the reference tracker.
- reference tracker may be positioned within the coordinate system at any position relative to the snapshot tracker 160 as long as the snapshot tracker 160 may be tracked relative to the reference tracker.
- the snapshot tracker 160 may be positioned at a known position relative to the end effector 44.
- the snapshot tracker 160 extends from a rod or connection member.
- the localizer 88 may then view or determine a position of the snapshot tracking device 160 relative to the reference tracking device 54 and or the reference tracking device 58.
- determining or tracking a position of the snapshot tracking device 160 relative to the reference frame 54 may be used to determine a relationship between a position within the navigation space and the robotic space of the end effector 44.
- the navigation space defined by the localizer 88 may include the full navigation space, which may include portions relative to the subject, such as the subject tracker 58 and other portions that may be moved therein, such as the instrument 68.
- the robotic registration space may be smaller and may include a robotic registration space that may include the reference frame 54 and the snapshot tracker 160.
- the robot registration navigation space may include the snapshot tracker 160 and the patient tracker 58 for registration.
- the exemplary registration navigation space is merely for the current discussion.
- both the robotic reference tracker 54 and the patient tracker 58 need not be used simultaneously. This is particularly true when the patient 30 is fixed in space, such as fixed relative to the robotic system 20.
- the registration may include various portions or sub-parts, as discussed herein.
- the various parts may occur in any appropriate order, and the order discussed herein is merely exemplary.
- the co-registration may further allow only one coordinate system of the robotic or navigation to be registered to a third coordinate system, such as an image coordinate system, but allow the registration of the other to the third coordinate system.
- the robotic system 20 is positioned relative to the subject 30 for various portions of a procedure.
- the robotic system 20 may be registered to the subject 30 and to the image 108 of the subject 30, that may be displayed on the display device 84 and/or a second or auxiliary display device 84’ that may be movable relative to the robotic system 20.
- the imaging system 80 or any appropriate imaging system, may be used to image the subject 30.
- the image may include a portion of the subject, such as one or more of the vertebrae and a fiducial or robotic fiducial array 140 (FIGS. 2A, 2B) that may be fixed to the robotic system 20.
- the robotic fiducial 140 may be fixed to a selected portion of the robotic system 20, such as to the base 34 and/or the fixed portion 38.
- the robotic fiducial 140 may also and/or alternatively be connected to the end effector 44 (illustrated in phantom in FIGS. 2A and 2B).
- the robotic fiducial 140 may be positioned relative to the subject 30 for acquisition of images such that the fiducial 140 is apparent in the images.
- the position of the robotic fiducial 140 relative to the vertebrae may be determined. If the robotic fiducial 140 is fixed to the robotic system 20, the robotic coordinate system may be determined relative to the subject space coordinate system.
- the known position of the end effector in the robotic coordinate system allows for image registration to the robotic coordinate system of the robotic system 20.
- the robotic coordinate system may be registered to a subject space or coordinate system in the method 182 of FIG. 3.
- the registration may include positioning the robotic system 20 relative to a subject space in block 184. Positioning of the robotic system 20 relative to the subject space may include positioning the robotic system 30 relative to the subject. Further, positioning of the robotic system 20 may include positioning or removably positioning the robotic fiducial 140 relative to the subject 30. The robotic fiducial 140 may be removably placed in a position relative to the robotic system 20 for various procedures and may be substantially positioned in the same position for different or subsequent procedures. With the subject 30 positioned relative to the robotic system 20, fiducial images may be acquired of the subject 30 and the robotic fiducial 140 with the imaging system 80 in block 186.
- the acquisition of the fiducial images in block 186 allows for image data to be acquired of the subject 30, such as with the vertebrae, and the fiducial 140. It is desirable to maintain a relative position along the Z-axis between an upper surface 132 of the table 104 and a base 38 of the robotic arm 40 throughout an operating procedure.
- the relative position may be set based on operator preference, as well as based on the procedure being performed. Once the relative position is set, the relative position is maintained as the table 104 is raised or lowered along the Z-axis by sensing movement of the table 104, and then simultaneously (or nearly simultaneously) moving the base 38 along the Z-axis in the same direction as the table 104 to maintain the relative operating position as the table 104 is moved.
- the processor 102 and/or the controller 96 is configured to save the position of the robotic arm 40 (including the Euclidian positions and orientations of the fiducial 140 and the robotic arm 40, including the end effector 44) at the same time that the image was acquired in block 186 so that the processor 102 and/or the controller 96 know the precise location of the robotic system 20 during registration of the robotic coordinate system to the subject space described at block 192.
- the image acquired of the fiducial 140 may be used for registration of a coordinate reference frame of the robotic arm 20 and the pre-acquired image, which may be any appropriate type of image.
- the pre-acquired image may be a two- and/or three-dimensional image (such as a CT image, MRI image, or the like).
- the image acquired of the fiducial 140 may also be any appropriate type of image, such as a two-dimensional fluoroscopic image, a two-dimensional image acquired with the O-arm® imaging system, a cone beam image required with the O-arm® imaging system, or other appropriate image data.
- identifying of the robotic fiducial 140 in the acquired fiducial images occurs in block 188.
- Identification of the robotic fiducial in the robotic fiducial images may be manual, automatic, or a combination of automatic and manual.
- the user may identify the robotic fiducial in the image a selected automatic system may segment the fiducials from the fiducial images, or the user may identify a seed pixel or voxel or multiple seed pixels or voxels and the processor system may further segment the fiducial system.
- the acquired images in block 186 may be used for planning and/or performing a procedure.
- the imaging system 80 may acquire image data sufficient for a selected procedure.
- the images acquired in block 186 may be used for planning and navigating a selected procedure relative to the subject 30.
- the image data may include two-dimensional image data, reconstructed three-dimensional image data, and/or image data acquired over time to illustrate movement of motion of the subject (which may be acquired in 2D or 3D).
- the fiducial image acquired in block 186 may be optionally registered to other- time or pre-acquired images in block 190, such as an MRI or a computed tomography scan of the subject 30 prior to the acquisition of the fiducial images in block 186.
- the pre-acquired images may be acquired at any appropriate time prior to the acquisition of the fiducial images in block 186. It is understood, however, that the images may be acquired after the fiducial images and may be registered to the fiducial images in a similar manner as discussed herein.
- the registration of the fiducial images to the pre-acquired images may occur in any appropriate manner such as segmentation of selected vertebrae, identification in registration of selected fiducial elements in the images (e.g. anatomical fiducial portions and/or positioned or implanted fiducial members) or other appropriate procedures.
- the Mazor X® Robotic System may generally allow for registration of a preacquired image to the fiducial images and may be appropriate for registering the fiducial images in block 186 to the pre-acquired images in the registration of the pre-acquired image to the fiducial image in block 190.
- the robotic coordinate system may also be registered to the subject space in block 192 with the identification of fiducials in the image in block 188 and the registration.
- the robotic fiducial 140, imaged with the fiducial images in block 186, is positioned in a known position relative to the robotic system 20, such as the base 34 and/or with the known position of the end effector 44 in the robotic coordinate system.
- the robotic coordinate system that is defined by the robotic system 20 relative to the base 34 and/or the fixed portion 38 may, therefore also, be pre-determined or known relative to the robotic fiducial 140 as the robotic fiducial 140 is fixed relative to the robotic system 20.
- the position of the robotic fiducial 140 is known in the robotic coordinate system by tracked (e.g.
- the fiducial image acquired in block 186 may also assist in defining the patient space relative to which the robotic system 20, particularly the end effector movable portion 44, may move is also then known. As discussed above, the end effector 44 moves in the robotic coordinate system due to the robotic tracking system that may include various mechanisms, such as encoders at the various movable portions, such as the wrist 48 or elbow 52, of the robotic system 20. If the fiducial images in block 186 are the images for performing the procedure, such as for navigation and may the displayed image 108, the registration may be substantially automatic as the subject 30 may be substantially fixed relative to the robotic system 20 (e.g.
- the robotic coordinate system can be registered to the subject space and/or image space according to the method 182. Given the registration of the robotic coordinate system to the image space the robotic, coordinate system registration may be used to determine a position of the end effector 44 and/or a member positioned through or with the end effector 44, relative to the image 108. Accordingly, the image 108 may be used to display a graphical representation, such as a graphical representation of the member or instrument 45 as an icon 45i superimposed or superimposed relative to the image 108.
- the fiducial image may acquire an image of the subject 30 and the fiducial 140 at substantially the same time and include portions of the subject 30 and the fiducial 140 in the same image or image data.
- the fiducial image that includes an image of a portion of the fiducial 140 and the subject 30 may be used to register or generate a translation map between the fiducial image and any pre-acquired image, such as of the subject 30.
- the translation map between the fiducial image strand of the pre-acquired image may allow for registration between the two images.
- the pose of the fiducial 140 may be determined relative to a pre-acquired image. This may allow for a pose of the robotic system 20 to be determined relative to the pre-required image. Thus, any planning that may have occurred in the pre-acquired image may be translated or registered to the current pose of the robotic system 20 relative to the subject 30.
- the fiducial image that is inclusive of the image of the fiducial 140 and the subject 30 may be, therefore, registered to the pre-acquired image for us during a procedure using the robotic system 20 and relative to the subject 30.
- the robotic system 20 advantageously maintains registration between the robotic coordinate system and the subject space/navigation coordinate system even if the surgical table is raised or lowered after registration.
- the robotic system 20 includes a base assembly 510, which may be configured as a mobile cart including wheels 512.
- the base assembly 510 may be a stationary base, which may be arranged in an operating room, for example.
- the base assembly 510 is referred to herein as the cart or mobile cart.
- the mobile cart 510 includes an actuator 520 mounted thereto, which is movable between a lowered position as illustrated in FIG. 2A, and a raised position as illustrated in FIG. 2B.
- the actuator 520 may be stopped at any intermediate position as well.
- the actuator 520 may be any suitable actuator, such as a scissor jack, a telescoping actuator, screw drive, etc.
- a support member 522 is mounted to the actuator 520 and configured to support the robotic arm 40.
- the robotic arm 40 may be mounted to the support member 522 in any suitable manner.
- a motor 530 is in cooperation with the actuator 520, and configured to vertically move the actuator 520 along the z-axis between the lowered position of FIG. 2A and the raised position of FIG. 2B, and to any intermediate position between the lowered position and the raised position.
- the motor 530 may be any suitable motor, such as a stepper motor.
- the mobile cart 510 includes a linear sensing device 540.
- the linear sensing device 540 is configured to be connected to any suitable operating room table, such as the table 104. When connected to the table 104, the linear sensing device 540 is configured to sense when the table 104 is raised and lowered, and the distance that the table 104 is raised and lowered.
- the linear sensing device 540 may be any suitable device configured to identify vertical movement or z-axis movement of the table 104.
- the linear sensing device 540 may define a track 542 that extends vertically. In cooperation with the track 542 is a link, tab or post 544, which is configured to be coupled to the table 104.
- the post 544 moves vertically within the track 542 as the table 104 moves up and down when the post 544 is coupled to the table 104.
- the vertical movement of the post 544 within the track 540 is sensed by the linear sensing device 540 in any suitable manner, such as with a linear variable displacement transducer (LVDT), or an optical sensing device.
- the linear sensing device 540 is in cooperation with a control module 550 of the mobile cart 510 to send signals to the control module 550 identifying the height of the table 104.
- the control module 550 is configured to operate the motor 530 for raising and lowing the actuator 520 based on the signals from the linear sensing device 540, as explained herein.
- the mobile cart 510 further includes any suitable user interface, such as the display 84' configured as a touch screen.
- the user interface may be configured in any other suitable manner as well, such as with buttons, knobs, a keyboard, etc.
- the user interface 84' provides controls for the user to operate the robotic arm 40, as well as set the vertical height thereof. More specifically, the user sets the height of the robotic arm 40 relative to the table 104 by inputting the desired height into the user interface.
- the base 38 of the arm 40 may be set at any suitable distance X (see FIGS. 2A, 2B) along the Z-axis relative to the upper surface 132 of the table 104.
- the distance X may be 0, for example, such that the base 38 is aligned with the upper surface 132 at the same vertical height along the Z-axis.
- the control module 550 is configured to then operate the motor 530 to raise or lower the actuator 520 to the requested height relative to the table 104 to set the base 38 at the distance X relative to the upper surface 132 of the table 104.
- the control module 550 is configured to receive sensor signals from the linear sensing device 540 identifying the vertical position of the operating table 104. In response to receipt of a signal that the table 104 is being raised, the control module 550 is configured to operate the motor 530 to raise the actuator 520 and the robotic arm 540 in unison with the operating table 104 to the same relative height.
- the robotic arm 40 may be set by the operator such that the base 38 of the arm 40 is at a distance X above an upper surface 132 of the table 104.
- the control module 550 is configured to operate the motor 530 to raise the actuator 520 from the position H1 A of FIG. 2A to the position H2 A of FIG. 2B to maintain the relative distance X of the robotic base 38 above the upper surface 132 of the table 104.
- the control module 550 is configured to operate the motor 530 to lower the actuator to maintain the same distance X between the base 38 of the arm 40 and the upper surface 132 of the table 104.
- FIG. 4 illustrates an exemplary method of operating a surgical system including the mobile cart 104 in accordance with the present disclosure at reference numeral 610.
- the mobile cart 104 is positioned relative to the operating table 104.
- the linear sensor 540 is configured to measure vertical or z-axis movement of the OR table 104.
- the post 544 in cooperation with the linear track 542 is coupled to the table 104 in any suitable manner.
- the height of the robotic base 38 is set relative to the table 104, such as at a first distance X relative to the upper surface 132 of the table 104.
- the control module 550 then operates the motor 530 to raise or lower the actuator 520 to achieve the distance X between the base 38 and the upper surface 132 of the table 104.
- the robotic arm 40 is registered to the navigation space as described above.
- the control module in response to the table 104 being raised or lowered as sensed by the linear sensor 540, the control module is configured to operate the motor 540 to raise or lower the actuator 520 to maintain the height X between the robotic base 38 and the upper surface 132 of the table 104. Raising and lowering the robotic arm 40 in unison with the table 104 so that the base 38 remains at the same height at which the arm 40 was registered maintains registration of the arm 40 to the navigation system and eliminates any need for reregistration during surgery.
- the control module 550 actives the motor 530 to move the actuator 520 in the same direction as the table 104 to maintain the distance X between the robotic base 38 and the upper surface 132.
- the movement of the actuator 520 is simultaneous with, or nearly simultaneous with, movement of the table 104 such that any difference in movement is not distinguishable by the user. Registration of the robotic arm 40 to the navigation system is advantageously maintained throughout the movement of the table 104 and the robotic arm 40.
- the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit.
- Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
- processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable logic arrays
- processors may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202480028492.7A CN121079051A (en) | 2023-04-26 | 2024-04-24 | Surgical trolley with robotic arm |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363462094P | 2023-04-26 | 2023-04-26 | |
| US63/462,094 | 2023-04-26 | ||
| US18/608,505 US20240358466A1 (en) | 2023-04-26 | 2024-03-18 | Surgical Cart With Robotic Arm |
| US18/608,505 | 2024-03-18 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024224310A1 true WO2024224310A1 (en) | 2024-10-31 |
Family
ID=91082089
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/054002 Pending WO2024224310A1 (en) | 2023-04-26 | 2024-04-24 | Surgical cart with robotic arm |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN121079051A (en) |
| WO (1) | WO2024224310A1 (en) |
Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
| US5913820A (en) | 1992-08-14 | 1999-06-22 | British Telecommunications Public Limited Company | Position location system |
| US5983126A (en) | 1995-11-22 | 1999-11-09 | Medtronic, Inc. | Catheter location system and method |
| US6474341B1 (en) | 1999-10-28 | 2002-11-05 | Surgical Navigation Technologies, Inc. | Surgical communication and power system |
| US6940941B2 (en) | 2002-02-15 | 2005-09-06 | Breakaway Imaging, Llc | Breakable gantry apparatus for multidimensional x-ray based imaging |
| US7001045B2 (en) | 2002-06-11 | 2006-02-21 | Breakaway Imaging, Llc | Cantilevered gantry apparatus for x-ray imaging |
| US7106825B2 (en) | 2002-08-21 | 2006-09-12 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US7108421B2 (en) | 2002-03-19 | 2006-09-19 | Breakaway Imaging, Llc | Systems and methods for imaging large field-of-view objects |
| US7188998B2 (en) | 2002-03-13 | 2007-03-13 | Breakaway Imaging, Llc | Systems and methods for quasi-simultaneous multi-planar x-ray imaging |
| US20120029694A1 (en) * | 2010-08-02 | 2012-02-02 | Kuka Laboratories Gmbh | Medical Work Station |
| US8150494B2 (en) | 2007-03-29 | 2012-04-03 | Medtronic Navigation, Inc. | Apparatus for registering a physical space to image space |
| WO2016069660A1 (en) * | 2014-10-27 | 2016-05-06 | Intuitive Surgical Operations, Inc. | System and method for monitoring control points during reactive motion |
| US20170333275A1 (en) * | 2014-10-27 | 2017-11-23 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table icons |
| US20190328599A1 (en) * | 2018-04-27 | 2019-10-31 | Ormonde M. Mahoney | System and method for patient positioning in an automated surgery |
| US11135025B2 (en) | 2019-01-10 | 2021-10-05 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation |
-
2024
- 2024-04-24 CN CN202480028492.7A patent/CN121079051A/en active Pending
- 2024-04-24 WO PCT/IB2024/054002 patent/WO2024224310A1/en active Pending
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5913820A (en) | 1992-08-14 | 1999-06-22 | British Telecommunications Public Limited Company | Position location system |
| US5592939A (en) | 1995-06-14 | 1997-01-14 | Martinelli; Michael A. | Method and system for navigating a catheter probe |
| US5983126A (en) | 1995-11-22 | 1999-11-09 | Medtronic, Inc. | Catheter location system and method |
| US6474341B1 (en) | 1999-10-28 | 2002-11-05 | Surgical Navigation Technologies, Inc. | Surgical communication and power system |
| US6940941B2 (en) | 2002-02-15 | 2005-09-06 | Breakaway Imaging, Llc | Breakable gantry apparatus for multidimensional x-ray based imaging |
| US7188998B2 (en) | 2002-03-13 | 2007-03-13 | Breakaway Imaging, Llc | Systems and methods for quasi-simultaneous multi-planar x-ray imaging |
| US7108421B2 (en) | 2002-03-19 | 2006-09-19 | Breakaway Imaging, Llc | Systems and methods for imaging large field-of-view objects |
| US7001045B2 (en) | 2002-06-11 | 2006-02-21 | Breakaway Imaging, Llc | Cantilevered gantry apparatus for x-ray imaging |
| US7106825B2 (en) | 2002-08-21 | 2006-09-12 | Breakaway Imaging, Llc | Apparatus and method for reconstruction of volumetric images in a divergent scanning computed tomography system |
| US8150494B2 (en) | 2007-03-29 | 2012-04-03 | Medtronic Navigation, Inc. | Apparatus for registering a physical space to image space |
| US20120029694A1 (en) * | 2010-08-02 | 2012-02-02 | Kuka Laboratories Gmbh | Medical Work Station |
| WO2016069660A1 (en) * | 2014-10-27 | 2016-05-06 | Intuitive Surgical Operations, Inc. | System and method for monitoring control points during reactive motion |
| US20170333275A1 (en) * | 2014-10-27 | 2017-11-23 | Intuitive Surgical Operations, Inc. | System and method for integrated surgical table icons |
| US20190328599A1 (en) * | 2018-04-27 | 2019-10-31 | Ormonde M. Mahoney | System and method for patient positioning in an automated surgery |
| US11135025B2 (en) | 2019-01-10 | 2021-10-05 | Medtronic Navigation, Inc. | System and method for registration between coordinate systems and navigation |
Also Published As
| Publication number | Publication date |
|---|---|
| CN121079051A (en) | 2025-12-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11759272B2 (en) | System and method for registration between coordinate systems and navigation | |
| US8644570B2 (en) | System and method for automatic registration between an image and a subject | |
| US8737708B2 (en) | System and method for automatic registration between an image and a subject | |
| EP2629670B1 (en) | System and method for automatic registration between an image and a subject | |
| US20240164848A1 (en) | System and Method for Registration Between Coordinate Systems and Navigation | |
| EP2676627B1 (en) | System and method for automatic registration between an image and a subject | |
| US20240358466A1 (en) | Surgical Cart With Robotic Arm | |
| WO2024224310A1 (en) | Surgical cart with robotic arm | |
| US20240307131A1 (en) | Systems And Methods For An Image Guided Procedure | |
| US20240277415A1 (en) | System and method for moving a guide system | |
| US20240285348A1 (en) | Automated movement of optical localizer for optimal line of sight with optical trackers | |
| EP4580502A1 (en) | System and method for imaging | |
| CN114098971A (en) | An orthopedic surgery robot imaging, navigation and positioning method, device and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24725944 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024725944 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2024725944 Country of ref document: EP Effective date: 20251126 |
|
| ENP | Entry into the national phase |
Ref document number: 2024725944 Country of ref document: EP Effective date: 20251126 |
|
| ENP | Entry into the national phase |
Ref document number: 2024725944 Country of ref document: EP Effective date: 20251126 |