[go: up one dir, main page]

CN120476370A - Surgical robotic system and method for communication between surgeon console and bedside assistant - Google Patents

Surgical robotic system and method for communication between surgeon console and bedside assistant

Info

Publication number
CN120476370A
CN120476370A CN202480006426.XA CN202480006426A CN120476370A CN 120476370 A CN120476370 A CN 120476370A CN 202480006426 A CN202480006426 A CN 202480006426A CN 120476370 A CN120476370 A CN 120476370A
Authority
CN
China
Prior art keywords
instrument
virtual
screen
surgical
helper
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202480006426.XA
Other languages
Chinese (zh)
Inventor
费萨尔·I·巴希尔
迈尔·罗森贝格
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Covidien LP
Original Assignee
Covidien LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien LP filed Critical Covidien LP
Publication of CN120476370A publication Critical patent/CN120476370A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Leader-follower robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0437Trolley or cart-type apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Robotics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Manipulator (AREA)

Abstract

A surgical robotic system includes an assistant access port configured to receive an assistant instrument, an endoscope camera configured to generate a video feed of a surgical site and the assistant instrument, and a control tower having a first screen, and a surgeon console having a second screen and a handle controller, the surgeon console configured to receive user input. The system further includes a video processing device configured to render a virtual instrument of the assistant instrument in the video feed to generate an enhanced video feed, move the virtual instrument in the enhanced video feed in response to the received user input, output the enhanced video feed with the virtual instrument on a first screen and a second screen, confirm whether the assistant instrument is positioned at the location of the virtual instrument, and indicate whether the assistant instrument is positioned at the location of the virtual instrument on the first screen and the second screen.

Description

Surgical robotic system and method for communicating between a surgeon console and a bedside assistant
Cross Reference to Related Applications
The application claims the benefit of U.S. provisional patent application Ser. No. 63/437,786, filed on 1/9 of 2023, the entire contents of which are incorporated herein by reference.
Background
Surgical robotic systems are used in a variety of surgical procedures, including minimally invasive medical procedures. Some surgical robotic systems include a surgeon console that controls a surgical robotic arm and a surgical instrument having an end effector (e.g., a clamp or grasping instrument) coupled to and actuated by the robotic arm. In operation, the robotic arm is moved to a position over the patient and then the surgical instrument is guided into the small incision via the surgical port or natural orifice of the patient to position the end effector at the working site within the patient.
Minimally invasive surgery and robotic-assisted surgery enable a surgeon to perform a surgical procedure in a teleoperational mode at a tele-console. During the teleoperational mode, the surgeon scrubs and leaves the sterile field during the surgical procedure. During surgery, there are various needs for bedside assistants. Some ancillary activities require simplified communication between the assistant and the surgeon. There is an unmet need to optimize communication between an assistant and a surgeon.
Disclosure of Invention
The present disclosure provides an Augmented Reality (AR) guided communication platform between a surgeon and an assistant on which the surgeon can show the assistant how to perform complex maneuvers through tool poses. This avoids the need for the surgeon to interrupt the surgical workflow or to personally scrub and perform laparoscopic maneuvers. Rather, the surgeon may communicate complex surgical maneuvers to the assistant by manipulating a virtual tool that both the surgeon and the assistant can visualize on one or more of their respective screens.
In accordance with one embodiment of the present disclosure, a surgical robotic system is disclosed. The surgical robotic system includes an assistant access port configured to receive an assistant instrument. The system also includes an endoscopic camera configured to generate a video feed of the surgical site and the assistant instrument. The system further includes a control tower having a first screen and a surgeon console having a second screen and a handle controller, the surgeon console configured to receive user input. The system further includes a video processing device configured to render a virtual instrument of the helper instrument in the video feed to generate an enhanced video feed, and to move the virtual instrument in the enhanced video feed in response to user input. The video processing device is further configured to output an enhanced video feed with the virtual instrument on the first screen and the second screen and confirm whether the helper instrument is placed at the location of the virtual instrument. Further, the video processing device is configured to indicate on the first screen and the second screen whether the helper instrument is placed at the location of the virtual instrument.
Implementations of the above embodiments may include one or more of the following features. According to one aspect of the above embodiment, the surgical robotic system may further include a robotic arm having a robotic instrument and a robotic access port configured to receive the robotic instrument. The surgeon console may be configured to switch between controlling the robotic instrument and controlling the virtual instrument. The surgical robotic system may further include a tracking unit configured to track a position of the assistant access port and a position of the robot access port. The video processing device may be configured to determine a location of the helper instrument based on the position of the helper access port and the position of the robot access port. The video processing device may be further configured to render the virtual instrument based on the 3D model data of the helper instrument. The endoscopic camera may be a stereoscopic camera and the processing device may be configured to generate a depth map of the surgical site. The processing device may be further configured to generate a virtual boundary corresponding to a physical boundary of the surgical site based on the depth map. The surgeon console may also be configured to limit user input for controlling movement of the virtual instrument at the surgeon console based on the virtual boundary.
According to another embodiment of the present disclosure, a non-transitory computer readable medium is disclosed. The medium stores instructions that, when executed by a processor, cause the processor to perform a computer-implemented method for communicating movement instructions using a virtual appliance. The method includes receiving a video feed of the surgical site and the assistant instrument, and rendering a virtual instrument of the assistant instrument in the video feed to generate an enhanced video feed that is displayed on a first screen of the control tower and a second screen of the surgeon console. The method further includes moving the rendered virtual instrument in the video feed in response to an input signal received from the surgeon console, and outputting the enhanced video feed with the virtual instrument on the first screen and the second screen. In addition, the method includes confirming whether the helper instrument is positioned at the location of the virtual instrument, and indicating on the first screen and the second screen whether the helper instrument is positioned at the location of the virtual instrument.
Implementations of the above embodiments may include one or more of the following features. According to one aspect of the above embodiment, the method may further include switching between controlling a robotic instrument coupled to the robotic arm and inserted through the robotic access port and the virtual instrument. The method may further include tracking a position of an assistant access port through which an assistant instrument is inserted and a position of the robot access port. The method may also include determining a position of the helper instrument based on the position of the helper access port and the position of the robot access port. Furthermore, the method may include rendering the virtual instrument based on the 3D model data of the helper instrument. The method may also include generating a depth map of the surgical site. The method may further include generating a virtual boundary corresponding to the physical boundary of the surgical site based on the depth map. The method may also include limiting movement of the virtual instrument at the surgeon console based on the virtual boundary.
In accordance with further embodiments of the present disclosure, a surgical robotic system is disclosed. The surgical robotic system includes a robotic arm having a robotic instrument and a robotic access port configured to receive the robotic instrument. The system also includes a helper access port configured to receive a helper instrument. The system further includes an endoscopic camera configured to generate a video feed of the surgical site and the assistant instrument. The system also includes a control tower having a first screen and a surgeon console having a second screen and a handle controller, the surgeon console configured to receive user inputs to control the robotic instrument and the virtual instrument. The system further includes a video processing device configured to render a virtual instrument of the helper instrument in the video feed to generate an enhanced video feed, and to move the virtual instrument in the enhanced video feed in response to user input. The video processing device is further configured to output an enhanced video feed with the virtual instrument on the first screen and the second screen, confirm whether the helper instrument is positioned at the location of the virtual instrument, and indicate whether the helper instrument is positioned at the location of the virtual instrument on the first screen and the second screen.
Implementations of the above embodiments may include one or more of the following features. According to one aspect of the above embodiment, the endoscopic camera may be a stereoscopic camera and the processing device may be configured to generate a depth map of the surgical site. The processing device may be configured to generate a virtual boundary corresponding to a physical boundary of the surgical site based on the depth map. The surgeon console may be configured to limit user input for controlling movement of the virtual instrument at the surgeon console based on the virtual boundary.
Drawings
Various embodiments of the present disclosure are described herein with reference to the accompanying drawings, in which:
FIG. 1 is a schematic illustration of a surgical robotic system including a control tower, a console, and one or more surgical robotic arms, each disposed on a mobile cart, according to an embodiment of the disclosure;
FIG. 2 is a perspective view of a surgical robotic arm of the surgical robotic system of FIG. 1, according to an embodiment of the present disclosure;
FIG. 3 is a perspective view of a mobile cart with an installed arm with a surgical robotic arm of the surgical robotic system of FIG. 1, according to an embodiment of the disclosure;
FIG. 4 is a schematic diagram of a computer architecture of the surgical robotic system of FIG. 1, according to an embodiment of the present disclosure;
FIG. 5 is a schematic plan view of the surgical robotic system of FIG. 1 positioned around an operating table according to an embodiment of the present disclosure;
FIG. 6 is a view of a graphical user interface displayed on a control tower display and a surgeon console display, and in accordance with an embodiment of the present disclosure
Fig. 7 is a flowchart illustrating a method for providing movement instructions to a bedside assistant in accordance with an embodiment of the present disclosure.
Detailed Description
Embodiments of the surgical robotic systems disclosed herein are described in detail with reference to the drawings, wherein like reference numerals designate identical or corresponding elements in each of the several views.
As will be described in detail below, the present disclosure is directed to a surgical robotic system including a surgeon console, a control tower, and one or more mobile carts having a surgical robotic arm coupled to an installation arm. The surgeon console receives user inputs through one or more interface devices that are processed by the control tower as movement commands for moving the surgical robotic arm and instruments and/or cameras coupled to the surgical robotic arm. Thus, the surgeon console enables teleoperation of the surgical arm and attached instruments/cameras. The surgical robotic arm includes a controller configured to process the movement command and configured to generate a torque command for activating one or more actuators of the robotic arm, which in turn move the robotic arm in response to the movement command.
The present disclosure provides a surgical robotic system and method that improves communication between a surgeon and an assistant. The system enables complex instrument maneuvers to be delivered to the assistant without causing disruption of the surgical flow by moving the gaze away from the surgeon's console's screen or having to scrub and perform tasks personally. In particular, the system is configured to virtualize one or more surgical instruments that are controlled by an assistant and may be any electric or manual instrument. The guided communication interface is an Augmented Reality (AR) interface that generates a virtual image of the instrument as an overlay over the endoscopic camera feed.
The workflow for optimal communication between the surgeon and the assistant may include the assistant inserting laparoscopic instruments into the surgical site. The surgeon then moves the endoscopic camera to place the instrument of the assistant in the field of view of the camera. The camera may be a stereoscopic camera configured to enable depth mapping of the surgical site and the instrument. The camera is calibrated to enable accurate perception of depth. The instruments of the assistant are identified by machine vision. The system also includes an external vision system (e.g., one or more cameras or infrared sensors) for determining port locations of all access ports. This allows the system to determine the position of the assistant port relative to the endoscope port in order to generate a virtual instrument in the camera feed, which is shown both on the screen of the surgeon console used by the surgeon and on the screen of the control tower used by the assistant.
The system is also configured to load a 3D model, a physical model, and kinematics of an helper instrument for rendering the virtual instrument. The virtual appliance is then rendered as an AR overlay on both screens. The initial placement of the virtual instrument is based on physical modeling of a stereo reconstruction depth map from a stereo endoscope.
The virtual instrument is controlled by the surgeon's console in the same manner as any actual robotic controller instrument. The surgeon disengages the clutch, gains control of the virtual instrument rendered on-screen, and moves the virtual instrument using the handle controller. The virtual tool is associated with an assistant port, and the system is configured to realistically move the virtual tool in a video feed of the surgical site based on the kinematics of the assistant instrument. The surgeon's trajectory of movement may be recorded so that it may be replayed by the assistant on the screen of the control tower. The assistant then moves the assistant instrument until the assistant instrument is co-located with the virtual instrument, i.e., the assistant instrument follows the trajectory as a guide to align with the virtual instrument. The system may verify that the helper instrument is in a virtual instrument position and oriented in the same position based on the depth map and image processing.
Referring to fig. 1, a surgical robotic system 10 includes a control tower 20 that is connected to all of the components of the surgical robotic system 10 (including a surgeon console 30 and one or more mobile carts 60). Each mobile cart 60 includes a robotic arm 40 to which a surgical instrument 50 is removably coupled. The robotic arm 40 is also coupled to a mobile cart 60. The robotic system 10 may include any number of mobile carts 60 and/or any number of robotic arms 40.
The surgical instrument 50 is configured for use during minimally invasive surgery. In an embodiment, the surgical instrument 50 may be configured for open surgery. In further embodiments, the surgical instrument 50 may be an electrosurgical clamp configured to seal tissue by compressing the tissue between jaw members and applying electrosurgical current thereto. In yet further embodiments, the surgical instrument 50 may be a surgical stapler including a pair of jaws configured to grasp and clamp tissue, deploy a plurality of tissue fasteners (e.g., staples) simultaneously, and cut stapled tissue. In yet further embodiments, surgical instrument 50 may be a clip applier that includes a pair of jaws configured to apply a clip to tissue.
One of the robotic arms 40 may include a laparoscopic camera 51 configured to capture video of the surgical site. The laparoscopic camera 51 may be a stereoscopic endoscopic camera configured to capture two side-by-side (i.e., left and right) images of the surgical site to generate a video stream of the surgical scene. The laparoscopic camera 51 is coupled to an image processing device 56, which may be disposed within the control tower 20. The image processing device 56 may be any computing device configured to receive a video feed from the laparoscopic camera 51 and output a processed video stream.
The surgeon console 30 includes a first screen 32 that displays a video feed of the surgical site provided by a camera 51 of a surgical instrument 50 disposed on the robotic arm 40 and a second screen 34 that displays a user interface for controlling the surgical robotic system 10. The first screen 32 and the second screen 34 may be touch screens that allow various graphical user inputs to be displayed.
The surgeon console 30 also includes a plurality of user interface devices, such as a foot pedal 36 and a pair of hand controls 38a and 38b that are used by a user to remotely control the robotic arm 40. The surgeon's console further includes an armrest 33 for supporting the clinician's arm when manipulating the hand controls 38a and 38 b.
The control tower 20 includes a screen 23, which may be a touch screen and is output on a Graphical User Interface (GUI). The control tower 20 also serves as an interface between the surgeon console 30 and one or more robotic arms 40. In particular, the control tower 20 is configured to control the robotic arm 40 based on a set of programmable instructions and/or input commands from the surgeon console 30 in order to move the robotic arm 40 and corresponding surgical instrument 50 in a manner that causes the robotic arm 40 and surgical instrument 50 to perform a desired sequence of movements in response to inputs from the foot pedals 36 and the handle controllers 38a and 38 b. Foot pedal 36 may be used to activate and lock handle controls 38a and 38b, reposition camera movements, and electrosurgical activation/deactivation. In particular, foot pedal 36 may be used to perform a clutching action on handle controllers 38a and 38 b. The clutching is activated by depressing one of the foot pedals 36, which disconnects the handle controller 38a and/or 38b from the robotic arm 40 and the corresponding instrument 50 or camera 51 attached thereto (i.e., prevents movement input). This allows the user to reposition the handle controls 38a and 38b without moving the robotic arm(s) 40, as well as the instrument 50 and/or camera 51. This is useful when reaching the control boundaries of the surgical space.
Each of the control tower 20, surgeon console 30, and robotic arm 40 includes a respective computer 21, 31, 41. The computers 21, 31, 41 are interconnected to each other by using any suitable communication network based on a wired or wireless communication protocol. The term "network", as used herein, whether plural or singular, refers to a data network including, but not limited to, the internet, an intranet, a wide area network, or a local area network, and is not limited to the full scope of the definition of communication network covered by the present disclosure. Suitable protocols include, but are not limited to, transmission control protocol/internet protocol (TCP/IP), datagram protocol/internet protocol (UDP/IP), and/or datagram congestion control protocol (DC). Wireless communication may be implemented by one or more wireless configurations, e.g., radio frequency, light, wi-Fi, bluetooth (an open wireless protocol, for exchanging data from fixed devices and mobile devices over short distances using short length radio waves, creating Personal Area Networks (PANs))(A set of specifications for advanced communication protocols using small low power digital radios based on the IEEE 122.15.4-1203 Wireless Personal Area Network (WPAN) standard).
The computer 21, 31, 41 may include any suitable processor (not shown) operatively coupled to a memory (not shown) that may include one or more of volatile, non-volatile, magnetic, optical, or electrical media, such as read-only memory (ROM), random-access memory (RAM), electrically-erasable programmable ROM (EEPROM), non-volatile RAM (NVRAM), or flash memory. The processor may be any suitable processor (e.g., control circuitry) adapted to perform the operations, computations, and/or instruction sets described in this disclosure, including but not limited to hardware processors, field Programmable Gate Arrays (FPGAs), digital Signal Processors (DSPs), central Processing Units (CPUs), microprocessors, and combinations thereof. Those skilled in the art will appreciate that the processors may be replaced by using any logical processor (e.g., control circuitry) adapted to perform the algorithms, calculations, and/or instruction sets described herein.
Referring to fig. 2, each robotic arm 40 may include a plurality of links 42a, 42b, 42c interconnected at joints 44a, 44b, 44c, respectively. Other configurations of links and joints may be used, as known to those skilled in the art. The joint 44a is configured to secure the robotic arm 40 to the mobile cart 60 and defines a first longitudinal axis. Referring to fig. 3, the mobile cart 60 includes an elevator 67 and a mounting arm 61 that provides a base for mounting the robotic arm 40. The lifter 67 allows the installation arm 61 to move vertically. The mobile cart 60 also includes a screen 69 for displaying information about the robotic arm 40. In embodiments, the robotic arm 40 may include any type and/or any number of joints.
The mounting arm 61 includes a first link 62a, a second link 62b, and a third link 62c, which provide lateral maneuverability of the robotic arm 40. Links 62a, 62b, 62c are interconnected at joints 63a and 63b, each of which may include an actuator (not shown) for rotating links 62b and 62b relative to each other and relative to link 62 c. In particular, the links 62a, 62b, 62c are movable in their respective lateral planes parallel to each other, thereby allowing the robotic arm 40 to extend relative to a patient (e.g., an operating table). In an embodiment, the robotic arm 40 may be coupled to an operating table (not shown). The setting arm 61 comprises a control device 65 for adjusting the movement of the links 62a, 62b, 62c and the elevator 67. In embodiments, the mounting arm 61 may include any type and/or any number of joints.
The third link 62c may include a rotatable base 64 having two degrees of freedom. In particular, the rotatable base 64 includes a first actuator 64a and a second actuator 64b. The first actuator 64a is rotatable about a first fixed arm axis perpendicular to the plane defined by the third link 62c, and the second actuator 64b is rotatable about a second fixed arm axis transverse to the first fixed arm axis. The first actuator 64a and the second actuator 64b allow for full three-dimensional orientation of the robotic arm 40.
The actuator 48b of the joint 44b is coupled to the joint 44c via a strap 45a, and the joint 44c is in turn coupled to the joint 46b via a strap 45 b. The joint 44c may include a transfer case coupling the straps 45a and 45b such that the actuator 48b is configured to rotate each of the links 42b, 42c and the holder 46 relative to one another. More specifically, the links 42b, 42c and the holder 46 are passively coupled to an actuator 48b that forcibly rotates about a pivot point "P" located at the intersection of a first axis defined by the link 42a and a second axis defined by the holder 46. In other words, the pivot point "P" is the Remote Center of Motion (RCM) of the robotic arm 40. Thus, the actuator 48b controls the angle θ between the first axis and the second axis, allowing for the orientation of the surgical instrument 50. As a result of the interconnection of the links 42a, 42b, 42c and the holder 46 via the straps 45a and 45b, the angle between the links 42a, 42b, 42c and the holder 46 is also adjusted to achieve the desired angle θ. In an embodiment, some or all of the joints 44a, 44b, 44c may include actuators to eliminate the need for mechanical linkages.
The joints 44a and 44b include actuators 48a and 48b configured to drive the joints 44a, 44b, 44c relative to each other via a series of straps 45a and 45b or other mechanical linkages (e.g., drive rods, cables, or levers, etc.). In particular, the actuator 48a is configured to rotate the robotic arm 40 about a longitudinal axis defined by the link 42 a.
Referring to fig. 2, the holder 46 defines a second longitudinal axis and is configured to receive an Instrument Drive Unit (IDU) 52 (fig. 1). IDU 52 is configured to be coupled to an actuation mechanism of surgical instrument 50 and camera 51 and is configured to move (e.g., rotate) and actuate instrument 50 and/or camera 51. IDU 52 transmits an actuation force from its actuator to surgical instrument 50 to actuate components of end effector 49 of surgical instrument 50. Holder 46 includes a slide mechanism 46a configured to move IDU 52 along a second longitudinal axis defined by holder 46. The retainer 46 also includes a joint 46b that rotates the retainer 46 relative to the link 42 c. During an endoscopic procedure, instrument 50 may be inserted through an endoscopic access port 55 (fig. 3) held by holder 46. The holder 46 also includes a port lock 46c (fig. 2) for securing the access port 55 to the holder 46.
The robotic arm 40 also includes a mounting arm 61 that can be used in manual mode and a plurality of manual override buttons 53 (fig. 1) provided on the IDU 52. The user may press one or more of these buttons 53 to move the components associated with the buttons 53.
Referring to fig. 4, each of the computers 21, 31, 41 of the surgical robotic system 10 may include a plurality of controllers, which may be implemented in hardware and/or software. The computer 21 controlling the tower 20 includes a controller 21a and a safety observer 21b. Controller 21a receives data from computer 31 of surgeon console 30 regarding the current position and/or orientation of handle controllers 38a and 38b and the status of foot pedal 36 and other buttons. The controller 21a processes these input positions to determine the desired drive commands for each joint and/or IDU 52 of the robotic arm 40 and communicates these desired drive commands to the computer 41 of the robotic arm 40. Controller 21a also receives the actual joint angle measured by the encoders of actuators 48a and 48b and uses this information to determine force feedback commands that are transmitted back to computer 31 of surgeon console 30 to provide tactile feedback through handle controllers 38a and 38 b. The safety observer 21b performs a validity check on the data entered into and exiting from the controller 21a and, if an error in the data transmission is detected, notifies the system fault handling program to put the computer 21 and/or the surgical robotic system 10 in a safe state.
The controller 21a is coupled to a storage device 22a, which may be a non-transitory computer readable medium, configured to store any suitable computer data, such as software instructions executable by the controller 21 a. The controller 21a also includes a transient memory 22b for loading instructions and other computer readable data during execution of the instructions. In embodiments, other controllers of system 10 include similar configurations.
The computer 41 includes a plurality of controllers, i.e., a cart main controller 41a, an installation arm controller 41b, a robot arm controller 41c, and an Instrument Drive Unit (IDU) controller 41d. The cart main controller 41a receives and processes the joint command from the controller 21a of the computer 21, and transmits it to the setup arm controller 41b, the robot arm controller 41c, and the IDU controller 41d. The cart master controller 41a also manages the overall status of the mobile cart 60, robotic arm 40, and IDU 52 for instrument replacement. The cart master controller 41a also communicates the actual joint angle back to the controller 21a.
Each of the joints 63a and 63b and the rotatable base 64 of the mounting arm 61 are passive joints (i.e., where no actuator is present) that allow manual adjustment by a user. The joints 63a and 63b and the rotatable base 64 include detents that are disengaged by the user to configure the mounting arm 61. The mounting arm controller 41b monitors the sliding of each of the joints 63a and 63b and the rotatable base 64 of the mounting arm 61 when the brake is engaged, or is free to move by the operator when the brake is disengaged, but does not affect the control of the other joints. The robot arm controller 41c controls each joint 44a and 44b of the robot arm 40 and calculates the desired motor torque required for the gravity compensation, friction compensation, and closed loop position control of the robot arm 40. The robot arm controller 41c calculates a movement command based on the calculated torque. The calculated motor commands are then transmitted to one or more of the actuators 48a and 48b in the robotic arm 40. The actual joint position is then transmitted back to the robot arm controller 41c by the actuators 48a and 48 b.
IDU controller 41d receives the desired joint angle (e.g., wrist angle and jaw angle) of surgical instrument 50 and calculates the desired current for the motor in IDU 52. The IDU controller 41d calculates the actual angle based on the motor position and transmits the actual angle back to the cart main controller 41a.
The robot arm 40 is controlled in response to the posture of a handle controller (e.g., the handle controller 38 a) that controls the robot arm 40, which is converted into a desired posture of the robot arm 40 by the hand-eye conversion function performed by the controller 21 a. The hand-eye functions, as well as other functions described herein, are implemented in software executable by the controller 21a or any other suitable controller described herein. The pose of one of the handle controllers 38a may be implemented as a coordinate position and roll-pitch-Roll (RPY) orientation relative to a coordinate reference frame fixed to the surgeon console 30. The desired pose of the instrument 50 is relative to a stationary system on the robotic arm 40. The pose of the handle controller 38a is then scaled by the scaling function performed by the controller 21 a. In an embodiment, the coordinate position may be reduced and the orientation may be enlarged by the zoom function. In addition, the controller 21a may also perform a clutching function for disengaging the handle controller 38a from the robotic arm 40. In particular, if certain movement limits or other limits are exceeded, the controller 21a stops transmitting movement commands from the handle controller 38a to the robotic arm 40 and essentially acts like a virtual clutch mechanism, e.g., limiting mechanical inputs to affect mechanical outputs.
The desired pose of the robotic arm 40 is based on the pose of the handle controller 38a and then transferred through the inverse kinematics function performed by the controller 21 a. The inverse kinematics function calculates the angle of the joints 44a, 44b, 44c of the robotic arm 40 that achieve the scaled and adjusted pose input by the handle controller 38 a. The calculated angle is then transferred to a robotic arm controller 41c, which includes a joint axis controller with a Proportional Derivative (PD) controller, a friction estimator module, a gravity compensator module, and a double-sided saturation block configured to limit the commanded torque of the motors of the joints 44a, 44b, 44 c.
Referring to fig. 5, surgical robotic system 10 is mounted about an operating table 90. The system 10 includes mobile carts 60a-d, which may be numbered "1" through "4". During installation, each of the carts 60a-d is positioned around the operating table 90. The position and orientation of the carts 60a-d is dependent on a number of factors, such as the placement of the plurality of access ports 55a-d, which in turn is dependent on the procedure being performed. Once port placement is determined, the access ports 55a-d are inserted into the patient and the carts 60a-d are positioned to insert the instrument 50 and laparoscopic camera 51 into the corresponding ports 55 a-d.
During use, each of the robotic arms 40a-d is attached to one of the access ports 55a-d inserted into the patient by attaching a latch 46c (FIG. 2) to the access port 55 (FIG. 3). IDU 52 is attached to holder 46, and then SIM 43 is attached to the distal portion of IDU 52. Thereafter, the instrument 50 is attached to the SIM 43. Instrument 50 is then inserted through access port 55 by moving IDU 52 along holder 46. SIM 43 includes a plurality of drive shafts configured to transmit rotation of the respective motors of IDU 52 to instrument 50, thereby actuating instrument 50. In addition, SIM 43 provides a sterile barrier between instrument 50 and other components of robotic arm 40, including IDU 52. SIM 43 is also configured to secure a sterile drape (not shown) to IDU 52.
During deployment, the assistant access port 50e is also inserted into the patient and used to insert any suitable surgical instrument 70. The instrument 70 may be a manual, robotic, or motor-driven surgical instrument, an additional endoscopic camera, an ultrasonic probe, or any other surgical instrument that is not directly controlled by the robotic system 10. In an embodiment, more than one assistant access port 55e may be used to accommodate additional instruments.
The system 10 includes an external vision tracking unit 80 (e.g., one or more cameras or infrared sensors) configured to track the position of the access ports 50a-e, the robotic arm 40, etc. Tracking unit 80 may include one or more position sensors, which may be any suitable white light or infrared camera, electromagnetic sensor, magneto-resistive sensor, radio frequency sensor, or any other sensor suitable for adequately sensing the position of a navigation marker. The tracking unit 80 may be configured to detect markers disposed on the access ports 50 a-e. The marker may be a passive tracking element (e.g., a reflector) for transmitting an optical signal (e.g., reflecting light emitted from the tracking unit 80). Alternatively, the marker may comprise a radio-opaque material that is identified and trackable by the tracking unit 80. In other configurations, active tracking markers may be employed. The active tracking marker may be, for example, a light emitting diode that emits light (such as infrared light). Active and passive arrangements are possible. The markers may be arranged in a defined or known position and orientation relative to the other markers to allow the tracking unit 80 to determine the position of the access ports 50a-e relative to each other. The access ports 50a-d may be registered with a particular robotic arm 40 and corresponding instrument 50, and the access port 50e registered with the assistant instrument 70 to allow the surgical robotic system 10 to determine the position and/or orientation of the instrument 50, camera 51, and instrument 70 within a defined space such as a surgical field (surgical field).
Referring to fig. 6, the first screen 32 of the surgeon console 30 includes a GUI 100 that provides a video feed 102 of the camera 51. The video feed 102 is within the field of view of the camera 51 and may show the surgical site, instrument 50, instrument 70, etc. The video processing device 56 is configured to output a GUI 100 that may be displayed on any screen of the system 10, i.e., the first screen 32 of the surgeon console 30, the screen 23 of the control tower 20.
Fig. 7 illustrates a method for providing instructions to a bedside assistant via a GUI 100 displayed on a screen 23 of the control tower 20. The method may be implemented as software instructions executable by a processor (e.g., controller 21 a) and in particular as a software application having a GUI 100 for providing virtual instrument positions. The virtual instrument may be moved by the surgeon through the surgeon console 30 and used by the bedside assistant as a guide for moving the assistant instrument 70.
At step 200, the tracking unit 80 is used to determine the external locations of the access ports 50a-e and provide these external locations to the system 10. The position data is provided to the video processing device 56 for outputting the virtual appliance 170 in the GUI 100 as an overlay in the video feed 102 (fig. 6). In particular, the access ports 50a-e are registered in a world coordinate system that allows the camera 51 and instrument(s) 50 to be registered relative to each other.
At step 202, the controller 21a determines whether the instrument 70 is within the field of view of the camera 51. The video processing device 56 may use a machine learning image processing algorithm. In an embodiment, the machine learning may include a Convolutional Neural Network (CNN) and/or a Support Vector Machine (SVM). CNNs may be trained on previous data (e.g., images of various instruments and devices). The video processing device 56 is configured to communicate with the controller 21a to inform the controller 21a whether the instrument 70 is present.
If the instrument 70 is outside the field of view of the camera 51, the controller 21a may output a message at step 204 at the GUI 100 at the screen 23 of the control tower 20 and/or the screen 32 of the surgeon console 30. The message may include a directional arrow for moving the camera 51 and/or the instrument 70 such that the instrument 70 is within the field of view of the camera 51.
Once the instrument 70 is visible to the camera 51, the video processing device 56 renders the virtual instrument 170 in the video feed 102 to generate an augmented reality video feed at step 206. The video processing device 56 is configured to load 3D models and physical models and kinematics of the helper instrument 70 that are used to render the virtual instrument 170 and simulate movement of the virtual instrument 170. Virtual instrument 170 may be rendered in a virtual 3D space of the field of view of camera 51 based on a depth map of the surgical site. Image processing techniques may be used for stereoscopic video feeds to perform depth mapping. As described above, the virtual instrument 170 is displayed on both the first screen 32 of the surgeon console 30 and the screen 23 of the control tower 20, each screen receiving a feed of the camera 51.
At step 208, the surgeon may control virtual instrument 170. This may be done in the same manner as the robotic arm 40 and the instrument 50 and/or camera 51 coupled thereto are controlled. The surgeon selects the virtual instrument 170 through one of the handle controls 38a or 38b, which allows the virtual instrument 170 to move as if the virtual instrument 170 were an actual instrument, such as instrument 50. Movement of the handle controller 38a or 38b moves the virtual appliance 170 in the enhanced video feed 102 based on the depth map of the scene and the 3D rendering of the model.
The video processing device 56 is configured to render a physical-based virtual 3D space based on a stereoscopic reconstruction (i.e., depth map) from the video feed 102. This will enable creation of virtual boundaries in virtual 3D space through which virtual instrument 170 cannot move. The virtual boundary may be indicated to the user via tactile feedback through the handle controllers 38a and 38b and the resistance of the motors adjusting the controllers 38a and 38 b. Accordingly, the surgeon console 30 may limit movement of the virtual instrument 170 based on the virtual boundary. In an embodiment, the virtual appliance 170 may be used as a virtual tool to measure distances in the video feed 102. This may be accomplished by any number of 3D user interface methods, such as moving the virtual instrument 170 between two points to measure distance like a virtual ruler.
Movement of the virtual instrument 170 may also include multiple movements or maneuvers in one or more directions until the virtual instrument 170 is in a desired position. At step 210, the final position of the virtual instrument 170 is displayed on the GUI 100, and in particular on the video feed 102. In addition, the movement sequence taken to reach the final position may be replayed on the GUI 100 as well. These aids provide guidance to the assistant who is moving the actual (i.e., tangible) helper instrument 70 to the final position indicated by the virtual instrument 170. GUI 100 may display directional arrows indicating in which manner instrument 70 is to be moved to reach the final position.
In step 212, the video processing device 56 checks whether the helper instrument 70 is in the same or substantially the same position as the virtual instrument 170. The video processing device 56 compares the plurality of keypoints of the helper instrument 70 with corresponding keypoints of the virtual instrument 170 to determine whether a 3D alignment exists between the instrument 70 and the virtual instrument 170. At step 214, if there is a mismatch between the position of the instrument 70 and the virtual instrument 170, the GUI 100 may indicate using an alarm or by using a particular color (e.g., red) of the virtual instrument 170 on the video feed 102. If there is alignment, at step 216, GUI 100 may indicate by changing color (e.g., green), by a message, or the like. In an embodiment, the color of virtual instrument 170 may change from red to green as a thermodynamic diagram to indicate the proximity of the alignment.
In an embodiment, in addition to or as an alternative to video processing device 56 checking whether the helper instrument 70 is in the same or substantially the same position as the virtual instrument 170, system 10 may display a prompt to the user on the surgeon console 30 confirming whether the helper instrument 70 is in the indicated position of the virtual instrument 170. The cues may be generated periodically and/or in response to the position of the helper instrument 70 proximate to the virtual instrument 170. Once the user positively confirms that the helper instrument 70 is in the indicated position, the GUI 100 may then be indicated by changing color (e.g., green) or another message. In an embodiment, the color of virtual instrument 170 may change from red to green as a thermodynamic diagram to indicate the proximity of the alignment.
The disclosed systems and methods may be used with a variety of different surgical instruments, and several exemplary embodiments are provided below. Endoscopic stapling involves a complex workflow that involves positioning the stapling end effector at a precise angle and position. The use of a virtual stapler will allow the assistant to follow the assistant instrument 70 (e.g., a physical stapler) to the exact location of the tissue to be stapled.
In further embodiments, the virtual instrument 170 may be a virtual laparoscopic ultrasound probe and the instrument 70 may be an endoscopic ultrasound probe. The robotic system may use a robotic insertion probe that is manipulated by the grasper instrument to obtain an ultrasound image of the tissue. However, such probes are expensive and inconvenient to use, and they also require one of the mechanical arms. Accordingly, the assistant instrument 70 may be an endoscopic ultrasound probe that the assistant may position as indicated by the virtual instrument 170, thereby indicating the location where ultrasound imaging is desired. This will allow the surgeon to move the virtual instrument 170 (e.g., an ultrasound probe) along the organ to track or confirm the tumor boundary or other critical structures.
In additional embodiments, the virtual instrument 170 may be a virtual endoscope and the helper instrument 70 may be an additional endoscope that may be controlled by an assistant or may be controlled directly by a surgeon through the surgeon console 30. The surgeon may disengage the clutch and maneuver the virtual endoscope without the assistance of an assistant. The virtual endoscope may be used to render views of the surgical site from different angles without moving the camera 51. In an embodiment, the feed from the virtual endoscope may be combined with the feed from the camera 51 to generate a wrap-around view or analog port jump of the surgical site, i.e., to switch the video feed displayed on the first screen 32 of the surgeon console 30.
The virtual endoscope may also be used to render a simulated view of the anatomical structure from multiple angles using a continuously updated depth map. Machine learning may be implemented in a virtual endoscope, for example, based on a method of generating a countermeasure network (GAN) to render views that are not directly visible by the camera 51. GAN can be trained on other patient anatomies from previous videos of similar surgery to generate such views.
It will be understood that various modifications may be made to the embodiments disclosed herein. Thus, the above description should not be construed as limiting, but merely as exemplifications of the various embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the claims appended hereto.

Claims (20)

1. A surgical robotic system, comprising:
a helper access port configured to receive a helper instrument;
An endoscopic camera configured to generate a video feed of the surgical site and the assistant instrument;
a control tower including a first screen;
A surgeon console including a second screen and a handle controller, the surgeon console configured to receive user input, and
A video processing device configured to:
rendering a virtual instrument of the helper instrument in the video feed to generate an enhanced video feed;
Moving the virtual instrument in the enhanced video feed in response to the user input;
Outputting the enhanced video feed with the virtual appliance on the first screen and the second screen;
confirming whether the assistant instrument is placed at the position of the virtual instrument, and
Indicating on the first screen and the second screen whether the helper instrument is placed at the location of the virtual instrument.
2. The surgical robotic system of claim 1, further comprising:
a robotic arm including a robotic instrument, and
A robotic access port configured to receive the robotic instrument.
3. The surgical robotic system according to claim 2, wherein the surgeon console is configured to switch between controlling the robotic instrument and the virtual instrument.
4. The surgical robotic system of claim 2, further comprising a tracking unit configured to track a location of the assistant access port and a location of the robot access port.
5. The surgical robotic system of claim 4, wherein the video processing device is configured to determine the location of the helper instrument based on the location of the helper access port and the location of the robot access port.
6. The surgical robotic system according to claim 1, wherein the video processing device is configured to render the virtual instrument based on 3D model data of the helper instrument.
7. The surgical robotic system of claim 1, wherein the endoscopic camera is a stereoscopic camera and the processing device is configured to generate a depth map of the surgical site.
8. The surgical robotic system of claim 7, wherein the processing device is configured to generate a virtual boundary corresponding to a physical boundary of the surgical site based on the depth map.
9. The surgical robotic system according to claim 8, wherein the surgeon console is configured to limit the user input for controlling movement of the virtual instrument at the surgeon console based on the virtual boundary.
10. A non-transitory computer readable medium storing instructions that, when executed by a processor, cause the processor to perform a computer-implemented method for transmitting movement instructions using a virtual appliance, the method comprising:
receiving a video feed of a surgical site and an assistant instrument;
rendering a virtual instrument of the assistant instrument in the video feed to generate an enhanced video feed, the enhanced video feed displayed on a first screen of a control tower and a second screen of a surgeon console;
Moving the rendered virtual instrument in the video feed in response to an input signal received from the surgeon console;
Outputting the enhanced video feed with the virtual appliance on the first screen and the second screen;
Prompting the user to confirm whether the helper instrument is positioned at the location of the virtual instrument, and
Indicating on the first screen and the second screen whether the helper instrument is placed at the location of the virtual instrument.
11. The non-transitory computer readable medium of claim 10, wherein the computer-implemented method further comprises:
Switching between controlling a robotic instrument coupled to the robotic arm and inserted through the robotic access port and the virtual instrument.
12. The non-transitory computer readable medium of claim 11, wherein the computer-implemented method further comprises:
Tracking a position of an assistant access port through which the assistant instrument is inserted and a position of the robot access port.
13. The non-transitory computer readable medium of claim 12, wherein the computer-implemented method further comprises:
The position of the helper instrument is determined based on the position of the helper access port and the position of the robot access port.
14. The non-transitory computer readable medium of claim 10, wherein the computer-implemented method further comprises:
Rendering the virtual instrument based on the 3D model data of the helper instrument.
15. The non-transitory computer readable medium of claim 10, wherein the computer-implemented method further comprises:
A depth map of the surgical site is generated.
16. The non-transitory computer-readable medium of claim 15, wherein the computer-implemented method further comprises:
a virtual boundary corresponding to a physical boundary of the surgical site is generated based on the depth map.
17. The non-transitory computer readable medium of claim 16, wherein the computer-implemented method further comprises:
the virtual instrument is restricted from moving at the surgeon console based on the virtual boundary.
18. A surgical robotic system, comprising:
a robotic arm including a robotic instrument, and
A robotic access port configured to receive the robotic instrument;
a helper access port configured to receive a helper instrument;
An endoscopic camera configured to generate a video feed of the surgical site and the assistant instrument;
a control tower including a first screen;
a surgeon console including a second screen and a handle controller, the surgeon console configured to receive user inputs to control the robotic instrument and the virtual instrument, and
A video processing device configured to:
Rendering the virtual appliance of the helper appliance in the video feed to generate an enhanced video feed;
Moving the virtual instrument in the enhanced video feed in response to the user input;
Outputting the enhanced video feed with the virtual appliance on the first screen and the second screen;
confirming whether the assistant instrument is placed at the position of the virtual instrument, and
Indicating on the first screen and the second screen whether the helper instrument is placed at the location of the virtual instrument.
19. The surgical robotic system according to claim 18, wherein the endoscopic camera is a stereoscopic camera and the processing device is configured to generate a depth map of the surgical site.
20. The surgical robotic system according to claim 19, wherein the processing device is configured to generate a virtual boundary corresponding to a physical boundary of the surgical site based on the depth map, and the surgeon console is configured to limit the user input for controlling movement of the virtual instrument at the surgeon console based on the virtual boundary.
CN202480006426.XA 2023-01-09 2024-01-02 Surgical robotic system and method for communication between surgeon console and bedside assistant Pending CN120476370A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202363437786P 2023-01-09 2023-01-09
US63/437,786 2023-01-09
PCT/IB2024/050029 WO2024150077A1 (en) 2023-01-09 2024-01-02 Surgical robotic system and method for communication between surgeon console and bedside assistant

Publications (1)

Publication Number Publication Date
CN120476370A true CN120476370A (en) 2025-08-12

Family

ID=89619647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202480006426.XA Pending CN120476370A (en) 2023-01-09 2024-01-02 Surgical robotic system and method for communication between surgeon console and bedside assistant

Country Status (3)

Country Link
EP (1) EP4649371A1 (en)
CN (1) CN120476370A (en)
WO (1) WO2024150077A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2026018118A1 (en) * 2024-07-19 2026-01-22 Covidien Lp Surgical robotic system and method for preoperative planning

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8398541B2 (en) * 2006-06-06 2013-03-19 Intuitive Surgical Operations, Inc. Interactive user interfaces for robotic minimally invasive surgical systems
CN102341046B (en) * 2009-03-24 2015-12-16 伊顿株式会社 Surgical robot system and control method using augmented reality technology
KR20140112207A (en) * 2013-03-13 2014-09-23 삼성전자주식회사 Augmented reality imaging display system and surgical robot system comprising the same
CN119451638A (en) * 2022-06-27 2025-02-14 柯惠Lp公司 Accessory port placement for minimally invasive or robotic-assisted surgery

Also Published As

Publication number Publication date
EP4649371A1 (en) 2025-11-19
WO2024150077A1 (en) 2024-07-18

Similar Documents

Publication Publication Date Title
KR102422496B1 (en) Inter-operative switching of tools in a robotic surgical system
EP4275642A1 (en) Real-time instrument position identification and tracking
US11948226B2 (en) Systems and methods for clinical workspace simulation
CN121099961A (en) Surgical robotic system and method for generating digital twins
EP4154837A1 (en) Surgical robotic system setup
CN120476370A (en) Surgical robotic system and method for communication between surgeon console and bedside assistant
US20250380995A1 (en) Assisted port placement for minimally invasive or robotic assisted surgery
CN121001675A (en) Surgical robot systems and methods for preventing instrument collisions
US20240415590A1 (en) Surgeon control of robot mobile cart and setup arm
US20250127577A1 (en) Surgical robotic system setup using color coding
US11786315B2 (en) Surgical robotic system having grip-dependent control
CN118019505A (en) Bedside installation method of movable arm cart in surgical robot system
CN120513063A (en) Surgical robotic system and method for navigating a surgical instrument
EP4654912A1 (en) Surgical robotic system and method for assisted access port placement
EP4633520A1 (en) Augmented reality simulated setup and control of robotic surgical systems with instrument overlays
WO2025078950A1 (en) Surgical robotic system and method for integrated control of 3d model data
CN118354732A (en) Graphical user interface foot pedal for surgical robotic systems
EP4633519A1 (en) Systems and methods for creating virtual boundaries in robotic surgical systems
WO2026015700A1 (en) Surgical robotic system for controlling laparoscopic and endoluminal instruments
WO2026018118A1 (en) Surgical robotic system and method for preoperative planning
CN121693310A (en) Surgical robot system and method for detecting access ports

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination