CN208573801U - Surgical robot system - Google Patents
Surgical robot system Download PDFInfo
- Publication number
- CN208573801U CN208573801U CN201820047256.1U CN201820047256U CN208573801U CN 208573801 U CN208573801 U CN 208573801U CN 201820047256 U CN201820047256 U CN 201820047256U CN 208573801 U CN208573801 U CN 208573801U
- Authority
- CN
- China
- Prior art keywords
- information
- image
- workstation
- dynamic
- imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 209
- 230000000007 visual effect Effects 0.000 claims abstract description 49
- 230000004927 fusion Effects 0.000 claims abstract description 46
- 230000033001 locomotion Effects 0.000 claims abstract description 42
- 230000005540 biological transmission Effects 0.000 claims abstract description 24
- 239000000284 extract Substances 0.000 claims abstract description 10
- 238000007499 fusion processing Methods 0.000 claims description 111
- 230000003068 static effect Effects 0.000 claims description 83
- 239000000523 sample Substances 0.000 claims description 46
- 238000001514 detection method Methods 0.000 claims description 19
- 238000002604 ultrasonography Methods 0.000 claims description 19
- 238000012285 ultrasound imaging Methods 0.000 claims description 11
- 238000004891 communication Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 abstract description 8
- 238000001356 surgical procedure Methods 0.000 description 9
- 238000000034 method Methods 0.000 description 8
- 238000002591 computed tomography Methods 0.000 description 4
- 230000006378 damage Effects 0.000 description 4
- 230000035515 penetration Effects 0.000 description 4
- 238000002595 magnetic resonance imaging Methods 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 208000012260 Accidental injury Diseases 0.000 description 2
- 238000001727 in vivo Methods 0.000 description 2
- 208000015181 infectious disease Diseases 0.000 description 2
- 208000014674 injury Diseases 0.000 description 2
- 238000013152 interventional procedure Methods 0.000 description 2
- 238000002955 isolation Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000002679 ablation Methods 0.000 description 1
- 238000011298 ablation treatment Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000000338 in vitro Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002697 interventional radiology Methods 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008816 organ damage Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 201000003144 pneumothorax Diseases 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
Landscapes
- Manipulator (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The utility model provides a kind of surgical robot system, including image co-registration processing equipment, quiescent imaging equipment, dynamic imaging equipment, robotic arm apparatus and vision imaging apparatus;3D image information is transferred to image co-registration processing equipment by quiescent imaging equipment;Visual space information is transferred to image co-registration processing equipment by vision imaging apparatus;Image co-registration processing equipment merges information and extracts;Dynamic information fusion is entered 3D image information by dynamic message transmission to image co-registration processing equipment, image co-registration processing equipment by dynamic imaging equipment;Mechanical control equipment drives surgical instrument movement to execute and punctures intervention operation, and dynamic imaging equipment is also capable of real-time acquisition surgical instrument in the motion profile of patient's body.The position of object to be scanned can accurately be positioned in this way, so that the i.e. reachable target area of surgical instrument one-time puncture;Simultaneously, moreover it is possible to the track of surgical instrument be avoided to deviate.
Description
Technical Field
The utility model relates to the technical field of medical equipment, especially, relate to a surgical robot system.
Background
Interventional radiology is continuously developed, and image-guided percutaneous puncture interventional surgery has become a common minimally invasive treatment method. In clinic, CT, ultrasound, C-arm (DSA), MRI and other imaging devices are generally used to acquire human body image data, and surgical puncture path planning is performed on a workstation, and a doctor is guided to perform an operation to complete a puncture operation. For the image equipment used as interventional navigation, each equipment has advantages and disadvantages, for example, when the CT or CBCT is used for guidance, high-definition human body 3D image data can be obtained, but the 3D data can not be continuously and dynamically acquired, the puncture path and the puncture needle track can not be monitored in real time, and the radiation dose generated by multiple positioning repeated scanning is large; when the ultrasonic guidance is used alone, although the human body image data can be flexibly and dynamically acquired, the positioning precision is relatively low, the scanning range is narrow, and the accurate condition of the target area in a large human body range is difficult to evaluate. The imaging devices have the problem that a doctor cannot accurately position during a puncture operation, multiple times of tentative puncture is needed to reach a target area, mechanical damage to a patient is large, and even accidents (such as pneumothorax, internal bleeding, organ damage and the like) can be caused to cause medical accidents.
SUMMERY OF THE UTILITY MODEL
Based on this, it is necessary to provide a surgical robot system capable of accurately positioning and ensuring accurate puncture movement track to solve the problem that the target area is reached by multiple times of puncture caused by inaccurate positioning of the existing puncture surgery.
The above purpose is realized by the following technical scheme:
a surgical robot system comprises an image fusion processing device, a static imaging device, a dynamic imaging device, a mechanical arm device and a visual imaging device;
the static imaging equipment is used for imaging 3D image information of an object to be scanned and transmitting the 3D image information to the image fusion processing equipment;
the vision imaging equipment is used for acquiring vision space information of a patient and an operation area on the sickbed equipment and transmitting the vision space information to the image fusion processing equipment;
the image fusion processing equipment fuses the 3D image information and the visual space information and extracts a planned scanning position of the dynamic imaging equipment and a puncture planned path of a surgical instrument;
the dynamic imaging equipment is used for imaging the dynamic information of the object to be scanned at the planned scanning position in real time and transmitting the dynamic information to the image fusion processing equipment, and the image fusion processing equipment fuses the dynamic information into the 3D image information;
the mechanical control equipment is used for bearing the surgical instrument and driving the surgical instrument to move to execute a puncture interventional operation according to a puncture planned path of the surgical instrument.
In one embodiment, the static imaging apparatus includes a static imaging device having a scanning cavity and a static imaging workstation, the static imaging device is configured to image 3D image information of the object to be scanned and feed the 3D image information back to the static imaging workstation, and the static imaging workstation transmits the 3D image information to the image fusion processing apparatus.
In one embodiment, the dynamic imaging apparatus includes an ultrasound imaging device and a dynamic imaging workstation, the ultrasound imaging device images dynamic information of the object to be scanned at the planned scanning position and feeds the dynamic information back to the dynamic imaging workstation, and the dynamic imaging workstation transmits the dynamic information to the image fusion processing apparatus.
In one embodiment, the mechanical arm device includes a first mechanical arm and a mechanical arm workstation, the first mechanical arm carries the surgical instrument, and the mechanical arm workstation receives the puncture planning path transmitted by the image fusion processing device, controls the first mechanical arm to move according to the puncture planning path, and executes a puncture interventional operation.
In one embodiment, the mechanical arm device further includes a second mechanical arm, the second mechanical arm carries an ultrasound probe of the ultrasound imaging apparatus, and the mechanical arm workstation receives the planned scanning position transmitted by the image fusion processing device and controls the second mechanical arm to move to the planned scanning position, so as to acquire dynamic information of the object to be scanned in real time.
In one embodiment, the vision imaging device is movably arranged near the sickbed device through a support column and is independent of the mechanical arm control device;
or the mechanical arm control equipment further comprises a supporting part and a third mechanical arm, the third mechanical arm is in transmission connection with the mechanical arm workstation, the visual imaging equipment is installed on the third mechanical arm, and the first mechanical arm, the second mechanical arm and the third mechanical arm are all installed on the supporting part.
In one embodiment, the image fusion processing device includes an image fusion display and an image fusion processing workstation, the image fusion processing workstation receives the 3D video information, the dynamic information, and the visual space information, respectively, fuses and displays the information through the image fusion display, and the image fusion processing workstation transmits the planned scanning position and the planned puncture path to the mechanical arm workstation.
In one of the embodiments, the robot arm apparatus further includes a puncture pose detecting piece and an ultrasonic pose detecting piece;
the puncture pose detection piece is used for detecting the position and/or the inclination angle of the surgical instrument and transmitting the position and/or the inclination angle information to the mechanical arm workstation, and the image fusion processing workstation fuses the position and/or the inclination angle information with the surgical instrument information in the visual space information;
the ultrasonic pose detection piece is used for detecting the position and/or the inclination angle of the ultrasonic probe and transmitting the position and/or the inclination angle information to the mechanical arm workstation, and the image fusion processing workstation fuses the position and/or the angle information of the ultrasonic probe with the 3D image information and the ultrasonic probe information in the visual space information respectively.
In one embodiment, the surgical robot system further comprises a distance measuring device, and the information acquired by the distance measuring device is used for calculating the position information of the surgical instrument and the ultrasonic probe and acquiring the body surface information of the patient;
the distance measuring device is integrated with the vision imaging device; or,
the distance measuring equipment is arranged on the first mechanical arm or the second mechanical arm.
In one embodiment, the robot control apparatus further comprises a console in transmission connection with the robot workstation, the console being capable of controlling the movement of the first robot arm and the second robot arm via the robot workstation;
and/or the surgical robot system is further integrated with a remote communication module, and the remote communication module is used for receiving remote control information so as to control the first mechanical arm and the second mechanical arm to move.
After the technical scheme is adopted, the beneficial effects of the utility model are that:
the utility model discloses a surgical robot system, static imaging device scans the 3D image information of the object to be scanned of patient and transmits the 3D image information to image fusion processing equipment, visual imaging device collects the visual space information of each equipment and patient in the operation area and transmits the visual space information to image fusion processing equipment, image fusion processing equipment fuses 3D image information and visual space to extract the position of the object to be scanned and generates the puncture planning path of surgical instrument for puncture intervention operation movement and the planning scanning position of dynamic imaging device, dynamic imaging device scans the dynamic information of the object to be scanned of patient in real time at the planning scanning position and transmits the dynamic information to image fusion processing equipment, image fusion processing equipment fuses the dynamic information into the 3D image information, thus, mechanical arm device controls the movement of surgical instrument according to the puncture planning path, the dynamic imaging equipment is adopted to monitor the motion track of the surgical instrument in the patient body in real time; the problem that the target area can be reached only by multiple times of puncture due to inaccurate positioning of the existing puncture surgery is effectively solved; therefore, the position of the object to be scanned can be accurately positioned, so that the surgical instrument can reach the target area by one-time puncture, and the large mechanical damage of a patient caused by multiple punctures of the surgical instrument is avoided; meanwhile, the dynamic imaging device can image the object to be scanned in real time, and when the surgical instrument enters the body of the patient, the dynamic imaging device can monitor the puncture planning path and the actual motion track of the surgical instrument, so that the track of the surgical instrument is prevented from deviating, the accuracy of the operation is improved, and the risk is reduced.
Drawings
Fig. 1 is a schematic front view of a surgical robot system according to an embodiment of the present invention;
FIG. 2 is a schematic top view of the surgical robotic system of FIG. 1;
FIG. 3 is a block diagram of a control portion of the surgical robotic system shown in FIG. 1;
fig. 4 is a schematic front view of a surgical robot system according to another embodiment of the present invention;
wherein:
100-a surgical robotic system;
120-an image fusion processing device;
121-image fusion processing workstation;
122-image fusion display;
130-a static imaging device;
131-a static imaging device;
132-a static imaging workstation;
133-static image display;
140-a visual imaging device;
150-a dynamic imaging device;
151-dynamic imaging workstation;
152-an ultrasound probe;
153-dynamic image display;
160-a robotic arm device;
161-mechanical arm workstation;
162-a first robot arm;
163-a second robotic arm;
164-robotic arm image display;
165-console;
200-patient;
300-a surgical instrument;
400-hospital bed equipment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clearly understood, the surgical robot system of the present invention will be further described in detail by the following embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the invention.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings). In the description of the present invention, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore, should not be construed as limiting the present invention.
In the present application, unless expressly stated or limited otherwise, the first feature may be directly on or directly under the second feature or indirectly via intermediate members. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1, the present invention provides a surgical robotic system 100, wherein the surgical robotic system 100 is used for performing a puncture interventional operation on a patient 200. The utility model discloses a surgical robot system 100 can intervene the operation with the puncture with the dynamic information and the static information fusion of the visual spatial information in the operation area and treating the scanning object with patient 200, guide the puncture, realize accurate positioning and accurate puncture, improve the successful way that the operation was intervened in the puncture, reduce the medical accident risk that the accidental injury leads to. It is understood that the object to be scanned includes, but is not limited to, a focal region, and may also include a non-focal region.
Referring to fig. 1 to 3, in an embodiment of the present invention, the surgical robot system 100 includes an image fusion processing device 120, a static imaging device 130, a dynamic imaging device 150, a mechanical arm device 160, and a visual imaging device 140, wherein the image fusion processing device 120 is respectively connected to the static imaging device 130, the dynamic imaging device 150, the visual imaging device 140, and the mechanical arm device 160 in a transmission manner. The transmission connection herein includes a communication connection and/or an electrical connection, and the communication connection may be a wired connection or a wireless connection, and the type of the transmission connection is not limited in principle as long as information transmission between the respective devices is possible. The surgical robotic system 100 of the present invention is used in conjunction with the hospital bed device 400, and the hospital bed device 400 can be a part of the surgical robotic system 100 or can be independent of the surgical robotic system 100. The hospital bed apparatus 400 is used for supporting the patient 200, and performing operations such as examination and interventional puncture on the patient 200. The static imaging device 130 is configured to image 3D image information of an object to be scanned, and transmit the 3D image information to the image fusion processing device 120. The vision imaging device 140 is used for acquiring the vision space information of the patient 200 and the operation region on the hospital bed device 400 and transmitting the vision space information to the image fusion processing device 120. The image fusion processing device 120 fuses the 3D imagery information and the visual space information and extracts the planned scan position of the dynamic imaging device 150 and the planned path of the puncture of the surgical instrument 300. The dynamic imaging device 150 is configured to image dynamic information of the object to be scanned at the planned scanning position in real time and transmit the dynamic information to the image fusion processing device 120, and the image fusion processing device 120 fuses the dynamic information into 3D image information. The mechanical control device is used for carrying the surgical instrument 300, and driving the surgical instrument 300 to move according to the puncture planned path of the surgical instrument 300 to perform a puncture interventional operation, and the dynamic imaging device 150 can also acquire the motion track of the surgical instrument 300 in the body of the patient 200 in real time.
The static imaging device 130 performs three-dimensional imaging scanning on the object to be scanned of the patient 200 to obtain 3D image information of the object to be scanned of the patient 200, that is, in-vivo 3D image information, and transmits the information to the image fusion processing device 120. The vision imaging device 140 scans the patient 200 on the patient bed device 400, and also scans various devices (the image fusion processing device 120, the static imaging device 130, the dynamic imaging device 150, the mechanical arm device 160, the vision imaging device 140, and the like) in the operation area to determine the position relationship between the various devices in the operation area and the patient 200, so as to generate visual space information outside the body, and transmit the visual space information to the image fusion processing device 120. The image fusion processing device 120 can fuse the in-vitro visual space information and in-vivo 3D image information to form a multi-modal image, so as to determine the position coordinate information of the object to be scanned in the operating region, thereby realizing accurate positioning of surgical navigation; the image fusion processing device 120 can also automatically or manually plan a puncture interventional operation path according to the position coordinate information of the object to be scanned, so as to extract a preset planned scanning position of the dynamic imaging device 150 and a puncture planned path of the surgical instrument 300 on the robotic arm device 160. The dynamic imaging device 150 moves to the planned scanning position and performs dynamic imaging scanning on the object to be scanned in real time to obtain dynamic information of the object to be scanned, and transmits the dynamic information to the image fusion processing device 120. In the 3D video information, the image fusion processing device 120 is capable of fusing the dynamic information into the 3D video information according to the dynamic information and the corresponding position and angle of the dynamic imaging device 150, so as to display a dynamic simulated fusion image (each cross section and the whole position) in the 3D video information, thereby forming a dynamic fusion image. The dynamic fusion image can improve the definition of the object to be scanned, and the dynamic information of the 2D image of the dynamic imaging device 150 is fused into the 3D image information, so that the stereoscopic impression is increased, the judgment of the doctor on the space position in the patient body during the operation is facilitated, and the medical care personnel can observe the condition of the object to be scanned conveniently.
Meanwhile, the dynamic imaging device 150 can also perform real-time imaging on the object to be scanned in real time, transmit the object to be scanned to the image fusion processing device 120 in real time, and perform real-time fusion processing through the image fusion processing device 120. The robotic arm apparatus 160 moves the surgical instrument 300 to a safe region above the surface of the patient 200 where the planned path of penetration is located. When the mechanical arm device 160 drives the surgical instrument 300 to perform a puncture interventional operation on an object to be scanned according to the puncture planned path, the dynamic imaging device 150 performs real-time imaging on the object to be scanned, and also acquires a motion trajectory of the surgical instrument 300 entering the body of the patient 200 in real time, and feeds the motion trajectory back to the image fusion processing device 120 for real-time fusion, and the image fusion processing device 120 further compares the motion trajectory of the surgical instrument 300 with the puncture planned path to determine whether the motion trajectory of the surgical instrument 300 is the preset puncture planned path.
If the motion trajectory of the surgical instrument 300 coincides with the puncture planned path, the mechanical arm device 160 continues to drive the surgical instrument 300 to perform a puncture interventional operation; if the motion trajectory of the surgical instrument 300 deviates from the puncture planned path and the deviation angle is small, the image fusion processing device 120 transmits the deviation angle information to the mechanical arm device 160, and the mechanical arm device 160 finely adjusts the angle of the surgical instrument 300 to ensure that the motion trajectory of the surgical instrument 300 is on the preset puncture planned path; if the movement trajectory of the surgical instrument 300 deviates from the puncture planned path by a large angle, at this time, the image fusion processing device 120 transmits the deviation angle information to the mechanical arm device 160, and the mechanical arm device 160 extracts the surgical instrument 300 from the body of the patient 200 to perform the puncture interventional operation again, or suspends the movement of the surgical instrument 300, and outputs a prompt or an alarm, so that the medical staff selects a subsequent scheme.
It should be noted that, extract surgical instrument 300 from patient 200 internally and can cause certain damage to patient 200, the utility model discloses a surgical robot system 100 through image fusion processing equipment 120, dynamic imaging equipment 150 and static imaging equipment 130, draws the real-time orbit of moving of surgical instrument 300 among the dynamic information and carries out the emulation with static 3D image information and fuse, can realize the puncture planning route that shows surgical instrument 300 in real time on the dynamic fusion image, avoids surgical instrument 300's motion orbit to take place to deviate. In addition, in this embodiment, the surgical instrument 300 can further ensure that the surgical instrument 300 moves according to the puncture planned path by using the mechanical arm device 160, so as to avoid deviation of the movement track of the surgical instrument 300. When medical personnel handheld surgical instruments 300 carry out the puncture and intervene the operation, also can guarantee the precision of puncture intervention operation through the puncture planning route that shows surgical instruments 300 in real time on the dynamic fusion image, reduce surgical instruments 300's motion trajectory and take place skew risk, guarantee the accuracy of puncture intervention operation.
Alternatively, the static imaging device 130, the dynamic imaging device 150, the robotic arm device 160, and the vision imaging device 140 may be moved to the vicinity of the patient bed device 400 to image the patient 200 and the object to be scanned on the patient bed device 400, perform interventional puncture procedures, and so on. Of course, in other embodiments of the present invention, the hospital bed apparatus 400 can be moved to the static imaging apparatus 130, the dynamic imaging apparatus 150, the mechanical arm apparatus 160, and the visual imaging apparatus 140, and the hospital bed apparatus 400 can be moved to the static imaging apparatus 130, the dynamic imaging apparatus 150, the mechanical arm apparatus 160, and the visual imaging apparatus 140, so long as the static imaging apparatus 130, the dynamic imaging apparatus 150, the mechanical arm apparatus 160, and the visual imaging apparatus 140 can perform imaging, puncturing intervention operation, and other processing on the patient 200 and the object to be scanned on the hospital bed apparatus 400.
Optionally, the vision imaging device 140 further has a video recording function, and the vision imaging device 140 records the operation status of each device in the operation area in real time, especially the status of the operation performed by the robotic arm device 160. The visual imaging device 140 transmits the video data to the image fusion processing device 120 for storage, and the video data can be read from the image fusion processing device 120 after the operation is completed, so that the visual imaging device can be used for later clinical teaching practices and the like.
The surgical robot system 100 of the present invention performs simulation fusion on the static imaging device 130 and the visual imaging device 140 through the image fusion processing device 120 to form a dynamic fusion image, so as to perform accurate navigation and positioning on the puncture interventional operation of the surgical instrument 300; the dynamic information of the dynamic imaging device 150 is fused into the 3D image information of the static imaging device 130 through the image fusion processing device 120, so that the real-time operation track of the surgical instrument 300 in the dynamic information is extracted and is simulated and fused with the static 3D image information to form a dynamic fusion image, a puncture planning path of the surgical instrument 300 can be displayed on the dynamic fusion image in real time, the puncture planning path and the motion track of the surgical instrument 300 are monitored, and the motion track of the surgical instrument 300 is prevented from deviating; the problem that the target area can only be reached by multiple times of puncture caused by inaccurate positioning of the existing puncture operation is effectively solved, so that the position of an object to be scanned can be accurately positioned, the target area can be reached by one time of puncture of the surgical instrument 300, and the situation that the mechanical damage of the patient 200 is large due to multiple times of puncture of the surgical instrument 300 is avoided. Moreover, the navigation operation is performed by adopting the combined scanning imaging of the static imaging device 130 and the dynamic imaging device 150, so that the advantages of different devices can be taken into consideration, the precision of dynamic fusion images is improved, and the radiation dose of multiple scanning of the static imaging device 130 is reduced.
As an implementation manner, the static imaging apparatus 130 includes a static imaging device 131 having a scanning chamber and a static imaging workstation 132, the static imaging device 131 is connected to the static imaging workstation 132 in a transmission manner, the static imaging device 131 is used for imaging 3D image information of an object to be scanned and feeding the 3D image information back to the static imaging workstation 132, and the static imaging workstation 132 transmits the 3D image information to the image fusion processing apparatus 120. The static imaging device 131 performs three-dimensional static imaging on an object to be scanned of the patient 200 to acquire 3D image information of the object to be scanned. The still imaging workstation 132 is capable of processing the 3D video information and transmitting the 3D video information to the image fusion processing device 120. Meanwhile, the static imaging workstation 132 can also control the static imaging device 131 to perform a three-dimensional static imaging operation on the object to be scanned.
Further, the static imaging device 131 is movably disposed near the patient bed apparatus 400, and during imaging, the static imaging device 131 moves to the patient bed apparatus 400 and aligns the scanning chamber with the object to be scanned. It can be understood that, when the static imaging device 131 performs three-dimensional static imaging on the object to be scanned, the static imaging device 131 can move to the vicinity of the hospital bed apparatus 400, and the 3D image information of the object to be scanned is acquired by performing three-dimensional static imaging on the object to be scanned through the scanning cavity of the static imaging device 131. After the three-dimensional static imaging is completed, the static imaging device 131 needs to be removed, that is, the static imaging device is far away from the patient 200, so that the dynamic imaging device 150 and the mechanical arm device 160 can process the object to be scanned, interference between the dynamic imaging device 150 and the mechanical arm device 160 and the static imaging device 131 is avoided, and the accuracy of the puncture interventional operation is ensured.
Illustratively, a rail may be arranged on the ground of the operation area, the rail being located near the patient bed apparatus 400, the static imaging device 131 being slidably arranged on the rail. In imaging, the static imaging device 131 slides along the guide rail to the vicinity of the bed apparatus 400; after the imaging is completed, the static imaging device 131 slides along the guide rail and away from the patient bed apparatus 400. It is also possible to arrange a roller at the bottom of the static imaging device 131, and the rolling of the roller can also realize that the static imaging device 131 is close to or away from the hospital bed apparatus 400. Of course, other configurations that enable movement of the static imaging device 131 may be used. For example, the static imaging device 131 may be a CT (Computed Tomography). Alternatively, the static imaging device 131 may be a CBCT (Cone Beam CT). During imaging, the CBCT is moved to the vicinity of the patient bed apparatus 400, and the C-arm of the CBCT is rotated, so that the scanning cavity of the C-arm is aligned with the object to be scanned of the patient 200; after imaging is complete, the C-arm of the CBCT is rotated so that the scan volume of the C-arm is away from the patient 200, at which point the CBCT may be removed. Of course, in other embodiments of the present invention, the static imaging device 131 may also be an MR (Magnetic Resonance imaging) device.
Optionally, the static imaging apparatus 130 further includes a static image display 133, the static image display 133 is in transmission connection with the static imaging workstation 132, the static imaging device 131 obtains the 3D image information of the object to be scanned, and the 3D image information can be transmitted to the static image display 133 through the static imaging workstation 132 and displayed through the static image display 133, so that the medical staff can conveniently view the 3D image information of the object to be scanned.
As an implementation manner, the dynamic imaging apparatus 150 includes an ultrasound imaging device and a dynamic imaging workstation 151, the ultrasound imaging device is connected to the dynamic imaging workstation 151 in a transmission manner, the ultrasound imaging device images dynamic information of the object to be scanned at the planned scanning position and feeds the dynamic information back to the dynamic imaging workstation 151, and the dynamic imaging workstation 151 transmits the dynamic information to the image fusion processing apparatus 120. The ultrasound imaging apparatus dynamically images an object to be scanned of the patient 200 to acquire dynamic information of the object to be scanned. The dynamic imaging workstation 151 can process the dynamic information and transmit the dynamic information to the image fusion processing device 120. Meanwhile, the dynamic imaging workstation 151 can also control the dynamic imaging apparatus to perform dynamic imaging operation on the object to be scanned.
Moreover, before and after the puncture interventional operation, the ultrasonic imaging device performs real-time imaging on the object to be scanned of the patient 200 at the planned scanning position, so that the accurate positioning and the accurate puncture of the surgical instrument 300 are realized, and the success rate of the puncture interventional operation is improved. It is understood that the ultrasound imaging apparatus includes an ultrasound probe 152, and the object to be scanned of the patient 200 is dynamically imaged by the ultrasound probe 152 to acquire two-dimensional dynamic information of the object to be scanned. The image fusion processing device 120 can blend dynamic information into 3D image information to form dynamic fusion image, the dynamic fusion image can improve the definition of the object to be scanned, and blend two-dimensional dynamic information acquired by the ultrasonic probe into the 3D image information, so that the stereoscopic impression is increased, the judgment of the doctor on the space position in the patient body during the operation is facilitated, and the medical care personnel can observe the condition of the object to be scanned conveniently
Optionally, the dynamic imaging device 150 further includes a dynamic image display 153, the dynamic image display 153 is in transmission connection with the dynamic imaging workstation 151, the dynamic imaging device obtains dynamic information of the object to be scanned, and the dynamic information can be transmitted to the dynamic image display 153 through the dynamic imaging workstation 151 and displayed by the dynamic image display 153, so that medical staff can conveniently view the dynamic information of the object to be scanned.
As an implementation manner, the robot apparatus 160 includes a first robot 162 and a robot workstation 161, the first robot 162 is movable relative to the patient bed apparatus 400, the first robot 162 carries the surgical instrument 300, the robot workstation 161 is connected to the first robot 162 in a transmission manner, and the robot workstation 161 receives the puncture planning path transmitted by the image fusion processing apparatus 120, and controls the first robot 162 to move and perform a puncture interventional operation according to the puncture planning path. The first mechanical arm 162 has multiple degrees of freedom, and can drive the surgical instrument 300 to move to any desired position, so as to meet the requirement of the puncture interventional operation. The robotic workstation 161 is configured to control the movement of the first robotic arm 162, including but not limited to controlling the first robotic arm 162 to move the surgical instrument 300 to a safe area above the body surface of the patient 200 where the puncture planning path is located, and controlling the first robotic arm 162 to move the surgical instrument 300 to perform a puncture interventional procedure, etc. After the mechanical arm workstation 161 receives the puncture planning path transmitted by the image fusion processing device 120, the mechanical arm workstation 161 controls the first mechanical arm 162 to drive the surgical instrument 300 to move to a safe region above the body surface of the patient 200 where the puncture planning path is located; then, the robotic arm workstation 161 controls the surgical instrument 300 to drive the surgical instrument 300 to move along the puncture planning path according to the puncture planning path, and performs the puncture interventional operation. It should be noted that the surgical instrument 300 of the present invention includes, but is not limited to, a puncture needle, a biopsy needle, an ablation needle, or a scalpel for superficial surgery, etc. For example, the surgical instrument 300 may be mounted at any position on the first mechanical arm 162, and preferably, the surgical instrument 300 is mounted on the end of the first mechanical arm 162.
Further, the mechanical arm device 160 further includes a second mechanical arm 163, the second mechanical arm 163 can move relative to the hospital bed device 400, the second mechanical arm 163 bears the ultrasonic probe 152 of the ultrasonic imaging apparatus, the second mechanical arm 163 is connected to the mechanical arm workstation 161 in a transmission manner, the mechanical arm workstation 161 receives the planned scanning position transmitted by the image fusion processing device 120, and controls the second mechanical arm 163 to move to the planned scanning position, so as to acquire the dynamic information of the object to be scanned in real time. The second mechanical arm 163 has multiple degrees of freedom, and can drive the ultrasonic probe 152 to move to any desired position, so that the ultrasonic probe 152 scans the object to be scanned at the planned scanning position. The robotic workstation 161 can control the second robotic arm 163 to move, including but not limited to controlling the second robotic arm 163 to move the ultrasound probe 152 to a planned scanning position on the body surface of the patient 200, and controlling the second robotic arm 163 to move the ultrasound probe 152 to perform a dynamic scanning operation, etc. For example, the ultrasonic probe 152 may be mounted at any position on the second mechanical arm 163, and preferably, the ultrasonic probe 152 is mounted on the end of the second mechanical arm 163.
The first mechanical arm 162 drives the surgical instrument 300 to perform a puncture interventional operation, the second mechanical arm 163 drives the ultrasonic probe 152 to perform a dynamic imaging operation, and the first mechanical arm 162 and the second mechanical arm 163 can move simultaneously, so that scenes of actual medical staff during a two-hand operation can be simulated to the maximum extent, and the medical staff is easy to learn and use. Moreover, the first mechanical arm 162 and the second mechanical arm 163 are used for imaging the ultrasonic probe 152, so that the puncture interventional operation can be performed automatically, errors caused by manual operation are reduced, and the success rate of the puncture interventional operation is improved. For example, the first robot arm 162 and the second robot arm 163 may be mounted on the movable base, respectively, to facilitate the first robot arm 162 and the second robot arm 163 to move to the vicinity of the hospital bed apparatus 400; of course, the first robot arm 162 and the second robot arm 163 may be mounted on the same movable base.
Optionally, the robot arm device 160 further includes a robot arm image display 164, the robot arm image display 164 is connected to the robot arm workstation 161, and the robot arm workstation 161 can transmit the movement of the first robot arm 162 and the second robot arm 163 to the robot arm image display 164, which is displayed by the robot arm image display 164, so that the medical staff can conveniently see the movement of the first robot arm 162 and the second robot arm 163.
Optionally, the robot apparatus 160 further comprises a console 165, the console 165 is in transmission connection with the robot workstation 161, and the console 165 is capable of controlling the movement of the first robot 162 and the second robot 163 through the robot workstation 161. The console 165 has a control for implementing the first robot 162 and the second robot 163 to control the movements of the first robot 162 and the second robot 163. The console 165 has automatic, semi-automatic, and manual modes of operation, and can be adapted to different clinical application scenarios, such as surgery of an infection isolation patient.
Still optionally, the surgical robot system 100 further integrates a remote communication module for receiving remote control information to control the movement of the first and second robot arms 162 and 163. The remote communication module can receive remote control information and control the first mechanical arm 162 and the second mechanical arm 163 to move through the mechanical arm workstation 161, so as to perform a puncture interventional operation. Different clinical application scenarios can be used, such as infection isolation patient surgery, remote surgery in remote areas, battlefield environment remote surgery, spacecraft remote surgery, and the like. The remote communication module may be integrated on the robot workstation 161, on the console 165, on the image fusion processing device 120, or in other devices.
As one possible embodiment, the vision imaging apparatus 140 is movably disposed near the hospital bed apparatus 400 by a support column and is disposed independently of the robot arm apparatus 160. The visual imaging device 140 is movably disposed in the operating field to capture the positions of the various devices in the operating field, facilitating the simulated fusion of the image fusion processing device 120. The bottom of the supporting column can be provided with rollers and the like to realize the movement of the supporting column; of course, the movement of the supporting column can also be realized by adopting a sliding rail mode; alternatively, the support column may be a multi-degree-of-freedom robotic arm. Of course, in another embodiment of the present invention, the robot apparatus 160 further comprises a supporting portion and a third robot, the third robot is connected to the robot workstation 161 in a transmission manner, the vision imaging apparatus 140 is installed on the third robot, and the first robot 162, the second robot 163 and the third robot are all installed on the supporting portion. The third mechanical arm drives the vision imaging device 140 to move to a desired position to collect the positions of the devices in the operating area, so that the image fusion processing device 120 can perform simulation fusion conveniently. It is understood that the support portion may be a movable base, and of course, the support portion may further include a multi-degree-of-freedom robot, which drives the first robot 162, the second robot 163 and the third robot to move to desired positions.
As an implementation manner, the image fusion processing device 120 includes an image fusion display 122 and an image fusion processing workstation 121, the image fusion display 122 is in transmission connection with the image fusion processing workstation 121, the image fusion processing workstation 121 is also in transmission connection with the static imaging workstation 132, the dynamic imaging workstation 151 and the mechanical arm workstation 161 respectively, the image fusion processing workstation 121 receives 3D video information, dynamic information and visual space information respectively for fusion and displays through the image fusion display 122, and the image fusion processing workstation 121 transmits the planned scanning position and the puncture planned path to the mechanical arm workstation 161.
The image fusion processing workstation 121 fuses visual space information and 3D image information to form a multi-mode image so as to perform accurate navigation and positioning on the operation, the image fusion processing workstation 121 further fuses dynamic information into the 3D image information, extracts an actual running track of the surgical instrument 300 in the dynamic information and performs simulation fusion with the 3D image information to form a dynamic fusion image, so that a puncture planning path and a motion path of the surgical instrument 300 are detected in real time, accurate positioning and accurate puncture are realized, and the success rate of puncture interventional operations is improved. Moreover, the image fusion processing workstation 121 can display the multi-modal image and the dynamic fusion image on the image fusion display 122 respectively and in a fusion manner, and the dynamic fusion image of the image fusion display 122 can also display the puncture planning path accurately in real time, so as to avoid deviation of the motion trajectory of the surgical instrument 300.
Further, the image fusion processing workstation 121 plans the planned scanning position of the ultrasound probe 152 and the puncture planned path of the surgical instrument 300 according to the 3D image information and the visual space information of the object to be scanned. That is, the image fusion processing workstation 121 can automatically plan the planned scan position of the ultrasound probe 152 and the planned path of penetration of the surgical instrument 300 according to the information acquired by the respective devices. Of course, in other embodiments of the present invention, the medical personnel may also manually plan the planned scan position of the ultrasound probe 152 and the planned path of the penetration of the surgical instrument 300.
Further, the surgical robot system further includes a distance measuring device, the distance measuring device is in transmission connection with the image fusion processing workstation 121, and information acquired by the distance measuring device is used for calculating the position information of the surgical instrument 300 and the ultrasonic probe 152 and acquiring the body surface information of the patient 200. The distance measuring equipment can measure the position of each equipment in the operation region, and transmit the position information of each equipment for image fusion processing workstation 121, guarantee that 3D image information and visual space information can accurate fusion, can also guarantee that dynamic information is accurate to fuse to 3D image information, in order to realize accurate positioning and accurate puncture, improve the success rate that the operation was intervened in the puncture, reduce the risk that accidental injury leads to medical malpractice. Illustratively, the ranging apparatus is integrated with the visual imaging apparatus 140. Of course, in other embodiments of the present invention, the distance measuring device is disposed on the first mechanical arm 162 or the second mechanical arm 163.
Optionally, the robot arm apparatus 160 further includes a puncture pose detecting piece and an ultrasonic pose detecting piece. The puncture pose detection piece is used for detecting the position and/or the inclination angle of the surgical instrument 300, the puncture pose detection piece is in transmission connection with the mechanical arm workstation 161, the puncture pose detection piece transmits the position and/or the inclination angle information to the mechanical arm workstation 161, and the position and/or the inclination angle information and the information of the surgical instrument 300 in the visual space information are fused through the image fusion processing workstation 121. The puncture pose detection piece can detect the position and/or angle of the surgical instrument 300 in real time and transmit the position and/or angle to the image fusion processing workstation 121 via the robot arm workstation 161. The image fusion processing workstation 121 can fuse the position and/or angle information of the surgical instrument 300 with the information of the surgical instrument 300 in the visual space information collected by the visual imaging device 140 to form the posture of the surgical instrument 300 outside the patient, and the posture of the surgical instrument 300 outside the patient can be combined with the posture of the surgical instrument 300 inside the patient collected by the ultrasonic probe 152 to display the complete posture of the surgical instrument 300 on the image fusion display 122, so that the medical staff can conveniently observe the motion track of the surgical instrument 300.
Still optionally, the robot arm apparatus 160 further includes an ultrasonic pose detection piece. The ultrasonic pose detection piece is used for detecting the position and/or the inclination angle of the ultrasonic probe 152, the ultrasonic pose detection piece is in transmission connection with the mechanical arm workstation 161, the ultrasonic pose detection piece transmits the position and/or the inclination angle information to the mechanical arm workstation 161, and the position and/or the inclination angle information are respectively fused with the ultrasonic probe 152 information in the 3D image information and the visual space information through the image fusion processing workstation 121. The ultrasonic pose detection piece can detect the position and/or angle of the ultrasonic probe 152 in real time and transmit the position and/or angle to the image fusion processing workstation 121 through the robot arm workstation 161. The image fusion processing workstation 121 can fuse the position and/or angle information of the ultrasonic probe 152 and the dynamic information corresponding to the position and/or angle information with the 3D image information to form a dynamic fusion image, and display the dynamic fusion image on the image fusion display 122, so as to display the condition of the lesion position and the motion trajectory of the surgical instrument 300 in real time, and also monitor the puncture planning path and the motion trajectory of the surgical instrument 300, thereby preventing the surgical instrument 300 from deviating. Moreover, the image fusion processing workstation 121 fuses the position and/or angle information of the ultrasound probe 152 with the information of the ultrasound probe 152 in the visual space information collected by the visual imaging device 140, so as to display the information on the image fusion display 122, which is convenient for the medical staff to view.
The utility model discloses a surgical robot system 100 gathers surgical instrument 300 and ultrasonic probe 152's position appearance through puncture position appearance detection piece and supersound position appearance detection piece in real time to transmit to in the image fusion processing workstation 121, in order to realize the accurate integration of developments. For example, the puncture posture detecting member may be provided on the surgical instrument 300 or may be provided on the first robot arm 162; the puncture pose detection piece may be a sensor or the like capable of detecting the position and/or inclination of the ultrasonic probe 152. The ultrasonic pose detection piece may be provided on the ultrasonic probe 152, or may be provided on the second mechanical arm 163; the ultrasonic pose detection piece may be a sensor or the like capable of detecting the position and/or inclination of the ultrasonic probe 152.
In another embodiment of the present invention, the surgical robot system 100 includes an image fusion processing device 120, a static imaging device 130, a dynamic imaging device 150 and a visual imaging device 140, and the image fusion processing device 120 is respectively connected to the static imaging device 130, the dynamic imaging device 150 and the visual imaging device 140 in a transmission manner. The static imaging device 130 is configured to image 3D image information of an object to be scanned, and transmit the 3D image information to the image fusion processing device 120. The vision imaging device 140 is used for acquiring the vision space information of the patient 200 and the operation region on the hospital bed device 400 and transmitting the vision space information to the image fusion processing device 120. The image fusion processing device 120 fuses the 3D imagery information and the visual space information and extracts the planned scan position of the dynamic imaging device 150 and the planned path of the puncture of the surgical instrument 300. The dynamic imaging device 150 is configured to image dynamic information of an object to be scanned at a planned scanning position in real time and transmit the dynamic information to the image fusion processing device 120, and the image fusion processing device 120 fuses the dynamic information into 3D image information; the dynamic imaging device 150 is also capable of acquiring the motion trajectory of the surgical instrument 300 within the patient 200 in real time. The surgical robotic system 100 of the present embodiment does not include the robotic arm apparatus 160, and the medical personnel need to hold the ultrasonic probe 152 for dynamic scanning and hold the surgical instrument 300 for interventional procedures. Other structures of the surgical robot system 100 of the present embodiment are completely the same as the structures of the surgical robot system 100 of the above embodiments, and are not repeated herein.
The utility model discloses still provide a control method of surgical robot system 100, be applied to surgical robot system 100 in any above-mentioned embodiment, control method includes the following step:
acquiring visual space information of a patient 200 and an operation area;
acquiring 3D image information of an object to be scanned of a patient 200;
fusing visual space information and 3D image information, and extracting position information of an object to be scanned;
planning an operation path according to the position information of the object to be scanned, and extracting a planned scanning position and a puncture planned path;
imaging dynamic information of an object to be scanned at a planned scanning position in real time;
fusing the dynamic information into the 3D image information, and displaying a dynamic simulation fused image;
controlling the surgical instrument 300 to move along the puncture planned path, and performing a puncture interventional operation;
acquiring the motion track of the surgical instrument 300 in the body of the patient 200 in real time, and comparing the motion track with the puncture planning path;
after the puncture operation is completed, the surgical instrument 300 is controlled to return according to the original path.
The utility model discloses surgical robot system 100 is when the operation, and vision imaging system obtains the external vision spatial information in patient 200 and operation region in real time to give image fusion processing workstation 121 with this vision spatial information transmission. The static imaging apparatus scans an object to be scanned of the patient 200 to obtain 3D image information in the body, and transmits the 3D image information to the image fusion processing workstation 121. The image fusion processing workstation 121 fuses the visual space information and the 3D image information, and extracts the position coordinate information of the object to be scanned. The image fusion processing workstation 121 automatically or manually plans the puncture surgical path of the surgical instrument 300, and extracts a planned scanning position coordinate system of the ultrasonic probe 152 and start and end point (motion trajectory) coordinate systems of the surgical instrument 300. In a manual or automatic mode, the robotic workstation 161 directs the second robotic arm 163 to move the ultrasound probe 152 to a planned scan position on the body surface of the patient 200. The dynamic imaging workstation 151 controls the ultrasound probe 152 to perform dynamic scanning on the object to be scanned of the patient 200 to obtain dynamic information, and transmits the dynamic information to the dynamic imaging workstation 151 and the image fusion processing workstation 121 in real time. The pose information of the ultrasound probe 152 is transmitted to the image fusion navigation workstation in real time by the ultrasound pose detection element. In the fusion navigation workstation, in the 3D image information, the dynamic information scanning range is fused into the 3D image information according to the pose of the ultrasound probe 152 received in real time, and the dynamic fusion image (each cross section and the whole position) is displayed on the image fusion display 122. In the automatic or semi-automatic mode, the robotic workstation 161 moves the first robotic arm 162 to move the surgical instrument 300 to a safe area above the surface location of the patient 200 where the planned path of penetration is located. The medical staff can select a full-automatic, remote control semi-automatic or manual method of the machine to execute the puncture operation process. The ultrasonic probe 152 acquires dynamic information of the surgical instrument 300 penetrating into the body in real time and transmits the dynamic information to the dynamic imaging workstation 151 and the image fusion processing workstation 121. The image fusion processing workstation 121 displays the dynamic trajectory of the fusion needle insertion position on the 3D video information in real time through the image fusion display 122, and compares the dynamic trajectory with the puncture planned path. After the puncture sampling, drainage or ablation treatment is completed, the first mechanical arm 162 and the second mechanical arm 163 respectively drive the surgical instrument 300 and the ultrasonic probe 152 to return to the safety zone according to the original path.
The technical features of the embodiments described above can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only represent some embodiments of the present invention, and the description thereof is specific and detailed, but not to be construed as limiting the scope of the present invention. It should be noted that, for those skilled in the art, without departing from the spirit of the present invention, several variations and modifications can be made, which are within the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the appended claims.
Claims (10)
1. A surgical robot system is characterized by comprising an image fusion processing device, a static imaging device, a dynamic imaging device, a mechanical arm device and a visual imaging device;
the static imaging equipment is used for imaging 3D image information of an object to be scanned and transmitting the 3D image information to the image fusion processing equipment;
the vision imaging equipment is used for acquiring vision space information of a patient and an operation area on the sickbed equipment and transmitting the vision space information to the image fusion processing equipment;
the image fusion processing equipment fuses the 3D image information and the visual space information and extracts a planned scanning position of the dynamic imaging equipment and a puncture planned path of a surgical instrument;
the dynamic imaging equipment is used for imaging the dynamic information of the object to be scanned at the planned scanning position in real time and transmitting the dynamic information to the image fusion processing equipment, and the image fusion processing equipment fuses the dynamic information into the 3D image information;
the mechanical control equipment is used for bearing the surgical instrument and driving the surgical instrument to move to execute a puncture interventional operation according to a puncture planned path of the surgical instrument.
2. The surgical robotic system of claim 1, wherein the static imaging device comprises a static imaging device having a scanning cavity and a static imaging workstation, the static imaging device is configured to image 3D image information of the object to be scanned and feed the 3D image information back to the static imaging workstation, and the static imaging workstation transmits the 3D image information to the image fusion processing device.
3. The surgical robotic system of claim 2, wherein the dynamic imaging device comprises an ultrasound imaging device and a dynamic imaging workstation, the ultrasound imaging device images dynamic information of the object to be scanned at the planned scanning position and feeds the dynamic information back to the dynamic imaging workstation, and the dynamic imaging workstation transmits the dynamic information to the image fusion processing device.
4. The surgical robotic system of claim 3, wherein the robotic device comprises a first robotic arm and a robotic workstation, the first robotic arm carries the surgical instrument, and the robotic workstation receives the puncture planning path transmitted by the image fusion processing device, controls the first robotic arm to move according to the puncture planning path, and performs a puncture interventional operation.
5. The surgical robotic system according to claim 4, wherein the robotic arm device further comprises a second robotic arm, the second robotic arm carries an ultrasound probe of the ultrasound imaging device, and the robotic arm workstation receives the planned scanning position transmitted by the image fusion processing device and controls the second robotic arm to move to the planned scanning position so as to acquire dynamic information of the object to be scanned in real time.
6. The surgical robotic system as claimed in claim 5, wherein the vision imaging device is movably disposed adjacent to the bed device by a support column and is disposed independently of the robotic arm control device;
or the mechanical arm control equipment further comprises a supporting part and a third mechanical arm, the third mechanical arm is in transmission connection with the mechanical arm workstation, the visual imaging equipment is installed on the third mechanical arm, and the first mechanical arm, the second mechanical arm and the third mechanical arm are all installed on the supporting part.
7. The surgical robotic system of claim 5, wherein the image fusion processing device comprises an image fusion display and an image fusion processing workstation, the image fusion processing workstation receives the 3D video information, the dynamic information, and the visual space information respectively for fusion and display through the image fusion display, and the image fusion processing workstation transmits the planned scanning position and the planned puncture path to the robotic arm workstation.
8. The surgical robotic system according to claim 7, wherein the robotic arm apparatus further includes a puncture pose detector and an ultrasonic pose detector;
the puncture pose detection piece is used for detecting the position and/or the inclination angle of the surgical instrument and transmitting the position and/or the inclination angle information to the mechanical arm workstation, and the image fusion processing workstation fuses the position and/or the inclination angle information with the surgical instrument information in the visual space information;
the ultrasonic pose detection piece is used for detecting the position and/or the inclination angle of the ultrasonic probe and transmitting the position and/or the inclination angle information to the mechanical arm workstation, and the image fusion processing workstation fuses the position and/or the angle information of the ultrasonic probe with the 3D image information and the ultrasonic probe information in the visual space information respectively.
9. The surgical robotic system of claim 5, further comprising a distance measuring device, wherein the information collected by the distance measuring device is used to calculate position information of the surgical instrument and the ultrasound probe and collect body surface information of the patient;
the distance measuring device is integrated with the vision imaging device; or,
the distance measuring equipment is arranged on the first mechanical arm or the second mechanical arm.
10. The surgical robotic system as claimed in claim 5, wherein the robotic arm control device further comprises a console in transmission connection with the robotic arm workstation, the console being capable of controlling the movement of the first and second robotic arms via the robotic arm workstation;
and/or the surgical robot system is further integrated with a remote communication module, and the remote communication module is used for receiving remote control information so as to control the first mechanical arm and the second mechanical arm to move.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201820047256.1U CN208573801U (en) | 2018-01-11 | 2018-01-11 | Surgical robot system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201820047256.1U CN208573801U (en) | 2018-01-11 | 2018-01-11 | Surgical robot system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN208573801U true CN208573801U (en) | 2019-03-05 |
Family
ID=65500015
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201820047256.1U Active CN208573801U (en) | 2018-01-11 | 2018-01-11 | Surgical robot system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN208573801U (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107970060A (en) * | 2018-01-11 | 2018-05-01 | 上海联影医疗科技有限公司 | Surgical robot system and its control method |
| CN110522473A (en) * | 2019-09-26 | 2019-12-03 | 孙强 | A kind of gynemetrics's complete-automatic check imaging system |
| CN111513849A (en) * | 2020-04-30 | 2020-08-11 | 京东方科技集团股份有限公司 | A surgical system, control method and control device for puncture |
| WO2021253867A1 (en) * | 2020-06-17 | 2021-12-23 | 厦门波耐模型设计有限责任公司 | Framework, method, and system for unmanned smart hospital |
| CN117274506A (en) * | 2023-11-20 | 2023-12-22 | 华中科技大学同济医学院附属协和医院 | Three-dimensional reconstruction method and system for interventional target scene under catheter |
| CN118490364A (en) * | 2024-05-09 | 2024-08-16 | 杭州三坛医疗科技有限公司 | Full-orthopedics platform operation robot and navigation method thereof |
-
2018
- 2018-01-11 CN CN201820047256.1U patent/CN208573801U/en active Active
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107970060A (en) * | 2018-01-11 | 2018-05-01 | 上海联影医疗科技有限公司 | Surgical robot system and its control method |
| CN110522473A (en) * | 2019-09-26 | 2019-12-03 | 孙强 | A kind of gynemetrics's complete-automatic check imaging system |
| CN111513849A (en) * | 2020-04-30 | 2020-08-11 | 京东方科技集团股份有限公司 | A surgical system, control method and control device for puncture |
| CN111513849B (en) * | 2020-04-30 | 2022-04-19 | 京东方科技集团股份有限公司 | Surgical system for puncture, control method and control device |
| WO2021253867A1 (en) * | 2020-06-17 | 2021-12-23 | 厦门波耐模型设计有限责任公司 | Framework, method, and system for unmanned smart hospital |
| CN117274506A (en) * | 2023-11-20 | 2023-12-22 | 华中科技大学同济医学院附属协和医院 | Three-dimensional reconstruction method and system for interventional target scene under catheter |
| CN117274506B (en) * | 2023-11-20 | 2024-02-02 | 华中科技大学同济医学院附属协和医院 | A method and system for three-dimensional reconstruction of target scenes under catheter intervention |
| CN118490364A (en) * | 2024-05-09 | 2024-08-16 | 杭州三坛医疗科技有限公司 | Full-orthopedics platform operation robot and navigation method thereof |
| CN118490364B (en) * | 2024-05-09 | 2025-10-17 | 杭州三坛医疗科技有限公司 | Full-orthopedics platform operation robot and navigation method thereof |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107970060A (en) | Surgical robot system and its control method | |
| CN208573801U (en) | Surgical robot system | |
| US12514657B2 (en) | Surgical robot platform | |
| JP7663356B2 (en) | Robotic devices for minimally invasive soft tissue interventions | |
| CN104274194B (en) | Insertion type imaging system | |
| US11576746B2 (en) | Light and shadow guided needle positioning system and method | |
| US20170065248A1 (en) | Device and Method for Image-Guided Surgery | |
| US20090082784A1 (en) | Interventional medical system | |
| KR101758741B1 (en) | Guiding method of interventional procedure using medical images and system for interventional procedure for the same | |
| AU2019412420A1 (en) | Surgical navigation system | |
| JP2010269067A (en) | Treatment support device | |
| US10849602B2 (en) | Method of conducting a minimally invasive surgical procedure and rkh-i apparatus for the implementation thereof | |
| CN113558735A (en) | Robot puncture positioning method and device for biliary tract puncture | |
| CN116019558B (en) | Electromagnetic navigation puncture robot system and positioning method thereof | |
| EP3254627A1 (en) | Fluoroscopic guidance system with offset light source and method of use | |
| CN110584782B (en) | Medical image processing method, medical image processing apparatus, medical system, computer, and storage medium | |
| KR20140032919A (en) | Medical technical assembly and method for producing images | |
| CN215874870U (en) | A robotic puncture positioning device for biliary puncture | |
| CN110537985A (en) | Spine space coordinate system positioning device and method for augmented reality surgery system | |
| CN116999129A (en) | Positioning navigation system and method for neurosurgery puncture operation | |
| WO2023019479A1 (en) | Robot puncture positioning method and apparatus for biliary tract puncture | |
| EP3500159B1 (en) | System for the use of soft-point features to predict respiratory cycles and improve end registration | |
| KR101758740B1 (en) | Guiding method of interventional procedure using medical images and system for interventional procedure for the same | |
| US20210219947A1 (en) | Intraoperative Ultrasound Probe System and Related Methods | |
| CN113729941A (en) | VR-based surgery auxiliary positioning system and control method thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| TR01 | Transfer of patent right |
Effective date of registration: 20191230 Address after: No. 549-1, building B1, Optical Valley Biological City, No. 666, Gaoxin Avenue, Donghu New Technology Development Zone, Wuhan City, Hubei Province 430000 Patentee after: Wuhan Lianying Zhirong Medical Technology Co., Ltd. Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258 Patentee before: Shanghai United Imaging Healthcare Co., Ltd. |
|
| TR01 | Transfer of patent right |