[go: up one dir, main page]

WO2012089529A1 - Dispositif d'affichage optique pour un système d'assistance chirurgicale - Google Patents

Dispositif d'affichage optique pour un système d'assistance chirurgicale Download PDF

Info

Publication number
WO2012089529A1
WO2012089529A1 PCT/EP2011/072934 EP2011072934W WO2012089529A1 WO 2012089529 A1 WO2012089529 A1 WO 2012089529A1 EP 2011072934 W EP2011072934 W EP 2011072934W WO 2012089529 A1 WO2012089529 A1 WO 2012089529A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointer
imaging
image
light sources
computer unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2011/072934
Other languages
German (de)
English (en)
Inventor
Johannes Reinschke
Lydia Reinschke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Siemens Corp
Original Assignee
Siemens AG
Siemens Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG, Siemens Corp filed Critical Siemens AG
Publication of WO2012089529A1 publication Critical patent/WO2012089529A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B90/11Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints
    • A61B90/13Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis with guides for needles or instruments, e.g. arcuate slides or ball joints guided by light, e.g. laser pointers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/061Measuring instruments not otherwise provided for for measuring dimensions, e.g. length
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/366Correlation of different images or relation of image positions in respect to the body using projection of images directly onto the body

Definitions

  • the present invention relates to a device with an optical pointer.
  • an arrangement and its use for robotic surgery is given.
  • 3D imaging techniques are used to assist in surgical surgeries during manual surgical procedures.
  • a computed tomography scanner used pre and intraoperatively in the operating room.
  • robot-assisted surgical assist systems are known in which a robot-guided guide laser is used.
  • the display of process steps by such a guide laser is not verifiable error.
  • the surgeon as a user of such a surgical assistence system, can not conclude that his patient is correctly positioned to the guide laser.
  • the correct position of the examination object relative to the pointer must be checked at regular time intervals.
  • the device according to the invention serves to display the position, in particular the position display of an operating step during a surgical procedure. It can done manually or robotically ⁇ to the chirur ⁇ cal intervention.
  • the device according to the invention has an optical pointer, wherein the pointer has at least two light sources of different colors.
  • the light sources are ⁇ staltet to emit light beams, the light beams are made ⁇ directable by at least two light sources to at least one vorgeb ⁇ cash point of a surface of a Jardinsob ect. It turns the Informsob ect at the same time the surgical object.
  • the apparatus is so madestal ⁇ tet that a point can be displayed in a three dimensional coordinate system. This has the advantage that the visual display of the next operation step directly undertakes depth control, which consists in a surgeon or an imaging control unit shifting the at least two images of the light sources on the surface of the examination or operating object can determine.
  • the light sources have colors that are not naturally present in the human body, such as green and blue light.
  • each of the light sources ⁇ is independently switchable.
  • the intensity of each of the light sources can be modulated.
  • the light sources are designed such that they generate lines as a pointer image on the surface of the examination object. In particular, the light sources generate mutually parallel lines as a pointer image. The lines can thus directly indicate a desired cutting profile.
  • the generated line images on the surface of the examination subject are broken lines or dotted lines.
  • Individual points arranged on a line as a pointer image offer the advantage of even easier visual inspection of the match of the pointer images, with a match being made when the pointer images are superposed so that they appear as an image.
  • the arrangement according to the invention for robot-assisted process step display and / or automated visual inspection of the process step display during a surgical procedure comprises a table for depositing an examination object and a device with an optical pointer according to the invention.
  • the device can be designed such that the at least two light sources are movable relative to the table independently of one another.
  • the light sources in a fixed position to each other, for example, arranged on a manipulator, wherein a movability of Manipu ⁇ lator, relative to the table, in three spatial directions allowed to display any predetermined point in a three-dimensional workspace.
  • the manipulator can also be pivotable in three axial directions, so that a total of six degrees of freedom of motion ⁇ arise.
  • the device with the optical pointer is brought, for example, in a known position and orientation to the table, which serves to deposit a subject to be examined. In particular, based on the generated image, a basic position tion of the device is set with the optical pointer relative to the table.
  • the light sources or the manipulator actuators may be included.
  • At least one first imaging device is included.
  • this may be a device for 3D imaging.
  • a picture of the examination object is generated.
  • Device for 3D imaging may be a CT scanner.
  • the means for 3D imaging is a C-arm.
  • a stereo camera can be used for 3D imaging.
  • the examination subject is positioned on the intended table.
  • a computer unit said computer unit is removables ⁇ taltet to perform image processing to calculate with the addition of the image information from said first means for imaging a process step, in particular a surgical procedure.
  • the computer unit is further configured to control the pointer of the position display device. For example, sent from the computer unit an actuation ⁇ signal to the pointer.
  • the computer unit is in particular ⁇ sondere part of an expert system that rurgen the work of chi in traditional open surgery, minimally invasive surgery as well as in robot-assisted minimally invasive surgical procedures supported.
  • a depth control can additionally be carried out.
  • guidance or navigation through an expert system can enable the surgical procedure to be carried out faster, avoid errors and ensure automatic documentation of the course of the surgery.
  • An expedient embodiment of the arrangement additionally has a monitor.
  • Wenig preferably ⁇ least comprises a second device for imaging.
  • these is in particular a camera and serves to detect the surface of the examination object with the pointer image of the optical pointer.
  • the computer unit is designed to display on the monitor an overlay of at least one image of the first imaging device and one image of the second imaging device.
  • the camera for detecting the surface of the examination subject detects a surface section of the examination object which comprises the pointer image of the optical pointer.
  • the arrangement of the computer unit is configured in the depth direction to perform a position control, insbesonde ⁇ r.
  • a position control insbesonde ⁇ r.
  • the depth control comprises at least the simple checking of a match of the pointer images or a mismatch of the pointer images on the surface of the examination subject.
  • indicates a match
  • the pointer images lie one above the other so that they appear as a pointer image and non-coincidence, that the two pointer images are at a distance greater than zero. If the match pointer images of the light sources of the optical pointer to the surface of theomme ⁇ chung properties not, that is, are the pointer images not above one another so that they appear as a pointer image, may be made of the distance of the pointer images of the two light sources of the pointer on the surface of the a shift in the examination subject can be determined.
  • the computer unit is designed such that in case of non-conformity of pointer images of the light sources of the optical pointer on the surface of Letsob ect ⁇ a warning signal is given.
  • the second the means for imaging an object under examination, a mono-camera or a Stereokame ⁇ ra.
  • the optical pointer is integrated together with the laparoscopic camera in a laparoscopic instrument.
  • the laparoscopic camera can be designed as a monocamera or as a stereo camera.
  • one, in particular the first, of the devices for imaging an examination subject is a computed tomography scanner.
  • this imaging device may be a C-arm.
  • the assembly comprises two imaging means, one of the imaging means detecting the surface of the examination subject and one of the imaging means for 3D imaging, such as a stereo camera and a computed tomography scanner.
  • a computed tomography scanner a magnetic resonance tomograph or a stereo camera can also be used for 3D imaging.
  • any 3D imaging method can be used, in particular medical imaging methods such as X-ray computed tomography, magnetic resonance imaging ⁇ or positron emission tomography.
  • a method for medically monitored computer Surgery may comprise a first step for planning a surgical procedure in which a 3D image of a Letsobj ek- tes is generated and in the suchungsob with the addition of the image Informa ⁇ tion of the 3D image process steps of an intervention on the lower ⁇ ect can be virtually determined and / or calculated.
  • the method conveniently includes a second step in the preparation of the surgical procedure, in which the Letsob is ect prepared and positioned and in which - for a fixed positioning of the examination object - a second 3D image of the Letsobjek ⁇ tes is generated and then the compliance of the second 3D image with the first 3D image checked or a correspondence relationship between the two images is made. Based on this correspondence relation, the virtual process steps planned in the first step are adapted to the second 3D image. Furthermore, the method comprises a third step for carrying out the surgical intervention, in which a defined and / or calculated process step is first displayed, namely as a virtual process step, for example, on a monitor and as a real process step on the examination object with the aid of of the pointer. The validity of the real process step can be monitored and / or controlled by checking the correspondence of the position of the virtual process step with the position actually displayed on the examination object. Furthermore, the third step comprises the implementation of the Pro ⁇ zess Kunststoffs after confirmation by the doctor.
  • the first step has the advantage that planning of the course of the operation outside of the operating room can be carried out, in particular the identification of the type of operation to be made and the detailed planning of all successive process steps of the surgical procedure.
  • the planning involves the use of an expert system.
  • the second step has the advantage that the surgical planning to which the adjustment of the examination subject during the operation.
  • the third step has the advantage that each individual process step can be monitored and / or controlled by the position indicator with the pointer before the process step is carried out. In particular, the verification of the coincidence of the position is carried out automatically, for. B. by means of an image evaluation software. In particular, a warning can be given in case of non-compliance.
  • the third step comprises a camera monitoring of the surgical intervention, wherein the camera image can be output to a monitor.
  • the individual process steps are also displayed on the monitor in addition to the display on the examination subject by means of the optical pointer.
  • the monitor can display a camera image, in particular a stereo camera image or a 3D image. If the process step by an incision into the examination subject, the position indicator may, on the object under examination or on the monitor, indicating the position and the focal distance of the to be set into the lower section ⁇ suchungsmony.
  • the monitor used to control the surgical procedure shows a dreidi ⁇ dimensional image, in particular a kind of virtual reality in the form of a superimposition of a 3D image from a device for 3D imaging and a second image of the examination ⁇ object, which also the pointer image on the surface of the Un ⁇ tersuchungsucces includes.
  • this overlay of images also includes an indication of the next process step to be performed.
  • an instruction can be given, for.
  • the form of text which is additionally output on the monitor, or alternatively as audio information. That is, for example, an audio guide may be included that provides the information and instructions necessary for the process step to be performed.
  • the instructions may also include which surgical instruments to use.
  • a video for example a stereo video
  • the process step is, for example, automatically Runaway ⁇ leads.
  • a plurality of predetermined and / or calculated process steps are included, which are carried out automatically.
  • the process steps are performed by the robot-controlled manipulator, which can be controlled by means of the computer unit. Insbesonde- re the guide to be ⁇ calculated by the computer unit process steps relates.
  • An automatically performed process step can be captured by a video camera and displayed on ei ⁇ nem monitor.
  • the video camera can be a stereo video camera.
  • a query Prior to the automatic execution of a process step by the robot or manipulator, a query must necessarily go to the user of the arrangement, in particular to the surgeon, if he agrees with the displayed next process step. Only after confirmation by the user, the process step is performed by the robot.
  • an arrangement for physician-supervised robotic surgery thus, for example, includes a first step, which is used for planning a surgical procedure, wherein by means of a device for 3D imaging, a 3D image of a Letsob ektes generated and by means of a computer unit, which is designed a Image processing, with the addition of the image information from the 3D image, a process step of an intervention at the examination object is determined and / or calculated.
  • a second step in the preparation of the surgical procedure comprises, in the preparing the object under examination and bring on a table specifies ⁇ and positioned. Thereafter, a second 3D image of the examination object is generated by means of the device for SD imaging and its relation to the first 3D image is calculated. Furthermore, by means of the computer unit, the current positions of the process steps from the first
  • Step determined. Furthermore, a third step is to Carrying out the surgical procedure comprises, wherein each process step, in particular its position on or in the examination subject, is first indicated by a pointer image by means of the optical pointer on the surface of the examination subject, the optical pointer being of the
  • the position of the process step at the object under examination is monitored over ⁇ and / or controlled by a second device for imaging detects a surface portion of the inspection object with the pointer image, and by the computer ⁇ unit the consistency of the image of doctorsnab ⁇ section checked with the 3D image and / or makes a depth control based on the Zei ⁇ ger image. Furthermore, in the third step, the process step is performed.
  • the use comprises a second
  • a user eg a surgeon
  • the video film in particular reproducing a virtual reality in the sense that a superposition of at least two images is output on the monitor, which in particular is 3D Image and an image of the Ober ⁇ surface of the examination object are.
  • the image of the surface of the object to be examined comprises egg ⁇ NEN surface portion which is affected by the operation to be performed.
  • a surface portion is included which holds the pointer to the image display process step by ⁇ , which is visible on the surface of the examination subject.
  • a guide to the next process step is given, which in particular as text additionally displayed on the monitor who ⁇ can.
  • the instructions as Played audio information.
  • the robot is z. B. realized by the computer unit and the manipulator.
  • Insbeson ⁇ particular may include the optical pointer and said second means for imaging the manipulator.
  • the manipulator may include robotic surgical instruments.
  • a review of all robot-controlled process steps by the user, in particular the surgeon, provided by the agreement is queried before each process step.
  • the arrangement for robot-assisted process step display and / or automated visual inspection of the process step display during a surgical procedure therefore initially expediently comprises a device with an optical pointer for position indication.
  • the operation of the position ⁇ and / or process step investigation is ongoing particularly automated.
  • the position and process step display is carried out, for example, robotically.
  • the visual control is automated in which the second imaging device either robotically designed the optical pointer trackable configured or mechanically connected to the optical pointer.
  • FIG. 1 shows an embodiment of the arrangement for robot-assisted operation of a surgical procedure.
  • Figure 2 shows a side view of the light sources, the light beams ⁇ ect and the surface of the Letsob.
  • Figure 3 shows a plan view of the light sources
  • a device with an optical pointer 10 is arranged together with a plurality of system components to form a CT-based surgical assistance system.
  • a manipulator 11 and two light sources 12 and a table 20 on which a patient can be positioned there are shown a computer unit 40, a monitor 41 and a CT scanner 30 for 3D imaging.
  • the pointer images 13 generated by the two light sources 12 are shown on the table 20 and can be seen on the patient during use of the system, in particular on the organs affected by the surgical procedure.
  • FIG. 1 shows an embodiment of the device for position display 10, in which the manipulator 11 and both light sources 12 are movable.
  • the illustrated table 20 is retractable into the CT scanner 30 for 3D imaging.
  • the monitor 41 is mounted so that the surgeon can see from his Ulpositi ⁇ on the monitor 41.
  • the computer unit 40 is connected to the CT scanner for 3D imaging 30, to the monitor 40 and to the position indicator 10.
  • the computer unit 40 is arranged ⁇ outside the operating room.
  • the computer unit 40 is in particular part of an Ar ⁇ beitsstation an expert system with an input device and a second monitor.
  • SUC ⁇ gen can. This can be a generated 3D image to be evaluated.
  • Surgery assistance systems be improved particularly by the invention it ⁇ device according to an optical pointer 10, since such a position and in particular Tie ⁇ fenkontrolle over the two pointer images of the user can be directly perceived.
  • the Visual inspection automatable by means of a second imaging device.
  • the table 20, on which the patient is positioned is adjustable in height and horizontally, relative to CT scanner 30 and manipulator 11, movable. Alternatively or additionally, the CT scanner 30 can also be moved.
  • the positioning and orientation of table 20 and CT scanner 30 are registered and evaluated by the computer unit 40, so that this position information is available for the position display.
  • the position indicator 10 is calibratable to a basic setting relative to the table 20.
  • Figure 2 shows a side view of the two light sources 12, the emitted light rays and the surface 50, such as the skin of the patient or the surface of an organ or bone.
  • Fig. 2 shows the case of coincidence of the images.
  • the light beams of the two light sources intersect exactly at a point on the surface 50 and are superimposed on the surface 50 so that they appear as a point on the surface 50.
  • Incorrect calibration or movement of the patient relative to the light sources 12 shifts the surface 50, resulting in two dots being visible on the surface 50 instead of one dot. From the different colors of the two
  • Light sources 12 is thus also readable, whether the surface 50 has been moved vertically upwards or downwards.
  • Figure 3 shows a plan view of the two light sources 12, the connecting line 51 and the line images formed from the two Lichtquel ⁇ len 12 13 which do not match in the case shown in Figure 3, but parallel to each other in a evaluable distance from the center line 52 to the Surface 50 hit. If the light rays intersect above the surface 50, the green line image 13 lies to the left of the second blue line image 13 from the observer. From this it can already be qualitatively recognized whether the surface surface 50 was moved up or down from the home position.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Optics & Photonics (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Robotics (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un dispositif qui comprend un dispositif d'affichage optique et qui sert à afficher la position d'une phase opératoire. Pour ce faire, le dispositif d'affichage est pourvu d'au moins deux sources de lumière de couleurs différentes, les rayons lumineux pouvant être dirigés sur au moins un point pouvant être prédéfini d'une surface d'un objet d'examen, de sorte à pouvoir afficher un point dans un système de coordonnées à trois dimensions. Le dispositif selon la présente invention peut être inclus dans un ensemble pour l'exécution d'une intervention chirurgicale assistée par robot, avec au moins une installation d'affichage d'image, en particulier pour l'affichage d'une image en 3D, de même qu'avec une unité de calcul, l'unité de calcul étant configurée pour effectuer un traitement d'image, pour calculer une phase du processus, en particulier d'une intervention chirurgicale, et pour guider le dispositif d'affichage du dispositif.
PCT/EP2011/072934 2010-12-29 2011-12-15 Dispositif d'affichage optique pour un système d'assistance chirurgicale Ceased WO2012089529A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102010064320.3A DE102010064320B4 (de) 2010-12-29 2010-12-29 Optischer Zeiger für ein Chirurgieassistenzsystem
DE102010064320.3 2010-12-29

Publications (1)

Publication Number Publication Date
WO2012089529A1 true WO2012089529A1 (fr) 2012-07-05

Family

ID=45373711

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/072934 Ceased WO2012089529A1 (fr) 2010-12-29 2011-12-15 Dispositif d'affichage optique pour un système d'assistance chirurgicale

Country Status (2)

Country Link
DE (1) DE102010064320B4 (fr)
WO (1) WO2012089529A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115192051A (zh) * 2021-04-13 2022-10-18 佳能医疗系统株式会社 医用影像装置、医用影像系统以及医用影像装置中辅助检查方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040092958A1 (en) * 2001-11-15 2004-05-13 Limonadi Farhad M. Stereotactic wands, endoscopes and methods using such wands and endoscopes
US20060036264A1 (en) * 2004-08-06 2006-02-16 Sean Selover Rigidly guided implant placement
WO2009011643A1 (fr) * 2007-07-13 2009-01-22 C-Rad Positioning Ab Surveillance de patient au niveau d'appareils de radiothérapie
DE102008013615A1 (de) * 2008-03-11 2009-09-24 Siemens Aktiengesellschaft Verfahren und Markierungsvorrichtung zur Markierung einer Führungslinie eines Eindringungsinstruments, Steuerungseinrichtung und Aufnahmesystem

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE438091B (sv) * 1983-06-08 1985-04-01 Gote Palsgard Anordning for koordinatregistrering
US5198877A (en) * 1990-10-15 1993-03-30 Pixsys, Inc. Method and apparatus for three-dimensional non-contact shape sensing
GB9716240D0 (en) * 1997-07-31 1997-10-08 Tricorder Technology Plc Scanning apparatus and methods
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
EP1142536B1 (fr) * 2000-04-05 2002-07-31 BrainLAB AG Référencement d'un patient dans un système de navigation médical utilisant des points lumineux projetés
US7385708B2 (en) * 2002-06-07 2008-06-10 The University Of North Carolina At Chapel Hill Methods and systems for laser based real-time structured light depth extraction
DE10308383A1 (de) * 2003-02-27 2004-09-16 Storz Endoskop Produktions Gmbh Verfahren und optisches System zur Vermessung der Topographie eines Meßobjekts

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040092958A1 (en) * 2001-11-15 2004-05-13 Limonadi Farhad M. Stereotactic wands, endoscopes and methods using such wands and endoscopes
US20060036264A1 (en) * 2004-08-06 2006-02-16 Sean Selover Rigidly guided implant placement
WO2009011643A1 (fr) * 2007-07-13 2009-01-22 C-Rad Positioning Ab Surveillance de patient au niveau d'appareils de radiothérapie
DE102008013615A1 (de) * 2008-03-11 2009-09-24 Siemens Aktiengesellschaft Verfahren und Markierungsvorrichtung zur Markierung einer Führungslinie eines Eindringungsinstruments, Steuerungseinrichtung und Aufnahmesystem

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115192051A (zh) * 2021-04-13 2022-10-18 佳能医疗系统株式会社 医用影像装置、医用影像系统以及医用影像装置中辅助检查方法

Also Published As

Publication number Publication date
DE102010064320A1 (de) 2012-07-05
DE102010064320B4 (de) 2019-05-23

Similar Documents

Publication Publication Date Title
EP3558599B1 (fr) Procédé d'étalonnage d'un manipulateur d'un système de manipulateur diagnostique et/ou thérapeutique
EP1361829B1 (fr) Dispositif pour piloter des instruments chirurgicaux
EP3076369B1 (fr) Procede et dispositif destines a la representation d'un objet
EP2449997B1 (fr) Poste de travail médical
DE102010020781B4 (de) Bestimmung und Überprüfung der Koordinatentransformation zwischen einem Röntgensystem und einem Operationsnavigationssystem
EP3626176B1 (fr) Procédé d'assistance d'un utilisateur, produit programme informatique, support de données et système d'imagerie
EP3412242A1 (fr) Émission de données de position d'un instrument technique médical
DE102010020284A1 (de) Bestimmung von 3D-Positionen und -Orientierungen von chirurgischen Objekten aus 2D-Röntgenbildern
EP4213755B1 (fr) Système d'assistance chirurgicale
EP0685088A1 (fr) Procede permettant de programmer et de controler le deroulement d'une intervention chirurgicale
DE102007013807A1 (de) Bildakquisitions-, Bildverarbeitungs- und Bildvisualisierungssystem zur Unterstützung der Navigation interventioneller Werkzeuge bei Durchführung von CT- bzw. MRT-gesteuerten Interventionen
EP1319368A2 (fr) Méthode pour déterminer l'orientation et la position relative d'un instrument médical
DE102005029242B4 (de) Verfahren zur Aufnahme und Auswertung von Bilddaten eines Untersuchungsobjekts und dazugehörige Einrichtung
DE102011006537A1 (de) Verfahren zur Registrierung eines ersten Koordinatensystems einer ersten medizinischen Bildgebungseinrichtung mit einem zweiten Koordinatensystem einer zweiten medizinischen Bildgebungseinrichtung und/oder einem dritten Koordinatensystem eines medizinischen Instruments bezüglich einer medizinischen Navigationseinrichtung und medizinisches Untersuchungs- und/oder Behandlungssystem
DE102022125798A1 (de) Verfahren zur medizintechnischen Kalibrierung
EP3499461B1 (fr) Représentation de marqueurs en imagerie médicale
DE69929026T2 (de) Bildgebungsverfahren für bildgesteuerte chirurgie
DE102011050240A1 (de) Vorrichtung und Verfahren zur Bestimmung der relativen Position und Orientierung von Objekten
EP1464285B1 (fr) Recalage en perspective et visualisation des régions corporelles internes
WO2012089529A1 (fr) Dispositif d'affichage optique pour un système d'assistance chirurgicale
DE10235795B4 (de) Medizinische Vorrichtung
DE102005051102B4 (de) System zur medizinischen Navigation
DE102024115529B3 (de) Medizinisches AR-System für einen chirurgischen Eingriff und Verfahren zur Überprüfung einer Navigationsgenauigkeit
DE102004048066A1 (de) Vorrichtung und Verfahren zur geometrischen Kalibrierung unterschiedlicher Meßeinrichtungen, insbesondere bei der Anwendung bildgebender Operations-, Therapie- oder Diagnostikmethoden
DE102005012295A1 (de) Verfahren zu endoskopischen Navigation und zur Eichung von Endoskopsystemen sowie System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11799164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11799164

Country of ref document: EP

Kind code of ref document: A1