[go: up one dir, main page]

WO2009009515A2 - Device, system and method for aligning images - Google Patents

Device, system and method for aligning images Download PDF

Info

Publication number
WO2009009515A2
WO2009009515A2 PCT/US2008/069385 US2008069385W WO2009009515A2 WO 2009009515 A2 WO2009009515 A2 WO 2009009515A2 US 2008069385 W US2008069385 W US 2008069385W WO 2009009515 A2 WO2009009515 A2 WO 2009009515A2
Authority
WO
WIPO (PCT)
Prior art keywords
orientation
sensor
imaging
imaging device
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2008/069385
Other languages
French (fr)
Other versions
WO2009009515A3 (en
Inventor
David P. Harris
Michael G. Lyttle
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/167,752 external-priority patent/US20090015680A1/en
Application filed by Individual filed Critical Individual
Publication of WO2009009515A2 publication Critical patent/WO2009009515A2/en
Publication of WO2009009515A3 publication Critical patent/WO2009009515A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5247Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/547Control of apparatus or devices for radiation diagnosis involving tracking of position of the device or parts of the device

Definitions

  • the invention generally relates to a device, system and method for aligning imaging devices to obtain images from a common perspective and orientation; and more particularly to a device, system and method including one or more sensors which generate and transmit orientation data of one or more imaging devices and a processing unit configured to process the data and facilitate alignment of the imaging devices to a common perspective and orientation.
  • MIS Minimally Invasive Surgery
  • endoscopes When captured (Digitized) these images also become an essential part of the medical record.
  • a typical operating room is a crowded environment, and likely to have limited video display monitors to show the resulting images. As such, a control system for routing these "many" sources to the "few” displays is required.
  • C-Arm i.e., a live X-Ray imager
  • use of a C-Arm might facilitate the surgeon's decision on how to approach the proposed surgical site.
  • it is often difficult to relate two images particularly if the images are created from different imaging devices, having a different perspective and orientation.
  • IGS systems are provided by companies such as Brainlab (BrainSuite®), Medtronic (StealthStation®) and Accuray (CyberKnife®). They rely on a number of factors: first, the generation of a 3D Scan, with the patient locked into a specific orientation, second, the ability to reposition that patient in an identical position during surgery, and third, the ability to direct the robot, laser, or other surgical instrument, using that 3D database, with great accuracy to a location in 3D space, and therefore perform surgery without damaging healthy tissue.
  • the present system and method is designed to help solve the issue of image management in operating rooms where the surgical team utilizes multiple, disparate imaging devices (fluoroscopes, ultrasound, microscopes, endoscopes and video cameras) during surgical procedures, without the high cost of full IGS systems.
  • the present system does not require the use of a historical 3D Scan database, or automated surgical instruments. Neither does it rely on a Micro GPS, or equivalent technology to provide information on a location in 3D space. It is used to help the surgeon reposition an imaging device, or position a secondary imaging device, in the same orientation in 3D space, relative to a visually acquired "Target" location on the patient, so that the resulting images may be usefully compared, or overlaid in real time.
  • the present invention provides a device, system and method for aligning images taken from one or more image devices.
  • the invention can be utilized in connection with surgical and other medical procedures to provide two or more images from the various image devices with the same orientation and perspective. This greatly enhances a medical practitioner's ability to determine the appropriate course of action.
  • Typical imaging devices i.e. video and data sources, include: electronic patient records, digitized radiological images, endoscopic cameral images, patient vitals, surgical robot images, frozen section lab specimens, live radiological images, three dimensional navigation aid images, microscopy images, surgical light camera images, wall camera images, ultrasound device images, DVD playback, and videoconferences with outside clinicians.
  • an imaging alignment system comprises a first sensor having a mounting for attachment to a first imaging device.
  • the imaging device can be an optical device or a non-optical device.
  • the first sensor is configured to generate and transmit orientation data and/or other related data such as positional data or perspective data, of the first imaging device.
  • the first sensor can include a wireless transmitter for transmitting the orientation data to the base unit.
  • the system further includes a base processing unit, such as a, computer or other microprocessor based device, configured to receive the orientation data from the first sensor and to provide feedback to facilitate positioning of the first imaging device to a first orientation.
  • the processing unit can use a visual and/or audio display to facilitate the positioning of the image device.
  • the system can further comprise a second sensor having a mounting attachment for attachment to a second imaging device.
  • the second sensor is also configured to generate and transmit orientation data of the second imaging device.
  • the system can utilize a plurality of such sensors for attachment to a plurality of imaging devices.
  • the base unit is configured to provide feedback to facilitate positioning of the second imaging device (or others of the plurality of devices) to the first orientation.
  • the first sensor can include a rechargeable battery, an activation button and an indicator light. Additionally, the first sensor can be configured to receive alignment data from the base unit. The indicator light can be activated when the imaging device is in the first orientation.
  • the base unit can be utilized to provide calibration for the sensor units. It can also be used to recharge the sensor units.
  • an imaging alignment system comprises a first imaging device having a first sensor incorporated in the first imaging device and configured to generate and transmit orientation data of the first imaging device, and a base processing unit configured to receive the orientation data from the first sensor and to provide feedback to facilitate positioning of the first imaging device to a first orientation.
  • the system further comprises a second sensor incorporated in a second imaging device. The second sensor is also configured to generate and transmit orientation data of the second imaging device to the base unit.
  • a method of aligning a first and second imaging device to a same orientation comprises providing a first sensor to a first imaging device; positioning the device to obtain an image of an object at a first orientation; transmitting positional and orientation data of the first imaging device to a processing unit; providing a second sensor to a second imaging device; transmitting positional and orientation data of the first imaging device to the processing unit; and, providing positioning information to facilitate positioning of the second imaging device to the first orientation.
  • the step of providing positioning information to facilitate positioning of the second imaging device to the first orientation comprises displaying the positional information on a display.
  • FIG. 1 is a touch panel control interface for an operating control room
  • FIG. 2 is a perspective view of an operating control room
  • FIG. 3 is an image of a chest X-ray of a patient
  • FIG. 4 is an external image from a light camera of the patient of FIG. 3;
  • FIG. 5 is an endoscopic camera image of a patient from a first perspective and orientation
  • FIG. 6 is an endoscopic camera image of the patient of FIG. 5 from a second perspective and orientation
  • FIG. 7 is an isometric image of a honeycomb article
  • FIG. 8 is a front plan view of the article of FIG. 7;
  • FIG. 9 is a side plan view of the article of FIG. 7;
  • FIG. 10 is a perspective view of an operating room with a patient positioned below an imaging device
  • FIG. 11 is the image of FIG. 3 overlayed over the image of FIG. 4;
  • FIG. 12 is a perspective view of a sensor for attaching to an imaging device in accordance with the present invention.
  • FIG. 13 is an image of a display from a processing unit in accordance with the present invention.
  • Figure 1 shows a typical touchpanel control interface 10 for a control system in an operating room.
  • the control interface includes controls 12 for displaying images from a number of image sources.
  • the image sources utilized in a typical operating room environment include optical image sources such as endoscopes, microscopes and light cameras, along with non-optical image sources such as C-arms, Ultrasound, PC's and MRI.
  • the images utilized may be produced live, or recorded from a different environment, such as a Cardiac Catheterization Lab, or Pathology Lab.
  • Figures 3 and 4 provide an example of two images of a Patient's chest.
  • Figure 4 shows an external image of the Patient's chest using a camera in the surgical light (i.e., a light camera) and
  • Figure 3 is an X-ray of the Patient's chest using a C- arm (i.e., a live X-ray imager).
  • the C-arm image only shows a portion of the chest and includes a circular image boundary.
  • Figures 5 an 6 show an endoscopic camera image projected onto flat discs orientated at different angles. The different views provided by each image illustrates how much distortion can be created when viewing the same region or area of a Patient from different orientations.
  • Figures 7-9 show an image of the same article 14 (i.e., a honeycombed rectangular object) from three different perspectives and orientations.
  • Figure 7 provides an isometric view of the article 14 from above the article.
  • Figures 8 and 9 provide front and side plan views, respectively. It is evident from these views that the article 14 looks entirely different depending on the perspective and orientation of the image.
  • the present invention utilizes an image alignment sensor unit 16 that can be attached to existing image devices in an operating room. Alternatively, new imaging devices can be made incorporating a sensor unit 16 directly into the devices.
  • the sensor unit 16 includes a fixed mounting 18 for attachment to the image devices. Referring to Figure 10, a sensor unit 16 is shown attached to a C-arm device and another unit 16 is attached to a Light Camera device in an operating room.
  • the C-Arm and the light camera are not physically connected and can be positioned (oriented) independently from different angles and perspectives potentially creating disparate images for use by the surgical team. However, by correctly aligning these two disparate imaging devices, the images created generate outputs that are similar in orientation and perspective. This allows the surgical team to view different anatomical structures (internal and external as an example) from different imaging devices at the exact same perspective, thus providing them more accurate comparative information for making decision on treatment and surgical approach.
  • the sensor units 16 are utilized to provide position and orientation feedback (i.e., data) to a base unit or main image alignment system processing unit (e.g., a computer or other microprocessor based device).
  • a base unit or main image alignment system processing unit e.g., a computer or other microprocessor based device.
  • Figure 13 shows a screen shot 20 of the processing unit.
  • Positional information is displayed on the screen 20.
  • the sensor units 16 are provided with a wireless transmission device 22 and are configured to wireless transmit the position and orientation data to the processing unit.
  • the sensors 16 can include an indicator light 24 and an activation switch 26.
  • the sensor units 16 can also include a rechargeable battery.
  • the processing unit is configured to wirelessly receive the data from each sensor unit 16.
  • the processing unit then processes the received data, and displays orientation and perspective feedback with the visual display 20 (the processing unit can also utilize an audio display) to help direct positioning of the imaging device or devices, and to supplement the sensor unit's onboard indicator light.
  • the indicator light can be configured to go on when placed in the proper position.
  • the processing unit is used to position the imaging devices to an appropriate location so that the image generated by the device is either consistent in perspective and orientation with prior images from the device, or with images from other devices.
  • the processing unit can perform multiple functions. These can include: calibration of sensor units 16, recharging of batteries, receiving and processing orientation signals, identifying specific imaging devices, display of resulting device orientation feedback, and so on. Additionally, the processing unit can be configured to consider additional information relating to the imaging devices to facilitate proper positioning. Such information can include, for example the size and shape of the image device, and/or the distance of sensor unit 16 from the lens or focal point of the device.
  • Light Camera as an example, it is possible to facilitate orienting any imaging device accurately and to the same perspective, every time. This process ensures that image outputs can be easily compared live or captured (digitized) on separate displays. Moreover, the images can even be superimposed on a single display such as shown in Figure 11. The superimposing can be performed manually (e.g., with use of a computer mouse) or with software.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Endoscopes (AREA)

Abstract

An imaging alignment device, system and method. A sensor unit is provided that mounts to an imaging device and transmits orientation and perspective data to a base unit. The base unit provides information to facilitate placement of the imaging devices.

Description

DEVICE, SYSTEM AND METHOD FOR ALIGNING IMAGES
DESCRIPTION CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefit of U.S. Provisional Patent Application No. 60/958,910 filed July 10, 2007 and U.S. Nonprovisional Patent Application No. 12/167,752 filed July 3, 2008, the contents of which are incorporated herein by reference.
TECHNICAL FIELD
[0002] The invention generally relates to a device, system and method for aligning imaging devices to obtain images from a common perspective and orientation; and more particularly to a device, system and method including one or more sensors which generate and transmit orientation data of one or more imaging devices and a processing unit configured to process the data and facilitate alignment of the imaging devices to a common perspective and orientation.
BACKGROUND OF THE INVENTION
[0003] Today's complex surgical operating rooms typically utilize ten or more video and data sources to display during surgery. Visual access to these images is becoming critically important during the clinical process, particularly in the case of Minimally Invasive Surgery (MIS), where surgery is performed with small incisions, using endoscopes. When captured (Digitized) these images also become an essential part of the medical record. [0004] A typical operating room is a crowded environment, and likely to have limited video display monitors to show the resulting images. As such, a control system for routing these "many" sources to the "few" displays is required.
[0005] There is great value of using non-invasive Imaging Devices such as a C-Arm (i.e., a live X-Ray imager), during surgery. For example, because it reveals underlying structure, use of a C-Arm might facilitate the surgeon's decision on how to approach the proposed surgical site. However, it is often difficult to relate two images, particularly if the images are created from different imaging devices, having a different perspective and orientation. Similarly, it is also difficult to relate two images created from the same device if the images generated do not have the same perspective and orientation.
[0006] That is, comparing images, particularly a comparison of optical and non-optical, from very different sources, is difficult. Comparison of non-optical sources such as CAT scans and C-Arms, can be problematic. Ensuring that the images are from the same orientation and perspective is critical in the operating room environment. [0007] With the evolving use of three dimensional ("3D") scans and associated 3D digitized models, surgery is now migrating from MIS towards Image Guided Surgery (IGS) in an effort to increase surgical accuracy. For example, such techniques can be utilized where the results of an MRI scan, a database driven 3D model, has identified the location of a tumor in the brain. In this case Lasers are guided to excise the cancerous tissue. [0008] These IGS systems are provided by companies such as Brainlab (BrainSuite®), Medtronic (StealthStation®) and Accuray (CyberKnife®). They rely on a number of factors: first, the generation of a 3D Scan, with the patient locked into a specific orientation, second, the ability to reposition that patient in an identical position during surgery, and third, the ability to direct the robot, laser, or other surgical instrument, using that 3D database, with great accuracy to a location in 3D space, and therefore perform surgery without damaging healthy tissue.
[0009] These automated, data driven solutions require information on absolute position in 3D space, and require use of technologies such as Micro GPS, or infrared optical tracker systems that rely on fiduciary markers.
[0010] In contrast, the present system and method is designed to help solve the issue of image management in operating rooms where the surgical team utilizes multiple, disparate imaging devices (fluoroscopes, ultrasound, microscopes, endoscopes and video cameras) during surgical procedures, without the high cost of full IGS systems. The present system does not require the use of a historical 3D Scan database, or automated surgical instruments. Neither does it rely on a Micro GPS, or equivalent technology to provide information on a location in 3D space. It is used to help the surgeon reposition an imaging device, or position a secondary imaging device, in the same orientation in 3D space, relative to a visually acquired "Target" location on the patient, so that the resulting images may be usefully compared, or overlaid in real time. [0011] The present invention is provided to solve the problems discussed above and other problems, and to provide advantages and features not provided by prior alignment systems. A full discussion of the features and advantages of the present invention is deferred to the following detailed description, which proceeds with reference to the accompanying drawings.
SUMMARY OF THE INVENTION
[0012] The present invention provides a device, system and method for aligning images taken from one or more image devices. The invention can be utilized in connection with surgical and other medical procedures to provide two or more images from the various image devices with the same orientation and perspective. This greatly enhances a medical practitioner's ability to determine the appropriate course of action.
[0013] Typical imaging devices, i.e. video and data sources, include: electronic patient records, digitized radiological images, endoscopic cameral images, patient vitals, surgical robot images, frozen section lab specimens, live radiological images, three dimensional navigation aid images, microscopy images, surgical light camera images, wall camera images, ultrasound device images, DVD playback, and videoconferences with outside clinicians.
[0014] In accordance with one embodiment of the invention, an imaging alignment system comprises a first sensor having a mounting for attachment to a first imaging device. The imaging device can be an optical device or a non-optical device. The first sensor is configured to generate and transmit orientation data and/or other related data such as positional data or perspective data, of the first imaging device. The first sensor can include a wireless transmitter for transmitting the orientation data to the base unit. [0015] The system further includes a base processing unit, such as a, computer or other microprocessor based device, configured to receive the orientation data from the first sensor and to provide feedback to facilitate positioning of the first imaging device to a first orientation. The processing unit can use a visual and/or audio display to facilitate the positioning of the image device.
[0016] The system can further comprise a second sensor having a mounting attachment for attachment to a second imaging device. The second sensor is also configured to generate and transmit orientation data of the second imaging device. In fact, the system can utilize a plurality of such sensors for attachment to a plurality of imaging devices. The base unit is configured to provide feedback to facilitate positioning of the second imaging device (or others of the plurality of devices) to the first orientation.
[0017] The first sensor can include a rechargeable battery, an activation button and an indicator light. Additionally, the first sensor can be configured to receive alignment data from the base unit. The indicator light can be activated when the imaging device is in the first orientation.
[0018] The base unit can be utilized to provide calibration for the sensor units. It can also be used to recharge the sensor units.
[0019] In accordance with another embodiment of the invention an imaging alignment system comprises a first imaging device having a first sensor incorporated in the first imaging device and configured to generate and transmit orientation data of the first imaging device, and a base processing unit configured to receive the orientation data from the first sensor and to provide feedback to facilitate positioning of the first imaging device to a first orientation. The system further comprises a second sensor incorporated in a second imaging device. The second sensor is also configured to generate and transmit orientation data of the second imaging device to the base unit.
[0020] In accordance with yet a further embodiment of the invention, a method of aligning a first and second imaging device to a same orientation is provided. The method comprises providing a first sensor to a first imaging device; positioning the device to obtain an image of an object at a first orientation; transmitting positional and orientation data of the first imaging device to a processing unit; providing a second sensor to a second imaging device; transmitting positional and orientation data of the first imaging device to the processing unit; and, providing positioning information to facilitate positioning of the second imaging device to the first orientation. The step of providing positioning information to facilitate positioning of the second imaging device to the first orientation comprises displaying the positional information on a display.
[0021] Other features and advantages of the invention will be apparent from the following specification taken in conjunction with the following drawings. BRIEF DESCRIPTION OF THE DRAWINGS
[0022] To understand the present invention, it will now be described by way of example, with reference to the accompanying drawings in which:
FIG. 1 is a touch panel control interface for an operating control room;
FIG. 2 is a perspective view of an operating control room;
FIG. 3 is an image of a chest X-ray of a patient;
FIG. 4 is an external image from a light camera of the patient of FIG. 3;
FIG. 5 is an endoscopic camera image of a patient from a first perspective and orientation;
FIG. 6 is an endoscopic camera image of the patient of FIG. 5 from a second perspective and orientation;
FIG. 7 is an isometric image of a honeycomb article;
FIG. 8 is a front plan view of the article of FIG. 7;
FIG. 9 is a side plan view of the article of FIG. 7;
FIG. 10 is a perspective view of an operating room with a patient positioned below an imaging device;
FIG. 11 is the image of FIG. 3 overlayed over the image of FIG. 4;
FIG. 12 is a perspective view of a sensor for attaching to an imaging device in accordance with the present invention; and,
FIG. 13 is an image of a display from a processing unit in accordance with the present invention.
DETAILED DESCRIPTION
[0023] While this invention is susceptible of embodiments in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated.
[0024] Figure 1 shows a typical touchpanel control interface 10 for a control system in an operating room. The control interface includes controls 12 for displaying images from a number of image sources. [0025] Referring to Figure 2, the image sources utilized in a typical operating room environment include optical image sources such as endoscopes, microscopes and light cameras, along with non-optical image sources such as C-arms, Ultrasound, PC's and MRI. The images utilized may be produced live, or recorded from a different environment, such as a Cardiac Catheterization Lab, or Pathology Lab.
[0026] Figures 3 and 4 provide an example of two images of a Patient's chest. Specifically, Figure 4 shows an external image of the Patient's chest using a camera in the surgical light (i.e., a light camera) and Figure 3 is an X-ray of the Patient's chest using a C- arm (i.e., a live X-ray imager). Unlike the light camera image, the C-arm image only shows a portion of the chest and includes a circular image boundary.
[0027] Figures 5 an 6 show an endoscopic camera image projected onto flat discs orientated at different angles. The different views provided by each image illustrates how much distortion can be created when viewing the same region or area of a Patient from different orientations.
[0028] The problem of images having different perspectives and/or orientations is further illustrated in Figures 7-9. Figures 7-9 show an image of the same article 14 (i.e., a honeycombed rectangular object) from three different perspectives and orientations. [0029] Figure 7 provides an isometric view of the article 14 from above the article. In sharp contrast to this view, Figures 8 and 9 provide front and side plan views, respectively. It is evident from these views that the article 14 looks entirely different depending on the perspective and orientation of the image.
[0030] Referring to Figure 12, the present invention utilizes an image alignment sensor unit 16 that can be attached to existing image devices in an operating room. Alternatively, new imaging devices can be made incorporating a sensor unit 16 directly into the devices. [0031] The sensor unit 16 includes a fixed mounting 18 for attachment to the image devices. Referring to Figure 10, a sensor unit 16 is shown attached to a C-arm device and another unit 16 is attached to a Light Camera device in an operating room. [0032] The C-Arm and the light camera are not physically connected and can be positioned (oriented) independently from different angles and perspectives potentially creating disparate images for use by the surgical team. However, by correctly aligning these two disparate imaging devices, the images created generate outputs that are similar in orientation and perspective. This allows the surgical team to view different anatomical structures (internal and external as an example) from different imaging devices at the exact same perspective, thus providing them more accurate comparative information for making decision on treatment and surgical approach.
[0033] The sensor units 16 are utilized to provide position and orientation feedback (i.e., data) to a base unit or main image alignment system processing unit (e.g., a computer or other microprocessor based device). Figure 13 shows a screen shot 20 of the processing unit.
Positional information is displayed on the screen 20.
[0034] The sensor units 16 are provided with a wireless transmission device 22 and are configured to wireless transmit the position and orientation data to the processing unit.
Additionally, the sensors 16 can include an indicator light 24 and an activation switch 26.
The sensor units 16 can also include a rechargeable battery.
[0035] The processing unit is configured to wirelessly receive the data from each sensor unit 16. The processing unit then processes the received data, and displays orientation and perspective feedback with the visual display 20 (the processing unit can also utilize an audio display) to help direct positioning of the imaging device or devices, and to supplement the sensor unit's onboard indicator light. The indicator light can be configured to go on when placed in the proper position. The processing unit is used to position the imaging devices to an appropriate location so that the image generated by the device is either consistent in perspective and orientation with prior images from the device, or with images from other devices.
[0036] The processing unit can perform multiple functions. These can include: calibration of sensor units 16, recharging of batteries, receiving and processing orientation signals, identifying specific imaging devices, display of resulting device orientation feedback, and so on. Additionally, the processing unit can be configured to consider additional information relating to the imaging devices to facilitate proper positioning. Such information can include, for example the size and shape of the image device, and/or the distance of sensor unit 16 from the lens or focal point of the device.
[0037] Using the image alignment system with existing imaging devices (C-arm and
Light Camera as an example), it is possible to facilitate orienting any imaging device accurately and to the same perspective, every time. This process ensures that image outputs can be easily compared live or captured (digitized) on separate displays. Moreover, the images can even be superimposed on a single display such as shown in Figure 11. The superimposing can be performed manually (e.g., with use of a computer mouse) or with software.
[0038] Alignment of the images using the present system in the operating room, and potentially other acute clinical areas of the hospital, provides a clinical team more accurate comparative information for making decisions on the appropriate treatment and surgical approach.
[0039] While the specific embodiments have been illustrated and described, numerous modifications come to mind without significantly departing from the spirit of the invention, and the scope of protection is only limited by the scope of the accompanying Claims.

Claims

CLAIMSWhat is claimed is:
1. An imaging alignment system comprising: a first sensor having a mounting for attachment to a first imaging device, the first sensor configured to generate and transmit orientation data of the first imaging device; a base processing unit configured to receive the orientation data from the first sensor and to provide feedback to facilitate positioning of the first imaging device to a first orientation.
2. The imaging alignment system of claim 1 wherein the first sensor includes a wireless transmitter for transmitting the orientation data to the base unit.
3. The imaging alignment system of claim 1 wherein the base unit includes a visual display to provide the feedback.
4. The imaging alignment system of claim 1 further comprising a second sensor having a mounting attachment for attachment to a second imaging device, the second sensor configured to generate and transmit orientation data of the second imaging device.
5. The imaging alignment system of claim 4 wherein the base unit is configured to provide feedback to facilitate positioning of the second imaging device to the first orientation.
6. The imaging alignment system of claim 1 wherein the first sensor includes a rechargeable battery.
7. The imaging alignment system of claim 1 wherein the first sensor is configured to receive alignment data from the base unit.
8. The imaging alignment system of claim 7 wherein the first sensor includes an indicator light that is activated when the imaging device is in the first orientation.
9. The imaging alignment system of claim 1 wherein the base unit is configured to provide calibration for the first sensor unit.
10. The imaging alignment system of claim 1 wherein the first sensor is mounted on a C- arm X-ray device.
11. The imaging alignment system of claim 10 wherein the second sensor is mounted on a Light Camera.
12. An imaging alignment system comprising: a first imaging device having a first sensor incorporated in the first imaging device, the first sensor configured to generate and transmit orientation data of the first imaging device; a base processing unit configured to receive the orientation data from the first sensor and to provide feedback to facilitate positioning of the first imaging device to a first orientation.
13. The system of claim 12 further comprising a second sensor incorporated in a second imaging device, the second sensor configured to generate and transmit orientation data of the second imaging device to the base unit.
14. A method of aligning a first and second imaging device to a same orientation comprising: providing a first sensor to a first imaging device; positioning the device to obtain an image of an object at a first orientation; transmitting positional and orientation data of the first imaging device to a processing unit; providing a second sensor to a second imaging device; transmitting positional and orientation data of the first imaging device to the processing unit; and, providing positioning information to facilitate positioning of the second imaging device to the first orientation.
15. The method of claim 14 wherein the step of providing positioning information to facilitate positioning of the second imaging device to the first orientation comprises: displaying the positional information on a display.
16. A system comprising of a plurality of sensing devices that can be attached to mobile imaging devices such that the imaging devices can be orientated similarly, such that the resulting images are aligned, facilitating the comparison of those images.
17. A sensor that can be attached to a mobile or articulated imaging device for the purpose of accurately positioning or repositioning that device in three dimensional space.
18. A system comprising of a plurality of sensing devices that have means to communicate their orientation wirelessly to a base station that can process and display the orientation information, to facilitate the positioning of the mobile imaging devices.
19. A system comprising of a plurality of sensing devices that can be individually identified and tracked by the base station.
20. A base station that is capable identifying a specific sensor, receiving data regarding the sensor's orientation in space from the sensor, and storing the data.
21. The base station of claim 20 that is capable of transmitting the data to other, specific sensors to facilitate the physical orientation of those sensors, such that they might conform to, or be specifically offset from the orientation of the first sensor.
22. A base station that is capable of exchanging data with a three dimensional computer model regarding a specific orientation, or view, and transmitting the data to other, specific sensors to facilitate the physical orientation of those sensors, such that they might conform to, or be specifically offset from the orientation of that view.
23. An imaging alignment device substantially as shown and described in the present application.
24. An imaging alignment system substantially as shown and described in the present application.
25. An imaging alignment method substantially as shown and described in the present application.
PCT/US2008/069385 2007-07-10 2008-07-08 Device, system and method for aligning images Ceased WO2009009515A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US95891007P 2007-07-10 2007-07-10
US60/958,910 2007-07-10
US12/167,752 US20090015680A1 (en) 2007-07-10 2008-07-03 Device, System and Method for Aligning Images
US12/167,752 2008-07-03

Publications (2)

Publication Number Publication Date
WO2009009515A2 true WO2009009515A2 (en) 2009-01-15
WO2009009515A3 WO2009009515A3 (en) 2009-03-26

Family

ID=40229440

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/069385 Ceased WO2009009515A2 (en) 2007-07-10 2008-07-08 Device, system and method for aligning images

Country Status (1)

Country Link
WO (1) WO2009009515A2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9623575D0 (en) * 1996-11-13 1997-01-08 Univ Glasgow Medical imaging systems
US6764217B2 (en) * 2000-10-30 2004-07-20 Kabushiki Kaisha Toshiba X-ray diagnosis apparatus
US7581885B2 (en) * 2004-11-24 2009-09-01 General Electric Company Method and system of aligning x-ray detector for data acquisition

Also Published As

Publication number Publication date
WO2009009515A3 (en) 2009-03-26

Similar Documents

Publication Publication Date Title
US11416995B2 (en) Systems, devices, and methods for contactless patient registration for a medical procedure
US11911225B2 (en) Method and system for improving 2D-3D registration convergence
US11819365B2 (en) System and method for measuring depth of instrumentation
US10993771B2 (en) Trackable apparatuses and methods
EP3306567B1 (en) System for improving 2d-3d registration convergence
EP3320874B1 (en) Method for checking registration for surgical systems
US9248000B2 (en) System for and method of visualizing an interior of body
US10265854B2 (en) Operating room safety zone
US10144637B2 (en) Sensor based tracking tool for medical components
US20170119339A1 (en) Systems and methods of checking registrations for surgical systems
CA2964494A1 (en) Patient reference tool for rapid registration
JP7029932B2 (en) Systems and methods for measuring the depth of instruments
US20090015680A1 (en) Device, System and Method for Aligning Images
WO2009009515A2 (en) Device, system and method for aligning images
US12484990B2 (en) Method and system for improving 2D-3D registration convergence
WO2024125773A1 (en) Wide angle navigation system
HK1247803A1 (en) Method for checking registration for surgical systems
HK1247803B (en) Method for checking registration for surgical systems
HK1246949A1 (en) System for improving 2d-3d registration convergence
HK1246949B (en) System for improving 2d-3d registration convergence

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08772451

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08772451

Country of ref document: EP

Kind code of ref document: A2