[go: up one dir, main page]

US20030132936A1 - Display of two-dimensional and three-dimensional views during virtual examination - Google Patents

Display of two-dimensional and three-dimensional views during virtual examination Download PDF

Info

Publication number
US20030132936A1
US20030132936A1 US10/301,034 US30103402A US2003132936A1 US 20030132936 A1 US20030132936 A1 US 20030132936A1 US 30103402 A US30103402 A US 30103402A US 2003132936 A1 US2003132936 A1 US 2003132936A1
Authority
US
United States
Prior art keywords
dimensional
path
organ
volume
representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/301,034
Other languages
English (en)
Inventor
Kevin Kreeger
Bin Li
Frank Dachille
Jeff Meade
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
VIATRONIX
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/301,034 priority Critical patent/US20030132936A1/en
Assigned to VIATRONIX reassignment VIATRONIX ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DACHILLE, FRANK C. IX, KREEGER, KEVIN, MEADE, JEFF, LI, BIN
Publication of US20030132936A1 publication Critical patent/US20030132936A1/en
Assigned to BOND, WILLIAM, AS COLLATERAL AGENT reassignment BOND, WILLIAM, AS COLLATERAL AGENT SECURITY AGREEMENT Assignors: VIATRONIX, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • G06T2207/20044Skeletonization; Medial axis transform
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S128/00Surgery
    • Y10S128/92Computer assisted medical diagnostics

Definitions

  • the present disclosure relates to a system and method for performing a volume based three-dimensional virtual examination. More particularly, the disclosure relates to a virtual examination system and method providing enhanced visualization and navigational properties.
  • Two-dimensional (“2D”) visualization of human organs using medical imaging devices has been widely used for patient diagnosis.
  • medical imaging devices include computed tomography (“CT”) and magnetic resonance imaging (“MRI”), for example.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • 3D Three-dimensional (“3D”) images can be formed by stacking and interpolating between two-dimensional pictures produced from the scanning machines. Imaging an organ and visualizing its volume in three-dimensional space would be beneficial due to the lack of physical intrusion and the ease of data manipulation. However, the exploration of the three-dimensional volume image must be properly performed in order to fully exploit the advantages of virtually viewing an organ from the inside.
  • a functional model When viewing the 3D volume virtual image of an environment, a functional model must be used to explore the virtual space.
  • One possible model is a virtual “camera” that can be used as a point of reference for the viewer to explore the virtual space.
  • Camera control in the context of navigation within a general 3D virtual environment has been previously studied.
  • complete control of a camera in a large domain would be tedious and tiring, and an operator might not view all the important features between the start and finishing point of the exploration.
  • the second technique of camera control is a planned navigational method, which assigns the camera a predetermined path to take and which cannot be accidentally changed by the operator. This is akin to having an engaged “autopilot”. This allows the operator to concentrate on the virtual space being viewed, and not have to worry about steering into walls of the environment being examined. However, this second technique does not give the viewer the flexibility to alter the course or investigate an interesting area viewed along the flight path.
  • Radiologists and other specialists have historically been trained to analyze scan data consisting of two-dimensional slices. However, while stacks of such slices may be useful for analysis, they do not provide an efficient or intuitive means to navigate through a virtual organ, especially one as tortuous and complex as the colon. There remains a need for a virtual examination system providing data in a conventional format for analysis while, in addition, allowing an operator to easily navigate a virtual organ.
  • a preferred embodiment of the present disclosure generates a three-dimensional visualization image of an object such as a human organ using volume visualization techniques and explores the virtual image using a guided navigation system, which allows the operator to travel along a predefined flight path and to adjust both the position and viewing angle to a particular portion of interest in the image away from the predefined path in order to identify polyps, cysts or other abnormal features in the organ.
  • An aspect of the present disclosure relates to a method for performing a three-dimensional internal virtual examination of at least one organ.
  • the organ is scanned with a radiological scanning device to produce scan data representative of the organ which is then used to create a three-dimensional volume representation of the organ that includes volume elements.
  • the scan data includes a sequence of axial images.
  • a defined flight path is generated and guided navigation through the three-dimensional representation is performed.
  • one of the series of axial images is displayed wherein the one image corresponds to the current location along defined path.
  • FIG. 10 Another aspect of the present disclosure relates to an operator interface for a three-dimensional virtual examination system of an object wherein the virtual examination includes a guided navigation along a defined path within a three-dimensional volume representation of the object created from scanning data comprising a sequence of two-dimensional axial images of the object and then generating volume elements of the representation based on these axial images.
  • the operator interface includes a display screen having a plurality of sub-windows simultaneously visible. Within a first of these sub-windows volume elements responsive to the defined path and an operator's input during the guided navigation are displayed in real-time. In a second of these sub-windows one of the two-dimensional images corresponding to a current location along the defined path is displayed.
  • This operator interface can be stored as instructions on a computer-readable medium as well to cause, upon execution thereof, a processor to provide the operator interface.
  • system and method embodiments are provided for generating a three-dimensional visualization image of an object such as an organ using volume visualization techniques and exploring the image using a guided navigation system, which allows the operator to travel along a flight path and to adjust the view to a particular portion of the image of interest in order, for example, to identify polyps, cysts or other abnormal features in the visualized organ.
  • One or more series of two-dimensional renditions of the organ, correlated to the flight path location, can be provided to an operator to assist in analyzing the organ. Both the three-dimensional representation, a display of the flight path, and the two-dimensional slices are simultaneously displayed to the operator.
  • FIG. 1 shows a flow chart of the steps for performing a virtual examination of an object, specifically a colon, in accordance with the disclosure
  • FIG. 2 shows an illustration of a “submarine” camera model which performs guided navigation in the virtual organ
  • FIG. 3 shows a diagram illustrating a two dimensional cross-section of a volumetric colon which contains the flight path
  • FIG. 4 shows a diagram of a system used to perform a virtual examination of a human organ in accordance with the disclosure
  • FIG. 5 shows an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure
  • FIG. 6 shows an exemplary display of a two-dimensional slice of scan data according to an embodiment of the present disclosure
  • FIG. 7 shows the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path
  • FIG. 8 shows an exemplary operator interface screen according to embodiments of the present disclosure.
  • FIG. 9 shows a block diagram of a system embodiment based on a personal computer bus architecture.
  • the preferred embodiment to be described is the examination of an organ in the human body, specifically the colon.
  • the colon is long and twisted, which makes it especially suited for a virtual examination saving the patient monetary expense as well as the discomfort and increased hazard of a physical probe.
  • organs that can be examined include the lungs, stomach and portions of the gastrointestinal system, the heart and blood vessels.
  • a method for performing a virtual examination of an object such as a colon is indicated generally by the reference numeral 100 .
  • the method 100 illustrates the steps necessary to perform a virtual colonoscopy using volume visualization techniques.
  • Step 101 prepares the colon to be scanned in order to be viewed for examination if required by either the doctor or the particular scanning instrument.
  • This preparation could include cleansing the colon with a “cocktail” or liquid, which enters the colon after being orally ingested and passed through the stomach.
  • the cocktail forces the patient to expel waste material that is present in the colon.
  • a substance used is Golytcly.
  • air or carbon dioxide can be forced into the colon in order to expand it to make the colon easier to scan and examine.
  • Step 101 does not need to be performed in all examinations as indicated by the dashed line in FIG. 1.
  • Step 103 scans the organ that is to be examined.
  • the scanner can be an apparatus well known in the art, such as a spiral CT-scanner for scanning a colon or a Zenith MRI machine for scanning a lung labeled with xenon gas, for example.
  • the scanner must be able to take multiple images from different positions around the body during suspended respiration, in order to produce the data necessary for the volume Visualization.
  • data can be acquired using a GE/CTI spiral mode scanner operating in a helical mode of 5 mm, 1.5-2.0:1 pitch, reconstructed in 1 mm slices, where the pitch is adjusted based upon the patient's height in a known manner.
  • a routine imaging protocol of 120 kVp and 200-280 ma can be utilized for this operation.
  • the data can be acquired and reconstructed as 1 mm thick slice images having an array size of 512 ⁇ 512 pixels in the field of view, which varies from 34 to 40 cm depending on the patient's size.
  • the number of such slices generally varies under these conditions from 300 to 450, depending on the patient's height.
  • the image data set is converted to volume elements or voxels.
  • An example of a single CT-image would use an X-ray beam of 5 mm width, 1:1 to 2:1 pitch, with a 40 cm field-of-view being performed from the top of the splenic flexure of the colon to the rectum.
  • Discrete data representations of the object can be produced by other methods besides scanning.
  • Voxel data representing an object can be derived from a geometric model by techniques described in U.S. Pat. No. 5,038,302 entitled “Method of Connecting Continuous Three-Dimensional Geometrical Representations into Discrete Three-Dimensional Voxel-Based Representations Within a Three-Dimensional Voxel-Based System” by Kaufman, issued Aug. 8, 1991, filed Jul. 26,1988, which is hereby incorporated by reference in its entirety. Additionally, data can be produced by a computer model of an image, which can be converted to three-dimensional voxels and explored in accordance with this disclosure.
  • Step 104 converts the scanned images into three-dimensional volume elements (“voxels”).
  • the scan data is reformatted into 5 mm thick slices at increments of 1 mm or 2.5 mm and reconstructed in 1 mm slices, with each slice represented as a matrix of 512 by 512 pixels. By doing this, voxels of approximately 1 cubic mm are created. Thus a large number of 2D slices are generated depending upon the length of the scan. The set of 2D slices is then reconstructed to 3D voxels.
  • the conversion process of 2D images from the scanner into 3D voxels can either be performed by the scanning machine itself or by a separate machine such as a computer implementing techniques that are well known in the art (see, e.g., U.S. Pat. No. 4,985,856 entitled “Method and Apparatus for Storing, Accessing, and Processing Voxel-based Data” by Kaufman et al.; issued Jan. 15, 1991, filed Nov. 11, 1988; which is hereby incorporated by reference in its entirety).
  • Step 105 allows the operator to define the portion of the selected organ to be examined.
  • a physician may be interested in a particular section of the colon likely to develop polyps.
  • the physician can view a two dimensional slice overview map to indicate the section to be examined.
  • a starting point and finishing point of a path to be viewed can be indicated by the physician/operator.
  • a conventional computer and computer interface e.g., keyboard, mouse or spaceball
  • a grid system with coordinates can be used for keyboard entry or the physician/operator can “click” on the desired points.
  • the entire image of the colon can also be viewed if desired.
  • Step 107 performs the planned or guided navigation operation of the virtual organ being examined.
  • Performing a guided navigation operation is defined as navigating through an environment along a predefined or automatically predetermined flight path, which can be manually adjusted by an operator at any time.
  • the virtual examination is modeled on having a tiny viewpoint or “camera” traveling through the virtual space with a view direction or “lens” pointing towards the finishing point.
  • the guided navigation technique provides a level of interaction with the camera, so that the camera can navigate through a virtual environment automatically in the case of no operator interaction, and at the same time, allow the operator to manipulate the camera when necessary.
  • the preferred embodiment of achieving guided navigation is to use a physically based camera model that employs potential fields to control the movement of the camera, as is further detailed with respect to FIG. 2.
  • Step 109 which can be performed concurrently with step 107 , displays the inside of the organ from the viewpoint of the camera model along the selected pathway of the guided navigation operation.
  • Three-dimensional displays can be generated using techniques well known in the art such as the marching cubes technique, for example.
  • a technique is used that reduces the vast number of data computations necessary for the display of the virtual organ.
  • the method described in FIG. 1 can also be applied to scanning multiple organs in a body at the same time.
  • a patient may be examined for cancerous growths in both the colon and lungs.
  • the method of FIG. 1 would be modified to scan all the areas of interest in step 103 and to select the current organ to be examined in step 105 .
  • the physician/operator may initially select the colon to virtually explore and later explore the lung.
  • two different doctors with different specialties may virtually explore different scanned organs relating to their respective specialties.
  • the next organ to be examined is selected and its portion will be defined and explored. This continues until all organs that need examination have been processed.
  • a “submarine camera” model that performs guided navigation in a virtual organ is indicated generally by the reference numeral 200 .
  • the model 200 depicts a viewpoint control model that performs the guided navigation technique of step 107 .
  • the default navigation is similar to that of planned navigation that automatically directs the camera along a flight path from one selected end of the colon to another.
  • the camera stays at the center of the colon for obtaining better views of the colonic surface.
  • the operator of the virtual camera using guided navigation can interactively bring the camera close to a specific region and direct the motion and angle of the camera to study the interesting area in detail, without unwillingly colliding with the walls of the colon.
  • the operator can control the camera with a standard interface device such as a keyboard, mouse or nonstandard device such as a spaceball.
  • a standard interface device such as a keyboard, mouse or nonstandard device such as a spaceball.
  • six degrees of freedom for the camera are required.
  • the camera must be able to move in the horizontal, vertical, and depth or Z direction (axes 217 ), as well as being able to rotate in another three degrees of freedom (axes 219 ) to allow the camera to move and scan all sides and angles of a virtual environment.
  • a two dimensional cross-section of a volumetric colon containing a flight path is indicated generally by the reference numeral 300 .
  • the cross-section 300 includes the final flight path for the camera model down the center of the colon, as indicated by “x”s, and at least one starting location 301 or 303 near one end of the colon.
  • FIG. 4 a system used to perform a virtual examination of a human organ in accordance with the disclosure is indicated generally by the reference numeral 400 .
  • the system 400 is for performing the virtual examination of an object such as a human organ using the techniques described herein.
  • a patient 401 lays on a platform 402 , while a scanning device 405 scans the area that contains the organ or organs to be examined.
  • the scanning device 405 contains a scanning portion 403 that takes images of the patient and an electronics portion 406 .
  • the electronics portion 406 includes an interface 407 , a central processing unit 409 , a memory 411 for temporarily storing the scanning data, and a second interface 413 for sending data to a virtual navigation platform or terminal 416 .
  • the interfaces 407 and 413 may be included in a single interface component or may be the same component.
  • the components in the portion 406 are connected together with conventional connectors.
  • the data provided from the scanning portion 403 of the device 405 is transferred to unit 409 for processing and is stored in memory 411 .
  • the central processing unit 409 converts the scanned 2D data to 3D voxel data and stores the results in another portion of the memory 411 .
  • the converted data may be directly sent to the interface unit 413 to be transferred to the virtual navigation terminal 416 .
  • the conversion of the 2D data could also take place at the virtual navigation terminal 416 after being transmitted from the interface 413 .
  • the converted data is transmitted over a carrier 414 to the virtual navigation terminal 416 in order for an operator to perform the virtual examination.
  • the data may also be transported in other conventional ways, such as storing the data on a storage medium and physically transporting it to terminal 416 or by using satellite transmissions, for example.
  • the scanned data need not be converted to its 3D representation until the visualization-rendering engine requires it to be in 3D form. This saves computational steps and memory storage space.
  • the virtual navigation terminal 416 includes a screen for viewing the virtual organ or other scanned image, an electronics portion 415 and an interface control 419 such as a keyboard, mouse or spaceball.
  • the electronics portion 415 includes an interface port 421 , a central processing unit 423 , optional components 427 for running the terminal and a memory 425 .
  • the components in the terminal 416 are connected together with conventional connectors.
  • the converted voxel data is received in the interface port 421 and stored in the memory 425 .
  • the central processing unit 423 then assembles the 3D voxels into a virtual representation and runs the submarine camera model as described for FIG. 2 to perform the virtual examination.
  • the visibility technique is used to compute only those areas that are visible from the virtual camera, and display them on the screen 417 .
  • a graphics accelerator can also be used in generating the representations.
  • the operator can use the interface device 419 to indicate which portion of the scanned body is desired to be explored.
  • the interface device 419 can further be used to control and move the submarine camera as desired as detailed for FIG. 2.
  • the terminal portion 415 can be, for example, the Cube-4 dedicated system box, generally available from the Department of Computer Science at the State University of New York at Stony Brook.
  • the scanning device 405 and terminal 416 can be part of the same unit.
  • a single platform would be used to receive the scan image data, connect it to 3D voxels if necessary and perform the guided navigation.
  • An important feature in system 400 is that the virtual organ can be examined at a later time without the presence of the patient. Additionally, the virtual examination could take place while the patient is being scanned.
  • the scan data can also be sent to multiple terminals, which would allow more than one doctor to view the inside of the organ simultaneously. Thus a doctor in New York could be looking at the same portion of a patient's organ at the same time with a doctor in California while discussing the case. Alternatively, the data can be viewed at different times. Two or more doctors could perform their own examination of the same data in a difficult case. Multiple virtual navigation terminals could be used to view the same scan data.
  • An improved electronic colon cleansing technique employs modified bowel preparation operations followed by image segmentation operations, such that fluid and stool remaining in the colon during a computed tomographic (“CT”) or magnetic resonance imaging (“MRI”) scan can be detected and removed from the virtual colonoscopy images.
  • CT computed tomographic
  • MRI magnetic resonance imaging
  • volume-rendering techniques may be used in connection with virtual colonoscopy procedures to further enhance the fidelity of the resulting image.
  • Methods for volume rendering are well known to those of ordinary skill in the pertinent art.
  • an exemplary representation of a colon and accompanying flight-path generated according to an embodiment of the present disclosure is indicated generally by the reference numeral 500 .
  • the representation 500 depicts a human colon 502 showing a centerline flight path 504 . As the operator travels through the virtual organ along this flight path, two-dimensional images of the current position are displayed.
  • an exemplary display of a two-dimensional slice of scan data is indicated generally by the reference numeral 600 .
  • the slice 600 is shown while advancing along the flight path, and the operator interface displays the virtual organ along with the slice for the current “z” coordinate and pans the image of that slice so that the current “x, y” position are in the center of the image.
  • the two-dimensional slices are axial slices, where convention has the z-axis pointing towards the head.
  • two-dimensional slices oriented on other planes can be generated and viewed as well.
  • the two-dimensional images displayed to the operator can be oriented on the sagittal plane, the coronal plane, or perpendicular to the flight path.
  • FIG. 7 the colon of FIG. 5 intersected by a plane oriented perpendicular to the flight-path is indicated generally by the reference numeral 700 .
  • the intersected colon 700 depicts a plane 704 indicating a two-dimensional image perpendicular to the flight path 702 through the colon 703 .
  • an exemplary operator interface screen is indicated generally by the reference numeral 2100 .
  • the screen 2100 includes a number of sub-windows that simultaneously provide an operator with graphical information from a number of different perspectives.
  • the center sub-window 2104 displays inside the virtual organ.
  • An arrow or marker 2105 helps orient the operator along the projected flight path.
  • a complete view of this flight path, along with the entire organ is depicted in sub-window 2102 .
  • Operator controls 2108 are near the bottom of the screen 2100 and are useful to control the travel through the virtual organ. The rendering of the virtual organ, as well as the control of flight through the organ, have been described earlier and are not repeated here.
  • Each of these windows can include a marker, for example 2115 , 2113 and 2111 , to help orient the operator along the flight path.
  • each window 2106 , 2114 , 2112 , and 2110 has a respective control for scrolling through two-dimensional images such as scroll bar 2115 . Accordingly, the operator can traverse the flight path manually, in either direction, using this scroll bar.
  • the screen 2100 is exemplary in nature and a skilled artisan would recognize many equivalent alternatives within the scope of the present disclosure. For example, not all sub-windows 2106 , 2114 , 2112 and 2110 need to be displayed.
  • the system 900 includes an alternate hardware embodiment suitable for deployment on a personal computer (“PC”), as illustrated.
  • the system 900 includes a processor 910 that preferably takes the form of a high speed, multitasking processor, such as, for example, a Pentium III processor operating at a clock speed in excess of 400 MHZ.
  • the processor 910 is coupled to a conventional bus structure 920 that provides for high-speed parallel data transfer.
  • Also coupled to the bus structure 920 are a main memory 930 , a graphics board 940 , and a volume rendering board 950 .
  • the graphics board 940 is preferably one that can perform texture mapping, such as, for example, a Diamond Viper v770 Ultra board manufactured by Diamond Multimedia Systems.
  • the volume rendering board 950 can take the form of the VolumePro board from Mitsubishi Electric, for example, which is based on U.S. Pat. Nos. 5,760,781 and 5,847,711, which are hereby incorporated by reference in their entirety.
  • a display device 945 such as a conventional SVGA or RGB monitor, is operably coupled to the graphics board 940 for displaying the image data.
  • a scanner interface board 960 is also provided for receiving data from an imaging scanner, such as an MRI or CT scanner, for example, and transmitting such data to the bus structure 920 .
  • the scanner interface board 960 may be an application specific interface product for a selected imaging scanner or can take the form of a general-purpose input/output card.
  • the PC based system 900 will generally include an I/O interface 970 for coupling I/O devices 980 , such as a keyboard, digital pointer or mouse, and the like, to the processor 910 .
  • the I/O interface can be coupled to the processor 910 via the bus 920 .
  • Embodiments of the present disclosure provide a user interface displaying both two-dimensional and three-dimensional data. Organs within the body are, by nature, three-dimensional. Conventional medical imaging devices, however, as explained herein, create stacks of two-dimensional images when acquiring scan data. Radiologists and other specialists, therefore, have historically been trained to review and analyze these two-dimensional images. As a result, most doctors are comfortable viewing two-dimensional images even if three-dimensional reconstructions or virtualizations are available.
  • three-dimensional flight paths are intuitive, efficient tools to virtually travel through volumetric renderings of human organs either automatically or manually.
  • each point along the flight path is represented by a coordinate (x, y, z).
  • these coordinates are used to automatically scroll and pan the series of two-dimensional images that doctors are used to analyzing.
  • the operator does not have to manually navigate through an organ in two dimensions but, instead, can let the present virtualization system advance along the organ while the operator concentrates on analyzing each two-dimensional image.
  • the methods and systems described herein could be applied to virtually examine an animal, fish or inanimate object.
  • applications of the technique could be used to detect the contents of sealed objects that cannot be opened.
  • the technique could also be used inside an architectural structure such as a building or cavern and enable the operator to navigate through the structure.
  • the teachings of the present disclosure are implemented as a combination of hardware and software.
  • the software is preferably implemented as an application program tangibly embodied on a program storage unit.
  • the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
  • the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPU”), a random access memory (“RAM”), and input/output (“I/O”) interfaces.
  • CPU central processing units
  • RAM random access memory
  • I/O input/output
  • the computer platform may also include an operating system and microinstruction code.
  • the various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU.
  • various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Graphics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Processing Or Creating Images (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
US10/301,034 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination Abandoned US20030132936A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/301,034 US20030132936A1 (en) 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US33171201P 2001-11-21 2001-11-21
US10/301,034 US20030132936A1 (en) 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination

Publications (1)

Publication Number Publication Date
US20030132936A1 true US20030132936A1 (en) 2003-07-17

Family

ID=23295049

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/301,034 Abandoned US20030132936A1 (en) 2001-11-21 2002-11-21 Display of two-dimensional and three-dimensional views during virtual examination
US10/496,430 Abandoned US20050169507A1 (en) 2001-11-21 2002-11-21 Registration of scanning data acquired from different patient positions
US11/273,430 Expired - Fee Related US7372988B2 (en) 2001-11-21 2005-11-14 Registration of scanning data acquired from different patient positions

Family Applications After (2)

Application Number Title Priority Date Filing Date
US10/496,430 Abandoned US20050169507A1 (en) 2001-11-21 2002-11-21 Registration of scanning data acquired from different patient positions
US11/273,430 Expired - Fee Related US7372988B2 (en) 2001-11-21 2005-11-14 Registration of scanning data acquired from different patient positions

Country Status (5)

Country Link
US (3) US20030132936A1 (fr)
EP (1) EP1456805A1 (fr)
AU (1) AU2002365560A1 (fr)
CA (1) CA2467646A1 (fr)
WO (1) WO2003046811A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050048456A1 (en) * 2003-08-14 2005-03-03 Christophe Chefd'hotel Method and apparatus for registration of virtual endoscopic images
US20050116957A1 (en) * 2003-11-03 2005-06-02 Bracco Imaging, S.P.A. Dynamic crop box determination for optimized display of a tube-like structure in endoscopic view ("crop box")
US20060103678A1 (en) * 2004-11-18 2006-05-18 Pascal Cathier Method and system for interactive visualization of locally oriented structures
US20070116332A1 (en) * 2003-11-26 2007-05-24 Viatronix Incorporated Vessel segmentation using vesselness and edgeness
US20090040221A1 (en) * 2003-05-14 2009-02-12 Bernhard Geiger Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US20100215226A1 (en) * 2005-06-22 2010-08-26 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20100259542A1 (en) * 2007-11-02 2010-10-14 Koninklijke Philips Electronics N.V. Automatic movie fly-path calculation
US20100283781A1 (en) * 2008-01-04 2010-11-11 Kriveshko Ilya A Navigating among images of an object in 3d space
US20110122068A1 (en) * 2009-11-24 2011-05-26 General Electric Company Virtual colonoscopy navigation methods using a mobile device
US20160350979A1 (en) * 2015-05-28 2016-12-01 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
CN108210066A (zh) * 2016-12-22 2018-06-29 韦伯斯特生物官能(以色列)有限公司 二维的肺静脉显示
CN108701492A (zh) * 2016-03-03 2018-10-23 皇家飞利浦有限公司 医学图像导航系统
US10517690B2 (en) * 2014-10-31 2019-12-31 Scopis Gmbh Instrument guidance system for sinus surgery
US11191423B1 (en) * 2020-07-16 2021-12-07 DOCBOT, Inc. Endoscopic system and methods having real-time medical imaging
US11423318B2 (en) * 2019-07-16 2022-08-23 DOCBOT, Inc. System and methods for aggregating features in video frames to improve accuracy of AI detection algorithms
US11684241B2 (en) 2020-11-02 2023-06-27 Satisfai Health Inc. Autonomous and continuously self-improving learning system
US11694114B2 (en) 2019-07-16 2023-07-04 Satisfai Health Inc. Real-time deployment of machine learning systems
US12062427B2 (en) * 2017-06-30 2024-08-13 Shanghai United Imaging Healthcare Co., Ltd. Method and system for tissue density analysis

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003046811A1 (fr) * 2001-11-21 2003-06-05 Viatronix Incorporated Enregistrement de donnees de balayage obtenues de differentes positions du patient
US20060071932A1 (en) * 2002-11-21 2006-04-06 Koninklijke Philips Electronics N.V. Method and apparatus for visualizing a sequece of volume images
US20050152588A1 (en) * 2003-10-28 2005-07-14 University Of Chicago Method for virtual endoscopic visualization of the colon by shape-scale signatures, centerlining, and computerized detection of masses
US7274811B2 (en) * 2003-10-31 2007-09-25 Ge Medical Systems Global Technology Company, Llc Method and apparatus for synchronizing corresponding landmarks among a plurality of images
US7574032B2 (en) * 2003-10-31 2009-08-11 General Electric Company Method and apparatus for virtual subtraction of stool from registration and shape based analysis of prone and supine scans of the colon
US20050256400A1 (en) * 2003-12-03 2005-11-17 Bhargav Raman Method to identify arterial and venous vessels
US7516416B2 (en) * 2004-06-04 2009-04-07 Stereotaxis, Inc. User interface for remote control of medical devices
ATE484811T1 (de) * 2004-06-23 2010-10-15 Koninkl Philips Electronics Nv Virtuelle endoskopie
US20060034513A1 (en) * 2004-07-23 2006-02-16 Siemens Medical Solutions Usa, Inc. View assistance in three-dimensional ultrasound imaging
BRPI0419168B1 (pt) * 2004-09-24 2017-05-16 Nokia Corp dispositivo eletrônico compreendendo a detecção de uma entrada de um usuário durante um modo de operação inativo
WO2006042077A2 (fr) * 2004-10-09 2006-04-20 Viatronix Incorporated Echantillonnage d'images medicales pour histologie virtuelle
CN101138009A (zh) * 2005-03-07 2008-03-05 皇家飞利浦电子股份有限公司 关联管状对象的第一和第二3d图像的装置和方法
JP5231210B2 (ja) * 2005-04-13 2013-07-10 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 多次元データセットにおいて表面をセグメント化する方法、システム及びコンピュータプログラム
US7379062B2 (en) * 2005-08-01 2008-05-27 Barco Nv Method for determining a path along a biological object with a lumen
WO2007015187A2 (fr) * 2005-08-01 2007-02-08 Koninklijke Philips Electronics N.V. Procede et appareil de mise en correspondance de deux images numeriques d'un objet
WO2007023423A2 (fr) * 2005-08-24 2007-03-01 Koninklijke Philips Electronics N.V. Appareil et methode pour etiqueter des donnees d'images anatomiques
WO2007023450A2 (fr) * 2005-08-24 2007-03-01 Koninklijke Philips Electronics N.V. Appareil et procede d'identification de sections d'un objet anatomique
US20080285822A1 (en) 2005-11-09 2008-11-20 Koninklijke Philips Electronics N. V. Automated Stool Removal Method For Medical Imaging
US20070109299A1 (en) * 2005-11-15 2007-05-17 Vital Images, Inc. Surface-based characteristic path generation
US7570986B2 (en) * 2006-05-17 2009-08-04 The United States Of America As Represented By The Secretary Of Health And Human Services Teniae coli guided navigation and registration for virtual colonoscopy
US8023703B2 (en) * 2006-07-06 2011-09-20 The United States of America as represented by the Secretary of the Department of Health and Human Services, National Institues of Health Hybrid segmentation of anatomical structure
US7853058B2 (en) 2006-11-22 2010-12-14 Toshiba Medical Visualization Systems Europe, Limited Determining a viewpoint for navigating a virtual camera through a biological object with a lumen
CN101889284B (zh) * 2007-12-07 2014-03-12 皇家飞利浦电子股份有限公司 导航引导
US8144964B1 (en) * 2008-05-30 2012-03-27 Ellis Amalgamated LLC Image feature analysis
EP2409280A1 (fr) * 2009-03-20 2012-01-25 Koninklijke Philips Electronics N.V. Visualisation d'une vue d'une scène
US8213700B2 (en) * 2009-03-31 2012-07-03 Icad, Inc. Systems and methods for identifying suspicious anomalies using information from a plurality of images of an anatomical colon under study
DE102009035441B4 (de) * 2009-07-31 2016-11-24 Siemens Healthcare Gmbh Verfahren und Bildverarbeitungssystem zur Erzeugung eines Volumenansichtsbilds vom Inneren eines Körpers
JP5551955B2 (ja) * 2010-03-31 2014-07-16 富士フイルム株式会社 投影画像生成装置、方法、及びプログラム
US10679365B1 (en) * 2010-11-24 2020-06-09 Fonar Corporation Method of correlating a slice profile
US8379955B2 (en) 2010-11-27 2013-02-19 Intrinsic Medical Imaging, LLC Visualizing a 3D volume dataset of an image at any position or orientation from within or outside
US10373375B2 (en) * 2011-04-08 2019-08-06 Koninklijke Philips N.V. Image processing system and method using device rotation
US9060672B2 (en) * 2013-02-11 2015-06-23 Definiens Ag Coregistering images of needle biopsies using multiple weighted landmarks
DE102014203113B4 (de) * 2014-02-20 2019-01-10 Siemens Healthcare Gmbh Erzeugung von Bilddaten eines Untersuchungsobjekts mittels eines Magnetresonanztomographen
RU2706231C2 (ru) * 2014-09-24 2019-11-15 Конинклейке Филипс Н.В. Визуализация объемного изображения анатомической структуры
CA3020507A1 (fr) 2015-04-13 2016-10-20 Accumetra, Llc Systeme de surveillance de qualite de balayage automatique

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5839440A (en) * 1994-06-17 1998-11-24 Siemens Corporate Research, Inc. Three-dimensional image registration method for spiral CT angiography
US5920319A (en) * 1994-10-27 1999-07-06 Wake Forest University Automatic analysis in virtual endoscopy
US6161211A (en) * 1996-10-28 2000-12-12 Altera Corporation Method and apparatus for automated circuit design
US6181320B1 (en) * 1998-08-19 2001-01-30 International Business Machines Corporation Method for converting timing diagram into timing graph and vice versa
US6331116B1 (en) * 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US20020015517A1 (en) * 2000-03-29 2002-02-07 Hwang Scott N. Subvoxel processing: a method for reducing partial volume blurring
US6448970B1 (en) * 1997-07-25 2002-09-10 Namco Ltd. Image generation apparatus for causing movement of moving body based on flow data for a fluid set on a course, and information storage medium
US20030038798A1 (en) * 2001-02-28 2003-02-27 Paul Besl Method and system for processing, compressing, streaming, and interactive rendering of 3D color image data
US20030052875A1 (en) * 2001-01-05 2003-03-20 Salomie Ioan Alexandru System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US6597359B1 (en) * 2000-05-17 2003-07-22 Raychip, Inc. Hierarchical space subdivision hardware for ray tracing
US20040109603A1 (en) * 2000-10-02 2004-06-10 Ingmar Bitter Centerline and tree branch skeleton determination for virtual objects
US6928314B1 (en) * 1998-01-23 2005-08-09 Mayo Foundation For Medical Education And Research System for two-dimensional and three-dimensional imaging of tubular structures in the human body

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07120621B2 (ja) * 1989-05-08 1995-12-20 キヤノン株式会社 位置合せ方法
US5937083A (en) * 1996-04-29 1999-08-10 The United States Of America As Represented By The Department Of Health And Human Services Image registration using closest corresponding voxels with an iterative registration process
US5971767A (en) * 1996-09-16 1999-10-26 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual examination
WO2003046811A1 (fr) * 2001-11-21 2003-06-05 Viatronix Incorporated Enregistrement de donnees de balayage obtenues de differentes positions du patient

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5839440A (en) * 1994-06-17 1998-11-24 Siemens Corporate Research, Inc. Three-dimensional image registration method for spiral CT angiography
US5920319A (en) * 1994-10-27 1999-07-06 Wake Forest University Automatic analysis in virtual endoscopy
US6331116B1 (en) * 1996-09-16 2001-12-18 The Research Foundation Of State University Of New York System and method for performing a three-dimensional virtual segmentation and examination
US6161211A (en) * 1996-10-28 2000-12-12 Altera Corporation Method and apparatus for automated circuit design
US6448970B1 (en) * 1997-07-25 2002-09-10 Namco Ltd. Image generation apparatus for causing movement of moving body based on flow data for a fluid set on a course, and information storage medium
US6928314B1 (en) * 1998-01-23 2005-08-09 Mayo Foundation For Medical Education And Research System for two-dimensional and three-dimensional imaging of tubular structures in the human body
US6181320B1 (en) * 1998-08-19 2001-01-30 International Business Machines Corporation Method for converting timing diagram into timing graph and vice versa
US20020015517A1 (en) * 2000-03-29 2002-02-07 Hwang Scott N. Subvoxel processing: a method for reducing partial volume blurring
US6597359B1 (en) * 2000-05-17 2003-07-22 Raychip, Inc. Hierarchical space subdivision hardware for ray tracing
US20040109603A1 (en) * 2000-10-02 2004-06-10 Ingmar Bitter Centerline and tree branch skeleton determination for virtual objects
US20030052875A1 (en) * 2001-01-05 2003-03-20 Salomie Ioan Alexandru System and method to obtain surface structures of multi-dimensional objects, and to represent those surface structures for animation, transmission and display
US20030038798A1 (en) * 2001-02-28 2003-02-27 Paul Besl Method and system for processing, compressing, streaming, and interactive rendering of 3D color image data

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090040221A1 (en) * 2003-05-14 2009-02-12 Bernhard Geiger Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US8059877B2 (en) * 2003-05-14 2011-11-15 Siemens Corporation Method and apparatus for fast automatic centerline extraction for virtual endoscopy
US7300398B2 (en) * 2003-08-14 2007-11-27 Siemens Medical Solutions Usa, Inc. Method and apparatus for registration of virtual endoscopic images
US20050048456A1 (en) * 2003-08-14 2005-03-03 Christophe Chefd'hotel Method and apparatus for registration of virtual endoscopic images
US20050116957A1 (en) * 2003-11-03 2005-06-02 Bracco Imaging, S.P.A. Dynamic crop box determination for optimized display of a tube-like structure in endoscopic view ("crop box")
US20070116332A1 (en) * 2003-11-26 2007-05-24 Viatronix Incorporated Vessel segmentation using vesselness and edgeness
US20060103678A1 (en) * 2004-11-18 2006-05-18 Pascal Cathier Method and system for interactive visualization of locally oriented structures
US8600125B2 (en) * 2005-06-22 2013-12-03 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20100215226A1 (en) * 2005-06-22 2010-08-26 The Research Foundation Of State University Of New York System and method for computer aided polyp detection
US20100259542A1 (en) * 2007-11-02 2010-10-14 Koninklijke Philips Electronics N.V. Automatic movie fly-path calculation
US10217282B2 (en) 2007-11-02 2019-02-26 Koninklijke Philips N.V. Automatic movie fly-path calculation
US20100283781A1 (en) * 2008-01-04 2010-11-11 Kriveshko Ilya A Navigating among images of an object in 3d space
US11163976B2 (en) 2008-01-04 2021-11-02 Midmark Corporation Navigating among images of an object in 3D space
US9937022B2 (en) * 2008-01-04 2018-04-10 3M Innovative Properties Company Navigating among images of an object in 3D space
US10503962B2 (en) * 2008-01-04 2019-12-10 Midmark Corporation Navigating among images of an object in 3D space
US20180196995A1 (en) * 2008-01-04 2018-07-12 3M Innovative Properties Company Navigating among images of an object in 3d space
US8692774B2 (en) * 2009-11-24 2014-04-08 General Electric Company Virtual colonoscopy navigation methods using a mobile device
US20110122068A1 (en) * 2009-11-24 2011-05-26 General Electric Company Virtual colonoscopy navigation methods using a mobile device
US10517690B2 (en) * 2014-10-31 2019-12-31 Scopis Gmbh Instrument guidance system for sinus surgery
US11324566B2 (en) 2014-10-31 2022-05-10 Stryker European Operations Limited Instrument guidance system for sinus surgery
US9892506B2 (en) * 2015-05-28 2018-02-13 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
US20160350979A1 (en) * 2015-05-28 2016-12-01 The Florida International University Board Of Trustees Systems and methods for shape analysis using landmark-driven quasiconformal mapping
CN108701492A (zh) * 2016-03-03 2018-10-23 皇家飞利浦有限公司 医学图像导航系统
CN108210066A (zh) * 2016-12-22 2018-06-29 韦伯斯特生物官能(以色列)有限公司 二维的肺静脉显示
US12062427B2 (en) * 2017-06-30 2024-08-13 Shanghai United Imaging Healthcare Co., Ltd. Method and system for tissue density analysis
US11423318B2 (en) * 2019-07-16 2022-08-23 DOCBOT, Inc. System and methods for aggregating features in video frames to improve accuracy of AI detection algorithms
US11694114B2 (en) 2019-07-16 2023-07-04 Satisfai Health Inc. Real-time deployment of machine learning systems
US11191423B1 (en) * 2020-07-16 2021-12-07 DOCBOT, Inc. Endoscopic system and methods having real-time medical imaging
US11684241B2 (en) 2020-11-02 2023-06-27 Satisfai Health Inc. Autonomous and continuously self-improving learning system

Also Published As

Publication number Publication date
US20050169507A1 (en) 2005-08-04
US7372988B2 (en) 2008-05-13
EP1456805A1 (fr) 2004-09-15
WO2003046811A1 (fr) 2003-06-05
US20060062450A1 (en) 2006-03-23
CA2467646A1 (fr) 2003-06-05
AU2002365560A1 (en) 2003-06-10

Similar Documents

Publication Publication Date Title
US20030132936A1 (en) Display of two-dimensional and three-dimensional views during virtual examination
US7012603B2 (en) Motion artifact detection and correction
EP1012812B1 (fr) Systeme et procede permettant d'effectuer un examen virtuel en trois dimensions
US6343936B1 (en) System and method for performing a three-dimensional virtual examination, navigation and visualization
US6331116B1 (en) System and method for performing a three-dimensional virtual segmentation and examination
US7194117B2 (en) System and method for performing a three-dimensional virtual examination of objects, such as internal organs
US7477768B2 (en) System and method for performing a three-dimensional virtual examination of objects, such as internal organs
WO2006042077A2 (fr) Echantillonnage d'images medicales pour histologie virtuelle
WO1998032371A9 (fr) Systeme d'imagerie bidimensionnelle et tridimensionnelle de structures tubulaires du corps humain
US20050197558A1 (en) System and method for performing a virtual endoscopy in a branching structure
IL145516A (en) System and method for segmentation and simulated three-dimensional tests
MXPA01009387A (en) System and method for performing a three-dimensional virtual examination, navigation and visualization
MXPA01009388A (en) System and method for performing a three-dimensional virtual segmentation and examination

Legal Events

Date Code Title Description
AS Assignment

Owner name: VIATRONIX, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KREEGER, KEVIN;LI, BIN;DACHILLE, FRANK C. IX;AND OTHERS;REEL/FRAME:013858/0007;SIGNING DATES FROM 20030305 TO 20030312

AS Assignment

Owner name: BOND, WILLIAM, AS COLLATERAL AGENT, FLORIDA

Free format text: SECURITY AGREEMENT;ASSIGNOR:VIATRONIX, INC.;REEL/FRAME:018515/0169

Effective date: 20060721

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION