[go: up one dir, main page]

CN110893095A - System and method for visible light and excited fluorescence real-time imaging - Google Patents

System and method for visible light and excited fluorescence real-time imaging Download PDF

Info

Publication number
CN110893095A
CN110893095A CN201811063689.7A CN201811063689A CN110893095A CN 110893095 A CN110893095 A CN 110893095A CN 201811063689 A CN201811063689 A CN 201811063689A CN 110893095 A CN110893095 A CN 110893095A
Authority
CN
China
Prior art keywords
light intensity
visible
intensity value
light
fluorescence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811063689.7A
Other languages
Chinese (zh)
Inventor
胡文忠
张宇
戴玉蓉
陈继东
聂红林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yisi Medical Imaging Equipment Co Ltd
Original Assignee
Shanghai Yisi Medical Imaging Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yisi Medical Imaging Equipment Co Ltd filed Critical Shanghai Yisi Medical Imaging Equipment Co Ltd
Priority to CN201811063689.7A priority Critical patent/CN110893095A/en
Priority to PCT/CN2019/105570 priority patent/WO2020052623A1/en
Publication of CN110893095A publication Critical patent/CN110893095A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Endoscopes (AREA)

Abstract

The application discloses multispectral imaging system includes: the light source module provides visible light and generates exciting light, the irradiated area is irradiated by the combination of the visible light and the exciting light, and the irradiated area is excited by the exciting light to release fluorescence; a camera module comprising one or more image sensors that receive visible and fluorescent light in reflected light from the illuminated region to form a multi-spectral image comprising visible and fluorescent spectral band image information; the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced. The method and the device can ensure that the visible light background image is clear and bright, and meanwhile, the fluorescence imaging contrast is high and the noise point is low.

Description

System and method for visible light and excited fluorescence real-time imaging
Technical Field
The present application relates to the optical imaging arts, and more particularly to the field of endoscopic multispectral imaging.
Background
The optical imaging is widely applied in clinical medicine, and has the advantages of no harm to human body, non-invasiveness, high sensitivity and capability of carrying out in-vivo multi-target imaging. Among them, with the continuous development and innovation of fluorescence imaging agents, fluorescence imaging becomes an important tool for intraoperative navigation. Currently, application technology based on near-infrared fluorescence developer Indocyanine Green (ICG) is also becoming mature. The fluorescent developer indocyanine green is combined with plasma protein after intravenous injection, and emits fluorescence of about 835nm wave band which is longer than the wavelength of excitation light through the excitation of near infrared light of about 805nm wave band. The imaging system processes the captured fluorescent signals through an algorithm to display white light, fluorescent light, or superimposed images on a screen in real time. Compared with the traditional organic dye, the real-time fluorescence developing method has higher contrast ratio, better identification effect on target tissues and avoids dye dispersion in the later period of operation.
For the simultaneous imaging of visible light and fluorescence, each image frame of the system output image is a multispectral fusion image, and one image frame can be divided into visible spectrum section image information and fluorescence spectrum section image information according to different spectral wavelength sections. In the case of tumor surgical resection using fluoroscopic imaging, it is often desirable to precisely ablate the fluorescently labeled tissue regions, which requires clear tissue regions and well-defined boundaries of the fluorescence regions. At the same time, the operator also wants the background tissue area in the visible spectrum to be clear and definite, so as to better identify the tissue. However, the labeled fluorescence region contains information in both the fluorescence and visible regions of the spectrum, and the intensity of light in the visible region is typically much greater than the intensity of light in the fluorescence region. This results in the fluorescence being generally overwhelmed by visible light, resulting in darker, unclear, lower contrast, etc. fluorescence in multispectral images, whereas the tissue at the borders of the fluorescence area is generally less diseased, the excited fluorescence is generally weaker, and the borders of the fluorescence area are more obscured and less clear at high visible spectral band light intensities.
Accordingly, there is a need in the art for improved multispectral imaging techniques to improve the image quality of simultaneous imaging of visible light and excited fluorescence.
Disclosure of Invention
To achieve the above object, the present application provides a multispectral imaging system comprising: the light source module provides visible light and generates exciting light, the irradiated area is irradiated by the combination of the visible light and the exciting light, and the irradiated area is excited by the exciting light to release fluorescence; a camera module comprising one or more image sensors that receive visible and fluorescent light in reflected light from the illuminated region to form a multi-spectral image comprising visible and fluorescent spectral band image information; the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced.
According to one aspect of the application, the light intensity values of the visible spectrum band include a first light intensity value of the visible spectrum band and a second light intensity value of the visible spectrum band, the first light intensity value of the visible spectrum band and the second light intensity value of the visible spectrum band respectively occurring in each periodic frame of the successive video frame images in an alternating periodic cycle.
According to one embodiment of the application, under the condition that the video image adopts a progressive scanning mode, the one or more image sensors acquire visible spectrum section image information and fluorescence spectrum section image information in the multispectral image; each periodic frame is two frames of multi-spectral fusion images, the light intensity value of the fluorescence spectrum section in the image frame is kept unchanged in the continuous frames of the images, the light intensity value of the visible spectrum section is set to be a first light intensity value of the visible spectrum section in the last half of the periodic frame, the light intensity value of the visible spectrum section is set to be a second light intensity value of the visible spectrum section in the next half of the periodic frame, and the first light intensity value of the visible spectrum section is different from the second light intensity value of the visible spectrum section.
According to one aspect of the application, the first light intensity value of the visible spectrum segment and the second light intensity value of the visible spectrum segment are set based on a preset reference light intensity value of the visible spectrum segment.
According to one aspect of the application, the first light intensity value of the visible spectrum and the second light intensity value of the visible spectrum are set in dependence on the light intensity value of the fluorescent spectrum.
According to one aspect of the application, the light intensity values in the visible spectrum range vary as a periodic function in successive video frame images.
According to one aspect of the application, the light intensity difference of the visible spectrum segments in the adjacent frame images of the image is adjusted by adjusting the maximum amplitude value between the peaks and the troughs of the periodic function.
According to one aspect of the application, the light intensity values in the visible spectrum are controlled by adjusting the light source module.
According to another embodiment of the present application, in the video progressive scanning mode, the light intensity values of the fluorescence spectrum section include a first light intensity value of the fluorescence spectrum section and a second light intensity value of the fluorescence spectrum section, the first light intensity value of the fluorescence spectrum section and the second light intensity value of the fluorescence spectrum section respectively occur in each periodic frame of the video image in an alternating periodic cycle, and the periodic variation of the light intensity values of the fluorescence spectrum section is opposite to the periodic variation of the light intensity values of the visible spectrum section.
According to one aspect of the application, the light intensity value of the fluorescence spectrum section and the light intensity value of the visible spectrum section vary in a periodic function in the video frame image, and the variation of the light intensity value of the fluorescence spectrum section corresponds to the variation of the current or voltage of the fluorescence light source, and the variation of the light intensity value of the visible spectrum section corresponds to the variation of the current or voltage of the visible light source.
According to another embodiment of the present application, in the video progressive scanning mode, each periodic frame is a four-frame image, and the first light intensity value in the visible spectrum band or the second light intensity value in the visible spectrum band appears in two consecutive frames of images.
According to one aspect of the application, the light intensity value of the visible spectrum segment is set to a first light intensity value of visible light in the first frame, the light intensity value of the visible spectrum segment is set to a second light intensity value of visible light in the second frame, the light intensity value of the visible spectrum segment is set to the second light intensity value of visible light in the third frame, and the light intensity value of the visible spectrum segment is set to the first light intensity value of visible light in the fourth frame.
According to one aspect of the application, the light intensity values in the visible spectrum range vary as a periodic function in the video frame image, and the variation in the light intensity values in the visible spectrum range corresponds to a variation in the current or voltage of the visible light source.
According to another embodiment of the present application, in the video interlaced scanning mode, each frame image of the video is divided into two field images, and the first light intensity value in the visible light spectrum range and the second light intensity value in the visible light spectrum range respectively appear in different field images alternately and periodically.
According to one aspect of the application, each frame of multispectral image of the video is divided into two field images for interlaced video output, and the two field images are complementary to form the multispectral image.
According to one aspect of the application, the first light intensity value in the visible spectrum is different from the second light intensity value in the visible spectrum.
According to one aspect of the application, the light intensity values in the visible spectrum range vary as a periodic function in the video frame image, and the variation in the light intensity values in the visible spectrum range corresponds to a variation in the current or voltage of the visible light source.
According to one aspect of the application, the second light intensity value of the visible spectral range is set according to the following: calculating the minimum value of the active fluorescence pixels, the maximum value of the active fluorescence pixels and the average value of all the active fluorescence pixels; calculating a median value of the active fluorescence pixels according to the calculated minimum value and maximum value of the active fluorescence pixels; selecting the minimum value of the median value of the fluorescence pixels and the average value of the active fluorescence pixels; and determining a preset reference light intensity value of the visible light spectrum section according to the selected minimum value, and taking the reference light intensity value as a second light intensity value of the visible light spectrum section.
According to one aspect of the application, the first light intensity value for the visible spectrum band is set according to an imaging quality of the visible spectrum band image.
According to one aspect of the application, a maximum magnitude for controlling the light source current or voltage to adjust the light intensity is determined by a maximum difference of a first light intensity value in the visible spectrum and a second light intensity value in the visible spectrum.
The application also provides a multispectral imaging method, comprising: the visible light and the exciting light are irradiated in a combined mode, and the irradiated area is excited by the exciting light to release fluorescence; receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information; the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced.
According to one aspect of the application, the light intensity values of the visible spectrum band include a first light intensity value of the visible spectrum band and a second light intensity value of the visible spectrum band, the first light intensity value of the visible spectrum band and the second light intensity value of the visible spectrum band respectively occurring in each periodic frame of the successive video frame images in an alternating periodic cycle.
According to one embodiment of the application, when the video image is scanned line by line, each periodic frame is a two-frame multispectral fused image, the light intensity value of the fluorescence spectrum section in the image frame is kept unchanged in the continuous frames of the image, the light intensity value of the visible spectrum section is set to be a first light intensity value of the visible spectrum section in the last half of the periodic frame, the light intensity value of the visible spectrum section is set to be a second light intensity value of the visible spectrum section in the next half of the periodic frame, and the first light intensity value of the visible spectrum section is different from the second light intensity value of the visible spectrum section.
According to another embodiment of the present application, in the video progressive scanning mode, the light intensity values of the fluorescence spectrum section include a first light intensity value of the fluorescence spectrum section and a second light intensity value of the fluorescence spectrum section, the first light intensity value of the fluorescence spectrum section and the second light intensity value of the fluorescence spectrum section respectively occur in each periodic frame of the video image in an alternating periodic cycle, and the periodic variation of the light intensity values of the fluorescence spectrum section is opposite to the periodic variation of the light intensity values of the visible spectrum section.
According to another embodiment of the present application, in the video progressive scanning mode, each periodic frame is a four-frame image, and the first light intensity value in the visible spectrum band or the second light intensity value in the visible spectrum band appears in two consecutive frames of images.
According to another embodiment of the present application, in the video interlaced scanning mode, each frame image of the video is divided into two field images, and the first light intensity value in the visible light spectrum range and the second light intensity value in the visible light spectrum range respectively appear in different field images alternately and periodically.
According to yet another embodiment of the present application, a fusion of the visible spectral band image with the multispectral image of the fluorescence spectral band image is performed before outputting the multispectral image.
The present application further provides a computer-readable storage medium storing computer-executable instructions for: the visible light and the exciting light are irradiated in a combined mode, and the irradiated area is excited by the exciting light to release fluorescence; receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information; the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced.
The application also provides a light intensity adjusting method used in multispectral imaging, which comprises the following steps: calculating the minimum value of the active fluorescence pixels, the maximum value of the active fluorescence pixels and the average value of all the active fluorescence pixels; calculating a median value of the active fluorescence pixels according to the calculated minimum value and the maximum value of the active fluorescence pixels; selecting the minimum value of the median value of the fluorescence pixels and the average value of the active fluorescence pixels; determining a preset reference light intensity value of the visible light spectrum section according to the selected minimum value, and taking the reference light intensity value as a second light intensity value of the visible light spectrum section; setting a first light intensity value of the visible spectrum band according to the imaging quality of the visible spectrum band image; calculating a maximum difference between a first light intensity value of the visible spectrum segment and a second light intensity value of the visible spectrum segment; and determining a maximum amplitude for controlling the light source current or voltage to adjust the light intensity.
According to one embodiment of the present application, calculating the median of the active fluorescence pixels may also incorporate the pixel maximum gray level.
Compared with the prior art, the system and the method for visible light and fluorescence excitation real-time imaging can enable visible light background images to be clear and bright, and meanwhile, fluorescence imaging contrast is high and noise is low.
The method can be widely applied to the technical field of medical imaging, including various fields such as medical endoscopes and fluorescence microscopes.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic structural diagram of a multispectral imaging system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a multi-spectral imaging system employing a single image sensor according to one embodiment of the present application;
FIG. 3 is a schematic diagram of a light source module according to one embodiment of the present application employing a combination of multiple monochromatic light sources;
FIG. 4 is a schematic diagram of a camera module employing two image sensors according to one embodiment of the present application;
FIG. 5 is a schematic diagram of a camera module according to one embodiment of the present application employing four image sensors;
FIG. 6 is a schematic diagram of a next imaging method in a line-by-line output mode according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another imaging method in a progressive output mode according to one embodiment of the present application;
FIG. 8 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a next imaging method in interlaced output mode according to an embodiment of the present application;
FIG. 10 is a diagram illustrating a method for confirming a reference preset value of light intensity according to an embodiment of the present application; and
FIG. 11 is a block diagram of an exemplary computer system, according to one embodiment of the present application.
Detailed Description
Various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to specific examples and implementations are for illustrative purposes, and are not intended to limit the scope of the application or claims.
The word "exemplary" is used herein to mean "serving as an example, instance, or illustration. Any implementation described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other implementations.
FIG. 1 illustrates an exemplary structural schematic of a multispectral imaging system according to one embodiment of the present application. In this example, the multispectral imaging system may be used for fluorescence endoscopy. As shown in fig. 1, the fluorescence endoscope camera system may include a light source module 1, a camera module 2 for imaging, a data acquisition and preprocessing module 3 for image acquisition and preprocessing, a digital image processing module 4, a system controller 5, and a light source controller 6. The camera module 2 may include, among other things, an optical lens assembly and one or more image sensors (which will be described in more detail below). The system controller 5 may generate a corresponding control signal according to a feedback processing result of the digital image processing module 4. The light source controller 6 may perform lighting control on the light source module 1 according to a control signal of the system controller 5. The light source module can be adjusted to control the light intensity value in the visible spectrum.
FIG. 2 illustrates an exemplary structural schematic of a multi-spectral imaging system employing a single image sensor according to one embodiment of the present application. As shown in fig. 2, the light source module 1 may include an incident visible light LED 11, a Near Infrared (NIR) exciter 12, and a combined beam splitter 13. In this example, the camera module 2 may include an optical lens assembly 9 and a single image sensor 10 (e.g., a CCD or CMOS image sensor). Optionally, the optical lens assembly 9 may further include an optical filter 8. Incident visible light LEDs 11 can provide illumination in the visible wavelength band (400nm to 760 nm). The Near Infrared (NIR) exciter 12 can generate NIR excitation light in the near infrared band (790 to 820nm), particularly in the wavelength band around 800 nm. Visible and NIR excitation light may be combined for illumination of the illuminated area (e.g., tissue) by a combined beam splitter 13. When the tissue injected with the fluorescence developer is irradiated with NIR excitation light in the light source module 1, fluorophores in the tissue are excited to release near infrared fluorescence of a wavelength band longer than the wavelength of the excitation light, for example, fluorescence of 830nm to 900 nm. The reflected light from the illuminated region (e.g., tissue) can include, for example, three spectral bands of visible light, excitation light (e.g., NIR excitation light), and fluorescence (e.g., near infrared fluorescence). Where NIR excitation light is completely blocked from entering the image sensor by the optional optical filter 8, while visible and fluorescence light passes through the optical lens assembly 9 into a single image sensor 10 having NIR pixels. Data from the image sensor 10 may be passed through the data acquisition and pre-processing module 3 to the digital image processing module 4 to form images in the visible and fluorescence bands. The system controller 5 may output a signal to the light source controller 6 according to the feedback of the digital image processing module 4. The light source controller 6 may control the light source module 1 by controlling a driving current or a voltage.
FIG. 3 is an exemplary diagram of a light source module according to one embodiment of the present application employing a combination of multiple monochromatic light sources. In this example, the light source module 1 may include a combination of three monochromatic light sources as shown in fig. 3. The visible light source is composed of three monochromatic light sources of a red light LED 18(620 nm-760 nm), a green light LED 19(492 nm-577 nm) and a blue light LED 20(400 nm-450 nm) through a corresponding combined spectroscope 15, a combined spectroscope 16 and a combined spectroscope 17. In an example of a combination of multiple monochromatic light sources, the camera module 2 may include one or more image sensors. For example, the camera module 2 may be a single image sensor as shown in fig. 2, or may be a dual image sensor combination as shown in fig. 4.
FIG. 4 is an exemplary diagram of a camera module employing two image sensors according to one embodiment of the present application. The camera module 2 may include a plurality of image sensors, and corresponding beam splitting prisms. In this example, the camera module 2 may employ an image sensor 32 (e.g., a CCD/CMOS image sensor) of visible light and an image sensor 33 (e.g., a CCD/CMOS image sensor) of fluorescence. Visible light and fluorescent light from the tissue enter the corresponding image sensors through the dichroic prism 31, respectively.
FIG. 5 is an exemplary diagram of a camera module employing four image sensors according to one embodiment of the present application. The camera module 2 may include four image sensors and corresponding three beam splitting prisms. When a 4-CCD/CMOS image sensor combination is employed, as shown in fig. 5, prism spectroscopy (e.g., a combination of a beam splitter prism 34, a beam splitter prism 35, and a beam splitter prism 36) is employed to direct light into three RGB monochrome image sensors (e.g., a red image sensor 38, a green image sensor 39, and a blue image sensor 40) and one NIR fluorescence image sensor 37, respectively, each of which receives reflected light of a corresponding wavelength band to form a multispectral image.
It is understood that the light intensity of the present application may be periodically adjusted in successive frames of a video image by one image sensor, by two image sensors, or by a four sensor scheme. The multi-spectral fusion image can also be obtained by integrating fluorescent pixels in a single image sensor, a visible light image is synthesized by three image sensors (RGB), and a fusion image can also be formed by a 4CCD/CMOS image of fluorescent light generated by a fourth image sensor. Thus, the present application employs one or more image sensors to generate a fused image frame of visible light and fluorescence, the number of image sensors in various embodiments being merely illustrative and not limiting.
Fig. 6 is a schematic diagram of a next imaging method in a progressive output mode according to an embodiment of the present application. Fig. 6 shows an alternative way of periodic light intensity variations in successive frame images. Under the line-by-line output mode, each image frame of the system output image is a multispectral fusion image and comprises visible spectrum section image information and fluorescence spectrum section image information. The intensity of the fluorescence spectrum segment may be set to remain constant in all successive frame images. In the first frame of the system image output, the visible spectrum band can be set to high light intensity value, while the visible spectrum band in the second frame can be set to low light intensity value, the visible spectrum band in the third frame image can be set to high light intensity value, and the visible spectrum band in the fourth frame image can be changed to low light intensity value. So does the subsequent image frame. Visible spectrum high and low light intensity values, respectively, alternate periodically in each frame of the image. In the figure, one period of the change of the high and low light intensity values is the time of two frames of images, namely, one period frame is two frames of images, the first half period frame is high light intensity of a visible light spectrum section, and the second half period frame is low light intensity of the visible light spectrum section. It should be understood that the high or low intensity values of the visible spectrum of the image are relative to the preset reference light intensity of the visible spectrum or relative to the intensity of the fluorescence spectrum. Although the light intensity of the visible light spectrum section of the adjacent frames of the image has brightness difference, due to the characteristics of human persistence of vision and the characteristic that human eyes have low sensitivity to brightness change above a certain light intensity, the brightness parts of the visible light video image in the continuous image frames are interwoven together, and the rapid brightness change of the visible light spectrum section is not easy to be perceived by human eyes. In addition, in the information contained in each image frame, the frame image with high light intensity in the visible light spectrum range mainly highlights the image details and the definition of background tissues, and the frame image with low light intensity in the visible light spectrum range highlights the image details of a fluorescence area and highlights the contrast with the background tissues. Therefore, in the process of continuous image frames, the structural image details of background tissues under high light intensity are reflected and the specific image details of a fluorescence area are also reflected by using the persistence of vision of human eyes.
In addition, the light intensity values in the visible spectrum range are adjusted and varied in the successive video frame images by a periodic function. The light intensity may be controlled by the light source drive current or voltage. To produce continuous periodic alternating bright and dark frame images in the visible spectrum, the light source drive current may be a continuous periodic function of change. The larger the driving current is, the larger the light source intensity is, and the light source driving current is a periodic variation function, such as a sine function, a cosine function, a tangent function, a square wave periodic function, a triangular periodic function, a step periodic function, and the like. For example, the sine function shown in fig. 6 varies with a half cycle time of one frame image exposure time.
The maximum amplitude between the wave crest and the wave trough of the function is adjusted to adjust the visible light intensity difference value in the adjacent frame images so as to adapt to the sensitivity of human eyes to the light intensity change and achieve better visual effect. Further, in implementations, the frame rate range of the images may be adjusted between 60fps (frames per second) to 120 fps. The image frame rate can be properly increased to 60-120 fps to eliminate the screen flash phenomenon caused by too slow or too large light intensity change, thereby eliminating the human eye discomfort caused by too large light intensity change.
Fig. 7 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application. Fig. 7 shows another alternative of a periodic light intensity variation in successive frame images. In the video progressive output mode, when the contrast between the visible spectrum band image and the fluorescence region image is more emphasized in the continuous image period frames, the light intensity change in the system image period frames can be as shown in fig. 7. The periodic variation of the light intensity values in the fluorescence spectrum is in the opposite direction to the periodic variation of the light intensity values in the visible spectrum. The visible spectrum section is high light intensity value, and the fluorescence spectrum section is low light intensity. While the visible spectrum section is a low light intensity value, the fluorescence spectrum section is a high light intensity change. The drive current for each light source also changes as shown in fig. 7. Therefore, in one frame of multi-spectral fusion image, the contrast difference of the image light intensity of the visible spectrum section and the fluorescence spectrum section is more prominent, so that in a continuous period frame, the visible spectrum section image and the fluorescence spectrum section image are more distinct. In this example, the light intensity values in the fluorescence spectrum band and the light intensity values in the visible spectrum band are adjusted by taking a periodic function in the video frame image.
Fig. 8 is a schematic diagram of another imaging method in a progressive output mode according to an embodiment of the present application. Fig. 8 shows yet another alternative of a periodic light intensity variation in successive frame images. In the video line-by-line output mode, when the light source driving current is changed like a cosine function period, the light intensity change of the visible light spectrum section in the system image frame information is as shown in fig. 8. According to the foregoing description, one period frame is 4 frames of images, and high or low light intensity values in the visible spectrum appear in two consecutive frames of images, that is, the period time during which a certain light intensity state is increased in the consecutive frames of images. The different light intensity states of the visible spectrum section image and the fluorescence spectrum section image are more distinct.
Fig. 9 is a schematic diagram of an imaging method in an interlaced output mode according to an embodiment of the present application. Fig. 9 shows a method for adjusting the intensity of light periodically in successive frame images in interlaced output mode. When the image is outputted in an interlaced output mode, as shown in fig. 9, the driving current of the light source is changed by a continuous periodic function, such as a sinusoidal periodic function. When the image adopts an interlaced output mode, one frame of image is divided into odd field image information and even field image information, and visible light spectrum high and low light intensity values respectively appear in different field images alternately and periodically. The odd field is high light intensity in the visible spectrum, and the even field is low light intensity in the visible spectrum, or vice versa. And finally, displaying the image by adopting interlaced output. When interlaced output is used, the odd lines of the image are scanned first (top field) and then the even lines of the image are scanned (bottom field), which complement each other to form a complete picture. Although the brightness of the upper field is attenuated when the lower field is scanned, the brightness of the upper field is not easily perceived due to the fact that bright and dark parts in a visible spectrum range are interwoven, and the visual characteristics of a human are utilized. By adopting interlaced output, the image information is divided into two field images to be output respectively, and the mode has more continuity for the visual effect of the application. In addition, interlaced output can support higher resolution image output.
It should be understood that the light intensity values in the visible spectrum may take the form of high-then-low values (as in the illustrated embodiment) by way of example, but the application is not so limited. Alternatively, the light intensity value in the visible spectrum band may be set low first and high second in each periodic frame. And the intensity value of the fluorescence spectrum section can also be set to be high first and then low second.
Further, in various embodiments of the present application, a change in the light intensity value of the fluorescence spectrum band may correspond to a change in the current or voltage of the fluorescent light source. Variations in the light intensity values in the visible spectrum may correspond to variations in the current or voltage of the visible light source.
In addition, optionally, before outputting the multispectral image, image fusion may be performed on the visible spectrum segment image and the fluorescent spectrum segment image, and after image processing, the output multispectral fusion image has a better visual effect.
Fig. 10 is a schematic diagram illustrating a method for confirming a reference preset light intensity value according to an embodiment of the present application. As mentioned above, the high or low intensity values of the visible spectrum of the image are relative to a preset reference light intensity of the visible spectrum or relative to the light intensity of the fluorescent spectrum. The reference intensity feedback adjustment of the light intensity value variation in the visible spectrum band will be described below. As shown in fig. 10, the digital image processing module continuously calculates the minimum value, the maximum value and the average value of all active fluorescence pixels in the image frame when the visible spectrum has low light intensity as the background of the fluorescence region. The median value of the active fluorescent pixels is calculated from the minimum and maximum values of the active fluorescent pixels and the maximum gray level of the pixel. Optionally, calculating the median of the active fluorescence pixels may also incorporate the pixel maximum gray level. The maximum gray level of a pixel here can be the maximum dynamic range of a fluorescent pixel, i.e. the maximum value of the light intensity that can be perceived by the pixel sensor unit. Then, the median and average values of the active fluorescence pixels are compared, and the minimum value of the median of the fluorescence pixels and the average value of the active fluorescence pixels is selected. And determining the optimal preset light intensity of the visible light source in the low-light-intensity frame according to the selected minimum value. And the high intensity value of the visible spectrum section of the image is determined according to the imaging quality of the visible light image. And finally, calculating the maximum difference value between the high light intensity value and the low light intensity value of the visible light source, determining the maximum amplitude A of the light source control current, and completing the light intensity adjustment of the system fluorescence mode.
According to the scheme of the application, according to the persistence of vision characteristics of human eyes, different video output modes are combined, and in continuous frame image information, the light intensity of the visible light spectrum sections of different frame images in the video images is adjusted, so that the light intensity of the high visible light spectrum section and the light intensity of the low visible light spectrum section are alternately and periodically output in different frame images in a circulating manner. Therefore, in a continuous frame image information, not only can a visible light spectrum section image with high brightness be output, but also the visible light image details of the background tissue can be more highlighted, and fluorescence image information with high contrast under the background with low visible light intensity can be output. Therefore, the clear details of the visible spectrum range image and the image details of the fluorescent near-infrared spectrum range image under high contrast are embodied in a series of continuous frame image information. The method adopts an image period frame technology of outputting different light intensities and contrasts at intervals of frames, ensures high contrast of fluorescence image information, and simultaneously outputs clear visible light image information.
According to the scheme of the application, the defect that the contrast of a fluorescence area is low under a background image with high visible light intensity when the existing fluorescence and visible light spectrum sections are imaged simultaneously is overcome. The defect of poor visible light image imaging quality under a background image with low visible light intensity when fluorescence and visible light spectrum sections are imaged simultaneously in the market is overcome.
It should be understood that the above embodiments are only examples and not limitations, and that besides the above progressive and interlaced scanning schemes, those skilled in the art can also conceive other similar methods, which can embody both the sharp details of the visible spectrum band image and the image details of the fluorescent near infrared band image at high contrast. Such implementations should not be read as resulting in a departure from the scope of the present application.
Referring to FIG. 11, an exemplary computer system 1100 is shown. Computer system 1100 may include a logical processor 1102, such as an execution core. Although one logical processor 1102 is illustrated, in other embodiments, the computer system 1100 may have multiple logical processors, e.g., multiple execution cores per processor substrate, and/or multiple processor substrates, each of which may have multiple execution cores. As shown, the various computer-readable storage media 1110 may be interconnected by one or more system buses that couple the various system components to the logical processor 1102. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. In an exemplary embodiment, the computer-readable storage media 1110 can include, for example, Random Access Memory (RAM)1104, storage 1106 (e.g., electromechanical hard drive, solid state hard drive, etc.), firmware 1108 (e.g., flash RAM or ROM), and a removable storage device 1118 (such as, for example, a CD-ROM, a floppy disk, a DVD, a flash drive, an external storage device, etc.). Those skilled in the art will appreciate that other types of computer-readable storage media can be used, such as magnetic cassettes, flash memory cards, and/or digital video disks. Computer-readable storage media 1110 may provide non-volatile and volatile storage of computer-executable instructions 1122, data structures, program modules, and other data for computer system 1100. A basic input/output system (BIOS)1120, containing the basic routines that help to transfer information between elements within computer system 1100, such as during start-up, may be stored in firmware 1108. A number of programs may be stored on firmware 1108, storage device 1106, RAM 1104, and/or removable storage device 1118 and executed by logical processor 1102, logical processor 1102 including an operating system and/or application programs. Commands and information may be received by computer system 1100 through input devices 1116, which may include, but are not limited to, keyboards and pointing devices. Other input devices may include a microphone, joystick, game pad, scanner, or the like. These and other input devices are often connected to the logical processor 1102 through a serial port interface that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port, or Universal Serial Bus (USB). A display or other type of display device can also be connected to the system bus via an interface, such as a video adapter, which can be part of graphics processing unit 1112 or connected to graphics processing unit 1112. In addition to the display, computers typically include other peripheral output devices, such as speakers and printers (not shown). The exemplary system of FIG. 11 can also include a host adapter, Small Computer System Interface (SCSI) bus, and an external storage device connected to the SCSI bus. The computer system 1100 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer. The remote computer may be another computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 1100. When used in a LAN or WAN networking environment, computer system 1100 can be connected to the LAN or WAN through network interface card 1114. A network card (NIC)1114 (which may be internal or external) may be connected to the system bus. In a networked environment, program modules depicted relative to the computer system 1100, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections described herein are exemplary and other means of establishing a communications link between the computers may be used.
In one or more exemplary embodiments, the functions and processes described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, the software codes may be stored in a memory, such as a memory of a mobile station, and executed by a processor, such as a desktop computer, a laptop computer, a server computer, a microprocessor of a mobile device, or the like. The memory may be implemented within the processor or external to the processor. As used herein, the term "memory" refers to any type of long term, short term, volatile, nonvolatile, or other memory and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
The foregoing description and drawings are provided by way of illustrative example only. Any reference to claim elements in the singular, for example, using the articles "a," "an," or "the" is not to be construed as limiting the element to the singular. Any reference to source depth, such as "first," "second," etc., as used herein, does not generally limit the number or order of those elements. Reference to first and second elements does not imply that only two elements are used herein, nor that the first element must somehow precede the second element. A set of elements may include one or more elements unless otherwise specified. Skilled artisans may implement the described structure in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.

Claims (30)

1. A multispectral imaging system, comprising:
the light source module provides visible light and generates exciting light, the irradiated area is irradiated by the combination of the visible light and the exciting light, and the irradiated area is excited by the exciting light to release fluorescence;
a camera module comprising one or more image sensors that receive visible and fluorescent light in reflected light from the illuminated region to form a multi-spectral image comprising visible and fluorescent spectral band image information;
the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced.
2. The multispectral imaging system of claim 1, wherein:
the light intensity values of the visible spectrum band include a first light intensity value of the visible spectrum band and a second light intensity value of the visible spectrum band, which alternately periodically cycle in each periodic frame of the consecutive video frame images, respectively.
3. The multispectral imaging system of claim 2, wherein:
under the condition that the video image adopts a progressive scanning mode, the one or more image sensors acquire visible spectrum section image information and fluorescence spectrum section image information in the multispectral image; each periodic frame is two frames of multi-spectral fusion images, the light intensity value of the fluorescence spectrum section in the image frame is kept unchanged in the continuous frames of the images, the light intensity value of the visible spectrum section is set to be a first light intensity value of the visible spectrum section in the last half of the periodic frame, the light intensity value of the visible spectrum section is set to be a second light intensity value of the visible spectrum section in the next half of the periodic frame, and the first light intensity value of the visible spectrum section is different from the second light intensity value of the visible spectrum section.
4. The multispectral imaging system of claim 2, wherein:
the first light intensity value of the visible spectrum segment and the second light intensity value of the visible spectrum segment are set based on a preset reference light intensity value of the visible spectrum segment.
5. The multispectral imaging system of claim 2, wherein:
the first light intensity value in the visible range and the second light intensity value in the visible range are set in dependence on the light intensity value in the fluorescent range.
6. The multispectral imaging system of claim 3, wherein:
the light intensity values in the visible spectrum range vary as a periodic function in successive video frame images.
7. The multispectral imaging system of claim 6, wherein:
and adjusting the light intensity difference of the visible spectrum section in the adjacent frame images of the image by adjusting the maximum amplitude value between the wave crest and the wave trough of the periodic function.
8. The multispectral imaging system of claim 1, wherein:
the light source module is adjusted to control the light intensity value in the visible spectrum range.
9. The multispectral imaging system of claim 2, wherein:
in the video progressive scanning mode, the light intensity values of the fluorescence spectrum section comprise a first light intensity value of the fluorescence spectrum section and a second light intensity value of the fluorescence spectrum section, the first light intensity value of the fluorescence spectrum section and the second light intensity value of the fluorescence spectrum section alternately and periodically appear in each periodic frame of the video image respectively, and the periodic change of the light intensity values of the fluorescence spectrum section is opposite to the periodic change of the light intensity values of the visible spectrum section.
10. The multispectral imaging system of claim 9, wherein:
the light intensity value of the fluorescence spectrum section and the light intensity value of the visible spectrum section are changed in a periodic function mode in the video frame image, the change of the light intensity value of the fluorescence spectrum section corresponds to the change of the current or the voltage of the fluorescence light source, and the change of the light intensity value of the visible spectrum section corresponds to the change of the current or the voltage of the visible light source.
11. The multispectral imaging system of claim 2, wherein:
in the video progressive scanning mode, each periodic frame is four frames of images, and the first light intensity value of the visible light spectrum section or the second light intensity value of the visible light spectrum section appears in two continuous frames of images.
12. The multispectral imaging system of claim 11, wherein:
the light intensity value for the visible spectrum segment is set to the first light intensity value for visible light in the first frame, the light intensity value for the visible spectrum segment is set to the second light intensity value for visible light in the second frame, the light intensity value for the visible spectrum segment is set to the second light intensity value for visible light in the third frame, and the light intensity value for the visible spectrum segment is set to the first light intensity value for visible light in the fourth frame.
13. The multispectral imaging system of claim 11, wherein:
the light intensity value of the visible light spectrum section changes in a periodic function in the video frame image, and the change in the light intensity value of the visible light spectrum section corresponds to the change in the current or voltage of the visible light source.
14. The multispectral imaging system of claim 2, wherein:
in the video interlaced scanning mode, each frame image of the video is divided into two field images, and a first light intensity value of a visible light spectrum section and a second light intensity value of the visible light spectrum section alternately and periodically appear in different field images respectively.
15. The multispectral imaging system of claim 14, wherein:
and each frame of multispectral image of the video is divided into two field images for interlaced scanning video output, and the two field images are complementary to form the multispectral image.
16. The multispectral imaging system of claim 14, wherein:
the first light intensity value in the visible spectrum is different from the second light intensity value in the visible spectrum.
17. The multispectral imaging system of claim 14, wherein:
the light intensity value of the visible light spectrum section changes in a periodic function in the video frame image, and the change in the light intensity value of the visible light spectrum section corresponds to the change in the current or voltage of the visible light source.
18. The multispectral imaging system according to claim 4 or 5, wherein the second light intensity values for the visible spectral range are set according to:
calculating the minimum value of the active fluorescence pixels, the maximum value of the active fluorescence pixels and the average value of all the active fluorescence pixels;
calculating a median value of the active fluorescence pixels according to the calculated minimum value and maximum value of the active fluorescence pixels;
selecting the minimum value of the median value of the fluorescence pixels and the average value of the active fluorescence pixels; and
and determining a preset reference light intensity value of the visible light spectrum section according to the selected minimum value, and taking the reference light intensity value as a second light intensity value of the visible light spectrum section.
19. The multispectral imaging system of claim 18, wherein:
the first light intensity value of the visible spectrum band is set according to an imaging quality of the visible spectrum band image.
20. The multispectral imaging system of claim 19, wherein:
the maximum amplitude for controlling the current or voltage of the light source is determined by the maximum difference between the first light intensity value in the visible spectrum and the second light intensity value in the visible spectrum to adjust the light intensity.
21. A method of multispectral imaging, comprising:
the visible light and the exciting light are irradiated in a combined mode, and the irradiated area is excited by the exciting light to release fluorescence;
receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information;
the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced.
22. The multispectral imaging method of claim 21, wherein:
the light intensity values of the visible spectrum band include a first light intensity value of the visible spectrum band and a second light intensity value of the visible spectrum band, which alternately periodically cycle in each periodic frame of the consecutive video frame images, respectively.
23. The method of multispectral imaging according to claim 22, wherein:
under the condition that the video image is scanned line by line, each periodic frame is two frames of multispectral fusion images, the light intensity value of a fluorescence spectrum section in the image frame is kept unchanged in continuous frames of the image, the light intensity value of a visible spectrum section is set to be a first light intensity value of the visible spectrum section in the last half periodic frame, the light intensity value of the visible spectrum section is set to be a second light intensity value of the visible spectrum section in the next half periodic frame, and the first light intensity value of the visible spectrum section is different from the second light intensity value of the visible spectrum section.
24. The method of multispectral imaging according to claim 22, wherein:
in the video progressive scanning mode, the light intensity values of the fluorescence spectrum section comprise a first light intensity value of the fluorescence spectrum section and a second light intensity value of the fluorescence spectrum section, the first light intensity value of the fluorescence spectrum section and the second light intensity value of the fluorescence spectrum section alternately and periodically appear in each periodic frame of the video image respectively, and the periodic change of the light intensity values of the fluorescence spectrum section is opposite to the periodic change of the light intensity values of the visible spectrum section.
25. The method of multispectral imaging according to claim 22, wherein:
in the video progressive scanning mode, each periodic frame is four frames of images, and the first light intensity value of the visible light spectrum section or the second light intensity value of the visible light spectrum section appears in two continuous frames of images.
26. The method of multispectral imaging according to claim 22, wherein:
in the video interlaced scanning mode, each frame image of the video is divided into two field images, and a first light intensity value of a visible light spectrum section and a second light intensity value of the visible light spectrum section alternately and periodically appear in different field images respectively.
27. The method of multispectral imaging according to claim 22, wherein:
and fusing the multispectral images of the visible spectrum segment image and the fluorescence spectrum segment image before outputting the multispectral images.
28. A computer-readable storage medium storing computer-executable instructions for:
the visible light and the exciting light are irradiated in a combined mode, and the irradiated area is excited by the exciting light to release fluorescence;
receiving visible light and fluorescence in reflected light from the illuminated area to form a multispectral image comprising visible spectral band image information and fluorescence spectral band image information;
the light intensity value of the visible light spectrum section is periodically changed in the continuous video frame images, so that the bright and dark parts of the visible light video images in the continuous frames are interlaced.
29. A method of light intensity adjustment for multi-spectral imaging, comprising:
calculating the minimum value of the active fluorescence pixels, the maximum value of the active fluorescence pixels and the average value of all the active fluorescence pixels;
calculating a median value of the active fluorescence pixels according to the calculated minimum value and the maximum value of the active fluorescence pixels;
selecting the minimum value of the median value of the fluorescence pixels and the average value of the active fluorescence pixels;
determining a preset reference light intensity value of the visible light spectrum section according to the selected minimum value, and taking the reference light intensity value as a second light intensity value of the visible light spectrum section;
setting a first light intensity value of the visible spectrum band according to the imaging quality of the visible spectrum band image;
calculating a maximum difference between a first light intensity value of the visible spectrum segment and a second light intensity value of the visible spectrum segment; and
a maximum amplitude for controlling the light source current or voltage is determined to adjust the light intensity.
30. The method of claim 29, wherein:
the median of the active fluorescence pixels is also calculated in combination with the pixel maximum gray level.
CN201811063689.7A 2018-09-12 2018-09-12 System and method for visible light and excited fluorescence real-time imaging Pending CN110893095A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811063689.7A CN110893095A (en) 2018-09-12 2018-09-12 System and method for visible light and excited fluorescence real-time imaging
PCT/CN2019/105570 WO2020052623A1 (en) 2018-09-12 2019-09-12 System and method for real-time imaging by visible light and excited fluorescent light

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811063689.7A CN110893095A (en) 2018-09-12 2018-09-12 System and method for visible light and excited fluorescence real-time imaging

Publications (1)

Publication Number Publication Date
CN110893095A true CN110893095A (en) 2020-03-20

Family

ID=69777319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811063689.7A Pending CN110893095A (en) 2018-09-12 2018-09-12 System and method for visible light and excited fluorescence real-time imaging

Country Status (2)

Country Link
CN (1) CN110893095A (en)
WO (1) WO2020052623A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113208567A (en) * 2021-06-07 2021-08-06 上海微创医疗机器人(集团)股份有限公司 Multispectral imaging system, imaging method and storage medium
CN113610823A (en) * 2021-08-13 2021-11-05 南京诺源医疗器械有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN113749771A (en) * 2021-04-30 2021-12-07 上海格联医疗科技有限公司 Molecular image near-infrared two-zone fluorescence navigation system
CN113749772A (en) * 2021-04-22 2021-12-07 上海格联医疗科技有限公司 Enhanced near-infrared 4K fluorescence navigation system
WO2022101823A1 (en) * 2020-11-13 2022-05-19 International Business Machines Corporation Identification of a section of bodily tissue for pathology tests
WO2022105902A1 (en) * 2020-11-20 2022-05-27 上海微创医疗机器人(集团)股份有限公司 Fluorescence endoscope system, control method and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111887810A (en) * 2020-07-24 2020-11-06 西北大学 Near-infrared two-region co-radial off-axis optical-CT dual-mode imaging system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060004292A1 (en) * 2002-12-13 2006-01-05 Alexander Beylin Optical examination method and apparatus particularly useful for real-time discrimination of tumors from normal tissues during surgery
CN102099671A (en) * 2008-05-20 2011-06-15 大学健康网络 Devices and methods for fluorescence-based imaging and monitoring
CN103300812A (en) * 2013-06-27 2013-09-18 中国科学院自动化研究所 Endoscope-based multispectral video navigation system and method
CN107518879A (en) * 2017-10-11 2017-12-29 北京数字精准医疗科技有限公司 A kind of fluoroscopic imaging device and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2359637A1 (en) * 1999-01-26 2000-07-27 Stephen F. Fulghum, Jr. Autofluorescence imaging system for endoscopy
US7697975B2 (en) * 2003-06-03 2010-04-13 British Colombia Cancer Agency Methods and apparatus for fluorescence imaging using multiple excitation-emission pairs and simultaneous multi-channel image detection
US20090131800A1 (en) * 2007-11-15 2009-05-21 Carestream Health, Inc. Multimodal imaging system for tissue imaging
JPWO2011007435A1 (en) * 2009-07-16 2012-12-20 株式会社山野光学 Aperture stop
CN105496354B (en) * 2014-09-23 2019-06-04 岩崎电气株式会社 camera system
US11206987B2 (en) * 2015-04-03 2021-12-28 Suzhou Caring Medical Co., Ltd. Method and apparatus for concurrent imaging at visible and infrared wavelengths

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060004292A1 (en) * 2002-12-13 2006-01-05 Alexander Beylin Optical examination method and apparatus particularly useful for real-time discrimination of tumors from normal tissues during surgery
CN102099671A (en) * 2008-05-20 2011-06-15 大学健康网络 Devices and methods for fluorescence-based imaging and monitoring
CN103300812A (en) * 2013-06-27 2013-09-18 中国科学院自动化研究所 Endoscope-based multispectral video navigation system and method
CN107518879A (en) * 2017-10-11 2017-12-29 北京数字精准医疗科技有限公司 A kind of fluoroscopic imaging device and method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022101823A1 (en) * 2020-11-13 2022-05-19 International Business Machines Corporation Identification of a section of bodily tissue for pathology tests
WO2022105902A1 (en) * 2020-11-20 2022-05-27 上海微创医疗机器人(集团)股份有限公司 Fluorescence endoscope system, control method and storage medium
CN113749772A (en) * 2021-04-22 2021-12-07 上海格联医疗科技有限公司 Enhanced near-infrared 4K fluorescence navigation system
CN113749771A (en) * 2021-04-30 2021-12-07 上海格联医疗科技有限公司 Molecular image near-infrared two-zone fluorescence navigation system
CN113208567A (en) * 2021-06-07 2021-08-06 上海微创医疗机器人(集团)股份有限公司 Multispectral imaging system, imaging method and storage medium
CN113610823A (en) * 2021-08-13 2021-11-05 南京诺源医疗器械有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN113610823B (en) * 2021-08-13 2023-08-22 南京诺源医疗器械有限公司 Image processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
WO2020052623A1 (en) 2020-03-19

Similar Documents

Publication Publication Date Title
CN110893095A (en) System and method for visible light and excited fluorescence real-time imaging
CN110893096A (en) Multispectral imaging system and method based on image exposure
US9872610B2 (en) Image processing device, imaging device, computer-readable storage medium, and image processing method
US9486123B2 (en) Endoscope system which enlarges an area of a captured image, and method for operating endoscope system
US10335014B2 (en) Endoscope system, processor device, and method for operating endoscope system
CN110198653B (en) Simultaneous visual and fluorescence endoscopic imaging
US8723937B2 (en) Endoscope system, imaging apparatus, and control method
JP5814698B2 (en) Automatic exposure control device, control device, endoscope device, and operation method of endoscope device
US10694117B2 (en) Masking approach for imaging multi-peak fluorophores by an imaging system
US9271635B2 (en) Fluorescence endoscope apparatus
US12171396B2 (en) Endoscope apparatus, operating method of endoscope apparatus, and information storage medium
JPWO2018180631A1 (en) Medical image processing apparatus, endoscope system, and method of operating medical image processing apparatus
JP5698476B2 (en) ENDOSCOPE SYSTEM, ENDOSCOPE SYSTEM OPERATING METHOD, AND IMAGING DEVICE
CN110087528B (en) Endoscope system and image display device
CN102843951A (en) Fluorescence observation device
CN109475283B (en) Endoscope system
US12052526B2 (en) Imaging system having structural data enhancement for non-visible spectra
CN114027765B (en) Fluorescence endoscope system, control method, and storage medium
JP7721989B2 (en) Ophthalmological image processing program
CN111345902B (en) Systems and methods for creating HDR monochromatic images of fluorescing fluorophores
JP7584983B2 (en) Medical image processing apparatus, its control method, and program
US20240335091A1 (en) Systems and methods for providing medical fluorescence imaging with a modulated fluorescence excitation illumination source

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201224

Address after: 3 / F, building 10, 2388 xiupu Road, Pudong New Area, Shanghai, 201315

Applicant after: Shanghai Yisi medical imaging equipment Co.,Ltd.

Applicant after: Zhejiang Yijing Medical Equipment Co.,Ltd.

Address before: 3 / F, building 10, 2388 xiupu Road, Pudong New Area, Shanghai, 201315

Applicant before: Shanghai Yisi medical imaging equipment Co.,Ltd.

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200320