WO2019230095A1 - カメラ装置、画像処理方法およびカメラシステム - Google Patents
カメラ装置、画像処理方法およびカメラシステム Download PDFInfo
- Publication number
- WO2019230095A1 WO2019230095A1 PCT/JP2019/007801 JP2019007801W WO2019230095A1 WO 2019230095 A1 WO2019230095 A1 WO 2019230095A1 JP 2019007801 W JP2019007801 W JP 2019007801W WO 2019230095 A1 WO2019230095 A1 WO 2019230095A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- video
- visible
- camera
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/002—Scanning microscopes
- G02B21/0024—Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
- G02B21/0052—Optical details of the image generation
- G02B21/0076—Optical details of the image generation arrangements using fluorescence or luminescence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000095—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/043—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/046—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for infrared imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room
- A61B5/0035—Features or image-related aspects of imaging apparatus, e.g. for MRI, optical tomography or impedance tomography apparatus; Arrangements of imaging apparatus in a room adapted for acquisition of images from more than one imaging mode, e.g. combining MRI and optical tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0071—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/002—Scanning microscopes
- G02B21/0024—Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
- G02B21/008—Details of detection or image processing, including general computer control
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
- G02B21/08—Condensers
- G02B21/12—Condensers affording bright-field illumination
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/16—Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/361—Optical details, e.g. image relay to the camera or image sensor
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2423—Optical details of the distal end
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2446—Optical details of the image relay
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2453—Optical details of the proximal end
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2476—Non-optical details, e.g. housings, mountings, supports
- G02B23/2484—Arrangements in relation to a camera or imaging device
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/26—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/042—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by a proximal camera, e.g. a CCD camera
Definitions
- the present disclosure relates to a camera device, an image processing method, and a camera system that process an image captured during a medical practice, for example.
- an observation image for example, a normal visible image or a fluorescence image excited by IR excitation light
- a doctor or the like can confirm the situation of the surgical target part in detail and can grasp the situation of the surgical target part in real time.
- Patent Document 1 the brightness of the fluorescence image and the reference light image calculated by the fluorescence image brightness calculation circuit and the reference light image brightness calculation circuit, respectively, are stored in the first coefficient and the second coefficient stored in the coefficient storage memory.
- An endoscope apparatus is disclosed that calculates the brightness of a dimming target by multiplying and adding each of the coefficients, and calculates and adjusts a gain for setting the dimming target value via a gain calculation circuit. Yes.
- a target site where surgery or treatment is performed for example, a site where a fluorescent drug is administered to a subject by injection before surgery
- the visibility of the output video displayed on the monitor is important for a doctor or the like to grasp the details of the situation of the target part (for example, the affected part in the subject).
- the present disclosure has been devised in view of the above-described conventional circumstances, suppresses the deterioration of the image quality of the fluorescent image, makes it easy to see the fluorescent portion in the fluorescent image when the fluorescent image and the normal visible image are superimposed, and It is an object of the present invention to provide a camera device, an image processing method, and a camera system that support the output of a captured image that enables a user such as the above to determine a clear state of a target region of a subject.
- imaging based on visible light incident on a medical optical device from a target region of a subject to which a fluorescent agent has been administered in advance, and imaging based on fluorescence incident on the medical optical device from the target region are performed.
- a camera head that is capable of both, amplifying the intensity of the fluorescent image input from the camera head, and making the black and white portions of the fluorescent image stand out, and then performing a nonlinear conversion process on the amplified fluorescent image
- An image processing unit that superimposes the fluorescence image after the nonlinear transformation processing on the visible image obtained by imaging based on the visible light and generates a superimposed image to be output to the output unit, A camera device is provided.
- the present disclosure is also an image processing method in a camera device including a camera head and an image processing unit, and the camera head enters a medical optical device from a target site of a subject to which a fluorescent agent has been administered in advance. Steps of performing imaging based on visible light and imaging based on fluorescence incident on the medical optical device from the target site, and the image processing unit amplifies the intensity of the fluorescent image input from the camera head Then, after making the black portion and the white portion of the fluorescent image stand out, the nonlinear conversion processing is performed on the amplified fluorescent image, and the nonlinear conversion processing is performed on the visible image obtained by imaging based on the visible light. And a step of superimposing a subsequent fluorescence image and generating a superimposed image to be output to an output unit.
- the present disclosure is a camera system including a camera device and an output unit, and the camera device is based on visible light incident on a medical optical instrument from a target site of a subject to which a fluorescent agent has been administered in advance.
- Imaging, and imaging based on fluorescence incident on the medical optical instrument from the target site respectively, amplifying the intensity of the fluorescence image obtained by imaging based on the fluorescence, and black and white portions of the fluorescence image
- the non-linear conversion processing is performed on the amplified fluorescence image, the fluorescence image after the non-linear conversion processing is superimposed on the visible image obtained by imaging based on the visible light, and the output unit A superimposed image to be output to the camera device is generated, and the output unit provides a camera system that outputs the superimposed image generated by the camera device.
- deterioration of the image quality of a fluorescent image is suppressed, and when a fluorescent image and a normal visible image are superimposed, a fluorescent part in the fluorescent image is easily seen, and a user such as a doctor can It can support the output of captured images that can distinguish a clear situation.
- FIG. Block diagram showing a detailed hardware configuration example of the visible video / IR video superimposition processing unit
- Explanatory drawing of threshold processing Explanatory drawing of the 1st example (binarization) of nonlinear transformation processing
- Explanatory drawing of the 2nd example (N-value conversion) of nonlinear transformation processing 7 is a flowchart illustrating an example of an operation procedure of the camera device according to the first embodiment.
- FIG. 7 is a block diagram illustrating a hardware configuration example of a camera device according to a second embodiment.
- the figure which shows the example of a display by which the visual image and the superimposed image were displayed by contrast The figure which shows the example of a display by which IR image
- the figure which shows the example of a display by which the visual image, IR image, and the superimposed image were displayed by contrast The figure which shows the system structural example which applied the medical camera system containing the camera apparatus which concerns on Embodiment 1, 2 to the surgery endoscope system
- a medical camera system used in medical surgery such as microscopic surgery or endoscopic surgery will be described as an example of a camera system including a camera device according to the present disclosure.
- the camera system is not limited to the example of the medical camera system.
- the camera device is a medical optical device from an observation target site (for example, an affected area that is a target site for surgery) of a subject (for example, a patient) to which a fluorescent drug such as ICG (Indocyanine Green) is administered in advance. Imaging based on visible light incident on the device and imaging based on fluorescence incident on the medical optical device from the observation target site are performed.
- the medical optical instrument is, for example, a surgical microscope or a surgical endoscope.
- the camera device performs image processing including at least nonlinear conversion processing on the fluorescence image obtained by imaging based on fluorescence, superimposes the fluorescence image after image processing on the visible image obtained by imaging based on visible light, and outputs an output unit A superimposed image to be output is generated.
- FIG. 1 is a system configuration diagram showing a configuration example in which a medical camera system including the camera device 20 according to the first and second embodiments is applied to a surgical microscope system.
- the surgical microscope system includes a surgical microscope 10 as an example of a medical optical instrument, a camera device 20, and an output unit 30.
- the camera device 20 includes a camera head 21 that captures an observation image of an observation target region obtained by the surgical microscope 10 by condensing light incident on the imaging optical system 23 on the imaging unit 24.
- the camera device 20 includes a CCU (Camera Control Unit) 22 that performs image processing on each frame of the observation image that constitutes the observation image captured by the camera head 21.
- the camera head 21 and the CCU 22 are connected by a signal cable 25.
- the camera head 21 is attached to and connected to the camera attachment unit 15 of the surgical microscope 10.
- An output unit 30 (for example, a display device such as a monitor) for displaying an observation image as a result of image processing performed by the CCU 22 is connected to the output terminal of the CCU 22.
- the surgical microscope 10 is, for example, a binocular microscope, and includes an objective lens 11, an observation optical system 12 provided so as to correspond to the left and right eyes of an observer such as a doctor, an eyepiece 13, and camera imaging optics.
- This is a configuration having a system 14 and a camera mounting portion 15.
- the observation optical system 12 includes zoom optical systems 101L and 101R, imaging lenses 102L and 102R, and eyepieces 103L and 103R so as to correspond to the left and right eyes of the observer.
- the zoom optical systems 101L and 101R, the imaging lenses 102L and 102R, and the eyepiece lenses 103L and 103R are symmetrically arranged with the optical axis of the objective lens 11 in between.
- the light from the subject 40 is incident on the objective lens 11 and then obtained through the zoom optical systems 101L and 101R, the imaging lenses 102L and 102R, the eyepieces 103L and 103R, the optical system 104L, and the beam splitter 104R.
- the left and right observation images having parallax are guided to the eyepiece unit 13.
- the observer can view the state of the observation target portion of the subject 40 in three dimensions by looking through the eyepiece unit 13 with both eyes.
- the light from the subject 40 described above is white light emitted from the light source device 31 (see later) to the observation target portion of the subject 40 to which the above-described fluorescent agent such as ICG is previously administered by injection or the like.
- the reflected light (for example, normal RGB visible light) reflected by the observation target portion or the fluorescence generated as a result of being excited by the IR excitation light irradiated from the light source device 31 being irradiated to the fluorescent agent.
- a band cut filter (BCF: Band Cut Filter) for blocking transmission of IR excitation light is formed between the objective lens 11 and the respective zoom optical systems 101L and 101R.
- BCF Band Cut Filter
- a doctor or the like fluoresces in the body of the subject 40 that is the observation target site in order to determine the status of the lymph node of the site to be observed (that is, the affected part of the subject 40).
- ICG indocyanine green
- ICG indocyanine green
- ICG indocyanine green
- the camera imaging optical system 14 includes an optical system 104L, a beam splitter 104R, and a mirror 105R.
- the camera imaging optical system 14 deflects and separates the light passing through the observation optical system 12 by the beam splitter 104R, reflects the light by the mirror 105R, and guides it to the camera mounting unit 15.
- FIG. 2 is a diagram showing an example of the appearance of a surgical microscope system.
- the surgical microscope 10 is provided with an eyepiece 13 at the upper part of the microscope main body, a housing for the camera imaging optical system 14 extends from the base end of the eyepiece 13 to the side, and a camera mounting portion 15 is provided. .
- the camera mounting portion 15 opens upward and is formed so that the imaging optical system 23 of the camera head 21 can be mounted.
- the imaging optical system 23 can be attached to and detached from the main body of the camera head 21 and can be exchanged, and an imaging optical system having different optical characteristics can be used depending on the application.
- the camera head 21 captures, for example, a spectral prism that divides a subject image into light in each frequency band of RGB (Red Green Blue) and IR (Infrared Radiation), and a subject image of light in each wavelength band of RGB and IR. It is composed of a four-plate imaging unit having four image sensors. Note that a single-plate imaging unit having one image sensor in which RGB and IR pixels are arranged may be used as the imaging unit 24. In addition, a two-plate type having a prism that separates visible light and IR light (for example, fluorescence), two image sensors including an image sensor that captures visible light and an image sensor that captures IR light (for example, fluorescence). An imaging unit may be used as the imaging unit 24.
- a spectral prism that divides a subject image into light in each frequency band of RGB (Red Green Blue) and IR (Infrared Radiation), and a subject image of light in each wavelength band of RGB and IR. It is composed of a four-plate
- the surgical microscope system includes a light source device 31 that illuminates a target region, a recorder 32 that records an observation image captured by the camera device 20, an operation unit 33 for operating the surgical microscope system, and an observer performs operation input with his / her foot.
- the foot switch 37 is included.
- the operation unit 33, CCU 22, light source device 31 and recorder 32 are housed in a control unit housing 35.
- an output unit 30 for example, a display such as a liquid crystal display device
- the surgical microscope 10 is attached to a support arm 34 that can be displaced, and is connected to a control unit housing 35 via the support arm 34.
- FIG. 3 is a block diagram illustrating a hardware configuration example of the camera device 20 according to the first embodiment.
- a camera device 20 shown in FIG. 3 includes camera heads 21 and 121 and CCUs 22 and 122.
- the camera heads 21 and 121 and the CCUs 22 and 122 are connected via a signal cable 25.
- the camera head 21 has imaging optical systems 23 and 123 and an imaging unit 24.
- the camera head 21 is attached to the camera mounting portion 15 of the surgical microscope 10 during, for example, a microscope operation.
- the light from the subject 40 passes through the imaging optical systems 23 and 123 and is coupled to the imaging surfaces of the image sensors held by the visible imaging unit 241 and the IR imaging unit 242 of the imaging unit 24, respectively.
- An RGB subject image and an IR subject image are respectively captured.
- the imaging optical systems 23 and 123 have one or a plurality of lenses and a resolving prism for splitting light from the subject 40 into each wavelength band of RGB and IR.
- the imaging unit 24 includes a visible imaging unit 241 and an IR imaging unit 242.
- the visible imaging unit 241 is configured using, for example, a three-plate image sensor arranged to be able to capture light in the RGB wavelength band or a single-plate image sensor in which RGB pixels are arranged, and the imaging optical system 23, Based on the light in each wavelength band of RGB that has passed through 123, an observation image of visible light (hereinafter also referred to as “visible image” or “visible image”) is generated.
- the IR imaging unit 242 is configured using, for example, a single-plate image sensor arranged so as to be capable of imaging G (green) or R (red) light having sensitivity in the IR or IR wavelength band. Based on the light in the IR wavelength band (that is, fluorescence) that has passed through 23 and 123, a fluorescence observation image (hereinafter also referred to as “IR image” or “IR image”) is generated.
- IR image fluorescence observation image
- the image sensor is composed of a solid-state imaging device such as a CCD (Charged-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- a signal of an observation image of the observation target region (that is, the affected part) of the subject 40 imaged by the camera head 21 is transmitted to the CCU 22 via the signal cable 25 and input.
- the CCUs 22 and 122 as an example of an image processing apparatus include a visible video / IR video separation unit 221, a visible video processing unit 222, an IR video processing unit 223, and a visible video / IR video superimposition processing unit 224. is there.
- the CCUs 22 and 122 or their respective parts are configured using a CPU (Central Processing Unit), a DSP (Digital Signal Processor) or an FPGA (Field Programmable Gate Array), and a circuit configuration and operation setting and change can be made by a program. Yes.
- the CCUs 22 and 122 input signals of the observation video (visible video, IR video) captured by the camera head 21, perform predetermined image processing for the visible video on the visible video, and perform the processing on the IR video. Then, predetermined image processing for IR video is performed.
- the CCUs 22 and 122 perform various kinds of image processing for improving the image quality of the IR video on the IR video, and superimpose the IR video after the image processing on the visible video to generate a superimposed video and output it to the output unit 30. To do.
- the visible video / IR video separation unit 221 separates the observation video signal transmitted from the camera head 21 via the signal cable 25 into a visible video signal and an IR video signal, and converts the visible video signal into the visible video signal.
- the image is sent to the processing unit 222, and an IR video signal is sent to the IR video processing unit 223.
- a visible image and an IR image are periodically time-division imaged in the imaging unit 24 of the camera head 21, the imaged visible image is input for the first certain period and the IR image captured for the next certain period. Is entered.
- the visible video / IR video separation unit 221 receives the visible video input for the first predetermined period. Only the video is sent to the visible video processing unit 222, and only the input IR video is sent to the IR video processing unit 223 in the next fixed period.
- the visible imaging unit 241 and the IR imaging unit 242 are provided, it is possible to simultaneously capture the visible image and the IR imaging.
- the visible video and the IR video are alternately input to the visible video / IR video separation unit 221, for example, the visible video and the IR video are identified and separated by referring to the header area, for example.
- the visible video / IR video separation unit 221 may receive a visible video and an IR video at the same time.
- the signal cable 25 (see FIG. 3) is configured to include both a visible video signal line and an IR video signal line.
- the visible video processing unit 222 performs normal image processing (for example, linear interpolation processing, high resolution processing, etc.) on the input visible video, and the visible video signal after the image processing is converted into a visible video / IR video superimposition processing unit. 224.
- normal image processing for example, linear interpolation processing, high resolution processing, etc.
- the IR video processing unit 223 performs normal image processing (for example, linear interpolation processing, high resolution processing, etc.) on the input IR video, and the IR video signal after the image processing is converted into a visible video / IR video superimposition processing unit. 224.
- normal image processing for example, linear interpolation processing, high resolution processing, etc.
- the visible video / IR video superimposing unit 224 performs various kinds of image processing on the IR video signal sent from the IR video processing unit 223, and the IR video signal subjected to the image processing is subjected to visible video processing. Superimposition processing is performed on the visible video sent from the unit 222 to generate a superimposed video (superimposed image) and output it to the output unit 30. Details of the operation of the visible image / IR image superimposition processing unit 224 will be described later with reference to FIG.
- the visible image / IR image superimposing processing unit 224 includes an operation unit (not shown) such as a doctor who visually recognizes the superimposed image output to the output unit 30 (for example, a liquid crystal display device) during a microscope operation or an endoscopic operation. A signal based on an operation using () may be input, and an image processing parameter (see below) applied to the IR video signal may be appropriately changed.
- the output unit 30 is, for example, a liquid crystal display device (LCD: Liquid Crystal Display) or a video display device configured using organic EL (Electroluminescence), or data of output video output (that is, superimposed video (superimposed image)).
- LCD Liquid Crystal Display
- video display device configured using organic EL (Electroluminescence), or data of output video output (that is, superimposed video (superimposed image)).
- recording device for recording
- the recording device is, for example, an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- FIG. 4 is a block diagram illustrating a detailed hardware configuration example of the visible video / IR video superimposition processing unit 224.
- the visible video / IR video superimposition processing unit 224 includes a threshold processing unit 2241, a pre-conversion gain processing unit 2242, a non-linear conversion unit 2243, a post-conversion gain processing unit 2244, and a superimposition processing unit 2245.
- the threshold processing unit 2241 and the non-linear conversion unit 2243 may be configured as the same circuit (for example, the non-linear conversion unit 2243). This is because threshold processing described later can be considered as an example of nonlinear conversion processing, and the characteristics used by the threshold processing unit 2241 (see FIG. 5A) are configured by a lookup table so that the nonlinear conversion unit 2243 performs threshold processing. Because it is feasible.
- the threshold processing unit 2241 has the intensity of the input IR video signal (for example, the brightness of the IR image for each pixel constituting the IR video, or k * k (k: an integer that is a multiple of 2 equal to or greater than 2)). If the brightness of the IR image for each block of pixels (the same applies hereinafter) is less than the first threshold th1, the intensity is reduced, and if the intensity is greater than or equal to the second threshold th2 (> first threshold th1) An intensity correction process for increasing the intensity is performed (see FIG. 5A). The threshold processing unit 2241 sends the IR video signal subjected to the intensity correction processing to the pre-conversion gain processing unit 2242.
- the intensity of the input IR video signal for example, the brightness of the IR image for each pixel constituting the IR video, or k * k (k: an integer that is a multiple of 2 equal to or greater than 2). If the brightness of the IR image for each block of pixels (the same applies hereinafter) is less than the
- FIG. 5A is an explanatory diagram of threshold processing.
- the threshold process is an intensity correction process for correcting the intensity of the input IR video signal using parameters (for example, first threshold th1 and second threshold th2 which are two kinds of thresholds) (see above). It is.
- the horizontal axis (x-axis) indicates the intensity of the input IR video signal
- the vertical axis (y-axis) indicates the intensity of the IR video signal output after the threshold processing.
- a characteristic Cv1 indicates the characteristic of threshold processing in the threshold processing unit 2241.
- the IR video signal input to the threshold processing unit 2241 often includes a noise component.
- a part of the white part in the IR image in other words, the affected part that is the fluorescent observation target part
- the black part in the IR image in other words, The background portion (which is not the affected area) becomes somewhat whitish, and as a result, the image quality of the IR video is deteriorated.
- the threshold processing unit 2241 reduces the intensity so that the output value becomes 0 (zero) if the intensity of the input IR video signal is less than the first threshold th1 (in other words, To 0 (zero). That is, the threshold processing unit 2241 corrects the intensity of the pixel or block to which the intensity less than the first threshold th1 is input to 0 (zero), and makes the black portion of the IR video stand out.
- the threshold processing unit 2241 when the intensity of the input IR video signal is equal to or higher than the second threshold th2, is the maximum output value that can represent the output value with a predetermined number of bits that can be expressed. Increase the strength so that That is, the threshold processing unit 2241 corrects the intensity of the pixel or block to which the intensity greater than or equal to the second threshold th2 is input to the maximum output value, and makes the white portion in the IR video stand out.
- the threshold processing unit 2241 corrects the dark portion of the IR video so that it becomes black even if the intensity of the input IR video signal is equal to or higher than the first threshold th1 and lower than the second threshold.
- the gradation of the IR image can be made closer to black and white, and the fluorescent part can be easily distinguished when superimposed on the visible image.
- at least one of the first threshold th1 and the second threshold th2 is a signal based on an operation using an operation unit (not shown) such as a doctor who visually recognizes the superimposed video (superimposed image) displayed on the output unit 30. Based on the input, the changed value may be set in the threshold processing unit 2241.
- the pre-conversion gain processing unit 2242 holds a preset first gain value.
- the first gain value may be set based on a signal based on an operation using an operation unit such as a doctor who visually recognizes the superimposed image displayed on the output unit 30.
- the pre-conversion gain processing unit 2242 uses the first gain value to input and amplify the IR video signal subjected to the intensity correction processing by the threshold processing unit 2241.
- the pre-conversion gain processing unit 2242 sends the amplified IR video signal to the nonlinear conversion unit 2243. Accordingly, the camera device 20 can perform amplification processing for facilitating the nonlinear conversion processing in the subsequent nonlinear conversion unit 2243 and can contribute to the use of other image processing. Therefore, the visible image / IR image superimposition processing unit 224 can be versatile.
- the non-linear conversion unit 2243 is configured using a non-linear processing circuit that holds a look-up table (LUT: Look Up Table) 2243t and performs non-linear processing using the look-up table 2243t.
- the nonlinear conversion unit 2243 performs nonlinear conversion processing on the IR video signal sent from the pre-conversion gain processing unit 2242 based on the value group written in the lookup table 2243t.
- the non-linear conversion processing is, for example, binarization or N-leveling of the intensity of the IR video signal, and represents processing for expressing the IR video signal in two gradations or N gradations.
- N is an integer of 3 or more.
- the type (for example, binarization or N-value conversion) of the non-linear conversion process performed by the non-linear conversion unit 2243 may be set in advance, or an operation of a doctor or the like who visually recognizes the superimposed image displayed on the output unit 30. It may be set based on a signal based on an operation using the unit.
- the nonlinear conversion unit 2243 is not limited to the example in which the nonlinear conversion process is performed using the lookup table 2243t described above, but is a ROM in which nonlinear function data (for example, data of each point of a broken line used for approximation) is written.
- the nonlinear conversion process may be performed by a broken line approximation circuit that performs a process of connecting points with broken lines using (Read ⁇ ⁇ Only Memory).
- the look-up table normally has an output value corresponding to each of the values that the input signal can take (for example, 0 to 255 for 8 bits), and consists of 256 pieces of data.
- the amount of data held in the lookup table tends to increase.
- the amount of data to be held can be reduced because the data to be held is only the number of points of the broken line.
- an IR image (that is, an image obtained by imaging fluorescence generated by exciting a fluorescent agent such as ICG with IR excitation light) has a light intensity lower than that of normal visible light, and has, for example, 10 bits.
- the IR portion that is, the fluorescent portion
- the nonlinear conversion unit 2243 performs nonlinear conversion processing (for example, binarization or N-value conversion) on the input IR video signal.
- nonlinear conversion processing for example, binarization or N-value conversion
- FIG. 5B is an explanatory diagram of a first example (binarization) of nonlinear conversion processing.
- FIG. 5C is an explanatory diagram of a second example (N-value conversion) of nonlinear conversion processing.
- the horizontal axis (x-axis) indicates the intensity of the IR video signal
- the vertical axis (y-axis) indicates the intensity of the IR video signal output after the nonlinear conversion process.
- the nonlinear conversion unit 2243 outputs 0 as the output when the intensity in the pixel or block (see above) of the input IR video signal is less than M1 held in the lookup table 2243t. Convert to (zero).
- the characteristic Cv2 is a characteristic example showing an example of the nonlinear conversion process in the nonlinear conversion unit 2243.
- the camera device 20 simply converts the IR video signal into a black and white signal according to whether or not the input value (that is, the intensity of each pixel or block of the input IR video signal) is equal to or less than M1. Since it can be expressed by bits, it is possible to generate a superimposed image in which the fluorescent portion can be easily identified when superimposed on the visible image.
- the horizontal axis (x-axis) indicates the intensity of the IR video signal
- the vertical axis (y-axis) indicates the intensity of the IR video signal output after the nonlinear conversion process.
- the non-linear conversion unit 2243 has N1, N2, N3, N4, N5, and N6 whose intensities in pixels or blocks (see above) of the input IR video signal are held in the lookup table 2243t. , N7, the output value is converted in a stepwise manner.
- the characteristic Cv3 is a characteristic example showing an example of nonlinear conversion processing in the nonlinear conversion unit 2243.
- the output value is the maximum value ( For example, the input value for outputting 8) and the value group of the output value are written in the lookup table 2243t. If the input value is larger than N3 and smaller than N4, a value of 50% of the maximum value is assigned. That is, in FIG. 5C, the output according to the characteristic Cv3 shown in FIG. 5C is the result of the nonlinear conversion process according to the comparison result between the intensity of each pixel or block of the input IR video signal and the seven types of values N1 to N7. Assigned as.
- the camera device 20 determines the IR video signal in accordance with the magnitude relationship between the input value (that is, the intensity of each pixel or block of the input IR video signal) and the seven types of values N1 to N7. Can be expressed with 8 bits of black and white, so that a superimposed image can be generated in which the fluorescent portion can be easily identified when superimposed on the visible image.
- the post-conversion gain processing unit 2244 holds a preset second gain value. Note that the second gain value may be set based on a signal based on an operation using an operation unit such as a doctor who visually recognizes the superimposed image displayed on the output unit 30.
- the post-conversion gain processing unit 2244 uses the second gain value to input and amplify the IR video signal subjected to the non-linear conversion processing by the non-linear conversion unit 2243.
- the post-conversion gain processing unit 2244 sends the amplified IR video signal to the superimposition processing unit 2245. As a result, the camera device 20 becomes darker due to amplification of the signal of the IR image that has been subjected to nonlinear conversion processing.
- the visible video / IR video superimposition processing unit 224 can be versatile.
- the superimposition processing unit 2245 receives the visible video signal sent from the visible video processing unit 222 and the IR video signal sent from the post-conversion gain processing unit 2244, and converts the IR video signal into the visible video signal. A superimposed image (superimposed image) subjected to the superimposition process is generated (see FIG. 8). The superimposition processing unit 2245 outputs the superimposed video signal as an output video to the output unit 30 (for example, a monitor).
- the superimposition processing unit 2245 includes RGB information (pixel value) for each k * k blocks (see above) of the visible image constituting the visible video in the same block of the corresponding IR video. By adding the G (green) information (pixel value), it is possible to generate a superimposed image in which the IR image portion is colored (colored) in green after the IR image portion is superimposed on the visible image. Note that the superimposition processing in the superimposition processing unit 2245 is not limited to the above-described example.
- FIG. 8 is a diagram illustrating an example of a superimposed video G2as in which an IR video is superimposed on a visible video.
- the superimposed image G2as is a white portion of the IR image (that is, a portion that is not a background portion and that is an affected portion tg in which a fluorescent agent such as ICG emits fluorescence) when the IR image is superimposed on the visible image. In order to make it easy to discriminate, for example, it is colored green and shown.
- the doctor visually recognizes the light emission distribution of the fluorescent agent administered into the body of the subject 40 and the state of the affected part tg by visually checking the superimposed image G2as output (displayed) on the output unit 30 (for example, a monitor). Can be determined automatically.
- FIG. 6 is a flowchart illustrating an example of an operation procedure of the camera device 20 according to the first embodiment. The processing within the dotted line shown in FIG. 6 is executed for each frame of a visible image that constitutes a visible image or each frame of an IR image that constitutes an IR image.
- the camera device 20 causes the camera head 21 to collect the reflected light from the subject 40 (that is, visible light and fluorescence reflected by the subject 40) on the imaging unit 24.
- a visible image and an IR image are captured (St1).
- the IR excitation light that excites the fluorescent agent previously administered into the body of the subject 40 is provided between the objective lens 11 and the zoom optical systems 101L and 101R, for example. It is preferable that the transmission is blocked by the band cut filter. Thereby, since the reflected light of the IR excitation light is suppressed from entering the camera head 21 of the camera device 20, the image quality of the fluorescence observation video signal is improved.
- the camera device 20 separates the observation video signal transmitted from the camera head 21 via the signal cable 25 into a visible video signal and an IR video signal, and sends the visible video signal to the visible video processing unit 222.
- the IR video signal is sent to the IR video processing unit 223 (St2).
- the camera device 20 performs normal image processing (for example, linear interpolation processing, high resolution processing, etc.) for each frame of the visible image constituting the visible video (St3A).
- normal image processing for example, linear interpolation processing, high resolution processing, etc.
- the camera device 20 performs normal image processing (for example, linear interpolation processing, high resolution processing, etc.) for each frame of the IR image constituting the IR video (St3B1). For each IR image frame on which image processing has been performed in step St3B1, the camera device 20 calculates the intensity of the frame (for example, the brightness of the IR image for each pixel constituting the IR video, or k * k pixels). Intensity correction that reduces the intensity when the brightness of the IR image for each block is less than the first threshold th1 and increases the intensity when the intensity is greater than or equal to the second threshold th2 (> first threshold th1). Processing is performed (St3B2).
- normal image processing for example, linear interpolation processing, high resolution processing, etc.
- the camera device 20 amplifies the signal of the IR image subjected to the intensity correction process using the first gain value for each frame on which the intensity correction process (threshold process) is performed in step St3B2 (St3B3).
- the camera device 20 performs nonlinear conversion processing for each frame of the IR image constituting the IR video amplified in step St3B3 based on the value group written in the lookup table 2243t (St3B4).
- the camera device 20 amplifies the signal of the IR video that has been subjected to the non-linear conversion process using the second gain value for each frame of the IR image that constitutes the IR video that has been subjected to the non-linear conversion process in step St3B4. (St3B5).
- the camera device 20 receives the visible video signal subjected to image processing in step St3A and the IR video signal amplified in step St3B5, and superimposes the IR video signal superimposed on the visible video signal. (Superimposed image) is generated (St4).
- the camera device 20 outputs the superimposed video signal generated in step St4 to the output unit 30 (for example, a monitor) as an output video (St5).
- FIG. 7 is a schematic diagram showing an example of changes in IR video before and after threshold processing and nonlinear conversion processing.
- an IR video G2 is an IR video before threshold processing and nonlinear conversion processing are performed.
- a wide noise component appears strongly in the portion BKG1 other than the affected part tg (that is, around the affected part tg and the whole excluding the affected part tg), and the color is black to gray. ing.
- the IR image G2a is an IR image after the threshold processing and the nonlinear conversion processing are performed.
- the noise component of the part BKG1 other than the affected part tg that has appeared in the IR image G2 is eliminated to obtain a clear image quality, and the part BKG2 other than the affected part tg has a hue close to black.
- the image quality of the IR video G2a is improved compared to the image quality of the IR video G2.
- the camera device 20 is configured so that an observation target region (for example, an affected part that is a target region for surgery) of a subject (for example, a patient) to which a fluorescent agent such as ICG (Indocyanine Green) is administered in advance. Imaging based on visible light incident on the medical optical instrument and imaging based on fluorescence incident on the medical optical instrument from the observation site are performed.
- the medical optical instrument is, for example, a surgical microscope or a surgical endoscope.
- the camera device performs image processing including at least nonlinear conversion processing on the fluorescence image obtained by imaging based on fluorescence, superimposes the fluorescence image after image processing on the visible image obtained by imaging based on visible light, and outputs an output unit A superimposed image to be output is generated.
- the camera device 20 performs nonlinear conversion processing on the fluorescence image based on the fluorescence emission of the fluorescent agent such as ICG previously administered to the affected part in the body of the subject 40 to define the color of the visible image. Since the hue of the fluorescent image can be defined with a bit number lower than the number, deterioration of the image quality of the fluorescent image can be suppressed. In addition, since the camera device 20 performs nonlinear conversion processing so that the fluorescent image becomes a clear black and white image, when the fluorescent image and the normal visible image are superimposed, the fluorescent portion in the fluorescent image is made easier to see, so that the doctor It is possible to support the output of a captured image that allows a user such as the user to determine the clear state of the target portion of the subject.
- the camera device 20 amplifies the intensity of the fluorescent image input from the camera head 21, and then performs a nonlinear conversion process on the amplified fluorescent image. Accordingly, the camera device 20 can perform amplification processing for facilitating the nonlinear conversion processing in the subsequent nonlinear conversion unit 2243 and can contribute to the use of other image processing. Therefore, the visible image / IR image superimposition processing unit 224 can be versatile.
- the camera device 20 amplifies the intensity of the fluorescent image after the nonlinear conversion process. As a result, the camera device 20 becomes darker due to amplification of the signal of the IR image that has been subjected to nonlinear conversion processing. Therefore, when the IR image in the superimposition processing unit 2245 in the subsequent stage is superimposed on the visible image, the fluorescence of the IR image is increased. Amplification processing that makes the portion more identifiable can be performed, and it can contribute to the use of other image processing. Therefore, the visible video / IR video superimposition processing unit 224 can be versatile.
- the amplification factor related to amplification in the pre-conversion gain processing unit 2242 or the post-conversion gain processing unit 2244 is variable by the operation of a user such as a doctor. Thereby, the doctor or the like can appropriately adjust the visibility of the superimposed image output to the output unit 30 (in other words, the affected part tg of the subject 40), so that an appropriate judgment can be made at the time of microscopic surgery or endoscopic surgery. Yes.
- the camera device 20 executes an intensity correction process that decreases when the intensity of the fluorescent image is less than the first threshold th1 and increases when the intensity of the fluorescent image is greater than or equal to the second threshold th2 that is greater than the first threshold th1. Then, the nonlinear conversion process is executed on the fluorescence image after the intensity correction process.
- the camera device 20 can effectively eliminate the influence of the noise component by the intensity correction process. It can be close to the value, and it is easy to distinguish the fluorescent part when superimposed on the visible image.
- the first threshold th1 and the second threshold th2 are variable by the operation of a user such as a doctor. Accordingly, the doctor or the like determines whether the boundary of the affected part tg of the subject 40 is buried in the surrounding visible video in the superimposed video output to the output unit 30 and is difficult to see. In this case, the first threshold th1 and the second threshold th2 can be adjusted as appropriate, so that an appropriate determination can be made during microscopic surgery or endoscopic surgery.
- the camera device 20 has a lookup table 2243t having a value group that can be changed by the operation of a user such as a doctor, and performs nonlinear conversion processing based on the lookup table 2243t.
- the camera device 20 simply converts the IR video signal into a black and white signal according to whether or not the input value (that is, the intensity of each pixel or block of the input IR video signal) is equal to or less than M1. Since it can be expressed by bits, it is possible to generate a superimposed image in which the fluorescent portion can be easily identified when superimposed on the visible image.
- the camera device 20 receives the IR video signal in accordance with the magnitude relationship between the input value (that is, the intensity of each pixel or block of the input IR video signal) and the seven types of values N1 to N7. Since it is possible to express finely with black and white 8 bits, it is possible to generate a superimposed image in which a fluorescent part can be easily identified when superimposed on a visible image. As described above, the doctor or the like selects the look-up table 2243t as appropriate in order to check the superimposed image output to the output unit 30 (in other words, the affected part tg of the subject 40). Since the nonlinear conversion process in the conversion unit 2243 can be selected, it is possible to cause the camera device 20 to execute an appropriate nonlinear conversion process and to visually recognize a superimposed image with good image quality.
- the camera device selects at least one of a visible image, a fluorescent image that has undergone image processing, and a superimposed image, and at least It further comprises at least one selection unit that outputs to one corresponding output unit.
- FIG. 9 is a block diagram illustrating a hardware configuration example of the camera device 20 according to the second embodiment.
- the same contents as those in FIG. 3 are denoted by the same reference numerals, simplified or omitted, and different contents will be described.
- the camera apparatus 20 shown in FIG. 9 includes camera heads 21 and 121 and CCUs 22 and 122.
- the CCUs 22 and 122 include a visible video / IR video separation unit 221, a visible video processing unit 222, an IR video processing unit 223, a visible video / IR video superimposition processing unit 224, and a selection unit 2251, 2252, 2253. It is a configuration.
- the selection unit 2251 includes a visible video signal after image processing by the visible video processing unit 222, an IR video signal after image processing by the IR video processing unit 223, and a visible video / IR
- the superimposed video signal generated by the video superimposing process b224 is input.
- the selection unit 2251 selects at least one of a visible image, an IR image, and a superimposed image and outputs the selected image to the output unit 301 in response to input of a signal based on an operation of a user such as a doctor.
- the selection unit 2252 includes a visible video signal after image processing by the visible video processing unit 222, an IR video signal after image processing by the IR video processing unit 223, and a visible video / IR
- the superimposed video signal generated by the video superimposing process b224 is input.
- the selection unit 2252 selects at least one of a visible image, an IR image, and a superimposed image and outputs the selected image to the output unit 302 in response to input of a signal based on an operation of a user such as a doctor.
- the selection unit 2253 includes a visible video signal after image processing by the visible video processing unit 222, an IR video signal after image processing by the IR video processing unit 223, and a visible video / IR
- the superimposed video signal generated by the video superimposing process b224 is input.
- the selection unit 2253 selects at least one of a visible image, an IR image, and a superimposed image and outputs the selected image to the output unit 303 in response to an input of a signal based on an operation of a user such as a doctor.
- the output units 301, 302, and 303 are, for example, a liquid crystal display (LCD: Liquid Crystal Display) or a video display device configured using an organic EL (Electroluminescence), or an output video (that is, a superimposed video ( This is a recording apparatus that records data of a superimposed image)).
- the recording device is, for example, an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- FIG. 10 is a diagram showing a display example in which the visible video G1 and the superimposed video G2as are displayed in comparison.
- FIG. 11 is a diagram illustrating a display example in which the IR video G2a and the superimposed video G2as are displayed in comparison.
- FIG. 12 is a diagram illustrating a display example in which the visible video G1, the IR video G2a, and the superimposed video G2as are displayed in comparison.
- the camera device 20 according to Embodiment 2 can select, for example, the visible video G1 and the superimposed video G2as in the selection unit 2251 and output them to the output unit 301. That is, the camera device 20 according to the second embodiment selects a plurality of types of video and selects the same screen as compared with the example in which the camera device 20 according to the first embodiment outputs the superimposed video G2as to the output unit 30. Can be displayed in contrast.
- the doctor or the like compares the superimposed video G2as and the visible video G1 with the same output unit 301, and determines whether or not the degree of superimposition of the fluorescent light emission part (that is, the video of the affected part tg) in the superimposed video G2as is appropriate.
- the degree of superimposition can be adjusted by changing various parameters as necessary.
- the camera apparatus 20 according to Embodiment 2 can select, for example, the IR video G2a and the superimposed video G2as in the selection unit 2252 and output them to the output unit 302. That is, the camera device 20 according to the second embodiment selects a plurality of types of video and selects the same screen as compared with the example in which the camera device 20 according to the first embodiment outputs the superimposed video G2as to the output unit 30. Can be displayed in contrast. Accordingly, the doctor or the like compares the superimposed image G2as and the IR image G2a with the same output unit 302 and determines whether or not the noise component is appropriately suppressed from the IR image G2a.
- the camera device 20 according to Embodiment 2 can select, for example, the visible video G1, the IR video G2a, and the superimposed video G2as in the selection unit 2253, and output them to the output unit 303. That is, the camera device 20 according to the second embodiment selects a plurality of types of video and selects the same screen as compared with the example in which the camera device 20 according to the first embodiment outputs the superimposed video G2as to the output unit 30. Can be displayed in contrast.
- segmented into 4 screens is shown and the example in which the display area of the superimposition image
- the doctor or the like compares the visible video G1, the IR video G2a, and the superimposed video G2as with the same output unit 303, and the degree of superimposition of the fluorescent light emission part (that is, the video of the affected part tg) in the superimposed video G2as is appropriate. It can be easily confirmed whether or not there is.
- doctors or the like appropriately determine whether or not the noise component is appropriately suppressed from the IR image G2a, and whether or not the intensity of the fluorescence emission part (that is, the image of the affected part tg) is appropriately corrected so as to be distinguished in the superimposed image G2as. Whether or not can be easily confirmed. Therefore, doctors and the like can correct the image quality of the IR video G2a by changing various parameters as described in the first embodiment as necessary.
- 10, 11, and 12 show examples in which a plurality of videos are selected by the selection units 2251, 2252, and 2253 and displayed on the output units 301, 302, and 303, respectively.
- 2252 and 2253 may select only one of the visible video G1, the IR video G2a, and the superimposed video G2as and output the selected video to the corresponding output units 301, 302, and 303.
- the camera apparatus 20 etc. can output the image
- FIG. 13 is a system configuration diagram showing a configuration example in which the medical camera system including the camera device 20 according to the first and second embodiments is applied to a surgical endoscope system.
- the surgical endoscope system includes a surgical endoscope 110, a camera device 120, an output unit 130 (for example, a display device such as a monitor), and a light source device 131.
- the camera device 120 is the same as the camera device 20 shown in FIGS. 1 to 4 and includes a camera head 121 and a CCU 122.
- the surgical endoscope 110 includes an elongated insertion portion 111 and an objective lens 201L, a relay lens 202L, and an imaging lens 203L.
- the surgical endoscope 110 includes a camera mounting unit 115 and a light source mounting unit 117 provided on the proximal side of the observation optical system, and guides illumination light from the light source mounting unit 117 to the distal end portion of the insertion unit 111.
- a light guide 204 is provided. By mounting the imaging optical system 123 of the camera head 121 on the camera mounting unit 115 and capturing an image, an observation image can be acquired by the camera device 120.
- a light guide cable 116 is connected to the light source mounting portion 117, and a light source device 131 is connected via the light guide cable 116.
- the camera head 121 and the CCU 122 are connected by a signal cable 125, and the video signal of the subject 40 imaged by the camera head 121 is transmitted to the CCU 122 via the signal cable 125.
- An output unit 130 (for example, a display device such as a monitor) is connected to the output terminal of the CCU 122, and two left and right video outputs 1 and 2 for 3D display may be output, or a 2D observation video (observation image) is output. May be.
- the output unit 130 may display a 2D pixel 3D image as an observation image of the target region, or may output a 2D observation image (observation image).
- FIG. 14 is a diagram showing an example of the external appearance of the surgical endoscope system.
- a camera mounting unit 115 is provided on the proximal side of the insertion unit 111, and the imaging optical system 123 of the camera head 121 is mounted.
- a light source mounting portion 117 is provided on the proximal side of the insertion portion 111, and a light guide cable 116 is connected thereto.
- the camera head 121 is provided with an operation switch so that an operation of an observation image to be captured (freeze, release, image scan, etc.) can be performed at the user's hand.
- the surgical endoscope system includes a recorder 132 that records an observation video imaged by the camera device 120, an operation unit 133 for operating the surgical endoscope system, and a foot switch 137 that allows an observer to input an operation with a foot.
- the operation unit 133, the CCU 122, the light source device 131, and the recorder 132 are housed in the control unit housing 135.
- An output unit 130 is disposed on the upper portion of the control unit casing 135.
- the state of the observation target portion acquired by the surgical endoscope 110 can be clearly confirmed, similarly to the configuration of the medical camera system described above. Output of a superimposed image is possible.
- the present disclosure suppresses deterioration of the image quality of the fluorescent image, makes it easy to see the fluorescent portion in the fluorescent image when the fluorescent image and the normal visible image are superimposed, and allows a user such as a doctor to clearly see the target portion of the subject.
- the present invention is useful as a camera device, an image processing method, and a camera system that support the output of captured images that can determine the situation.
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Astronomy & Astrophysics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Endoscopes (AREA)
- Microscoopes, Condenser (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
Description
実施の形態1では、カメラ装置は、ICG(インドシアニングリーン)等の蛍光薬剤が予め投与された被検体(例えば患者)の観察対象部位(例えば手術の対象部位である患部)からの医療用光学機器に入射した可視光に基づく撮像、および観察対象部位からの医療用光学機器に入射した蛍光に基づく撮像をそれぞれ行う。医療用光学機器は、例えば手術顕微鏡あるいは手術内視鏡である。カメラ装置は、蛍光に基づく撮像により得られた蛍光画像に対する非線形変換処理を少なくとも含む画像処理を施し、可視光に基づく撮像により得られた可視画像に画像処理後の蛍光画像を重畳し、出力部に出力されるための重畳画像を生成する。
実施の形態2では、カメラ装置は、上述した実施の形態1に係るカメラ装置の構成に加え、可視映像、画像処理が施された蛍光映像、および重畳画像のうち少なくとも1つを選択し、少なくとも1つの対応する出力部に出力する少なくとも1つの選択部をさらに備える。
20、120 カメラ装置
21、121 カメラヘッド
22、122 CCU(カメラコントロールユニット)
23、123 撮像光学系
24 撮像部
25、125 信号ケーブル
30、130、301、302、303 出力部
40 被検体
110 手術内視鏡
221 可視映像/IR映像分離部
222 可視映像処理部
223 IR映像処理部
224 可視映像/IR映像重畳処理部
2241 閾値処理部
2242 変換前ゲイン処理部
2243 非線形変換部
2244 変換後ゲイン処理部
2245 重畳処理部
2251、2252、2253 選択部
Claims (9)
- 蛍光薬剤が予め投与された被検体の対象部位からの医療用光学機器に入射した可視光に基づく撮像、および前記対象部位からの前記医療用光学機器に入射した蛍光に基づく撮像がともに可能なカメラヘッドと、
前記カメラヘッドから入力された前記蛍光画像の強度を増幅し、当該蛍光画像の黒色部分と白色部分を際立たせた後に、前記増幅された蛍光画像に対して非線形変換処理を実行し、前記可視光に基づく撮像により得られた可視画像に前記非線形変換処理後の蛍光画像を重畳し、出力部に出力されるための重畳画像を生成する画像処理部と、を備える、
カメラ装置。 - 前記画像処理部は、前記カメラヘッドから入力された前記蛍光画像の強度が第1閾値未満の場合に低下させ、前記蛍光画像の強度が前記第1閾値より大きい第2閾値以上の場合に強度を増幅する、
請求項1に記載のカメラ装置。 - 前記画像処理部は、前記非線形変換処理後の蛍光画像の強度を増幅する、
請求項1または2に記載のカメラ装置。 - 前記増幅に関する増幅率は、ユーザの操作により可変である、
請求項1~3のうちいずれか一項に記載のカメラ装置。 - 前記第1閾値および前記第2閾値は、ユーザの操作により可変である、
請求項2~4のうちいずれか一項に記載のカメラ装置。 - 前記画像処理部は、ユーザの操作により変更可能な値群を有するルックアップテーブルを有し、前記ルックアップテーブルに基づいて前記非線形変換処理を実行する、
請求項1~5のうちいずれか一項に記載のカメラ装置。 - 前記可視画像、前記画像処理が施された蛍光画像、および前記重畳画像のうち少なくとも1つを選択し、少なくとも1つの対応する前記出力部に出力する少なくとも1つの選択部、をさらに備える、
請求項1~6のうちいずれか一項に記載のカメラ装置。 - カメラヘッドおよび画像処理部を含むカメラ装置における画像処理方法であって、
前記カメラヘッドにより、蛍光薬剤が予め投与された被検体の対象部位からの医療用光学機器に入射した可視光に基づく撮像、および前記対象部位からの前記医療用光学機器に入射した蛍光に基づく撮像をそれぞれ行うステップと、
前記画像処理部により、前記カメラヘッドから入力された前記蛍光画像の強度を増幅し、当該蛍光画像の黒色部分と白色部分を際立たせた後に、前記増幅された蛍光画像に対して非線形変換処理を実行し、前記可視光に基づく撮像により得られた可視画像に前記非線形変換処理後の蛍光画像を重畳し、出力部に出力されるための重畳画像を生成するステップと、を有する、
画像処理方法。 - カメラ装置と出力部とを含むカメラシステムであって、
前記カメラ装置は、
蛍光薬剤が予め投与された被検体の対象部位からの医療用光学機器に入射した可視光に基づく撮像、および前記対象部位からの前記医療用光学機器に入射した蛍光に基づく撮像をそれぞれ行い、
前記蛍光に基づく撮像により得られた蛍光画像の強度を増幅し、当該蛍光画像の黒色部分と白色部分を際立たせた後に、前記増幅された蛍光画像に対して非線形変換処理を実行し、前記可視光に基づく撮像により得られた可視画像に前記非線形変換処理後の蛍光画像を重畳し、前記出力部に出力されるための重畳画像を生成し、
前記出力部は、
前記カメラ装置により生成された前記重畳画像を出力する、
カメラシステム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201980028616.0A CN112040831A (zh) | 2018-05-31 | 2019-02-28 | 相机设备、图像处理方法和相机系统 |
| US16/979,090 US20200405152A1 (en) | 2018-05-31 | 2019-02-28 | Camera device, image processing method, and camera system |
| EP19811542.0A EP3804604A4 (en) | 2018-05-31 | 2019-02-28 | CAMERA DEVICE, IMAGE PROCESSING METHODS AND CAMERA SYSTEM |
| JP2020521710A JP7272670B2 (ja) | 2018-05-31 | 2019-02-28 | カメラ装置、画像処理方法およびカメラシステム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018-105397 | 2018-05-31 | ||
| JP2018105397 | 2018-05-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019230095A1 true WO2019230095A1 (ja) | 2019-12-05 |
Family
ID=68698071
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/007801 Ceased WO2019230095A1 (ja) | 2018-05-31 | 2019-02-28 | カメラ装置、画像処理方法およびカメラシステム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20200405152A1 (ja) |
| EP (1) | EP3804604A4 (ja) |
| JP (1) | JP7272670B2 (ja) |
| CN (1) | CN112040831A (ja) |
| WO (1) | WO2019230095A1 (ja) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020146405A (ja) * | 2019-03-15 | 2020-09-17 | ソニー・オリンパスメディカルソリューションズ株式会社 | 画像処理装置、画像処理方法およびプログラム |
| WO2022059293A1 (ja) * | 2020-09-15 | 2022-03-24 | Hoya株式会社 | 内視鏡用プロセッサ及び内視鏡システム |
| WO2023112916A1 (ja) * | 2021-12-16 | 2023-06-22 | i-PRO株式会社 | 映像信号処理装置、映像信号処理方法および映像信号処理システム |
| JP2023094165A (ja) * | 2021-12-23 | 2023-07-05 | 株式会社東芝 | 管内検査装置、管内検査方法、およびプログラム |
| JPWO2023127053A1 (ja) * | 2021-12-27 | 2023-07-06 | ||
| KR102692579B1 (ko) * | 2023-04-24 | 2024-08-06 | 주식회사 큐리오시스 | 명시야 이미지 보정기능이 있는 광학계 |
| WO2024225754A1 (ko) * | 2023-04-24 | 2024-10-31 | 주식회사 큐리오시스 | 필터블록 조립체 및 이를 이용하는 광학계 |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019215907A1 (ja) * | 2018-05-11 | 2019-11-14 | オリンパス株式会社 | 演算処理装置 |
| DE102022126824A1 (de) * | 2022-10-13 | 2024-04-18 | Karl Storz Se & Co. Kg | Verfahren zum Überlagern von Überlagerungsaufnahmeinformationen mit einem Livebild und eine entsprechende Vorrichtung |
| CN116158718A (zh) * | 2023-03-13 | 2023-05-26 | 武汉迈瑞医疗技术研究院有限公司 | 用于内窥镜系统的成像及显示方法和内窥镜系统 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0998939A (ja) * | 1995-08-03 | 1997-04-15 | Asahi Optical Co Ltd | 蛍光診断用内視鏡装置 |
| JP2001258820A (ja) * | 2000-01-13 | 2001-09-25 | Fuji Photo Film Co Ltd | 蛍光画像表示方法および装置 |
| JP2013039275A (ja) * | 2011-08-18 | 2013-02-28 | Olympus Corp | 蛍光観察装置および蛍光観察システム並びに蛍光画像処理方法 |
| JP2015054038A (ja) | 2013-09-11 | 2015-03-23 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
| JP2015173917A (ja) * | 2014-03-17 | 2015-10-05 | 黎明 李 | 腹腔鏡を用いた検査方法及び検査装置 |
| JP2018105397A (ja) | 2016-12-26 | 2018-07-05 | 日本精工株式会社 | ボールねじ |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0728446A (ja) * | 1993-07-08 | 1995-01-31 | Canon Inc | 画像の2値化処理装置 |
| US6462770B1 (en) * | 1998-04-20 | 2002-10-08 | Xillix Technologies Corp. | Imaging system with automatic gain control for reflectance and fluorescence endoscopy |
| EP1301118B1 (en) * | 2000-07-14 | 2006-09-06 | Xillix Technologies Corp. | Compact fluorescence endoscopy video system |
| JP2002051969A (ja) * | 2000-08-08 | 2002-02-19 | Asahi Optical Co Ltd | 電子内視鏡装置 |
| JP2005319116A (ja) * | 2004-05-10 | 2005-11-17 | Pentax Corp | 蛍光観察内視鏡装置 |
| JP4599398B2 (ja) * | 2005-03-22 | 2010-12-15 | オリンパス株式会社 | 画像処理装置及び内視鏡装置 |
| US8355595B2 (en) * | 2007-05-15 | 2013-01-15 | Xerox Corporation | Contrast enhancement methods and apparatuses |
| EP2074933B1 (de) * | 2007-12-19 | 2012-05-02 | Kantonsspital Aarau AG | Verfahren zur Analyse und Bearbeitung von Fluoreszenzbildern |
| DE102008027905A1 (de) * | 2008-06-12 | 2009-12-17 | Olympus Winter & Ibe Gmbh | Verfahren und Endoskop zur Verbesserung von Endoskopbildern |
| US20110110589A1 (en) * | 2009-11-06 | 2011-05-12 | Kabushiki Kaisha Toshiba | Image Contrast Enhancement |
| JP5385350B2 (ja) * | 2011-08-16 | 2014-01-08 | 富士フイルム株式会社 | 画像表示方法および装置 |
| WO2013115389A1 (ja) * | 2012-02-01 | 2013-08-08 | 株式会社東芝 | 医用画像診断装置 |
| JP5993184B2 (ja) * | 2012-04-04 | 2016-09-14 | オリンパス株式会社 | 蛍光観察装置および蛍光観察装置の作動方法 |
| US9863767B2 (en) * | 2013-06-27 | 2018-01-09 | Panasonic Intellectual Property Corporation Of America | Motion sensor device having plurality of light sources |
| JP6533358B2 (ja) * | 2013-08-06 | 2019-06-19 | 三菱電機エンジニアリング株式会社 | 撮像装置 |
| US10368795B2 (en) * | 2014-06-30 | 2019-08-06 | Canfield Scientific, Incorporated | Acne imaging methods and apparatus |
| EP3205254B1 (en) * | 2016-02-15 | 2020-11-18 | Leica Instruments (Singapore) Pte. Ltd. | Medical inspection apparatus, such as a microscope or endoscope, using pseudocolors |
-
2019
- 2019-02-28 WO PCT/JP2019/007801 patent/WO2019230095A1/ja not_active Ceased
- 2019-02-28 EP EP19811542.0A patent/EP3804604A4/en not_active Withdrawn
- 2019-02-28 CN CN201980028616.0A patent/CN112040831A/zh active Pending
- 2019-02-28 US US16/979,090 patent/US20200405152A1/en not_active Abandoned
- 2019-02-28 JP JP2020521710A patent/JP7272670B2/ja active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0998939A (ja) * | 1995-08-03 | 1997-04-15 | Asahi Optical Co Ltd | 蛍光診断用内視鏡装置 |
| JP2001258820A (ja) * | 2000-01-13 | 2001-09-25 | Fuji Photo Film Co Ltd | 蛍光画像表示方法および装置 |
| JP2013039275A (ja) * | 2011-08-18 | 2013-02-28 | Olympus Corp | 蛍光観察装置および蛍光観察システム並びに蛍光画像処理方法 |
| JP2015054038A (ja) | 2013-09-11 | 2015-03-23 | オリンパスメディカルシステムズ株式会社 | 内視鏡装置 |
| JP2015173917A (ja) * | 2014-03-17 | 2015-10-05 | 黎明 李 | 腹腔鏡を用いた検査方法及び検査装置 |
| JP2018105397A (ja) | 2016-12-26 | 2018-07-05 | 日本精工株式会社 | ボールねじ |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3804604A4 |
Cited By (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7459342B2 (ja) | 2019-03-15 | 2024-04-01 | ソニー・オリンパスメディカルソリューションズ株式会社 | 画像処理装置、内視鏡システム、画像処理方法およびプログラム |
| JP7257829B2 (ja) | 2019-03-15 | 2023-04-14 | ソニー・オリンパスメディカルソリューションズ株式会社 | 画像処理装置、画像処理方法およびプログラム |
| JP2023076608A (ja) * | 2019-03-15 | 2023-06-01 | ソニー・オリンパスメディカルソリューションズ株式会社 | 画像処理装置、内視鏡システム、画像処理方法およびプログラム |
| JP2020146405A (ja) * | 2019-03-15 | 2020-09-17 | ソニー・オリンパスメディカルソリューションズ株式会社 | 画像処理装置、画像処理方法およびプログラム |
| WO2022059293A1 (ja) * | 2020-09-15 | 2022-03-24 | Hoya株式会社 | 内視鏡用プロセッサ及び内視鏡システム |
| JP2022048866A (ja) * | 2020-09-15 | 2022-03-28 | Hoya株式会社 | 内視鏡用プロセッサ及び内視鏡システム |
| CN114901120A (zh) * | 2020-09-15 | 2022-08-12 | 豪雅株式会社 | 内窥镜用处理器以及内窥镜系统 |
| JP7229210B2 (ja) | 2020-09-15 | 2023-02-27 | Hoya株式会社 | 内視鏡用プロセッサ及び内視鏡システム |
| US12150622B2 (en) | 2020-09-15 | 2024-11-26 | Hoya Corporation | Processor for endoscope and endoscopic system |
| WO2023112916A1 (ja) * | 2021-12-16 | 2023-06-22 | i-PRO株式会社 | 映像信号処理装置、映像信号処理方法および映像信号処理システム |
| JP2023094165A (ja) * | 2021-12-23 | 2023-07-05 | 株式会社東芝 | 管内検査装置、管内検査方法、およびプログラム |
| WO2023127053A1 (ja) * | 2021-12-27 | 2023-07-06 | オリンパス株式会社 | 画像処理装置、光免疫治療システム、画像処理方法及び画像処理プログラム |
| JPWO2023127053A1 (ja) * | 2021-12-27 | 2023-07-06 | ||
| JP7621512B2 (ja) | 2021-12-27 | 2025-01-24 | オリンパス株式会社 | 画像処理装置、光免疫治療システム、画像処理方法及び画像処理プログラム |
| KR102692579B1 (ko) * | 2023-04-24 | 2024-08-06 | 주식회사 큐리오시스 | 명시야 이미지 보정기능이 있는 광학계 |
| WO2024225754A1 (ko) * | 2023-04-24 | 2024-10-31 | 주식회사 큐리오시스 | 필터블록 조립체 및 이를 이용하는 광학계 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3804604A4 (en) | 2022-04-13 |
| US20200405152A1 (en) | 2020-12-31 |
| EP3804604A1 (en) | 2021-04-14 |
| JPWO2019230095A1 (ja) | 2021-06-24 |
| CN112040831A (zh) | 2020-12-04 |
| JP7272670B2 (ja) | 2023-05-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7272670B2 (ja) | カメラ装置、画像処理方法およびカメラシステム | |
| CN107137053B (zh) | 使用伪彩色的诸如显微镜或内窥镜的医疗检查装置 | |
| US20200337540A1 (en) | Endoscope system | |
| US8451328B2 (en) | Image processing device, imaging device, computer-readable device, and image processing method for processing a fluorescence image | |
| US9513221B2 (en) | Fluorescence observation apparatus | |
| JP6461797B2 (ja) | 蛍光観察装置 | |
| US20170035280A1 (en) | Stereoscopic endoscope system with concurrent imaging at visible and infrared wavelengths | |
| CN107072520A (zh) | 以可见光波长和红外波长并行成像的内窥镜系统 | |
| EP3610779A1 (en) | Image acquisition system, control device, and image acquisition method | |
| CN106132276A (zh) | 荧光观察内窥镜系统 | |
| JP2015029841A (ja) | 撮像装置および撮像方法 | |
| JP6072374B2 (ja) | 観察装置 | |
| US12102296B2 (en) | Medical imaging system, medical imaging device, and operation method | |
| WO2017042980A1 (ja) | 蛍光観察装置および蛍光観察内視鏡装置 | |
| JP2022027195A (ja) | 3板式カメラ | |
| JP6535701B2 (ja) | 撮像装置 | |
| JP6896053B2 (ja) | 特に顕微鏡および内視鏡のための、蛍光発光性蛍光体のhdrモノクローム画像を作成するためのシステムおよび方法 | |
| WO2017212946A1 (ja) | 画像処理装置 | |
| US20200037865A1 (en) | Image processing device, image processing system, and image processing method | |
| JP6801990B2 (ja) | 画像処理システムおよび画像処理装置 | |
| WO2023090044A1 (ja) | 電子内視鏡用プロセッサ及び電子内視鏡システム | |
| WO2025243998A1 (ja) | 画像処理装置、画像処理方法およびプログラム | |
| JP2023076608A (ja) | 画像処理装置、内視鏡システム、画像処理方法およびプログラム | |
| JP2016043135A (ja) | 内視鏡装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19811542 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020521710 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2019811542 Country of ref document: EP Effective date: 20210111 |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2019811542 Country of ref document: EP |