WO2016110984A1 - 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 - Google Patents
画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 Download PDFInfo
- Publication number
- WO2016110984A1 WO2016110984A1 PCT/JP2015/050400 JP2015050400W WO2016110984A1 WO 2016110984 A1 WO2016110984 A1 WO 2016110984A1 JP 2015050400 W JP2015050400 W JP 2015050400W WO 2016110984 A1 WO2016110984 A1 WO 2016110984A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- signal
- color
- unit
- component
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00011—Operational features of endoscopes characterised by signal transmission
- A61B1/00013—Operational features of endoscopes characterised by signal transmission using optical means
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2461—Illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/273—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the upper alimentary canal, e.g. oesophagoscopes, gastroscopes
- A61B1/2733—Oesophagoscopes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
Definitions
- an image processing apparatus that performs signal processing on an imaging signal generated by an imaging element to generate an image signal, an operation method of the image processing apparatus, an operation program of the image processing apparatus, and the image processing apparatus
- the present invention relates to an endoscope apparatus.
- an endoscope apparatus is widely used for various examinations in the medical field and the industrial field.
- the medical endoscope apparatus is configured to insert a flexible insertion portion having a long and narrow shape in which an imaging element having a plurality of pixels is provided at the tip into a subject such as a patient. Since the in-vivo image in the subject can be acquired without cutting the subject, the burden on the subject is small, and the spread is progressing.
- a white illumination light observation method using white illumination light (white illumination light), and a narrow band illumination light (narrow band wavelength band narrower than the white wavelength band) Narrow-band light observation methods using illumination light) are widely known.
- the narrow band light observation method for example, an image can be obtained in which a capillary blood vessel, a mucous membrane fine pattern and the like existing in a mucous membrane surface (a living body surface) of a living body are highlighted.
- the narrow band light observation method a lesion in the mucous membrane surface of a living body can be detected more accurately.
- illumination light of the three primary colors of R, G and B is sequentially irradiated to the tissue in the subject, and the reflected light thereof is white
- a white illumination light observation mode for generating an illumination light observation image and illumination light consisting of two narrow band lights respectively included in the wavelength bands of blue light and green light are sequentially irradiated, and narrow band light observation from their reflected light images
- An endoscope system capable of switching between a narrow band light observation mode for generating an image has been proposed (see, for example, Patent Document 1).
- the two narrow band lights respectively included in the wavelength bands of blue light and green light have different absorption characteristics of hemoglobin of blood vessels and attenuation in the depth direction to the living body depending on the wavelength.
- the narrow band light included in the wavelength band of blue light captures the capillary structure of the surface layer and the mucous membrane structure of the surface layer
- the narrow band light included in the wavelength band of green light captures thicker blood vessels in the deep layer be able to.
- a red (R) color generally referred to as a Bayer array is formed on the light receiving surface of the imaging element to obtain a captured image by the imaging element of a single plate.
- a color filter is provided, in which four filters that respectively transmit light in wavelength bands of green (G), green (G) and blue (B) are arranged in each pixel as one filter unit.
- each pixel receives the light of the wavelength band transmitted through the filter, and generates an electrical signal of a color component according to the light of the wavelength band. Therefore, in the process of generating a color image, interpolation processing is performed to interpolate the signal value of the missing color component without passing through the filter in each pixel.
- Such interpolation processing is called demosaicing processing.
- the signal acquired by the G pixel (a pixel in which the G filter is arranged; the same definition applies to R pixel and B pixel) is referred to as a G signal (R signal in the case of R pixel, B in the case of B pixel) Signal).
- the R signal of the R pixel or the R pixel of the R pixel or the R pixel of the R pixel or the B pixel is interpolated using the correlation of the G pixels around the R pixel and the B pixel where the G signal is missing Interpolation to the pixel position where the color difference signal is missing from the color difference signal (RG signal and BG signal) calculated using the B signal at the B pixel position is used in interpolation of the G signal
- a technique for interpolating a color difference signal using the correlation of surrounding G pixels is disclosed (see, for example, Patent Document 2).
- an endoscope apparatus that emits white illumination light and narrow band narrow band illumination light included in a blue wavelength band is disclosed (for example, a patent) See reference 3).
- the G pixel has a slight sensitivity to the narrow band light of blue, and the G signal and the R signal are generated based on the G signal, the B signal and the R signal obtained by the illumination light described above.
- the B signal component corresponding to the narrow band light of blue captured by the G pixel is extracted from the correlation calculation of B, and the extracted B signal component and the B signal generated from the B pixel based on the narrow band light of blue are combined This produces a weighted image of surface capillaries.
- the present invention has been made in view of the above, and an image processing apparatus and an image processing apparatus capable of obtaining an image of high resolution in either the white illumination light observation method or the narrow band light observation method.
- An operation method, an operation program of an image processing apparatus, and an endoscope apparatus are provided.
- an image processing apparatus comprises a first luminance component that is a luminance component of white illumination light including light in red, green and blue wavelength bands.
- Third pixels that generate third color signals of color components different from the first and second luminance components are arranged in a matrix, and the density of the first pixels is higher than the density of each of the second and third pixels.
- the respective color components generated by interpolation In front of the color signal of A specific frequency component extraction unit for extracting a specific frequency component signal having a predetermined spatial frequency component from a color signal of a first luminance component, and extracting the specific frequency component to a color signal of a color component different from the first luminance component And a specific frequency component addition unit that adds the specific frequency component signals extracted by the unit.
- an operation method of an image processing apparatus comprises a first luminance component which is a luminance component of white illumination light including light in red, green and blue wavelength bands.
- Pixels and third pixels for generating third color signals of color components different from the first and second luminance components are arranged in a matrix, and the density of the first pixels is the density of each of the second and third pixels.
- an operation program of an image processing apparatus comprises a first luminance component which is a luminance component of white illumination light including light in red, green and blue wavelength bands.
- Pixels and third pixels for generating third color signals of color components different from the first and second luminance components are arranged in a matrix, and the density of the first pixels is the density of each of the second and third pixels.
- An adding unit performing, in the image processing apparatus, a specific frequency component addition procedure of adding the specific frequency component signal extracted by the specific frequency component extraction unit to a color signal of a color component different from the first luminance component; It is characterized by
- an endoscope apparatus has a white illumination light including light in red, green and blue wavelength bands, and a wavelength band narrower than the wavelength band of the white illumination light.
- a first color signal of a first luminance component which is a luminance component of white illumination light including light source sections emitting any of narrow band illumination light consisting of band light and light of red, green and blue wavelength bands is generated First pixel, a second pixel for generating a second color signal of a second luminance component which is a luminance component of narrowband illumination light which is light of a band narrower than the wavelength band of the white illumination light, and
- An imaging element in which third pixels generating third color signals of color components different from the two luminance components are arranged in a matrix, and the density of the first pixels is higher than the densities of the second and third pixels;
- An image processing apparatus according to the above invention; Characterized in that was.
- the present invention it is possible to obtain an image of high resolution in either of the white illumination light observation method and the narrow band light observation method.
- FIG. 1 is a view showing a schematic configuration of an endoscope apparatus according to an embodiment of the present invention.
- FIG. 2 is a schematic view showing a schematic configuration of an endoscope apparatus according to an embodiment of the present invention.
- FIG. 3 is a schematic view showing a configuration of a pixel of the imaging device according to the embodiment of the present invention.
- FIG. 4 is a schematic view showing an example of the configuration of a color filter according to an embodiment of the present invention.
- FIG. 5 is a graph showing the relationship between the wavelength of the illumination light emitted by the illumination unit of the endoscope apparatus according to the embodiment of the present invention and the amount of light.
- FIG. 1 is a view showing a schematic configuration of an endoscope apparatus according to an embodiment of the present invention.
- FIG. 2 is a schematic view showing a schematic configuration of an endoscope apparatus according to an embodiment of the present invention.
- FIG. 3 is a schematic view showing a configuration of a pixel of the imaging device according to the embodiment of the present
- FIG. 6 is a graph showing the relationship between the wavelength of the illumination light and the transmittance by the switching filter of the illumination unit of the endoscope apparatus according to the embodiment of the present invention.
- FIG. 7 is a block diagram for explaining the configuration of the pre-processing unit of the processor unit according to the embodiment of the present invention.
- FIG. 8 is a schematic view illustrating frequency characteristics of interpolation filter processing performed by the interpolation processing unit according to the embodiment of the present invention.
- FIG. 9 is a block diagram for explaining the configuration of the main part of the interpolation processing unit of the processor unit according to the embodiment of the present invention.
- FIG. 10 is a schematic diagram for explaining the demosaicing process performed by the processor unit according to the embodiment of the present invention.
- FIG. 11 is a schematic diagram for explaining the demosaicing process performed by the processor unit according to the embodiment of the present invention.
- FIG. 12 is a block diagram for explaining the configuration of the main part of the interpolation processing unit of the processor unit according to the embodiment of the present invention.
- FIG. 13 is a schematic diagram for explaining the frequency characteristics of the filtering process performed by the filter processing unit according to the embodiment of the present invention.
- FIG. 14 is a schematic view illustrating the frequency characteristic of the filtering process performed by the filter processing unit according to the embodiment of the present invention.
- FIG. 15 is a schematic view illustrating a signal after filtering processing performed by the filter processing unit according to the embodiment of the present invention.
- FIG. 16 is a schematic diagram for explaining signals after clip processing by the clip processing unit according to the embodiment of the present invention.
- FIG. 17 is a schematic view illustrating gain processing performed by the gain processing unit according to the embodiment of the present invention.
- FIG. 18 is a block diagram for explaining the configuration of the main part of the interpolation processing unit of the processor unit according to the embodiment of the present invention.
- FIG. 19 is a flowchart illustrating signal processing performed by the processor unit according to the embodiment of the present invention.
- FIG. 20 is a schematic view showing a schematic configuration of an endoscope apparatus according to a modification of the embodiment of the present invention.
- FIG. 21 is a schematic view showing a configuration of a rotary filter of a light source unit according to a modification of the embodiment of the present invention.
- FIG. 22 is a graph showing the relationship between the wavelength of the illumination light by the filter of the illumination unit of the endoscope apparatus according to the modification of the embodiment of the present invention and the transmittance.
- a medical endoscope apparatus that includes an image processing apparatus according to the present invention and captures and displays an image inside a subject such as a patient will be described. Further, the present invention is not limited by the embodiment. Furthermore, in the description of the drawings, the same parts will be described with the same reference numerals.
- the narrow band is a range which is narrower than the wavelength band of white light. Any wavelength band may be used, and a wavelength band (for example, infrared, ultraviolet, etc.) outside the range of the wavelength band of white light (visible light) may be included.
- FIG. 1 is a view showing a schematic configuration of an endoscope apparatus 1 according to an embodiment of the present invention.
- FIG. 2 is a schematic view showing a schematic configuration of the endoscope apparatus 1 according to the embodiment of the present invention.
- An endoscope apparatus 1 shown in FIGS. 1 and 2 includes an endoscope 2 which picks up an in-vivo image of an observation site by inserting the insertion portion 21 into a subject to generate an electric signal, and the endoscope 2.
- the endoscope apparatus 1 inserts the insertion unit 21 into a subject such as a patient to acquire an in-vivo image of the inside of the subject.
- An operator such as a doctor examines the acquired in-vivo image to examine the presence or absence of a bleeding site or a tumor site as a detection target site.
- the endoscope 2 has an elongated insertion portion 21 having flexibility, an operation portion 22 connected to the proximal end side of the insertion portion 21 and receiving input of various operation signals, and an insertion portion from the operation portion 22
- a universal cord 23 extends in a direction different from the extending direction of the cable 21 and incorporates various cables connected to the light source unit 3 and the processor unit 4.
- the insertion unit 21 has pixels (photodiodes) that receive light arranged in a grid (matrix), and incorporates an imaging element 202 that generates an image signal by performing photoelectric conversion on the light received by the pixels. It has a distal end portion 24, a bendable bending portion 25 constituted by a plurality of bending pieces, and an elongated flexible pipe portion 26 connected to the base end side of the bending portion 25 and having flexibility. .
- the operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical and horizontal directions, a treatment tool insertion unit 222 that inserts a treatment tool such as a biological forceps, an electric knife, and an inspection probe into the subject, and a light source unit An instruction signal for switching the illumination light to 3; an operation instruction signal for the treatment tool or an external device connected to the processor unit 4; a water supply instruction signal for performing water supply; and a suction instruction signal for performing suction And a plurality of switches 223 for inputting and the like.
- the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via a treatment tool channel (not shown) provided at the distal end of the distal end portion 24.
- the universal cord 23 incorporates at least the light guide 203 and a collective cable in which one or more signal lines are put together.
- the collective cable is a signal line for transmitting and receiving signals between the endoscope 2 and the light source unit 3 and the processor unit 4 and is a signal line for transmitting and receiving setting data, a signal line for transmitting and receiving an image signal, It includes a signal line and the like for transmitting and receiving a driving timing signal for driving the imaging element 202.
- the endoscope 2 also includes an imaging optical system 201, an imaging element 202, a light guide 203, an illumination lens 204, an A / D conversion unit 205, and an imaging information storage unit 206.
- the imaging optical system 201 is provided at the distal end portion 24 and condenses at least light from the observation site.
- the imaging optical system 201 is configured using one or more lenses.
- the imaging optical system 201 may be provided with an optical zoom mechanism for changing the angle of view and a focusing mechanism for changing the focus.
- the imaging element 202 is provided perpendicular to the optical axis of the imaging optical system 201, and photoelectrically converts an image of light formed by the imaging optical system 201 to generate an electrical signal (image signal).
- the imaging device 202 is realized by using a charge coupled device (CCD) image sensor, a complementary metal oxide semiconductor (CMOS) image sensor, or the like.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- FIG. 3 is a schematic view showing the configuration of the pixel of the imaging device 202.
- the imaging element 202 a plurality of pixels that receive light from the imaging optical system 201 are arranged in a lattice (matrix). Then, the imaging element 202 generates an electrical signal (also referred to as an image signal or the like) by performing photoelectric conversion on the light received by each pixel.
- the electrical signal includes the pixel value (brightness value) of each pixel, positional information of the pixel, and the like.
- the pixel arranged in the i-th row and the j-th column is described as a pixel P ij (i and j are natural numbers).
- the imaging element 202 is provided between the imaging optical system 201 and the imaging element 202, and includes a color filter 202a having a plurality of filters each transmitting light of a wavelength band individually set.
- the color filter 202 a is provided on the light receiving surface of the imaging element 202.
- FIG. 4 is a schematic view showing an example of the configuration of the color filter 202a.
- the color filter 202a is a filter unit U1 composed of four filters arranged in a matrix of 2 rows and 2 columns arranged in a matrix according to the arrangement of the pixels P ij .
- the color filter 202a is formed by repeatedly arranging the filter arrangement of the filter unit U1 with the basic pattern as the basic pattern.
- One filter that transmits light of a predetermined wavelength band is disposed on the light receiving surface of each pixel. Therefore, the pixel P ij provided with the filter receives the light in the wavelength band transmitted by the filter.
- the pixel P ij provided with a filter that transmits light in the green wavelength band receives light in the green wavelength band.
- the pixel P ij that receives light in the green wavelength band is referred to as a G pixel.
- a pixel that receives light in the blue wavelength band is called a B pixel
- a pixel that receives light in the red wavelength band is called an R pixel.
- the filter unit U1 transmits light in the blue (B) wavelength band H B , the green (G) wavelength band H G and the red (R) wavelength band H R.
- the filter unit U1 red which transmits light in a wavelength band H blue filter that transmits light of B (B filters), a green filter which transmits light in a wavelength band H G (G filter), wavelength band H R A filter (R filter) is used, and two G filters are arranged diagonally, and a B filter and an R filter are arranged diagonally, forming a so-called Bayer arrangement.
- the density of the G filter is higher than the density of the B filter and the R filter.
- the density of G pixels is higher than the density of B pixels and R pixels.
- the blue, green and red wavelength bands H B , H G and H R have, for example, a wavelength band H B of 380 nm to 500 nm, a wavelength band H G of 480 nm to 600 nm, and a wavelength band H R of 580 nm to 650 nm.
- the light guide 203 is made of glass fiber or the like, and forms a light guide path of the light emitted from the light source unit 3.
- the illumination lens 204 is provided at the tip of the light guide 203, diffuses the light guided by the light guide 203, and emits the light to the outside of the tip 24.
- the A / D conversion unit 205 performs A / D conversion on the electrical signal generated by the imaging element 202 and outputs the converted electrical signal to the processor unit 4.
- the A / D conversion unit 205 converts the electrical signal generated by the imaging element 202 into, for example, 12-bit digital data (image signal).
- the imaging information storage unit 206 stores data including various programs for operating the endoscope 2, various parameters necessary for the operation of the endoscope 2, identification information of the endoscope 2, and the like.
- the imaging information storage unit 206 further includes an identification information storage unit 261 that stores identification information.
- the identification information includes the unique information (ID) of the endoscope 2, the year, the spec information, the transmission method, and the arrangement information of the filters applied to the color filter 202a.
- the imaging information storage unit 206 is realized using a flash memory or the like.
- the light source unit 3 includes an illumination unit 31 and an illumination control unit 32.
- the illumination unit 31 switches and emits a plurality of illumination lights having different wavelength bands under the control of the illumination control unit 32.
- the illumination unit 31 includes a light source 31a, a light source driver 31b, a switching filter 31c, a drive unit 31d, a drive driver 31e, and a condensing lens 31f.
- Light source 31a emits under the control of the illumination control unit 32, the red, green and blue wavelength band H R, the white illumination light including light of H G and H B.
- the white illumination light generated by the light source 31 a is emitted from the distal end portion 24 to the outside via the switching filter 31 c, the condenser lens 31 f and the light guide 203.
- the light source 31 a is realized using a light source that emits white light, such as a white LED or a xenon lamp.
- the light source driver 31 b causes the light source 31 a to emit white illumination light by supplying current to the light source 31 a under the control of the illumination control unit 32.
- the switching filter 31c transmits only the narrow band light of blue and the narrow band light of green among the white illumination light emitted from the light source 31a.
- the switching filter 31 c is detachably disposed on the optical path of the white illumination light emitted by the light source 31 a under the control of the illumination control unit 32.
- the switching filter 31 c transmits only two narrow band lights by being disposed on the light path of the white illumination light.
- switching filter 31c includes an optical narrow-band T B included in the wavelength band H B (e.g., 400 nm ⁇ 445 nm), narrow-band T G included in the wavelength band H G (e.g., 530 nm ⁇ 550 nm) And narrow band illumination light consisting of The narrow bands T B and T G are wavelength bands of blue light and green light which are easily absorbed by hemoglobin in blood.
- narrowband T B may be contained at least 405 nm ⁇ 425 nm.
- the light which is limited and emitted in this band is called narrow band illumination light, and observation of the image by the narrow band illumination light is called narrow band light observation (NBI) method.
- NBI narrow band light observation
- the driving unit 31 d is configured using a stepping motor, a DC motor, or the like, and operates the switching filter 31 c from the light path of the light source 31 a.
- the drive driver 31 e supplies a predetermined current to the drive unit 31 d under the control of the illumination control unit 32.
- the condensing lens 31 f condenses the white illumination light emitted from the light source 31 a or the narrowband illumination light transmitted through the switching filter 31 c, and emits the light to the outside (light guide 203) of the light source unit 3.
- the illumination control unit 32 controls the light source driver 31b to turn on and off the light source 31a, and controls the drive driver 31e to insert and remove the switching filter 31c from the light path of the light source 31a. Control the type (band) of the emitted illumination light.
- the illumination control unit 32 operates the switching filter 31c with respect to the light path of the light source 31a to make the illumination light emitted from the illumination unit 31 be either white illumination light or narrowband illumination light. Control to switch to In other words, the illumination control unit 32 uses the white illumination light observation (WLI) method using the white illumination light including the light in the wavelength bands H B , H G and H R , and the light in the narrow bands T B and T G Control is performed to switch to any of the narrowband light observation (NBI) observation methods using narrowband illumination light.
- WLI white illumination light observation
- NBI narrowband light observation
- white illumination light observation (WLI) in the manner green component (wavelength band H G) is the luminance component (the first luminance component) and the blue component in the narrow band imaging (NBI) system (narrowband T B) luminance It becomes a component (second luminance component). Therefore, in the imaging device 202 according to the present embodiment, the G pixel corresponds to the first pixel, the B pixel corresponds to the second pixel, and the R pixel corresponds to the third pixel.
- the luminance component in the present invention means, for example, a color component which is a main component of the luminance signal Y of the XYZ color system described later.
- a green component which has the highest relative luminous sensitivity of human eyes and in which a blood vessel or duct structure of a living body is clearly depicted becomes a luminance component.
- the luminance component selected differs depending on the subject, and the green component may be selected as in white illumination light observation, or the luminance component may be different from white light observation.
- the NBI observation described above as a representative example of a blue component or a red component becoming a luminance component in narrow band light observation, and in this case, a blue component in which a blood vessel or duct structure in the surface of a living body is clearly depicted. Is the luminance component.
- the green component is used as a luminance component in white illumination light observation
- the blue component is used as a luminance component in narrowband light observation.
- FIG. 5 is a graph showing the relationship between the wavelength of the illumination light emitted by the illumination unit 31 and the amount of light.
- FIG. 6 is a graph showing the relationship between the wavelength of the illumination light by the switching filter 31c of the illumination unit 31 and the transmittance.
- the illumination unit 31 is a white light including light in the wavelength bands H B , H G and H R as shown in the white light spectrum shown in FIG. Emit illumination light.
- the control of the illumination control unit 32 inserts the switching filter 31c in the optical path of the light source 31a, an illumination unit 31, narrowband T B, emits narrow band illumination light comprising light of T G (see FIG. 6 ).
- the processor unit 4 includes an image processing unit 41, an input unit 42, a storage unit 43, and a control unit 44.
- the image processing unit 41 executes predetermined image processing based on the electric signal from the endoscope 2 (A / D conversion unit 205) to generate image information to be displayed by the display unit 5.
- the image processing unit 41 includes a pre-processing unit 411, an interpolation processing unit 412, and a display image generation processing unit 413.
- the pre-processing unit 411 performs optical black (OB) clamping, noise reduction (NR) processing, and white balance (WB) processing on the electric signal from the A / D conversion unit 205. Then, the image signal after the signal processing is output to the interpolation processing unit 412.
- OB optical black
- NR noise reduction
- WB white balance
- FIG. 7 is a block diagram for explaining the configuration of the pre-processing unit 411.
- the pre-processing unit 411 includes an OB processing unit 4111, an NR processing unit 4112, and a WB processing unit 4113.
- the OB processing unit 4111 performs OB clamping processing on the R signal, the G signal, and the B signal of the image signal input from the A / D conversion unit 205.
- OB clamping process an average value of a predetermined area corresponding to the optical black area is calculated based on the electric signal input from the endoscope 2 (A / D conversion unit 205), and the average value is calculated from the electric signal. By subtracting, the black level is corrected to a zero value.
- the NR processing unit 4112 acquires observation method information regarding whether the current observation method is the WLI method or the NBI method from the control unit 44, changes the noise reduction amount according to the observation method information, and performs OB Noise reduction processing is performed on the image signal subjected to the clamp processing.
- the WB processing unit 4113 performs white balance processing based on the observation method information on the image signal subjected to the noise reduction processing, and outputs the image signal after the white balance processing to the interpolation processing unit 412.
- the WB processing unit 4113 performs balance correction processing on the signals between the two channels when the signals of the channels (color components) obtained by the narrow band light observation (NBI) method are two (G signal and B signal). The remaining one channel (R signal in the first embodiment) is multiplied by zero.
- the interpolation processing unit 412 determines the interpolation direction from the correlation of color information (pixel values) of a plurality of pixels based on the image signal input from the pre-processing unit 411, and colors of pixels aligned in the determined interpolation direction. By performing interpolation based on the information, a color image signal having signals of at least two color components is generated.
- the interpolation processing unit 412 includes a G interpolation color difference calculating unit 412a, a color difference interpolating unit 412b, a color image signal generating unit 412c, a G signal specific frequency component extracting unit 421d, and a specific frequency component adding unit 412e.
- the G interpolation color difference calculation unit 412a interpolates a G signal (R pixel or B pixel) in which the G signal is missing from the image signal input from the pre-processing unit 411 based on the peripheral pixels thereof ( Hereinafter, the interpolation G signal is generated, and a G signal image in which all pixel positions have the G signal or the interpolation G signal is output. That is, by the interpolation processing of the G interpolation color difference calculating unit 412a, an image signal constituting one image in which each pixel has a pixel value of G component or an interpolation value is generated.
- the G interpolation color difference calculation unit 412a generates an RG signal or a BG signal which is a color difference signal obtained by taking the color difference between the signal of each color component and the interpolation G signal according to the position of the R pixel or B pixel. , Output as a color difference image signal.
- the G interpolation color difference calculating unit 412a outputs the generated G signal image to the color image signal generating unit 412c, and outputs the color difference image signal to the color difference interpolating unit 412b.
- the color difference interpolation unit 412b interpolates the color difference signal missing at each pixel position with respect to the color difference image signal input from the G interpolation color difference calculation unit 412a, and the color difference image signal in which all pixel positions have color difference signals It is output to the image signal generation unit 412c. That is, by the interpolation processing of the color difference interpolation unit 412b, an image signal that constitutes one image in which each pixel has a value of the color difference signal RG or BG is generated.
- the color image signal generation unit 412c applies an interpolation filter to the G signal, the RG signal, and the BG signal generated by the G interpolation color difference calculation unit 412a and the color difference interpolation unit 412b.
- FIG. 8 is a schematic diagram for explaining the frequency characteristics of interpolation filter processing performed by the interpolation processing unit 412 according to the embodiment of the present invention, and is a graph showing the response of the signal to the frequency.
- the color image signal generation unit 412c applies an interpolation filter having low-pass filter characteristics as shown in FIG. 8 at half pixel positions in the horizontal direction and the vertical direction.
- the frequency characteristics are periodically different between the signal at the pixel position originally provided to the image pickup element 202 and the signal generated by the interpolation processing using the signal values of the peripheral pixels (in particular, the G signal has the Nyquist frequency) Make the period of NF) spatially uniform.
- the color image signal generation unit 412 c adds the G signal (including the interpolation G signal) at each pixel position and the color difference signal (the BG signal or the RG signal) after the above-described interpolation filter processing.
- the RGB signal or the GB signal is generated by the above-mentioned method, and the generated signal is output to the display image generation processing unit 413 as a color image signal.
- the color image signal generation unit 412c causes the color difference image signal having the BG signal and the color difference image signal having the RG signal to be output from the color difference interpolation unit 412b. Acquire and generate R component, G component and B component signals (RGB signals).
- the color image signal generation unit 412 c acquires only the color difference image signal having the BG signal from the BG interpolation unit 4004 because there is no light of R component. , G component and B component signal (GB signal) are generated.
- the G signal specific frequency component extraction unit 421d is configured to set a predetermined one of the color signals to the G signal which is the signal of the luminance component of the WLI system among the signals having red, green and blue as components generated by interpolation.
- a specific frequency component signal having the spatial frequency component of B as a component is extracted.
- the G signal specific frequency component extraction unit 421d receives the G signal among the RGB signals generated by the color image signal generation unit 412c, and for the G signal, a blood vessel region that is a structure or A specific frequency component signal having a frequency component according to the dark area (dark portion) of the recess is extracted.
- the spatial frequency component is, for example, amplitude information of pixel values for each spatial frequency band obtained by converting a color space constituting a predetermined color system such as R, G, B, etc. into a frequency space.
- the specific frequency component adder 412e receives the R signal and / or B signal generated by the color image signal generator 412c and the specific frequency component signal extracted by the G signal specific frequency component extractor 421d, and inputs the R signal or A specific frequency component signal corresponding to each color component is added to the B signal.
- the specific frequency component adder 412e acquires observation mode information (information indicating which of the illumination method is the white illumination light observation (WLI) method and the narrow band light observation (NBI) method), If the illumination method is the WLI method, the specific frequency component signal according to each color component is added to at least one of the R signal and the B signal, and if it is the NBI method, it is added to the B signal On the other hand, specific frequency component signals corresponding to the B component are added.
- observation mode information information indicating which of the illumination method is the white illumination light observation (WLI) method and the narrow band light observation (NBI) method
- the display image generation processing unit 413 performs color conversion processing on the color image signal generated by the interpolation processing unit 412, for example, in the color space of the display unit 5, which is the color space of sRGB (XYZ color system), And gradation processing based on the gradation conversion characteristics of the image, enlargement processing, or structure enhancement processing of a structure such as a capillary blood vessel or a mucous membrane fine pattern of the mucous membrane surface layer. After performing the predetermined processing, the display image generation processing unit 413 outputs the signal after the processing to the display unit 5 as a display image signal for display.
- the input unit 42 is an interface for performing input from the operator or the like to the processor unit 4, a power switch for turning on / off the power, and a mode switching button for switching the imaging mode and other various modes.
- An illumination light switching button for switching the illumination light of the light source unit 3 is included.
- the storage unit 43 is used for data processing including various programs for operating the endoscope apparatus 1 and various parameters necessary for the operation of the endoscope apparatus 1 and image processing such as a white balance coefficient according to an observation method. Necessary data, a program for executing image processing according to the present invention, and the like are recorded. In addition, the storage unit 43 may store information related to the endoscope 2, for example, a relation table between unique information (ID) of the endoscope 2 and information related to the filter arrangement of the color filter 202a.
- the storage unit 43 is realized using a semiconductor memory such as a flash memory or a dynamic random access memory (DRAM).
- the control unit 44 is configured using a CPU or the like, and performs drive control of each component including the endoscope 2 and the light source unit 3 and input / output control of information with respect to each component.
- the control unit 44 transmits setting data (for example, a pixel to be read) for imaging control recorded in the storage unit 43, a timing signal according to imaging timing, and the like via a predetermined signal line to the endoscope.
- the control unit 44 outputs the color filter information (identification information) acquired through the imaging information storage unit 206 to the image processing unit 41, and also causes the light source unit 3 to provide information on the insertion / removal operation (arrangement) of the switching filter 31c. Output.
- the display unit 5 receives the display image signal generated by the processor unit 4 via the video cable, and displays the in-vivo image corresponding to the display image signal.
- the display unit 5 is configured using liquid crystal or organic EL (Electro Luminescence).
- FIG. 9 is a block diagram for explaining the configuration of the main part of the interpolation processing unit 412, and shows the configuration of the G interpolation color difference calculation unit 412a, the color difference interpolation unit 412b, and the color image signal generation unit 412c.
- the color difference interpolation unit 412 b includes a color difference separation unit 4001, a BG correlation determination unit 4002, a BG interpolation direction determination unit 4003, a BG interpolation unit 4004, an RG correlation determination unit 4005, an RG interpolation direction determination unit 400 and an RG interpolation unit 4007.
- the color difference separation unit 4001 separates the color difference image signal output from the G interpolation color difference calculation unit 412a into the BG signal and the RG signal, and the BG signal is determined by the BG correlation determination unit 4002 and the BG signal. While outputting to the interpolation direction determination unit 4003, the R-G signal is output to the R-G correlation determination unit 4005 and the R-G interpolation direction determination unit 4006.
- FIG. 10 is a schematic diagram for explaining the demosaicing process performed by the processor unit (interpolation processing unit 412).
- the separated BG signal and RG signal are arranged according to the B pixel position and the R pixel position as shown in the schematic view of FIG.
- the BG correlation determination unit 4002 sets an R pixel having an RG signal as a target pixel, and generates a BG signal of a pixel adjacent to the target pixel.
- Calculate the correlation of FIG. 11 is a schematic diagram for explaining the demosaicing process performed by the processor unit (interpolation processing unit 412). Specifically, as shown in FIG. 11, the BG correlation discrimination unit 4002 sets the coordinates of the pixel of interest (pixel P ij ) to (k, l), and sets the BG signal at four adjacent pixel positions.
- the color difference signal values are f B-G (k-1, l-1), f B-G (k + 1, l-1), f B- G (k-1, l + 1), f B- G (k + 1, l + 1)
- the correlation value Ss in the diagonally upward direction is calculated based on the following expression (1).
- the obliquely upward direction is a direction from the lower left to the upper right in the arrangement of the pixels shown in FIG. 3
- the obliquely downward direction is a direction from the upper left to the lower right in the arrangement of the pixels shown in FIG.
- the signal value of the pixel at the folded position is used.
- the BG correlation determination unit 4002 calculates the correlation value Sb in the diagonally lower direction based on the following expression (2).
- said Formula (1) and (2) although it calculates using the signal value of two pixels located in the diagonal direction, it is not limited to this. It is possible to improve the reliability of the correlation value calculated by using the BG signal at pixels farther away in the same direction centering on the target pixel.
- the BG correlation discrimination unit 4002 determines the direction of the correlation value, which is the smaller one of the correlation values Ss and Sb, It is determined that the correlation is in the higher direction.
- the BG correlation determination unit 4002 determines that there is no correlation in the specific direction.
- the BG correlation discrimination unit 4002 outputs, to the BG interpolation direction discrimination unit 4003, determination information indicating one of “obliquely upward direction”, “obliquely downward direction” and “no correlation in specific direction”.
- the threshold is set as a value in consideration of noise included in the signal.
- the BG interpolation direction determination unit 4003 uses the determination information from the BG correlation determination unit 4002 and the color difference signal value of the BG signal to calculate the interpolation color difference of the BG signal in the pixel of interest (k, l).
- the signal value f B ⁇ G (k, l) is calculated by any one of the following equations (3) to (5). [When the determination information is "obliquely upward”] When the determination information is “obliquely upward”, the B-G interpolation direction determination unit 4003 calculates the interpolation color difference signal value f B-G (k, l) of the B-G signal in the pixel of interest (k, l) Calculated based on (3).
- the B-G interpolation direction determination unit 4003 calculates the interpolation color difference signal value f B-G (k, l) of the B-G signal in the pixel of interest (k, l) Calculated based on (4).
- the B-G interpolation direction determination unit 4003 determines the interpolation color difference signal value f B-G (k, l) of the BG signal in the pixel of interest (k, l). Calculated based on the following equation (5).
- the above equation (5) is the average value of the BG signals of the four surrounding pixels, interpolation using BG signals of 16 or more surrounding pixels which can hold higher spatial frequencies is You may go.
- B-G interpolation direction determination unit 4003 the pixel of interest (k, l) interpolation chrominance signal values for f B-G (k, l ) by calculation of, B-G signal comprising interpolated color difference signal in a checkered pattern
- the color difference signal for the arranged color difference B-G is output to the B-G interpolator 4004.
- the BG interpolation unit 4004 calculates an interpolated color difference signal value of the BG signal at a pixel position missing from the color difference signal (BG signal) from the BG interpolation direction determination unit 4003. For example, the BG interpolation unit 4004 generates the interpolation value f B ⁇ G (k, l ⁇ 1) of the pixel position (k, l ⁇ 1) missing in the pixel arrangement shown in FIG. Calculated based on Although the above equation (6) is the average value of the BG signals of the four surrounding pixels, it is possible to interpolate using the BG signals of more than sixteen surrounding pixels capable of holding higher spatial frequencies. You may go.
- the BG interpolation unit 4004 outputs a color difference image signal having BG signals at all pixel positions to the color image signal generation unit 412c by calculating the interpolation color difference signal value with respect to the missing pixel position of the BG signal. That is, by the interpolation processing of the BG interpolation unit 4004, color difference signals are generated which constitute one image in which each pixel has a color difference signal value for color difference BG or an interpolated color difference signal value.
- the RG correlation discrimination unit 4005 sets the B pixel having the BG signal as the target pixel for the RG signal output from the color difference separation unit 4001 in the same manner as the BG correlation discrimination unit 4002, The correlation of the R-G signal of the pixel adjacent to the target pixel is calculated.
- the RG correlation discrimination unit 4005 replaces B with R in Equations (1) and (2) to calculate correlation values Ss and Sb.
- the RG correlation discrimination unit 4005 determines that there is no “diagonal upward”, “diagonal downward” and “specific direction correlation based on the correlation value Ss, the correlation value Sb, the difference absolute value
- the RG interpolation direction determination unit 4006 uses the determination information from the RG correlation determination unit 4005 and the color difference signal value of the RG signal in the same manner as the BG interpolation direction determination unit 4003, Interpolated color difference signal values f R-G (k, l) of R-G signals are calculated by (k, l) according to any one of the expressions (3) to (5).
- the RG interpolation direction determination unit 4006 reads B as R in the equations (3) to (5) to calculate an interpolated color difference signal value f R -G (k, l).
- the RG interpolation direction discrimination unit 4006 checks the RG signal including the color difference signal interpolated by calculating the interpolation color difference signal value f RG (k, l) for the pixel of interest (k, l) in a checkered manner.
- the color difference signal for the arranged color difference RG is output to the RG interpolation unit 4007.
- the RG interpolation unit 4007 Similarly to the BG interpolation unit 4004, the RG interpolation unit 4007 generates an RG signal at a pixel position missing from the color difference signal (RG signal) from the RG interpolation direction determination unit 4006. Calculate interpolated color difference signal values of the signal.
- the RG interpolation unit 4007 outputs a chrominance image signal having an RG signal at every pixel position to the color image signal generation unit 412c by calculating an interpolated color difference signal value with respect to a missing pixel position of the RG signal. That is, by the interpolation processing of the RG interpolation unit 4007, color difference signals are generated which constitute one image in which each pixel has a color difference signal value for color difference RG or an interpolation color difference signal value.
- the color difference interpolation unit 412 b outputs the color difference signal to the color image signal generation unit 412 c by the above-described interpolation processing.
- the observation method is the WLI method
- the chrominance image signal having the BG signal and the chrominance image signal having the RG signal are respectively from the BG interpolation unit 4004 and the RG interpolation unit 4007. It is output.
- the observation method is the NBI method, no R component light is present, so only the color difference image signal having the BG signal from the BG interpolation unit 4004 is input to the color image signal generation unit 412c. Be done.
- FIG. 12 is a block diagram for explaining the configuration of the main part of the interpolation processing unit 412, and shows the configuration of the color image signal generation unit 412c, the G signal specific frequency component extraction unit 421d and the specific frequency component addition unit 412e. is there.
- the G signal specific frequency component extraction unit 421 d includes a filter processing unit 4101, a clip processing unit 4102, and a gain processing unit 4103.
- the filter processing unit 4101 receives the G signal among the RGB signals generated by the color image signal generation unit 412c, and performs filter processing using a high pass filter.
- FIGS. 13 and 14 are schematic diagrams for explaining the frequency characteristics of the filtering process performed by the filter processing unit 4101 according to the embodiment of the present invention.
- FIG. 15 is a schematic view illustrating a signal after filtering processing performed by the filter processing unit 4101 according to the embodiment of the present invention. Note that FIG. 15 shows signal levels at pixel positions in one line (for example, in the horizontal direction) where the pixel arrangement shown in FIG. 3 is present.
- the G signal is a signal limited to the frequency characteristic as shown in FIG. 8 by the interpolation filter processing in the color image signal generation unit 412c.
- the filter processing unit 4101 performs filter processing on the G signal generated by the color image signal generation unit 412 c using a high pass filter having response characteristics (frequency characteristics) as shown in FIG. 13.
- the G signal subjected to the high-pass filter processing by the filter processing unit 4101 has a composite characteristic of the frequency characteristic shown in FIG. 8 and the frequency characteristic shown in FIG. 13 and consequently the band pass characteristic limited to the frequency characteristic shown in FIG. It becomes a signal (a response signal) to have.
- the band pass characteristic signal output from the filter processing unit 4101 becomes a specific frequency component signal having positive and negative values based on the zero level at which the low frequency component is cut as shown in FIG.
- a positive value area R s corresponds to a bright small area such as a small bright spot, a bright area of a convex portion of a living body, or a mucosal area where the hemoglobin content is locally smaller than that of a peripheral area, and has a negative value.
- the regions R v 1 and R v 2 correspond to the dark regions of blood vessels and depressions.
- the blood vessels in the mucous membrane absorb light of narrow band TG by hemoglobin contained in the blood, and the small diameter blood vessel of the mucous membrane surface layer, the medium diameter blood vessel of the mucous membrane middle layer, the large diameter blood vessel of the mucous membrane layer is less It will be a dark area.
- the amount of light absorption of narrow band TG increases as the amount of hemoglobin increases (that is, as blood vessels become thicker). Since the area other than the blood vessel has a small content of hemoglobin, the amount of reflected light is larger than that of the blood vessel area, and the area becomes brighter (at a high luminance value) on average.
- the zero level shown in FIG. 15 generally corresponds to the signal level in the area other than the blood vessel.
- a large diameter blood vessel which is a deep blood vessel is roughly cut by the high pass filter characteristic processed by the filter processing unit 4101, and the characteristic of the high pass filter is designed such that the middle layer and the surface layer are extracted.
- the frequency characteristic of the blood vessel in the G signal changes depending on the observation distance and the number of pixels of the endoscope 2 (insertion portion 21).
- the main focus was on determining the running state of the superficial blood vessels in the close state, mainly observing the color of a wider area (area exhibiting a brownish color in which the superficial blood vessels are dense) in distant view. A diagnosis has been made and it is possible to set the predetermined viewing distance.
- the clip processing unit 4102 receives the specific frequency component signal as shown in FIG. 15 output from the filter processing unit 4101 as shown in FIG. 15, and limits the specific frequency component signal limited by a preset clip threshold different between positive and negative values.
- Generate FIG. 16 is a schematic view for explaining signals after clip processing by the clip processing unit 4102 according to the embodiment of the present invention.
- the clip processing unit 4102 sets the clip upper limit value to a zero level and extracts the clip lower limit value based on a range (clip range) set as a predetermined negative value.
- the absolute value of the clip lower limit value in the present embodiment is larger than the absolute value of the above-described regions R v1 and R v 2.
- the clip processing unit 4102 cuts a region R s such as a bright spot and extracts only the regions R v 1 and R v 2 which are dark regions of the blood vessel region or the recess.
- the specific frequency component signal clipped by the clip processing unit 4102 is output to the gain processing unit 4103.
- the clip processing unit 4102 has been described as extracting the blood vessel region with two threshold values, but the present invention is not limited to this. It may be possible to extract. Further, for example, by combining non-linear low-pass processing such as morphology, it is also possible to extract only a recess having a width equal to or less than a predetermined width.
- the clip ranges described above may be the same range regardless of the observation method, or may be different ranges depending on the observation method.
- the clip processing unit 4102 obtains observation mode information from the control unit 44, determines the observation method, and performs signal extraction in the clip range corresponding to the determined observation method.
- the gain processing unit 4103 calculates the amount of gain with respect to the signal value of the G signal after the clip processing by the clip processing unit 4102, and multiplies the specific frequency component signal by the calculated amount of gain.
- FIG. 17 is a schematic diagram for explaining gain processing performed by the gain processing unit 4103 according to the embodiment of the present invention. Specifically, the gain processing unit 4103 receives the clipped specific frequency component signal output from the clip processing unit 4102 and the G signal output from the color image signal generation unit 412c. The gain processing unit 4103 calculates a gain amount according to the signal level of the G signal as shown in FIG. 17, multiplies the calculated gain amount by the specific frequency component signal, and obtains the final specific frequency component signal obtained. Are output to the specific frequency component adder 412e.
- the gain amount having the characteristics as shown in FIG. 17 is that the increase of the noise amount in the dark part is suppressed, and the amplitude of the blood vessel area extracted in the area of the predetermined brightness or more is amplified by .alpha.
- ⁇ is determined in advance as an amount to compensate for the attenuation of the blood vessel depicted in the B signal or R signal due to the interpolation processing in the previous stage.
- FIG. 18 is a block diagram for explaining the configuration of the main part of the interpolation processing unit 412, and is a color image signal generating unit 412c, a G signal specific frequency component extracting unit 421d, a specific frequency component adding unit 412e, and a display image generation processing unit It is a figure which shows the structure of 413.
- the specific frequency component adder 412 e includes a first switching unit 4201, a first adder 4202, a second switching unit 4203, and a second adder 4204.
- the first switching unit 4201 When the first switching unit 4201 receives the B signal from the color image signal generation unit 412 c and acquires observation mode information indicating that the illumination method is the WLI method, the input B signal is sent to the display image generation processing unit 413. Output. On the other hand, when acquiring the observation mode information to the effect that the illumination method is the NBI method, the first switching unit 4201 outputs the input B signal to the first addition unit 4202. In this manner, the first switching unit 4201 switches the output destination of the B signal input according to the observation mode information (illumination method).
- the first addition unit 4202 receives the B signal output from the first switching unit 4201 and the specific frequency component signal output from the G signal specific frequency component extraction unit 421d, and adds the specific frequency component signal to the B signal. Do.
- the first addition unit 4202 outputs the addition B signal obtained by the addition to the display image generation processing unit 413.
- the second switching unit 4203 When the second switching unit 4203 receives the R signal from the color image signal generation unit 412 c and obtains observation mode information indicating that the illumination method is the NBI method, the input R signal is sent to the display image generation processing unit 413. Output. On the other hand, when the second switching unit 4203 acquires observation mode information indicating that the illumination method is the WLI method, the second switching unit 4203 outputs the input R signal to the second addition unit 4204. Thus, the second switching unit 4203 switches the output destination of the R signal input according to the observation mode information (illumination method).
- the second addition unit 4204 receives the R signal output from the second switching unit 4203 and the specific frequency component signal output from the G signal specific frequency component extraction unit 421 d, and adds the specific frequency component signal to the R signal. Do.
- the second addition unit 4203 outputs the addition R signal obtained by the addition to the display image generation processing unit 413.
- the illumination method is the NBI method
- the meaning of adding the specific frequency component signal to the B signal will be described.
- the superficial blood vessel has a small amount of hemoglobin, so the amount of light absorbed in the narrow band TG is small, and the difference in signal value with the peripheral mucosa is small.
- the amount of hemoglobin in the medium-diameter blood vessel is larger than that of the superficial blood vessel, the amount of light absorbed in the narrow band TG is further increased, and a difference in signal value with the peripheral mucosa is obtained.
- the B signal is subjected to interpolation processing to maintain the frequency band as much as possible by the color difference interpolation unit 412b and the color image signal generation unit 412c as described above, since the interpolation processing itself has lowpass filter characteristics, high frequency components are attenuated The contrast (signal amplitude) of the small diameter blood vessel image and the medium diameter blood vessel image is reduced.
- a signal corresponding to the large diameter blood vessel image is not extracted, and a signal corresponding to the medium diameter blood vessel image and the small diameter blood vessel image is extracted as the specific frequency component signal. Note that regions other than blood vessels are not extracted by the clipping process, and unnecessary noise mixing can also be suppressed for the specific frequency component signal.
- a specific frequency component image of the small diameter blood vessel or a specific frequency component image of the middle diameter blood vessel is added to the small diameter blood vessel image or middle diameter blood vessel image of the B signal.
- the contrast of the blood vessel image in the B signal can be improved as compared with the case of the B signal alone.
- the illumination system is NBI mode
- only for the mucosal surface of the capillaries to absorb only light in a narrow band T B there is a case where the absorption of light in a narrow band T G is hardly obtained.
- a capillary image can not be drawn by the G signal, and a specific frequency component of the G signal can not be added to the G signal.
- the reduction in resolution of the B signal can be minimized by performing direction determination interpolation on the BG signal by the color difference interpolation unit 412b.
- the display image generation processing unit 413 combines the G signal output from the color image signal generation unit 412c and the B signal (or addition B signal) and R signal (or addition R signal) output from the specific frequency component addition unit 412e. On the other hand, after the processing described above is performed, the signal after the processing is output to the display unit 5 as a display image signal for display.
- FIG. 19 is a flowchart illustrating signal processing performed by the processor unit 4.
- the processor unit 4 acquires an electrical signal from the endoscope 2 (tip portion 24)
- the processor unit 4 outputs the electrical signal to the pre-processing unit 411 (step S101).
- the electrical signal from the endoscope 2 is a signal including RAW image data generated by the imaging element 202 and converted into a digital signal by the A / D conversion unit 205.
- the pre-processing unit 411 When the electrical signal is input to the pre-processing unit 411, the pre-processing unit 411 performs the above-described OB clamp processing, noise reduction processing, and white balance processing, and outputs the image signal after signal processing to the interpolation processing unit 412. (Step S102).
- the G interpolation color difference calculating unit 412a transmits a signal (R pixel or B pixel) for which the G signal is missing.
- An interpolated G signal is generated, and a G signal image in which all pixel positions have a G signal (pixel value) or an interpolated G signal (interpolated value) is output to the color image signal generation unit 412c. (Step S103).
- the G interpolation color difference calculation unit 412a selects observation mode information (in which one of the white illumination light observation method and the narrow band light observation method is generated) as the input electric signal. It acquires and judges (step S104). Specifically, based on the control signal from the control unit 44 (for example, information on illumination light or information indicating an observation method), the G interpolation color difference calculation unit 412a is generated by which observation method To judge.
- the G interpolation color difference calculation unit 412a determines that the input electrical signal is generated by the white illumination light observation method (step S104; WLI)
- the color components of each color component are determined according to the positions of the R pixel and B pixel.
- R G and B G signals which are color difference signals taking the color difference between the signal and the interpolated G signal, are generated, and are output to the color difference interpolation unit 412 b as color difference image signals (step S 105).
- the G interpolation color difference calculating unit 412a determines that the input electric signal is generated by the narrow band light observation method (step S104; NBI)
- the B component is selected according to the position of the B pixel.
- the BG signal which is a color difference signal taking the color difference between the G signal and the interpolation G signal, is generated, and is output to the color difference interpolation unit 412b as a color difference image signal (step S106).
- the color difference interpolation unit 412b performs color difference interpolation processing based on the color difference image signal acquired from the G interpolation color difference calculation unit 412a (step S107). Specifically, the color difference interpolation unit 412b interpolates the color difference signal missing at each pixel position with respect to the color difference image signal input from the G interpolation color difference calculation unit 412a, and all pixel positions have color difference signals.
- the color difference image signal is output to the color image signal generation unit 412c. That is, according to the interpolation processing of the color difference interpolation unit 412b, in the case of the white illumination light observation method, an image signal constituting one image in which each pixel has the values of the color difference signals RG and BG is generated. In the case of the band light observation method, an image signal is generated which constitutes a single image in which each pixel has a value of the color difference signal BG.
- the color image signal generation unit 412c configures a color image using the pixel value and the interpolation value of the G component generated by the G interpolation color difference calculation unit 412a and the signal value of the color difference image signal generated by the color difference interpolation unit 412b.
- To generate a color image signal (step S108). Specifically, the color image signal generation unit 412c adds the RGB signal or GB by adding the G signal or the interpolated G signal at each pixel position and the color difference signal (BG signal or RG signal). A signal is generated, and a G signal is output to the G signal specific frequency component extraction unit 421d and the display image generation processing unit 413, and an RB signal is output to the specific frequency component addition unit 412e.
- the G signal specific frequency component extraction unit 421d extracts a specific frequency component signal from the G signal among the RGB signals generated by the color image signal generation unit 412c. Specifically, first, the filter processing unit 4101 receives a G signal among the RGB signals generated by the color image signal generation unit 412c, and performs a filter process using a high pass filter (step S109).
- the clip processing unit 4102 performs clipping processing on the specific frequency component signal (see FIG. 15) output from the filter processing unit 4101, and the specific frequency component signal limited by the preset clip lower limit value and clip upper limit value. Are generated (step S110: specific frequency component extraction step).
- the gain processing unit 4103 calculates the amount of gain with respect to the signal value of the G signal after the clip processing by the clip processing unit 4102, and multiplies the specific frequency component signal by the calculated amount of gain (step S111).
- the specific frequency component addition unit 412e receives the R signal and the B signal generated by the color image signal generation unit 412c and the specific frequency component signal extracted by the G signal specific frequency component extraction unit 421d.
- the specific frequency component signal according to each color component is added to the B signal according to the observation mode information (step S112: specific frequency component addition step).
- the specific frequency component adder 412e refers to the above-mentioned observation mode information, and adds the specific frequency component signal according to each color component to the R signal and the B signal if the illumination method is the WLI method, and NBI In the case of the method, a specific frequency component signal corresponding to the B component is added to the B signal.
- the display image generation processing unit 413 combines the G signal output from the color image signal generation unit 412c and the B signal (or addition B signal) and R signal (or addition R signal) output from the specific frequency component addition unit 412e.
- tone conversion, enlargement processing, or structure emphasizing processing of a structure such as a capillary or a mucous membrane fine pattern of a mucous membrane surface layer is performed to generate a display image signal for display (step S113).
- the display image generation processing unit 413 performs predetermined processing, and then outputs the processing to the display unit 5 as a display image signal.
- the G signal specific frequency component extraction unit for the color image signal generated by the color image signal generation unit 412c. 421 d extracts the specific frequency component signal from the G signal among the RGB signals generated by the color image signal generation unit 412 c, and the specific frequency component addition unit 412 e adds the specific frequency component signal to the B signal.
- the specific frequency component addition unit 412 e adds the specific frequency component signal to the B signal.
- the WLI method in the case of the WLI method, high resolution is ensured by performing interpolation processing using signal values of G pixels, and in the case of the NBI method, a specific frequency component signal for the blue component signal Because the region corresponding to the dark region of the blood vessel or the recess is compensated for, the white light observation with the green component as the luminance component and the narrow band observation with the blue component as the luminance component The resolution can be increased.
- the BG signal and the RG signal with respect to the BG signal and the RG signal, higher correlation can be achieved by using only the BG signal or the RG signal in the vicinity of the target pixel position. Since the direction is determined and interpolation is performed using the BG signal or the RG signal in which the determined correlation is high, structures such as capillaries and deep blood vessels that are hardly correlated with the G signal It is possible to minimize the decrease in resolution of
- FIG. 20 is a schematic view showing a schematic configuration of an endoscope apparatus 1a according to a modification of the embodiment of the present invention.
- the light source unit 3 has a switching filter 31c, a white illumination light observation mode by insertion and removal of said switching filter 31c, narrowband T B, the narrow band illumination light comprising light of T G
- the light source unit 3a is replaced with the light source unit 3 and the observation method is switched by the rotation filter 31g.
- the endoscope apparatus 1a includes the endoscope 2 described above, the processor unit 4, the display unit 5, and a light source unit 3a that generates illumination light emitted from the tip of the endoscope 2.
- the light source unit 3 a includes an illumination unit 31 and an illumination control unit 32.
- the illumination unit 31 switches and emits a plurality of illumination lights having different wavelength bands under the control of the illumination control unit 32.
- the illumination unit 31 includes the light source 31a, the light source driver 31b, the drive unit 31d, the drive driver 31e and the condenser lens 31f described above, and the rotation filter 31g.
- FIG. 21 is a schematic view showing the configuration of the rotary filter 31g of the light source unit 3a according to the modification of the embodiment of the present invention.
- the rotary filter 31 g has a rotary shaft 310 and a disk-like rotary portion 311 supported by the rotary shaft 310.
- the rotating portion 311 has three filters (filters 312 to 314) arranged in the area obtained by dividing the main surface into three.
- the filter 312 transmits white illumination light including light in the red, green and blue wavelength bands H R , H G and H B.
- a narrowband T B (e.g., 400 nm ⁇ 445 nm) included in the wavelength band H B and light
- narrow-band T G (e.g., 530 nm ⁇ 550 nm) included in the wavelength band H G and light, consisting of A narrow band illumination light (in this modification, the first narrow band illumination light) is transmitted.
- the light transmitted by the filter 313 corresponds to the narrow band illumination light of the narrow band light observation (NBI) method described above.
- FIG. 22 is a graph showing the relationship between the wavelength of the illumination light by the filter of the illumination unit 31 of the endoscope apparatus 1a according to the modification of the embodiment of the present invention and the transmittance.
- the filter 314 is a narrow band illumination light composed of the light of the narrow band T R included in the wavelength band H R and the light of the narrow band T G included in the wavelength band H G (in this modification, the second narrow band Transmission light).
- the light of the narrow band TG transmitted by the filter 314 may be the same as the light of the narrow band TG of the narrow band light observation (NBI) method described above, or may be of a different band.
- NBI narrow band light observation
- a color component with a large change is selected as the luminance component among the red component (narrow band T R ) and the green component (narrow band T G ).
- the illumination control unit 32 controls the light source driver 31b to turn on and off the light source 31a, and controls the drive driver 31e to rotate the rotation filter 31g (rotation shaft 310) to perform one of the filters 312 to 314.
- the filter By arranging the filter on the light path of the light source 31a, the type (band) of the illumination light emitted by the illumination unit 31 is controlled.
- the image processing unit 41 performs the above-described signal processing to generate a display image signal.
- the image processing unit 41 performs the above-described signal processing to generate a display image signal.
- no B component light is present, so only the color difference image signal having the RG signal from the RG interpolation unit 4007 in the color image signal generation unit 412c. Is input, and the color image signal generator 412 c generates R component and G component signals (RG signals).
- the filter processing unit 4101 receives the G signal among the RGB signals generated by the color image signal generation unit 412 c and performs filter processing using a predetermined high pass filter.
- the filter processing unit 4101 may extract signals having different band pass characteristics depending on the observation method. Specifically, when the observation method is the NBI method, the filter processing unit 4101 extracts a signal having a band pass characteristic limited to the frequency characteristic shown in FIG. 14 and uses the second narrow band illumination light observation method. In some cases, a signal having a predetermined range of frequency characteristics may be extracted by predetermined band pass filtering.
- the high-frequency filter is cut as a band pass filter instead of the high pass filter to eliminate the unnecessary blood vessels in the mucosal surface layer unnecessary for observation with the second narrowband illumination light. It can be cut.
- the clip processing unit 4102 switches the threshold (range of signal amplitude) to be subjected to the clipping process according to the observation mode information (observation method).
- the threshold for each observation method may refer to one stored in advance in the storage unit 43, or the clip processing is performed using the threshold input by the operator via the input unit 42. It may be
- the first switching unit 4201 receives the B signal from the color image signal generation unit 412c and acquires observation mode information indicating that the illumination method is the WLI method or the method using the second narrowband illumination light, the input is input.
- the received B signal is output to the display image generation processing unit 413.
- the first switching unit 4201 when acquiring the observation mode information to the effect that the illumination method is the NBI method, the first switching unit 4201 outputs the input B signal to the first addition unit 4202.
- the second switching unit 4203 receives the R signal from the color image signal generation unit 412c and acquires observation mode information indicating that the illumination method is the NBI method
- the input R signal is displayed as a display image. It is output to the generation processing unit 413.
- the second switching unit 4203 outputs the input R signal to the second addition unit 4204 when acquiring the observation mode information indicating that the illumination method is the WLI method or the method using the second narrowband illumination light.
- the color filter 202a having a plurality of filters each transmitting light in a predetermined wavelength band is described as being provided on the light receiving surface of the imaging element 202, but each filter is an imaging element It may be provided individually for each pixel 202.
- the endoscope apparatuses 1 and 1a emit light from the illumination unit 31 to the white light emitted from one light source 31a by the insertion and removal of the switching filter 31c and the rotation of the rotation filter 31g. Is described as switching the illumination light to white illumination light and narrowband illumination light, but switching between two light sources respectively emitting white illumination light and narrowband illumination light to switch either white illumination light or narrowband illumination light May be emitted.
- a capsule endoscope provided with a light source unit, a color filter, and an imaging device and introduced into a subject It can apply.
- the endoscope apparatuses 1 and 1a have been described assuming that the A / D conversion unit 205 is provided at the distal end portion 24. However, the endoscope devices 1 and 1a are provided in the processor unit 4 It may be. Further, the configuration relating to the image processing may be provided in the endoscope 2, a connector for connecting the endoscope 2 and the processor unit 4, the operation unit 22 or the like. In the above-described endoscope apparatuses 1 and 1a, the endoscope 2 connected to the processor unit 4 is described as identifying the endoscope 2 using identification information or the like stored in the imaging information storage unit 206. An identification means may be provided at the connection portion (connector) between the unit 4 and the endoscope 2. For example, a pin for identification (identification means) is provided on the endoscope 2 side, and the endoscope 2 connected to the processor unit 4 is identified.
- the G interpolation color difference calculation unit 412a generates the G signal interpolated based on the peripheral pixels for the pixel (R pixel or B pixel) where the G signal is missing.
- linear interpolation may be performed in which the interpolation direction is determined by determining the interpolation direction, or interpolation processing may be performed by cubic interpolation or other non-linear interpolation.
- the second switching unit 4203 outputs the input R signal to the second adding unit 4204 when acquiring the observation mode information to the effect that the illumination method is the WLI method.
- the input R signal may be output to the display image generation processing unit 413. That is, in the WLI method, addition processing of specific frequency component signals may not be performed. By controlling so as not to add the specific frequency component in the WLI system, it is possible to suppress the change in the color tone of the blood vessel.
- the response signal extracted by the filter processing unit 4101 may be output to the gain processing unit 4103 as a specific frequency component signal without performing the clip processing by the clip processing unit 4102. .
- the color filters 202a are described as the filter units U1 of Bayer arrangement arranged in a matrix, but the present invention is not limited to the Bayer arrangement.
- any filter unit having a filter arrangement in which the density of the G component is higher than that of the B component may be used. It is applicable if the density of the pixel which generates the electric signal of the luminosity component of white illumination light is higher than the density of the pixel which generates the electric signal of the luminosity component of narrow-band illumination light.
- the endoscope apparatus including the image processing apparatus has been described as an example, but the present invention can also be applied to an imaging apparatus that performs image processing, such as a microscope apparatus.
- the image processing apparatus, the operation method of the image processing apparatus, the operation program of the image processing apparatus, and the endoscope apparatus according to the present invention are the white illumination light observation method and the narrow band light observation method. Is also useful for obtaining high resolution images.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Astronomy & Astrophysics (AREA)
- General Physics & Mathematics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
Description
図1は、本発明の一実施の形態にかかる内視鏡装置1の概略構成を示す図である。図2は、本発明の一実施の形態にかかる内視鏡装置1の概略構成を示す模式図である。図1および図2に示す内視鏡装置1は、被検体内に挿入部21を挿入することによって観察部位の体内画像を撮像して電気信号を生成する内視鏡2と、内視鏡2の先端から出射する照明光を発生する光源部3と、内視鏡2が取得した電気信号に所定の画像処理を施すとともに、内視鏡装置1全体の動作を統括的に制御するプロセッサ部4と、プロセッサ部4が画像処理を施した体内画像を表示する表示部5と、を備える。内視鏡装置1は、患者等の被検体内に、挿入部21を挿入して被検体内の体内画像を取得する。医師等の術者は、取得した体内画像の観察を行うことによって、検出対象部位である出血部位や腫瘍部位の有無を検査する。
〔判定情報が「斜め上方向」の場合〕
B-G補間方向判別部4003は、判定情報が「斜め上方向」の場合、注目画素(k,l)におけるB-G信号の補間色差信号値fB-G(k,l)を下式(3)に基づいて算出する。
B-G補間方向判別部4003は、判定情報が「斜め下方向」の場合、注目画素(k,l)におけるB-G信号の補間色差信号値fB-G(k,l)を下式(4)に基づいて算出する。
B-G補間方向判別部4003は、判定情報が「特定方向の相関なし」の場合、注目画素(k,l)におけるB-G信号の補間色差信号値fB-G(k,l)を下式(5)に基づいて算出する。
図20は、本発明の実施の形態の変形例にかかる内視鏡装置1aの概略構成を示す模式図である。上述した実施の形態では、光源部3が、切替フィルタ31cを有し、該切替フィルタ31cの挿脱により白色照明光観察方式と、狭帯域TB,TGの光からなる狭帯域照明光を用いた狭帯域光観察方式とのいずれかの観察方式に切り替えるものとして説明したが、本変形例では、光源部3に代えて光源部3aを有し、回転フィルタ31gにより観察方式を切り替える。
2 内視鏡
3,3a 光源部
4 プロセッサ部
5 表示部
21 挿入部
22 操作部
23 ユニバーサルコード
24 先端部
31 照明部
31a 光源
31b 光源ドライバ
31c 切替フィルタ
31d 駆動部
31e 駆動ドライバ
31f 集光レンズ
31g 回転フィルタ
32 照明制御部
41 画像処理部
42 入力部
43 記憶部
44 制御部
201 撮像光学系
202 撮像素子
202a カラーフィルタ
203 ライトガイド
204 照明用レンズ
205 A/D変換部
206 撮像情報記憶部
261 識別情報記憶部
411 プレ処理部
412 補間処理部
412a G補間色差算出部
412b 色差補間部
412c カラー画像信号生成部
412d G信号特定周波数成分抽出部
412e 特定周波数成分加算部
413 表示画像生成処理部
4001 色差分離部
4002 B-G相関判別部
4003 B-G補間方向判別部
4004 B-G補間部
4005 R-G相関判別部
4006 R-G補間方向判別部
4007 R-G補間部
4101 フィルタ処理部
4102 クリップ処理部
4103 ゲイン処理部
4201 第1切替部
4202 第1加算部
4203 第2切替部
4204 第2加算部
U1 フィルタユニット
Claims (10)
- 赤色、緑色および青色の波長帯域の光を含む白色照明光の輝度成分である第1輝度成分の第1色信号を生成する第1画素、前記白色照明光の波長帯域よりも狭帯域の光からなる狭帯域照明光の輝度成分である第2輝度成分の第2色信号を生成する第2画素、ならびに第1および第2輝度成分とは異なる色成分の第3色信号を生成する第3画素がマトリックス状に配置され、前記第1画素の密度が前記第2および第3画素の各密度よりも高く、各画素がそれぞれ生成した色信号をもとに欠落した色成分の信号を補間することにより、各色成分の色信号を有する画像信号を生成する画像処理装置において、
補間により生成された前記各色成分の色信号のうちの前記第1輝度成分の色信号から、所定の空間周波数成分を有する特定周波数成分信号を抽出する特定周波数成分抽出部と、
前記第1輝度成分とは異なる色成分の色信号に、前記特定周波数成分抽出部が抽出した前記特定周波数成分信号を加算する特定周波数成分加算部と、
を備えたことを特徴とする画像処理装置。 - 前記特定周波数成分抽出部は、所定の周波数特性を有するハイパスフィルタ、またはバンドパスフィルタにより該フィルタの周波数特性に応じた応答信号を抽出するフィルタ処理部を有する、
ことを特徴とする請求項1に記載の画像処理装置。 - 前記補間により生成された前記各色成分の色信号は、補間フィルタ処理が施され、
前記特定周波数成分信号は、前記フィルタ処理部のフィルタの周波数特性と前記補間フィルタ処理後の周波数特性との合成特性を有する信号である、
ことを特徴とする請求項2に記載の画像処理装置。 - 前記特定周波数成分抽出部は、前記フィルタ処理部が抽出した応答信号から、構造物に応じた特定振幅を有する信号を前記特定周波数成分信号として抽出するクリップ処理部を有する、
ことを特徴とする請求項1に記載の画像処理装置。 - 前記構造物は、血管であり、
前記クリップ処理部は、抽出した応答信号から、前記血管に応じた閾値に基づいて前記特定周波数成分信号を抽出する、
ことを特徴とする請求項4に記載の画像処理装置。 - 前記第2または第3画素の位置における前記第1輝度成分の色信号を当該画素位置の周辺の前記第1色信号に基づき補間し、該補間により生成された前記第1輝度成分の色信号と、前記第2または第3画素が生成した第2または第3色信号との差をとった色差信号を生成する補間色差算出部と、
前記補間色差算出部が生成した前記色差信号をもとに補間方向を判別して、各画素位置で欠落している色差信号を補間する色差補間部と、
前記色差補間部が生成した補間後の色差信号と、前記補間色差算出部が生成した前記前記第1輝度成分の色信号とをもとに、前記第1輝度成分とは異なる色成分の色信号を生成する画像信号生成部と、
を備えたことを特徴とする請求項1に記載の画像処理装置。 - 前記クリップ処理部は、前記白色照明光による観察方式、または前記狭帯域照明光による観察方式に応じてクリップ処理を切り替える、
ことを特徴とする請求項4に記載の画像処理装置。 - 赤色、緑色および青色の波長帯域の光を含む白色照明光の輝度成分である第1輝度成分の第1色信号を生成する第1画素、前記白色照明光の波長帯域よりも狭帯域の光からなる狭帯域照明光の輝度成分である第2輝度成分の第2色信号を生成する第2画素、ならびに第1および第2輝度成分とは異なる色成分の第3色信号を生成する第3画素がマトリックス状に配置され、前記第1画素の密度が前記第2および第3画素の各密度よりも高く、各画素がそれぞれ生成した色信号をもとに欠落した色成分の信号を補間することにより、各色成分の色信号を有する画像信号を生成する画像処理装置の作動方法において、
特定周波数成分抽出部が、補間により生成された前記各色成分の色信号のうちの前記第1輝度成分の色信号から、所定の空間周波数成分を有する特定周波数成分信号を抽出する特定周波数成分抽出ステップと、
特定周波数成分加算部が、前記第1輝度成分とは異なる色成分の色信号に、前記特定周波数成分抽出部が抽出した前記特定周波数成分信号を加算する特定周波数成分加算ステップと、
を含むことを特徴とする画像処理装置の作動方法。 - 赤色、緑色および青色の波長帯域の光を含む白色照明光の輝度成分である第1輝度成分の第1色信号を生成する第1画素、前記白色照明光の波長帯域よりも狭帯域の光からなる狭帯域照明光の輝度成分である第2輝度成分の第2色信号を生成する第2画素、ならびに第1および第2輝度成分とは異なる色成分の第3色信号を生成する第3画素がマトリックス状に配置され、前記第1画素の密度が前記第2および第3画素の各密度よりも高く、各画素がそれぞれ生成した色信号をもとに欠落した色成分の信号を補間することにより、各色成分の色信号を有する画像信号を生成する画像処理装置の作動プログラムにおいて、
特定周波数成分抽出部が、補間により生成された前記各色成分の色信号のうちの前記第1輝度成分の色信号から、所定の空間周波数成分を有する特定周波数成分信号を抽出する特定周波数成分抽出手順と、
特定周波数成分加算部が、前記第1輝度成分とは異なる色成分の色信号に、前記特定周波数成分抽出部が抽出した前記特定周波数成分信号を加算する特定周波数成分加算手順と、
を前記画像処理装置に実行させることを特徴とする画像処理装置の作動プログラム。 - 赤色、緑色および青色の波長帯域の光を含む白色照明光、ならびに前記白色照明光の波長帯域より狭帯域の光からなる狭帯域照明光のいずれかを出射する光源部と、
赤色、緑色および青色の波長帯域の光を含む白色照明光の輝度成分である第1輝度成分の第1色信号を生成する第1画素、前記白色照明光の波長帯域よりも狭帯域の光からなる狭帯域照明光の輝度成分である第2輝度成分の第2色信号を生成する第2画素、ならびに第1および第2輝度成分とは異なる色成分の第3色信号を生成する第3画素がマトリックス状に配置され、前記第1画素の密度が前記第2および第3画素の各密度よりも高い撮像素子と、
請求項1に記載の画像処理装置と、
を備えたことを特徴とする内視鏡装置。
Priority Applications (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/050400 WO2016110984A1 (ja) | 2015-01-08 | 2015-01-08 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
| DE112015005531.2T DE112015005531T5 (de) | 2015-01-08 | 2015-01-08 | Bildverarbeitungsvorrichtung, Verfahren zum Betreiben einer Bildverarbeitungsvorrichtung, Programm zum Betreiben einer Bildverarbeitungsvorrichtung und Endoskopeinrichtung |
| JP2016568228A JP6454359B2 (ja) | 2015-01-08 | 2015-01-08 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
| CN201580072238.8A CN107105987B (zh) | 2015-01-08 | 2015-01-08 | 图像处理装置及其工作方法、记录介质和内窥镜装置 |
| US15/637,422 US10070771B2 (en) | 2015-01-08 | 2017-06-29 | Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/050400 WO2016110984A1 (ja) | 2015-01-08 | 2015-01-08 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/637,422 Continuation US10070771B2 (en) | 2015-01-08 | 2017-06-29 | Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016110984A1 true WO2016110984A1 (ja) | 2016-07-14 |
Family
ID=56355705
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/050400 Ceased WO2016110984A1 (ja) | 2015-01-08 | 2015-01-08 | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US10070771B2 (ja) |
| JP (1) | JP6454359B2 (ja) |
| CN (1) | CN107105987B (ja) |
| DE (1) | DE112015005531T5 (ja) |
| WO (1) | WO2016110984A1 (ja) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110769738A (zh) * | 2017-06-21 | 2020-02-07 | 奥林巴斯株式会社 | 图像处理装置、内窥镜装置、图像处理装置的工作方法及图像处理程序 |
| WO2021140601A1 (ja) * | 2020-01-09 | 2021-07-15 | オリンパス株式会社 | 画像処理システム、内視鏡システム及び画像処理方法 |
| US12417531B2 (en) | 2020-01-09 | 2025-09-16 | Olympus Corporation | Image processing system, training method for training device, and storage medium |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6758859B2 (ja) * | 2016-03-01 | 2020-09-23 | キヤノン株式会社 | 撮像装置、撮像システム、および画像処理方法 |
| US20180228361A1 (en) * | 2017-02-15 | 2018-08-16 | Dynacolor, Inc. | Arthroscopic system with disposable arthroscope |
| US11109747B2 (en) * | 2017-02-15 | 2021-09-07 | Dynacolor, Inc. | Arthroscopic system with disposable arthroscope having image rotation function and method thereof |
| JP7056131B2 (ja) * | 2017-12-15 | 2022-04-19 | オムロン株式会社 | 画像処理システム、画像処理プログラム、および画像処理方法 |
| JPWO2019198576A1 (ja) * | 2018-04-11 | 2021-04-08 | 富士フイルム株式会社 | 医療画像処理装置 |
| CN111970954B (zh) * | 2018-04-23 | 2024-07-16 | 富士胶片株式会社 | 医疗图像处理系统 |
| JP7077260B2 (ja) * | 2019-03-22 | 2022-05-30 | 株式会社日立ハイテク | 電子ビーム装置及び画像処理方法 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008043604A (ja) * | 2006-08-18 | 2008-02-28 | Olympus Medical Systems Corp | 内視鏡装置 |
| JP2011143100A (ja) * | 2010-01-15 | 2011-07-28 | Olympus Corp | 画像処理装置、内視鏡システム、プログラム及び画像処理方法 |
| JP2014039105A (ja) * | 2012-08-13 | 2014-02-27 | Canon Inc | 画像処理装置、画像処理方法、及びコンピュータプログラム |
Family Cites Families (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5049989A (en) * | 1990-01-04 | 1991-09-17 | Olympus Optical Co., Ltd. | Method and circuit for reducing the influence of a bright image area in an endoscope image signal |
| US6480300B1 (en) * | 1998-04-08 | 2002-11-12 | Fuji Photo Film Co., Ltd. | Image processing apparatus, image processing method and recording medium on which software for executing the image processing is recorded |
| JP4162111B2 (ja) * | 1999-07-27 | 2008-10-08 | 富士フイルム株式会社 | 画像処理方法および装置並びに記録媒体 |
| US20030081177A1 (en) * | 2001-10-29 | 2003-05-01 | Daniel Rosen | Method and system for producing multiple imagery products from a one time scan of motion picture film |
| US7167280B2 (en) * | 2001-10-29 | 2007-01-23 | Eastman Kodak Company | Full content film scanning on a film to data transfer device |
| US7412092B2 (en) * | 2003-10-31 | 2008-08-12 | Sony Corporation | Image processing apparatus, image processing method, and program |
| JP4441809B2 (ja) | 2004-05-20 | 2010-03-31 | 忠 杉木 | カラー固体撮像装置 |
| JP4025764B2 (ja) | 2004-08-31 | 2007-12-26 | オリンパス株式会社 | 内視鏡装置 |
| JP5523802B2 (ja) * | 2009-11-25 | 2014-06-18 | 株式会社東芝 | 画像処理装置 |
| RU2496253C1 (ru) * | 2009-07-21 | 2013-10-20 | Кэнон Кабусики Кайся | Устройство обработки изображения и способ обработки изображения для корректировки хроматической аберрации |
| JP5631769B2 (ja) * | 2011-02-17 | 2014-11-26 | 株式会社東芝 | 画像処理装置 |
| CN103283240B (zh) * | 2011-02-21 | 2016-03-30 | 富士胶片株式会社 | 彩色成像设备 |
| JP2012170640A (ja) | 2011-02-22 | 2012-09-10 | Fujifilm Corp | 内視鏡システム、および粘膜表層の毛細血管の強調画像表示方法 |
| JP5855358B2 (ja) * | 2011-05-27 | 2016-02-09 | オリンパス株式会社 | 内視鏡装置及び内視鏡装置の作動方法 |
| JP5816511B2 (ja) * | 2011-10-04 | 2015-11-18 | オリンパス株式会社 | 画像処理装置、内視鏡装置及び画像処理装置の作動方法 |
| JP5931418B2 (ja) * | 2011-11-25 | 2016-06-08 | オリンパス株式会社 | 画像処理装置、画像処理装置の作動方法、及び画像処理プログラム |
| EP2800377A4 (en) * | 2011-12-28 | 2015-07-15 | Fujifilm Corp | IMAGING DEVICE |
| US8989515B2 (en) * | 2012-01-12 | 2015-03-24 | Kofax, Inc. | Systems and methods for mobile image capture and processing |
| JP2015035782A (ja) * | 2013-08-09 | 2015-02-19 | オリンパス株式会社 | 画像処理装置、撮像装置、顕微鏡システム、画像処理方法及び画像処理プログラム |
| US9418398B2 (en) * | 2014-06-12 | 2016-08-16 | Samsung Electronics Co., Ltd. | Low power subpixel rendering on RGBW display |
| US10594996B2 (en) * | 2014-09-24 | 2020-03-17 | Sony Semiconductor Solutions Corporation | Image processing apparatus, image pickup device, image pickup apparatus, and image processing method |
-
2015
- 2015-01-08 DE DE112015005531.2T patent/DE112015005531T5/de not_active Withdrawn
- 2015-01-08 JP JP2016568228A patent/JP6454359B2/ja active Active
- 2015-01-08 WO PCT/JP2015/050400 patent/WO2016110984A1/ja not_active Ceased
- 2015-01-08 CN CN201580072238.8A patent/CN107105987B/zh active Active
-
2017
- 2017-06-29 US US15/637,422 patent/US10070771B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008043604A (ja) * | 2006-08-18 | 2008-02-28 | Olympus Medical Systems Corp | 内視鏡装置 |
| JP2011143100A (ja) * | 2010-01-15 | 2011-07-28 | Olympus Corp | 画像処理装置、内視鏡システム、プログラム及び画像処理方法 |
| JP2014039105A (ja) * | 2012-08-13 | 2014-02-27 | Canon Inc | 画像処理装置、画像処理方法、及びコンピュータプログラム |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110769738A (zh) * | 2017-06-21 | 2020-02-07 | 奥林巴斯株式会社 | 图像处理装置、内窥镜装置、图像处理装置的工作方法及图像处理程序 |
| WO2021140601A1 (ja) * | 2020-01-09 | 2021-07-15 | オリンパス株式会社 | 画像処理システム、内視鏡システム及び画像処理方法 |
| US12417531B2 (en) | 2020-01-09 | 2025-09-16 | Olympus Corporation | Image processing system, training method for training device, and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| US10070771B2 (en) | 2018-09-11 |
| JP6454359B2 (ja) | 2019-01-16 |
| CN107105987A (zh) | 2017-08-29 |
| CN107105987B (zh) | 2019-11-15 |
| DE112015005531T5 (de) | 2017-09-21 |
| US20170296034A1 (en) | 2017-10-19 |
| JPWO2016110984A1 (ja) | 2017-07-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6454359B2 (ja) | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 | |
| JP6285383B2 (ja) | 画像処理装置、内視鏡システム、画像処理装置の作動方法、及び内視鏡システムの作動方法 | |
| US9675238B2 (en) | Endoscopic device | |
| JP5968944B2 (ja) | 内視鏡システム、プロセッサ装置、光源装置、内視鏡システムの作動方法、プロセッサ装置の作動方法、光源装置の作動方法 | |
| JP6435275B2 (ja) | 内視鏡装置 | |
| JP6196900B2 (ja) | 内視鏡装置 | |
| JP6471173B2 (ja) | 画像処理装置、内視鏡装置の作動方法、画像処理プログラムおよび内視鏡装置 | |
| EP2979618B1 (en) | Image processing device for operating endoscope system | |
| EP2979617B1 (en) | Image processing device, and method for operating endoscope system | |
| JP6823072B2 (ja) | プロセッサ装置及び内視鏡システム | |
| CN105310633B (zh) | 医用图像处理装置及其工作方法、以及内窥镜系统 | |
| JP6401800B2 (ja) | 画像処理装置、画像処理装置の作動方法、画像処理装置の作動プログラムおよび内視鏡装置 | |
| CN106163367B (zh) | 医用图像处理装置及其工作方法以及内窥镜系统 | |
| JP2013000176A (ja) | 内視鏡システム、内視鏡システムの光源装置、及び光量制御方法 | |
| JP6346501B2 (ja) | 内視鏡装置 | |
| WO2016084257A1 (ja) | 内視鏡装置 | |
| JP5972312B2 (ja) | 医用画像処理装置及びその作動方法 | |
| JPWO2018142658A1 (ja) | 内視鏡システム | |
| JP5970054B2 (ja) | 内視鏡システム、内視鏡システムの光源装置、及び内視鏡システムの作動方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15876861 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2016568228 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 112015005531 Country of ref document: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15876861 Country of ref document: EP Kind code of ref document: A1 |