WO2017017735A1 - Image processing device, display control method and program - Google Patents
Image processing device, display control method and program Download PDFInfo
- Publication number
- WO2017017735A1 WO2017017735A1 PCT/JP2015/071164 JP2015071164W WO2017017735A1 WO 2017017735 A1 WO2017017735 A1 WO 2017017735A1 JP 2015071164 W JP2015071164 W JP 2015071164W WO 2017017735 A1 WO2017017735 A1 WO 2017017735A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- narrow band
- image processing
- wide band
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
Definitions
- the present invention relates to an image processing apparatus that performs image processing on image data generated by an imaging apparatus that captures an in-vivo image of a subject, a display control method, and a program.
- a normal image selection button for instructing display of a normal image, narrow band image
- a technique for displaying either a normal image or a narrowband image by providing a narrowband image selection button for instructing display and the user selecting one of two buttons according to observation See Patent Document 2.
- JP 2007-20880 A JP 2008-86759 A JP, 2011-194111, A
- Patent Document 1 since the normal image and the narrow band image are simultaneously displayed, the user must perform an operation of the endoscope while making a diagnosis while comparing the two images. There was a problem that the load of the work of the user in was heavy.
- Patent Document 2 when the user finds a suspicious lesion while observing a normal image, the user switches the narrow band image from the normal image by selecting the narrow band image selection button each time. Also in this case, there is a problem that the load of the work of the user at the time of medical examination is large.
- the present invention has been made in view of the above, and an image processing apparatus, a display control method, and a program capable of preventing a user from missing a lesion while reducing the burden of the user's work at the time of medical examination. Intended to provide.
- an image processing apparatus includes a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one narrow band light To form a predetermined array pattern, and an individual filter forming the array pattern is generated by an imaging element disposed at a position corresponding to any of a plurality of pixels arranged in a two-dimensional grid.
- An image processing apparatus for performing predetermined image processing on the acquired image data and displaying an image corresponding to the image data subjected to the image processing on a display device, based on the image data generated by the imaging element
- An image generation unit generating a wide band image corresponding to the wide band filter and a narrow band image corresponding to the narrow band filter; and the narrow band generated by the image generation unit
- the wide band image based on a determination result of a calculation unit that calculates a feature amount included in an image, a determination unit that determines whether the feature amount calculated by the calculation unit exceeds a threshold, and a determination result of the determination unit
- a display control unit for emphasizing one of the narrowband images more than the other and displaying the narrowband image on the display device.
- the calculating unit calculates the feature amount based on an edge included in the narrow band image.
- the calculation unit may set the feature amount based on a first edge included in the wide band image and a second edge included in the narrow band image. To calculate.
- the image processing apparatus is characterized in that, in the above-mentioned invention, the calculation unit calculates the number of pixels whose pixel value exceeds a predetermined value in the narrow band image as the feature amount.
- the display control section may display the narrow band image display area as the wide band image when the determination section determines that the feature quantity exceeds the threshold value. And a display area larger than the display area.
- the display control unit reduces the wide band image when the determination unit determines that the feature amount exceeds the threshold, and the reduced wide band An image is superimposed on the narrow band image and displayed on the display device.
- the display control section blinks the narrow band image when the determination section determines that the feature amount exceeds the threshold, and the display apparatus It is characterized by being displayed on.
- the display control unit determines that the narrow band image is either a character or a graphic when the determination unit determines that the feature amount exceeds the threshold. To be displayed on the display device.
- the image processing apparatus is characterized in that, in the above-mentioned invention, the peak wavelength of light transmitted by the narrow band filter is between 395 nm and 435 nm.
- the image processing apparatus is characterized in that, in the above-mentioned invention, the peak wavelength of light transmitted by the narrow band filter is between 790 nm and 820 nm.
- a predetermined array pattern is formed by using a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one narrow band light.
- Each of the filters forming the array pattern performs predetermined image processing on the image data generated by the imaging device disposed at the position corresponding to any of the plurality of pixels disposed in a grid, and the image is generated.
- a display control method executed by an image processing apparatus that causes an image corresponding to processed image data to be displayed on a display device, wherein a wide band image corresponding to the wide band filter is generated based on the image data generated by the image sensor.
- An image generating step of generating a narrow band image corresponding to the narrow band filter, and the narrow band image generated in the image generating step The wide band image and the broadband image are calculated based on a determination step of calculating the included feature amount, a determination step of determining whether the feature amount calculated in the calculation step exceeds a threshold, and a determination result of the determination step.
- a predetermined arrangement pattern is formed using a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one at least one narrow band light;
- Each of the filters forming the array pattern performs predetermined image processing on image data generated by an imaging element arranged at a position corresponding to any one of a plurality of pixels arranged in a grid pattern,
- an image processing apparatus for displaying an image corresponding to image data subjected to image processing on a display device, a wide band image corresponding to the wide band filter and the narrow band filter based on the image data generated by the imaging device
- FIG. 1 is a view schematically showing an entire configuration of an endoscope system according to a first embodiment of the present invention.
- FIG. 2 is a block diagram showing the functions of the main parts of the endoscope system according to the first embodiment of the present invention.
- FIG. 3 is a view schematically showing a configuration of a color filter according to Embodiment 1 of the present invention.
- FIG. 4 is a view showing the relationship between the transmittance and the wavelength of each of the filters constituting the color filter according to Embodiment 1 of the present invention.
- FIG. 5 is a flowchart showing an outline of display processing performed by the processor according to the first embodiment of the present invention on a display device.
- FIG. 1 is a view schematically showing an entire configuration of an endoscope system according to a first embodiment of the present invention.
- FIG. 2 is a block diagram showing the functions of the main parts of the endoscope system according to the first embodiment of the present invention.
- FIG. 3 is a view schematically showing a configuration of
- FIG. 6 is a view schematically showing a filter used when the calculation unit of the processor according to the first embodiment of the present invention calculates the horizontal feature amount of the narrow band image.
- FIG. 7 is a view schematically showing a filter used when the calculation unit of the processor according to the first embodiment of the present invention calculates the feature amount in the vertical direction of the narrow band image.
- FIG. 8 is a diagram illustrating an example of an image displayed by the display device when the determination unit of the processor according to the first embodiment of the present invention determines that the value is smaller than the threshold.
- FIG. 9 is a diagram showing an example of an image displayed by the display device when the determination unit of the processor according to the first embodiment of the present invention determines that the value is not smaller than the threshold.
- FIG. 10 is a diagram showing an example of another image displayed by the display device when the determination unit of the processor according to the first embodiment of the present invention determines that the value is smaller than the threshold.
- FIG. 11 is a diagram showing an example of another image displayed by the display device when the determination unit of the processor according to the first embodiment of the present invention determines that it is not smaller than the threshold.
- FIG. 12 is a diagram showing an example of still another image displayed by the display device when the determination unit of the processor according to the first embodiment of the present invention determines that it is not smaller than the threshold.
- FIG. 13 is a diagram showing an example of still another image displayed by the display device when the determination unit of the processor according to the first embodiment of the present invention determines that it is not smaller than the threshold.
- FIG. 14 is a flowchart showing an outline of display processing performed by the processor according to the second embodiment of the present invention on a display device.
- FIG. 15 is a view schematically showing a configuration of a color filter according to Embodiment 3 of the present invention.
- FIG. 16 is a diagram showing the relationship between the transmittance and the wavelength of each of the filters constituting the color filter according to Embodiment 3 of the present invention.
- FIG. 17 is a flowchart showing an outline of display processing performed by the processor according to the third embodiment of the present invention on a display device.
- FIG. 1 is a view schematically showing an entire configuration of an endoscope system according to a first embodiment of the present invention.
- the endoscope system 1 illustrated in FIG. 1 includes an endoscope 2 (endoscope) that captures an in-vivo image of a subject by inserting the tip into a body cavity of the subject, and a tip of the endoscope 2
- the light source device 3 generates illumination light emitted from the light source
- the display device 4 displays an image corresponding to the image data captured by the endoscope 2
- the in-vivo image captured by the endoscope 2 is subjected to predetermined image processing
- a processor 5 controls the operation of the entire endoscope system 1 as a whole.
- the processor 5 functions as an image processing apparatus.
- the endoscope 2 has an elongated insertion portion 21 having flexibility, an operation portion 22 connected to the proximal end side of the insertion portion 21 and receiving input of various operation signals, and an insertion portion from the operation portion 22 21 includes a universal cord 23 which extends in a direction different from the extending direction and incorporates various cables connected to the processor 5 and the light source device 3.
- the insertion portion 21 is connected to a proximal end side of the bending portion 25 and a distal end portion 24 having a built-in imaging device (imaging portion) described later, a bendable bending portion 25 formed of a plurality of bending pieces, and flexibility And a flexible flexible tube portion 26 having a property.
- the operation unit 22 includes a bending knob 221 that bends the bending unit 25 in the vertical and horizontal directions, a treatment instrument insertion unit 222 that inserts a treatment tool such as a biological forceps, a laser knife and an inspection probe into a body cavity, the light source device 3,
- a plurality of switches 223 which are operation input units for inputting operation instruction signals of peripheral devices such as air supply means, water supply means, gas supply means and the like.
- the treatment tool inserted from the treatment tool insertion portion 222 is exposed from the opening (not shown) via the tip portion 24.
- the universal cord 23 incorporates at least a light guide to be described later and a collecting cable.
- the universal cord 23 has a connector portion 27 (see FIG. 1) which is detachable from the light source device 3.
- the connector portion 27 has a coiled coil cable 27a extended, and has an electrical connector portion 28 detachably attachable to the processor 5 at the extended end of the coil cable 27a.
- the connector unit 27 is internally configured using an FPGA (Field Programmable Gate Array).
- the light source device 3 is configured using, for example, a halogen lamp or a white LED (Light Emitting Diode). Under the control of the processor 5, the light source device 3 emits illumination light from the tip end side of the insertion portion of the endoscope 2 toward the subject.
- a halogen lamp or a white LED (Light Emitting Diode).
- the light source device 3 emits illumination light from the tip end side of the insertion portion of the endoscope 2 toward the subject.
- the display device 4 displays an image corresponding to an image signal subjected to image processing by the processor 5 and various information related to the endoscope system 1.
- the display device 4 is configured using a liquid crystal, a display panel such as an organic EL (Electro Luminescence), or the like.
- the processor 5 performs predetermined image processing on the RAW image data input from the endoscope 2 and outputs the raw image data to the display device 4.
- the processor 5 is configured using a CPU or the like.
- FIG. 2 is a block diagram showing the function of the main part of the endoscope system 1. The details of the configuration of each part of the endoscope system 1 and the paths of electric signals in the endoscope system 1 will be described with reference to FIG.
- the endoscope 2 includes an optical system 201, an imaging unit 202, an A / D conversion unit 203, and a light guide path 204.
- the optical system 201 receives the reflected light of the illumination light emitted by the light source device 3 on the imaging surface of the imaging unit 202 to form an object image.
- the optical system 201 is configured using one or more lenses, a prism, and the like.
- the imaging unit 202 receives an object image formed on the light receiving surface by the optical system 201 under the control of the processor 5 and performs photoelectric conversion to generate image data (RAW image data) of the object.
- the generated image data is output to the A / D converter 203.
- the imaging unit 202 captures an image of the subject at a reference frame rate, for example, a frame rate of 60 fps, and generates image data of the subject.
- the imaging unit 202 photoelectrically converts light received by a plurality of pixels arranged in a two-dimensional grid, and generates an electrical signal.
- An imaging element 202 a such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS)
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- a plurality of first band-pass filters (hereinafter referred to as “wide band filters”) for transmitting light of the primary color wavelength band, and a narrow band having a maximum value outside the range of the wavelength band of light passing through the first band-pass filter
- FIG. 3 is a view schematically showing the configuration of the color filter 202b.
- the color filter 202 b includes two wide band filters R transmitting red components, eight wide band filters G transmitting green components, two wide band filters B transmitting blue components, and a narrow band.
- the four narrow band-pass filters X1 transmitting the light of (1) are configured using a filter unit in which a predetermined arrangement pattern is formed.
- the color filters 202b are disposed at positions corresponding to any of the plurality of pixels of the imaging element 202a in which the individual filters forming the above-described array pattern are arrayed in a secondary grid.
- the peak wavelength of the wavelength band of the narrow band light in the first embodiment is between 395 nm and 435 nm.
- the image data generated by the imaging unit 202 using the color filter 202 b configured as described above is subjected to predetermined image processing (for example, interpolation such as demosaicing processing) by the processor 5 described later, whereby a color wide band is generated. Converted into images and narrowband images.
- predetermined image processing for example, interpolation such as demosaicing processing
- FIG. 4 is a diagram showing the relationship between the transmittance and the wavelength of each of the filters constituting the color filter 202b.
- the curve L B represents the relationship between the transmittance and the wavelength of the broadband filter B
- the curve L G represents the relationship between the transmittance and the wavelength of the broad band filter G
- the curve L R is the transmittance of the broadband filter R
- the curve L X1 shows the relationship between the transmittance and the wavelength of the narrowband filter X1.
- the peak wavelength of the narrow band filter X1 is described as being between 395 nm and 435 nm.
- the spectral characteristics of the narrow band filter X1 are such that the width of the wavelength transmission band of the narrow band filter X1 is narrower than each of the wide band filter R, the wide band filter B, and the wide band filter G.
- the A / D conversion unit 203 performs A / D conversion on the analog image data input from the imaging unit 202, and outputs the digital image data subjected to the A / D conversion to the processor 5.
- the light guide path 204 is configured using an illumination lens and a slide guide, and propagates the illumination light emitted by the light source device 3 toward a predetermined area.
- the processor 5 includes an image processing unit 51, a recording unit 52, and a control unit 53.
- the image processing unit 51 subjects the digital image data input from the endoscope 2 to predetermined image processing and outputs the image data to the display device 4.
- the image processing unit 51 includes a separation unit 511, a demosaicing unit 512, an image generation unit 513, a calculation unit 514, a determination unit 515, and a display control unit 516.
- the separation unit 511 separates the RAW data into mosaic-like channels, and outputs the signal value of the RAW data separated to each of the channels to the demosaicing unit 512. Do.
- the demosaicing unit 512 performs demosaicing processing using the signal value of each channel separated by the separation unit 511, thereby generating each of the R image, the G image, the B image, and the X1 image, and this R image, G Each of the image, the B image, and the X1 image is output to the image generation unit 513.
- the image generation unit 513 generates a color wide-band image using the R image, the G image, and the B image generated by the demosaicing unit 512, and generates a pseudo image using the G image and the X1 image generated by the demosaicing unit 512. Generate a narrowband image of color.
- the calculating unit 514 calculates the feature amount of the narrowband image generated by the image generating unit 513. Specifically, the calculation unit 514 calculates the feature amount based on the edge included in the narrowband image.
- Determination unit 515 determines whether the feature amount of the narrowband image calculated by calculation unit 514 is smaller than a threshold, and outputs the determination result to display control unit 516.
- the display control unit 516 controls the display mode of the display device 4. Further, based on the determination result of the determination unit 515, the display control unit 516 highlights one of the wide band image and the narrow band image generated by the image generation unit 513 and causes the display device 4 to display the image. Specifically, the display control unit 516 enlarges the display area of either the wide band image or the narrow band image generated by the image generation unit 513 and causes the display device 4 to display the enlarged image. For example, when the determination unit 515 determines that the threshold is exceeded by the determination unit 515, the display control unit 516 causes the display device 4 to display the display area of the narrowband image larger than the display area of the broadband image.
- the recording unit 52 records image data generated by the endoscope 2, a program executed by the endoscope system 1, and information being processed.
- the recording unit 52 is configured using a non-volatile memory, a volatile memory, or the like.
- the control unit 53 controls the respective units constituting the endoscope system 1 in an integrated manner.
- the control unit 53 is configured using a CPU or the like.
- the control unit 53 controls the emission timing of the illumination light of the light source device 3, the imaging timing of the imaging unit 202 of the endoscope 2, and the like.
- FIG. 5 is a flowchart showing an outline of the display processing performed by the processor 5 on the display device 4.
- the separation unit 511 separates the RAW data into mosaic-like channels (step S101). Specifically, the separation unit 511 separates the signal value of the RAW data for each channel corresponding to each of the wide band filter R, the wide band filter G, the wide band filter B, and the narrow band filter X1.
- the demosaicing unit 512 generates each of the R image, the G image, the B image, and the X1 image by performing the demosaicing process using the signal value of each channel separated by the separation unit 511 (step S102).
- the method of demonstration may be performed by a known linear interpolation, or may be performed on the signal values of the wide band filter R and the wide band filter B with reference to the signal value of the wide band filter G.
- the image generation unit 513 generates a color wide-band image using the R image, the G image, and the B image generated by the demosaicing unit 512 (step S103), and the G image and the X1 image generated by the demosaicing unit 512 To generate a pseudo-color narrowband image (step S104).
- the calculating unit 514 calculates the feature amount of the narrowband image generated by the image generating unit 513 (step S105). Specifically, the calculation unit 514 calculates the feature amount based on the edge included in the narrowband image. More specifically, the calculating unit 514 sets the pixel value of each pixel of the narrowband image as X (i, j), and calculates the edge of the narrowband image using a Sobel filter.
- the Sobel filter is a filter for calculating an edge by multiplying each of nine pixel values of upper, lower, left, and right around the target pixel of the narrow band image by coefficients and summing the results.
- the calculation unit 514 multiplies each of the nine pixel values in the upper, lower, left, and right around the target pixel of the narrowband image by the coefficient of the filter F1 shown in FIG.
- the horizontal total value gHS is calculated by summing the results.
- the calculation unit 514 multiplies the respective coefficients of the filter F2 shown in FIG. 7 and sums up the results to calculate the total value gVS in the vertical direction.
- the calculator 514 calculates the edge g (i, j) according to the following equation (1).
- the calculation unit 514 calculates the sum GHx of the edges of the narrowband image according to the following equation (2).
- GHx ⁇ g (i, j) (2)
- ⁇ is the sum of all g (i, j).
- the calculating unit 514 calculates the sum GHx of the edges of the narrowband image as the feature amount, based on the edges included in the narrowband image.
- the determination unit 515 determines whether the feature amount of the narrowband image calculated by the calculation unit 514 is smaller than a threshold (step S106). If the determination unit 515 determines that the feature amount of the narrowband image calculated by the calculation unit 514 is smaller than the threshold (step S106: Yes), the processor 5 proceeds to step S107 described later. On the other hand, when the determination unit 515 determines that the feature amount of the narrowband image calculated by the calculation unit 514 is not smaller than the threshold (step S106: No), the processor 5 proceeds to step S108 described later.
- step S107 the display control unit 516 causes the display device 4 to display the wide band image generated by the image generation unit 513 larger than the narrow band image generated by the image generation unit 513. Specifically, as shown in FIG. 8, the display control unit 516 causes the display device 4 to display the wide band image P1 larger than the narrow band image P2 in the display area 40 of the display device 4.
- the processor 5 proceeds to step S109 described later.
- step S108 the display control unit 516 causes the display device 4 to display the narrowband image generated by the image generation unit 513 larger than the wide band image generated by the image generation unit 513. Specifically, as shown in FIG. 9, the display control unit 516 causes the display device 4 to display the narrowband image P2 larger than the wide band image P1 in the display area 40 of the display device 4. As a result, when the feature amount of the narrowband image P2 exceeds the predetermined threshold, the user is focused on the narrowband image P2 because the narrowband image P2 is displayed larger and emphasized than the wide band image P1. It is possible to prevent missing of a lesion of a subject.
- the processor 5 proceeds to step S109 described later.
- step S109 when the observation of the subject is ended (step S109: Yes), the processor 5 ends the present process. On the other hand, when the observation of the subject is not finished (step S109: No), the processor 5 returns to the above-described step S101.
- the narrow band image P2 is selected from the wide band image P1.
- the display area is enlarged and displayed on the display device 4.
- the narrow band image P2 is emphasized and displayed on the display device 4 only when necessary while the examination based on the wide band image P1 is based, so that the narrow band image P2 is focused, so the work at the time of the user's examination It is possible to prevent missing of the lesion while minimizing the burden of
- the display control unit 516 determines that the feature amount of the narrowband image P2 is smaller than the threshold by the determination unit 515, the wide band image P1 is displayed larger than the narrowband image P2
- the determination unit 515 determines that the feature amount of the narrowband image P2 is equal to or more than the threshold while displaying on the device 4
- the narrowband image P2 is displayed on the display device 4 larger than the wide band image P1.
- the display method may be appropriately changed in accordance with the determination result of the determination unit 515. Specifically, when the determination unit 515 determines that the feature amount of the narrowband image P2 is smaller than the threshold, the display control unit 516 reduces the narrowband image P2, and the reduced narrowband image P2 is a wideband image.
- the determining unit 515 determines that the feature amount of the narrowband image P2 is equal to or greater than the threshold, the broadband image P1 is reduced.
- the reduced wide band image P1 may be superimposed on the narrow band image P2 and displayed in the display area 40 of the display device 4 (see FIG. 11).
- the display control unit 516 may blink the image displayed by the display device 4 according to the determination result of the determination unit 515. For example, as shown in FIG. 12, when the display device 4 displays the wide band image P1 and the narrow band image P2 in parallel in the same display area 40, the determination unit 515 sets the feature amount of the narrow band image P2 to a threshold value. When it is determined as above, the edge T1 of the narrow band image P2 may be blinked and displayed on the display device 4 (FIG. 12 ⁇ FIG. 13). Of course, the display control unit 516 may cause the display device 4 to display each of the wide band image P1 and the narrow band image P2 in a blinking manner.
- the display control unit 516 may warn the fact that the feature amount of the narrowband image P2 is equal to or more than the threshold value by displaying characters, figures, and the like in the display area of the display device 4.
- the endoscope system according to the second embodiment has the same configuration as the endoscope system 1 according to the first embodiment described above, and the process to be performed is different. In the following, processing performed by the endoscope system according to the second embodiment will be described.
- the same components as those in the first embodiment described above are denoted by the same reference numerals and description thereof is omitted.
- FIG. 14 is a flowchart showing an outline of display processing performed by the processor 5 according to the second embodiment of the present invention on the display device 4.
- steps S201 to S204 correspond to the above-described steps S101 to S104, respectively.
- step S205 the calculating unit 514 calculates the feature amount included in the narrow band image. Specifically, the calculating unit 514 calculates an edge by the same method as that of the first embodiment described above, and calculates the sum GHx of the luminance of the edge as a feature amount.
- the calculation unit 514 calculates the feature amount based on the edge included in the wide band image (step S206). Specifically, the calculating unit 514 calculates the edge of the wide band image and calculates the sum GHg of the edges as the feature amount, as in the method of calculating the edge of the narrow band image according to the first embodiment described above.
- the calculation unit 514 calculates a value (GHx / GHg) obtained by dividing the feature amount of the narrowband image calculated in step S205 by the feature amount of the wide band image calculated in step S206 described above (step S207).
- the determination unit 515 determines whether the value (GHx / GHg) calculated by the calculation unit 514 in step S207 described above is smaller than a threshold (step S208). If the determination unit 515 determines that the value calculated by the calculation unit 514 is smaller than the threshold (step S208: Yes), the processor 5 proceeds to step S209 described later. On the other hand, when the determination unit 515 determines that the value calculated by the calculation unit 514 is not smaller than the threshold (step S208: No), the processor 5 proceeds to step S210 described later.
- Steps S209 to S211 correspond to the above-described steps S107 to S109, respectively.
- the wide band image P2 is selected from the wide band image P1.
- the display area is enlarged and displayed on the display device 4.
- the narrow band image P2 is emphasized and displayed on the display device 4 only when necessary while the examination based on the wide band image P1 is based, so that the narrow band image P2 is focused, so the work at the time of the user's examination It is possible to prevent missing of the lesion while minimizing the burden of
- the third embodiment differs in the configuration of the color filter 202b in the endoscope system 1 according to the above-described first embodiment, and in the processing executed by the processor.
- processing performed by the processor according to the third embodiment will be described.
- the same components as those in the first embodiment described above are denoted by the same reference numerals and description thereof is omitted.
- FIG. 15 is a view schematically showing a configuration of a color filter 202c according to Embodiment 3 of the present invention.
- the color filter 202c is configured using a filter unit of an arrangement in which four narrow band-pass filters X2 transmitting narrow band light are replaced by one set instead of the above-described narrow band-pass filter X1.
- the wavelength band peak wavelength of the narrow band light in the third embodiment is between 790 nm and 820 nm.
- the image data generated by the imaging element 202a using the color filter 202c configured in this way is subjected to predetermined image processing (for example, interpolation such as demosaicing processing) by the processor 5, whereby a color broadband image and an image are generated. It is converted to an infrared narrow band image.
- predetermined image processing for example, interpolation such as demosaicing processing
- FIG. 16 is a diagram showing the relationship between the transmittance and the wavelength of each of the filters constituting the color filter 202c.
- the curve L B represents the relationship between the transmittance and the wavelength of the broadband filter B
- the curve L G represents the relationship between the transmittance and the wavelength of the broad band filter G
- the curve L R is the transmittance of the broadband filter R
- the curve L X2 shows the relationship between the transmittance of the narrowband filter X2 and the wavelength. Further, in FIG. 16, the peak wavelength of the narrow band filter X2 is described as being between 790 nm and 820 nm.
- the spectral characteristics of the narrow band filter X2 are such that the width of the wavelength transmission band is narrower than each of the wide band filter R, the wide band filter B and the wide band filter G, and has the maximum value of the transmission spectrum on the long wavelength side . Furthermore, the narrow band-pass filter X2 transmits light (infrared light) having the maximum value of the transmission spectrum outside the range of the wavelength band of light (visible light) transmitted through the wide band filter.
- FIG. 17 is a flowchart showing an outline of the display processing performed by the processor 5 on the display device 4.
- steps S301 to S304 correspond to steps S101 to S104 in FIG. 5 described above.
- step S305 the calculating unit 514 calculates, as a feature amount, the number of pixels exceeding a predetermined value for the pixel value of each pixel of the narrowband image generated by the image generating unit 513.
- the determination unit 515 determines whether the number of pixels calculated by the calculation unit 514 is smaller than a threshold (step S306).
- the threshold is 20% with respect to the total number of pixels in the narrowband image.
- the threshold can be changed as appropriate. If the determination unit 515 determines that the number of pixels calculated by the calculation unit 514 is smaller than the threshold (step S306: Yes), the processor 5 proceeds to step S307 described later. On the other hand, when the determination unit 515 determines that the number of pixels calculated by the calculation unit 514 is not smaller than the threshold (step S306: No), the processor 5 proceeds to step S308 described later.
- Steps S307 to S309 correspond to steps S107 to S109 in FIG. 5 described above.
- the peak wavelength of light transmitted by the narrow band-pass filter X2 constituting the color filter 202c is between 790 nm and 820 nm, and the calculation unit 514 sets the pixel value to a predetermined value. Since the number of pixels exceeded is calculated as a feature amount, narrow band image is displayed by emphasizing the infrared narrow band image P2 and displaying it on the display device 4 only when necessary while making a consultation with the wide band image P1. By focusing on the user, it is possible to prevent the user from missing a lesion while minimizing the burden of work at the time of medical examination.
- the wide-band color filter is configured of the primary color filter, but for example, complementary color filters (Cy, Mg, Ye) transmitting light having complementary wavelength components may be used. Furthermore, even if a color filter (R, G, B, Or, Cy) configured by a primary color filter and a filter (Or, Cy) that transmits light having orange and cyan wavelength components is used. Good. Furthermore, a color filter (R, G, B, W) configured by a primary color filter and a filter (W) that transmits light having a white wavelength component may be used.
- the color filters are provided with narrow band filters that transmit one type of wavelength band, but a plurality of narrow band filters may be provided in the color filters.
- the narrow band filter X1 of the first embodiment described above and the narrow band filter X2 of the third embodiment described above may be provided.
- the image processing apparatus has been described as a processor used as an endoscope system.
- a capsule endoscope that can be inserted into a body cavity of a subject can be applied.
- the method of each process by the image processing apparatus in the above-described embodiment can be stored as a program that can be executed by a control unit such as a CPU.
- a control unit such as a CPU
- memory cards ROM cards, RAM cards, etc.
- magnetic disks floppy disks (registered trademark), hard disks, etc.
- optical disks CD-ROM, DVD, etc.
- storage in external storage devices such as semiconductor memory etc.
- a control unit such as a CPU can read the program stored in the storage medium of the external storage device, and can execute the above-described processing by controlling the operation by the read program.
- the present invention is not limited to the above-described embodiment and modification as it is, and at the implementation stage, the constituent elements can be modified and embodied without departing from the scope of the invention.
- various inventions can be formed by appropriately combining a plurality of components disclosed in the above-described embodiment. For example, some components may be deleted from all the components described in the above-described embodiment and modifications. Furthermore, the components described in each embodiment and modification may be combined as appropriate.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Endoscopes (AREA)
Abstract
Description
本発明は、被検体の体内画像を撮像する撮像装置が生成した画像データに対して画像処理を行う画像処理装置、表示制御方法およびプログラムに関する。 The present invention relates to an image processing apparatus that performs image processing on image data generated by an imaging apparatus that captures an in-vivo image of a subject, a display control method, and a program.
近年、可視光領域において広帯域の波長透過特性を有する複数の広帯域フィルタと狭帯域の波長透過特性を有する狭帯域フィルタとを2次元的に配列したフィルタ部を撮像素子に設けることによって、粘膜表層の毛細血管および粘膜微細模様を観察可能な狭帯域画像とカラーの通常画像とを同時に取得する内視鏡が知られている(特許文献1参照)。この技術では、通常画像と狭帯域画像とを1つの表示装置で同時に表示させる。これにより、医者等のユーザによる病変部の診断精度を向上させることができる。 In recent years, by providing a filter unit in which a plurality of wide band filters having wide band wavelength transmission characteristics in a visible light region and a narrow band filter having narrow band wavelength transmission characteristics are two-dimensionally arrayed in an imaging device, There is known an endoscope which simultaneously acquires a narrow band image capable of observing capillary and mucous membrane micropatterns and a normal color image (see Patent Document 1). In this technique, a normal image and a narrowband image are simultaneously displayed on one display device. As a result, it is possible to improve the diagnostic accuracy of the lesioned part by the user such as a doctor.
また、上述した撮像素子を備えたカプセル型内視鏡で生成された画像データに対応する画像を表示する表示装置において、通常画像の表示を指示するための通常画像選択ボタンと、狭帯域画像の表示を指示するための狭帯域画像選択ボタンとを設け、ユーザが観察に応じて2つのボタンのいずれかを選択することによって、通常画像および狭帯域画像のいずれか一方を表示する技術が知られている(特許文献2参照)。 In addition, in a display device for displaying an image corresponding to image data generated by the capsule endoscope provided with the above-described image sensor, a normal image selection button for instructing display of a normal image, narrow band image There is known a technique for displaying either a normal image or a narrowband image by providing a narrowband image selection button for instructing display and the user selecting one of two buttons according to observation. (See Patent Document 2).
また、白色光の波長帯域における情報を有した被写体像を含む通常画像と、特定の狭帯域の波長帯域における情報を有した被写体像を含む狭帯域画像とを合成して合成画像を生成し、この合成画像を表示装置に表示する技術が知られている(特許文献3参照)。 And combining a normal image including a subject image having information in a white light wavelength band and a narrow band image including a subject image having information in a specific narrow band wavelength band to generate a composite image; A technique for displaying this composite image on a display device is known (see Patent Document 3).
しかしながら、上述した特許文献1では、通常画像と狭帯域画像とを同時に表示させているため、ユーザが2つの画像を比べながら診察しつつ、内視鏡の操作を行わなければならないため、診察時におけるユーザの作業の負荷が大きいという問題点があった。
However, in
また、上述した特許文献2では、ユーザが通常画像を観察しながら疑わしい病変部を見つけた場合、その都度、ユーザが狭帯域画像選択ボタンを選択することによって、通常画像から狭帯域画像を切り替えているため、この場合にも、診察時におけるユーザの作業の負荷が大きいという問題点があった。
Further, in
さらに、上述した特許文献3では、通常画像と狭帯域画像とを合成して表示しているものの、通常画像および狭帯域画像の各々に重要な診断情報が含まれているため、1つの合成画像だけでは、病変部を見逃してしまうおそれがあった。
Furthermore, in the
本発明は、上記に鑑みてなされたものであって、診察時におけるユーザの作業の負担を低減しつつ、ユーザが病変部を見逃すことを防止することができる画像処理装置、表示制御方法およびプログラムを提供することを目的とする。 The present invention has been made in view of the above, and an image processing apparatus, a display control method, and a program capable of preventing a user from missing a lesion while reducing the burden of the user's work at the time of medical examination. Intended to provide.
上述した課題を解決し、目的を達成するために、本発明に係る画像処理装置は、原色の波長帯域の光を透過する複数の広帯域フィルタと少なくとも1つの狭帯域の光を透過させる狭帯域フィルタとを用いて所定の配列パターンを形成し、該配列パターンを形成する個々のフィルタが、2次元格子状に配置された複数の画素のいずれかに対応する位置に配置された撮像素子によって生成された画像データに対して所定の画像処理を施し、該画像処理を施した画像データに対応する画像を表示装置に表示させる画像処理装置であって、前記撮像素子が生成した前記画像データに基づいて、前記広帯域フィルタに対応する広帯域画像と、前記狭帯域フィルタに対応する狭帯域画像と、を生成する画像生成部と、前記画像生成部が生成した前記狭帯域画像に含まれる特徴量を算出する算出部と、前記算出部が算出した前記特徴量が閾値を超えているか否かを判定する判定部と、前記判定部の判定結果に基づいて、前記広帯域画像および前記狭帯域画像のどちらか一方を他方より強調して前記表示装置に表示させる表示制御部と、を備えたことを特徴とする。 In order to solve the problems described above and to achieve the object, an image processing apparatus according to the present invention includes a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one narrow band light To form a predetermined array pattern, and an individual filter forming the array pattern is generated by an imaging element disposed at a position corresponding to any of a plurality of pixels arranged in a two-dimensional grid. An image processing apparatus for performing predetermined image processing on the acquired image data and displaying an image corresponding to the image data subjected to the image processing on a display device, based on the image data generated by the imaging element An image generation unit generating a wide band image corresponding to the wide band filter and a narrow band image corresponding to the narrow band filter; and the narrow band generated by the image generation unit The wide band image based on a determination result of a calculation unit that calculates a feature amount included in an image, a determination unit that determines whether the feature amount calculated by the calculation unit exceeds a threshold, and a determination result of the determination unit And a display control unit for emphasizing one of the narrowband images more than the other and displaying the narrowband image on the display device.
また、本発明に係る画像処理装置は、上記発明において、前記算出部は、前記狭帯域画像に含まれるエッジに基づき、前記特徴量を算出することを特徴とする。 In the image processing apparatus according to the present invention as set forth in the above invention, the calculating unit calculates the feature amount based on an edge included in the narrow band image.
また、本発明に係る画像処理装置は、上記発明において、前記算出部は、前記広帯域画像に含まれる第1のエッジと、前記狭帯域画像に含まれる第2のエッジとに基づき、前記特徴量を算出することを特徴とする。 Further, in the image processing apparatus according to the present invention according to the above-mentioned invention, the calculation unit may set the feature amount based on a first edge included in the wide band image and a second edge included in the narrow band image. To calculate.
また、本発明に係る画像処理装置は、上記発明において、前記算出部は、前記狭帯域画像において画素値が所定の値を超えた画素の数を前記特徴量として算出することを特徴とする。 The image processing apparatus according to the present invention is characterized in that, in the above-mentioned invention, the calculation unit calculates the number of pixels whose pixel value exceeds a predetermined value in the narrow band image as the feature amount.
また、本発明に係る画像処理装置は、上記発明において、前記表示制御部は、前記判定部によって前記特徴量が前記閾値を超えると判定された場合、前記狭帯域画像の表示領域を前記広帯域画像の表示領域より大きくして前記表示装置に表示させることを特徴とする。 Further, in the image processing apparatus according to the present invention, in the above-mentioned invention, the display control section may display the narrow band image display area as the wide band image when the determination section determines that the feature quantity exceeds the threshold value. And a display area larger than the display area.
また、本発明に係る画像処理装置は、上記発明において、前記表示制御部は、前記判定部によって前記特徴量が前記閾値を超えると判定された場合、前記広帯域画像を縮小し、該縮小した広帯域画像を前記狭帯域画像に重畳して前記表示装置に表示させることを特徴とする。 In the image processing apparatus according to the present invention, in the above-mentioned invention, the display control unit reduces the wide band image when the determination unit determines that the feature amount exceeds the threshold, and the reduced wide band An image is superimposed on the narrow band image and displayed on the display device.
また、本発明に係る画像処理装置は、上記発明において、前記表示制御部は、前記判定部によって前記特徴量が前記閾値を超えると判定された場合、前記狭帯域画像を点滅させて前記表示装置に表示させることを特徴とする。 Further, in the image processing apparatus according to the present invention, in the above-mentioned invention, the display control section blinks the narrow band image when the determination section determines that the feature amount exceeds the threshold, and the display apparatus It is characterized by being displayed on.
また、本発明に係る画像処理装置は、上記発明において、前記表示制御部は、前記判定部によって前記特徴量が前記閾値を超えると判定された場合、前記狭帯域画像に文字および図形のいずれかを重畳して前記表示装置に表示させることを特徴とする。 Further, in the image processing apparatus according to the present invention, in the above-mentioned invention, the display control unit determines that the narrow band image is either a character or a graphic when the determination unit determines that the feature amount exceeds the threshold. To be displayed on the display device.
また、本発明に係る画像処理装置は、上記発明において、前記狭帯域フィルタが透過する光のピーク波長は、395nmから435nmの間にあることを特徴とする。 The image processing apparatus according to the present invention is characterized in that, in the above-mentioned invention, the peak wavelength of light transmitted by the narrow band filter is between 395 nm and 435 nm.
また、本発明に係る画像処理装置は、上記発明において、前記狭帯域フィルタが透過する光のピーク波長は、790nmから820nmの間にあることを特徴とする。 The image processing apparatus according to the present invention is characterized in that, in the above-mentioned invention, the peak wavelength of light transmitted by the narrow band filter is between 790 nm and 820 nm.
また、本発明に係る表示制御方法は、原色の波長帯域の光を透過する複数の広帯域フィルタと少なくとも1つの狭帯域の光を透過させる狭帯域フィルタを用いて所定の配列パターンを形成し、該配列パターンを形成する個々のフィルタが、格子状に配置された複数の画素のいずれかに対応する位置に配置された撮像素子によって生成された画像データに対して所定の画像処理を施し、該画像処理を施した画像データに対応する画像を表示装置に表示させる画像処理装置が実行する表示制御方法であって、前記撮像素子が生成した前記画像データに基づいて、前記広帯域フィルタに対応する広帯域画像と、前記狭帯域フィルタに対応する狭帯域画像と、を生成する画像生成ステップと、前記画像生成ステップにおいて生成した前記狭帯域画像に含まれる特徴量を算出する算出ステップと、前記算出ステップにおいて算出した前記特徴量が閾値を超えているか否かを判定する判定ステップと、前記判定ステップの判定結果に基づいて、前記広帯域画像および前記狭帯域画像のどちらか一方を他方より強調して前記表示装置に表示させる表示制御ステップと、を含むことを特徴とする。 Further, in the display control method according to the present invention, a predetermined array pattern is formed by using a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one narrow band light. Each of the filters forming the array pattern performs predetermined image processing on the image data generated by the imaging device disposed at the position corresponding to any of the plurality of pixels disposed in a grid, and the image is generated. A display control method executed by an image processing apparatus that causes an image corresponding to processed image data to be displayed on a display device, wherein a wide band image corresponding to the wide band filter is generated based on the image data generated by the image sensor. An image generating step of generating a narrow band image corresponding to the narrow band filter, and the narrow band image generated in the image generating step The wide band image and the broadband image are calculated based on a determination step of calculating the included feature amount, a determination step of determining whether the feature amount calculated in the calculation step exceeds a threshold, and a determination result of the determination step. And D. a display control step of emphasizing one of narrowband images more than the other and displaying the same on the display device.
また、本発明に係るプログラムは、原色の波長帯域の光を透過する複数の広帯域フィルタと少なくとも1つの少なくとも1つの狭帯域の光を透過させる狭帯域フィルタを用いて所定の配列パターンを形成し、該配列パターンを形成する個々のフィルタが、格子状に配置された複数の画素のいずれかに対応する位置に配置された撮像素子によって生成された画像データに対して所定の画像処理を施し、該画像処理を施した画像データに対応する画像を表示装置に表示させる画像処理装置に、前記撮像素子が生成した前記画像データに基づいて、前記広帯域フィルタに対応する広帯域画像と、前記狭帯域フィルタに対応する狭帯域画像と、を生成する画像生成ステップと、前記画像生成ステップにおいて生成した前記狭帯域画像に含まれる特徴量を算出する算出ステップと、前記算出ステップにおいて算出した前記特徴量が閾値を超えているか否かを判定する判定ステップと、前記判定ステップの判定結果に基づいて、前記広帯域画像および前記狭帯域画像のどちらか一方を他方より強調して前記表示装置に表示させる表示制御ステップと、を実行させることを特徴とする。 In the program according to the present invention, a predetermined arrangement pattern is formed using a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one at least one narrow band light; Each of the filters forming the array pattern performs predetermined image processing on image data generated by an imaging element arranged at a position corresponding to any one of a plurality of pixels arranged in a grid pattern, In an image processing apparatus for displaying an image corresponding to image data subjected to image processing on a display device, a wide band image corresponding to the wide band filter and the narrow band filter based on the image data generated by the imaging device An image generation step of generating a corresponding narrowband image, and a feature included in the narrowband image generated in the image generation step Of the wide band image and the narrow band image on the basis of the calculation step of calculating L, the determination step of determining whether the feature value calculated in the calculation step exceeds the threshold, and the determination result of the determination step. And a display control step of emphasizing one of the other than the other and displaying the display on the display device.
本発明によれば、診察時における作業の負担を低減しつつ、病変部を見逃すことを防止することができるという効果を奏する。 According to the present invention, it is possible to prevent missing of a lesion while preventing the burden of work at the time of medical examination.
以下、本発明を実施するための形態(以下、「実施の形態」という)を説明する。本実施の形態では、患者等の被検体の体腔内の画像を撮像して表示する医療用の内視鏡システムを例に説明する。また、以下の実施の形態により本発明が限定されるものではない。さらに、図面の記載において、同一部分には同一の符号を付して説明する。 Hereinafter, modes for carrying out the present invention (hereinafter, referred to as “embodiments”) will be described. In the present embodiment, a medical endoscope system for capturing and displaying an image of a body cavity of a subject such as a patient will be described as an example. Further, the present invention is not limited by the following embodiments. Furthermore, in the description of the drawings, the same parts will be described with the same reference numerals.
(実施の形態1)
〔内視鏡システムの構成〕
図1は、本発明の実施の形態1に係る内視鏡システムの全体構成を模式的に示す図である。
[Configuration of Endoscope System]
FIG. 1 is a view schematically showing an entire configuration of an endoscope system according to a first embodiment of the present invention.
図1に示す内視鏡システム1は、被検体の体腔内に先端部を挿入することによって被検体の体内画像を撮像する内視鏡2(内視鏡スコープ)と、内視鏡2の先端から出射する照明光を発生する光源装置3と、内視鏡2が撮像した画像データに対応する画像を表示する表示装置4と、内視鏡2が撮像した体内画像に所定の画像処理を施して表示装置4に表示させるとともに、内視鏡システム1全体の動作を統括的に制御するプロセッサ5(制御装置)と、を備える。なお、本実施の形態1では、プロセッサ5が画像処理装置として機能する。
The
内視鏡2は、可撓性を有する細長形状をなす挿入部21と、挿入部21の基端側に接続され、各種の操作信号の入力を受け付ける操作部22と、操作部22から挿入部21が延びる方向と異なる方向に延び、プロセッサ5および光源装置3と接続する各種ケーブルを内蔵するユニバーサルコード23と、を備える。
The
挿入部21は、後述する撮像装置(撮像部)を内蔵した先端部24と、複数の湾曲駒によって構成された湾曲自在な湾曲部25と、湾曲部25の基端側に接続され、可撓性を有する長尺状の可撓管部26と、を有する。
The
操作部22は、湾曲部25を上下方向および左右方向に湾曲させる湾曲ノブ221と、体腔内に生体鉗子、レーザメスおよび検査プローブ等の処理具を挿入する処置具挿入部222と、光源装置3、プロセッサ5に加えて、送気手段、送水手段、送ガス手段等の周辺機器の操作指示信号を入力する操作入力部である複数のスイッチ223と、を有する。処置具挿入部222から挿入される処置具は、先端部24を経由して開口部(図示せず)から表出する。
The
ユニバーサルコード23は、後述するライトガイドと、集合ケーブルと、を少なくとも内蔵している。ユニバーサルコード23は、光源装置3に着脱自在なコネクタ部27(図1を参照)を有する。コネクタ部27は、コイル状のコイルケーブル27aが延設し、コイルケーブル27aの延出端にプロセッサ5と着脱自在な電気コネクタ部28を有する。コネクタ部27は、内部にFPGA(Field Programmable Gate Array)を用いて構成される。
The
光源装置3は、例えばハロゲンランプや白色LED(Light Emitting Diode)等を用いて構成される。光源装置3は、プロセッサ5の制御のもと、内視鏡2の挿入部の先端側から被写体に向けて照明光を照射する。
The
表示装置4は、プロセッサ5の制御のもと、プロセッサ5が画像処理を施した画像信号に対応する画像および内視鏡システム1に関する各種情報を表示する。表示装置4は、液晶や有機EL(Electro Luminescence)等の表示パネル等を用いて構成される。
Under the control of the
プロセッサ5は、内視鏡2から入力されたRAW画像データに対して所定の画像処理を施して表示装置4へ出力する。プロセッサ5は、CPU等を用いて構成される。
The
次に、内視鏡システム1の要部の機能について説明する。図2は、内視鏡システム1の要部の機能を示すブロック図である。図2を参照して内視鏡システム1の各部構成の詳細および内視鏡システム1内の電気信号の経路について説明する。
Next, the function of the main part of the
〔内視鏡の構成〕
まず、内視鏡2の要部について説明する。
図2に示すように、内視鏡2は、光学系201と、撮像部202と、A/D変換部203と、導光路204と、を備える。
[Configuration of Endoscope]
First, the main part of the
As shown in FIG. 2, the
光学系201は、光源装置3が照射した照明光の反射光を撮像部202の撮像面に受光して被写体像を結像する。光学系201は、1または複数のレンズおよびプリズム等を用いて構成される。
The
撮像部202は、プロセッサ5の制御のもと、光学系201が受光面に結像した被写体像を受光して光電変換を行うことによって、被写体の画像データ(RAW画像データ)を生成し、この生成した画像データをA/D変換部203へ出力する。具体的には、撮像部202は、プロセッサ5の制御のもと、基準のフレームレート、例えば60fpsのフレームレートによって被検体を撮像して被検体の画像データを生成する。撮像部202は、2次元格子状に配置された複数の画素がそれぞれ受光した光を光電変換し、電気信号を生成するCCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等の撮像素子202aと、原色の波長帯域の光を透過する複数の第1帯域フィルタ(以下、「広帯域フィルタ」という)と、この第1帯域フィルタを透過する光の波長帯域の範囲外に最大値を有する狭帯域の光を透過させる第2帯域フィルタ(以下、「狭帯域フィルタ」という)と、を含むフィルタユニットを複数の画素に対応させて配置したカラーフィルタ202bと、を用いて構成される。
The
図3は、カラーフィルタ202bの構成を模式的に示す図である。図3に示すように、カラーフィルタ202bは、赤色の成分を透過する2つの広帯域フィルタR、緑色の成分を透過する8つの広帯域フィルタG、青色の成分を透過する2つの広帯域フィルタBおよび狭帯域の光を透過させる4つの狭帯域フィルタX1を1組とする所定の配列パターンを形成したフィルタユニットを用いて構成される。カラーフィルタ202bは、上述した配列パターンを形成する個々のフィルタが2次格子状に配列された撮像素子202aの複数の画素のいずれかに対応する位置に配置される。ここで、本実施の形態1における狭帯域の光の波長帯域のピーク波長は、395nmから435nmの間にある。このように構成されたカラーフィルタ202bを用いて撮像部202で生成された画像データは、後述するプロセッサ5によって所定の画像処理(例えばデモザイキング処理等の補間)が行われることによって、カラーの広帯域画像および狭帯域画像に変換される。
FIG. 3 is a view schematically showing the configuration of the
図4は、カラーフィルタ202bを構成する各フィルタの透過率と波長との関係を示す図である。図4において、曲線LBが広帯域フィルタBの透過率と波長との関係を示し、曲線LGが広帯域フィルタGの透過率と波長との関係を示し、曲線LRが広帯域フィルタRの透過率と波長との関係を示し、曲線LX1が狭帯域フィルタX1の透過率と波長との関係を示す。また、図4においては、狭帯域フィルタX1のピーク波長を395nmから435nmの間にあるとして説明する。 FIG. 4 is a diagram showing the relationship between the transmittance and the wavelength of each of the filters constituting the color filter 202b. 4, the curve L B represents the relationship between the transmittance and the wavelength of the broadband filter B, the curve L G represents the relationship between the transmittance and the wavelength of the broad band filter G, the curve L R is the transmittance of the broadband filter R The curve L X1 shows the relationship between the transmittance and the wavelength of the narrowband filter X1. Further, in FIG. 4, the peak wavelength of the narrow band filter X1 is described as being between 395 nm and 435 nm.
図4に示すように、狭帯域フィルタX1の分光特性は、狭帯域フィルタX1の波長透過帯域の幅は、広帯域フィルタR、広帯域フィルタBおよび広帯域フィルタGの各々よりも狭い。 As shown in FIG. 4, the spectral characteristics of the narrow band filter X1 are such that the width of the wavelength transmission band of the narrow band filter X1 is narrower than each of the wide band filter R, the wide band filter B, and the wide band filter G.
図2に戻り、内視鏡2の構成の説明を続ける。
A/D変換部203は、撮像部202から入力されたアナログの画像データに対して、A/D変換を行い、このA/D変換を行ったデジタルの画像データをプロセッサ5へ出力する。
Returning to FIG. 2, the description of the configuration of the
The A /
導光路204は、照明レンズやライドガイドを用いて構成され、所定の領域に向けて光源装置3が照射した照明光を伝播する。
The
〔プロセッサの構成〕
次に、プロセッサ5の要部について説明する。
プロセッサ5は、画像処理部51と、記録部52と、制御部53と、を備える。
Processor Configuration
Next, the main part of the
The
画像処理部51は、内視鏡2から入力されたデジタルの画像データに対して、所定の画像処理を施して表示装置4へ出力する。画像処理部51は、分離部511と、デモザイキング部512と、画像生成部513と、算出部514と、判定部515と、表示制御部516と、を有する。
The
分離部511は、内視鏡2からデジタルのRAWデータが入力された場合、RAWデータをモザイク状の各チャンネルに分離し、この各チャンネルに分離したRAWデータの信号値をデモザイキング部512へ出力する。
When digital RAW data is input from the
デモザイキング部512は、分離部511が分離した各チャンネルの信号値を用いてデモザイキング処理を行うことによって、R画像、G画像、B画像およびX1画像の各々を生成し、このR画像、G画像、B画像およびX1画像の各々を画像生成部513へ出力する。
The
画像生成部513は、デモザイキング部512が生成したR画像、G画像およびB画像を用いてカラーの広帯域画像を生成するとともに、デモザイキング部512が生成したG画像およびX1画像を用いて、疑似カラーの狭帯域画像を生成する。
The
算出部514は、画像生成部513が生成した狭帯域画像の特徴量を算出する。具体的には、算出部514は、狭帯域画像に含まれるエッジに基づいて、特徴量を算出する。
The calculating
判定部515は、算出部514によって算出された狭帯域画像の特徴量が閾値より小さいか否かを判定し、判定結果を表示制御部516へ出力する。
表示制御部516は、表示装置4の表示態様を制御する。また、表示制御部516は、判定部515の判定結果に基づいて、画像生成部513が生成した広帯域画像および前記狭帯域画像のどちらか一方を強調して表示装置4に表示させる。具体的には、表示制御部516は、画像生成部513が生成した広帯域画像および前記狭帯域画像のどちらか一方の表示領域を大きくして表示装置4に表示させる。例えば、表示制御部516は、判定部515によって閾値を超えると判定された場合、狭帯域画像の表示領域を広帯域画像の表示領域より大きくして表示装置4に表示させる。
The
記録部52は、内視鏡2が生成した画像データ、内視鏡システム1が実行するプログラムや処理中の情報を記録する。記録部52は、不揮発性メモリや揮発性メモリ等を用いて構成される。
The
制御部53は、内視鏡システム1を構成する各部を統括的に制御する。制御部53は、CPU等を用いて構成される。制御部53は、光源装置3の照明光の出射タイミングや内視鏡2の撮像部202の撮像タイミング等を制御する。
The
〔プロセッサの処理〕
次に、プロセッサ5が表示装置4に対して行う表示処理について説明する。図5は、プロセッサ5が表示装置4に対して行う表示処理の概要を示すフローチャートである。
Processor Processing
Next, display processing performed by the
図5に示すように、分離部511は、内視鏡2からデジタルのRAWデータが入力された場合、RAWデータをモザイク状の各チャンネルに分離する(ステップS101)。具体的には、分離部511は、広帯域フィルタR、広帯域フィルタG、広帯域フィルタBおよび狭帯域フィルタX1の各々に対応するチャンネル毎にRAWデータの信号値を分離する。
As shown in FIG. 5, when the digital RAW data is input from the
続いて、デモザイキング部512は、分離部511が分離した各チャンネルの信号値を用いてデモザイキング処理を行うことによって、R画像、G画像、B画像およびX1画像の各々を生成する(ステップS102)。ここで、デモザキングの方法としては、周知の線形補間によって行ってもよいし、広帯域フィルタGの信号値を参考に広帯域フィルタRおよび広帯域フィルタBそれぞれの信号値に対してデモザキングを行ってもよい。
Subsequently, the
その後、画像生成部513は、デモザイキング部512が生成したR画像、G画像およびB画像を用いてカラーの広帯域画像を生成し(ステップS103)、デモザイキング部512が生成したG画像およびX1画像を用いて、疑似カラーの狭帯域画像を生成する(ステップS104)。
Thereafter, the
続いて、算出部514は、画像生成部513が生成した狭帯域画像の特徴量を算出する(ステップS105)。具体的には、算出部514は、狭帯域画像に含まれるエッジに基づいて、特徴量を算出する。より具体的には、算出部514は、狭帯域画像の各画素の画素値をX(i,j)とし、Sobelフィルタを用いて狭帯域画像のエッジを算出する。Sobelフィルタは、狭帯域画像の注目画素を中心とした上下左右の9つの画素値の各々に対して係数を乗算し、この結果を合計することによってエッジを算出するためのフィルタである。例えば、算出部514は、水平方向の合計値をgHSとしたとき、狭帯域画像の注目画素を中心とした上下左右の9つの画素値の各々に対して図6に示すフィルタF1の係数を乗算し、この結果を合計することによって、水平方向の合計値gHSを算出する。また、算出部514は、垂直方向の合計値をgVSとしたとき、図7に示すフィルタF2の各々の係数を乗算し、この結果を合計することによって、垂直方向の合計値gVSを算出する。その後、算出部514は、注目画素(中心画素)のエッジをg(i,j)とした場合、以下の式(1)によってエッジg(i,j)を算出する。
g(i,j)=(gHS2+gVS2)1/2 ・・・(1)
続いて、算出部514は、以下の式(2)によって狭帯域画像のエッジの総和GHxを算出する。
GHx=Σg(i,j) ・・・(2)
ここで、Σは、全てのg(i,j)を合計した和である。
このように、算出部514は、狭帯域画像に含まれるエッジに基づいて、狭帯域画像のエッジの総和GHxを特徴量として算出する。
Subsequently, the calculating
g (i, j) = (gHS 2 + gVS 2 ) 1/2 (1)
Subsequently, the
GHx = Σg (i, j) (2)
Here, Σ is the sum of all g (i, j).
As described above, the calculating
続いて、判定部515は、算出部514によって算出された狭帯域画像の特徴量が閾値より小さいか否かを判定する(ステップS106)。判定部515が算出部514によって算出された狭帯域画像の特徴量が閾値より小さいと判定した場合(ステップS106:Yes)、プロセッサ5は、後述するステップS107へ移行する。これに対して、判定部515が算出部514によって算出された狭帯域画像の特徴量が閾値より小さくないと判定した場合(ステップS106:No)、プロセッサ5は、後述するステップS108へ移行する。
Subsequently, the
ステップS107において、表示制御部516は、画像生成部513が生成した広帯域画像を画像生成部513が生成した狭帯域画像より大きくして表示装置4に表示させる。具体的には、図8に示すように、表示制御部516は、表示装置4の表示領域40において、広帯域画像P1を狭帯域画像P2より大きくして表示装置4に表示させる。ステップS107の後、プロセッサ5は、後述するステップS109へ移行する。
In step S107, the
ステップS108において、表示制御部516は、画像生成部513が生成した狭帯域画像を画像生成部513が生成した広帯域画像より大きくして表示装置4に表示させる。具体的には、図9に示すように、表示制御部516は、表示装置4の表示領域40において、狭帯域画像P2を広帯域画像P1より大きくして表示装置4に表示させる。これにより、ユーザは、狭帯域画像P2の特徴量が所定の閾値を超えた場合、狭帯域画像P2が大きく表示されて広帯域画像P1より強調されるので、狭帯域画像P2に注視することによって、被検体の病変部の見逃しを防止することができる。ステップS108の後、プロセッサ5は、後述するステップS109へ移行する。
In step S108, the
ステップS109において、被検体の観察を終了する場合(ステップS109:Yes)、プロセッサ5は、本処理を終了する。これに対して、被検体の観察を終了しない場合(ステップS109:No)、プロセッサ5は、上述したステップS101へ戻る。
In step S109, when the observation of the subject is ended (step S109: Yes), the
以上説明した本発明の実施の形態1によれば、表示制御部516が判定部515によって狭帯域画像に含まれる特徴量が閾値を超えると判定された場合、広帯域画像P1より狭帯域画像P2の表示領域を大きくして表示装置4に表示させる。これにより、広帯域画像P1による診察を基本としながら、必要なときだけ、狭帯域画像P2を強調して表示装置4に表示させることによって、狭帯域画像P2に注視させるので、ユーザの診察時における作業の負担を最小限にしながら病変部の見逃しを防止することができる。
According to the first embodiment of the present invention described above, when the
また、本発明の実施の形態1では、表示制御部516が判定部515によって狭帯域画像P2の特徴量が閾値より小さいと判定された場合、広帯域画像P1を狭帯域画像P2より大きくして表示装置4に表示させる一方、判定部515によって狭帯域画像P2の特徴量が閾値以上と判定された場合、狭帯域画像P2を広帯域画像P1より大きくして表示装置4に表示させていたが、例えば判定部515の判定結果に応じて、表示方法を適宜変更してもよい。具体的には、表示制御部516は、判定部515によって狭帯域画像P2の特徴量が閾値より小さいと判定された場合、狭帯域画像P2を縮小し、この縮小した狭帯域画像P2を広帯域画像P1上に重畳して表示装置4の表示領域40に表示させる一方(図10を参照)、判定部515によって狭帯域画像P2の特徴量が閾値以上と判定された場合、広帯域画像P1を縮小し、この縮小した広帯域画像P1を狭帯域画像P2上に重畳して表示装置4の表示領域40に表示させてもよい(図11を参照)。
Further, in the first embodiment of the present invention, when the
また、本発明の実施の形態1では、表示制御部516が判定部515の判定結果に応じて表示装置4が表示する画像を点滅させてもよい。例えば、図12に示すように、表示装置4が広帯域画像P1および狭帯域画像P2を同じ表示領域40で並列して表示している場合において、判定部515によって狭帯域画像P2の特徴量が閾値以上と判定された場合、狭帯域画像P2の縁T1を点滅させて表示装置4に表示させてもよい(図12→図13)。もちろん、表示制御部516は、広帯域画像P1および狭帯域画像P2の各々を点滅させて表示装置4に表示させてもよい。これにより、ユーザは、直感的に狭帯域画像P2に病変部が含まれることを把握することができる。もちろん、表示制御部516は、表示装置4の表示領域に文字や図形等を表示することで、狭帯域画像P2の特徴量が閾値以上であることを警告してもよい。
In the first embodiment of the present invention, the
(実施の形態2)
次に、本発明の実施の形態2について説明する。本実施の形態2に係る内視鏡システムは、上述した実施の形態1に係る内視鏡システム1と同一の構成を有し、実行する処理が異なる。以下においては、本実施の形態2に係る内視鏡システムが実行する処理について説明する。なお、上述した実施の形態1と同一の構成には同一の符号を付して説明を省略する。
Second Embodiment
Next, a second embodiment of the present invention will be described. The endoscope system according to the second embodiment has the same configuration as the
〔プロセッサの処理〕
図14は、本発明の実施の形態2に係るプロセッサ5が表示装置4に対して行う表示処理の概要を示すフローチャートである。図14において、ステップS201~ステップS204は、上述したステップS101~ステップS104にそれぞれ対応する。
Processor Processing
FIG. 14 is a flowchart showing an outline of display processing performed by the
ステップS205において、算出部514は、狭帯域画像に含まれる特徴量を算出する。具体的には、算出部514は、上述した実施の形態1と同様の方法でエッジを算出し、このエッジの輝度の総和GHxを特徴量として算出する。
In step S205, the calculating
続いて、算出部514は、広帯域画像に含まれるエッジに基づいて、特徴量を算出する(ステップS206)。具体的には、算出部514は、上述した実施の形態1の狭帯域画像のエッジの算出方法と同様に、広帯域画像のエッジを算出し、このエッジの総和GHgを特徴量として算出する。
Subsequently, the
その後、算出部514は、ステップS205で算出した狭帯域画像の特徴量を上述したステップS206で算出した広帯域画像の特徴量で除算した値(GHx/GHg)を算出する(ステップS207)。
After that, the
続いて、判定部515は、上述したステップS207で算出部514が算出した値(GHx/GHg)が閾値より小さいか否かを判定する(ステップS208)。判定部515が算出部514で算出した値が閾値より小さいと判定した場合(ステップS208:Yes)、プロセッサ5は、後述するステップS209へ移行する。これに対して、判定部515が算出部514で算出した値が閾値より小さくないと判定した場合(ステップS208:No)、プロセッサ5は、後述するステップS210へ移行する。
Subsequently, the
ステップS209~ステップS211は、上述したステップS107~ステップS109それぞれに対応する。 Steps S209 to S211 correspond to the above-described steps S107 to S109, respectively.
以上説明した本発明の実施の形態2によれば、表示制御部516が判定部515によって狭帯域画像に含まれる特徴量が閾値を超えると判定された場合、広帯域画像P1より狭帯域画像P2の表示領域を大きくして表示装置4に表示させる。これにより、広帯域画像P1による診察を基本としながら、必要なときだけ、狭帯域画像P2を強調して表示装置4に表示させることによって、狭帯域画像P2に注視させるので、ユーザの診察時における作業の負担を最小限にしながら病変部の見逃しを防止することができる。
According to the second embodiment of the present invention described above, when the
(実施の形態3)
次に、本発明の実施の形態3について説明する。本実施の形態3は、上述した実施の形態1に係る内視鏡システム1におけるカラーフィルタ202bの構成が異なるうえ、プロセッサが実行する処理が異なる。以下においては、本実施の形態3に係るカラーフィルタの構成を説明後、本実施の形態3に係るプロセッサが実行する処理について説明する。なお、上述した実施の形態1と同一の構成には同一の符号を付して説明を省略する。
Third Embodiment
Next, a third embodiment of the present invention will be described. The third embodiment differs in the configuration of the
図15は、本発明の実施の形態3に係るカラーフィルタ202cの構成を模式的に示す図である。図15に示すように、カラーフィルタ202cは、上述した狭帯域フィルタX1に換えて、狭帯域の光を透過させる4つの狭帯域フィルタX2を1組とする配列のフィルタユニットを用いて構成される。ここで、本実施の形態3における狭帯域の光の波長帯域ピーク波長は、790nmから820nmの間にある。このように構成されたカラーフィルタ202cを用いて撮像素子202aで生成された画像データは、プロセッサ5によって所定の画像処理(例えばデモザイキング処理等の補間)が行われることによって、カラーの広帯域画像および赤外の狭帯域画像に変換される。
FIG. 15 is a view schematically showing a configuration of a
図16は、カラーフィルタ202cを構成する各フィルタの透過率と波長との関係を示す図である。図16において、曲線LBが広帯域フィルタBの透過率と波長との関係を示し、曲線LGが広帯域フィルタGの透過率と波長との関係を示し、曲線LRが広帯域フィルタRの透過率と波長との関係を示し、曲線LX2が狭帯域フィルタX2の透過率と波長との関係を示す。また、図16においては、狭帯域フィルタX2のピーク波長を790nmから820nmの間にあるとして説明する。 FIG. 16 is a diagram showing the relationship between the transmittance and the wavelength of each of the filters constituting the color filter 202c. 16, the curve L B represents the relationship between the transmittance and the wavelength of the broadband filter B, the curve L G represents the relationship between the transmittance and the wavelength of the broad band filter G, the curve L R is the transmittance of the broadband filter R The curve L X2 shows the relationship between the transmittance of the narrowband filter X2 and the wavelength. Further, in FIG. 16, the peak wavelength of the narrow band filter X2 is described as being between 790 nm and 820 nm.
図16に示すように、狭帯域フィルタX2の分光特性は、波長透過帯域の幅が広帯域フィルタR、広帯域フィルタBおよび広帯域フィルタGの各々よりも狭く、長波長側に透過スペクトルの最大値を有する。さらに、狭帯域フィルタX2は、広帯域フィルタを透過する光(可視光)の波長帯域の範囲外に透過スペクトルの最大値を有する光(赤外光)を透過させる。 As shown in FIG. 16, the spectral characteristics of the narrow band filter X2 are such that the width of the wavelength transmission band is narrower than each of the wide band filter R, the wide band filter B and the wide band filter G, and has the maximum value of the transmission spectrum on the long wavelength side . Furthermore, the narrow band-pass filter X2 transmits light (infrared light) having the maximum value of the transmission spectrum outside the range of the wavelength band of light (visible light) transmitted through the wide band filter.
〔プロセッサの処理〕
次に、プロセッサ5が表示装置4に対して行う表示処理について説明する。図17は、プロセッサ5が表示装置4に対して行う表示処理の概要を示すフローチャートである。
Processor Processing
Next, display processing performed by the
図17において、ステップS301~ステップS304は、上述した図5のステップS101~ステップS104それぞれに対応する。 In FIG. 17, steps S301 to S304 correspond to steps S101 to S104 in FIG. 5 described above.
ステップS305において、算出部514は、画像生成部513によって生成された狭帯域画像の各画素の画素値に対して、所定の値を超えた画素の数を特徴量として算出する。
In step S305, the calculating
続いて、判定部515は、算出部514によって算出された画素の数が閾値より小さいか否かを判定する(ステップS306)。ここで、閾値は、狭帯域画像の全画素数に対して、20%である。なお、閾値は、適宜変更することができる。判定部515が算出部514によって算出された画素の数が閾値より小さいと判定した場合(ステップS306:Yes)、プロセッサ5は、後述するステップS307へ移行する。これに対して、判定部515が算出部514によって算出された画素の数が閾値より小さくないと判定した場合(ステップS306:No)、プロセッサ5は、後述するステップS308へ移行する。
Subsequently, the
ステップS307~ステップS309は、上述した図5のステップS107~ステップS109それぞれに対応する。 Steps S307 to S309 correspond to steps S107 to S109 in FIG. 5 described above.
以上説明した本発明の実施の形態3によれば、カラーフィルタ202cを構成する狭帯域フィルタX2が透過する光のピーク波長を790nmから820nmの間とし、算出部514によって画素値が所定の値を超えた画素数を特徴量として算出するので、広帯域画像P1による診察を基本としながら、必要なときだけ、赤外の狭帯域画像P2を強調して表示装置4に表示させることによって、狭帯域画像に注視させることで、ユーザによる診察時における作業の負担を最小限にしながら病変部の見逃しを防止することができる。
According to the third embodiment of the present invention described above, the peak wavelength of light transmitted by the narrow band-pass filter X2 constituting the
(その他の実施の形態)
本発明では、広帯域のカラーフィルタが原色フィルタで構成されていたが、例えば補色の波長成分を有する光を透過する補色フィルタ(Cy,Mg,Ye)を用いてもよい。さらに、カラーフィルタを、原色フィルタと、オレンジおよびシアンの波長成分を有する光を透過するフィルタ(Or,Cy)とによって構成されたカラーフィルタ(R,G,B,Or,Cy)を用いてもよい。さらにまた、原色フィルタと、白色の波長成分を有する光を透過させるフィルタ(W)とによって構成されたカラーフィルタ(R,G,B,W)を用いてもよい。
(Other embodiments)
In the present invention, the wide-band color filter is configured of the primary color filter, but for example, complementary color filters (Cy, Mg, Ye) transmitting light having complementary wavelength components may be used. Furthermore, even if a color filter (R, G, B, Or, Cy) configured by a primary color filter and a filter (Or, Cy) that transmits light having orange and cyan wavelength components is used. Good. Furthermore, a color filter (R, G, B, W) configured by a primary color filter and a filter (W) that transmits light having a white wavelength component may be used.
また、本発明では、カラーフィルタに、1つの種類の波長帯域を透過させる狭帯域フィルタが設けられていたが、カラーフィルタ内に、複数の狭帯域フィルタを設けてもよい。例えば、上述した実施の形態1の狭帯域フィルタX1と、上述した実施の形態3の狭帯域フィルタX2とを設けてもよい。 Further, in the present invention, the color filters are provided with narrow band filters that transmit one type of wavelength band, but a plurality of narrow band filters may be provided in the color filters. For example, the narrow band filter X1 of the first embodiment described above and the narrow band filter X2 of the third embodiment described above may be provided.
また、本発明では、画像処理装置を内視鏡システムとして用いられるプロセッサとして説明していたが、例えば被検体の体腔内に挿入可能なカプセル型内視鏡であっても適用することができる。 Further, in the present invention, the image processing apparatus has been described as a processor used as an endoscope system. However, for example, a capsule endoscope that can be inserted into a body cavity of a subject can be applied.
また、本明細書において、前述の各動作フローチャートの説明において、便宜上「まず」、「次に」、「続いて」、「その後」等を用いて動作を説明しているが、この順で動作を実施することが必須であることを意味するものではない。 Furthermore, in the present specification, in the description of each operation flowchart described above, the operation is described using “first”, “next”, “follow”, “after”, etc. for convenience, but the operation is performed in this order Does not mean that it is essential to
また、上述した実施の形態における画像処理装置による各処理の手法、即ち、各フローチャートに示す処理は、いずれもCPU等の制御部に実行させることができるプログラムとして記憶させておくこともできる。この他、メモリカード(ROMカード、RAMカード等)、磁気ディスク(フロッピーディスク(登録商標)、ハードディスク等)、光ディスク(CD-ROM、DVD等)、半導体メモリ等の外部記憶装置の記憶媒体に格納して配布することができる。そして、CPU等の制御部は、この外部記憶装置の記憶媒体に記憶されたプログラムを読み込み、この読み込んだプログラムによって動作が制御されることにより、上述した処理を実行することができる。 Further, the method of each process by the image processing apparatus in the above-described embodiment, that is, the process shown in each flowchart can be stored as a program that can be executed by a control unit such as a CPU. In addition, memory cards (ROM cards, RAM cards, etc.), magnetic disks (floppy disks (registered trademark), hard disks, etc.), optical disks (CD-ROM, DVD, etc.), storage in external storage devices such as semiconductor memory etc. Can be distributed. Then, a control unit such as a CPU can read the program stored in the storage medium of the external storage device, and can execute the above-described processing by controlling the operation by the read program.
また、本発明は、上述した実施の形態および変形例そのままに限定されるものではなく、実施段階では、発明の要旨を逸脱しない範囲内で構成要素を変形して具体化することができる。また、上述した実施の形態に開示されている複数の構成要素を適宜組み合わせることによって、種々の発明を形成することができる。例えば、上述した実施の形態および変形例に記載した全構成要素からいくつかの構成要素を削除してもよい。さらに、各実施の形態および変形例で説明した構成要素を適宜組み合わせてもよい。 The present invention is not limited to the above-described embodiment and modification as it is, and at the implementation stage, the constituent elements can be modified and embodied without departing from the scope of the invention. In addition, various inventions can be formed by appropriately combining a plurality of components disclosed in the above-described embodiment. For example, some components may be deleted from all the components described in the above-described embodiment and modifications. Furthermore, the components described in each embodiment and modification may be combined as appropriate.
また、明細書または図面において、少なくとも一度、より広義または同義な異なる用語とともに記載された用語は、明細書または図面のいかなる箇所においても、その異なる用語に置き換えることができる。このように、発明の主旨を逸脱しない範囲内において種々の変形や応用が可能である。 Further, in the specification or the drawings, the terms described together with the broader or synonymous different terms at least once can be replaced with the different terms anywhere in the specification or the drawings. Thus, various modifications and applications are possible without departing from the spirit of the invention.
1 内視鏡システム
2 内視鏡
3 光源装置
4 表示装置
5 プロセッサ
51 画像処理部
52 記録部
53 制御部
201 光学系
202 撮像部
202a 撮像素子
202b,202c カラーフィルタ
203 A/D変換部
204 導光路
511 分離部
512 デモザイキング部
513 画像生成部
514 算出部
515 判定部
516 表示制御部
REFERENCE SIGNS
Claims (12)
前記撮像素子が生成した前記画像データに基づいて、前記広帯域フィルタに対応する広帯域画像と、前記狭帯域フィルタに対応する狭帯域画像と、を生成する画像生成部と、
前記画像生成部が生成した前記狭帯域画像に含まれる特徴量を算出する算出部と、
前記算出部が算出した前記特徴量が閾値を超えているか否かを判定する判定部と、
前記判定部の判定結果に基づいて、前記広帯域画像および前記狭帯域画像のどちらか一方を他方より強調して前記表示装置に表示させる表示制御部と、
を備えたことを特徴とする画像処理装置。 A plurality of wide band filters transmitting light in the primary wavelength band and a narrow band filter transmitting at least one narrow band light form a predetermined array pattern, and the individual filters forming the array pattern Predetermined image processing is performed on image data generated by an imaging device arranged at a position corresponding to any of a plurality of pixels arranged in a two-dimensional grid, and the image data subjected to the image processing is handled. An image processing apparatus for displaying an image to be displayed on a display device;
An image generator configured to generate a wide band image corresponding to the wide band filter and a narrow band image corresponding to the narrow band filter based on the image data generated by the image sensor;
A calculation unit that calculates a feature amount included in the narrow band image generated by the image generation unit;
A determination unit that determines whether the feature value calculated by the calculation unit exceeds a threshold value;
A display control unit which causes one of the wide band image and the narrow band image to be displayed more emphatically than the other on the basis of the determination result of the determination unit;
An image processing apparatus comprising:
前記撮像素子が生成した前記画像データに基づいて、前記広帯域フィルタに対応する広帯域画像と、前記狭帯域フィルタに対応する狭帯域画像と、を生成する画像生成ステップと、
前記画像生成ステップにおいて生成した前記狭帯域画像に含まれる特徴量を算出する算出ステップと、
前記算出ステップにおいて算出した前記特徴量が閾値を超えているか否かを判定する判定ステップと、
前記判定ステップの判定結果に基づいて、前記広帯域画像および前記狭帯域画像のどちらか一方を他方より強調して前記表示装置に表示させる表示制御ステップと、
を含むことを特徴とする表示制御方法。 A predetermined array pattern is formed using a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one narrow band light, and the individual filters forming the array pattern are gratings. Predetermined image processing is performed on image data generated by an imaging element arranged at a position corresponding to any of a plurality of pixels arranged in a shape, and an image corresponding to the image data subjected to the image processing is A display control method executed by an image processing apparatus displayed on a display device,
An image generation step of generating a wide band image corresponding to the wide band filter and a narrow band image corresponding to the narrow band filter based on the image data generated by the image sensor;
Calculating at least one feature amount included in the narrow band image generated at the image generation step;
A determination step of determining whether the feature value calculated in the calculation step exceeds a threshold value;
A display control step of emphasizing either one of the wide band image and the narrow band image from the other and displaying it on the display device based on the determination result of the determination step;
A display control method comprising:
前記撮像素子が生成した前記画像データに基づいて、前記広帯域フィルタに対応する広帯域画像と、前記狭帯域フィルタに対応する狭帯域画像と、を生成する画像生成ステップと、
前記画像生成ステップにおいて生成した前記狭帯域画像に含まれる特徴量を算出する算出ステップと、
前記算出ステップにおいて算出した前記特徴量が閾値を超えているか否かを判定する判定ステップと、
前記判定ステップの判定結果に基づいて、前記広帯域画像および前記狭帯域画像のどちらか一方を他方より強調して前記表示装置に表示させる表示制御ステップと、
を実行させることを特徴とするプログラム。 A predetermined array pattern is formed using a plurality of wide band filters transmitting light in the primary color wavelength band and a narrow band filter transmitting at least one narrow band light, and the individual filters forming the array pattern are gratings. Predetermined image processing is performed on image data generated by an imaging element arranged at a position corresponding to any of a plurality of pixels arranged in a shape, and an image corresponding to the image data subjected to the image processing is In the image processing apparatus displayed on the display device,
An image generation step of generating a wide band image corresponding to the wide band filter and a narrow band image corresponding to the narrow band filter based on the image data generated by the image sensor;
Calculating at least one feature amount included in the narrow band image generated at the image generation step;
A determination step of determining whether the feature value calculated in the calculation step exceeds a threshold value;
A display control step of emphasizing either one of the wide band image and the narrow band image from the other and displaying it on the display device based on the determination result of the determination step;
A program characterized by causing
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/071164 WO2017017735A1 (en) | 2015-07-24 | 2015-07-24 | Image processing device, display control method and program |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/071164 WO2017017735A1 (en) | 2015-07-24 | 2015-07-24 | Image processing device, display control method and program |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017017735A1 true WO2017017735A1 (en) | 2017-02-02 |
Family
ID=57885113
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/071164 Ceased WO2017017735A1 (en) | 2015-07-24 | 2015-07-24 | Image processing device, display control method and program |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2017017735A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010142547A (en) * | 2008-12-22 | 2010-07-01 | Fujifilm Corp | Endoscopic image processing apparatus and method, and program. |
| JP2010172673A (en) * | 2009-02-02 | 2010-08-12 | Fujifilm Corp | Endoscope system, processor for endoscope, and endoscopy aiding method |
| JP2010184057A (en) * | 2009-02-13 | 2010-08-26 | Fujifilm Corp | Image processing method and device |
| JP2011160848A (en) * | 2010-02-05 | 2011-08-25 | Olympus Corp | Image processing device, endoscope system, program, and image processing method |
-
2015
- 2015-07-24 WO PCT/JP2015/071164 patent/WO2017017735A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010142547A (en) * | 2008-12-22 | 2010-07-01 | Fujifilm Corp | Endoscopic image processing apparatus and method, and program. |
| JP2010172673A (en) * | 2009-02-02 | 2010-08-12 | Fujifilm Corp | Endoscope system, processor for endoscope, and endoscopy aiding method |
| JP2010184057A (en) * | 2009-02-13 | 2010-08-26 | Fujifilm Corp | Image processing method and device |
| JP2011160848A (en) * | 2010-02-05 | 2011-08-25 | Olympus Corp | Image processing device, endoscope system, program, and image processing method |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10231658B2 (en) | Endoscope system, processor device for endoscope system, operation method for endoscope system, and operation method for processor device | |
| JP5789280B2 (en) | Processor device, endoscope system, and operation method of endoscope system | |
| JP6196900B2 (en) | Endoscope device | |
| EP2556790A1 (en) | Endoscopic device | |
| JP6471173B2 (en) | Image processing apparatus, operation method of endoscope apparatus, image processing program, and endoscope apparatus | |
| JP6329715B1 (en) | Endoscope system and endoscope | |
| JP6401800B2 (en) | Image processing apparatus, operation method of image processing apparatus, operation program for image processing apparatus, and endoscope apparatus | |
| US10070771B2 (en) | Image processing apparatus, method for operating image processing apparatus, computer-readable recording medium, and endoscope device | |
| JPWO2015093295A1 (en) | Endoscope device | |
| JP7387859B2 (en) | Medical image processing device, processor device, endoscope system, operating method and program for medical image processing device | |
| CN113164054B (en) | Medical imaging system and method | |
| WO2016084257A1 (en) | Endoscope apparatus | |
| JP2022136171A (en) | MEDICAL IMAGE PROCESSING APPARATUS, ENDOSCOPE SYSTEM, AND METHOD OF OPERATION OF MEDICAL IMAGE PROCESSING APPARATUS | |
| CN113453607A (en) | Medical image processing apparatus and method | |
| JPWO2017221335A1 (en) | Image processing apparatus, image processing method, and program | |
| JP2011194082A (en) | Endoscope image-correcting device and endoscope apparatus | |
| US10863149B2 (en) | Image processing apparatus, image processing method, and computer readable recording medium | |
| US11363943B2 (en) | Endoscope system and operating method thereof | |
| WO2017017735A1 (en) | Image processing device, display control method and program | |
| JP6535435B2 (en) | Processor and endoscope system | |
| JP6681971B2 (en) | Processor and endoscope system | |
| JP6801990B2 (en) | Image processing system and image processing equipment | |
| CN121196429A (en) | Image processing device, endoscope system and working method of image processing device | |
| WO2018235153A1 (en) | Endoscope system, display method and program | |
| CN112739250A (en) | Medical image processing device, processor device, medical image processing method and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15899569 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15899569 Country of ref document: EP Kind code of ref document: A1 |