[go: up one dir, main page]

WO2017064746A1 - Imaging device, endoscopic device and imaging method - Google Patents

Imaging device, endoscopic device and imaging method Download PDF

Info

Publication number
WO2017064746A1
WO2017064746A1 PCT/JP2015/078871 JP2015078871W WO2017064746A1 WO 2017064746 A1 WO2017064746 A1 WO 2017064746A1 JP 2015078871 W JP2015078871 W JP 2015078871W WO 2017064746 A1 WO2017064746 A1 WO 2017064746A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
wavelength band
imaging
mask
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/078871
Other languages
French (fr)
Japanese (ja)
Inventor
愼一 今出
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to PCT/JP2015/078871 priority Critical patent/WO2017064746A1/en
Publication of WO2017064746A1 publication Critical patent/WO2017064746A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • G03B35/12Stereoscopic photography by simultaneous recording involving recording of different viewpoint images in different colours on a colour film
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • G03B35/26Stereoscopic photography by simultaneous viewing using polarised or coloured light separating different viewpoint images
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B9/00Exposure-making shutters; Diaphragms
    • G03B9/08Shutters
    • G03B9/10Blade or disc rotating or pivoting about axis normal to its plane
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Definitions

  • the present invention relates to an imaging device, an endoscope device, an imaging method, and the like.
  • Various methods for optically 3D measuring the surface shape of an object have been proposed.
  • a method of projecting active pattern illumination onto an object and performing stereo three-dimensional distance measurement by binocular stereoscopic vision is proposed. is there.
  • stereo measurement matching processing is performed using a characteristic part (for example, an edge) of an image, and a distance to an object is obtained.
  • a characteristic part for example, an edge
  • a characteristic part is intentionally added to the subject, and distance measurement can be accurately performed even in a part without a characteristic part (for example, a flat part).
  • Patent Documents 1 and 2 disclose a technique for switching left and right imaging light paths with a mechanical shutter in time and acquiring left and right images in a time-sharing manner.
  • Patent Document 2 discloses a technique of inserting an RG filter in the left half of a single imaging optical path and a GB filter in the right half, and separating the left and right images from the R image and the B image of the captured image. Yes.
  • the RG filter and the GB filter are retracted from the imaging optical path, and an observation image is acquired.
  • inspection may be performed by capturing an observation image that is not stereo shooting, and stereo measurement may be performed at a portion to be examined in detail.
  • switching between flat illumination and active pattern illumination can be considered, but the illumination mechanism becomes complicated.
  • convenience can be improved when real-time stereo measurement can be performed while performing normal observation, but it is preferable that illumination switching is not accompanied at that time. .
  • an imaging apparatus an endoscope apparatus, an imaging method, and the like that can capture a normal observation image while performing illumination with active pattern illumination.
  • One embodiment of the present invention includes an image including a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color.
  • An imaging unit that captures an image and can capture the first color image and the third color image as a stereo image, and is included in the wavelength band of the first color and included in the wavelength band of the second color
  • a pattern illumination having a given light amount distribution in a second wavelength band that is included in the first wavelength band that is not included in the wavelength band of the third color and that is not included in the wavelength band of the second color.
  • an illumination unit includes an illumination.
  • a pattern illumination having a given light amount distribution in a first wavelength band included in the first color wavelength band and a second wavelength band included in the third color wavelength band is a subject. Is irradiated. Since the first wavelength band and the second wavelength band are not included in the wavelength band of the second color, the image of the second color is an image that is not affected by a given light amount distribution due to pattern illumination. This makes it possible to capture a normal observation image while performing illumination with active pattern illumination.
  • Another aspect of the present invention relates to an endoscope apparatus that includes the imaging apparatus described above.
  • a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color when the image of the first color and the image of the third color can be captured as a stereo image, the first color that is included in the wavelength band of the first color and not included in the wavelength band of the second color.
  • the present invention relates to an imaging method for imaging a captured image including an image of one color, an image of the second color, and an image of the third color.
  • An example of pattern illumination The example of the waveform of the light quantity of pattern illumination, the reflection coefficient distribution of a to-be-photographed object, and a captured image.
  • Explanatory drawing of a correction process Explanatory drawing of a correction process.
  • Second configuration example of illumination unit The structural example of an imaging part.
  • the structural example of an imaging part The structural example of an imaging part.
  • the 1st detailed structural example of a fixed mask and a movable mask The 1st detailed structural example of a fixed mask and a movable mask.
  • the 2nd detailed structural example of a fixed mask and a movable mask The 2nd detailed structural example of a fixed mask and a movable mask.
  • an industrial endoscope apparatus will be described below as an application example of the present invention.
  • the present invention is not limited to application to an industrial endoscope apparatus, and a stereo shooting method (with an imaging system having parallax).
  • a method of detecting the phase difference between the two images and acquiring subject distance information), a three-dimensional measuring device that measures a three-dimensional shape, and an imaging device having a three-dimensional measuring function for example, a medical endoscope device, Microscopes, industrial cameras, robot vision functions, etc. are applicable.
  • Pattern illumination In order to obtain a high-quality observation image, it is generally premised that the subject is illuminated uniformly. However, when performing stereo measurement, distance information to the subject cannot be obtained unless there is a feature that easily obtains a phase difference. Therefore, a method of irradiating the subject with pattern illumination that is intentionally characterized is used.
  • an observation mode for capturing an observation image using white light and a measurement mode for performing stereo measurement by the color phase difference method are switched and used.
  • conventional pattern illumination it is necessary to have a function of switching the illumination itself so that uniform illumination is performed in the observation mode and pattern illumination is performed in the measurement mode.
  • a method is used in which pattern illumination is always performed in the observation mode or in the measurement mode, and the pattern in the observation mode is erased by a spectral filter or image processing. Thereby, the illumination switching function can be made unnecessary.
  • this method will be described.
  • FIG. 1 schematically shows an example of pattern illumination.
  • FIG. 1 shows a plan view of the pattern illumination PL (for example, a plan view when projected onto a plane perpendicular to the optical axis of the imaging system) and an example of the light quantity characteristic in the AA section.
  • x is a position (coordinate) in a direction perpendicular to the optical axis.
  • a small circle pattern DT whose brightness changes is an illumination pattern regularly arranged.
  • the outside of the small circle pattern DT is illumination with flat brightness, and the inside of the small circle pattern DT is illumination darker than that.
  • a waveform obtained by multiplying the amount of light of pattern illumination PL and the reflection coefficient distribution of the subject is obtained as an imaging waveform (sensor output) as shown in FIG.
  • the sensor output in FIG. 2 represents a pixel value or a luminance value (or brightness of image formation on the sensor surface) at a pixel corresponding to the position x.
  • a solid line indicates a sensor output when pattern illumination is performed, and a dotted line indicates a sensor output when flat illumination is performed.
  • FIG. 3 shows a first spectral characteristic example of pattern illumination and a first spectral characteristic example of a pupil in the observation mode and the stereo measurement mode.
  • Dotted lines with B, G, R, and IR symbols represent spectral characteristics of the color filter of the image sensor.
  • B, G, and R are spectral characteristics of the blue, green, and red filters of the image sensor, respectively.
  • IR is an infrared sensitivity characteristic and passes through any of the blue, green, and red filters.
  • the wavelength bands are set to five ⁇ Pv1, Pb, Pv2, Pr, Pir ⁇ in association with the spectral characteristics of the color filter of the imaging sensor.
  • the illumination light in the bands ⁇ Pv1, Pv2, Pir ⁇ is non-pattern light (flat light) with a uniform light amount distribution
  • the illumination light in the bands ⁇ Pb, Pr ⁇ is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.
  • the bands Pv1 and Pb are set to bands that pass through the blue filter of the image sensor but do not pass through the green filter, the band Pv is set to a band that passes through the green filter, and the bands Pr and Pir are red.
  • a band that passes through the filter but does not pass through the green filter is set.
  • the band Pir includes an infrared sensitivity characteristic IR. Note that the band Pir may not include the infrared sensitivity characteristic IR.
  • the bands Pb and Pr are set to wavelength bands that do not interfere with the spectral characteristic G of the green filter of the image sensor.
  • the bands Pb and Pr are set to narrow bands such as several nm to several tens of nm, for example.
  • band Pb it is desirable to select a wavelength range in which only the blue pixel (spectral characteristic B) of the image sensor can be acquired and the sensitivity is relatively good.
  • band Pr it is desirable to select a wavelength range in which only the red pixel (spectral characteristic R) of the imaging sensor can be acquired and the sensitivity is relatively good. That is, since the left pupil image and the right pupil image must be separated by different color pixels of the image sensor, ⁇ Pb, Pr ⁇ may be selected in a wavelength region that does not have a mutual light receiving sensitivity characteristic.
  • the spectral components ⁇ Pv1, Pv2, Pir ⁇ of the non-pattern illumination light are components of a normal observation image
  • the red pixel, the green pixel, and the blue pixel spectral characteristics R, It is desirable that G and B) can cover as many wavelength ranges as possible.
  • a standard laser light source with wavelengths of 450 nm and 660 nm is used for ⁇ Pb, Pr ⁇ so as to be a narrow-band light source, and the spectral components ⁇ Pv1, Pv2, Pir ⁇ may cover many wavelength components. This is one way.
  • the relationship between the spectral characteristics FL, FC, FR of the pupil and the spectral characteristics of the illumination light is set as in the following formula (1).
  • the spectral characteristic FC corresponds to the wavelength band ⁇ Pv1, Pv2, Pir ⁇ , and an observation image by flat illumination is obtained.
  • shooting is performed with the left pupil of the spectral characteristic FL and the right pupil of the spectral characteristic FR.
  • the spectral characteristics FL and FR correspond to the wavelength bands Pb and Pr, and a stereo image by active pattern illumination is obtained.
  • the observation mode is a mode in which monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed mask 20 as shown in FIGS. 11, 13, and 15, and the stereo measurement mode is shown in FIGS. 12, 14, and 16. In this manner, stereo imaging is performed with the left pupil aperture 21 (spectral characteristic FL) and right pupil aperture 22 (spectral characteristic FR) of the fixed mask 20 as described above. Details of these observation modes and stereo measurement modes will be described later.
  • FIG. 4 shows the relationship between the first spectral characteristic example and the captured images in the observation mode and the stereo measurement mode.
  • the dotted line virtually indicates the sensor output when flat illumination is performed.
  • the active pattern illumination of the present embodiment is applied to the imaging unit of FIGS. 11 to 16 as an example.
  • the configuration of the imaging unit is not limited to FIGS. Any observation image can be used as long as a stereo image by spectral characteristics FL and FR can be taken.
  • the illumination light is the same regardless of the mode, and the active pattern illumination is projected onto the subject in both the observation mode and the stereo measurement mode.
  • the reflected light by the illumination light having the spectral components ⁇ Pv1, Pv2, Pir ⁇ of the non-pattern light passes through the pupil center optical path (spectral characteristic FC), and a captured image ⁇ Vr, Vg, Vb ⁇ is obtained.
  • Vr is a red image obtained by a pixel having a red filter of the image sensor
  • Vg is a green image obtained by a pixel having a green filter of the image sensor, and obtained by a pixel having a blue filter of the image sensor It is a blue image.
  • the band of the active pattern is removed, an observation image that is not affected by the active pattern is obtained.
  • the reflected light by the illumination light having the spectral component ⁇ Pb, Pr ⁇ of the pattern light passes through the left and right pupil optical paths, and a captured image ⁇ Mr, Mb ⁇ is obtained.
  • Mr is an image by the left pupil optical path (spectral characteristic FL)
  • Mb is an image by the right pupil optical path (spectral characteristic FR).
  • the captured image ⁇ Mr, Mb ⁇ can be easily matched and phase difference detection can be easily performed.
  • the imaging device captures the captured images (Vb, Vg, Vr) including the first color image, the second color image, and the third color image.
  • An imaging unit capable of capturing a first color image and a third color image as a stereo image (Mb, Mr), and pattern illumination having a given light amount distribution in the first wavelength band Pb and the second wavelength band Pr And an illumination unit that irradiates the subject.
  • the second color (green) image is an image on the longer wavelength side than the first color (blue)
  • the third color (red) image is an image on the longer wavelength side than the second color.
  • the first wavelength band Pb is a band that is included in the first color wavelength band (spectral characteristic B band) and not included in the second color wavelength band (spectral characteristic G band).
  • the second wavelength band Pr is a band that is included in the third color wavelength band (spectral characteristic R band) and not included in the second color wavelength band (spectral characteristic G band).
  • the first wavelength band Pb and the second wavelength band Pr are not included in the wavelength band of the second color (green), at least the pattern of the given light quantity distribution is present in the second color image. Is not reflected. Thereby, it is possible to switch between stereo measurement and normal observation without switching between pattern illumination and flat illumination.
  • the stereo image is composed of a first color (blue) image and a third color (red) image, and the first color image includes a pattern in the first wavelength band Pb, and the third color image. Shows a pattern in the second wavelength band Pr.
  • a stereo image intentionally characterized by a pattern is acquired, and high-precision stereo measurement can be performed by performing matching processing thereof.
  • an observation image is captured through a spectral filter (spectral characteristic FC) that does not pass the first wavelength band Pb and the second wavelength band Pr, so that an observation image in which a pattern due to a given light amount distribution is not captured is obtained. Is obtained.
  • spectral characteristic FC spectral characteristic
  • an observation image of the entire band of white light is photographed, and the first color (blue) and third color (red) images in which the pattern is captured are not captured.
  • an observation image from which the pattern has been removed is obtained.
  • a pattern with a given light quantity distribution is not provided in the wavelength band of the second color, it is possible to obtain an observation image as in flat illumination while performing pattern illumination.
  • the first wavelength band is the wavelength band Pb in FIG. 3 and the second wavelength band is the wavelength band Pr in FIG. 3, but the present invention is not limited to this.
  • the first wavelength band may be the wavelength band Pb1 in FIG. 5
  • the second wavelength band may be the wavelength band Pr2 in FIG.
  • a given light amount distribution (a light amount distribution of a given shape) is a distribution having a light / dark (light amount) boundary or a distribution having an edge portion where the light amount changes abruptly.
  • a plurality of parts are provided in the irradiation area of the pattern illumination.
  • this given light quantity distribution is given only to specific wavelength bands Pb and Pr.
  • small circle patterns DT darker than the surroundings are regularly arranged in the wavelength bands Pb and Pr, but the given light quantity distribution is not limited to this.
  • the pattern DT may not be a small circle, the arrangement of the pattern DT may not be regular, and the inside of the pattern DT may be brighter than the outside.
  • the imaging unit switches between a stereo mode for capturing a stereo image and a non-stereo mode for capturing a captured image with a single eye.
  • the imaging unit in the imaging unit described later with reference to FIGS. 11 to 16, in the non-stereo mode (observation mode), monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed mask 20, and in the stereo mode (stereo measurement mode), the image is fixed. Stereo imaging is performed with the left pupil aperture 21 (spectral characteristics FL) and right pupil aperture 22 (spectral characteristics FR) of the mask 20.
  • the imaging unit to which the pattern illumination of the present embodiment can be applied is not limited to this, and any imaging unit that captures the first color image and the third color image as a stereo image in the stereo mode may be used.
  • the monocular in the non-stereo mode uses the first wavelength band Pb and the second wavelength band Pr out of the wavelength bands (white light wavelength bands) of the first color, the second color, and the third color. Except the wavelength bands ⁇ Pv1, Pv2, Pir ⁇ .
  • the wavelength band ⁇ Pv1, Pv2, Pir ⁇ through which the monocular passes is a pattern based on the given light quantity distribution. Is not attached.
  • an observation image such as flat illumination can be taken despite pattern illumination in the non-stereo mode.
  • the wavelength bands ⁇ Pv1, Pv2, Pir ⁇ excluding them are substantially the wavelength band of white light. Therefore, it is possible to obtain an image that is not inferior to a captured image obtained by illumination with white light.
  • the imaging apparatus may include a phase difference detection unit 330 and an image output unit (color image generation unit 320).
  • the phase difference detection unit 330 detects a phase difference between the first color image and the third color image captured in the stereo mode.
  • the image output unit outputs an image for observation based on the captured image (first color to third color image) captured in the non-stereo mode.
  • the phase difference detection unit 330 can detect the phase difference with high accuracy.
  • the pattern is not captured in the captured image captured in the non-stereo mode even though pattern illumination is performed, an image for observation can be output by the image output unit.
  • the wavelength band of the first color wavelength band (spectral characteristic B) excluding the first wavelength band Pb, the second color wavelength band (spectral characteristic G), and the third color wavelength band. It is a flat light amount distribution in the wavelength band excluding the second wavelength band Pr in (spectral characteristic R).
  • the illumination light has a flat light amount distribution other than the first wavelength band Pb and the second wavelength band Pr, at least the second color image is an image by flat illumination.
  • the flat light quantity distribution means that the light quantity distribution is constant (substantially constant) in the photographing region (field of view) photographed by the imaging unit.
  • the light amount distribution on a surface having a constant distance from the imaging unit is constant.
  • the light amount distribution need not be completely constant, and there may be a gradual light amount change without an abrupt light amount change (edge portion) such as pattern illumination.
  • the first color is blue
  • the second color is green
  • the third color is red
  • the first color image, the second color image, and the third color image correspond to a blue image, a green image, and a red image, respectively, and these images have an image pickup element having a primary color filter. It is not limited to what was imaged (for example, image sensor of primary color Bayer arrangement).
  • a complementary color image may be captured by an imaging element having a complementary color filter, and a blue image, a green image, and a red image may be acquired from the complementary color image by conversion processing.
  • Second spectral characteristic example In the first spectral characteristic example, the case where the imaging bands of the observation mode and the stereo measurement mode are separated has been described. In the second spectral characteristic example, a case where the imaging bands of the observation mode and the measurement mode are not separated will be described. If you can always use illumination that combines non-patterned light and patterned light without separating the wavelength band, regardless of the mode, you can use the wavelength band effectively, and the illumination switching function is unnecessary, ensuring imaging sensitivity. Covering the color information of the subject is advantageous in terms of simplifying the illumination mechanism. However, if the wavelength bands are not separated, the captured image in the observation mode is also affected by the pattern light. Therefore, in order to obtain a high-quality and faithful observation image, it is necessary to remove or reduce the influence of the pattern light.
  • FIG. 5 shows a second spectral characteristic example of pattern illumination and a second spectral characteristic example of the pupil in the observation mode and the stereo measurement mode.
  • the wavelength band is divided into five parts ⁇ Pb1, Pb2, Pr1, Pr2, Pir ⁇ in association with the spectral characteristics of the color filter of the image sensor.
  • the illumination light in the bands ⁇ Pb2, Pr1, Pir ⁇ is non-pattern light (flat light) with a uniform light quantity distribution, and the illumination light in the bands ⁇ Pb1, Pr2 ⁇ is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.
  • Pb1 is set in a band that passes through the blue filter but does not pass through the green filter
  • Pb2 is set in a band that passes through both the blue filter and the green filter
  • Pr1 is set in a band that passes through both the red filter and the green filter
  • Pr2 is set in a band that passes through the red filter but does not pass through the green filter
  • Pir is a band corresponding to the infrared sensitivity characteristic IR, and is a band that passes through all the red, green, and blue filters.
  • the wavelength bands Pb1 and Pr2 are set to wavelength bands that do not interfere with the spectral characteristics G of the green filter of the image sensor.
  • an image is taken with the pupil of the spectral characteristic FC, which includes the wavelength band ⁇ Pb1, Pb2, Pr1, Pr2, Pir ⁇ .
  • FC the spectral characteristic
  • no filter is provided in the central aperture 23 of the fixed mask 20, and the central pupil optical path allows light in the entire band to pass.
  • the relationship between the red image Vr, the green image Vg, and the blue image Vb constituting the color image captured in the observation mode and the wavelength band covered by them is expressed by the following equation (2).
  • the spectral characteristic FL corresponds to the wavelength band Pb1
  • the spectral characteristic FR corresponds to the wavelength band Pr2. That is, the relationship between the red image Mr and the blue image Mb captured in the stereo measurement mode and the wavelength band covered by them is expressed by the following equation (3).
  • FIG. 6 shows the relationship between the second spectral characteristic example and the captured images in the observation mode and the stereo measurement mode.
  • the dotted line virtually indicates the sensor output when flat illumination is performed.
  • the reflected light by the illumination light having the spectral components ⁇ Pb1, Pb2, Pr1, Pr2, Pir ⁇ passes through the pupil center optical path (spectral characteristic FC), and a captured image ⁇ Vr, Vg, Vb ⁇ is obtained.
  • the reflected light by the illumination light having the spectral components ⁇ Pb1, Pr2 ⁇ of the pattern light passes through the left and right pupil optical paths, and a captured image ⁇ Mr, Mb ⁇ is obtained.
  • the observation images Vb and Vr are affected by the pattern illumination, so that the images are intentionally changed in brightness.
  • the observation image Vg is composed of the wavelength range ⁇ Pb2, Pr1, Pir ⁇ that is uniform illumination, no intentional brightness change occurs, and the image profile reflects only the reflection coefficient of the subject.
  • the observation images Vr and Vb are profiles affected by the pattern, but assuming a profile (dotted line in FIG. 6) obtained when the illumination is uniform. Although the average brightness is different, the similarity (similarity) with the observed image Vg is partially high. This is because the observation image ⁇ Vr, Vg ⁇ is an image in which the spectral characteristics of the imaging sensor overlap in the band Pb2, and the observation image ⁇ Vb, Vg ⁇ is an image in which the spectral characteristics of the imaging sensor overlap in the band Pr1. This is because there is a considerable correlation between them.
  • the observation image Vg not affected by the pattern is used, the observation images Vr and Vb can be corrected, and the influence of the pattern can be removed or reduced. That is, it is possible to restore an observation image as if it was taken with flat illumination. This correction process will be described below.
  • FIG 7 and 8 are explanatory diagrams of the correction process.
  • the correction of the blue image Vb will be described as an example, but the correction of the red image Vr can be similarly performed.
  • the correlation value between the waveform of the blue image Vb and the green image Vg in the section of the width d centering on the arbitrary position XL on the sensor surface of the image sensor is calculated. This is performed for all positions x on the sensor surface (that is, all pixels of the captured image). For example, when ZNCC (Zero-mean Normalized Cross-Correlation) is used, the correlation value becomes 1 when the similarity is maximum, and the correlation value approaches 0 as the similarity becomes lower.
  • the correlation value is compared with the threshold Th, and when the correlation value is equal to or greater than the threshold Th, the flag value is set to “1”, and when the correlation value is smaller than the threshold Th, the flag value is set to “0”. That is, the flag value is obtained by binarizing the similarity with / without similarity. In the correction process, a pixel with a flag value “1” is determined as a valid pixel, and a pixel with a flag value “0” is determined as an invalid pixel.
  • the correlation value is not limited to the case of obtaining all the pixels of the captured image.
  • the correlation value may be obtained for pixels in a predetermined region, or may be obtained for pixels thinned at a predetermined interval.
  • the determination of the flag value is not limited to the above. For example, when using a correlation calculation in which the correlation value decreases as the similarity increases, the flag value is set to “1” when the correlation value is equal to or less than the threshold Th The flag value may be set to “0” when the correlation value is larger than the threshold Th.
  • the fitting process for example, when the effective pixel range of the processing section is e1 and e2, the level of the green image Vg is changed, and the total absolute value of the difference between the green image Vg and the blue image Vb at each level is obtained.
  • a method of superimposing such that the sum is minimized can be considered.
  • a method is conceivable in which the gain of the green image Vg is changed, the sum of the absolute values of the differences between the green image Vg and the blue image Vb at each gain is obtained, and the sum is made so that the sum is minimized.
  • the pixel value Vg (XL) at the position XL of the green image Vg after the fitting process is set as a correction value Vb ′ (XL).
  • the series of correction processes described above is performed at all positions x on the sensor surface (that is, all pixels of the captured image) to generate a corrected blue image Vb ′.
  • a corrected red image Vr ′ is generated.
  • a high-quality observation image can be generated even with illumination light including pattern light.
  • the monocular in the non-stereo mode passes the wavelength band including the wavelength bands of the first color (blue), the second color (green), and the third color (red).
  • the first color image and the third color image in which the pattern by the pattern illumination is captured, and the second color image in which the pattern by the pattern illumination is not captured are captured. Then, by using these three color images, it is possible to erase (reduce) the pattern by pattern illumination, and an observation image can be obtained.
  • the imaging apparatus may include a phase difference detection unit 330 and an image output unit (color image generation unit 320).
  • the phase difference detection unit 330 detects a phase difference between the first color image and the third color image captured in the stereo mode.
  • the image output unit outputs an image for observation based on the captured image (first color to third color image) captured in the non-stereo mode.
  • the image output unit corrects changes in the pixel values of the first color image and the third color image due to a given light amount distribution based on the second color image.
  • the pattern by the pattern illumination is not reflected in the second color image. Accordingly, it is possible to correct the pixel values of the first color image and the third color image in which the pattern by the pattern illumination is captured using the second color image as a reference. In other words, it is considered that the profiles of the first to third color images are almost similar in a normal image (for example, an image of a subject that is normally captured by an industrial endoscope or an image captured of the natural world). . Therefore, the influence of pattern illumination can be corrected by correcting the profiles of the first color image and the third color image so as to be similar to the profile of the second color image in which no pattern is captured.
  • Illumination Unit An illumination unit that performs active pattern illumination according to this embodiment will be described below.
  • a light source part is not necessarily limited to the apparatus using a light guide member from a light source part to an output end.
  • FIG. 9 shows a first configuration example of the illumination unit. 9 includes a white light source 401, a light guide member 402 (light guide member for non-pattern light), an illumination lens 403, a red laser light source 404, a blue laser light source 405, dichroic prisms 406 and 407 (mirrors in a broad sense). ), A light guide member 408 (light guide member for pattern light), a mask pattern 409, and a projection lens 410.
  • a white light source 401 includes a white light source 401, a light guide member 402 (light guide member for non-pattern light), an illumination lens 403, a red laser light source 404, a blue laser light source 405, dichroic prisms 406 and 407 (mirrors in a broad sense).
  • a light guide member 408 light guide member for pattern light
  • a mask pattern 409 includes a projection lens 410.
  • the illumination unit has two light guide members 402 and 408.
  • One light guide member 402 guides the white light from the white light source 401 to the illumination lens 403 provided at the distal end of the scope unit.
  • the guided white light is irradiated to the subject through the illumination lens 403.
  • the other light guide member 408 guides blue laser light and red laser light to the projection lens 410 provided at the distal end of the scope portion. That is, the blue laser light from the blue laser light source 405 and the red laser light from the red laser light source 404 are incident on the light guide member 408 by optical path synthesis by the dichroic prisms 406 and 407.
  • the laser light guided by the light guide member 408 passes through the mask pattern 409, and the added pattern is projected onto the subject 5 by the projection lens 410.
  • the non-pattern light and the pattern light are combined on the surface of the subject 5.
  • FIG. 10 shows a second configuration example of the illumination unit. 10 includes a white light source 451, a polarizing element 452, a blue laser light source 453, a red laser light source 454, dichroic prisms 455 and 456, a mask pattern 457, a polarizing element 458, a prism 459 (synthesis prism), and a light guide member 460. Projection lens 461.
  • the illumination unit has one light guide member 460.
  • White light from the white light source 451 is converted into, for example, P-polarized light (polarized light perpendicular to S-polarized light) by the polarizing element 452, and the polarized white light enters the prism 459.
  • the optical paths of the blue laser light from the blue laser light source 453 and the red laser light from the red laser light source 454 are synthesized by the dichroic prisms 455 and 456.
  • the optical path synthesized laser light passes through the mask pattern 409, whereby a pattern is added.
  • the laser light to which the pattern is added is converted into, for example, S-polarized light (polarized light parallel to the reflecting surface of the prism 459) by the polarizing element 452, and the polarized laser light enters the prism 459.
  • White light that is non-patterned light and laser light that is patterned light are incident on the light guide member 460 by optical path synthesis by the prism 459.
  • the light guide member 460 guides incident light to the projection lens 461 provided at the distal end portion of the scope portion.
  • the guided non-pattern light and pattern light are projected onto the subject 5 by the projection lens 410.
  • pattern illumination is generated in advance before the light guide member 460 is incident, and is combined with non-patterned light and then incident on the light guide member 460.
  • the light guide member 460 is an optical fiber bundle (image guide) that can transmit the pattern as it is, the pattern of the generated pattern light is transmitted conveniently.
  • the light guide member 460 is a simple optical fiber bundle (light guide)
  • the fiber arrangement is different between the incident end and the output end of the light guide member 460, so that the generated pattern light pattern is transmitted as it is.
  • the pattern is a random pattern, the patterns at the incident end and the outgoing end may be appropriately changed.
  • Imaging Unit Hereinafter, a configuration example of an imaging unit capable of switching between the observation mode and the stereo measurement mode will be described.
  • a scope In an inspection with an endoscopic device, for example, a scope is inserted into the inspection object and a normal image is taken to check for abnormalities. Measure the three-dimensional shape and examine whether further inspection is necessary. Thus, a normal observation image is taken with white light.
  • a method of achieving both such shooting with white light and stereo measurement for example, performing stereo shooting with white light can be considered.
  • the image sensor is divided into left and right parts, and the left image and the right image need to be imaged in the respective regions.
  • there is a color phase difference method As a method for forming the left image and the right image in the same area of the image sensor, there is a color phase difference method. However, since the captured image becomes a color shift image, it cannot be used as an observation image.
  • Patent Document 2 As a technique for performing stereo measurement in a non-time-division manner using a color phase difference, for example, there is Patent Document 2 described above.
  • Patent Document 2 applies stereo measurement to autofocus, and it is considered that high-speed switching with an observation image is not assumed.
  • it is considered disadvantageous in terms of high-speed switching.
  • Patent Document 2 has a problem in that it is difficult to increase the distance measurement accuracy because it is difficult to increase the distance between the pupils because the single optical path is divided into right and left in the middle.
  • the diaphragm is small (F value is large). Therefore, the small diaphragm diameter is divided into right and left, and the distance between the pupils tends to be close.
  • 11 and 12 show a configuration example of the imaging unit of the present embodiment that can solve the above-described problems.
  • 11 and 12 are cross-sectional views (on a plane including the optical axis) of the image pickup unit, and a light amount of an image formed on the image pickup element (or a pixel value of an image picked up by the image pickup element).
  • the relationship of the position x is shown.
  • the position x is a position (coordinates) in a direction perpendicular to the optical axis of the imaging optical system, for example, a pixel position of the image sensor. Actually, it is a two-dimensional coordinate system, but here, a two-dimensional one-dimensional coordinate system in the parallax direction will be described.
  • the imaging unit includes an imaging optical system 10, a movable mask 30 (first mask), a fixed mask 20 (second mask), an imaging element 40 (imaging sensor, image sensor), and an illumination unit 60 (illumination device).
  • the imaging optical system 10 is a monocular optical system, and includes, for example, one or a plurality of lenses.
  • the image pickup device 40 has RGB color filters of the Bayer array will be described as an example.
  • the present invention is not limited to this, and may include, for example, a complementary color filter.
  • the reflected light from the subject 5 is imaged on the surface of the image sensor 40 by the imaging optical system 10.
  • the fixed mask 20 divides the pupil center and the left and right pupils, and the movable mask 30 switches between the image formation by the pupil center and the image formation by the left and right pupils. These are imaged in the same area of the image sensor 40.
  • d is the distance between the center line IC1 of the left pupil (the left eye aperture of the fixed mask 20) and the center line IC2 of the right pupil (the right eye aperture of the fixed mask 20), and the baseline length in stereo measurement It becomes.
  • the straight line AXC is the optical axis of the imaging optical system 10.
  • the center lines IC1 and IC2 are provided at an equal distance from the optical axis AXC of the single-lens imaging optical system 10, for example.
  • the center lines IC1 and IC2 and the optical axis AXC are preferably in the same plane, but are not necessarily in the same plane.
  • the fixed mask 20 and the movable mask 30 are provided at the pupil position of the imaging optical system 10, for example. Alternatively, it may be provided on the imaging side with respect to the imaging optical system 10.
  • the fixed mask 20 is fixed with respect to the imaging optical system 10, and the movable mask 30 is configured such that the position can be switched in a plane perpendicular to the optical axis AXC.
  • the movable mask 30 has an observation mode (first mode, non-stereo mode, monocular mode) which is the first state shown in FIG. 11 and a stereo measurement mode (second mode) which is the second state shown in FIG. , Stereo mode), which can be switched at high speed.
  • the fixed mask 20 includes a plate-shaped light shielding portion (light shielding member) provided with three apertures (left eye aperture, right eye aperture, and central aperture), and a short wavelength ( Blue) spectral filter and a long wavelength (red) spectral filter provided in the right eye aperture.
  • the portions other than the aperture hole are covered with a light shielding portion so that light does not pass through.
  • the central aperture may be, for example, a through hole, or some spectral filter (for example, a broadband spectral filter that transmits at least white light) may be provided.
  • the movable mask 30 includes a plate-shaped light shielding portion (light shielding member) provided with three aperture holes. In each mode, the movable mask 30 is configured in such a size that the light blocking portion covers the central aperture hole or the left and right eye aperture holes among the three aperture holes of the fixed mask 20.
  • the aperture is provided at a position overlapping the central aperture of the fixed mask 20 in the observation mode and at a position overlapping the left eye aperture and the right eye aperture in the stereo measurement mode.
  • the movable mask 30 is also referred to as a left eye aperture, a right eye aperture, and a center aperture.
  • 11 and 12 illustrate the case where the movable mask 30 is provided on the imaging side with respect to the fixed mask 20, the movable mask 30 may be provided on the objective side with respect to the fixed mask 20.
  • the illumination unit 60 is preferably provided so that the tip (illumination exit end) is positioned symmetrically with respect to the left pupil and the right pupil.
  • the negative of the illumination unit 60 is not necessarily limited to the left pupil and the right pupil. May not be symmetrical. 11 and 12, the tip of the illumination unit 60 is disposed in front of the imaging optical system 10.
  • the present invention is not limited to this.
  • the illumination unit 60 and the imaging optical system 10 at the tip of the imaging unit. May be arranged side by side.
  • each diaphragm hole of the movable mask 30 is not provided with a spectral filter (is an open hole), and allows the entire band to pass.
  • FIG. 11 shows the state of the observation mode, where the optical path at the center of the pupil is opened through the central aperture of the fixed mask 20 and the central aperture of the movable mask, and the optical path of the left and right pupils is blocked by the movable mask 30. (Light shielding).
  • the image formed on the image sensor 40 is a formed image IC formed only by the pupil center, and a normal (monocular white light) captured image is obtained.
  • FIG. 12 shows a state of the stereo measurement mode, in which the left eye diaphragm hole of the fixed mask 20 and the left eye diaphragm hole of the movable mask 30 are overlapped, and the right eye diaphragm hole of the fixed mask 20 and the movable mask 30 are overlapped.
  • the right eye aperture hole is overlapped.
  • the optical path at the center of the pupil is blocked (shielded) by the movable mask 30. That is, in the optical path on the left pupil side, the imaging light is filtered by the short wavelength (blue) spectral filter FL (first filter), and an image IL based on the short wavelength component is formed on the image sensor 40. In the optical path on the right pupil side, the imaging light is filtered by a long wavelength (red) spectral filter FR (second filter), and an image IR based on the long wavelength component is formed on the same image sensor 40.
  • red red
  • the image IL obtained from the blue pixels of the image sensor 40 is a short wavelength image
  • the image IR obtained from the red pixels of the image sensor 40 is a long wavelength image
  • the images IL and IR from the two optical paths are obtained.
  • the imaging unit of the present embodiment can take the observation mode that is the first state and the stereo measurement mode that is the second state, and can switch between these states at high speed. Thereby, for example, 3D measurement can be performed in real time while performing non-stereo normal observation.
  • FIGS. 13 and 14 show a first detailed configuration example of the fixed mask 20 and the movable mask 30.
  • 13 and 14 are cross-sectional views of the imaging optical system 10, the fixed mask 20, and the movable mask 30, and a view of the fixed mask 20 and the movable mask 30 in the optical axis direction (a rear view viewed from the imaging side). ).
  • a diaphragm hole 21 having a short wavelength filter FL is opened in the optical path of the left pupil of the fixed mask 20, and a diaphragm hole 22 having a long wavelength spectral filter FR is formed in the optical path of the right pupil.
  • the optical path is provided with an aperture (through hole) aperture 23.
  • the aperture 23 may be provided with a spectral filter FC that allows the bands Pv1, Pv2, and Pir of FIG. 3 to pass therethrough.
  • the aperture holes 21 and 22 are opened in the light shielding portion 24 (light shielding member), and are, for example, holes having a size corresponding to the depth of field necessary for the imaging system (for example, circular holes, the size is a diameter).
  • the centers of the aperture holes 21, 22, and 23 coincide with (including substantially coincident with) the center lines IC1 and IC2 and the optical axis AXC, respectively.
  • the light shielding unit 24 is provided so as to close the housing containing the imaging optical system 10 when viewed from the front (or the back), and is provided, for example, perpendicular to the optical axis AXC. It is a plate-like member.
  • the movable mask 30 has aperture holes 31, 32, 33 in an open state (through hole) and a light shielding portion 34 (light shielding member) in which the aperture holes 31, 32, 33 are opened.
  • the aperture holes 31, 32, and 33 are holes that are slightly larger than the aperture holes 21, 22, and 23 of the fixed mask 20, for example. Alternatively, it may be a hole having a size corresponding to the depth of field necessary for the imaging system (for example, a circular hole having a diameter).
  • the center of the aperture 33 (for example, the center of a circle) coincides with (including substantially coincides with) the optical axis AXC in the observation mode.
  • the light shielding unit 34 is connected to a rotation shaft 35 perpendicular to the optical axis AXC, and is a plate-like member provided perpendicular to the optical axis AXC, for example.
  • the shape of the light shielding part 34 is, for example, a fan shape (the fan base is connected to the shaft 35), but is not limited thereto, and may be any shape that can realize the states of FIGS.
  • the movable mask 30 is configured to rotate about a rotation axis 35 by a predetermined angle in a direction perpendicular to the optical axis AXC.
  • rotational movement can be realized by a piezo element or a motor.
  • the movable mask 30 is rotated and tilted to the right eye by a predetermined angle, the pupil center optical path (aperture hole 23) of the fixed mask 20 is opened, and the left and right pupil optical paths (aperture holes 21, 22) is in a light shielding state.
  • the movable mask 30 rotates and tilts to the left eye side by a predetermined angle, the pupil center optical path (aperture hole 23) of the fixed mask 20 enters a light-shielded state, and the left and right pupil optical paths (aperture hole 21). 22) is in an open state.
  • the aperture 21 having the spectral filter FL By exposing the aperture 21 having the spectral filter FL, the left pupil passes only the short wavelength component, and by exposing the aperture 22 having the spectral filter FR, the right pupil passes only the long wavelength component.
  • the movable mask 30 may be moved by a sliding operation to create two states.
  • the rotation operation or the slide operation can be realized by, for example, a magnet mechanism or a piezoelectric mechanism, and an appropriate one may be selected in consideration of high speed and durability.
  • the imaging device includes the imaging element 40, the imaging optical system 10, the fixed mask 20, and the movable mask 30.
  • the imaging optical system 10 forms an image of the subject 5 on the image sensor 40.
  • the fixed mask 20 includes first to third apertures (diaphragm holes 21, 22, and 23) that divide the pupil of the imaging optical system 10, and a first wavelength band (Pb in FIG. 3 or Pb1 in FIG. 5).
  • a first filter FL that passes therethrough and a second filter FR that passes the second wavelength band (Pr in FIG. 3 or Pr2 in FIG. 5) are included.
  • the movable mask 30 includes a light shielding portion 34 and fourth to sixth openings (throttle holes 31, 32) provided in the light shielding portion 34 corresponding to the first to third openings (throttle holes 21, 22, 23). 33) and is movable with respect to the imaging optical system 10.
  • the first filter FL is provided in the first opening (throttle hole 21).
  • the second filter FR is provided in the second opening (throttle hole 22).
  • the third opening (aperture hole 23) is provided on the optical axis AXC of the imaging optical system 10.
  • the movable mask 30 since there is one movable mask 30 that is a movable part, it is possible to realize high-speed switching, simplification of the driving mechanism, and suppression of failures and errors in mode switching.
  • the movable mask 30 has a simple configuration in which openings (diaphragm holes 31, 32, 33) are provided in the light-shielding portion 34, and troubles such as filter removal due to switching vibration can be suppressed.
  • the fixed mask 20 is provided with three openings (diaphragm holes 21, 22, and 23), and one of them is provided on the optical axis AXC, so that the observation image becomes a pupil center image. .
  • the vignetting of the light beam is reduced and an observation image having a wide viewing angle can be acquired.
  • high quality (for example, less distortion) imaging can be obtained.
  • the center (s / 2 position) of the phase difference (s in FIG. 12) in stereo measurement coincides with the light beam passing through the center of the pupil. That is, in the present embodiment, the same pixel in the observation image and the distance map corresponds to the same position on the subject 5.
  • the observation image has a parallax on the left and is not the center of the pupil, so that the pixel on the subject image 5 and the distance map corresponding to the same position on the subject 5 corresponds.
  • the present embodiment is more advantageous when, for example, displaying an observation image and three-dimensional information in an overlapping manner.
  • the first aperture corresponds to the left pupil
  • the second aperture corresponds to the right pupil
  • the third aperture is at the pupil center.
  • the first opening may correspond to the right pupil
  • the second opening may correspond to the left pupil.
  • the aperture is referred to as a diaphragm aperture.
  • the aperture does not necessarily have a function as a diaphragm (a function of limiting the cross-sectional area of the light beam passing through the pupil).
  • the apertures 23 and 33 overlap in the observation mode, but when the aperture 23 is smaller, the aperture 23 has a function of aperture, and when the aperture 33 is smaller, the aperture 33 is the aperture. It has the function of.
  • the pupil is for separating (or defining) the imaging optical path by the imaging optical system 10.
  • the optical path is a path from the light that forms an image on the image sensor 40 to the image sensor 40 after entering from the objective side of the optical system. That is, the optical paths that pass through the imaging optical system 10 and the apertures 21 and 22 of the fixed mask 20 (the apertures 31 and 32 of the movable mask 30 in the stereo measurement mode) are the first and second optical paths.
  • the optical path that passes through the imaging optical system 10 and the aperture 23 of the fixed mask 20 (or the aperture 33 of the movable mask 30 in the observation mode) is the third optical path.
  • the mask is a member or component that shields light incident on the mask and allows part of the light to pass.
  • the light shielding portions 24 and 34 shield light
  • the aperture holes 21, 22, 23, 31, 32, and 33 emit light (full band or partial band). Let it pass.
  • the imaging apparatus includes a movable mask control unit 340 (FIG. 18) that controls the movable mask 30.
  • the movable mask control unit 340 When the movable mask control unit 340 is viewed in the optical axis AXC direction in the non-stereo mode (observation mode), the light shielding unit 34 overlaps the first and second openings (diaphragm holes 21 and 22) and the sixth opening.
  • the movable mask 30 is set in a first state (first position) where the (aperture hole 33) overlaps with the third opening (aperture hole 23).
  • the fourth and fifth apertures become the first and second apertures (diaphragm apertures 21 and 22).
  • the movable mask 30 is set in a second state (second position) where the light shielding portion 34 overlaps with the third opening (aperture hole 23).
  • FIGS. 15 and 16 show a second detailed configuration example of the fixed mask 20 and the movable mask 30.
  • 15 and 16 are cross-sectional views of the imaging optical system 10, the fixed mask 20, and the movable mask 30, and a view of the fixed mask 20 and the movable mask 30 in the optical axis direction (a rear view viewed from the imaging side). ).
  • the movable mask 30 includes a light shielding part 34 and aperture holes 31 and 32 provided in the light shielding part 34.
  • the aperture holes 31 and 32 are in an open state (through hole), and are arranged on the same circle around the rotation shaft 35.
  • the aperture hole 31 has a shape extending in the circumferential direction of the same circle, and overlaps the aperture hole 23 of the fixed mask 20 in the observation mode and also overlaps the aperture hole 21 of the fixed mask 20 in the stereo measurement mode. ing.
  • the fixed mask 20 includes a light shielding part 24 and three aperture holes 21, 22, 23 provided in the light shielding part 24.
  • the aperture holes 21 and 22 are provided with spectral filters FL and FR.
  • the aperture hole 23 may be in an open state (through hole), or a spectral filter FC that allows the bands Pv1, Pv2, and Pir of FIG. 3 to pass therethrough may be provided.
  • the aperture holes 21, 22, and 23 are arranged on the same circle around the rotation shaft 35.
  • the aperture hole 23 at the center of the pupil of the fixed mask 20 is opened by the aperture hole 31 of the movable mask 30, and the aperture holes 21 and 22 of the left and right pupils of the fixed mask 20 are shielded by the light shielding portion 34 of the movable mask 30.
  • a white light image is captured by a single eye.
  • the left and right pupil apertures 21 and 22 of the fixed mask 20 are opened by the aperture holes 31 and 32 of the movable mask 30, and the aperture hole 23 at the center of the pupil of the fixed mask 20 is the light shielding portion 34 of the movable mask 30.
  • a parallax image (red image, blue image) by the color phase difference method is captured.
  • the imaging device includes the imaging element 40, the imaging optical system 10, the fixed mask 20, and the movable mask 30.
  • the imaging optical system 10 forms an image of the subject 5 on the image sensor 40.
  • the fixed mask 20 includes first to third apertures (diaphragm holes 21, 22, and 23) that divide the pupil of the imaging optical system 10, and a first wavelength band (Pb in FIG. 3 or Pb1 in FIG. 5).
  • a first filter FL that passes therethrough and a second filter FR that passes the second wavelength band (Pr in FIG. 3 or Pr2 in FIG. 5) are included.
  • the movable mask 30 includes a light shielding part 34, a fourth opening (aperture hole 31) provided in the light shielding part 34 corresponding to the first and third openings (throttle holes 21 and 23), and a second opening.
  • a fifth aperture (diaphragm hole 32) provided in the light shielding portion 34 corresponding to the (diaphragm hole 22) is provided, and is movable with respect to the imaging optical system 10.
  • the first filter FL is provided in the first opening (throttle hole 21).
  • the second filter FR is provided in the second opening (throttle hole 22).
  • the third opening (aperture hole 23) is provided on the optical axis AXC of the imaging optical system 10.
  • the imaging apparatus includes a movable mask control unit 340 that controls the movable mask 30.
  • the movable mask control unit 340 When the movable mask control unit 340 is viewed in the optical axis AXC direction in the non-stereo mode (observation mode), the light shielding unit 34 overlaps the first and second openings (diaphragm holes 21 and 22) and the fourth opening.
  • the movable mask 30 is set in a first state where the (aperture hole 31) overlaps the third opening (aperture hole 23).
  • the fourth and fifth apertures become the first and second apertures (diaphragm apertures 21 and 22).
  • the movable mask 30 is set in a second state in which the light shielding portion 34 overlaps with the third opening (aperture hole 23).
  • a coordinate system X, Y, and Z of the three-dimensional space is defined as follows. That is, the X axis and the Y axis orthogonal to the X axis are set along the imaging sensor surface, and the Z axis is set in a direction toward the subject in a direction orthogonal to the imaging sensor surface and parallel to the optical axis AXC. The Z axis intersects the X axis and Y axis at the zero point. Note that the Y-axis is omitted here for convenience.
  • the distance between the imaging lens 10 and the imaging sensor surface is b, and the distance from the imaging lens 10 to the arbitrary point Q (x, z) of the subject 5 is z.
  • the distance between the pupil center lines IC1 and IC2 and the Z axis is the same, and each is d / 2. That is, the baseline length in stereo measurement is d.
  • the X coordinate of the corresponding point at which the arbitrary point Q (x, y) of the subject 5 is imaged on the imaging sensor surface by the imaging lens 10 is XL, and the arbitrary point Q (x, y) of the subject 5 is the imaging lens 10.
  • the X coordinate of the corresponding point imaged on the imaging sensor surface is defined as XR.
  • the following equation (4) can be obtained by using a similarity relationship between a plurality of partial right-angled triangles formed in the triangle surrounded by the arbitrary point Q (x, z) and the coordinates XL and XR.
  • D and b are known set values, and the unknowns XL and XR are obtained as follows. That is, the position XL on the imaging sensor surface is considered as a reference (the pixel position of the left image is regarded as XL), and the position XR corresponding to the position XL is detected by matching processing (correlation calculation). By calculating the distance z for each position XL, the shape of the subject can be measured. If the matching is not good, the distance z may not be obtained, but may be obtained by interpolation from the distance z of surrounding pixels, for example.
  • FIG. 18 shows a configuration example of the endoscope device (imaging device in a broad sense) of the present embodiment.
  • the endoscope apparatus includes a scope unit 100 (imaging unit) and a main body unit 200 (control device).
  • the scope unit 100 includes an imaging optical system 10, a fixed mask 20, a movable mask 30, an image sensor 40, a drive unit 50, and an illumination unit 60.
  • the main body unit 200 includes a processing unit 210, a monitor display unit 220, and an imaging processing unit 230.
  • the processing unit 210 includes a light source drive control unit 305, an image selection unit 310 (image frame selection unit), a color image generation unit 320 (image output unit), a phase difference detection unit 330, and a movable mask control unit 340 (movable mask drive control unit). ), A movable mask position detector 350, a distance information calculator 360, and a three-dimensional information generator 370.
  • the main body unit 200 may include an operation unit that operates the main body unit 200, an interface unit that is connected to an external device, and the like as components (not illustrated).
  • the scope unit 100 may include, for example, an operation unit that operates the scope unit 100, a treatment instrument, and the like as components not shown.
  • an industrial and medical so-called video scope an endoscope apparatus incorporating an image sensor
  • the present invention can be applied to both a flexible mirror in which the scope unit 100 is configured to be bendable and a rigid mirror in which the scope unit 100 is configured in a stick shape.
  • the main body 200 and the imaging unit 110 are configured as portable devices that can be carried, and are used for manufacturing inspection and maintenance inspection of industrial products, maintenance inspection of buildings and piping, and the like.
  • the driving unit 50 drives the movable mask 30 based on a control signal from the movable mask control unit 340, and switches between the first state (observation mode) and the second state (stereo measurement mode).
  • the drive unit 50 is configured by an actuator using a piezoelectric element or a magnet mechanism.
  • the imaging processing unit 230 performs imaging processing on the signal from the imaging element 40 and outputs a captured image (for example, a Bayer image). For example, correlated double sampling processing, gain control processing, A / D conversion processing, gamma correction, color correction, noise reduction, and the like are performed.
  • the imaging processing unit 230 may be configured by, for example, a discrete IC such as an ASIC, or may be incorporated in the imaging device 40 (sensor chip) or the processing unit 210.
  • the monitor display unit 220 displays an image captured by the scope unit 100, 3D shape information of the subject 5, and the like.
  • the monitor display unit 220 includes a liquid crystal display, an EL (Electro-Luminescence) display, or the like.
  • the illumination unit 60 irradiates the subject 5 with the combined light of the non-pattern light and the pattern light described above.
  • the light source drive control unit 305 optimally controls each light amount of the non-pattern light and the pattern light based on a signal from the imaging processing unit 230 (so-called dimming control). For example, the brightness of the captured image is obtained, and the amount of light is controlled so that the brightness is within a predetermined range.
  • the movable mask control unit 340 controls the driving unit 50 to switch the position of the movable mask 30.
  • the movable mask control unit 340 sets the movable mask 30 to the observation mode, the reflected light from the subject 5 is imaged on the image sensor 40 via the pupil center optical path.
  • the imaging processing unit 230 reads the pixel value of the image formed on the imaging element 40, performs A / D conversion or the like, and outputs the image data to the image selection unit 310.
  • the image selection unit 310 detects that the movable mask 30 is in the observation mode based on the control signal from the movable mask control unit 340, and selects ⁇ Vr, Vg, Vb ⁇ from the captured image to generate a color image.
  • the color image generation unit 320 performs demosaicing processing (processing for generating an RGB image from a Bayer image) and various types of image processing, and outputs a three-plate RGB primary color image to the monitor display unit 220.
  • the monitor display unit 220 displays the color image.
  • the movable mask control unit 340 sets the movable mask 30 to the stereo measurement mode, the reflected light from the subject 5 is simultaneously imaged on the image sensor 40 via the left pupil optical path and the right pupil optical path.
  • the imaging processing unit 230 reads the pixel value of the image formed on the imaging element 40, performs A / D conversion or the like, and outputs the image data to the image selection unit 310.
  • the image selection unit 310 detects that the movable mask 30 is in the stereo measurement mode based on a control signal from the movable mask control unit 340, selects ⁇ Mr, Mb ⁇ from the captured image, and detects the phase difference. To 330.
  • the phase difference detection unit 330 performs matching processing on the two separated images Mr and Mb, and detects a phase difference (phase shift) for each pixel. Further, the phase difference detection unit 330 determines whether or not the phase difference detection is reliable. If it is determined that the phase difference detection is not reliable, an error flag is output for each pixel.
  • a matching evaluation method for obtaining a shift amount (phase difference) between two similar waveforms is based on a normalized cross-correlation calculation method represented by ZNCC (Zero-meanCCNormalized Cross-Correlation), and a sum of absolute values of mutual differences.
  • ZNCC Zero-meanCCNormalized Cross-Correlation
  • SAD Sud of Absolute Difference
  • phase shift can be detected by using Vr and Mr, which are parallax images, although they are time-divisionally affected by subject blur and imaging system blur.
  • Vr and Mr which are parallax images, although they are time-divisionally affected by subject blur and imaging system blur.
  • the phase difference detection unit 330 outputs the detected phase difference information and error flag to the distance information calculation unit 360.
  • the distance information calculation unit 360 calculates the distance information of the subject 5 (for example, the distance z in FIG. 17) for each pixel, and outputs the distance information to the three-dimensional information generation unit 370.
  • the pixel on which the error flag is set may be regarded as a flat portion (region having a small edge component) of the subject 5 and may be interpolated from distance information of surrounding pixels, for example.
  • the three-dimensional information generation unit 370 generates three-dimensional information from the distance information (or the distance information and the RGB image from the color image generation unit 320).
  • the three-dimensional information generation unit 370 generates the generated three-dimensional image, the three-dimensional data, or a display image in which these and the observation image are superimposed as necessary, and outputs the generated image to the monitor display unit 220.
  • the monitor display unit 220 displays the three-dimensional information.
  • the movable mask position detector 350 detects whether the movable mask 30 is in the observation mode position or the stereo measurement mode position using the image ⁇ Mr, Mb ⁇ obtained in the stereo measurement mode. If it is determined that the state of the movable mask 30 does not match the mode, a position error flag is output to the movable mask control unit 340.
  • the movable mask control unit 340 receives the position error flag and corrects the movable mask 30 to a correct state (a state corresponding to image selection). For example, when it is determined that there is no color shift in the image ⁇ Mr, Mb ⁇ even though the movable mask control unit 340 outputs a control signal for the stereo measurement mode, the actual movable mask 30 is positioned in the observation mode. It has become. In this case, correction is performed to match the position of the control signal and the movable mask 30. If the correct state is not obtained even if the correction operation is performed, it is determined that some failure has occurred, and the entire function is stopped.
  • the movable mask 30 is composed of a mechanical mechanism, it is conceivable that a malfunction occurs in the switching operation. According to the present embodiment, since it is possible to detect whether the switching position is the observation mode or the measurement mode, it is possible to cope with a problem of the switching operation.
  • Detecting or judging whether the movable mask 30 is in the observation mode position or the stereo measurement mode position is performed as follows, for example. That is, after matching the levels (average level, etc.) in the judgment areas of the images Mr and Mb, judgment based on the sum of absolute difference values of the images Mr and Mb (first method) and the correlation between the images Mr and Mb The position error is determined by determination based on the number (second method) or the like.
  • the absolute value of the difference value of the pixel value is obtained for each pixel, and it is integrated in all pixels or a partial pixel group. If the result exceeds a predetermined threshold, it is determined as an image in the stereo measurement mode, and if the result is less than the predetermined threshold, it is determined as an image in the observation mode.
  • the image Mr and the image Mb are basically images that have undergone color misregistration, so that the fact that a predetermined amount of difference value is obtained is used.
  • the correlation coefficient in the predetermined range between the image Mr and the image Mb is calculated, and when the result is equal to or smaller than the predetermined threshold, it is determined as an image in the stereo measurement mode, and the result exceeds the predetermined threshold.
  • the image is an observation mode.
  • the image Mr and the image Mb are basically images that have undergone color misregistration, so the correlation coefficient is small, whereas in the observation mode, the image Mr and the image Mb are almost the same image, so the correlation coefficient is Take advantage of big things.
  • the endoscope apparatus, the imaging apparatus, and the like of the present embodiment may include a processor and a memory.
  • the processor here may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as a GPU (GraphicsGProcessing Unit) or a DSP (Digital Signal Processor) can be used.
  • the processor may be an ASIC hardware circuit.
  • the memory stores instructions that can be read by a computer. When the instructions are executed by the processor, each unit (for example, each unit of the processing unit 210) of the endoscope apparatus, the imaging apparatus, and the like according to the present embodiment. Etc.) will be realized.
  • the memory here may be a semiconductor memory such as SRAM or DRAM, or a register or a hard disk.
  • the instruction here may be an instruction of an instruction set constituting the program, or an instruction for instructing an operation to the hardware circuit of the processor.
  • FIG. 19 shows a sequence (operation timing chart) for switching between the observation mode and the stereo measurement mode in moving image shooting.
  • switching of the state of the movable mask 30, imaging timing, and selection of a captured image are interlocked.
  • the mask state in the observation mode and the mask state in the stereo measurement mode are alternately repeated.
  • imaging is performed once in each mask state.
  • an image that is exposed and imaged by the image sensor 40 when in the mask state of the observation mode is selected as an observation image.
  • an image that is exposed and imaged by the image sensor 40 when in the mask state of the stereo measurement mode is selected as a measurement image.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Optics & Photonics (AREA)
  • Endoscopes (AREA)

Abstract

This imaging device includes an imaging unit and an illumination unit. The imaging unit can capture images (Vb, Vg, Vr) including a first color image, a second color image having a longer wavelength than the first color, and a third color image having a longer wavelength than the second color, and can also capture the first color image and the third color image as a stereo image (Mb, Mr). The illumination unit irradiates a subject with patterned light having prescribed light quantity distribution in a first wavelength band Pb included in the wavelength band of the first color while not included in a second-color wavelength band, and in a second wavelength band Pr included in the wavelength band of the third color while not included in the second-color wavelength band.

Description

撮像装置、内視鏡装置及び撮像方法Imaging apparatus, endoscope apparatus, and imaging method

 本発明は、撮像装置、内視鏡装置及び撮像方法等に関する。 The present invention relates to an imaging device, an endoscope device, an imaging method, and the like.

 物体の表面形状を光学的に3D計測する手法は種々提案されており、その手法の1つとして、アクティブパターン照明を物体に投影して、両眼立体視によりステレオ3次元距離計測を行う手法がある。ステレオ計測では、画像の特徴部分(例えばエッジ)を用いてマッチング処理を行い、物体までの距離を求める。アクティブパターンを投影することで、意図的に被写体に特徴部分を付与し、特徴部分の無い部分(例えば平坦部)においても正確に距離計測を可能にする。 Various methods for optically 3D measuring the surface shape of an object have been proposed. As one of the methods, a method of projecting active pattern illumination onto an object and performing stereo three-dimensional distance measurement by binocular stereoscopic vision is proposed. is there. In stereo measurement, matching processing is performed using a characteristic part (for example, an edge) of an image, and a distance to an object is obtained. By projecting the active pattern, a characteristic part is intentionally added to the subject, and distance measurement can be accurately performed even in a part without a characteristic part (for example, a flat part).

 ステレオ計測の従来技術としては、例えば特許文献1、2に開示される技術がある。特許文献1には、左右の結像光路をメカニカルシャッターにより時間的に切り替え、左右の画像を時分割に取得する手法が開示されている。特許文献2には、単一の結像光路の左半分にRGフィルタを挿入し、右半分にGBフィルタを挿入し、撮像画像のR画像とB画像により左右画像を分離する手法が開示されている。また特許文献2では、通常観察の場合においてRGフィルタ及びGBフィルタを結像光路から退避させ、観察画像を取得する。 As a conventional technique of stereo measurement, for example, there are techniques disclosed in Patent Documents 1 and 2. Japanese Patent Application Laid-Open No. 2004-133620 discloses a technique for switching left and right imaging light paths with a mechanical shutter in time and acquiring left and right images in a time-sharing manner. Patent Document 2 discloses a technique of inserting an RG filter in the left half of a single imaging optical path and a GB filter in the right half, and separating the left and right images from the R image and the B image of the captured image. Yes. In Patent Document 2, in the case of normal observation, the RG filter and the GB filter are retracted from the imaging optical path, and an observation image is acquired.

特開2010-128354号公報JP 2010-128354 A 特開2013-3159号公報JP 2013-3159 A

 上記のようなアクティブパターン照明を用いた場合、撮影された画像にアクティブパターンが写るため、フラット照明による通常の画像が得られないという課題がある。例えば、内視鏡装置を用いた非破壊検査等において、ステレオ撮影ではない観察画像の撮影により検査を行い、詳細に調べたい部分でステレオ計測を行うことがある。このような場合、例えばフラット照明とアクティブパターン照明を切り替えることが考えられるが、照明機構が煩雑になってしまう。或いは、ステレオ撮影と通常観察を交互に切り替えることによって、通常観察を行いながらリアルタイムなステレオ計測を行うことができた場合、利便性が向上するが、その際に照明の切り替えが伴わない方が望ましい。 When the active pattern illumination as described above is used, there is a problem that a normal image cannot be obtained by flat illumination because the active pattern is reflected in the captured image. For example, in a non-destructive inspection using an endoscope apparatus, inspection may be performed by capturing an observation image that is not stereo shooting, and stereo measurement may be performed at a portion to be examined in detail. In such a case, for example, switching between flat illumination and active pattern illumination can be considered, but the illumination mechanism becomes complicated. Alternatively, by alternately switching between stereo shooting and normal observation, convenience can be improved when real-time stereo measurement can be performed while performing normal observation, but it is preferable that illumination switching is not accompanied at that time. .

 本発明の幾つかの態様によれば、アクティブパターン照明による照明を行いながら通常の観察画像を撮影することが可能な撮像装置、内視鏡装置及び撮像方法等を提供できる。 According to some aspects of the present invention, it is possible to provide an imaging apparatus, an endoscope apparatus, an imaging method, and the like that can capture a normal observation image while performing illumination with active pattern illumination.

 本発明の一態様は、第1色の画像と、前記第1色よりも長波長側の第2色の画像と、前記第2色よりも長波長側の第3色の画像とを含む撮像画像を撮像し、前記第1色の画像と前記第3色の画像をステレオ画像として撮像可能な撮像部と、前記第1色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第1の波長帯域、及び前記第3色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第2の波長帯域において所与の光量分布を有するパターン照明を被写体に照射する照明部と、を含む撮像装置に関係する。 One embodiment of the present invention includes an image including a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color. An imaging unit that captures an image and can capture the first color image and the third color image as a stereo image, and is included in the wavelength band of the first color and included in the wavelength band of the second color A pattern illumination having a given light amount distribution in a second wavelength band that is included in the first wavelength band that is not included in the wavelength band of the third color and that is not included in the wavelength band of the second color. And an illumination unit.

 本発明の一態様によれば、第1色の波長帯域に含まれる第1の波長帯域と第3色の波長帯域に含まれる第2の波長帯域において所与の光量分布を有するパターン照明が被写体に照射される。第1の波長帯域と第2の波長帯域は第2色の波長帯域に含まれないので、第2色の画像は、パターン照明による所与の光量分布の影響を受けない画像となる。これにより、アクティブパターン照明による照明を行いながら通常の観察画像を撮影することが可能になる。 According to one aspect of the present invention, a pattern illumination having a given light amount distribution in a first wavelength band included in the first color wavelength band and a second wavelength band included in the third color wavelength band is a subject. Is irradiated. Since the first wavelength band and the second wavelength band are not included in the wavelength band of the second color, the image of the second color is an image that is not affected by a given light amount distribution due to pattern illumination. This makes it possible to capture a normal observation image while performing illumination with active pattern illumination.

 また本発明の他の態様は、上記に記載された撮像装置を含む内視鏡装置に関係する。 Further, another aspect of the present invention relates to an endoscope apparatus that includes the imaging apparatus described above.

 また本発明の更に他の態様は、第1色の画像と、前記第1色よりも長波長側の第2色の画像と、前記第2色よりも長波長側の第3色の画像のうち、前記第1色の画像と前記第3色の画像をステレオ画像として撮像可能である場合に、前記第1色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第1の波長帯域、及び前記第3色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第2の波長帯域において所与の光量分布を有するパターン照明を被写体に照射し、前記第1色の画像と前記第2色の画像と前記第3色の画像とを含む撮像画像を撮像する撮像方法に関係する。 According to still another aspect of the present invention, there is provided a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color. Among these, when the image of the first color and the image of the third color can be captured as a stereo image, the first color that is included in the wavelength band of the first color and not included in the wavelength band of the second color. A pattern illumination having a given light amount distribution in a second wavelength band that is included in the third wavelength band and not included in the second color wavelength band; and The present invention relates to an imaging method for imaging a captured image including an image of one color, an image of the second color, and an image of the third color.

パターン照明の例。An example of pattern illumination. パターン照明の光量と被写体の反射係数分布と撮像画像の波形例。The example of the waveform of the light quantity of pattern illumination, the reflection coefficient distribution of a to-be-photographed object, and a captured image. パターン照明の第1分光特性例と、観察モード及びステレオ計測モードでの瞳の第1分光特性例。A first spectral characteristic example of pattern illumination and a first spectral characteristic example of a pupil in an observation mode and a stereo measurement mode. 第1分光特性例と、観察モード及びステレオ計測モードにおける撮像画像との関係。The relationship between a 1st spectral characteristic example and the captured image in observation mode and stereo measurement mode. パターン照明の第2分光特性例と、観察モード及びステレオ計測モードでの瞳の第2分光特性例。The 2nd spectral characteristic example of pattern illumination and the 2nd spectral characteristic example of the pupil in observation mode and stereo measurement mode. 第2分光特性例と、観察モード及びステレオ計測モードにおける撮像画像との関係。The relationship between the 2nd spectral characteristic example and the captured image in observation mode and stereo measurement mode. 補正処理の説明図。Explanatory drawing of a correction process. 補正処理の説明図。Explanatory drawing of a correction process. 照明部の第1構成例。The 1st structural example of an illumination part. 照明部の第2構成例Second configuration example of illumination unit 撮像部の構成例。The structural example of an imaging part. 撮像部の構成例。The structural example of an imaging part. 固定マスク、可動マスクの第1の詳細な構成例。The 1st detailed structural example of a fixed mask and a movable mask. 固定マスク、可動マスクの第1の詳細な構成例。The 1st detailed structural example of a fixed mask and a movable mask. 固定マスク、可動マスクの第2の詳細な構成例。The 2nd detailed structural example of a fixed mask and a movable mask. 固定マスク、可動マスクの第2の詳細な構成例。The 2nd detailed structural example of a fixed mask and a movable mask. ステレオ計測の原理的な説明図。An explanatory view of the principle of stereo measurement. 内視鏡装置(撮像装置)の構成例。The structural example of an endoscope apparatus (imaging apparatus). 観察モードとステレオ計測モードを切り替えるシーケンス。Sequence that switches between observation mode and stereo measurement mode.

 以下、本実施形態について説明する。なお、以下に説明する本実施形態は、請求の範囲に記載された本発明の内容を不当に限定するものではない。また本実施形態で説明される構成の全てが、本発明の必須構成要件であるとは限らない。 Hereinafter, this embodiment will be described. In addition, this embodiment demonstrated below does not unduly limit the content of this invention described in the claim. In addition, all the configurations described in the present embodiment are not necessarily essential configuration requirements of the present invention.

 例えば以下では工業用の内視鏡装置を本発明の適用例として説明するが、本発明は工業用の内視鏡装置への適用に限定されず、ステレオ撮影方式(視差をもった撮像系で得た2画像の位相差を検出して被写体の距離情報を取得する方法)により3次元形状を計測する3次元計測装置や3次元計測機能を有する撮像装置(例えば医療用の内視鏡装置、顕微鏡、工業用カメラ、ロボットの視覚機能など)であれば適用できる。 For example, an industrial endoscope apparatus will be described below as an application example of the present invention. However, the present invention is not limited to application to an industrial endoscope apparatus, and a stereo shooting method (with an imaging system having parallax). A method of detecting the phase difference between the two images and acquiring subject distance information), a three-dimensional measuring device that measures a three-dimensional shape, and an imaging device having a three-dimensional measuring function (for example, a medical endoscope device, Microscopes, industrial cameras, robot vision functions, etc. are applicable.

 1.パターン照明
 高品質な観察画像を得るために、一般的には被写体への照明はムラのない一様照明が前提である。しかしながら、ステレオ方式の計測を行う場合には、位相差が得られやすい特徴部が存在しないと被写体までの距離情報を求めることができない。そのため、意図的に特徴付けを行うパターン照明を被写体に照射する方法を用いる。
1. Pattern illumination In order to obtain a high-quality observation image, it is generally premised that the subject is illuminated uniformly. However, when performing stereo measurement, distance information to the subject cannot be obtained unless there is a feature that easily obtains a phase difference. Therefore, a method of irradiating the subject with pattern illumination that is intentionally characterized is used.

 本実施形態では、白色光による観察画像を撮影する観察モードとカラー位相差法でステレオ計測を行う計測モードとを切り替えて使用する。従来のパターン照明を用いた場合、観察モード時は一様照明を行い、計測モード時はパターン照明を行うような照明自体の切り替え機能が必要となる。 In the present embodiment, an observation mode for capturing an observation image using white light and a measurement mode for performing stereo measurement by the color phase difference method are switched and used. When conventional pattern illumination is used, it is necessary to have a function of switching the illumination itself so that uniform illumination is performed in the observation mode and pattern illumination is performed in the measurement mode.

 そこで本実施形態では、観察モード時であっても計測モードであっても常時パターン照明を行ない、分光フィルタ又は画像処理により観察モード時のパターンを消去する方法を用いる。これにより、照明の切り替え機能を不要にできる。以下、この手法について説明する。 Therefore, in the present embodiment, a method is used in which pattern illumination is always performed in the observation mode or in the measurement mode, and the pattern in the observation mode is erased by a spectral filter or image processing. Thereby, the illumination switching function can be made unnecessary. Hereinafter, this method will be described.

 図1は、パターン照明の例を模式的に示したものである。図1には、パターン照明PLの平面視図(例えば、撮像系の光軸に垂直な平面に投影したときの平面視図)と、そのAA断面での光量特性の例とを示す。xは光軸に垂直な方向での位置(座標)である。 FIG. 1 schematically shows an example of pattern illumination. FIG. 1 shows a plan view of the pattern illumination PL (for example, a plan view when projected onto a plane perpendicular to the optical axis of the imaging system) and an example of the light quantity characteristic in the AA section. x is a position (coordinate) in a direction perpendicular to the optical axis.

 図1に示すように、明るさが変化する小円パターンDTが規則的に配置された照明パターンとなっている。小円パターンDTの外部はフラットな明るさの照明になっており、小円パターンDTの内部はそれよりも暗い照明になっている。 As shown in FIG. 1, a small circle pattern DT whose brightness changes is an illumination pattern regularly arranged. The outside of the small circle pattern DT is illumination with flat brightness, and the inside of the small circle pattern DT is illumination darker than that.

 このようなパターン照明PLにより被写体画像を捉えた場合、図2に示すようにパターン照明PLの光量と被写体の反射係数分布が乗算された結果の波形が撮像波形(センサ出力)として得られる。図2のセンサ出力は、位置xに対応する画素での画素値又は輝度値(又はセンサ面での結像の明るさ)を表す。実線はパターン照明をした場合のセンサ出力を示し、点線はフラット照明をした場合のセンサ出力を示す。 When a subject image is captured by such pattern illumination PL, a waveform obtained by multiplying the amount of light of pattern illumination PL and the reflection coefficient distribution of the subject is obtained as an imaging waveform (sensor output) as shown in FIG. The sensor output in FIG. 2 represents a pixel value or a luminance value (or brightness of image formation on the sensor surface) at a pixel corresponding to the position x. A solid line indicates a sensor output when pattern illumination is performed, and a dotted line indicates a sensor output when flat illumination is performed.

 2.第1分光特性例
 図3に、パターン照明の第1分光特性例と、観察モード及びステレオ計測モードでの瞳の第1分光特性例を示す。B、G、R、IRの符号を付した点線は、撮像センサのカラーフィルタの分光特性を表す。B、G、Rは、それぞれ撮像センサの青色、緑色、赤色フィルタの分光特性である。IRは、赤外の感度特性であり、青色、緑色、赤色フィルタのいずれも通過する。
2. First Spectral Characteristics Example FIG. 3 shows a first spectral characteristic example of pattern illumination and a first spectral characteristic example of a pupil in the observation mode and the stereo measurement mode. Dotted lines with B, G, R, and IR symbols represent spectral characteristics of the color filter of the image sensor. B, G, and R are spectral characteristics of the blue, green, and red filters of the image sensor, respectively. IR is an infrared sensitivity characteristic and passes through any of the blue, green, and red filters.

 図3の上図に示すように、パターン照明の第1分光特性例では、撮像センサのカラーフィルタの分光特性と関連付けて、波長帯域を{Pv1,Pb,Pv2,Pr,Pir}の5つに分割する。帯域{Pv1,Pv2,Pir}の照明光は、光量分布が一様な非パターン光(フラット光)であり、帯域{Pb,Pr}の照明光はパターン光である。これらを合成した分光特性をもつ光が被写体への照明光となる。 As shown in the upper diagram of FIG. 3, in the first spectral characteristic example of the pattern illumination, the wavelength bands are set to five {Pv1, Pb, Pv2, Pr, Pir} in association with the spectral characteristics of the color filter of the imaging sensor. To divide. The illumination light in the bands {Pv1, Pv2, Pir} is non-pattern light (flat light) with a uniform light amount distribution, and the illumination light in the bands {Pb, Pr} is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.

 帯域Pv1、Pbは、撮像センサの青色フィルタを通過するが緑色フィルタを通過しない帯域に設定されており、帯域Pvは、緑色フィルタを通過する帯域に設定されており、帯域Pr、Pirは、赤色フィルタを通過するが緑色フィルタを通過しない帯域に設定されている。帯域Pirは、赤外の感度特性IRを含む。なお、帯域Pirが赤外の感度特性IRを含まなくてもよい。このように、帯域Pb、Prは、撮像センサの緑色フィルタの分光特性Gとは干渉しない波長帯域に設定される。また、帯域Pb、Prは、例えば数nm~数10nm等の狭帯域に設定される。 The bands Pv1 and Pb are set to bands that pass through the blue filter of the image sensor but do not pass through the green filter, the band Pv is set to a band that passes through the green filter, and the bands Pr and Pir are red. A band that passes through the filter but does not pass through the green filter is set. The band Pir includes an infrared sensitivity characteristic IR. Note that the band Pir may not include the infrared sensitivity characteristic IR. Thus, the bands Pb and Pr are set to wavelength bands that do not interfere with the spectral characteristic G of the green filter of the image sensor. The bands Pb and Pr are set to narrow bands such as several nm to several tens of nm, for example.

 より具体的には、帯域Pbは撮像センサの青色画素(分光特性B)のみが取得可能であって比較的感度が良好な波長域を選択することが望ましい。また帯域Prは撮像センサの赤色画素(分光特性R)のみが取得可能であって比較的感度が良好な波長域を選択するのが望ましい。つまり、左瞳画像と右瞳画像が撮像センサの異なる色画素により分離されなければならないので、相互に受光感度特性がない波長域にて{Pb,Pr}を選べばよい。非パターン照明光の分光成分{Pv1,Pv2,Pir}は通常の観察画像の成分となるので、感度およびカラー品質を確保するために撮像センサの赤色画素、緑色画素、青色画素(分光特性R、G、B)がなるべく多くの波長域をカバーできるようにすることが望ましい。例えば、{Pb,Pr}は狭帯域光源となるように波長450nm,660nmの標準的なレーザー光源を用い、分光成分{Pv1,Pv2,Pir}が多くの波長成分をカバーするようにすることも一つの方法である。 More specifically, for the band Pb, it is desirable to select a wavelength range in which only the blue pixel (spectral characteristic B) of the image sensor can be acquired and the sensitivity is relatively good. For the band Pr, it is desirable to select a wavelength range in which only the red pixel (spectral characteristic R) of the imaging sensor can be acquired and the sensitivity is relatively good. That is, since the left pupil image and the right pupil image must be separated by different color pixels of the image sensor, {Pb, Pr} may be selected in a wavelength region that does not have a mutual light receiving sensitivity characteristic. Since the spectral components {Pv1, Pv2, Pir} of the non-pattern illumination light are components of a normal observation image, the red pixel, the green pixel, and the blue pixel (spectral characteristics R, It is desirable that G and B) can cover as many wavelength ranges as possible. For example, a standard laser light source with wavelengths of 450 nm and 660 nm is used for {Pb, Pr} so as to be a narrow-band light source, and the spectral components {Pv1, Pv2, Pir} may cover many wavelength components. This is one way.

 図3の中図、下図に示すように、瞳の分光特性FL、FC、FRと照明光の分光特性との関係を下式(1)のように設定する。

Figure JPOXMLDOC01-appb-M000001
As shown in the middle and lower diagrams of FIG. 3, the relationship between the spectral characteristics FL, FC, FR of the pupil and the spectral characteristics of the illumination light is set as in the following formula (1).
Figure JPOXMLDOC01-appb-M000001

 観察モードでは、分光特性FCの瞳で撮影する。分光特性FCは波長帯域{Pv1,Pv2,Pir}に対応しており、フラット照明による観察画像が得られる。一方、ステレオ計測モードでは、分光特性FLの左瞳と分光特性FRの右瞳で撮影する。分光特性FL、FRは、波長帯域Pb、Prに対応しており、アクティブパターン照明によるステレオ画像が得られる。 In the observation mode, images are taken with the pupil of the spectral characteristic FC. The spectral characteristic FC corresponds to the wavelength band {Pv1, Pv2, Pir}, and an observation image by flat illumination is obtained. On the other hand, in the stereo measurement mode, shooting is performed with the left pupil of the spectral characteristic FL and the right pupil of the spectral characteristic FR. The spectral characteristics FL and FR correspond to the wavelength bands Pb and Pr, and a stereo image by active pattern illumination is obtained.

 観察モードは、図11、図13、図15のように固定マスク20の中心絞り孔23(分光特性FC)で単眼撮影するモードであり、ステレオ計測モードは、図12、図14、図16のように固定マスク20の左瞳絞り孔21(分光特性FL)と右瞳絞り孔22(分光特性FR)でステレオ撮影するモードである。これらの観察モード、ステレオ計測モードの詳細は後述する。 The observation mode is a mode in which monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed mask 20 as shown in FIGS. 11, 13, and 15, and the stereo measurement mode is shown in FIGS. 12, 14, and 16. In this manner, stereo imaging is performed with the left pupil aperture 21 (spectral characteristic FL) and right pupil aperture 22 (spectral characteristic FR) of the fixed mask 20 as described above. Details of these observation modes and stereo measurement modes will be described later.

 図4に、第1分光特性例と、観察モード及びステレオ計測モードにおける撮像画像との関係を示す。図4の右図の波形において、点線はフラット照明をした場合のセンサ出力を仮想的に示している。 FIG. 4 shows the relationship between the first spectral characteristic example and the captured images in the observation mode and the stereo measurement mode. In the waveform in the right diagram of FIG. 4, the dotted line virtually indicates the sensor output when flat illumination is performed.

 なお、以下では、図11~図16の撮像部に本実施形態のアクティブパターン照明を適用した場合を例に説明するが、撮像部の構成は図11~図16に限定されず、分光特性FCによる観察画像と分光特性FL、FRによるステレオ画像が撮影できるものであればよい。 In the following, the case where the active pattern illumination of the present embodiment is applied to the imaging unit of FIGS. 11 to 16 will be described as an example. However, the configuration of the imaging unit is not limited to FIGS. Any observation image can be used as long as a stereo image by spectral characteristics FL and FR can be taken.

 照明光はモードに関わらず同一であり、観察モードでもステレオ計測モードでも被写体にはアクティブパターン照明が投影された状態である。 The illumination light is the same regardless of the mode, and the active pattern illumination is projected onto the subject in both the observation mode and the stereo measurement mode.

 観察モードでは、非パターン光の分光成分{Pv1,Pv2,Pir}をもった照明光による反射光が瞳中心光路(分光特性FC)を通過し撮像画像{Vr,Vg,Vb}が得られる。Vrは撮像センサの赤色フィルタを有する画素で得られた赤色画像であり、Vgは撮像センサの緑色フィルタを有する画素で得られた緑色画像であり、撮像センサの青色フィルタを有する画素で得られた青色画像である。観察モードでは、アクティブパターンの帯域が除かれているので、アクティブパターンの影響を受けない観察画像が得られる。 In the observation mode, the reflected light by the illumination light having the spectral components {Pv1, Pv2, Pir} of the non-pattern light passes through the pupil center optical path (spectral characteristic FC), and a captured image {Vr, Vg, Vb} is obtained. Vr is a red image obtained by a pixel having a red filter of the image sensor, Vg is a green image obtained by a pixel having a green filter of the image sensor, and obtained by a pixel having a blue filter of the image sensor It is a blue image. In the observation mode, since the band of the active pattern is removed, an observation image that is not affected by the active pattern is obtained.

 ステレオ計測モードでは、パターン光の分光成分{Pb,Pr}をもった照明光による反射光が左右瞳光路を通過し撮像画像{Mr,Mb}が得られる。Mrは左瞳光路(分光特性FL)による画像であり、Mbは右瞳光路(分光特性FR)による画像である。ステレオ計測モードでは、平滑な被写体面においても意図的にパターンが形成されるので、撮像画像{Mr,Mb}のマッチングが取りやすく位相差検出が容易になる。 In the stereo measurement mode, the reflected light by the illumination light having the spectral component {Pb, Pr} of the pattern light passes through the left and right pupil optical paths, and a captured image {Mr, Mb} is obtained. Mr is an image by the left pupil optical path (spectral characteristic FL), and Mb is an image by the right pupil optical path (spectral characteristic FR). In the stereo measurement mode, since a pattern is intentionally formed even on a smooth subject surface, the captured image {Mr, Mb} can be easily matched and phase difference detection can be easily performed.

 以上の実施形態によれば、撮像装置(内視鏡装置)は、第1色の画像と第2色の画像と第3色の画像を含む撮像画像(Vb、Vg、Vr)を撮影すると共に第1色の画像と第3色の画像をステレオ画像(Mb、Mr)として撮像可能な撮像部と、第1の波長帯域Pb及び第2の波長帯域Prにおいて所与の光量分布を有するパターン照明を被写体に照射する照明部と、を含む。第2色(緑色)の画像は、第1色(青色)よりも長波長側の画像であり、第3色(赤色)の画像は、第2色よりも長波長側の画像である。第1の波長帯域Pbは、第1色の波長帯域(分光特性Bの帯域)に含まれると共に第2色の波長帯域(分光特性Gの帯域)に含まれない帯域である。第2の波長帯域Prは、第3色の波長帯域(分光特性Rの帯域)に含まれると共に第2色の波長帯域(分光特性Gの帯域)に含まれない帯域である。 According to the above embodiment, the imaging device (endoscope device) captures the captured images (Vb, Vg, Vr) including the first color image, the second color image, and the third color image. An imaging unit capable of capturing a first color image and a third color image as a stereo image (Mb, Mr), and pattern illumination having a given light amount distribution in the first wavelength band Pb and the second wavelength band Pr And an illumination unit that irradiates the subject. The second color (green) image is an image on the longer wavelength side than the first color (blue), and the third color (red) image is an image on the longer wavelength side than the second color. The first wavelength band Pb is a band that is included in the first color wavelength band (spectral characteristic B band) and not included in the second color wavelength band (spectral characteristic G band). The second wavelength band Pr is a band that is included in the third color wavelength band (spectral characteristic R band) and not included in the second color wavelength band (spectral characteristic G band).

 このようにすれば、第1の波長帯域Pbと第2の波長帯域Prは第2色(緑色)の波長帯域に含まれないので、少なくとも第2色の画像には所与の光量分布によるパターンが写らない。これにより、パターン照明とフラット照明を切り替えることなくステレオ計測と通常観察を切り替えることが可能となる。 In this way, since the first wavelength band Pb and the second wavelength band Pr are not included in the wavelength band of the second color (green), at least the pattern of the given light quantity distribution is present in the second color image. Is not reflected. Thereby, it is possible to switch between stereo measurement and normal observation without switching between pattern illumination and flat illumination.

 即ち、ステレオ画像は第1色(青色)の画像と第3色(赤色)の画像で構成され、その第1色の画像には第1の波長帯域Pbにおけるパターンが写り、第3色の画像には第2の波長帯域Prにおけるパターンが写る。これにより、パターンによって意図的に特徴が付されたステレオ画像が取得され、それらのマッチング処理を行うことで高精度なステレオ計測が可能となる。また、上述のように第1の波長帯域Pbと第2の波長帯域Prを通過させない分光フィルタ(分光特性FC)を通して観察画像を撮影することで、所与の光量分布によるパターンが写らない観察画像が得られる。或いは図5~図8で後述するように、白色光の全帯域の観察画像を撮影し、パターンが写った第1色(青色)、第3色(赤色)の画像を、パターンが写らない第2色(緑色)の画像を用いて補正することで、パターンが除去された観察画像が得られる。このように、第2色の波長帯域において所与の光量分布によるパターンが付されていないことで、パターン照明を行ったままフラット照明時のような観察画像を得ることが可能となる。 That is, the stereo image is composed of a first color (blue) image and a third color (red) image, and the first color image includes a pattern in the first wavelength band Pb, and the third color image. Shows a pattern in the second wavelength band Pr. Thereby, a stereo image intentionally characterized by a pattern is acquired, and high-precision stereo measurement can be performed by performing matching processing thereof. In addition, as described above, an observation image is captured through a spectral filter (spectral characteristic FC) that does not pass the first wavelength band Pb and the second wavelength band Pr, so that an observation image in which a pattern due to a given light amount distribution is not captured is obtained. Is obtained. Alternatively, as will be described later with reference to FIGS. 5 to 8, an observation image of the entire band of white light is photographed, and the first color (blue) and third color (red) images in which the pattern is captured are not captured. By correcting using two-color (green) images, an observation image from which the pattern has been removed is obtained. As described above, since a pattern with a given light quantity distribution is not provided in the wavelength band of the second color, it is possible to obtain an observation image as in flat illumination while performing pattern illumination.

 なお、ここでは第1の波長帯域が図3の波長帯域Pbであり、第2の波長帯域が図3の波長帯域Prであるとしたが、これに限定されない。例えば第1の波長帯域が図5の波長帯域Pb1であり、第2の波長帯域が図5の波長帯域Pr2であってもよい。 Note that, here, the first wavelength band is the wavelength band Pb in FIG. 3 and the second wavelength band is the wavelength band Pr in FIG. 3, but the present invention is not limited to this. For example, the first wavelength band may be the wavelength band Pb1 in FIG. 5, and the second wavelength band may be the wavelength band Pr2 in FIG.

 ここで所与の光量分布(所与の形状の光量分布)は、明暗(光量の大小)の境界を有する分布、或いは、光量が急激に変化するエッジ部を有する分布であり、その境界やエッジ部はパターン照明の照射領域に複数設けられている。この所与の光量分布を有するパターン照明で照明した被写体を撮影することで、所与の光量分布の境界やエッジ部の形状や配置が撮像画像に写り、それによってステレオ計測に必要な特徴を撮像画像に与えることができる。本実施形態では、この所与の光量分布は特定の波長帯域Pb、Prにのみ付与されている。なお、図1では波長帯域Pb、Prにおいて周囲よりも暗い小円のパターンDTを規則的に配置しているが、所与の光量分布はこれに限定されない。例えばパターンDTは小円でなくてもよいし、パターンDTの配置には規則性がなくてもよいし、パターンDTの内部は外部よりも明るい照明になっていてもよい。 Here, a given light amount distribution (a light amount distribution of a given shape) is a distribution having a light / dark (light amount) boundary or a distribution having an edge portion where the light amount changes abruptly. A plurality of parts are provided in the irradiation area of the pattern illumination. By photographing a subject illuminated with pattern illumination having a given light intensity distribution, the boundaries of the given light intensity distribution and the shape and arrangement of edges are reflected in the captured image, thereby capturing the features required for stereo measurement. Can be given to images. In the present embodiment, this given light quantity distribution is given only to specific wavelength bands Pb and Pr. In FIG. 1, small circle patterns DT darker than the surroundings are regularly arranged in the wavelength bands Pb and Pr, but the given light quantity distribution is not limited to this. For example, the pattern DT may not be a small circle, the arrangement of the pattern DT may not be regular, and the inside of the pattern DT may be brighter than the outside.

 また本実施形態では、撮像部は、ステレオ画像を撮像するステレオモードと、単眼により撮像画像を撮像する非ステレオモードとを切り替える。 In this embodiment, the imaging unit switches between a stereo mode for capturing a stereo image and a non-stereo mode for capturing a captured image with a single eye.

 例えば図11~図16で後述する撮像部では、非ステレオモード(観察モード)では、固定マスク20の中心絞り孔23(分光特性FC)で単眼撮影し、ステレオモード(ステレオ計測モード)では、固定マスク20の左瞳絞り孔21(分光特性FL)と右瞳絞り孔22(分光特性FR)でステレオ撮影する。なお、本実施形態のパターン照明を適用できる撮像部はこれに限定されず、ステレオモードにおいて第1色の画像と第3色の画像をステレオ画像として撮像する撮像部であればよい。 For example, in the imaging unit described later with reference to FIGS. 11 to 16, in the non-stereo mode (observation mode), monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed mask 20, and in the stereo mode (stereo measurement mode), the image is fixed. Stereo imaging is performed with the left pupil aperture 21 (spectral characteristics FL) and right pupil aperture 22 (spectral characteristics FR) of the mask 20. Note that the imaging unit to which the pattern illumination of the present embodiment can be applied is not limited to this, and any imaging unit that captures the first color image and the third color image as a stereo image in the stereo mode may be used.

 このように、第1色の画像と第3色の画像をステレオ画像として撮影するステレオモードと、第1色~第3色の画像を撮像画像として単眼により撮像する非ステレオモードを切り替えることで、ステレオ計測と観察画像の撮影を切り替えることができる。また、この場合において上述したパターン照明を用いることによって、常時パターン照明を行いながらステレオ計測と観察画像の撮影を切り替えることが可能となる。 In this way, by switching between a stereo mode in which the first color image and the third color image are captured as a stereo image and a non-stereo mode in which the first to third color images are captured with a monocular as a captured image, It is possible to switch between stereo measurement and observation image capturing. In this case, by using the above-described pattern illumination, it is possible to switch between stereo measurement and observation image capturing while always performing pattern illumination.

 また本実施形態では、非ステレオモードにおける単眼は、第1色と第2色と第3色の波長帯域(白色光の波長帯域)のうち第1の波長帯域Pb及び第2の波長帯域Prを除く波長帯域{Pv1,Pv2,Pir}を通過させる。 In this embodiment, the monocular in the non-stereo mode uses the first wavelength band Pb and the second wavelength band Pr out of the wavelength bands (white light wavelength bands) of the first color, the second color, and the third color. Except the wavelength bands {Pv1, Pv2, Pir}.

 所与の光量分布によるパターンは第1の波長帯域Pb及び第2の波長帯域Prに付されているので、単眼が通過させる波長帯域{Pv1,Pv2,Pir}には所与の光量分布によるパターンが付されていない。これにより、非ステレオモードにおいてパターン照明を行っているにも関わらずフラット照明のような観察画像を撮影することができる。特に、上述のように第1の波長帯域Pb及び第2の波長帯域Prが狭帯域である場合には、それを除いた波長帯域{Pv1,Pv2,Pir}はほぼ白色光の波長帯域となるので、白色光の照明による撮像画像と遜色ない画像を得ることが可能となる。 Since the pattern based on the given light quantity distribution is attached to the first wavelength band Pb and the second wavelength band Pr, the wavelength band {Pv1, Pv2, Pir} through which the monocular passes is a pattern based on the given light quantity distribution. Is not attached. As a result, an observation image such as flat illumination can be taken despite pattern illumination in the non-stereo mode. In particular, when the first wavelength band Pb and the second wavelength band Pr are narrow bands as described above, the wavelength bands {Pv1, Pv2, Pir} excluding them are substantially the wavelength band of white light. Therefore, it is possible to obtain an image that is not inferior to a captured image obtained by illumination with white light.

 また本実施形態では、図18等で後述するように、撮像装置が位相差検出部330と画像出力部(カラー画像生成部320)を含んでもよい。位相差検出部330は、ステレオモードにおいて撮像された第1色の画像と第3色の画像との間の位相差を検出する。画像出力部は、非ステレオモードにおいて撮像された撮像画像(第1色~第3色の画像)に基づいて観察用の画像を出力する。 In this embodiment, as will be described later with reference to FIG. 18 and the like, the imaging apparatus may include a phase difference detection unit 330 and an image output unit (color image generation unit 320). The phase difference detection unit 330 detects a phase difference between the first color image and the third color image captured in the stereo mode. The image output unit outputs an image for observation based on the captured image (first color to third color image) captured in the non-stereo mode.

 上述のように第1色の画像と第3色の画像には、パターン照明によるパターンが写っているため、位相差検出部330によって高精度な位相差検出を行うことが可能である。また、非ステレオモードにおいて撮像された撮像画像には、パターン照明を行っているにも関わらずパターンが写っていないため、画像出力部によって観察用の画像を出力できる。 As described above, since the pattern of pattern illumination is reflected in the first color image and the third color image, the phase difference detection unit 330 can detect the phase difference with high accuracy. In addition, since the pattern is not captured in the captured image captured in the non-stereo mode even though pattern illumination is performed, an image for observation can be output by the image output unit.

 また本実施形態では、第1色の波長帯域(分光特性B)のうち第1の波長帯域Pbを除く波長帯域と、第2色の波長帯域(分光特性G)と、第3色の波長帯域(分光特性R)のうち第2の波長帯域Prを除く波長帯域とにおいてフラットな光量分布である。 In this embodiment, the wavelength band of the first color wavelength band (spectral characteristic B) excluding the first wavelength band Pb, the second color wavelength band (spectral characteristic G), and the third color wavelength band. It is a flat light amount distribution in the wavelength band excluding the second wavelength band Pr in (spectral characteristic R).

 このようにすれば、第1の波長帯域Pbと第2の波長帯域Pr以外ではフラットな光量分布の照明光となるので、少なくとも第2色の画像はフラット照明による画像となる。これにより、上述したようにパターン照明を常時行いながらステレオ計測と観察画像の撮影を切り替えることが可能となる。 In this way, since the illumination light has a flat light amount distribution other than the first wavelength band Pb and the second wavelength band Pr, at least the second color image is an image by flat illumination. As a result, as described above, it is possible to switch between stereo measurement and observation image capturing while always performing pattern illumination.

 ここで、フラットな光量分布(フラット照明)とは、撮像部により撮影される撮影領域(視野)において光量分布が一定(略一定)ということである。具体的には、撮像部からの距離が一定の面(光軸に垂直な面)での光量分布が一定ということである。なお、完全に光量分布が一定である必要はなく、例えばパターン照明のような急激な光量変化(エッジ部)がない緩やかな光量変化があってもよい。或いは、撮影領域の周辺部において中央部よりも光量が緩やかに下降するような、通常の照明で考えられるような光量分布があってもよい。 Here, the flat light quantity distribution (flat illumination) means that the light quantity distribution is constant (substantially constant) in the photographing region (field of view) photographed by the imaging unit. Specifically, the light amount distribution on a surface having a constant distance from the imaging unit (a surface perpendicular to the optical axis) is constant. Note that the light amount distribution need not be completely constant, and there may be a gradual light amount change without an abrupt light amount change (edge portion) such as pattern illumination. Alternatively, there may be a light amount distribution that can be considered in normal illumination, in which the light amount falls more slowly in the peripheral part of the imaging region than in the central part.

 また本実施形態では、第1色は青色であり、第2色は緑色であり、第3色は赤色である。 In this embodiment, the first color is blue, the second color is green, and the third color is red.

 なお、第1色の画像、第2色の画像、第3色の画像は、それぞれ青色の画像、緑色の画像、赤色の画像に対応するが、これらの画像は原色のカラーフィルタを有する撮像素子(例えば原色ベイヤ配列の撮像素子)によって撮像されたものに限定されない。例えば補色のカラーフィルタを有する撮像素子により補色の画像が撮像され、その補色の画像から変換処理により青色の画像、緑色の画像、赤色の画像が取得されてもよい。 Note that the first color image, the second color image, and the third color image correspond to a blue image, a green image, and a red image, respectively, and these images have an image pickup element having a primary color filter. It is not limited to what was imaged (for example, image sensor of primary color Bayer arrangement). For example, a complementary color image may be captured by an imaging element having a complementary color filter, and a blue image, a green image, and a red image may be acquired from the complementary color image by conversion processing.

 3.第2分光特性例
 第1分光特性例では、観察モードとステレオ計測モードの撮像帯域が分離された場合を説明した。第2分光特性例では、観察モードと計測モードの撮像帯域が分離されていない場合について述べる。波長帯域を分離せず非パターン光とパターン光が合成された照明をモードに関わらず常時利用することができれば、波長帯域を有効に利用でき、照明の切替え機能も不要なので、撮像感度の確保、被写体の色情報をカバーすること、照明機構の簡単化の点で有利である。しかしながら波長帯域が分離されていない場合は、観察モードの撮像画像もパターン光の影響を受けるので、高品質で忠実な観察画像を得るにはパターン光の影響を除去または低減する必要がある。
3. Second spectral characteristic example In the first spectral characteristic example, the case where the imaging bands of the observation mode and the stereo measurement mode are separated has been described. In the second spectral characteristic example, a case where the imaging bands of the observation mode and the measurement mode are not separated will be described. If you can always use illumination that combines non-patterned light and patterned light without separating the wavelength band, regardless of the mode, you can use the wavelength band effectively, and the illumination switching function is unnecessary, ensuring imaging sensitivity. Covering the color information of the subject is advantageous in terms of simplifying the illumination mechanism. However, if the wavelength bands are not separated, the captured image in the observation mode is also affected by the pattern light. Therefore, in order to obtain a high-quality and faithful observation image, it is necessary to remove or reduce the influence of the pattern light.

 図5に、パターン照明の第2分光特性例と、観察モード及びステレオ計測モードでの瞳の第2分光特性例を示す。 FIG. 5 shows a second spectral characteristic example of pattern illumination and a second spectral characteristic example of the pupil in the observation mode and the stereo measurement mode.

 パターン照明の分光特性では、撮像センサのカラーフィルタの分光特性と関連付けて、波長帯域を{Pb1,Pb2,Pr1,Pr2,Pir}の5つに分割する。帯域{Pb2,Pr1,Pir}の照明光は、光量分布が一様な非パターン光(フラット光)であり、帯域{Pb1,Pr2}の照明光はパターン光である。これらを合成した分光特性をもつ光が被写体への照明光となる。 In the spectral characteristics of the pattern illumination, the wavelength band is divided into five parts {Pb1, Pb2, Pr1, Pr2, Pir} in association with the spectral characteristics of the color filter of the image sensor. The illumination light in the bands {Pb2, Pr1, Pir} is non-pattern light (flat light) with a uniform light quantity distribution, and the illumination light in the bands {Pb1, Pr2} is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.

 Pb1は、青色フィルタを通過するが緑色フィルタを通過しない帯域に設定されており、Pb2は、青色フィルタと緑色フィルタを共に通過する帯域に設定されている。Pr1は、赤色フィルタと緑色フィルタを共に通過する帯域に設定されており、Pr2は、赤色フィルタを通過するが緑色フィルタを通過しない帯域に設定されている。Pirは、赤外の感度特性IRに対応する帯域であり、赤色、緑色、青色の全てのフィルタを通過する帯域である。このように、波長帯域Pb1、Pr2は、撮像センサの緑色フィルタの分光特性Gとは干渉しない波長帯域に設定される。 Pb1 is set in a band that passes through the blue filter but does not pass through the green filter, and Pb2 is set in a band that passes through both the blue filter and the green filter. Pr1 is set in a band that passes through both the red filter and the green filter, and Pr2 is set in a band that passes through the red filter but does not pass through the green filter. Pir is a band corresponding to the infrared sensitivity characteristic IR, and is a band that passes through all the red, green, and blue filters. Thus, the wavelength bands Pb1 and Pr2 are set to wavelength bands that do not interfere with the spectral characteristics G of the green filter of the image sensor.

 観察モードでは、分光特性FCの瞳で撮影するが、この分光特性FCは波長帯域{Pb1,Pb2,Pr1,Pr2,Pir}を含む。例えば、固定マスク20の中心絞り孔23にはフィルタが設けられず、中心瞳光路が全帯域の光を通過させる。観察モードにおいて撮像されるカラー画像を構成する赤色画像Vr、緑色画像Vg、青色画像Vbと、それらがカバーする波長帯域の関係は下式(2)のようになる。

Figure JPOXMLDOC01-appb-M000002
In the observation mode, an image is taken with the pupil of the spectral characteristic FC, which includes the wavelength band {Pb1, Pb2, Pr1, Pr2, Pir}. For example, no filter is provided in the central aperture 23 of the fixed mask 20, and the central pupil optical path allows light in the entire band to pass. The relationship between the red image Vr, the green image Vg, and the blue image Vb constituting the color image captured in the observation mode and the wavelength band covered by them is expressed by the following equation (2).
Figure JPOXMLDOC01-appb-M000002

 ステレオ計測モードでは、分光特性FLの左瞳と分光特性FRの右瞳で撮影する。分光特性FLは波長帯域Pb1に対応しており、分光特性FRは波長帯域Pr2に対応している。即ち、ステレオ計測モードにおいて撮像される赤色画像Mr、青色画像Mbと、それらがカバーする波長帯域の関係は下式(3)のようになる。

Figure JPOXMLDOC01-appb-M000003
In the stereo measurement mode, shooting is performed with the left pupil of the spectral characteristic FL and the right pupil of the spectral characteristic FR. The spectral characteristic FL corresponds to the wavelength band Pb1, and the spectral characteristic FR corresponds to the wavelength band Pr2. That is, the relationship between the red image Mr and the blue image Mb captured in the stereo measurement mode and the wavelength band covered by them is expressed by the following equation (3).
Figure JPOXMLDOC01-appb-M000003

 図6に、第2分光特性例と、観察モード及びステレオ計測モードにおける撮像画像との関係を示す。図6の右図の波形において、点線はフラット照明をした場合のセンサ出力を仮想的に示している。 FIG. 6 shows the relationship between the second spectral characteristic example and the captured images in the observation mode and the stereo measurement mode. In the waveform in the right diagram of FIG. 6, the dotted line virtually indicates the sensor output when flat illumination is performed.

 観察モードでは、分光成分{Pb1,Pb2,Pr1,Pr2,Pir}をもった照明光による反射光が瞳中心光路(分光特性FC)を通過し撮像画像{Vr,Vg,Vb}が得られる。ステレオ計測モードでは、パターン光の分光成分{Pb1,Pr2}をもった照明光による反射光が左右瞳光路を通過し撮像画像{Mr,Mb}が得られる。 In the observation mode, the reflected light by the illumination light having the spectral components {Pb1, Pb2, Pr1, Pr2, Pir} passes through the pupil center optical path (spectral characteristic FC), and a captured image {Vr, Vg, Vb} is obtained. In the stereo measurement mode, the reflected light by the illumination light having the spectral components {Pb1, Pr2} of the pattern light passes through the left and right pupil optical paths, and a captured image {Mr, Mb} is obtained.

 観察モードでは、観察画像Vb、Vrはパターン照明の影響を受けるので、意図的な明るさ変化が生じた画像となる。一方、観察画像Vgは一様照明である波長域{Pb2,Pr1,Pir}で構成されるので意図的な明るさ変化は発生せず、被写体の反射係数のみが反映された画像プロファイルになる。 In the observation mode, the observation images Vb and Vr are affected by the pattern illumination, so that the images are intentionally changed in brightness. On the other hand, since the observation image Vg is composed of the wavelength range {Pb2, Pr1, Pir} that is uniform illumination, no intentional brightness change occurs, and the image profile reflects only the reflection coefficient of the subject.

 4.補正処理
 第2分光特性例を用いた場合、観察画像Vr、Vbはパターンの影響を受けたプロファイルとなるが、仮に一様照明だとしたときに得られるプロファイル(図6の点線)を想定すると、平均的明るさは異なるものの部分的に観察画像Vgとの類似性(相似性)は高い。なぜならば、観察画像{Vr,Vg}は帯域Pb2において撮像センサの分光特性が重複している画像であり、観察画像{Vb,Vg}は帯域Pr1において撮像センサの分光特性が重複している画像であるので、少なからず相互に相関性をもつからである。従ってパターンの影響を受けない観察画像Vgを用いれば観察画像Vr、Vbの補正が可能となり、パターンの影響を除去または低減できる。即ち、フラット照明で撮影したかのような観察画像を復元可能である。この補正処理について以下に説明する。
4). Correction Processing When the second spectral characteristic example is used, the observation images Vr and Vb are profiles affected by the pattern, but assuming a profile (dotted line in FIG. 6) obtained when the illumination is uniform. Although the average brightness is different, the similarity (similarity) with the observed image Vg is partially high. This is because the observation image {Vr, Vg} is an image in which the spectral characteristics of the imaging sensor overlap in the band Pb2, and the observation image {Vb, Vg} is an image in which the spectral characteristics of the imaging sensor overlap in the band Pr1. This is because there is a considerable correlation between them. Therefore, if the observation image Vg not affected by the pattern is used, the observation images Vr and Vb can be corrected, and the influence of the pattern can be removed or reduced. That is, it is possible to restore an observation image as if it was taken with flat illumination. This correction process will be described below.

 図7、図8に、補正処理の説明図を示す。なお以下では青色画像Vbの補正を例にとって説明するが、赤色画像Vrの補正も同様に行うことができる。 7 and 8 are explanatory diagrams of the correction process. In the following description, the correction of the blue image Vb will be described as an example, but the correction of the red image Vr can be similarly performed.

 図7に示すように、撮像センサのセンサ面における任意の位置XLを中心とした幅dの区間での、青色画像Vbと緑色画像Vgの波形の間の相関値を計算する。これをセンサ面における全ての位置x(即ち、撮像画像の全ての画素)に対して行う。例えばZNCC(Zero-mean Normalized Cross-Correlation)を用いた場合、類似度が最大の場合に相関値が1となり、類似度が低いほど相関値が0に近づく。 As shown in FIG. 7, the correlation value between the waveform of the blue image Vb and the green image Vg in the section of the width d centering on the arbitrary position XL on the sensor surface of the image sensor is calculated. This is performed for all positions x on the sensor surface (that is, all pixels of the captured image). For example, when ZNCC (Zero-mean Normalized Cross-Correlation) is used, the correlation value becomes 1 when the similarity is maximum, and the correlation value approaches 0 as the similarity becomes lower.

 相関値と閾値Thを比較し、相関値が閾値Th以上である場合にはフラグ値を“1”とし、相関値が閾値Thより小さい場合にはフラグ値を“0”とする。即ち、フラグ値は類似度を有/無に2値化したものである。補正処理では、フラグ値“1”の画素を有効画素と判断し、フラグ値“0”の画素を無効画素と判断する。 The correlation value is compared with the threshold Th, and when the correlation value is equal to or greater than the threshold Th, the flag value is set to “1”, and when the correlation value is smaller than the threshold Th, the flag value is set to “0”. That is, the flag value is obtained by binarizing the similarity with / without similarity. In the correction process, a pixel with a flag value “1” is determined as a valid pixel, and a pixel with a flag value “0” is determined as an invalid pixel.

 なお、相関値は撮像画像の全ての画素について求める場合に限らず、例えば所定の領域の画素について求めてもよいし、所定間隔で間引いた画素について求めてもよい。また、フラグ値の判定は上記に限らず、例えば類似度が高いほど相関値が小さくなる相関演算を用いた場合には、相関値が閾値Th以下である場合にフラグ値を“1”とし、相関値が閾値Thより大きい場合にフラグ値を“0”としてもよい。 Note that the correlation value is not limited to the case of obtaining all the pixels of the captured image. For example, the correlation value may be obtained for pixels in a predetermined region, or may be obtained for pixels thinned at a predetermined interval. In addition, the determination of the flag value is not limited to the above. For example, when using a correlation calculation in which the correlation value decreases as the similarity increases, the flag value is set to “1” when the correlation value is equal to or less than the threshold Th The flag value may be set to “0” when the correlation value is larger than the threshold Th.

 次に図8に示すように、青色画像Vbの位置XLのフラグ値が“1”の場合は、何もせず青色画像Vbの位置XLの画素値Vb(XL)を、そのまま補正値Vb’(XL)とする。青色画像Vbの位置XLのフラグ値が0の場合は、青色画像Vb及び緑色画像Vgの位置XLを中心とした幅wの区間に着目し、その区間においてフラグ値が“1”の画素のみを使って青色画像Vbと緑色画像Vgとの区間フィッティング処理を行う。図8の例では、幅wの着目区間の有効画素範囲をe1、e2と表記している。フラグ値が“0”の無効画素範囲は区間フィッティング処理には使わない。 Next, as shown in FIG. 8, when the flag value of the position XL of the blue image Vb is “1”, nothing is done and the pixel value Vb (XL) of the position XL of the blue image Vb is directly used as the correction value Vb ′ ( XL). When the flag value at the position XL of the blue image Vb is 0, pay attention to the section of the width w centering on the position XL of the blue image Vb and the green image Vg, and only the pixel having the flag value “1” in the section is selected. The section fitting process between the blue image Vb and the green image Vg is performed. In the example of FIG. 8, the effective pixel ranges of the target section with the width w are denoted as e1 and e2. The invalid pixel range with the flag value “0” is not used for the interval fitting process.

 フィッティング処理としては、例えば、処理区間の有効画素範囲をe1、e2において、緑色画像Vgのレベルを変化させ、各レベルでの緑色画像Vgと青色画像Vbの差分の絶対値の合計を求め、その合計が最小となるように重ね合わせる方法が考えられる。或いは、緑色画像Vgのゲインを変化させ、各ゲインでの緑色画像Vgと青色画像Vbの差分の絶対値の合計を求め、その合計が最小となるように重ね合わせる方法が考えられる。 As the fitting process, for example, when the effective pixel range of the processing section is e1 and e2, the level of the green image Vg is changed, and the total absolute value of the difference between the green image Vg and the blue image Vb at each level is obtained. A method of superimposing such that the sum is minimized can be considered. Alternatively, a method is conceivable in which the gain of the green image Vg is changed, the sum of the absolute values of the differences between the green image Vg and the blue image Vb at each gain is obtained, and the sum is made so that the sum is minimized.

 フィッティング処理後の緑色画像Vgの位置XLの画素値Vg(XL)を補正値Vb’(XL)とする。上記一連の補正処理をセンサ面における全ての位置x(即ち、撮像画像の全ての画素)において行い、補正された青色画像Vb’を生成する。同様にして補正された赤色画像Vr’を生成する。これらの補正された画像と、既に撮像されている緑色画像Vgとを合わせて、表示用の観察画像としてのRGB画像を再構成する。 The pixel value Vg (XL) at the position XL of the green image Vg after the fitting process is set as a correction value Vb ′ (XL). The series of correction processes described above is performed at all positions x on the sensor surface (that is, all pixels of the captured image) to generate a corrected blue image Vb ′. Similarly, a corrected red image Vr ′ is generated. These corrected images and the already captured green image Vg are combined to reconstruct an RGB image as an observation image for display.

 以上のような補正処理を行うことで、パターン光を含む照明光であっても高品位な観察画像を生成することができる。 By performing the correction process as described above, a high-quality observation image can be generated even with illumination light including pattern light.

 以上の実施形態によれば、非ステレオモードにおける単眼は、第1色(青色)と第2色(緑色)と第3色(赤色)の波長帯域を含む波長帯域を通過させる。 According to the embodiment described above, the monocular in the non-stereo mode passes the wavelength band including the wavelength bands of the first color (blue), the second color (green), and the third color (red).

 このようにすれば、非ステレオモードにおいて、パターン照明によるパターンが写った第1色の画像及び第3色の画像と、パターン照明によるパターンが写らない第2色の画像が撮像される。そして、これら3色の画像を用いることでパターン照明によるパターンを消去(低減)することが可能となり、観察画像を得ることが可能となる。 In this way, in the non-stereo mode, the first color image and the third color image in which the pattern by the pattern illumination is captured, and the second color image in which the pattern by the pattern illumination is not captured are captured. Then, by using these three color images, it is possible to erase (reduce) the pattern by pattern illumination, and an observation image can be obtained.

 また本実施形態では、図18等で後述するように、撮像装置が位相差検出部330と画像出力部(カラー画像生成部320)を含んでもよい。位相差検出部330は、ステレオモードにおいて撮像された第1色の画像と第3色の画像との間の位相差を検出する。画像出力部は、非ステレオモードにおいて撮像された撮像画像(第1色~第3色の画像)に基づいて観察用の画像を出力する。また画像出力部は、所与の光量分布による第1色の画像及び第3色の画像の画素値の変化を、第2色の画像に基づいて補正する。 In this embodiment, as will be described later with reference to FIG. 18 and the like, the imaging apparatus may include a phase difference detection unit 330 and an image output unit (color image generation unit 320). The phase difference detection unit 330 detects a phase difference between the first color image and the third color image captured in the stereo mode. The image output unit outputs an image for observation based on the captured image (first color to third color image) captured in the non-stereo mode. The image output unit corrects changes in the pixel values of the first color image and the third color image due to a given light amount distribution based on the second color image.

 本実施形態のパターン照明を用いた場合、第2色の画像にはパターン照明によるパターンが写らない。これにより、その第2色の画像を基準として、パターン照明によるパターンが写った第1色の画像及び第3色の画像の画素値を補正することが可能となる。即ち、通常の画像(例えば工業用内視鏡で通常撮影する被写体の画像や、自然界を撮影した画像等)では第1色~第3色の画像のプロファイルは、ほぼ相似していると考えられる。そのため、パターンが写っていない第2色の画像のプロファイルに相似させるように第1色の画像及び第3色の画像のプロファイルを補正することによって、パターン照明の影響を補正できる。 When the pattern illumination of this embodiment is used, the pattern by the pattern illumination is not reflected in the second color image. Accordingly, it is possible to correct the pixel values of the first color image and the third color image in which the pattern by the pattern illumination is captured using the second color image as a reference. In other words, it is considered that the profiles of the first to third color images are almost similar in a normal image (for example, an image of a subject that is normally captured by an industrial endoscope or an image captured of the natural world). . Therefore, the influence of pattern illumination can be corrected by correcting the profiles of the first color image and the third color image so as to be similar to the profile of the second color image in which no pattern is captured.

 5.照明部
 以下、本実施形態のアクティブパターン照明を行う照明部について説明する。なお、照明光の出射部が光源部から離れた内視鏡スコープのような装置を例にとって説明するが、光源部は、必ずしも光源部から出射端までを導光部材を用いる装置に限定されない。
5). Illumination Unit An illumination unit that performs active pattern illumination according to this embodiment will be described below. In addition, although demonstrated taking the case of apparatuses, such as an endoscope scope in which the emission part of illumination light was separated from the light source part, a light source part is not necessarily limited to the apparatus using a light guide member from a light source part to an output end.

 図9に、照明部の第1構成例を示す。図9の照明部は、白色光源401、導光部材402(非パターン光用導光部材)、照明用レンズ403、赤色レーザー光源404、青色レーザー光源405、ダイクロイックプリズム406、407(広義にはミラー)、導光部材408(パターン光用導光部材)、マスクパターン409、投影レンズ410を含む。 FIG. 9 shows a first configuration example of the illumination unit. 9 includes a white light source 401, a light guide member 402 (light guide member for non-pattern light), an illumination lens 403, a red laser light source 404, a blue laser light source 405, dichroic prisms 406 and 407 (mirrors in a broad sense). ), A light guide member 408 (light guide member for pattern light), a mask pattern 409, and a projection lens 410.

 この第1構成例では、照明部が2系統の導光部材402、408を有している。一方の導光部材402は、白色光源401からの白色光を、スコープ部の先端部に設けられた照明用レンズ403まで導光する。その導光された白色光は、照明用レンズ403を介して被写体に照射される。他方の導光部材408は、青色レーザー光と赤色レーザー光を、スコープ部の先端部に設けられた投影レンズ410まで導光する。即ち、青色レーザー光源405からの青色レーザー光と赤色レーザー光源404からの赤色レーザー光が、ダイクロイックプリズム406、407による光路合成により、導光部材408に入射する。導光部材408により導光されたレーザー光はマスクパターン409を通過し、それにより付加されたパターンが投影レンズ410により被写体5に投影される。 In this first configuration example, the illumination unit has two light guide members 402 and 408. One light guide member 402 guides the white light from the white light source 401 to the illumination lens 403 provided at the distal end of the scope unit. The guided white light is irradiated to the subject through the illumination lens 403. The other light guide member 408 guides blue laser light and red laser light to the projection lens 410 provided at the distal end of the scope portion. That is, the blue laser light from the blue laser light source 405 and the red laser light from the red laser light source 404 are incident on the light guide member 408 by optical path synthesis by the dichroic prisms 406 and 407. The laser light guided by the light guide member 408 passes through the mask pattern 409, and the added pattern is projected onto the subject 5 by the projection lens 410.

 この第1構成例では、非パターン光とパターン光は被写体5の表面において合成された照明になる。 In this first configuration example, the non-pattern light and the pattern light are combined on the surface of the subject 5.

 図10に、照明部の第2構成例を示す。図10の照明部は、白色光源451、偏光素子452、青色レーザー光源453、赤色レーザー光源454、ダイクロイックプリズム455、456、マスクパターン457、偏光素子458、プリズム459(合成プリズム)、導光部材460、投影レンズ461を含む。 FIG. 10 shows a second configuration example of the illumination unit. 10 includes a white light source 451, a polarizing element 452, a blue laser light source 453, a red laser light source 454, dichroic prisms 455 and 456, a mask pattern 457, a polarizing element 458, a prism 459 (synthesis prism), and a light guide member 460. Projection lens 461.

 この第2構成例では、照明部が1系統の導光部材460を有している。白色光源451からの白色光は、偏光素子452により例えばP偏光(S偏光に垂直な偏光)にされ、その偏光された白色光がプリズム459に入射する。青色レーザー光源453からの青色レーザー光と赤色レーザー光源454からの赤色レーザー光は、ダイクロイックプリズム455、456により光路合成される。光路合成されたレーザー光はマスクパターン409を通過し、それによりパターンが付加される。そのパターンが付加されたレーザー光は偏光素子452により例えばS偏光(プリズム459の反射面に平行な偏光)にされ、その偏光されたレーザー光がプリズム459に入射する。非パターン光である白色光とパターン光であるレーザー光が、プリズム459による光路合成により導光部材460に入射する。導光部材460は入射光を、スコープ部の先端部に設けられた投影レンズ461まで導光する。導光された非パターン光とパターン光が投影レンズ410により被写体5に投影される。 In the second configuration example, the illumination unit has one light guide member 460. White light from the white light source 451 is converted into, for example, P-polarized light (polarized light perpendicular to S-polarized light) by the polarizing element 452, and the polarized white light enters the prism 459. The optical paths of the blue laser light from the blue laser light source 453 and the red laser light from the red laser light source 454 are synthesized by the dichroic prisms 455 and 456. The optical path synthesized laser light passes through the mask pattern 409, whereby a pattern is added. The laser light to which the pattern is added is converted into, for example, S-polarized light (polarized light parallel to the reflecting surface of the prism 459) by the polarizing element 452, and the polarized laser light enters the prism 459. White light that is non-patterned light and laser light that is patterned light are incident on the light guide member 460 by optical path synthesis by the prism 459. The light guide member 460 guides incident light to the projection lens 461 provided at the distal end portion of the scope portion. The guided non-pattern light and pattern light are projected onto the subject 5 by the projection lens 410.

 この第2構成例では、導光部材460の入射前に予めパターン照明を生成し、非パターン光と合成された上で導光部材460に入射される。導光部材460がパターンをそのまま伝達できる光ファイバーバンドル(イメージガイド)であれば、生成したパターン光のパターンが伝達されて好都合である。導光部材460が単なる光ファイバーバンドル(ライトガイド)のときは、導光部材460の入射端と出射端にてファイバーの配列が異なっている可能性が高いので、生成したパターン光のパターンがそのまま伝達できない。しかしながらパターンがランダムパターンであれば良い場合、入射端と出射端のパターンが適度に変化していても構わない。 In this second configuration example, pattern illumination is generated in advance before the light guide member 460 is incident, and is combined with non-patterned light and then incident on the light guide member 460. If the light guide member 460 is an optical fiber bundle (image guide) that can transmit the pattern as it is, the pattern of the generated pattern light is transmitted conveniently. When the light guide member 460 is a simple optical fiber bundle (light guide), there is a high possibility that the fiber arrangement is different between the incident end and the output end of the light guide member 460, so that the generated pattern light pattern is transmitted as it is. Can not. However, if it is sufficient that the pattern is a random pattern, the patterns at the incident end and the outgoing end may be appropriately changed.

 6.撮像部
 以下、観察モードとステレオ計測モードを切り替え可能な撮像部の構成例について説明する。
6). Imaging Unit Hereinafter, a configuration example of an imaging unit capable of switching between the observation mode and the stereo measurement mode will be described.

 内視鏡装置での検査では、例えば検査対象にスコープを挿入して通常の画像を撮影しながら異常がないかチェックしていき、傷などの詳細に観察したい部分が見つかったときに、その部分の3次元形状を計測して更なる検査が必要かを検討する。このように、通常の観察画像は白色光で撮影を行う。このような白色光での撮影とステレオ計測を両立する方法として、例えば白色光でステレオ撮影を行うことが考えられる。しかしながら、ステレオ撮影で白色光を用いた場合、イメージセンサを左右に分割して、それぞれの領域に左画像と右画像を結像させる必要があるため、画像が低解像になる。イメージセンサの同一領域に左画像と右画像を結像する手法としては、カラー位相差法があるが、撮影される画像は色ずれ画像になるため観察画像として用いることができない。 In an inspection with an endoscopic device, for example, a scope is inserted into the inspection object and a normal image is taken to check for abnormalities. Measure the three-dimensional shape and examine whether further inspection is necessary. Thus, a normal observation image is taken with white light. As a method of achieving both such shooting with white light and stereo measurement, for example, performing stereo shooting with white light can be considered. However, when white light is used in stereo shooting, the image sensor is divided into left and right parts, and the left image and the right image need to be imaged in the respective regions. As a method for forming the left image and the right image in the same area of the image sensor, there is a color phase difference method. However, since the captured image becomes a color shift image, it cannot be used as an observation image.

 上記のことから、白色光でイメージセンサの同一領域で左画像と右画像を写すためには、時分割切り替え(例えば特許文献1)が必要となる。しかしながら、撮像系と被写体が相対的に動いた場合には左画像と右画像の間に動きブレがあるため、三角測定が不正確になってしまう。特に内視鏡のようにカメラを被写体に対して固定できない場合には、動きブレが発生しやすい。 From the above, in order to capture the left image and the right image in the same area of the image sensor with white light, time division switching (for example, Patent Document 1) is required. However, when the imaging system and the subject move relative to each other, there is a motion blur between the left image and the right image, so that the triangle measurement becomes inaccurate. In particular, motion blur is likely to occur when the camera cannot be fixed to the subject as in an endoscope.

 カラー位相差で非時分割にステレオ計測を行う手法として、例えば上述した特許文献2がある。しかしながら、特許文献2はオートフォーカスにステレオ計測を適用するものであり、観察画像との高速な切り替えを想定していないと考えられる。上述したように、可動部であるフィルタが2つあるため高速な切り替えという点では不利と考えられる。 As a technique for performing stereo measurement in a non-time-division manner using a color phase difference, for example, there is Patent Document 2 described above. However, Patent Document 2 applies stereo measurement to autofocus, and it is considered that high-speed switching with an observation image is not assumed. As described above, since there are two filters which are movable parts, it is considered disadvantageous in terms of high-speed switching.

 また、特許文献2の構成では単一光路を真ん中で左右に分けるだけなので瞳間の距離を離すことが難しく、距離測定の精度を出しにくいという問題がある。内視鏡装置ではパンフォーカスが必要であるため絞りが小さい(F値が大きい)ので、その小さな絞り径を左右に分けることになり、瞳間の距離が近くなりやすい。 Further, the configuration of Patent Document 2 has a problem in that it is difficult to increase the distance measurement accuracy because it is difficult to increase the distance between the pupils because the single optical path is divided into right and left in the middle. In the endoscope apparatus, since the pan focus is necessary, the diaphragm is small (F value is large). Therefore, the small diaphragm diameter is divided into right and left, and the distance between the pupils tends to be close.

 また、ステレオにおける左右の時分割切り替えも含めて、時分割の切り替えではシャッタや分光フィルタを機械的に動かす(切り替える)必要がある。機械的な動きではミスや故障が発生するため、シャッタや分光フィルタが切り替えのいずれの状態(位置)にあるかを検出し、エラーがあれば修復する必要があるという問題がある。このような検出機能を実現する場合、エラーの種類が少ない方が検出も修復も容易である。例えば特許文献2の構成では、2つの分光フィルタのうち両方が瞳に挿入されなかった場合、一方だけ瞳に挿入された場合など、複数の種類のエラーが発生し得るので、検出や修復を確実に行うことが難しくなる。 In addition, it is necessary to mechanically move (switch) the shutter and the spectral filter for time division switching, including left and right time division switching in stereo. Since mechanical movements cause mistakes and failures, there is a problem that it is necessary to detect which state (position) of the shutter and the spectral filter is switched, and to repair any errors. When realizing such a detection function, detection and repair are easier when there are fewer types of errors. For example, in the configuration of Patent Document 2, multiple types of errors may occur, such as when both of the two spectral filters are not inserted into the pupil, or when only one of them is inserted into the pupil. Difficult to do.

 図11、図12に、上記のような課題を解決できる本実施形態の撮像部の構成例を示す。図11、図12には、撮像部を横から見た(光軸を含む平面での)断面図と、撮像素子上の結像の光量(又は撮像素子に撮像された画像の画素値)と位置xの関係と、を示す。位置xは、結像光学系の光軸に垂直な方向における位置(座標)であり、例えば撮像素子の画素位置である。実際には2次元の座標系であるが、ここでは2次元のうち視差方向の1次元の座標系で説明する。 11 and 12 show a configuration example of the imaging unit of the present embodiment that can solve the above-described problems. 11 and 12 are cross-sectional views (on a plane including the optical axis) of the image pickup unit, and a light amount of an image formed on the image pickup element (or a pixel value of an image picked up by the image pickup element). The relationship of the position x is shown. The position x is a position (coordinates) in a direction perpendicular to the optical axis of the imaging optical system, for example, a pixel position of the image sensor. Actually, it is a two-dimensional coordinate system, but here, a two-dimensional one-dimensional coordinate system in the parallax direction will be described.

 撮像部は、結像光学系10、可動マスク30(第1のマスク)、固定マスク20(第2のマスク)、撮像素子40(撮像センサ、イメージセンサ)、照明部60(照明装置)を含む。結像光学系10は、単眼の光学系であり、例えば1又は複数のレンズで構成される。ここでは撮像素子40がRGBのベイヤ配列のカラーフィルタを有する場合を例に説明するが、これに限定されず、例えば補色フィルタ等を有してもよい。 The imaging unit includes an imaging optical system 10, a movable mask 30 (first mask), a fixed mask 20 (second mask), an imaging element 40 (imaging sensor, image sensor), and an illumination unit 60 (illumination device). . The imaging optical system 10 is a monocular optical system, and includes, for example, one or a plurality of lenses. Here, the case where the image pickup device 40 has RGB color filters of the Bayer array will be described as an example. However, the present invention is not limited to this, and may include, for example, a complementary color filter.

 図11、図12に示すように、被写体5からの反射光を結像光学系10により撮像素子40の面上に結像させる。このとき、固定マスク20により瞳中心と左右瞳に分割され、可動マスク30により、瞳中心による結像と、左右瞳による結像が切り替えられる。これらは撮像素子40の同一領域に結像する。dは、左瞳(固定マスク20の左眼絞り孔)の中心線IC1と右瞳(固定マスク20の右眼絞り孔)の中心線IC2との間の距離であり、ステレオ計測においては基線長となる。なお、直線AXCは、結像光学系10の光軸である。中心線IC1、IC2は、例えば一眼の結像光学系10の光軸AXCから等距離に設けられる。中心線IC1、IC2と光軸AXCは同一平面内であることが望ましいが、必ずしも同一平面内でなくともよい。 As shown in FIGS. 11 and 12, the reflected light from the subject 5 is imaged on the surface of the image sensor 40 by the imaging optical system 10. At this time, the fixed mask 20 divides the pupil center and the left and right pupils, and the movable mask 30 switches between the image formation by the pupil center and the image formation by the left and right pupils. These are imaged in the same area of the image sensor 40. d is the distance between the center line IC1 of the left pupil (the left eye aperture of the fixed mask 20) and the center line IC2 of the right pupil (the right eye aperture of the fixed mask 20), and the baseline length in stereo measurement It becomes. The straight line AXC is the optical axis of the imaging optical system 10. The center lines IC1 and IC2 are provided at an equal distance from the optical axis AXC of the single-lens imaging optical system 10, for example. The center lines IC1 and IC2 and the optical axis AXC are preferably in the same plane, but are not necessarily in the same plane.

 固定マスク20、可動マスク30は、例えば結像光学系10の瞳位置に設けられる。或いは結像光学系10よりも結像側に設けられてもよい。固定マスク20は結像光学系10に対して固定されており、可動マスク30は光軸AXCに垂直な平面内で位置を切り替えられる構成となっている。可動マスク30は、図11に示す第1の状態である観察モード(第1のモード、非ステレオモード、単眼モード)と、図12に示す第2の状態であるステレオ計測モード(第2のモード、ステレオモード)の2つのモードをとることができ、これらが高速に切り替えられるようになっている。 The fixed mask 20 and the movable mask 30 are provided at the pupil position of the imaging optical system 10, for example. Alternatively, it may be provided on the imaging side with respect to the imaging optical system 10. The fixed mask 20 is fixed with respect to the imaging optical system 10, and the movable mask 30 is configured such that the position can be switched in a plane perpendicular to the optical axis AXC. The movable mask 30 has an observation mode (first mode, non-stereo mode, monocular mode) which is the first state shown in FIG. 11 and a stereo measurement mode (second mode) which is the second state shown in FIG. , Stereo mode), which can be switched at high speed.

 固定マスク20は、3つの絞り孔(左眼絞り孔、右眼絞り孔、中心絞り孔)が設けられた板状の遮光部(遮光部材)と、左眼絞り孔に設けられた短波長(青色)分光フィルタと、右眼絞り孔に設けられた長波長(赤色)分光フィルタと、を含む。絞り孔以外の部分は遮光部で覆われており、光が通過しないようになっている。中心絞り孔は、例えば貫通穴であってもよいし、或いは何らかの分光フィルタ(例えば、少なくとも白色光を透過する広帯域の分光フィルタ)が設けられてもよい。 The fixed mask 20 includes a plate-shaped light shielding portion (light shielding member) provided with three apertures (left eye aperture, right eye aperture, and central aperture), and a short wavelength ( Blue) spectral filter and a long wavelength (red) spectral filter provided in the right eye aperture. The portions other than the aperture hole are covered with a light shielding portion so that light does not pass through. The central aperture may be, for example, a through hole, or some spectral filter (for example, a broadband spectral filter that transmits at least white light) may be provided.

 可動マスク30は、3つの絞り孔が設けられた板状の遮光部(遮光部材)を含む。各モードにおいて、固定マスク20の3つの絞り孔のうち中心絞り孔又は左右眼絞り孔を遮光部が覆えるような大きさに可動マスク30が構成されている。絞り孔は、観察モードにおいて固定マスク20の中心絞り孔に重なる位置と、ステレオ計測モードにおいて左眼絞り孔と右眼絞り孔に重なる位置と、に設けられる。以下では、便宜的に、可動マスク30においても左眼絞り孔、右眼絞り孔、中心絞り孔と呼ぶ。図11、図12では可動マスク30が固定マスク20よりも結像側に設けられる場合を図示しているが、可動マスク30が固定マスク20よりも対物側に設けられてもよい。 The movable mask 30 includes a plate-shaped light shielding portion (light shielding member) provided with three aperture holes. In each mode, the movable mask 30 is configured in such a size that the light blocking portion covers the central aperture hole or the left and right eye aperture holes among the three aperture holes of the fixed mask 20. The aperture is provided at a position overlapping the central aperture of the fixed mask 20 in the observation mode and at a position overlapping the left eye aperture and the right eye aperture in the stereo measurement mode. Hereinafter, for convenience, the movable mask 30 is also referred to as a left eye aperture, a right eye aperture, and a center aperture. 11 and 12 illustrate the case where the movable mask 30 is provided on the imaging side with respect to the fixed mask 20, the movable mask 30 may be provided on the objective side with respect to the fixed mask 20.

 照明部60は、その先端部(照明の射出端)が左瞳と右瞳に対して対称な位置となるように、設けられることが望ましいが、必ずしも照明部60の先負が左瞳と右瞳に対して対称でなくてもよい。図11、図12では照明部60の先端部が結像光学系10よりも前に配置されているが、これに限定されず、例えば撮像部の先端部において照明部60と結像光学系10が横並びに配置されてもよい。 The illumination unit 60 is preferably provided so that the tip (illumination exit end) is positioned symmetrically with respect to the left pupil and the right pupil. However, the negative of the illumination unit 60 is not necessarily limited to the left pupil and the right pupil. May not be symmetrical. 11 and 12, the tip of the illumination unit 60 is disposed in front of the imaging optical system 10. However, the present invention is not limited to this. For example, the illumination unit 60 and the imaging optical system 10 at the tip of the imaging unit. May be arranged side by side.

 以下、固定マスク20の左眼絞り孔、右眼絞り孔、中心絞り孔の分光特性をFL、FR、FCと表記する。また、分かりやすくするために、各絞り孔に設けられる分光フィルタについても、同じ符号FL、FR、FCで表記するものとする。なお、可動マスク30の各絞り孔には分光フィルタが設けられておらず(開放孔であり)、全帯域を通過させる。 Hereinafter, the spectral characteristics of the left eye aperture, the right eye aperture, and the center aperture of the fixed mask 20 are denoted as FL, FR, and FC. In addition, for the sake of easy understanding, the spectral filters provided in the respective apertures are also denoted by the same symbols FL, FR, and FC. In addition, each diaphragm hole of the movable mask 30 is not provided with a spectral filter (is an open hole), and allows the entire band to pass.

 図11は観察モードの状態を示しており、瞳中心の光路は固定マスク20の中心絞り孔と可動マスクの中心絞り孔を介して開放された状態となり、左右瞳の光路は可動マスク30により遮断(遮光)された状態になっている。この場合、撮像素子40に結像される画像は瞳中心のみによる結像画像ICとなり、通常の(単眼による白色光の)撮像画像が得られる。 FIG. 11 shows the state of the observation mode, where the optical path at the center of the pupil is opened through the central aperture of the fixed mask 20 and the central aperture of the movable mask, and the optical path of the left and right pupils is blocked by the movable mask 30. (Light shielding). In this case, the image formed on the image sensor 40 is a formed image IC formed only by the pupil center, and a normal (monocular white light) captured image is obtained.

 一方、図12はステレオ計測モードの状態を示しており、固定マスク20の左眼絞り孔と可動マスク30の左眼絞り孔が重なった状態となり、固定マスク20の右眼絞り孔と可動マスク30の右眼絞り孔が重なった状態となっている。瞳中心の光路は可動マスク30により遮断(遮光)された状態になっている。即ち、左瞳側の光路は、結像光を短波長(青色)分光フィルタFL(第1のフィルタ)によりフィルタリングし、その短波長成分による画像ILを撮像素子40に結像する。右瞳側の光路は、結像光を長波長(赤色)分光フィルタFR(第2のフィルタ)によりフィルタリングし、その長波長成分による画像IRを同一の撮像素子40に結像する。 On the other hand, FIG. 12 shows a state of the stereo measurement mode, in which the left eye diaphragm hole of the fixed mask 20 and the left eye diaphragm hole of the movable mask 30 are overlapped, and the right eye diaphragm hole of the fixed mask 20 and the movable mask 30 are overlapped. The right eye aperture hole is overlapped. The optical path at the center of the pupil is blocked (shielded) by the movable mask 30. That is, in the optical path on the left pupil side, the imaging light is filtered by the short wavelength (blue) spectral filter FL (first filter), and an image IL based on the short wavelength component is formed on the image sensor 40. In the optical path on the right pupil side, the imaging light is filtered by a long wavelength (red) spectral filter FR (second filter), and an image IR based on the long wavelength component is formed on the same image sensor 40.

 したがってステレオ計測モードでは、撮像素子40の青色画素により得られる画像ILは短波長画像となり、撮像素子40の赤色画素により得られる画像IRは長波長画像となり、2つの光路からの画像IL、IRを分離取得することができる。つまりステレオ計測モードでは、位相差をもった左眼画像ILと右眼画像IRを同時に且つ独立して得ることができ、位相差画像によるステレオ計測が可能となる。また本実施形態の撮像部は、第1の状態である観察モードと第2の状態であるステレオ計測モードをとることが可能であり、これらの状態を高速に切り替えることが可能である。これによって、例えば非ステレオの通常観察を行いながら、リアルタイムに3D計測を行うことが可能となる。 Therefore, in the stereo measurement mode, the image IL obtained from the blue pixels of the image sensor 40 is a short wavelength image, the image IR obtained from the red pixels of the image sensor 40 is a long wavelength image, and the images IL and IR from the two optical paths are obtained. Can be obtained separately. That is, in the stereo measurement mode, the left eye image IL and the right eye image IR having a phase difference can be obtained simultaneously and independently, and stereo measurement using the phase difference image becomes possible. In addition, the imaging unit of the present embodiment can take the observation mode that is the first state and the stereo measurement mode that is the second state, and can switch between these states at high speed. Thereby, for example, 3D measurement can be performed in real time while performing non-stereo normal observation.

 7.固定マスク、可動マスクの第1の詳細な構成例
 図13、図14に固定マスク20、可動マスク30の第1の詳細な構成例を示す。図13、図14には、結像光学系10と固定マスク20と可動マスク30の断面図と、固定マスク20と可動マスク30を光軸方向に見た図(結像側から見た背面図)と、を示す。
7). First Detailed Configuration Example of Fixed Mask and Movable Mask FIGS. 13 and 14 show a first detailed configuration example of the fixed mask 20 and the movable mask 30. 13 and 14 are cross-sectional views of the imaging optical system 10, the fixed mask 20, and the movable mask 30, and a view of the fixed mask 20 and the movable mask 30 in the optical axis direction (a rear view viewed from the imaging side). ).

 固定マスク20の左瞳の光路には、短波長フィルタFLを有する絞り孔21が開いており、右瞳の光路には、長波長分光フィルタFRを有する絞り孔22が構成されており、瞳中心の光路には開放状態(スルーホール)の絞り孔23が設けられている。なお、絞り孔23には図3の帯域Pv1、Pv2、Pirを通過させる分光フィルタFCが設けられてもよい。絞り孔21、22は遮光部24(遮光部材)に開けられており、例えば撮像系に必要な被写界深度に対応したサイズの孔(例えば円形状の孔で、サイズは直径)である。絞り孔21、22、23の中心(例えば円の中心)は、それぞれ中心線IC1、IC2、光軸AXCに一致(略一致を含む)している。遮光部24は、結像光学系10が収められた筐体を正面(又は背面)から見たときに筐体を塞ぐように設けられており、例えば光軸AXCに対して垂直に設けられた板状部材である。 A diaphragm hole 21 having a short wavelength filter FL is opened in the optical path of the left pupil of the fixed mask 20, and a diaphragm hole 22 having a long wavelength spectral filter FR is formed in the optical path of the right pupil. The optical path is provided with an aperture (through hole) aperture 23. Note that the aperture 23 may be provided with a spectral filter FC that allows the bands Pv1, Pv2, and Pir of FIG. 3 to pass therethrough. The aperture holes 21 and 22 are opened in the light shielding portion 24 (light shielding member), and are, for example, holes having a size corresponding to the depth of field necessary for the imaging system (for example, circular holes, the size is a diameter). The centers of the aperture holes 21, 22, and 23 (for example, the center of a circle) coincide with (including substantially coincident with) the center lines IC1 and IC2 and the optical axis AXC, respectively. The light shielding unit 24 is provided so as to close the housing containing the imaging optical system 10 when viewed from the front (or the back), and is provided, for example, perpendicular to the optical axis AXC. It is a plate-like member.

 可動マスク30は、開放状態(スルーホール)の絞り孔31、32、33と、その絞り孔31、32、33が開けられた遮光部34(遮光部材)と、を有する。絞り孔31、32、33は、例えば固定マスク20の絞り孔21、22、23よりも少し大きいサイズの孔である。或いは、撮像系に必要な被写界深度に対応したサイズの孔(例えば円形状の孔で、サイズは直径)であってもよい。絞り孔33の中心(例えば円の中心)は、観察モードにおいて光軸AXCに一致(略一致を含む)している。遮光部34は、光軸AXCに対して垂直な回転軸35に接続されており、例えば光軸AXCに対して垂直に設けられた板状部材である。遮光部34の形状は、例えば扇型(扇の根元が軸35に接続される)であるが、これに限定されず、図13及び図14の状態を実現できる形状であればよい。 The movable mask 30 has aperture holes 31, 32, 33 in an open state (through hole) and a light shielding portion 34 (light shielding member) in which the aperture holes 31, 32, 33 are opened. The aperture holes 31, 32, and 33 are holes that are slightly larger than the aperture holes 21, 22, and 23 of the fixed mask 20, for example. Alternatively, it may be a hole having a size corresponding to the depth of field necessary for the imaging system (for example, a circular hole having a diameter). The center of the aperture 33 (for example, the center of a circle) coincides with (including substantially coincides with) the optical axis AXC in the observation mode. The light shielding unit 34 is connected to a rotation shaft 35 perpendicular to the optical axis AXC, and is a plate-like member provided perpendicular to the optical axis AXC, for example. The shape of the light shielding part 34 is, for example, a fan shape (the fan base is connected to the shaft 35), but is not limited thereto, and may be any shape that can realize the states of FIGS.

 可動マスク30は、回転軸35を中心として光軸AXCに垂直な方向に所定の角度だけ回転する構成となっている。例えばピエゾ素子やモーター等によって回転運動を実現できる。図13の観察モードにおいては、可動マスク30は右眼側に所定の角度だけ回転して傾き、固定マスク20の瞳中心光路(絞り孔23)は開放状態となり、左右瞳光路(絞り孔21、22)は遮光状態となる。図14のステレオ計測モードにおいては、可動マスク30は左眼側に所定の角度だけ回転して傾き、固定マスク20の瞳中心光路(絞り孔23)は遮光状態となり、左右瞳光路(絞り孔21、22)は開放状態となる。分光フィルタFLを有する絞り孔21を露呈させることにより左瞳は短波長成分のみを通過させ、分光フィルタFRを有する絞り孔22を露出させることにより右瞳は長波長成分のみを通過させる。 The movable mask 30 is configured to rotate about a rotation axis 35 by a predetermined angle in a direction perpendicular to the optical axis AXC. For example, rotational movement can be realized by a piezo element or a motor. In the observation mode of FIG. 13, the movable mask 30 is rotated and tilted to the right eye by a predetermined angle, the pupil center optical path (aperture hole 23) of the fixed mask 20 is opened, and the left and right pupil optical paths (aperture holes 21, 22) is in a light shielding state. In the stereo measurement mode of FIG. 14, the movable mask 30 rotates and tilts to the left eye side by a predetermined angle, the pupil center optical path (aperture hole 23) of the fixed mask 20 enters a light-shielded state, and the left and right pupil optical paths (aperture hole 21). 22) is in an open state. By exposing the aperture 21 having the spectral filter FL, the left pupil passes only the short wavelength component, and by exposing the aperture 22 having the spectral filter FR, the right pupil passes only the long wavelength component.

 なお、上記では可動マスク30を所定角度だけ軸回転することにより2つの状態を作る場合を説明したが、これに限定されない。例えば、スライド動作により可動マスク30を移動させて2つの状態を作るものでもよい。回転動作又はスライド動作は、例えばマグネット機構や、圧電機構などで実現可能であり、高速性や耐久性を考慮して適切なものを選択すればよい。 In the above description, the case where two states are formed by rotating the movable mask 30 by a predetermined angle has been described. However, the present invention is not limited to this. For example, the movable mask 30 may be moved by a sliding operation to create two states. The rotation operation or the slide operation can be realized by, for example, a magnet mechanism or a piezoelectric mechanism, and an appropriate one may be selected in consideration of high speed and durability.

 以上の実施形態によれば、撮像装置(内視鏡装置)は、撮像素子40と、結像光学系10と、固定マスク20と、可動マスク30と、を含む。結像光学系10は、撮像素子40に被写体5を結像させる。固定マスク20は、結像光学系10の瞳を分割する第1~第3の開口(絞り孔21、22、23)と、第1の波長帯域(図3のPb又は図5のPb1)を通過させる第1のフィルタFLと、第2の波長帯域(図3のPr又は図5のPr2)を通過させる第2のフィルタFRとを有する。可動マスク30は、遮光部34と、第1~第3の開口(絞り孔21、22、23)に対応して遮光部34に設けられた第4~第6の開口(絞り孔31、32、33)とを有し、結像光学系10に対して可動である。そして、第1のフィルタFLは、第1の開口(絞り孔21)に設けられる。第2のフィルタFRは、第2の開口(絞り孔22)に設けられる。第3の開口(絞り孔23)は、結像光学系10の光軸AXC上に設けられる。 According to the above embodiment, the imaging device (endoscope device) includes the imaging element 40, the imaging optical system 10, the fixed mask 20, and the movable mask 30. The imaging optical system 10 forms an image of the subject 5 on the image sensor 40. The fixed mask 20 includes first to third apertures (diaphragm holes 21, 22, and 23) that divide the pupil of the imaging optical system 10, and a first wavelength band (Pb in FIG. 3 or Pb1 in FIG. 5). A first filter FL that passes therethrough and a second filter FR that passes the second wavelength band (Pr in FIG. 3 or Pr2 in FIG. 5) are included. The movable mask 30 includes a light shielding portion 34 and fourth to sixth openings (throttle holes 31, 32) provided in the light shielding portion 34 corresponding to the first to third openings (throttle holes 21, 22, 23). 33) and is movable with respect to the imaging optical system 10. The first filter FL is provided in the first opening (throttle hole 21). The second filter FR is provided in the second opening (throttle hole 22). The third opening (aperture hole 23) is provided on the optical axis AXC of the imaging optical system 10.

 このような構成にすることで、図11~図14で説明したような観測モードとステレオ計測モードの切り替えが可能となる。またカラー位相差法における視差画像を同時に(時分割でなく)取得できるので、正確なステレオ計測が可能になる。また、可動部である可動マスク30が1つなので、切り替えの高速化や、駆動機構の簡素化、モード切り替えにおける故障やエラーの抑制を実現できる。また、可動マスク30には遮光部34に開口(絞り孔31、32、33)が設けられる簡素な構成であり、切り替えの振動によるフィルタ外れなどのトラブルを抑制できる。また、固定マスク20の開口(絞り孔21、22)により左右の瞳が明確に分かれるため、ステレオ計測における基線長(図11、図17のd)を大きくとりやすく、正確な距離測定が可能になる。 With this configuration, it is possible to switch between the observation mode and the stereo measurement mode as described with reference to FIGS. Moreover, since the parallax images in the color phase difference method can be acquired simultaneously (not in time division), accurate stereo measurement can be performed. In addition, since there is one movable mask 30 that is a movable part, it is possible to realize high-speed switching, simplification of the driving mechanism, and suppression of failures and errors in mode switching. In addition, the movable mask 30 has a simple configuration in which openings (diaphragm holes 31, 32, 33) are provided in the light-shielding portion 34, and troubles such as filter removal due to switching vibration can be suppressed. In addition, since the left and right pupils are clearly separated by the opening of the fixed mask 20 (diaphragm holes 21 and 22), it is easy to increase the baseline length (d in FIGS. 11 and 17) in stereo measurement, and accurate distance measurement is possible. Become.

 また、例えば左右瞳の一方を使って一眼の観察画像を撮影した比較例を考えると、光軸から外れた瞳での撮影になる。この点、本実施形態では、固定マスク20に3つの開口(絞り孔21、22、23)を設け、そのうちの1つを光軸AXC上に設けたことで、観察画像が瞳中心画像になる。これにより、光線のケラレが小さくなり、広視野角な観察画像を取得できる。また、高品質な(例えば歪みが少ない)結像を得ることができる。 Also, for example, when considering a comparative example in which an observation image of a single eye is photographed using one of the left and right pupils, photographing is performed with a pupil off the optical axis. In this respect, in this embodiment, the fixed mask 20 is provided with three openings (diaphragm holes 21, 22, and 23), and one of them is provided on the optical axis AXC, so that the observation image becomes a pupil center image. . Thereby, the vignetting of the light beam is reduced and an observation image having a wide viewing angle can be acquired. Further, high quality (for example, less distortion) imaging can be obtained.

 また、被写体5上の位置と画素位置との対応を考えた場合、ステレオ計測での位相差(図12のs)の中心(s/2の位置)は、瞳中心を通る光線と一致する。即ち、本実施形態では、観察画像と距離マップの同一画素は、被写体5上の同一位置に対応する。一方、上記の比較例では観察画像が左に視差を持っており、瞳中心でないため、被写体5上の同一位置に対して、観察画像と距離マップの異なる画素が対応してしまう。観察画像と3次元情報を重ねて表示する場合などにおいて、本実施形態の方が有利である。 Also, when considering the correspondence between the position on the subject 5 and the pixel position, the center (s / 2 position) of the phase difference (s in FIG. 12) in stereo measurement coincides with the light beam passing through the center of the pupil. That is, in the present embodiment, the same pixel in the observation image and the distance map corresponds to the same position on the subject 5. On the other hand, in the above comparative example, the observation image has a parallax on the left and is not the center of the pupil, so that the pixel on the subject image 5 and the distance map corresponding to the same position on the subject 5 corresponds. The present embodiment is more advantageous when, for example, displaying an observation image and three-dimensional information in an overlapping manner.

 本実施形態では、第1の開口(絞り孔21)が左瞳に対応し、第2の開口(絞り孔22)は右瞳に対応し、第3の開口(絞り孔23)が瞳中心に対応する。なお、第1の開口が右瞳に対応し、第2の開口が左瞳に対応もよい。また、便宜的にステレオ計測における瞳を左右に分離しているが、瞳の分離方向は左右に限らない。本実施形態では開口を絞り孔と呼んでいるが、開口は絞りとしての機能(瞳を通過する光束の断面積を制限する機能)を必ずしも持たなくてもよい。例えば、観察モードにおいて絞り孔23、33が重なるが、絞り孔23の方が小さい場合には絞り孔23が絞りの機能を有し、絞り孔33の方が小さい場合には絞り孔33が絞りの機能を有することになる。 In the present embodiment, the first aperture (diaphragm aperture 21) corresponds to the left pupil, the second aperture (diaphragm aperture 22) corresponds to the right pupil, and the third aperture (diaphragm aperture 23) is at the pupil center. Correspond. Note that the first opening may correspond to the right pupil, and the second opening may correspond to the left pupil. Moreover, although the pupil in stereo measurement is separated into right and left for convenience, the separating direction of the pupil is not limited to right and left. In the present embodiment, the aperture is referred to as a diaphragm aperture. However, the aperture does not necessarily have a function as a diaphragm (a function of limiting the cross-sectional area of the light beam passing through the pupil). For example, the apertures 23 and 33 overlap in the observation mode, but when the aperture 23 is smaller, the aperture 23 has a function of aperture, and when the aperture 33 is smaller, the aperture 33 is the aperture. It has the function of.

 ここで瞳とは、結像光学系10による結像光路を分離(又は規定)するものである。光路とは、撮像素子40に結像する光が、光学系の対物側から入射して撮像素子40に到達するまでの経路のことである。即ち、結像光学系10と固定マスク20の絞り孔21、22(ステレオ計測モードでは更に可動マスク30の絞り孔31、32)を通過する光路が第1、第2の光路である。また、結像光学系10と固定マスク20の絞り孔23(観察モードでは更に可動マスク30の絞り孔33)を通過する光路が第3の光路である。 Here, the pupil is for separating (or defining) the imaging optical path by the imaging optical system 10. The optical path is a path from the light that forms an image on the image sensor 40 to the image sensor 40 after entering from the objective side of the optical system. That is, the optical paths that pass through the imaging optical system 10 and the apertures 21 and 22 of the fixed mask 20 (the apertures 31 and 32 of the movable mask 30 in the stereo measurement mode) are the first and second optical paths. The optical path that passes through the imaging optical system 10 and the aperture 23 of the fixed mask 20 (or the aperture 33 of the movable mask 30 in the observation mode) is the third optical path.

 ここでマスクとは、マスクに入射する光を遮蔽すると共に一部の光を通過させる部材や部品のことである。本実施形態の固定マスク20や可動マスク30では、遮光部24、34が光を遮蔽すると共に絞り孔21、22、23、31、32、33が光(全帯域又は、一部の帯域)を通過させる。 Here, the mask is a member or component that shields light incident on the mask and allows part of the light to pass. In the fixed mask 20 and the movable mask 30 of the present embodiment, the light shielding portions 24 and 34 shield light, and the aperture holes 21, 22, 23, 31, 32, and 33 emit light (full band or partial band). Let it pass.

 また本実施形態では、撮像装置は、可動マスク30を制御する可動マスク制御部340(図18)を含む。可動マスク制御部340は、非ステレオモード(観察モード)において、光軸AXC方向に見た場合に遮光部34が第1、第2の開口(絞り孔21、22)に重なると共に第6の開口(絞り孔33)が第3の開口(絞り孔23)に重なる第1の状態(第1の位置)に、可動マスク30を設定する。一方、ステレオモード(ステレオ計測モード)において、光軸AXC方向に見た場合に第4、第5の開口(絞り孔31、32)が第1、第2の開口(絞り孔21、22)に重なると共に遮光部34が第3の開口(絞り孔23)に重なる第2の状態(第2の位置)に、可動マスク30を設定する。 In the present embodiment, the imaging apparatus includes a movable mask control unit 340 (FIG. 18) that controls the movable mask 30. When the movable mask control unit 340 is viewed in the optical axis AXC direction in the non-stereo mode (observation mode), the light shielding unit 34 overlaps the first and second openings (diaphragm holes 21 and 22) and the sixth opening. The movable mask 30 is set in a first state (first position) where the (aperture hole 33) overlaps with the third opening (aperture hole 23). On the other hand, in the stereo mode (stereo measurement mode), when viewed in the direction of the optical axis AXC, the fourth and fifth apertures (diaphragm holes 31 and 32) become the first and second apertures (diaphragm apertures 21 and 22). The movable mask 30 is set in a second state (second position) where the light shielding portion 34 overlaps with the third opening (aperture hole 23).

 このような可動マスク30の駆動制御を行うことで、図11や図13の観察モードと図12や図14のステレオ計測モードの切り替え制御を実現できる。即ち、可動マスク30を第1の状態に設定した場合には、第1、第2の開口が遮光部34で遮蔽されるので第3の開口のみでの撮影となり、第3の開口には分光フィルタが挿入されないので通常観察用の画像(白色光画像)を撮影することが可能となる。一方、可動マスク30を第2の状態に設定した場合には、第1の開口に第1のフィルタFLが固定され、第2の開口に第2のフィルタFRが固定されているので、カラー位相差法における視差画像を撮影することが可能となる。 By controlling the driving of the movable mask 30 as described above, switching control between the observation mode shown in FIGS. 11 and 13 and the stereo measurement mode shown in FIGS. 12 and 14 can be realized. In other words, when the movable mask 30 is set to the first state, the first and second openings are shielded by the light-shielding portion 34, so that only the third opening is photographed. Since no filter is inserted, an image for normal observation (white light image) can be taken. On the other hand, when the movable mask 30 is set to the second state, the first filter FL is fixed to the first opening and the second filter FR is fixed to the second opening. It is possible to take a parallax image in the phase difference method.

 8.固定マスク、可動マスクの第2の詳細な構成例
 図15、図16に固定マスク20、可動マスク30の第2の詳細な構成例を示す。図15、図16には、結像光学系10と固定マスク20と可動マスク30の断面図と、固定マスク20と可動マスク30を光軸方向に見た図(結像側から見た背面図)と、を示す。
8). Second Detailed Configuration Example of Fixed Mask and Movable Mask FIGS. 15 and 16 show a second detailed configuration example of the fixed mask 20 and the movable mask 30. 15 and 16 are cross-sectional views of the imaging optical system 10, the fixed mask 20, and the movable mask 30, and a view of the fixed mask 20 and the movable mask 30 in the optical axis direction (a rear view viewed from the imaging side). ).

 可動マスク30は、遮光部34と、遮光部34に設けられた絞り孔31、32と、を含む。絞り孔31、32は開放状態(スルーホール)であり、回転軸35を中心として同一円上に並ぶ。絞り孔31は、その同一円の円周方向に伸びた形状であり、観察モードにおいて固定マスク20の絞り孔23に重なると共にステレオ計測モードにおいて固定マスク20の絞り孔21に重なるような形状となっている。 The movable mask 30 includes a light shielding part 34 and aperture holes 31 and 32 provided in the light shielding part 34. The aperture holes 31 and 32 are in an open state (through hole), and are arranged on the same circle around the rotation shaft 35. The aperture hole 31 has a shape extending in the circumferential direction of the same circle, and overlaps the aperture hole 23 of the fixed mask 20 in the observation mode and also overlaps the aperture hole 21 of the fixed mask 20 in the stereo measurement mode. ing.

 固定マスク20は、遮光部24と、遮光部24に設けられた3つの絞り孔21、22、23と、を含む。絞り孔21、22には、分光フィルタFL、FRが設けられる。絞り孔23は開放状態(スルーホール)であってもよいし、図3の帯域Pv1、Pv2、Pirを通過させる分光フィルタFCが設けられてもよい。絞り孔21、22、23は、回転軸35を中心として同一円上に並ぶ。 The fixed mask 20 includes a light shielding part 24 and three aperture holes 21, 22, 23 provided in the light shielding part 24. The aperture holes 21 and 22 are provided with spectral filters FL and FR. The aperture hole 23 may be in an open state (through hole), or a spectral filter FC that allows the bands Pv1, Pv2, and Pir of FIG. 3 to pass therethrough may be provided. The aperture holes 21, 22, and 23 are arranged on the same circle around the rotation shaft 35.

 観察モードでは、固定マスク20の瞳中心の絞り孔23が可動マスク30の絞り孔31により開放状態となり、固定マスク20の左右瞳の絞り孔21、22が可動マスク30の遮光部34で遮光され、単眼による白色光の画像が撮像される。ステレオ計測モードでは、固定マスク20の左右瞳の絞り孔21、22が可動マスク30の絞り孔31、32により開放状態となり、固定マスク20の瞳中心の絞り孔23が可動マスク30の遮光部34で遮光され、カラー位相差法による視差画像(赤色画像、青色画像)が撮像される。 In the observation mode, the aperture hole 23 at the center of the pupil of the fixed mask 20 is opened by the aperture hole 31 of the movable mask 30, and the aperture holes 21 and 22 of the left and right pupils of the fixed mask 20 are shielded by the light shielding portion 34 of the movable mask 30. A white light image is captured by a single eye. In the stereo measurement mode, the left and right pupil apertures 21 and 22 of the fixed mask 20 are opened by the aperture holes 31 and 32 of the movable mask 30, and the aperture hole 23 at the center of the pupil of the fixed mask 20 is the light shielding portion 34 of the movable mask 30. And a parallax image (red image, blue image) by the color phase difference method is captured.

 以上の実施形態によれば、撮像装置(内視鏡装置)は、撮像素子40と結像光学系10と固定マスク20と可動マスク30とを含む。結像光学系10は、撮像素子40に被写体5を結像させる。固定マスク20は、結像光学系10の瞳を分割する第1~第3の開口(絞り孔21、22、23)と、第1の波長帯域(図3のPb又は図5のPb1)を通過させる第1のフィルタFLと、第2の波長帯域(図3のPr又は図5のPr2)を通過させる第2のフィルタFRとを有する。可動マスク30は、遮光部34と、第1、第3の開口(絞り孔21、23)に対応して遮光部34に設けられた第4の開口(絞り孔31)と、第2の開口(絞り孔22)に対応して遮光部34に設けられた第5の開口(絞り孔32)とを有し、結像光学系10に対して可動である。そして、第1のフィルタFLは、第1の開口(絞り孔21)に設けられる。第2のフィルタFRは、第2の開口(絞り孔22)に設けられる。第3の開口(絞り孔23)は、結像光学系10の光軸AXC上に設けられる。 According to the above embodiment, the imaging device (endoscope device) includes the imaging element 40, the imaging optical system 10, the fixed mask 20, and the movable mask 30. The imaging optical system 10 forms an image of the subject 5 on the image sensor 40. The fixed mask 20 includes first to third apertures (diaphragm holes 21, 22, and 23) that divide the pupil of the imaging optical system 10, and a first wavelength band (Pb in FIG. 3 or Pb1 in FIG. 5). A first filter FL that passes therethrough and a second filter FR that passes the second wavelength band (Pr in FIG. 3 or Pr2 in FIG. 5) are included. The movable mask 30 includes a light shielding part 34, a fourth opening (aperture hole 31) provided in the light shielding part 34 corresponding to the first and third openings (throttle holes 21 and 23), and a second opening. A fifth aperture (diaphragm hole 32) provided in the light shielding portion 34 corresponding to the (diaphragm hole 22) is provided, and is movable with respect to the imaging optical system 10. The first filter FL is provided in the first opening (throttle hole 21). The second filter FR is provided in the second opening (throttle hole 22). The third opening (aperture hole 23) is provided on the optical axis AXC of the imaging optical system 10.

 具体的には、撮像装置は、可動マスク30を制御する可動マスク制御部340を含む。可動マスク制御部340は、非ステレオモード(観察モード)において、光軸AXC方向に見た場合に遮光部34が第1、第2の開口(絞り孔21、22)に重なると共に第4の開口(絞り孔31)が第3の開口(絞り孔23)に重なる第1の状態に、可動マスク30を設定する。一方、ステレオモード(ステレオ計測モード)において、光軸AXC方向に見た場合に第4、第5の開口(絞り孔31、32)が第1、第2の開口(絞り孔21、22)に重なると共に遮光部34が第3の開口(絞り孔23)に重なる第2の状態に、可動マスク30を設定する。 Specifically, the imaging apparatus includes a movable mask control unit 340 that controls the movable mask 30. When the movable mask control unit 340 is viewed in the optical axis AXC direction in the non-stereo mode (observation mode), the light shielding unit 34 overlaps the first and second openings (diaphragm holes 21 and 22) and the fourth opening. The movable mask 30 is set in a first state where the (aperture hole 31) overlaps the third opening (aperture hole 23). On the other hand, in the stereo mode (stereo measurement mode), when viewed in the direction of the optical axis AXC, the fourth and fifth apertures (diaphragm holes 31 and 32) become the first and second apertures (diaphragm apertures 21 and 22). The movable mask 30 is set in a second state in which the light shielding portion 34 overlaps with the third opening (aperture hole 23).

 このような構成によっても、観測モードとステレオ計測モードの切り替えや、ステレオ計測モードにおける視差画像の同時取得、モード切り替えの高速化、可動マスク30の駆動機構の簡素化、モード切り替えにおける故障やエラーの抑制、ステレオ計測における基線長の確保などを実現できる。 Even with such a configuration, switching between the observation mode and the stereo measurement mode, simultaneous acquisition of parallax images in the stereo measurement mode, speeding up of mode switching, simplification of the driving mechanism of the movable mask 30, failure or error in mode switching Suppression, securing of baseline length in stereo measurement, etc. can be realized.

 9.ステレオ3次元計測の原理
 ステレオ計測モードにおけるステレオ計測の原理について説明する。図17に示すように、左眼と右眼の光路が独立して構成され、被写体5からの反射画像は、これら光路を介して撮像センサ面(受光面)に結像する。3次元空間の座標系X、Y、Zを以下のように定義する。即ち、撮像センサ面に沿ってX軸と、X軸に直交するY軸とを設定し、撮像センサ面に直交し且つ光軸AXCと平行な方向で被写体に向かう方向にZ軸を設定する。Z軸はX軸、Y軸とゼロ点にて交差するものとする。なお、ここでは便宜上Y軸は省略する。
9. Principle of Stereo 3D Measurement The principle of stereo measurement in the stereo measurement mode will be described. As shown in FIG. 17, the optical paths of the left eye and the right eye are configured independently, and the reflected image from the subject 5 forms an image on the imaging sensor surface (light receiving surface) through these optical paths. A coordinate system X, Y, and Z of the three-dimensional space is defined as follows. That is, the X axis and the Y axis orthogonal to the X axis are set along the imaging sensor surface, and the Z axis is set in a direction toward the subject in a direction orthogonal to the imaging sensor surface and parallel to the optical axis AXC. The Z axis intersects the X axis and Y axis at the zero point. Note that the Y-axis is omitted here for convenience.

 結像レンズ10と撮像センサ面の距離をbとし、結像レンズ10から被写体5の任意点Q(x,z)までの距離をzとする。瞳の中心線IC1、IC2とZ軸までの距離を同一とし、各d/2とする。つまりステレオ計測における基線長はdとなる。被写体5の任意点Q(x,y)が結像レンズ10により撮像センサ面に結像された対応点のX座標をXLとし、被写体5の任意点Q(x,y)が結像レンズ10により撮像センサ面に結像された対応点のX座標をXRとする。任意点Q(x,z)と座標XL、XRに囲まれた三角形内にできる複数の部分的な直角三角形の相似関係を使って下式(4)を得ることができる。

Figure JPOXMLDOC01-appb-M000004
The distance between the imaging lens 10 and the imaging sensor surface is b, and the distance from the imaging lens 10 to the arbitrary point Q (x, z) of the subject 5 is z. The distance between the pupil center lines IC1 and IC2 and the Z axis is the same, and each is d / 2. That is, the baseline length in stereo measurement is d. The X coordinate of the corresponding point at which the arbitrary point Q (x, y) of the subject 5 is imaged on the imaging sensor surface by the imaging lens 10 is XL, and the arbitrary point Q (x, y) of the subject 5 is the imaging lens 10. The X coordinate of the corresponding point imaged on the imaging sensor surface is defined as XR. The following equation (4) can be obtained by using a similarity relationship between a plurality of partial right-angled triangles formed in the triangle surrounded by the arbitrary point Q (x, z) and the coordinates XL and XR.
Figure JPOXMLDOC01-appb-M000004

 ここで、下式(5)、(6)が成り立つ。

Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
Here, the following expressions (5) and (6) hold.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006

 これにより、上式(4)の絶対値を下式(7)のように外すことができる。

Figure JPOXMLDOC01-appb-M000007
Thereby, the absolute value of the above equation (4) can be removed as in the following equation (7).
Figure JPOXMLDOC01-appb-M000007

 上式(7)をxについて解くと下式(8)となる。

Figure JPOXMLDOC01-appb-M000008
When the above equation (7) is solved for x, the following equation (8) is obtained.
Figure JPOXMLDOC01-appb-M000008

 上式(8)のxを上式(7)に代入すると、下式(9)が得られ、zを求めることができる。

Figure JPOXMLDOC01-appb-M000009
Substituting x in the above equation (8) into the above equation (7), the following equation (9) is obtained, and z can be obtained.
Figure JPOXMLDOC01-appb-M000009

 d、bは既知の設定値であり、未知数XL、XRは次のようにして求められる。即ち、撮像センサ面の位置XLを基準に考え(左画像の画素位置をXLと見なし)、位置XLに対応する位置XRをマッチング処理(相関演算)により検出する。各位置XLについて距離zを計算することで被写体の形状が計測できる。なお、マッチングが良好でない場合には距離zが求められない可能性があるが、例えば周囲の画素の距離zから補間すること等により求めてもよい。 D and b are known set values, and the unknowns XL and XR are obtained as follows. That is, the position XL on the imaging sensor surface is considered as a reference (the pixel position of the left image is regarded as XL), and the position XR corresponding to the position XL is detected by matching processing (correlation calculation). By calculating the distance z for each position XL, the shape of the subject can be measured. If the matching is not good, the distance z may not be obtained, but may be obtained by interpolation from the distance z of surrounding pixels, for example.

 10.内視鏡装置
 図18に、本実施形態の内視鏡装置(広義には、撮像装置)の構成例を示す。内視鏡装置は、スコープ部100(撮像部)、本体部200(制御装置)を含む。スコープ部100は、結像光学系10、固定マスク20、可動マスク30、撮像素子40、駆動部50、照明部60を含む。本体部200は、処理部210、モニタ表示部220、撮像処理部230を含む。処理部210は、光源駆動制御部305、画像選択部310(画像フレーム選択部)、カラー画像生成部320(画像出力部)、位相差検出部330、可動マスク制御部340(可動マスク駆動制御部)、可動マスク位置検出部350、距離情報算出部360、3次元情報生成部370を含む。
10. Endoscopic Device FIG. 18 shows a configuration example of the endoscope device (imaging device in a broad sense) of the present embodiment. The endoscope apparatus includes a scope unit 100 (imaging unit) and a main body unit 200 (control device). The scope unit 100 includes an imaging optical system 10, a fixed mask 20, a movable mask 30, an image sensor 40, a drive unit 50, and an illumination unit 60. The main body unit 200 includes a processing unit 210, a monitor display unit 220, and an imaging processing unit 230. The processing unit 210 includes a light source drive control unit 305, an image selection unit 310 (image frame selection unit), a color image generation unit 320 (image output unit), a phase difference detection unit 330, and a movable mask control unit 340 (movable mask drive control unit). ), A movable mask position detector 350, a distance information calculator 360, and a three-dimensional information generator 370.

 なお本体部200は、不図示の構成要素として、本体部200を操作する操作部、外部機器と接続するインターフェース部等を含んでもよい。スコープ部100は、不図示の構成要素として、例えばスコープ部100を操作する操作部や、処置具等を含んでもよい。 The main body unit 200 may include an operation unit that operates the main body unit 200, an interface unit that is connected to an external device, and the like as components (not illustrated). The scope unit 100 may include, for example, an operation unit that operates the scope unit 100, a treatment instrument, and the like as components not shown.

 内視鏡装置としては、工業用、医療用のいわゆるビデオスコープ(撮像素子を内蔵した内視鏡装置)を想定できる。スコープ部100が湾曲可能に構成された軟性鏡、スコープ部100がスティック状に構成された硬性鏡、いずれにも本発明を適用できる。例えば工業用の軟性鏡の場合、本体部200及び撮像部110は持ち運び可能なポータブル機器として構成されており、工業製品の製造検査やメンテナンス検査、建築物や配管のメンテナンス検査等に用いられる。 As the endoscope apparatus, an industrial and medical so-called video scope (an endoscope apparatus incorporating an image sensor) can be assumed. The present invention can be applied to both a flexible mirror in which the scope unit 100 is configured to be bendable and a rigid mirror in which the scope unit 100 is configured in a stick shape. For example, in the case of an industrial flexible mirror, the main body 200 and the imaging unit 110 are configured as portable devices that can be carried, and are used for manufacturing inspection and maintenance inspection of industrial products, maintenance inspection of buildings and piping, and the like.

 駆動部50は、可動マスク制御部340からの制御信号に基づいて可動マスク30を駆動し、第1の状態(観察モード)と第2の状態(ステレオ計測モード)を切り替える。例えば、駆動部50はピエゾ素子やマグネット機構によるアクチュエータで構成される。 The driving unit 50 drives the movable mask 30 based on a control signal from the movable mask control unit 340, and switches between the first state (observation mode) and the second state (stereo measurement mode). For example, the drive unit 50 is configured by an actuator using a piezoelectric element or a magnet mechanism.

 撮像処理部230は、撮像素子40からの信号に対して撮像処理を行い、撮像画像(例えばベイヤ画像等)を出力する。例えば、相関2重サンプリング処理、ゲインコントロール処理、A/D変換処理、ガンマ補正、色補正、ノイズ低減等を行う。撮像処理部230は、例えばASIC等のディスクリートICで構成されてもよいし、或いは撮像素子40(センサチップ)や処理部210に内蔵されてもよい。 The imaging processing unit 230 performs imaging processing on the signal from the imaging element 40 and outputs a captured image (for example, a Bayer image). For example, correlated double sampling processing, gain control processing, A / D conversion processing, gamma correction, color correction, noise reduction, and the like are performed. The imaging processing unit 230 may be configured by, for example, a discrete IC such as an ASIC, or may be incorporated in the imaging device 40 (sensor chip) or the processing unit 210.

 モニタ表示部220は、スコープ部100が撮像した画像や、被写体5の3次元形状情報等を表示する。例えば、モニタ表示部220は、液晶ディスプレイやEL(Electro-Luminescence)ディスプレイ等により構成される。 The monitor display unit 220 displays an image captured by the scope unit 100, 3D shape information of the subject 5, and the like. For example, the monitor display unit 220 includes a liquid crystal display, an EL (Electro-Luminescence) display, or the like.

 以下、内視鏡装置の動作を説明する。照明部60は、上述した非パターン光とパターン光の合成光を被写体5へ照射する。光源駆動制御部305は、撮像処理部230からの信号に基づいて非パターン光とパターン光の各光量を最適に制御(いわゆる調光制御)する。例えば、撮像画像の輝度を求め、その輝度が所定の範囲内となるように光量を制御する。 Hereinafter, the operation of the endoscope apparatus will be described. The illumination unit 60 irradiates the subject 5 with the combined light of the non-pattern light and the pattern light described above. The light source drive control unit 305 optimally controls each light amount of the non-pattern light and the pattern light based on a signal from the imaging processing unit 230 (so-called dimming control). For example, the brightness of the captured image is obtained, and the amount of light is controlled so that the brightness is within a predetermined range.

 可動マスク制御部340は、駆動部50を制御して可動マスク30の位置を切り替える。可動マスク制御部340が可動マスク30を観察モードに設定した場合、被写体5からの反射光が瞳中心光路を介して撮像素子40に結像される。撮像処理部230は、撮像素子40に結像された画像の画素値を読み出し、A/D変換等を行って画像選択部310に画像データを出力する。 The movable mask control unit 340 controls the driving unit 50 to switch the position of the movable mask 30. When the movable mask control unit 340 sets the movable mask 30 to the observation mode, the reflected light from the subject 5 is imaged on the image sensor 40 via the pupil center optical path. The imaging processing unit 230 reads the pixel value of the image formed on the imaging element 40, performs A / D conversion or the like, and outputs the image data to the image selection unit 310.

 画像選択部310は、可動マスク制御部340からの制御信号に基づいて可動マスク30の状態が観察モードであることを検知し、撮像画像から{Vr、Vg、Vb}を選択してカラー画像生成部320に出力する。カラー画像生成部320はデモザイキング処理(ベイヤ画像からRGB画像を生成する処理)や各種画像処理を行い、3板化RGB原色画像をモニタ表示部220に出力する。モニタ表示部220は、そのカラー画像を表示する。 The image selection unit 310 detects that the movable mask 30 is in the observation mode based on the control signal from the movable mask control unit 340, and selects {Vr, Vg, Vb} from the captured image to generate a color image. To the unit 320. The color image generation unit 320 performs demosaicing processing (processing for generating an RGB image from a Bayer image) and various types of image processing, and outputs a three-plate RGB primary color image to the monitor display unit 220. The monitor display unit 220 displays the color image.

 可動マスク制御部340が可動マスク30をステレオ計測モードに設定した場合、被写体5からの反射光が左瞳光路及び右瞳光路を介し撮像素子40に同時に結像される。撮像処理部230は、撮像素子40に結像された画像の画素値を読み出し、A/D変換等を行って画像選択部310に画像データを出力する。 When the movable mask control unit 340 sets the movable mask 30 to the stereo measurement mode, the reflected light from the subject 5 is simultaneously imaged on the image sensor 40 via the left pupil optical path and the right pupil optical path. The imaging processing unit 230 reads the pixel value of the image formed on the imaging element 40, performs A / D conversion or the like, and outputs the image data to the image selection unit 310.

 画像選択部310は、可動マスク制御部340からの制御信号に基づいて可動マスク30の状態がステレオ計測モードであることを検知し、撮像画像から{Mr,Mb}を選択して位相差検出部330に出力する。位相差検出部330は、分離された2つの画像Mr、Mbに対してマッチング処理を行い、画素毎に位相差(位相ずれ)を検出する。また位相差検出部330は、位相差検出が信頼できるか否かの判断を行い、信頼できないと判断した場合はエラーフラグを画素毎に出力する。従来より2つの類似波形のずれ量(位相差)を求めるためのマッチング評価方法はZNCC(Zero-mean Normalized Cross-Correlation)に代表される正規化相互相関演算法、相互の差分絶対値の合計によるSAD(Sum of Absolute Difference)など、種々提案されているので適宜利用が可能である。 The image selection unit 310 detects that the movable mask 30 is in the stereo measurement mode based on a control signal from the movable mask control unit 340, selects {Mr, Mb} from the captured image, and detects the phase difference. To 330. The phase difference detection unit 330 performs matching processing on the two separated images Mr and Mb, and detects a phase difference (phase shift) for each pixel. Further, the phase difference detection unit 330 determines whether or not the phase difference detection is reliable. If it is determined that the phase difference detection is not reliable, an error flag is output for each pixel. Conventionally, a matching evaluation method for obtaining a shift amount (phase difference) between two similar waveforms is based on a normalized cross-correlation calculation method represented by ZNCC (Zero-meanCCNormalized Cross-Correlation), and a sum of absolute values of mutual differences. Various proposals such as SAD (Sum of Absolute Difference) have been proposed and can be used as appropriate.

 なお、時分割となり被写体ブレ、撮像系のブレの影響は受けるものの視差画像となるVrとMrを使っても位相ずれ(位相差)を検出することができる。被写体5の反射が青色成分は少なく赤色成分が多い場合、MrとMbでは検出が難しい被写体5であっても共に赤色成分を有するVrとMrならば計測が可能となる。 It should be noted that the phase shift (phase difference) can be detected by using Vr and Mr, which are parallax images, although they are time-divisionally affected by subject blur and imaging system blur. When the reflection of the subject 5 has a small blue component and a large red component, even if the subject 5 is difficult to detect with Mr and Mb, measurement is possible if both Vr and Mr have a red component.

 位相差検出部330は、検出した位相差情報とエラーフラグを距離情報算出部360に出力する。距離情報算出部360は、被写体5の距離情報(例えば図17の距離z)を各画素について計算し、その距離情報を3次元情報生成部370に出力する。エラーフラグが立っている画素は、例えば被写体5の平坦部(エッジ成分が少ない領域)と見なして、例えば周囲の画素の距離情報から補間してもよい。3次元情報生成部370は、距離情報(又は、距離情報とカラー画像生成部320からのRGB画像)から3次元情報を生成する。3次元情報は、例えばZ値マップ(距離マップ)やポリゴン、疑似的な3次元表示画像(例えばシェーディング等による形状強調)等、種々の情報を想定できる。3次元情報生成部370は、生成した3次元画像や3次元データ、或いはそれらと観察画像とを重畳した表示画像などを必要に応じ生成し、モニタ表示部220へ出力する。モニタ表示部220は、その3次元情報を表示する。 The phase difference detection unit 330 outputs the detected phase difference information and error flag to the distance information calculation unit 360. The distance information calculation unit 360 calculates the distance information of the subject 5 (for example, the distance z in FIG. 17) for each pixel, and outputs the distance information to the three-dimensional information generation unit 370. For example, the pixel on which the error flag is set may be regarded as a flat portion (region having a small edge component) of the subject 5 and may be interpolated from distance information of surrounding pixels, for example. The three-dimensional information generation unit 370 generates three-dimensional information from the distance information (or the distance information and the RGB image from the color image generation unit 320). As the three-dimensional information, various information such as a Z value map (distance map), a polygon, and a pseudo three-dimensional display image (for example, shape enhancement by shading or the like) can be assumed. The three-dimensional information generation unit 370 generates the generated three-dimensional image, the three-dimensional data, or a display image in which these and the observation image are superimposed as necessary, and outputs the generated image to the monitor display unit 220. The monitor display unit 220 displays the three-dimensional information.

 可動マスク位置検出部350は、ステレオ計測モード時に得られた画像{Mr,Mb}を使って、可動マスク30が観察モードの位置にあるかステレオ計測モードの位置にあるかを検出する。そして、可動マスク30の状態がモードに一致していないと判断した場合には、可動マスク制御部340に位置エラーフラグを出力する。可動マスク制御部340は、位置エラーフラグを受けて、可動マスク30を正しい状態(画像選択に対応した状態)に修正する。例えば可動マスク制御部340がステレオ計測モードの制御信号を出力しているにも関わらず、画像{Mr,Mb}に色ずれが無いと判断される場合、実際の可動マスク30は観察モードの位置になっている。この場合、制御信号と可動マスク30の位置を一致させる修正を行う。なお、修正動作をしても正しい状態にならない場合は、何らかの故障が発生したと判断して全体の機能を停止させる。 The movable mask position detector 350 detects whether the movable mask 30 is in the observation mode position or the stereo measurement mode position using the image {Mr, Mb} obtained in the stereo measurement mode. If it is determined that the state of the movable mask 30 does not match the mode, a position error flag is output to the movable mask control unit 340. The movable mask control unit 340 receives the position error flag and corrects the movable mask 30 to a correct state (a state corresponding to image selection). For example, when it is determined that there is no color shift in the image {Mr, Mb} even though the movable mask control unit 340 outputs a control signal for the stereo measurement mode, the actual movable mask 30 is positioned in the observation mode. It has become. In this case, correction is performed to match the position of the control signal and the movable mask 30. If the correct state is not obtained even if the correction operation is performed, it is determined that some failure has occurred, and the entire function is stopped.

 可動マスク30はメカニカルな機構から構成されるので切り替え動作に不具合が発生することが考えられる。本実施形態によれば、切り替え位置が観察モードなのか計測モードなのかを検出できるので、切り替え動作の不具合に対応できる。 Since the movable mask 30 is composed of a mechanical mechanism, it is conceivable that a malfunction occurs in the switching operation. According to the present embodiment, since it is possible to detect whether the switching position is the observation mode or the measurement mode, it is possible to cope with a problem of the switching operation.

 可動マスク30が観察モードの位置にあるかステレオ計測モードの位置にあるかの検出や判断は、例えば以下のように行う。即ち、画像Mrと画像Mbの判断エリアでのレベル(平均レベルなど)を合わせた後、画像Mrと画像Mbの絶対差分値和による判断(第1手法)や、画像Mrと画像Mbの相関係数による判断(第2手法)などにより、位置エラーの判断を行う。 Detecting or judging whether the movable mask 30 is in the observation mode position or the stereo measurement mode position is performed as follows, for example. That is, after matching the levels (average level, etc.) in the judgment areas of the images Mr and Mb, judgment based on the sum of absolute difference values of the images Mr and Mb (first method) and the correlation between the images Mr and Mb The position error is determined by determination based on the number (second method) or the like.

 第1手法では、各画素で画素値の差分値の絶対値を求め、それを全画素又は部分画素群で積算する。その結果が所定の閾値を越えた場合は、ステレオ計測モードの画像と判断し、その結果が所定の閾値以下である場合は、観察モードの画像と判断する。ステレオ計測モードでは画像Mrと画像Mbは基本的に色ずれを起こしている画像なので、所定量の差分値が得られることを利用している。 In the first method, the absolute value of the difference value of the pixel value is obtained for each pixel, and it is integrated in all pixels or a partial pixel group. If the result exceeds a predetermined threshold, it is determined as an image in the stereo measurement mode, and if the result is less than the predetermined threshold, it is determined as an image in the observation mode. In the stereo measurement mode, the image Mr and the image Mb are basically images that have undergone color misregistration, so that the fact that a predetermined amount of difference value is obtained is used.

 第2手法では、画像Mrと画像Mbの所定範囲における相関係数を計算し、その結果が所定の閾値以下の場合は、ステレオ計測モードの画像と判断し、その結果が所定の閾値を越えた場合は、観察モードの画像と判断する。これはステレオ計測モードでは画像Mrと画像Mbは基本的に色ずれを起こしている画像なので相関係数が小さいのに対し、観察モードでは画像Mrと画像Mbはほぼ一致した画像なので相関係数が大きいことを利用している。 In the second method, the correlation coefficient in the predetermined range between the image Mr and the image Mb is calculated, and when the result is equal to or smaller than the predetermined threshold, it is determined as an image in the stereo measurement mode, and the result exceeds the predetermined threshold. In this case, it is determined that the image is an observation mode. In the stereo measurement mode, the image Mr and the image Mb are basically images that have undergone color misregistration, so the correlation coefficient is small, whereas in the observation mode, the image Mr and the image Mb are almost the same image, so the correlation coefficient is Take advantage of big things.

 なお、本実施形態の内視鏡装置、撮像装置等は、プロセッサとメモリを含んでもよい。ここでのプロセッサは、例えばCPU(Central Processing Unit)であってもよい。ただしプロセッサはCPUに限定されるものではなく、GPU(Graphics Processing Unit)、或いはDSP(Digital Signal Processor)等、各種のプロセッサを用いることが可能である。またプロセッサはASICによるハードウェア回路でもよい。また、メモリはコンピュータにより読み取り可能な命令を格納するものであり、当該命令がプロセッサにより実行されることで、本実施形態に係る内視鏡装置、撮像装置等の各部(例えば処理部210の各部等)が実現されることになる。ここでのメモリは、SRAM、DRAMなどの半導体メモリであってもよいし、レジスターやハードディスク等でもよい。また、ここでの命令は、プログラムを構成する命令セットの命令でもよいし、プロセッサのハードウェア回路に対して動作を指示する命令であってもよい。 Note that the endoscope apparatus, the imaging apparatus, and the like of the present embodiment may include a processor and a memory. The processor here may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as a GPU (GraphicsGProcessing Unit) or a DSP (Digital Signal Processor) can be used. The processor may be an ASIC hardware circuit. The memory stores instructions that can be read by a computer. When the instructions are executed by the processor, each unit (for example, each unit of the processing unit 210) of the endoscope apparatus, the imaging apparatus, and the like according to the present embodiment. Etc.) will be realized. The memory here may be a semiconductor memory such as SRAM or DRAM, or a register or a hard disk. Further, the instruction here may be an instruction of an instruction set constituting the program, or an instruction for instructing an operation to the hardware circuit of the processor.

 11.モード切り替えシーケンス
 図19に、動画撮影において観察モードとステレオ計測モードを切り替えるシーケンス(動作タイミングチャート)を示す。
11. Mode Switching Sequence FIG. 19 shows a sequence (operation timing chart) for switching between the observation mode and the stereo measurement mode in moving image shooting.

 上述したステレオ計測モードでは、動きがある被写体に対しても高精度なステレオ同時計測が実現できるが、色ずれ画像となってしまうので高品位な観察画像には使えない。そこで観察モードとステレオ計測モードを高速に切り替えることにより、この問題を解決でき、ほぼリアルタイムに近い状態で観察画像を表示しつつステレオ計測が実行可能である。 In the stereo measurement mode described above, high-precision simultaneous measurement can be realized even for a moving subject. However, since it becomes a color shift image, it cannot be used for a high-quality observation image. Therefore, this problem can be solved by switching between the observation mode and the stereo measurement mode at high speed, and stereo measurement can be performed while displaying the observation image in a state almost in real time.

 図19に示すように、可動マスク30の状態の切り替えと撮像タイミングと撮像画像の選択は連動している。A1、A2に示すように、観察モードのマスク状態とステレオ計測モードのマスク状態を交互に繰り返す。A3、A4に示すように、各マスク状態で1回ずつ撮像が行われる。A5に示すように、観察モードのマスク状態にあるときに撮像素子40により露光撮像された画像は観察画像として選択される。A6に示すように、ステレオ計測モードのマスク状態にあるときに撮像素子40により露光撮像された画像は計測画像として選択される。 As shown in FIG. 19, switching of the state of the movable mask 30, imaging timing, and selection of a captured image are interlocked. As shown in A1 and A2, the mask state in the observation mode and the mask state in the stereo measurement mode are alternately repeated. As shown in A3 and A4, imaging is performed once in each mask state. As shown in A5, an image that is exposed and imaged by the image sensor 40 when in the mask state of the observation mode is selected as an observation image. As shown in A6, an image that is exposed and imaged by the image sensor 40 when in the mask state of the stereo measurement mode is selected as a measurement image.

 このように観察モードとステレオ計測モードを交互に繰り返すことにより、ほぼリアルタイムに近い状態で観察画像と計測画像を連続的に得ることができるので、被写体5に動きがある場合も観察と計測を両方実現することができる。観察モードの画像を表示しつつ、そこに必要に応じて計測された情報を重ねて合わせて表示すれば、ユーザに対して目視検査と定量検査を同時に提供することができ、有用な情報提供が可能となる。 By alternately repeating the observation mode and the stereo measurement mode in this way, an observation image and a measurement image can be obtained continuously in a state almost in real time, so both observation and measurement can be performed even when the subject 5 is moving. Can be realized. While displaying the image in the observation mode, if the information measured as necessary is superimposed and displayed there, visual inspection and quantitative inspection can be provided to the user at the same time, providing useful information It becomes possible.

 以上、本発明を適用した実施形態およびその変形例について説明したが、本発明は、各実施形態やその変形例そのままに限定されるものではなく、実施段階では、発明の要旨を逸脱しない範囲内で構成要素を変形して具体化することができる。また、上記した各実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることによって、種々の発明を形成することができる。例えば、各実施形態や変形例に記載した全構成要素からいくつかの構成要素を削除してもよい。さらに、異なる実施の形態や変形例で説明した構成要素を適宜組み合わせてもよい。このように、発明の主旨を逸脱しない範囲内において種々の変形や応用が可能である。また、明細書又は図面において、少なくとも一度、より広義または同義な異なる用語と共に記載された用語は、明細書又は図面のいかなる箇所においても、その異なる用語に置き換えることができる。 As mentioned above, although embodiment and its modification which applied this invention were described, this invention is not limited to each embodiment and its modification as it is, and in the range which does not deviate from the summary of invention in an implementation stage. The component can be modified and embodied. Further, various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described embodiments and modifications. For example, some constituent elements may be deleted from all the constituent elements described in each embodiment or modification. Furthermore, you may combine suitably the component demonstrated in different embodiment and modification. Thus, various modifications and applications are possible without departing from the spirit of the invention. In addition, a term described together with a different term having a broader meaning or the same meaning at least once in the specification or the drawings can be replaced with the different term anywhere in the specification or the drawings.

5 被写体、10 結像光学系、20 固定マスク、
21,22,23 開口(絞り孔)、24 遮光部、30 可動マスク、
31,32,33 開口(絞り孔)、34 遮光部、35 回転軸、
40 撮像素子、50 駆動部、60 照明部、100 スコープ部、
110 撮像部、200 本体部、210 処理部、220 モニタ表示部、
230 撮像処理部、305 光源駆動制御部、310 画像選択部、
320 カラー画像生成部、330 位相差検出部、
340 可動マスク制御部、350 可動マスク位置検出部、
360 距離情報算出部、370 3次元情報生成部、401 白色光源、
402 導光部材、403 照明用レンズ、404 赤色レーザー光源、
405 青色レーザー光源、406,407 ダイクロイックプリズム、
408 導光部材、409 マスクパターン、410 投影レンズ、
451 白色光源、452 偏光素子、453 青色レーザー光源、
454 赤色レーザー光源、455,456 ダイクロイックプリズム、
457 マスクパターン、458 偏光素子、459 プリズム、
460 導光部材、461 投影レンズ、
FL 第1のフィルタ、FR 第2のフィルタ、Mb,Mr 画像、
Pb,Pb1 第1の波長帯域、Pr,Pr2 第2の波長帯域、
s(XL) 位相差、Vb,Vg,Vr 画像
5 subject, 10 imaging optical system, 20 fixed mask,
21, 22, 23 Opening (aperture hole), 24 light shielding part, 30 movable mask,
31, 32, 33 Aperture (diaphragm hole), 34 Shading part, 35 Rotating shaft
40 image sensor, 50 drive unit, 60 illumination unit, 100 scope unit,
110 imaging unit, 200 main body unit, 210 processing unit, 220 monitor display unit,
230 imaging processing unit, 305 light source drive control unit, 310 image selection unit,
320 color image generation unit, 330 phase difference detection unit,
340 movable mask control unit, 350 movable mask position detection unit,
360 distance information calculation unit, 370 three-dimensional information generation unit, 401 white light source,
402 light guide member, 403 illumination lens, 404 red laser light source,
405 Blue laser light source, 406, 407 dichroic prism,
408 light guide member, 409 mask pattern, 410 projection lens,
451 white light source, 452 polarizing element, 453 blue laser light source,
454 red laser light source, 455,456 dichroic prism,
457 mask pattern, 458 polarizing element, 458 prism,
460 light guide member, 461 projection lens,
FL first filter, FR second filter, Mb, Mr image,
Pb, Pb1 first wavelength band, Pr, Pr2 second wavelength band,
s (XL) phase difference, Vb, Vg, Vr image

Claims (14)

 第1色の画像と、前記第1色よりも長波長側の第2色の画像と、前記第2色よりも長波長側の第3色の画像とを含む撮像画像を撮像し、前記第1色の画像と前記第3色の画像をステレオ画像として撮像可能な撮像部と、
 前記第1色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第1の波長帯域、及び前記第3色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第2の波長帯域において所与の光量分布を有するパターン照明を被写体に照射する照明部と、
 を含むことを特徴とする撮像装置。
Capturing a captured image including a first color image, a second color image having a longer wavelength than the first color, and a third color image having a longer wavelength than the second color; An imaging unit capable of imaging one color image and the third color image as a stereo image;
A first wavelength band that is included in the wavelength band of the first color and not included in the wavelength band of the second color, and that is included in the wavelength band of the third color and included in the wavelength band of the second color. An illumination unit that irradiates a subject with pattern illumination having a given light amount distribution in a second wavelength band that is not present;
An imaging apparatus comprising:
 請求項1において、
 前記撮像部は、
 前記ステレオ画像を撮像するステレオモードと、単眼により前記撮像画像を撮像する非ステレオモードとを切り替えることを特徴とする撮像装置。
In claim 1,
The imaging unit
An imaging apparatus characterized by switching between a stereo mode for capturing the stereo image and a non-stereo mode for capturing the captured image with a single eye.
 請求項2において、
 前記非ステレオモードにおける前記単眼は、前記第1色と前記第2色と前記第3色の波長帯域のうち前記第1の波長帯域及び前記第2の波長帯域を除く波長帯域を通過させることを特徴とする撮像装置。
In claim 2,
The monocular in the non-stereo mode allows a wavelength band excluding the first wavelength band and the second wavelength band among the wavelength bands of the first color, the second color, and the third color to pass. An imaging device that is characterized.
 請求項3において、
 前記ステレオモードにおいて撮像された前記第1色の画像と前記第3色の画像との間の位相差を検出する位相差検出部と、
 前記非ステレオモードにおいて撮像された前記撮像画像に基づいて観察用の画像を出力する画像出力部と、
 を含むことを特徴とする撮像装置。
In claim 3,
A phase difference detector that detects a phase difference between the first color image and the third color image captured in the stereo mode;
An image output unit that outputs an image for observation based on the captured image captured in the non-stereo mode;
An imaging apparatus comprising:
 請求項2において、
 前記非ステレオモードにおける前記単眼は、前記第1色と前記第2色と前記第3色の波長帯域を含む波長帯域を通過させることを特徴とする撮像装置。
In claim 2,
The imaging apparatus according to claim 1, wherein the monocular in the non-stereo mode passes a wavelength band including the wavelength bands of the first color, the second color, and the third color.
 請求項5において、
 前記ステレオモードにおいて撮像された前記第1色の画像と前記第3色の画像との間の位相差を検出する位相差検出部と、
 前記非ステレオモードにおいて撮像された前記撮像画像に基づいて観察用の画像を出力する画像出力部と、
 を含み、
 前記画像出力部は、
 前記所与の光量分布による前記第1色の画像及び前記第3色の画像の画素値の変化を、前記第2色の画像に基づいて補正することを特徴とする撮像装置。
In claim 5,
A phase difference detector that detects a phase difference between the first color image and the third color image captured in the stereo mode;
An image output unit that outputs an image for observation based on the captured image captured in the non-stereo mode;
Including
The image output unit includes:
An imaging apparatus, wherein a change in pixel values of the first color image and the third color image due to the given light amount distribution is corrected based on the second color image.
 請求項1乃至6のいずれかにおいて、
 前記パターン照明は、前記第1色の波長帯域のうち前記第1の波長帯域を除く波長帯域と、前記第2色の波長帯域と、前記第3色の波長帯域のうち前記第2の波長帯域を除く波長帯域とにおいてフラットな光量分布であることを特徴とする撮像装置。
In any one of Claims 1 thru | or 6.
The pattern illumination includes a wavelength band excluding the first wavelength band among the wavelength bands of the first color, a wavelength band of the second color, and a second wavelength band of the wavelength bands of the third color. An image pickup apparatus having a flat light amount distribution in a wavelength band excluding.
 請求項1乃至7のいずれかにおいて、
 前記第1色は青色であり、前記第2色は緑色であり、前記第3色は赤色であることを特徴とする撮像装置。
In any one of Claims 1 thru | or 7,
The imaging apparatus according to claim 1, wherein the first color is blue, the second color is green, and the third color is red.
 請求項1乃至8のいずれかにおいて、
 撮像素子と、
 前記撮像素子に被写体を結像させる結像光学系と、
 前記結像光学系の瞳を分割する第1~第3の開口と、前記第1の波長帯域を通過させる第1のフィルタと、前記第2の波長帯域を通過させる第2のフィルタとを有する固定マスクと、
 遮光部と、前記第1~第3の開口に対応して前記遮光部に設けられた第4~第6の開口とを有し、前記結像光学系に対して可動である可動マスクと、
 を含み、
 前記第1のフィルタは、前記第1の開口に設けられ、
 前記第2のフィルタは、前記第2の開口に設けられ、
 前記第3の開口は、前記結像光学系の光軸上に設けられることを特徴とする撮像装置。
In any one of Claims 1 thru | or 8.
An image sensor;
An imaging optical system for imaging a subject on the image sensor;
First to third apertures that divide the pupil of the imaging optical system, a first filter that passes the first wavelength band, and a second filter that passes the second wavelength band A fixed mask,
A movable mask having a light shielding part and fourth to sixth openings provided in the light shielding part corresponding to the first to third openings, and movable with respect to the imaging optical system;
Including
The first filter is provided in the first opening;
The second filter is provided in the second opening;
The image pickup apparatus, wherein the third opening is provided on an optical axis of the imaging optical system.
 請求項9において、
 前記可動マスクを制御する可動マスク制御部を含み、
 前記可動マスク制御部は、
 非ステレオモードにおいて、前記光軸方向に見た場合に前記遮光部が前記第1、第2の開口に重なると共に前記第6の開口が前記第3の開口に重なる第1の状態に、前記可動マスクを設定し、
 ステレオモードにおいて、前記光軸方向に見た場合に前記第4、第5の開口が前記第1、第2の開口に重なると共に前記遮光部が前記第3の開口に重なる第2の状態に、前記可動マスクを設定することを特徴とする撮像装置。
In claim 9,
A movable mask control unit for controlling the movable mask;
The movable mask controller is
In the non-stereo mode, when viewed in the optical axis direction, the movable portion is in a first state in which the light shielding portion overlaps the first and second openings and the sixth opening overlaps the third opening. Set the mask,
In the stereo mode, when viewed in the optical axis direction, the fourth and fifth openings overlap the first and second openings, and the light shielding portion overlaps the third opening. An imaging apparatus, wherein the movable mask is set.
 請求項1乃至8のいずれかにおいて、
 撮像素子と、
 前記撮像素子に被写体を結像させる結像光学系と、
 前記結像光学系の瞳を分割する第1~第3の開口と、前記第1の波長帯域を通過させる第1のフィルタと、前記第2の波長帯域を通過させる第2のフィルタとを有する固定マスクと、
 遮光部と、前記第1、第3の開口に対応して前記遮光部に設けられた第4の開口と、前記第2の開口に対応して前記遮光部に設けられた第5の開口とを有し、前記結像光学系に対して可動である可動マスクと、
 を含み、
 前記第1のフィルタは、前記第1の開口に設けられ、
 前記第2のフィルタは、前記第2の開口に設けられ、
 前記第3の開口は、前記結像光学系の光軸上に設けられることを特徴とする撮像装置。
In any one of Claims 1 thru | or 8.
An image sensor;
An imaging optical system for imaging a subject on the image sensor;
First to third apertures that divide the pupil of the imaging optical system, a first filter that passes the first wavelength band, and a second filter that passes the second wavelength band A fixed mask,
A light shielding portion, a fourth opening provided in the light shielding portion corresponding to the first and third openings, and a fifth opening provided in the light shielding portion corresponding to the second opening. And a movable mask movable with respect to the imaging optical system,
Including
The first filter is provided in the first opening;
The second filter is provided in the second opening;
The image pickup apparatus, wherein the third opening is provided on an optical axis of the imaging optical system.
 請求項11において、
 前記可動マスクを制御する可動マスク制御部を含み、
 前記可動マスク制御部は、
 非ステレオモードにおいて、前記光軸方向に見た場合に前記遮光部が前記第1、第2の開口に重なると共に前記第4の開口が前記第3の開口に重なる第1の状態に、前記可動マスクを設定し、
 ステレオモードにおいて、前記光軸方向に見た場合に前記第4、第5の開口が前記第1、第2の開口に重なると共に前記遮光部が前記第3の開口に重なる第2の状態に、前記可動マスクを設定することを特徴とする撮像装置。
In claim 11,
A movable mask control unit for controlling the movable mask;
The movable mask controller is
In the non-stereo mode, when viewed in the optical axis direction, the movable portion is in a first state in which the light shielding portion overlaps the first and second openings and the fourth opening overlaps the third opening. Set the mask,
In the stereo mode, when viewed in the optical axis direction, the fourth and fifth openings overlap the first and second openings, and the light shielding portion overlaps the third opening. An imaging apparatus, wherein the movable mask is set.
 請求項1乃至12のいずれかに記載された撮像装置を含むことを特徴とする内視鏡装置。 An endoscope apparatus comprising the imaging apparatus according to any one of claims 1 to 12.  第1色の画像と、前記第1色よりも長波長側の第2色の画像と、前記第2色よりも長波長側の第3色の画像のうち、前記第1色の画像と前記第3色の画像をステレオ画像として撮像可能である場合に、
 前記第1色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第1の波長帯域、及び前記第3色の波長帯域に含まれると共に前記第2色の波長帯域に含まれない第2の波長帯域において所与の光量分布を有するパターン照明を被写体に照射し、
 前記第1色の画像と前記第2色の画像と前記第3色の画像とを含む撮像画像を撮像することを特徴とする撮像方法。
Of the first color image, the second color image longer than the first color, and the third color image longer than the second color, the first color image and the When a third color image can be captured as a stereo image,
A first wavelength band that is included in the wavelength band of the first color and not included in the wavelength band of the second color, and that is included in the wavelength band of the third color and included in the wavelength band of the second color. Irradiating the subject with pattern illumination having a given light intensity distribution in the second wavelength band not present,
An imaging method, comprising: capturing a captured image including the first color image, the second color image, and the third color image.
PCT/JP2015/078871 2015-10-13 2015-10-13 Imaging device, endoscopic device and imaging method Ceased WO2017064746A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/078871 WO2017064746A1 (en) 2015-10-13 2015-10-13 Imaging device, endoscopic device and imaging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/078871 WO2017064746A1 (en) 2015-10-13 2015-10-13 Imaging device, endoscopic device and imaging method

Publications (1)

Publication Number Publication Date
WO2017064746A1 true WO2017064746A1 (en) 2017-04-20

Family

ID=58517404

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/078871 Ceased WO2017064746A1 (en) 2015-10-13 2015-10-13 Imaging device, endoscopic device and imaging method

Country Status (1)

Country Link
WO (1) WO2017064746A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003047028A (en) * 2001-08-01 2003-02-14 Olympus Optical Co Ltd Imaging apparatus and stereogram-photographing method
JP2005223812A (en) * 2004-02-09 2005-08-18 Canon Inc Imaging device
JP2013124985A (en) * 2011-12-15 2013-06-24 Ricoh Co Ltd Compound-eye imaging apparatus and distance measuring device
JP2015502558A (en) * 2011-10-03 2015-01-22 イーストマン コダック カンパニー Stereo projector using spectrally adjacent color bands
JP2015513686A (en) * 2012-01-17 2015-05-14 イーストマン コダック カンパニー Stereoscopic glasses with tilt filter

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003047028A (en) * 2001-08-01 2003-02-14 Olympus Optical Co Ltd Imaging apparatus and stereogram-photographing method
JP2005223812A (en) * 2004-02-09 2005-08-18 Canon Inc Imaging device
JP2015502558A (en) * 2011-10-03 2015-01-22 イーストマン コダック カンパニー Stereo projector using spectrally adjacent color bands
JP2013124985A (en) * 2011-12-15 2013-06-24 Ricoh Co Ltd Compound-eye imaging apparatus and distance measuring device
JP2015513686A (en) * 2012-01-17 2015-05-14 イーストマン コダック カンパニー Stereoscopic glasses with tilt filter

Similar Documents

Publication Publication Date Title
KR101798656B1 (en) 3-d imaging using telecentric defocus
JP7289653B2 (en) Control device, endoscope imaging device, control method, program and endoscope system
US10542875B2 (en) Imaging device, endoscope apparatus, and imaging method
JP5053468B2 (en) Stereoscopic image capturing apparatus and endoscope
CN108073016A (en) Image forming apparatus
US20220346643A1 (en) Ophthalmic imaging apparatus and system
WO2018051679A1 (en) Measurement assistance device, endoscope system, processor for endoscope system, and measurement assistance method
US20180098053A1 (en) Imaging device, endoscope apparatus, and imaging method
US11025812B2 (en) Imaging apparatus, imaging method, and imaging system
JP2015231498A (en) Endoscope device
JP5953443B2 (en) Endoscope system
JP6948407B2 (en) Equipment and methods for determining surface topology and associated colors
JP6706026B2 (en) Endoscope system and operating method of endoscope apparatus
US20180092516A1 (en) Imaging device, endoscope apparatus, and imaging method
JP7609953B2 (en) Ophthalmic device and control method thereof
JP2015046019A (en) Image processing apparatus, imaging apparatus, imaging system, image processing method, program, and storage medium
JP5025761B2 (en) Image processing apparatus, image processing method, and program
JPS63246716A (en) Method for taking-in of endoscope image
CN208625698U (en) A kind of blood flow imaging device and endoscope
WO2017064746A1 (en) Imaging device, endoscopic device and imaging method
KR20230106593A (en) Imaging systems and laparoscopes for imaging objects
US11648080B2 (en) Medical observation control device and medical observation system that correct brightness differences between images acquired at different timings
JP7583551B2 (en) Imaging device
WO2017212577A1 (en) Imaging device, endoscope device, and imaging method
WO2024200351A1 (en) An extraoral scanner system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15906211

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15906211

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP