[go: up one dir, main page]

WO2016203760A1 - Dispositif de capture d'image et procédé de capture d'image - Google Patents

Dispositif de capture d'image et procédé de capture d'image Download PDF

Info

Publication number
WO2016203760A1
WO2016203760A1 PCT/JP2016/002852 JP2016002852W WO2016203760A1 WO 2016203760 A1 WO2016203760 A1 WO 2016203760A1 JP 2016002852 W JP2016002852 W JP 2016002852W WO 2016203760 A1 WO2016203760 A1 WO 2016203760A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared light
light
incident
image signal
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/002852
Other languages
English (en)
Japanese (ja)
Inventor
塚田 正人
ウォーター ブロム
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to JP2017524610A priority Critical patent/JP6720971B2/ja
Publication of WO2016203760A1 publication Critical patent/WO2016203760A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • H04N23/21Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F39/00Integrated devices, or assemblies of multiple devices, comprising at least one element covered by group H10F30/00, e.g. radiation detectors comprising photodiode arrays
    • H10F39/10Integrated devices
    • H10F39/12Image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10FINORGANIC SEMICONDUCTOR DEVICES SENSITIVE TO INFRARED RADIATION, LIGHT, ELECTROMAGNETIC RADIATION OF SHORTER WAVELENGTH OR CORPUSCULAR RADIATION
    • H10F99/00Subject matter not provided for in other groups of this subclass

Definitions

  • the present invention relates to a video shooting device and a video shooting method.
  • a red (R), green (G), and blue (B) three-color optical filter is usually incorporated in the image sensor.
  • the light incident on the camera is decomposed by the three-color optical filter, converted into a video signal by an image sensor, and RGB video data is generated.
  • the image sensor used in the video imaging apparatus is a silicon sensor, it has sensitivity to light in the near infrared region in addition to the visible light region. For this reason, when near-infrared light (Near InfraRed, NIR) is incident on the image sensor, the output by NIR is added to the output by RGB light. As a result, color reproducibility decreases. Therefore, in general digital cameras and digital video cameras, near-infrared light is removed by a near-infrared cut filter to ensure highly accurate color reproducibility. In general definition, near-infrared light is light having a wavelength of about 0.7 to 2 ⁇ m.
  • the simplest method is to provide a mechanism for mechanically moving an IR cut filter that removes infrared light (InfraRed, IR).
  • the IR cut filter is set in the optical system for visible light photographing, and the near infrared light photographing is performed by removing the IR cut filter from the optical system for photographing outdoors or in the dark at night. In this way, a visible light image and a near-infrared light image with good color reproducibility can be acquired with a single image capturing device.
  • Non-Patent Document 1 discloses a method using a four-color optical filter in which an IR transmission filter that removes visible light and transmits IR is added to an RGB three-color optical filter.
  • IR dedicated pixels are provided to generate IR signals.
  • the IR contribution is subtracted from the R, G, B sensor output to calculate a correct R, G, B signal. This ensures a high color reproduction line.
  • R, G, and B image sensors are used as near-infrared sensors, and monochrome images of near-infrared light are generated.
  • Patent Document 1 discloses a method of measuring the light intensity incident on a semiconductor photosensor in two stages: measurement of visible light intensity and measurement of near-infrared light intensity.
  • the measurement of the visible light intensity is performed using a three-color optical filter that transmits R, G, and B colors. These optical filters also transmit NIR.
  • the NIR intensity is measured by the NIR sensor provided in the deep part (the part far from the light receiving surface) of each of the R, G, and B photosensors. This utilizes the phenomenon that light having a longer wavelength enters deeper from the surface of the semiconductor and is absorbed. That is, the intensity of NIR light transmitted through the visible light sensing unit is measured by a sensor dedicated to NIR. With this configuration, it is possible to capture both a visible light image and a near-infrared light image with a single image capturing device.
  • Non-Patent Document 1 provides a pixel dedicated to IR. For this reason, when applied to an image sensor having the same area, the number of RGB pixels or the pixel area is reduced as compared with a method of measuring only RGB three colors. As a result, there is a problem that the resolution or sensitivity is lowered.
  • the present invention has been made in view of the above-described problems, and an object thereof is to provide an image photographing apparatus capable of photographing both a visible light image and a near infrared light image with a simple configuration. Yes.
  • the video imaging apparatus of the present invention generates an incident light video signal that generates an incident light video signal according to the intensity of the visible light component and the near infrared light component of the incident light incident on the light receiving surface.
  • Means a near-infrared light diffracting means arranged on the incident side of the light-receiving surface and diffracting a near-infrared light component of the incident light, and a diffraction for extracting a diffraction pattern generated by the near-infrared light on the light-receiving surface Pattern extracting means; near infrared light video signal generating means for generating a near infrared light video signal corresponding to the intensity of the near infrared light component calculated based on the diffraction pattern; the incident light video signal; and Visible light video signal generating means for generating a visible light video signal corresponding to the intensity of the visible light component calculated based on the near-infrared light video signal.
  • the effect of the present invention is that it is possible to provide an image photographing apparatus capable of photographing an image with both visible light and infrared light with a simple configuration.
  • FIG. 1 is a block diagram showing a video photographing apparatus 100 according to the first embodiment.
  • the image capturing apparatus 100 includes a near-infrared light diffraction unit 110, an incident light image signal generation unit 120, a diffraction pattern extraction unit 130, a near-infrared light image signal generation unit 140, and a visible light image signal generation unit 150. ,have.
  • Near-infrared light diffracting means 110 diffracts near-infrared light included in incident light 10 incident on video imaging device 100.
  • the near-infrared light diffracted by the near-infrared light diffracting means 110 forms a diffraction pattern on the light receiving surface 120 a of the incident light image signal generating means 120.
  • the incident light image signal generating means 120 generates an incident light image signal corresponding to the intensities of visible light and near infrared light.
  • the diffraction pattern extraction unit 130 extracts a diffraction pattern formed by the near infrared light on the light receiving surface 120a based on the incident light image signal.
  • the near-infrared light image signal generating means 140 calculates the intensity of near-infrared light incident on the image capturing device 100 based on the extracted diffraction pattern of near-infrared light, and generates a near-infrared light image signal.
  • the visible light video signal generation means 150 generates a visible light video signal based on the incident light video signal and the near infrared light video signal. With the above configuration, a near-infrared light image signal and a visible light image signal can be generated.
  • a visible light image and a near-infrared light image can be captured using a single image capturing device having a simple configuration.
  • FIG. 2 is a block diagram illustrating a video photographing apparatus 100 according to the second embodiment.
  • the image capturing apparatus 100 includes a near-infrared light diffraction unit 110, an incident light image signal generation unit 120, a diffraction pattern extraction unit 130, a near-infrared light image signal generation unit 140, and a visible light image signal generation unit 150. ,have.
  • the diffraction pattern extraction unit 130 includes a diffraction model generation unit 131, a luminance gradient analysis unit 132, and a matching model determination unit 133. Details of each element will be described later.
  • the near-infrared light image signal generation unit 140 analyzes the diffraction pattern extracted by the diffraction pattern extraction unit 130, calculates the intensity of incident near-infrared light, and generates a near-infrared light image signal.
  • the visible light video signal generation means 150 generates a visible light signal based on the incident video signal and the near-infrared light video signal. Specifically, the visible light video signal can be obtained by subtracting the near-infrared light video signal from the incident video signal in each pixel.
  • the near-infrared light diffracting means 110 has a near-infrared light cut portion that cuts near-infrared light and a near-infrared light transmission portion that transmits near-infrared light, and for near-infrared light Acts as a slit.
  • the shape of the near-infrared light transmitting portion can be, for example, a circle or a shape close to a circle.
  • the near-infrared light component of the incident light 10 incident on the image capturing device 100 is diffracted by passing through the near-infrared light diffracting means 110, and the light and darkness is reflected on the light receiving surface 120 a of the incident light image signal generating means 120. Produces a diffraction pattern.
  • the near-infrared light transmitting portion is circular, the diffraction pattern is a circular pattern composed of bright and dark concentric circles.
  • FIG. 3 is a graph showing an example of the luminance distribution of one circular pattern.
  • (x, y) is an orthogonal coordinate in the incident light video signal generation means 120, and the z axis represents the luminance at the coordinate.
  • This circular pattern can be approximated as Fraunhofer diffraction.
  • the bright part at the center of the circular pattern is called an Airy disk.
  • the incident light image signal generation means 120 generates an incident light image signal corresponding to the intensity of visible light and near infrared light incident on the light receiving surface 120a.
  • the incident light video signal generation means 120 is, for example, a photosensor array in which pixels are two-dimensionally arranged with photosensors having sensitivity to visible light and near infrared light as pixels.
  • the diffraction pattern extraction means 130 analyzes the incident light image signal and extracts a diffraction pattern formed by near infrared light. For this purpose, a diffraction model generation unit 131, a luminance gradient analysis unit 132, and a matching model determination unit 133 are provided.
  • the diffraction model generation means 131 generates a near infrared light diffraction model by theoretical calculation, for example, Fraunhofer approximation.
  • the luminance gradient analyzing means 132 analyzes the luminance gradient at each coordinate of the incident light video signal. In the analysis, a process of calculating a ratio of the luminance gradient of the incident light video signal at a certain coordinate and the luminance gradient at a position corresponding to the coordinate of the generated diffraction model is performed. Details will be described later.
  • the conforming model determination unit 133 determines a diffraction model that conforms to the circular pattern included in the incident light image signal based on the diffraction model and the luminance gradient analysis result. Thereby, a diffraction pattern is extracted.
  • the near-infrared light image signal generation means 140 estimates the intensity and wavelength of near-infrared light incident on the coordinates based on a compatible diffraction model. From the estimation result, a near-infrared light image signal is generated.
  • the diffraction pattern extraction unit 130 extracts a diffraction pattern and the near infrared light image signal generation unit 140 generates a near infrared light image signal will be described.
  • the intensity I (x) on the light receiving surface 120a of the incident light video signal generation unit 120 is expressed by the following equation.
  • J 1 (x) is a first type Bessel function of degree 1
  • C is a correction coefficient
  • x is as shown in Equation (2).
  • a is the radius of the portion of the near infrared light diffracting means 110 that transmits the near infrared light.
  • q is a distance on the light receiving surface 120a from a point perpendicular to the incident light image signal generation unit 120 from the center of the portion to the point.
  • R indicates the distance from a point perpendicular to the light receiving surface 120a from the center of the portion that transmits near infrared light to a point that is a distance q away from the light receiving surface 120a.
  • a diffraction pattern that can be expressed by Expression (1) and Expression (2) by normalizing the incident intensity I 0 is referred to as a diffraction model.
  • FIG. 5 is an example of a diffraction model when calculated assuming a certain wavelength.
  • the diffraction model represents the relationship between the intensity I of the circular pattern generated on the light receiving surface 120a and the distance q from the center of the circular pattern.
  • the correction coefficient C in Formula (1) shows two examples of C1 (solid line) and C2 (broken line).
  • the correction coefficient C is a coefficient that is adjusted to match the circular pattern generated in the actual video signal.
  • the incident light video signal generation means 120 is a photosensor array 121 in which a plurality of pixels are two-dimensionally arranged with a photosensor having sensitivity to visible light and near infrared light as pixels.
  • the near-infrared light diffracting means 110 generates a plurality of circular patterns on the photosensor array 121. At this time, the circular patterns are prevented from overlapping each other.
  • FIG. 6 schematically shows this state.
  • FIG. 6 illustrates a process of converting an incident light image signal from an orthogonal coordinate system to a polar coordinate system.
  • the intensity I (x, y) can be rephrased as the luminance received by the incident light video signal generation unit 120.
  • intensity and luminance are used interchangeably.
  • the luminance gradient in the center direction of the circular pattern is calculated for each coordinate (r, ⁇ ) in the polar coordinate system.
  • the luminance gradient is a luminance gradient detected by an adjacent photosensor (pixel), and is defined for each coordinate where the pixel is located.
  • the brightness gradient is also calculated for the diffraction model.
  • a diffraction model pattern expressed by Expression (1) and Expression (2) is created.
  • a gradient of intensity (which may be regarded as luminance) using the distance from the center as a variable is calculated.
  • the intensity is determined only by the distance from the center. For this reason, it is not necessary to consider the angular direction.
  • the incident light image signal includes both a visible light component and a near infrared light component. For this reason, the absolute value of luminance greatly depends on the visible light component. However, the influence of the visible light component can be suppressed by taking the luminance gradient in the polar coordinate system. Then, by calculating the luminance gradient for all coordinates in the circular pattern and performing statistical processing, it is possible to extract a circular pattern generated only with the near infrared light component.
  • the luminance gradient of the incident light image signal in the polar coordinate system is calculated in all directions in a predetermined unit, and a diffraction model that matches the luminance gradient is obtained. Then, the intensity and wavelength of near-infrared light are estimated from the diffraction model. The method will be described below.
  • a ratio b between the luminance gradient at each coordinate position (r, ⁇ ) of the incident light video signal and the luminance gradient of the diffraction model at the point where the distance from the center coincides is calculated.
  • the calculation of the luminance gradient ratio is performed in all directions at predetermined intervals. This is performed for all pixels in the region.
  • the intensity of the incident near-infrared light component can be estimated.
  • the luminance gradient it is possible to suppress the influence of components other than the near-infrared component in the target region, that is, the discontinuity of the luminance of the video signal due to visible light (such as edges).
  • the above calculation processing is performed for one region set to include one circular pattern. Similar calculation is performed for all regions of the incident light video signal, and the intensity of the near infrared component in each region is calculated.
  • a diffraction model is generated based on Equation (1) and Equation (2) for a wavelength region covering the wavelength region of near-infrared light.
  • the wavelength region covering the wavelength region of near-infrared light may be taken discretely at intervals of 10 nm from 700 nm, for example.
  • a diffraction model is obtained for each predetermined wavelength.
  • the luminance gradient ratio between the incident light image signal and the diffraction model is calculated for one region set to include one circular pattern in the incident light image signal.
  • the ratio between the luminance gradient at each coordinate (r, ⁇ ) of the incident light image signal and the luminance gradient in the diffraction model having the same distance from the center is calculated.
  • a histogram of luminance gradient ratio is created. One histogram is created for one wavelength. Then, by generating the same histogram by changing the wavelength, as shown in FIG. 9, the same number of histograms as the selected wavelength are generated.
  • the created histogram is analyzed, and one or more combinations of wavelength and luminance gradient ratios are selected from the larger histogram. Then, the selected wavelength is substituted into Equation (5), and the incident intensity of near infrared light in the target region is calculated. Thus, the wavelength and incident light intensity of the near-infrared light component in one region can be calculated. When there are a plurality of sets of wavelengths and luminance gradients to be selected, the incident light intensity of the near infrared light component may be calculated and added for each set.
  • each region set to include one circular pattern with the incident light video signal The same calculation is performed in each region set to include one circular pattern with the incident light video signal, and the wavelength and incident intensity of near infrared light in each region are calculated.
  • the wavelength and intensity of near infrared light incident on all near infrared light transmitting portions provided in the near infrared light diffracting means can be calculated. That is, a near infrared light video signal can be generated.
  • FIG. 10 is a flowchart summarizing the near-infrared light image signal generation operation in one area described above. This flowchart shows the operation after the region division and polar coordinate conversion. Hereinafter, this flowchart will be described.
  • a diffraction model is created assuming a wavelength in the near infrared region (S1).
  • the luminance gradient ratio between the incident light image signal and the diffraction model is calculated for each coordinate (S2).
  • a histogram of the obtained luminance gradient ratio is created (S3).
  • the incident intensity of near-infrared light is estimated from the luminance gradient giving the mode value (S4).
  • a diffraction model is created with a plurality of predetermined wavelengths covering the near-infrared light region (S5).
  • the luminance gradient ratio between the diffraction model of each wavelength and the incident light image signal is calculated at each coordinate (S6).
  • a histogram of luminance gradient is created for each wavelength for which the luminance gradient ratio has been calculated (S7).
  • the wavelength of the incident near-infrared light is estimated from the luminance gradient indicating the peak of the histogram (S8). As described above, the wavelength and intensity of near-infrared light are estimated.
  • both a visible light image and a near-infrared light image are captured by a single image capturing device using a photosensor array having a general simple configuration. can do.
  • FIG. 11 is a block diagram showing a video photographing apparatus 100A according to the third embodiment of the present invention.
  • the image capturing apparatus 100A according to the present embodiment generates both a visible color image signal and a near-infrared light image.
  • the image capturing device 100A includes a code type IR cut filter 110a as a near-infrared light diffracting means.
  • the incident light video signal generation unit 120 includes a photosensor array 121 in which a plurality of photosensors are arranged in a plane, and a color filter array 122 provided on the light incident side of the photosensor array 121.
  • R, G, B, and NIR represent red, green, blue, and near-infrared signals, respectively.
  • R + NIR, G + NIR, and B + NIR signals are generated by the photosensor array 121, and the NIR video signal is generated by the near-infrared light video signal generation unit 140 by arrows.
  • a state in which R, G, and B signals are generated by the visible light video signal generation unit 150 is indicated by arrows. Details of the operation will be described later.
  • FIG. 12 is a plan view showing the code-type IR cut filter 110a.
  • the code type IR cut filter 110 a includes a near infrared light cut unit 111 and a near infrared light transmission unit 112. That is, the code type means binary values of transmission and cut.
  • the code-type IR cut filter 110 a is provided on the front side of the color filter array 122 in the light traveling direction. As a result, near-infrared light diffraction occurs when incident light passes through the infrared transmitting portion 12.
  • the near-infrared light cut unit 111 does not transmit near-infrared light but transmits visible light.
  • the near infrared light transmission part 112 transmits near infrared light.
  • the near-infrared light transmitting portion 112 has a circular shape or a shape close thereto. A plurality of circular patterns formed on the photosensor array surface by near-infrared light are prevented from overlapping each other. Based on the equations (1) and (2), an approximate spread of diffracted light can be predicted.
  • the near-infrared light transmitting unit 112 is disposed so as to satisfy the above conditions, and the distance between the code-type IR cut filter 110a and the incident light video signal generating unit 120 is set. As long as this condition is satisfied, the arrangement of the near-infrared light transmitting portion is arbitrary.
  • a plurality of circular patterns of a predetermined size can be formed at predetermined positions on the light receiving surface of the incident light image signal generating means 120.
  • the size of the near-infrared light cut unit 112 does not necessarily match the size of one pixel of the photosensor. Further, although it is desirable that the near-infrared light transmitting unit 112 also transmits visible light, the present embodiment is established even when visible light is not transmitted.
  • the photosensor array 121 is a two-dimensional arrangement of photosensors as in the second embodiment, and each photosensor (pixel) is sensitive to visible light and near infrared light.
  • the color filter 122 is arranged in an array in which color filters that transmit, for example, R (red), G (green), and B (blue) are provided at positions corresponding to the photosensors. Each color filter also transmits near infrared light.
  • the color filter may be of a type using complementary colors such as C (cyan), M (magenta), and Y (yellow). Here, description will be made using examples of R, G, and B.
  • FIG. 13 is a plan view showing an example of the configuration of the color filter array 122.
  • each color filter transmits near-infrared light in addition to visible light having a color that passes through each color filter.
  • each pixel is described in the form of “color + NIR”.
  • R + NIR is a color filter that transmits red and NIR
  • G + NIR is a color filter that transmits green and NIR
  • B + NIR is a color filter that transmits blue and NIR.
  • a photo sensor is provided at a position corresponding to each color filter. Note that the number of photosensors corresponding to one color filter may be one or plural.
  • the arrangement of the color filters of each color shown in FIG. 13 is called a Bayer array, and one unit is formed by one R pixel, two G pixels, and one B pixel, and these units are arranged periodically. It is a thing.
  • FIG. 13 illustrates 6 units of 2 ⁇ 3 in the vertical direction.
  • the light transmitted through the code-type IR cut filter 110a and the color filter 122 is converted into a three-color signal of R + NIR, G + NIR, and B + NIR by a photosensor.
  • the diffraction pattern extraction unit 130 extracts the diffraction pattern, and the near-infrared light image signal generation unit 140 generates a near-infrared light image signal. To do. Details of the operation will be described later.
  • the visible light video signal generation means 150 generates R, G, B, and color signals based on the incident light video signal and the near infrared light video signal.
  • FIG. 14 is a flowchart showing an outline of the operation. Hereinafter, this flowchart will be described.
  • the incident light image signal generating means generates an incident light image signal (S101).
  • the incident light video signal is an R, G, B, three-color video signal, and each signal includes a near-infrared light component.
  • interpolation (demosaicing processing) of missing colors at each pixel is performed to generate progressive video signals of R + NIR, G + NIR, and B + NIR (S102).
  • the progressive video signal is defined as a video signal of each color obtained by the demosaicing process.
  • progressive video signals of each color are analyzed in the same manner as in the second embodiment, and near-infrared light video signals included in these are extracted and generated (S103).
  • a single color video signal of R, G, B, and each color is generated (S104).
  • the three-color visible light image signal and the near-infrared light image can be generated.
  • demosaicing process for interpolating missing colors will be described.
  • the demosaicing process described below is merely an example, and other methods may be used.
  • FIG. 15 is a schematic plan view for explaining the demosaicing process performed by one unit of the Bayer array.
  • One unit is composed of four pixels, one R pixel, two G pixels, and one B pixel.
  • the pixel here is a set of a color filter and a photosensor.
  • R + NIR, G + NIR, and B + NIR light is detected.
  • colors other than the target color are not detected. Therefore, the missing color signal is interpolated using the brightness of the surrounding pixels.
  • the R, G, and B color signals at this point also include NIR components, but in order to simplify the description, the R, G, and B color signals will be described.
  • the pixel at the coordinates (1, 1) corresponds to R and directly generates an R signal.
  • R (1,1) R (1,1) (6)
  • the G signal and the B signal that do not exist in the pixel at the coordinates (1, 1) are calculated by interpolating from the color signals of the peripheral pixels as follows, for example.
  • G (1,1) (G (2,1) + G (1,2)) / 2 (7)
  • B (1,1) B (2,2) (8)
  • a video signal (R, G, B color signal) of the pixel at coordinates (1, 2) is generated.
  • the pixel at coordinates (1, 2) corresponds to G and directly generates a G signal.
  • G (1,2) G (1,2) (9)
  • the R signal and the B signal which do not exist in the pixel at the coordinates (1, 2) are also calculated by interpolation from the color information of the peripheral pixels.
  • R (1,2) R (1,1) (10)
  • B (1,2) B (2,2) (11)
  • the same processing as above is repeated to generate video data (R, G, B color signals) for all pixels.
  • the demosaicing process is not limited to the above method, and various methods can be used. As described above, a progressive video signal in which R, G, and B color information is set for all pixels is obtained.
  • the visible light components R, G, and B are not affected (not diffracted) by the infrared transmitting portion 112 in the code-type IR cut filter 110a. For this reason, the information of the photographing scene is irradiated to the color filter array 122 and the photo sensor array 121 as they are.
  • An incident light image signal composed of three color signals of R + NIR, G + NIR, and B + NIR is generated from the irradiated light.
  • the progressive signal generated at this time in which R, G, B, and three-color signals are set for all pixels includes a near-infrared light component in a circular pattern. Therefore, these R + NIR, G + NIR, and B + NIR signal components are represented by I R + NIR , I G + NIR , and I B + NIR .
  • FIG. 16 is a schematic plan view showing progressive signals obtained for R, G, B, and each color.
  • the progressive R video signal 1R, the progressive G video signal 1G, and the progressive B video signal 1B each include a circular pattern of near infrared light. This is schematically shown in the figure.
  • the progressive signals of R, G, B, and each color include a circular pattern of near infrared light components.
  • Each near-infrared light component is estimated in the same manner as in the second embodiment. That is, processing similar to the flowchart of FIG. 10 is performed using R, G, and B progressive signals. As a result, three near-infrared light image signals obtained from the three progressive signals are extracted and generated.
  • the NIR signal obtained from the R progressive signal 1R is referred to as an NIR_R signal.
  • an NIR signal obtained from the G progressive signal 1G is referred to as an NIR_G signal
  • an NIR signal obtained from the B progressive signal 1B is referred to as an NIR_B signal.
  • Each signal component is represented by I NIR_R , I NIR_G , and I NIR_B .
  • R, G, B, and monochrome color components can be obtained by subtracting the near-infrared light component from the progressive signal component of each color. Assuming that the respective color components are called I R , I G , and I B , these components can be calculated by the following equations.
  • I R I R + NIR ⁇ I NIR_R (12)
  • I G I G + NIR ⁇ I NIR_G (13)
  • I B I B + NIR ⁇ I NIR_B (14)
  • the near-infrared light component monochrome component I NIR can be expressed by the following equation.
  • I NIR I NIR_R + I NIR_G + I NIR_B (15) Note that each term of Expression (15) may be weighted reflecting the way of demosaicing processing.
  • R, G, B, NIR, and four color video signal components can be extracted for each pixel.
  • the four colors of video data can be generated using the signal components of all pixels.
  • a visible light color video signal and a near-field image capturing device can be configured with a simple configuration that does not mechanically move the IR filter or use a special photosensor.
  • An infrared video signal can be generated simultaneously.
  • the near-infrared light component I NIR is calculated using three progressive signals I R + NIR , I G + NIR , I B + NIR , but can also be calculated using a signal of a specific color.
  • the pixel is an RGB Bayer array type, since the information on the G channel is twice that of the other color channels, a method using only the G channel after the demosaicing process is effective.
  • the position where the circular pattern is formed can be controlled. Therefore, if the near-infrared light transmitting part is arranged so that the circular pattern is centered on the G pixel, the NIR component contained in I R + NIR and I B + NI becomes small, and even if these are ignored, there is no influence. Becomes smaller. Further, if the shape and arrangement of the near-infrared light transmitting portion are set to about the size of one pixel of the color filter and the photosensor, the influence can be further reduced.
  • FIG. 17 is a block diagram of a video photographing apparatus 100B according to the fourth embodiment.
  • the video imaging device 100B has a configuration in which an encoded information memory 134 is added to the video imaging device 100A of the third embodiment.
  • the encoded information memory 134 is connected to the diffraction pattern extraction unit 130. Note that the code-type IR cut filter 110a, the color filter array 122, and the photosensor array 121 are the same as those in the video photographing apparatus 100A, and therefore, the encoded information memory 134 and the diffraction pattern extraction unit 130 will be described here.
  • the encoding information memory 134 records information on a circular pattern formed by near infrared light. Specifically, the center coordinates of the circular pattern, the size, and the distance between the code-type IR cut filter 110a and the photosensor array 121 are recorded.
  • This circular pattern is determined by the wavelength of near-infrared light, the size of the infrared transmission part 12, and the distance between the code-type IR cut filter 110a and the photosensor array 121. Therefore, once these parameters are determined, the center coordinates and size of the circular pattern can be grasped in advance by performing calculation or calibration.
  • a visible light image signal and a near-infrared light image signal are generated.
  • the generation method is the same as in the third embodiment.
  • the processing can be simplified by using the encoded information recorded in the encoded information memory 134.
  • FIG. 18 is a block diagram of the three-plate type video photographing apparatus 101.
  • the video imaging apparatus 101 includes a code-type IR cut filter 110a, a prism 160, and three types of incident light video signal generation means 120R, G, and B. Further, it has a diffraction pattern extraction means 130, a near-infrared light image signal generation means 140, and a visible light image signal generation means 150.
  • the prism 160 is color separation means, and in the example of FIG. 18, the G light travels straight, the B light travels downward in the drawing, and the R + NIR light travels upward in the drawing. Note that the shape of the prism 160 depicted in FIG. 18 does not represent an actual shape, but is conceptual.
  • the prism 160 may include an optical system for collimating light, and a camera lens 170 may be provided on the incident side of the prism 160.
  • the prism 160 and the camera lens 170 may be those generally used in a three-plate image photographing apparatus.
  • an incident light image signal generation means dedicated to that color is provided.
  • the incident light video signal generation unit 120G generates a G video signal.
  • the incident light video signal generation unit 120B generates a B video signal.
  • the incident light video signal generation unit 120R generates an R + NIR video signal.
  • a code-type IR cut filter 110a is provided on the light incident side of 120R. With this configuration, the video signal generated by 120R includes a circular pattern of near infrared light.
  • the photosensor used for each incident light image signal generating means may be a photosensor generally used in a three-plate image photographing apparatus.
  • the code IR cut filter 110a for example, the one used in any of the second to fourth embodiments is applied.
  • the code-type IR cut filter 110a is provided on the front side in the light traveling direction with respect to at least one of the three incident light image signal generating means. In the example of FIG. 18, it is provided corresponding to the R corresponding incident light video signal generating means 120R. As a result, diffraction of near-infrared light occurs.
  • a normal near infrared cut filter is used to cut near infrared light that may leak from the prism 160 with respect to the remaining two incident light image signal generating means that are not provided with the code type IR cut filter. May be installed. Thereby, color reproducibility can be ensured.
  • the light that enters the image capturing apparatus 101 through the camera lens 170 is decomposed by the prism 160 into R, G, and B light having different wavelength bands.
  • the light corresponding to R enters the incident light video signal generation unit 120R
  • the light corresponding to G enters the incident light video signal generation unit 120G
  • the light corresponding to B enters the incident light video signal generation unit 120B.
  • the near-infrared light component of the light corresponding to R is diffracted by the near-infrared light transmitting portion of the code-type IR cut filter 110a. Then, in the incident light video signal generation means 120R, an R + NIR video signal including NIR is generated. That is, with the above configuration, an R + NIR video signal to which a plurality of circular patterns by near infrared light are added is generated.
  • the diffraction pattern extraction unit 130 and the near-infrared light image signal generation unit 140 receive the R + NIR image signal and generate an NIR image signal. Then, an NIR video signal and an R + NIR video signal are output.
  • the visible light video generation unit 150 generates a monochrome R video signal by subtracting the NIR video signal from the R + NIR video signal. As described above, four-color video data (R, G, B, NIR) can be generated.
  • the video photographing apparatus 101 of the present embodiment has a configuration of a general three-plate type photographing apparatus and does not require a special device.
  • the code-type IR cut filter 1 is obtained by adding a simple modification to a general cut filter and has a simple configuration. That is, both a visible light video signal and a near-infrared light video signal can be generated with a simple configuration. For this reason, production cost reduction and failure rate reduction can be expected as compared with a method using a complicated configuration and a special device. Further, even under a situation where near-infrared light is strongly saturated, it is possible to suppress saturation and disperse the apparent dynamic range by dispersing near-infrared light by diffraction.
  • FIG. 19 is a block diagram of video imaging apparatus 102 in the present embodiment.
  • the image capturing apparatus 102 includes a code-type IR cut filter 110a and a stacked incident light image signal generating unit 120S in which incident light image signal generating units 120R, 120G, and 120B dedicated to R, G, and B are stacked. ing. Further, it has a diffraction pattern extraction means 130, a near-infrared light image signal generation means 140, and a visible light image signal generation means 150.
  • a multilayer sensor used in a general multilayer sensor-type video imaging apparatus can be used as the multilayer incident light video signal generation unit 120S. In the example of FIG. 19, the photosensors are stacked in the order of 120B, 120G, and 120R in the light traveling direction.
  • the code-type IR cut filter 110a may be the one used in any of the second to third embodiments. 110a is provided on the front side in the light traveling direction with respect to 120S.
  • the light incident on the image capturing device 102 includes R, G, B, and NIR light having different wavelength bands.
  • Light corresponding to B is converted to a B + NIR video signal by 120B
  • light corresponding to G is converted to a G + NIR video signal by 120G
  • light corresponding to R and NIR is converted to an R + NIR video signal by 120R.
  • R, G, B + NIR video signals include a circular pattern formed by the NIR diffracted by the code-type IR cut filter 110a.
  • a circular pattern is extracted by the diffraction pattern extraction unit 130, and a near-infrared light image signal is generated by the near-infrared light image generation unit 130. That is, the NIR video signal is generated based on the R + NIR, G + NIR, and B + NIR video signals input from the stacked incident light video signal generating unit 120S.
  • the method is the same as that of the third and fourth embodiments.
  • R, G, B, NIR, and four-color video signals can be generated for all pixels.
  • each unit may be configured by hardware, or may be realized by a computer program.
  • functions and operations similar to those described above are realized by a processor that operates according to a program stored in the program memory. Further, only some functions may be realized by a computer program.
  • the scope of the present invention also includes a program for causing a computer to execute the processes of the first to sixth embodiments and a recording medium storing the program.
  • a recording medium for example, a magnetic disk, a magnetic tape, an optical disk, a magneto-optical disk, a semiconductor memory, or the like can be used.
  • Incident light image signal generating means for generating an incident light image signal corresponding to the intensity of the visible light component and the near-infrared light component of the incident light incident on the light receiving surface, and disposed on the incident side of the light receiving surface.
  • Near-infrared light diffracting means for diffracting near-infrared light components
  • diffraction pattern extracting means for extracting a diffraction pattern generated by the near-infrared light on the light receiving surface, and the near-red calculated based on the diffraction pattern
  • Near-infrared light image signal generating means for generating a near-infrared light image signal corresponding to the intensity of the external light component, and the visible light component calculated based on the incident light image signal and the near-infrared light image signal
  • a visible light video signal generating means for generating a visible light video signal according to the intensity of the video photographing device.
  • the near-infrared light diffracting means has a near-infrared light cut portion that cuts near-infrared light and transmits visible light, and a near-infrared light transmission portion that transmits near-infrared light,
  • the video imaging device according to appendix 1.
  • the video imaging apparatus according to appendix 2, wherein the near-infrared light diffracting means includes a plurality of the near-infrared light transmitting portions.
  • Appendix 5 Any one of appendix 1 to appendix 4 characterized by having an information memory for storing information on the near-infrared light diffracting means and information on a position where the diffraction pattern is generated on the light receiving surface.
  • the near-infrared light image signal generating means includes a diffraction model generating means for generating a near-infrared light diffraction model, a brightness gradient analyzing means for analyzing a brightness gradient of the incident light image signal, and an analysis result of the brightness gradient And a matching diffraction model determination unit that determines a diffraction model that conforms to the above, and an incident near-infrared light intensity estimation unit that estimates the intensity of incident near-infrared light based on the matching diffraction model.
  • the video imaging device according to any one of supplementary notes 1 to 5.
  • the incident near-infrared light wavelength estimation means for estimating the wavelength of incident near-infrared light based on the adaptive diffraction model, the near-infrared light image signal generation means, Video shooting device.
  • the luminance gradient analyzing unit converts the incident light image signal into a region dividing unit that divides the incident light image signal for each region including one diffraction pattern, and converts the coordinates of the divided region into polar coordinates having the center of the diffraction pattern as an origin.
  • the video imaging apparatus according to appendix 6 or appendix 7, characterized by comprising: coordinate conversion means.
  • the luminance gradient analysis unit calculates a luminance gradient ratio that is a ratio of a luminance gradient at a certain coordinate of the incident light image signal and a luminance gradient at a position corresponding to the certain coordinate of the diffraction model.
  • the adaptive diffraction model determination unit performs determination based on a statistical processing result of the startup gradient ratio statistical processing unit.
  • (Appendix 12) 12 The video photographing apparatus according to any one of supplementary notes 1 to 11, wherein the incident light video signal generation unit generates a signal corresponding to intensity by separating incident light into a plurality of colors of visible light. (Appendix 13) 13. The video photographing apparatus according to appendix 12, wherein the near-infrared light video signal generating unit generates a near-infrared light video signal for each of the plurality of colors. (Appendix 14) 14. The video photographing apparatus according to any one of supplementary notes 1 to 13, wherein the first incident light video signal generation unit includes a silicon photosensor.
  • the near-infrared light diffraction means arranged on the incident side of the light-receiving surface diffracts the near-infrared light component of the incident light to obtain the intensity of the visible light component and the near-infrared light component of the incident light incident on the light-receiving surface.
  • a corresponding incident light image signal extracting a diffraction pattern generated by the near-infrared light on the light-receiving surface, and calculating the near-infrared light according to the intensity of the near-infrared light component calculated based on the diffraction pattern
  • a video imaging method comprising: generating an optical video signal, and generating a visible light video signal corresponding to an intensity of the visible light component based on the incident light video signal and the near-infrared light video signal.
  • (Appendix 16) A code-type IR having a near infrared light cut portion that cuts near infrared light and transmits visible light, and a near infrared light transmission portion that transmits near infrared light as the near infrared light component of the incident light.
  • (Appendix 17) The video imaging method according to appendix 16, wherein the code-type IR cut filter having a plurality of the near-infrared light transmitting portions is used.
  • (Appendix 18) 18.
  • the information of the near-infrared light diffracting means and the position information on which the diffraction pattern is generated on the incident light image signal generating means are stored in a memory.
  • Video shooting method. A near-infrared light diffraction model is generated, a luminance gradient of the incident light image signal is analyzed, a diffraction model that matches the analysis result of the luminance gradient is determined, and an incident near-red light that is incident based on the matching diffraction model is determined.
  • the video imaging method according to any one of supplementary notes 15 to 19, wherein the intensity of external light is estimated.
  • (Appendix 24) 24 The video shooting method according to claim 23, wherein the luminance gradient ratio generated at a plurality of coordinates in the divided area is statistically processed.
  • (Appendix 25) 25 The video shooting method according to appendix 24, wherein the matching diffraction model is determined based on the result of the statistical processing of the luminance gradient ratio.
  • (Appendix 26) 26 The video imaging method according to any one of supplementary notes 15 to 25, wherein incident light is separated into a plurality of colors of visible light and a signal corresponding to the intensity is generated.
  • (Appendix 27) 27 The video shooting method according to appendix 26, wherein a near-infrared light video signal is generated for each of the plurality of colors.
  • the image photographing method is generated by a silicon photosensor.
  • Appendix 29 Diffracting the near-infrared light component of the incident light by the near-infrared light diffracting means disposed on the incident side of the light-receiving surface; and A step of generating an incident light image signal according to intensity, a step of extracting a diffraction pattern generated by the near infrared light on the light receiving surface, and an intensity of the near infrared light component calculated based on the diffraction pattern Generating a near-infrared light image signal according to the step, and generating a visible light image signal according to the intensity of the visible light component based on the incident light image signal and the near-infrared light image signal;
  • a video shooting program characterized by comprising: (Appendix 30) Generating a near-infrared light diffraction model; analyzing a luminance gradient of the incident light image signal;
  • (Appendix 33) Statistically processing the luminance gradient ratio generated at a plurality of coordinates in the divided region, and determining the suitable diffraction model based on the result of the statistical processing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

Le problème décrit par l'invention a pour but de proposer un dispositif de capture d'image capable de capturer une image à la fois à l'aide d'une lumière visible et à l'aide d'une lumière infrarouge par une configuration simple. La solution selon la présente invention concerne un dispositif de capture d'image qui comprend : un moyen de génération de signal d'image de lumière incidente qui génère un signal d'image de lumière incidente correspondant aux intensités d'une composante de lumière visible et d'une composante de lumière du proche infrarouge de la lumière incidente incidente sur une surface de réception de lumière ; un moyen de diffraction de lumière du proche infrarouge qui est disposé sur le côté incidence de la surface de réception de lumière et diffracte la composante de lumière du proche infrarouge de la lumière incidente ; un moyen d'extraction de motif de diffraction qui extrait un motif de diffraction généré sur la surface de réception de lumière par la lumière du proche infrarouge ; un moyen de génération de signal d'image de lumière du proche infrarouge qui génère un signal d'image de lumière du proche infrarouge correspondant à l'intensité de la composante de lumière du proche infrarouge, l'intensité étant calculée sur la base du motif de diffraction ; et un moyen de génération de signal d'image de lumière visible qui génère un signal d'image de lumière visible correspondant à l'intensité de la composante de lumière visible sur la base du signal d'image de lumière incidente et du signal d'image de lumière du proche infrarouge.
PCT/JP2016/002852 2015-06-17 2016-06-13 Dispositif de capture d'image et procédé de capture d'image Ceased WO2016203760A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017524610A JP6720971B2 (ja) 2015-06-17 2016-06-13 映像撮影装置、映像撮影方法及び映像撮影プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-121588 2015-06-17
JP2015121588 2015-06-17

Publications (1)

Publication Number Publication Date
WO2016203760A1 true WO2016203760A1 (fr) 2016-12-22

Family

ID=57545881

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/002852 Ceased WO2016203760A1 (fr) 2015-06-17 2016-06-13 Dispositif de capture d'image et procédé de capture d'image

Country Status (2)

Country Link
JP (1) JP6720971B2 (fr)
WO (1) WO2016203760A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11962911B2 (en) 2020-12-03 2024-04-16 Samsung Electronics Co., Ltd. Electronic device for performing image processing and operation method thereof to reduce artifacts in an image captured by a camera through a display

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006064634A (ja) * 2004-08-30 2006-03-09 Sony Corp 物理情報取得方法および物理情報取得装置、複数の単位構成要素が配列されてなる物理量分布検知の半導体装置、並びに半導体装置の製造方法
JP2009529160A (ja) * 2006-02-06 2009-08-13 キネテイツク・リミテツド 撮像システム
WO2015059897A1 (fr) * 2013-10-23 2015-04-30 日本電気株式会社 Dispositif de capture d'image, procédé de capture d'image, filtre de coupure d'infrarouge du type code, et filtre de coupure de couleur particulière du type code

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006064634A (ja) * 2004-08-30 2006-03-09 Sony Corp 物理情報取得方法および物理情報取得装置、複数の単位構成要素が配列されてなる物理量分布検知の半導体装置、並びに半導体装置の製造方法
JP2009529160A (ja) * 2006-02-06 2009-08-13 キネテイツク・リミテツド 撮像システム
WO2015059897A1 (fr) * 2013-10-23 2015-04-30 日本電気株式会社 Dispositif de capture d'image, procédé de capture d'image, filtre de coupure d'infrarouge du type code, et filtre de coupure de couleur particulière du type code

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MASATO TSUKADA ET AL.: "High Sensitivity Sensing Method by Using a Coded IR Cut Filter", FIT2014 THE 13TH FORUM ON INFORMATION TECHNOLOGY KOEN RONBUNSHU SEPARATE VOL. 3 , SADOKUTSUKI RONBUN·IPPAN RONBUN GAZO NINSHIKIO MEDIA RIKAI GRAPHICS·GAZO HUMAN COMMUNICATION & INTERACTION KYOIKU KOGAKU·FUKUSHI KOGAKU· MULTIMEDIA OYO, FORUM ON IN, vol. 2014, 2014, pages 85 - 86 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11962911B2 (en) 2020-12-03 2024-04-16 Samsung Electronics Co., Ltd. Electronic device for performing image processing and operation method thereof to reduce artifacts in an image captured by a camera through a display

Also Published As

Publication number Publication date
JP6720971B2 (ja) 2020-07-08
JPWO2016203760A1 (ja) 2018-04-05

Similar Documents

Publication Publication Date Title
US7839437B2 (en) Image pickup apparatus, image processing method, and computer program capable of obtaining high-quality image data by controlling imbalance among sensitivities of light-receiving devices
KR101344824B1 (ko) 이미지 프로세싱 기기, 이미징 기기, 이미지 프로세싱 방법, 이미징 방법 및 이미징 프로세싱 프로그램
US8768053B2 (en) Image processing apparatus and method of providing high sensitive color images
US9179113B2 (en) Image processing device, and image processing method, and program
US9413984B2 (en) Luminance source selection in a multi-lens camera
CN102685511B (zh) 图像处理设备和图像处理方法
CN111812838A (zh) 用于光场成像的系统和方法
US11460666B2 (en) Imaging apparatus and method, and image processing apparatus and method
KR20100039120A (ko) 영상의 노이즈를 저감하는 영상 처리 장치 및 방법
JP5186517B2 (ja) 撮像装置
JP2000134634A (ja) 画像変換方法
JP2020024103A (ja) 情報処理装置、情報処理方法、及び、プログラム
US9118878B2 (en) Image processing apparatus that corrects for chromatic aberration for taken image, image pickup apparatus, method of correcting for chromatic aberration of magnification therefor, and storage medium
JPWO2015133130A1 (ja) 映像撮影装置、信号分離装置および映像撮影方法
JPWO2012144162A1 (ja) 3次元撮像装置、光透過部、画像処理装置、およびプログラム
KR101243285B1 (ko) 베이어 컬러 필터 배열 카메라를 사용한 다중 스펙트럼 기반 컬러 영상 생성장치 및 방법
JPWO2018116972A1 (ja) 画像処理方法、画像処理装置および記録媒体
JP6720971B2 (ja) 映像撮影装置、映像撮影方法及び映像撮影プログラム
JP6794989B2 (ja) 映像処理装置、撮影装置、映像処理方法及びプログラム
JP2014158165A (ja) 画像処理装置、画像処理方法およびプログラム
CN118483856A (zh) 色彩滤镜阵列、图像获取方法、图像传感器及采集装置
US8866972B1 (en) Method for transmitting spectrum information
JP6552248B2 (ja) 画像処理装置及び方法、撮像装置、並びにプログラム
WO2013111824A1 (fr) Dispositif de traitement d'images, dispositif de capture d'images et procédé de traitement d'images
JP2019049616A (ja) 画像処理装置、撮像装置および画像処理装置の制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16811235

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017524610

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16811235

Country of ref document: EP

Kind code of ref document: A1