CN112217962B - Camera and image generation method - Google Patents
Camera and image generation method Download PDFInfo
- Publication number
- CN112217962B CN112217962B CN201910618561.0A CN201910618561A CN112217962B CN 112217962 B CN112217962 B CN 112217962B CN 201910618561 A CN201910618561 A CN 201910618561A CN 112217962 B CN112217962 B CN 112217962B
- Authority
- CN
- China
- Prior art keywords
- image
- pixel
- visible light
- brightness
- black
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000001228 spectrum Methods 0.000 claims abstract description 177
- 230000004927 fusion Effects 0.000 claims abstract description 56
- 238000003384 imaging method Methods 0.000 claims abstract description 8
- 238000012937 correction Methods 0.000 claims description 45
- 238000005286 illumination Methods 0.000 claims description 41
- 239000013589 supplement Substances 0.000 claims description 10
- 230000008569 process Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 5
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 4
- 239000003086 colorant Substances 0.000 description 4
- 230000001965 increasing effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000007500 overflow downdraw method Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 206010034960 Photophobia Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 208000013469 light sensitivity Diseases 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/77—Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides a camera and an image generation method, wherein the method comprises the following steps: acquiring a color visible light image and a black-and-white full-spectrum image, wherein an imaging light source of the black-and-white full-spectrum image comprises visible light and infrared light; registering the color visible light image and the black-and-white full-spectrum image; and carrying out image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image. In the image generation method of the embodiment of the application, because the signal-to-noise ratio of the black-and-white full-spectrum image is greater than that of the color visible light image, the color fusion image is obtained by fusing the black-and-white full-spectrum image and the color visible light image, the signal-to-noise ratio of the fusion image is greater than that of the color visible light image, and the fusion image is adopted to replace the color visible light image, so that the image quality is improved.
Description
Technical Field
The present application relates to the field of image acquisition technologies, and in particular, to a camera and an image generation method.
Background
With the improvement of security awareness of people, the coverage range of monitoring equipment is gradually increased, and the application scenes of the monitoring equipment are more and more extensive. Unlike a camera, a video camera generally does not continuously fill the environment with light through a visible light fill-in lamp in consideration of light pollution. For the reasons, in a low-illumination scene, the visible light composition in the image collected by the camera is less, so that the color image has high noise and poor image quality.
Disclosure of Invention
An object of the embodiments of the present application is to provide a camera and an image generation method, so as to improve image quality. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a camera, where the camera includes:
the system comprises a visible light lens, a full spectrum lens and a processor;
the visible light lens is used for collecting a color visible light image;
the full-spectrum lens is used for collecting a black-and-white full-spectrum image, wherein an imaging light source of the black-and-white full-spectrum image comprises visible light and infrared light;
the processor is used for registering the color visible light image and the black-and-white full spectrum image; and carrying out image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image.
In a second aspect, an embodiment of the present application provides an image generation method, where the method includes:
acquiring a color visible light image and a black-and-white full spectrum image of a shooting scene, wherein an imaging light source of the black-and-white full spectrum image comprises visible light and infrared light;
registering the color visible light image and the black-and-white full spectrum image;
and carrying out image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image.
Optionally, before the registering the color visible light image and the black-and-white full spectrum image, the method further includes:
determining the current illumination of a visible light lens for collecting the color visible light image;
when the current illumination of the visible light lens is larger than a preset illumination threshold value, outputting the colorful visible light image;
the registering the color visible light image and the black-and-white full spectrum image comprises:
and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, registering the color visible light image and the black-and-white full-spectrum image.
Optionally, the determining the current illumination of the visible light lens for acquiring the color visible light image includes:
acquiring current shutter parameters, current exposure gain and current image brightness values of a visible light lens for acquiring the color visible light image;
and calculating the current illumination of the visible light lens according to the current shutter parameter, the current exposure gain and the current image brightness value of the visible light lens.
Optionally, the method further includes:
and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, starting an infrared light supplement lamp.
Optionally, the registered color visible light image is in an RGB format, and the image fusion is performed on the registered black-and-white full spectrum image and the registered color visible light image to obtain a fused image, including:
converting the registered black-white full-spectrum image and the registered color visible light image into YUV format;
determining a brightness factor and a color scale invariance factor of each pixel according to a Y channel value of each pixel in a black-and-white full-spectrum image in a YUV format and a Y channel value of each pixel in a color visible light image in a YUV format;
obtaining the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction according to the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the brightness factor of each pixel;
performing brightness fusion according to the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction to obtain the Y-channel numerical value of each pixel in a fused image;
obtaining a color-adjusted color visible light image according to the R, G, B numerical value of each pixel in the registered color visible light image and the color scale invariance factor of each pixel;
and converting the color-adjusted color visible light image into a YUV format to obtain a target color visible light image, taking the U channel numerical value of each pixel in the target color visible light image as the U channel numerical value of each pixel in the fusion image, and taking the V channel numerical value of each pixel in the target color visible light image as the V channel numerical value of each pixel in the fusion image.
Optionally, the determining the brightness factor and the color scale invariance factor of each pixel according to the Y channel value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel value of each pixel in the color visible light image in the YUV format includes:
respectively calculating the ratio of the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format to the Y-channel numerical value of the pixel at the corresponding position in the color visible light image in the YUV format to obtain the brightness factor of each pixel;
and inquiring a preset color scale factor table according to the Y channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel numerical value of each pixel in the color visible light image in the YUV format to obtain the color scale invariance factor of each pixel.
Optionally, the performing luminance fusion according to the Y channel value of each pixel in the color visible light image and the Y channel value of each pixel in the black-and-white full-spectrum image after luminance correction to obtain the Y channel value of each pixel in the fused image includes:
inquiring a preset brightness weight table according to the Y channel numerical value of each pixel in the color visible light image and the Y channel numerical value of each pixel in the black-and-white full spectrum image after brightness correction, and respectively determining the brightness weight of each pixel in the fused image, wherein the preset brightness weight table records the corresponding relation between the brightness weight and the brightness parameter, and the brightness parameter comprises the brightness of the pixel in the color visible light image and the brightness of the pixel in the black-and-white full spectrum image after brightness correction;
and respectively carrying out brightness fusion on the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after the brightness correction according to the brightness weight of each pixel in the fusion image to obtain the Y-channel numerical value of each pixel in the fusion image.
Optionally, when the camera that collects the color visible light image does not start the high dynamic mode, the preset brightness weight table is a first brightness weight table, where each brightness weight in the first brightness weight table is determined according to the brightness of a pixel in the color visible light image and the brightness of a pixel in the black-and-white full spectrum image after brightness correction; after the high dynamic mode of the camera is started, the preset brightness weight table is a second brightness weight table, and each brightness weight in the second brightness weight table is determined according to the average brightness, the brightness of the pixel in the color visible light image and the brightness of the pixel in the black-and-white full spectrum image after the brightness correction, wherein the average brightness is the average value of the brightness of each pixel in the color visible light image and the black-and-white full spectrum image after the brightness correction.
Optionally, the process of pre-establishing the preset color scale factor table includes:
by the formula: ycf (Ym, Yc) ═ clip (Y _ pix _ factor _ fac _ m (Ym)/64,4,32), and color scale invariance factors corresponding to different combinations of ymyc values are calculated, wherein Ycf (Ym, Yc) is the color scale invariance factor, Y _ pix _ factor is clip (Y _ pix _ factor _ fac _ c (Yc)/64,4,64), Y _ pix _ factor is clip ((Ym × 16)/(Yc +1),10,64), Ym is the luminance of the pixels in the full-spectrum image, Yc is the luminance of the pixels in the visible-light image, fac _ c (Yc) represents the value of the preset color weight curve when the luminance of the pixels in the full-spectrum image is Yc, and fac _ m (Ym) represents the value of the black-and white weight curve when the luminance of the pixels in the black-and-white-spectrum image is Ym;
and establishing a preset color proportion factor table according to the corresponding relation between the YmYc combination and the color proportion invariance factor.
The camera and the image generation method provided by the embodiment of the application acquire a color visible light image and a black-and-white full spectrum image, wherein an imaging light source of the black-and-white full spectrum image comprises visible light and infrared light; registering the color visible light image and the black-and-white full-spectrum image; and carrying out image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image. The signal-to-noise ratio of the black-and-white full-spectrum image is greater than that of the color visible light image, the color fusion image is obtained by fusing the black-and-white full-spectrum image and the color visible light image, the signal-to-noise ratio of the fusion image is greater than that of the color visible light image, and the fusion image is adopted to replace the color visible light image, so that the image quality is improved. Of course, not all advantages described above need to be achieved at the same time in the practice of any one product or method of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic view of a camera according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a camera for image fusion according to an embodiment of the present application;
FIG. 3a is a schematic diagram of a black-and-white full spectrum image according to an embodiment of the present application;
FIG. 3b is a schematic view of a color visible light image according to an embodiment of the present application;
FIG. 4 is a diagram illustrating luminance factor filtering according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a preset black-and-white weight curve and a preset color weight curve according to an embodiment of the present application;
FIG. 6 is a diagram illustrating a table of predetermined color scale factors according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an image generation method according to an embodiment of the present application;
FIG. 8 is a first diagram of a brightness weight table according to an embodiment of the present application;
fig. 9 is a second schematic diagram of a brightness weight table according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
First, terms of art in the embodiments of the present application are explained:
enhancing the low-illumination of the image: the information covered by the image is improved by an image acquisition device and an image fusion method.
Visible light: here the wavelength band visible to the human eye, is about 400nm (nanometers) to 650 nm.
RGB: the RGB color scheme is a color standard in the industry, and various colors are obtained by changing three color channels of Red (Red), Green (Green), and Blue (Blue) and superimposing them on each other, wherein RGB is a color representing three channels of Red, Green, and Blue.
YUV: is a color coding method. Commonly used in each video processing component, "Y" denotes brightness and "U" and "V" denote chrominance and density, respectively.
In the related technology, the image acquisition device for light splitting and fusion divides incident light into two parts of visible light and infrared light, respectively acquires a visible light image and an infrared light image through two sensors, fuses brightness information of the two images into one, and then fuses color information to finally obtain an image to be output. However, with the above method, it is necessary to fuse the luminance information of the visible light image and the infrared light image in the fusion process, and the amount of fusion computation is large.
In view of this, the present application provides a camera, referring to fig. 1, including:
visible light lens 101, full spectrum lens 102 and processor 103.
The visible light lens 101 is configured to collect a color visible light image.
The full spectrum lens 102 is configured to collect a black-and-white full spectrum image, wherein an imaging light source of the black-and-white full spectrum image includes visible light and infrared light.
The processor 103 is configured to register the color visible light image and the black-and-white full spectrum image; and carrying out image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image.
The embodiment of the application adopts a double-lens double-sensor mode, and the sensor of the visible light lens 101 only collects visible light images; the sensor of the plenoptic lens 102 collects a plenoptic (including visible and infrared) image. In one possible embodiment, referring to fig. 2, the visible light lens 101 captures images in the visible light band (wavelengths of about 400nm to 650nm), and to better capture low-light colors, a large aperture lens and a sensor with better low-light sensitivity can be used. The full spectrum lens 102 collects a full spectrum image, i.e., including visible light and infrared light. Optionally, in order to ensure the sharpness of the image, the full-spectrum lens 102 employs a confocal lens of infrared light and visible light. In a possible implementation, the camera further comprises an infrared light supplement lamp, the infrared light supplement lamp is used for performing infrared light supplement, and the infrared light supplement lamp can be started at night, so that the image quality is improved.
The Processor 103 in this embodiment may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also a DSP (Digital Signal Processing), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In general, before the registration, the Processor 103 may also perform ISP (Image Signal Processor) processing such as interpolation, white balance, CCM (Color Correction Matrix) Correction, gamma Correction, noise reduction, and sharpening on the Color visible light Image and the black-and-white full-spectrum Image. Of course, the ISP processing may be performed by another computing module other than the processor 103.
Since the visible light lens 101 and the full spectrum lens 102 have parallax, the processor 103 needs to register the color visible light image and the black-and-white full spectrum image before image fusion. Parallax is related to the distance of the scene from the image acquisition device. In the embodiment of the application, the color visible light image and the black and white full spectrum image can be registered by any relevant registration method. In a possible implementation mode, pictures of two cameras are collected in a scene with good illuminance and rich information in advance, targets in the pictures are identified, and coordinate pairing is carried out on the same targets. And performing affine transformation on the paired coordinates to calculate an affine matrix. And in the subsequent actual use, registering the color visible light image and the black-and-white full spectrum image by using the calculated affine matrix and affine transformation.
The processor 103 performs image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image. For example, the brightness of the black-and-white full-spectrum image may be selected as the brightness of the fused image, and the color information of the color visible light image may be selected as the color of the fused image. Specifically, the registered black-and-white full spectrum image and the registered color visible light image may be converted from an RGB format to a YUV format, and the registered Y channel of the black-and-white full spectrum image and the registered U, V channel of the color visible light image are combined to obtain a fused image. And then converting the fused image in the YUV format into an RGB format.
In the embodiment of the application, the signal-to-noise ratio of the black-and-white full-spectrum image is greater than that of the color visible light image, the color fusion image is obtained by fusing the black-and-white full-spectrum image and the color visible light image, the signal-to-noise ratio of the fusion image is greater than that of the color visible light image, and the fusion image is adopted to replace the color visible light image, so that the image quality is improved.
In a possible implementation, the processor is specifically configured to:
determining the current illumination of the visible light lens; when the current illumination of the visible light lens is larger than a preset illumination threshold value, outputting the color visible light image; and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, registering the color visible light image and the black-and-white full spectrum image, carrying out image fusion on the registered black-and-white full spectrum image and the registered color visible light image to obtain a fused image, and outputting the fused image.
The preset illumination threshold value can be set according to actual conditions. In the embodiment of the present application, in a scene with a good illuminance such as daytime, the camera may use a color visible light image collected by the visible light lens 101 as an output. In a scene with low ambient illumination such as at night, the camera outputs the fused image. Thereby adapting to different application scenarios.
Optionally, the determining the current illumination of the visible light lens includes:
acquiring current shutter parameters, current exposure gain and current image brightness values of the visible light lens;
and step two, calculating the current illumination of the visible light lens according to the current shutter parameter, the current exposure gain and the current image brightness value of the visible light lens.
Specifically, the current illuminance of the visible light lens can be calculated by lux ═ log ((Y < <5)/(gain × shutter)). Wherein, the shutter is the current shutter parameter of the visible light lens, the current exposure gain of the gain visible light lens, and Y is the brightness value of the image acquired by the visible light lens, that is, the current image brightness value. The above parameters can be obtained by the color pass auto-exposure module. And < < is a left shift operator.
Optionally, the camera further includes an infrared light supplement lamp; the processor is further configured to: and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, the infrared light supplement lamp is turned on. Specifically, the infrared fill light may be an infrared light source of 750nm (nanometers).
The dual cameras have infrared supplementary lighting, and since the infrared supplementary lighting has a flashlight effect (middle is bright, both sides are dark), the black-and-white full spectrum image and the color visible light image have a large difference in brightness, as shown in fig. 3a and 3b, where fig. 3a is the black-and-white full spectrum image, and fig. 3b is the color visible light image. The brightness difference causes that the color of the color visible light image is fused into the black-and-white full-spectrum image to generate larger distortion, and the fused color is lighter, which is the phenomenon of mismatching of the brightness and the color. In addition, because the human eyes cannot see the infrared light, only color pictures can be observed, and the fused picture and the picture observed by the human eyes have larger brightness difference, so that the scene of the scene cannot be reflected more truly. Therefore, local brightness blurring adjustment is required to make the brightness of the black-and-white full-spectrum image closer to that of the color visible light image.
Since the local brightness blurring adjustment is to adjust the brightness in a large area, the local distortion of the color cannot be completely solved. The small-area and local brightness difference is large, which causes large distortion after the color of the color visible light image is fused to the black-and-white full-spectrum image. Because the color visible light image has low brightness and light color at the position, the color and the brightness are not matched after the color visible light image is fused into the black-and-white full-spectrum image. And the fused image can generate stronger color noise in an extremely dark area of the black-white full-spectrum image. Therefore, the color-preserving process is required, and the principle thereof can be explained as follows: the change in YUV domain luminance Y is equivalent to R, G, B values multiplied by a factor in the RGB domain. The method can also be understood as keeping the color ratio R/G, B/G unchanged, and the derivation process is as follows:
if RcGcBc, RmGmBm, YcUcVc and YmUmVm exist, wherein Rc is the R value of a pixel in the RGB format black-and-white full spectrum image, Gc is the G value of the pixel in the RGB format black-and-white full spectrum image, and Bc is the B value of the pixel in the RGB format black-and-white full spectrum image; rm is the R value of a pixel in the RGB format color visible light image, Gc is the G value of the pixel in the RGB format color visible light image, and Bc is the B value of the pixel in the RGB format color visible light image; yc is the Y value of a pixel in a YUV format black-and-white full spectrum image, Uc is the U value of the pixel in the YUV format black-and-white full spectrum image, and Vc is the V value of the pixel in the YUV format black-and-white full spectrum image; ym is the Y value of the pixel in the YUV format color visible light image, Um is the U value of the pixel in the YUV format color visible light image, and Vm is the V value of the pixel in the YUV format color visible light image.
After the brightness is changed:
Gm=Gc*Ym/Yc (1-1)
if the R/G, B/G of the color path is kept unchanged, the following steps are provided:
Rm/Gm=Rc/Gc
Bm/Gm=Bc/Gc
substituting the above general formula (1-1) with:
Rm=Rc*Ym/Yc (1-2)
Bm=Bc*Ym/Yc (1-3)
in combination with formulas (1-1), (1-2), and (1-3), there are:
Rm=Rc*Yfactor
Gm=Gc*Yfactor
Bm=Bc*Yfactor
wherein Yfactor is the color scale invariance factor of the pixel.
Therefore, in order to enhance the color of the fused image and reduce the noise of the fused image, in a possible embodiment, the registered color visible light image is in RGB format, and the image fusion of the registered black-and-white full spectrum image and the registered color visible light image to obtain the fused image includes:
step one, converting the registered black-and-white full-spectrum image and the registered color visible light image into YUV format.
In general, an image acquired by a camera is in an RGB format, so that a registered black-and-white full-spectrum image and a registered color visible light image need to be converted into a YUV format. It will be appreciated by those skilled in the art that this step may be omitted if the image captured by the camera is itself in YUV format.
And step two, determining the brightness factor and the color scale invariance factor of each pixel according to the Y channel value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel value of each pixel in the color visible light image in the YUV format.
Optionally, the determining the brightness factor and the color scale invariance factor of each pixel according to the Y channel value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel value of each pixel in the color visible light image in the YUV format includes:
and step A, respectively calculating the ratio of the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format to the Y-channel numerical value of the pixel at the corresponding position in the color visible light image in the YUV format to obtain the brightness factor of each pixel.
First, the determination process of the luminance factor is exemplified.
The application provides a local brightness fuzzy adjustment method by guiding a color visible light image. In order not to introduce noise directed to a figure (color visible light image) while adjusting local brightness. And respectively dividing the brightness of each pixel of the black-and-white full-spectrum image by the brightness of the corresponding pixel in the color visible light image to obtain a brightness factor with the same size as the resolution of the picture.
The luminance factor can be calculated using the following formula:
Yf=Ym/(Yc+1) (1)
where Yf is a luminance factor, Ym is the luminance of a pixel in a black-and-white full-spectrum image, Yc is the luminance of a pixel in a color visible light image, and Yc +1 is taken to prevent Yc from being 0. The brightness factor is subjected to the most value limitation, the noise matching problem is comprehensively considered, and the value range is as follows: 0.6 to 10.
And respectively calculating the brightness factor of each pixel by the formula (1) for the pixels with the same positions in the black-white full-spectrum image and the color visible light image.
To further reduce the error, in a possible implementation, after the above-mentioned ratio of the Y-channel value of each pixel in the black-and-white full-spectrum image in the YUV format to the Y-channel value of the pixel at the corresponding position in the color visible light image in the YUV format is calculated respectively, and the luminance factor of each pixel is obtained, the above-mentioned method further includes: the luminance factor of each pixel is filtered. For example, as shown in fig. 4, the luminance factor is filtered over a large range.
And B, inquiring a preset color scale factor table according to the Y channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel numerical value of each pixel in the color visible light image in the YUV format to obtain the color scale invariance factor of each pixel.
Next, the determination process of the color scale invariance factor is exemplified.
To reduce color noise and highlight color overflow, the color scale invariance factor Yfactor varies with the pixel brightness Ym in the black-and-white full-spectrum image and the pixel brightness Yc in the color visible image as a function Ycf with Yc and Ym as variables.
The color scale invariance factor can be calculated using the following formula:
Yfactor=Ycf(Yc,Ym) (2)
the calculation process of the above equation (2) is as follows:
according to the formula: and y _ pix _ factor is a primary color scale factor calculated as clip ((Ym 16)/(Yc +1),10,64), wherein y _ pix _ factor is the primary color scale factor, Ym is the brightness of the pixel in the black-and-white full-spectrum image, and Yc is the brightness of the pixel in the color visible image.
By the formula: and calculating a middle-level color scale factor by Y _ pix _ factor (Y _ pix _ factor) fac _ c (Yc)/64,4 and 64), wherein the Y _ pix _ factor is the middle-level color scale factor, and fac _ c (Yc) represents the numerical value of the preset color weight curve when the brightness of the pixel in the color visible light image is Yc.
By the formula: ycf (Ym, Yc) ═ clip (Y _ pix _ factor x fac _ m (Ym)/64,4,32), a color scale invariance factor is calculated, wherein fac _ m (Ym) represents the value of a preset black-and-white weight curve when the brightness of a pixel in the black-and-white full-spectrum image is Ym. In one possible embodiment, the preset black-and-white weight curve and the preset color weight curve can be as shown in fig. 5, wherein curve 1 is the preset color weight curve, curve 2 is the preset black-and-white weight curve, the horizontal axis represents the brightness, and the vertical axis represents the value of the curve. It can be seen from fig. 5 that as the brightness increases, the preset color weight curve gradually increases, and finally approaches to be flat; with the increase of the brightness, the numerical line of the preset black-white weight curve is increased and then reduced.
The processor 103 may calculate the color scale invariance factor for each pixel by the above equation 2, but in order to reduce the calculation pressure of the processor 103, in one possible embodiment, a preset color scale factor table of the color scale invariance factor is established in advance. The processor 103 queries a preset color scale factor table, and performs table lookup according to the brightness Ym of each pixel in the black-and-white full-spectrum image in the YUV format and the Yc of each pixel in the color visible light image in the YUV format to obtain the color scale invariance factor of each pixel respectively. And adjusting the UV component by using the color scale invariance factor to enable the brightness to be more matched with the color. In one possible embodiment, the table of preset color scale factors may be as shown in fig. 6.
In order to further reduce the error, in a possible implementation manner, after querying a preset color scale factor table according to the Y channel value of each pixel in the black-and-white full spectrum image in the YUV format and the Y channel value of each pixel in the color visible light image in the YUV format to obtain a color scale invariance factor of each pixel, the method further includes: and filtering a factor graph formed by the color scale invariance factors of each pixel.
And step three, obtaining the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction according to the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the brightness factor of each pixel.
And dividing the brightness of each pixel in the black-and-white full-spectrum image in the YUV format by the brightness factor corresponding to each pixel to obtain the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction.
For example, according to the formula: and respectively correcting the Y-channel numerical value of each pixel in the black-and-white full-spectrum image, wherein Yma is the brightness of the pixel in the black-and-white full-spectrum image after brightness correction, Ym is the brightness of the pixel in the black-and-white full-spectrum image in the YUV format, and Yf is a brightness factor corresponding to the pixel.
And fourthly, performing brightness fusion according to the Y channel numerical value of each pixel in the color visible light image and the Y channel numerical value of each pixel in the black-and-white full-spectrum image after the brightness correction to obtain the Y channel numerical value of each pixel in the fused image.
The camera fuses the brightness (Y-channel value) of the pixels at the same position in the black-and-white full-spectrum image after the color visible light image and the brightness correction, and the fused image is used as the Y-channel value of each pixel in the fused image. For example, the Y channel values of the pixels at the same position can be added and averaged, but the method brings a large loss of the signal-to-noise ratio. To ensure a higher signal-to-noise ratio or better color realism, a weight fusion method may also be used.
Optionally, the performing luminance fusion according to the Y channel value of each pixel in the color visible light image and the Y channel value of each pixel in the black-and-white full-spectrum image after luminance correction to obtain the Y channel value of each pixel in the fused image includes:
step A, inquiring a preset brightness weight table according to the Y channel value of each pixel in the color visible light image and the Y channel value of each pixel in the black-and-white full spectrum image after brightness correction, and respectively determining the brightness weight of each pixel in the fusion image, wherein the preset brightness weight table records the corresponding relation between the brightness weight and the brightness parameter, and the brightness parameter comprises the brightness of the pixel in the color visible light image and the brightness of the pixel in the black-and-white full spectrum image after brightness correction.
The preset brightness weight table records the brightness weight corresponding to the combination of the pixel brightness in the visible light images of different colors and the pixel brightness in the corrected black-and-white full-spectrum image.
In order to make the color of the fused image be true, when the difference between the color and the black and white is large, more colors need to be fused, that is, the difference between the black and white and the color brightness is large, the infrared weight is smaller, in a possible embodiment, the brightness weight may be calculated by ymff _ def (Yc, Yma) ═ 256- | Yc-Yma |, where Yma is the brightness of the pixel in the black and white full spectrum image after brightness correction, Yc is the brightness of the pixel in the color visible light image, and ymff _ def (Yc, Yma) is the brightness weight corresponding to Yma and Yc, and the first initial brightness weight table is obtained after filtering and sampling. For example, one possible first initial luminance weight table may be as shown in fig. 8.
In one possible implementation, to increase the image information of an image for a fast moving object and to increase the image dynamic range, the weight of the image close to the average brightness of the image may be increased, that is:
dfyc=|Yc-avg_Y|+1;
dfym=|Yma-avg_Y|+1;
ymff_def(Yc,Yma)=256*dfym/(dfym+dfyc);
wherein avg _ Y is the average brightness of each pixel in the black-and-white full-spectrum image and the color visible light image after brightness correction. For example, in the case of an average luminance of 128, a second initial luminance weight table in which luminance weights are filtered and sampled may be as shown in fig. 9.
After the initial brightness weight table is obtained (the first initial brightness weight table may be the second initial brightness weight table), the preset brightness weight table may be obtained by weighting with the preset maximum value table (each brightness weight in the preset maximum value table is 256) and the minimum value table (each brightness weight in the preset minimum value table is 0). In one possible embodiment, the weighting is as follows:
if alpha is not less than 128, let alpha be alpha-128;
ymff=(256*α+(128-α)*ymff_def(Yc,Yma))/128;
if alpha is less than 128, then ymff ═ (ymff _ def (Yc, Yma) × alpha)/128;
the alpha can be set according to actual requirements, the alpha is used for controlling the signal-to-noise ratio of the fused image under low illumination, and the larger the alpha is, the larger the signal-to-noise ratio is, the larger the color distortion is; conversely, the smaller the alpha, the smaller the signal-to-noise ratio and the truer the color. Ymff is the luminance weight in the preset luminance weight table, and Ymff _ def (Yc, Yma) is the luminance weight in the initial luminance weight table.
And B, respectively carrying out brightness fusion on the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after the brightness correction according to the brightness weight of each pixel in the fused image to obtain the Y-channel numerical value of each pixel in the fused image.
Specifically, the Y channel value of each pixel in the fused image may be calculated according to the formula Y ═ (Yc ═ 256-ymff) + ymff Yma)/256. Where Yc is the brightness of the pixel in the color visible light image, Yma is the brightness of the pixel in the black-and-white full-spectrum image after brightness correction, and ymff is the brightness weight. The value range of Ymff is (0-256). The weight ymff is determined according to the values of Yc and Yma, and can be obtained by looking up a preset brightness weight table.
And step five, obtaining the color-adjusted color visible light image according to the R, G, B numerical value of each pixel in the color visible light image after the registration and the color scale invariance factor of each pixel.
And multiplying R, G, B values of each pixel in the registered color visible light image by a corresponding color scale invariance factor of each pixel to obtain the color-adjusted color visible light image. For example, the value of the color scale invariance factor corresponding to the nth pixel in the color visible light image after the registration is M, and the values of R, G, B of the nth pixel are respectively multiplied by M to obtain the values of R, G, B of the nth pixel in the color visible light image after the color adjustment.
And step six, converting the color-adjusted color visible light image into a YUV format to obtain a target color visible light image, taking the U channel numerical value of each pixel in the target color visible light image as the U channel numerical value of each pixel in the fusion image, and taking the V channel numerical value of each pixel in the target color visible light image as the V channel numerical value of each pixel in the fusion image.
In conclusion, the Y-channel numerical value, the U-channel numerical value and the V-channel numerical value of each pixel of the fused image can be obtained, so that a complete fused image is obtained.
In a possible embodiment, when the camera is in the non-activated high dynamic mode, the preset brightness weight table is a first brightness weight table, and each brightness weight in the first brightness weight table is determined according to the brightness of a pixel in the color visible light image and the brightness of a pixel in the black-and-white full spectrum image after brightness correction; after the camera starts the high dynamic mode, the preset brightness weight table is a second brightness weight table, and each brightness weight in the second brightness weight table is determined according to the average brightness, the brightness of the pixel in the color visible light image, and the brightness of the pixel in the black-and-white full spectrum image after brightness correction, wherein the average brightness is the average value of the brightness of each pixel in the color visible light image and the black-and-white full spectrum image after brightness correction.
Specifically, each luminance weight in the first luminance weight table may be obtained by weighting the first initial luminance weight table with a preset maximum value table and a preset minimum value table; each luminance weight in the second luminance weight table may be obtained by weighting the second initial luminance weight table with a preset maximum value table and a preset minimum value table.
Optionally, the processor is further configured to start a high dynamic mode of the camera when receiving a high dynamic mode start instruction; and when a high dynamic mode closing instruction is received, closing the high dynamic mode of the camera. After the camera starts the high dynamic mode, the processor can also turn on the infrared light supplement lamp, thereby triggering and increasing the collected image information.
An embodiment of the present application further provides an image generating method, with reference to fig. 7, where the method includes:
s701, acquiring a color visible light image and a black-and-white full spectrum image of a shooting scene, wherein an imaging light source of the black-and-white full spectrum image comprises visible light and infrared light.
The image generation method of the embodiment of the application can be realized by electronic equipment, and the electronic equipment can be a camera, a hard disk video recorder and the like. The camera can acquire a color visible light image and a black-and-white full spectrum image of a shooting scene through the visible light lens and the full spectrum lens. The hard disk video recorder can acquire a color visible light image and a black-and-white full spectrum image of a shooting scene through the front-end camera.
And S702, registering the color visible light image and the black-and-white full spectrum image.
And S703, carrying out image fusion on the registered black-and-white full-spectrum image and the registered color visible light image to obtain a fused image.
In the embodiment of the application, the signal-to-noise ratio of the black-and-white full-spectrum image is greater than that of the color visible light image, the color fusion image is obtained by fusing the black-and-white full-spectrum image and the color visible light image, the signal-to-noise ratio of the fusion image is greater than that of the color visible light image, and the fusion image is adopted to replace the color visible light image, so that the image quality is improved.
Optionally, before the registering the color visible light image and the black-and-white full spectrum image, the method further includes:
determining the current illumination of the visible light lens for collecting the color visible light image;
when the current illumination of the visible light lens is larger than a preset illumination threshold value, outputting the color visible light image;
the registering the color visible light image and the black-and-white full spectrum image includes:
and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, registering the color visible light image and the black-and-white full-spectrum image.
Optionally, the determining the current illumination of the visible light lens for collecting the color visible light image includes:
acquiring current shutter parameters, current exposure gain and current image brightness values of a visible light lens for acquiring the color visible light image;
and calculating the current illumination of the visible light lens according to the current shutter parameter, the current exposure gain and the current image brightness value of the visible light lens.
Optionally, the method further includes:
and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, starting an infrared light supplement lamp.
Optionally, the color visible light image after the registration is in an RGB format, and the image fusion of the black-and-white full spectrum image after the registration and the color visible light image after the registration to obtain a fused image includes:
converting the registered black-white full-spectrum image and the registered color visible light image into YUV format;
determining a brightness factor and a color scale invariance factor of each pixel according to a Y channel value of each pixel in a black-and-white full-spectrum image in a YUV format and a Y channel value of each pixel in a color visible light image in a YUV format;
obtaining the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction according to the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the brightness factor of each pixel;
performing brightness fusion according to the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after the brightness correction to obtain the Y-channel numerical value of each pixel in a fused image;
obtaining a color-adjusted color visible light image according to the R, G, B numerical value of each pixel in the color visible light image after the registration and the color scale invariance factor of each pixel;
and converting the color-adjusted color visible light image into a YUV format to obtain a target color visible light image, taking the U channel numerical value of each pixel in the target color visible light image as the U channel numerical value of each pixel in the fusion image, and taking the V channel numerical value of each pixel in the target color visible light image as the V channel numerical value of each pixel in the fusion image.
Optionally, the determining the brightness factor and the color scale invariance factor of each pixel according to the Y channel value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel value of each pixel in the color visible light image in the YUV format includes:
respectively calculating the ratio of the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format to the Y-channel numerical value of the pixel at the corresponding position in the color visible light image in the YUV format to obtain the brightness factor of each pixel;
and inquiring a preset color scale factor table according to the Y channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel numerical value of each pixel in the color visible light image in the YUV format to obtain the color scale invariance factor of each pixel.
Optionally, the process of pre-establishing the preset color scale factor table includes:
by the formula: ycf (Ym, Yc) ═ clip (Y _ pix _ factor _ fac _ m (Ym)/64,4,32), and color scale invariance factors corresponding to different combinations of ymyc values are calculated, wherein Ycf (Ym, Yc) is the color scale invariance factor, Y _ pix _ factor is clip (Y _ pix _ factor _ fac _ c (Yc)/64,4,64), Y _ pix _ factor is clip ((Ym × 16)/(Yc +1),10,64), Ym is the luminance of the pixels in the full-spectrum image, Yc is the luminance of the pixels in the visible-light image, fac _ c (Yc) represents the value of the preset color weight curve when the luminance of the pixels in the full-spectrum image is Yc, and fac _ m (Ym) represents the value of the black-and white weight curve when the luminance of the pixels in the black-and-white-spectrum image is Ym;
and establishing a preset color proportion factor table according to the corresponding relation between the YmYc combination and the color proportion invariance factor.
Optionally, the performing luminance fusion according to the Y channel value of each pixel in the color visible light image and the Y channel value of each pixel in the black-and-white full-spectrum image after luminance correction to obtain the Y channel value of each pixel in the fused image includes:
inquiring a preset brightness weight table according to the Y channel numerical value of each pixel in the color visible light image and the Y channel numerical value of each pixel in the black-and-white full spectrum image after the brightness correction, and respectively determining the brightness weight of each pixel in the fused image, wherein the preset brightness weight table records the corresponding relation between the brightness weight and the brightness parameter, and the brightness parameter comprises the brightness of the pixel in the color visible light image and the brightness of the pixel in the black-and-white full spectrum image after the brightness correction;
and respectively carrying out brightness fusion on the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after the brightness correction according to the brightness weight of each pixel in the fused image to obtain the Y-channel numerical value of each pixel in the fused image.
Optionally, when the camera that collects the color visible light image does not start the high dynamic mode, the preset brightness weight table is a first brightness weight table, where each brightness weight in the first brightness weight table is determined according to the brightness of a pixel in the color visible light image and the brightness of a pixel in the black-and-white full spectrum image after brightness correction; after the camera starts the high dynamic mode, the preset brightness weight table is a second brightness weight table, and each brightness weight in the second brightness weight table is determined according to an average brightness, a brightness of a pixel in the color visible light image, and a brightness of a pixel in the black-and-white full spectrum image after the brightness correction, wherein the average brightness is an average value of the brightness of each pixel in the color visible light image and the black-and-white full spectrum image after the brightness correction.
It should be noted that, in this document, the technical features in the various alternatives can be combined to form the scheme as long as the technical features are not contradictory, and the scheme is within the scope of the disclosure of the present application. Relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, embodiments of the method, the electronic device and the storage medium are substantially similar to the embodiment of the camera, so that the description is simple, and relevant points can be found in the partial description of the embodiment of the camera.
The above description is only for the preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application are included in the protection scope of the present application.
Claims (8)
1. A camera, characterized in that the camera comprises:
the system comprises a visible light lens, a full spectrum lens and a processor;
the visible light lens is used for collecting a color visible light image;
the full-spectrum lens is used for collecting a black-and-white full-spectrum image, wherein an imaging light source of the black-and-white full-spectrum image comprises visible light and infrared light;
the processor is used for registering the color visible light image and the black-and-white full spectrum image; under the condition that the registered color visible light image is in an RGB format, converting the registered black-and-white full-spectrum image and the registered color visible light image into a YUV format;
respectively calculating the ratio of the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format to the Y-channel numerical value of the pixel at the corresponding position in the color visible light image in the YUV format to obtain the brightness factor of each pixel;
inquiring a preset color scale factor table according to the Y channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel numerical value of each pixel in the color visible light image in the YUV format to obtain a color scale invariance factor of each pixel;
obtaining the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction according to the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the brightness factor of each pixel;
performing brightness fusion according to the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction to obtain the Y-channel numerical value of each pixel in a fused image;
obtaining a color-adjusted color visible light image according to the R, G, B numerical value of each pixel in the registered color visible light image and the color scale invariance factor of each pixel;
and converting the color-adjusted color visible light image into a YUV format to obtain a target color visible light image, taking the U channel numerical value of each pixel in the target color visible light image as the U channel numerical value of each pixel in the fusion image, and taking the V channel numerical value of each pixel in the target color visible light image as the V channel numerical value of each pixel in the fusion image.
2. The camera of claim 1, wherein the full-spectrum lens is a confocal lens for infrared light and visible light.
3. The camera of claim 1, wherein the processor is specifically configured to: determining the current illumination of the visible light lens; when the current illumination of the visible light lens is larger than a preset illumination threshold value, outputting the colorful visible light image; when the current illumination of the visible light lens is not greater than a preset illumination threshold value, registering the color visible light image and the black-and-white full spectrum image, carrying out image fusion on the registered black-and-white full spectrum image and the registered color visible light image to obtain a fused image, and outputting the fused image.
4. The camera of claim 3, wherein said determining a current illumination of said visible light lens comprises:
acquiring current shutter parameters, current exposure gain and current image brightness values of the visible light lens;
and calculating the current illumination of the visible light lens according to the current shutter parameter, the current exposure gain and the current image brightness value of the visible light lens.
5. The camera of claim 3, further comprising an infrared fill-in light; the processor is further configured to: and when the current illumination of the visible light lens is not greater than a preset illumination threshold value, the infrared light supplement lamp is started.
6. The camera according to claim 1, wherein performing luminance fusion according to the Y-channel value of each pixel in the color visible light image and the luminance-corrected Y-channel value of each pixel in the black-and-white full-spectrum image to obtain the Y-channel value of each pixel in the fused image comprises:
inquiring a preset brightness weight table according to the Y channel numerical value of each pixel in the color visible light image and the Y channel numerical value of each pixel in the black-and-white full spectrum image after brightness correction, and respectively determining the brightness weight of each pixel in the fused image, wherein the preset brightness weight table records the corresponding relation between the brightness weight and the brightness parameter, and the brightness parameter comprises the brightness of the pixel in the color visible light image and the brightness of the pixel in the black-and-white full spectrum image after brightness correction;
and respectively carrying out brightness fusion on the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after the brightness correction according to the brightness weight of each pixel in the fusion image to obtain the Y-channel numerical value of each pixel in the fusion image.
7. The camera according to claim 6, wherein when the camera is in the non-activated high dynamic mode, the preset brightness weight table is a first brightness weight table, wherein each brightness weight in the first brightness weight table is determined according to the brightness of a pixel in the color visible light image and the brightness of a pixel in the brightness-corrected black-and-white full spectrum image; after the high dynamic mode of the camera is started, the preset brightness weight table is a second brightness weight table, and each brightness weight in the second brightness weight table is determined according to the average brightness, the brightness of the pixel in the color visible light image and the brightness of the pixel in the black-and-white full spectrum image after the brightness correction, wherein the average brightness is the average value of the brightness of each pixel in the color visible light image and the black-and-white full spectrum image after the brightness correction.
8. An image generation method, characterized in that the method comprises:
acquiring a color visible light image and a black-and-white full spectrum image of a shooting scene, wherein an imaging light source of the black-and-white full spectrum image comprises visible light and infrared light;
registering the color visible light image and the black-and-white full spectrum image;
under the condition that the registered color visible light image is in an RGB format, converting the registered black-and-white full-spectrum image and the registered color visible light image into a YUV format;
respectively calculating the ratio of the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format to the Y-channel numerical value of the pixel at the corresponding position in the color visible light image in the YUV format to obtain the brightness factor of each pixel;
inquiring a preset color scale factor table according to the Y channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the Y channel numerical value of each pixel in the color visible light image in the YUV format to obtain a color scale invariance factor of each pixel;
obtaining the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction according to the Y-channel numerical value of each pixel in the black-and-white full-spectrum image in the YUV format and the brightness factor of each pixel;
performing brightness fusion according to the Y-channel numerical value of each pixel in the color visible light image and the Y-channel numerical value of each pixel in the black-and-white full-spectrum image after brightness correction to obtain the Y-channel numerical value of each pixel in a fused image;
obtaining a color-adjusted color visible light image according to the R, G, B numerical value of each pixel in the registered color visible light image and the color scale invariance factor of each pixel;
and converting the color-adjusted color visible light image into a YUV format to obtain a target color visible light image, taking the U channel numerical value of each pixel in the target color visible light image as the U channel numerical value of each pixel in the fusion image, and taking the V channel numerical value of each pixel in the target color visible light image as the V channel numerical value of each pixel in the fusion image.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910618561.0A CN112217962B (en) | 2019-07-10 | 2019-07-10 | Camera and image generation method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910618561.0A CN112217962B (en) | 2019-07-10 | 2019-07-10 | Camera and image generation method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN112217962A CN112217962A (en) | 2021-01-12 |
| CN112217962B true CN112217962B (en) | 2022-04-05 |
Family
ID=74047067
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201910618561.0A Active CN112217962B (en) | 2019-07-10 | 2019-07-10 | Camera and image generation method |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN112217962B (en) |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113077533B (en) * | 2021-03-19 | 2023-05-12 | 浙江大华技术股份有限公司 | Image fusion method and device and computer storage medium |
| CN115460343B (en) * | 2022-07-31 | 2023-06-13 | 荣耀终端有限公司 | Image processing method, device and storage medium |
| CN115861143A (en) * | 2022-12-26 | 2023-03-28 | 北京百度网讯科技有限公司 | Image fusion method and device, intelligent chip and storage medium |
| CN118261806B (en) * | 2024-03-07 | 2024-10-25 | 北京晶品特装科技股份有限公司 | Method for fusing images based on low-light-level and color visible light |
| CN118509670B (en) * | 2024-07-18 | 2025-01-17 | 深圳市积加创新技术有限公司 | A method and system for realizing high-resolution black light night vision based on dual cameras |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101067710A (en) * | 2006-01-20 | 2007-11-07 | 红外线解决方案公司 | Camera with hybrid visible and infrared images |
| CN102298769A (en) * | 2011-06-11 | 2011-12-28 | 浙江理工大学 | Colored fusion method of night vision low-light image and infrared image based on color transmission |
| CN104661008A (en) * | 2013-11-18 | 2015-05-27 | 深圳中兴力维技术有限公司 | Processing method and device for improving colorful image quality under condition of low-light level |
| CN107103596A (en) * | 2017-04-27 | 2017-08-29 | 湖南源信光电科技股份有限公司 | A kind of color night vision image interfusion method based on yuv space |
| CN108419062A (en) * | 2017-02-10 | 2018-08-17 | 杭州海康威视数字技术股份有限公司 | Image fusion device and image fusion method |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8878950B2 (en) * | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
| CA2767023C (en) * | 2011-02-09 | 2014-09-09 | Research In Motion Limited | Increased low light sensitivity for image sensors by combining quantum dot sensitivity to visible and infrared light |
-
2019
- 2019-07-10 CN CN201910618561.0A patent/CN112217962B/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101067710A (en) * | 2006-01-20 | 2007-11-07 | 红外线解决方案公司 | Camera with hybrid visible and infrared images |
| CN102298769A (en) * | 2011-06-11 | 2011-12-28 | 浙江理工大学 | Colored fusion method of night vision low-light image and infrared image based on color transmission |
| CN104661008A (en) * | 2013-11-18 | 2015-05-27 | 深圳中兴力维技术有限公司 | Processing method and device for improving colorful image quality under condition of low-light level |
| CN108419062A (en) * | 2017-02-10 | 2018-08-17 | 杭州海康威视数字技术股份有限公司 | Image fusion device and image fusion method |
| CN107103596A (en) * | 2017-04-27 | 2017-08-29 | 湖南源信光电科技股份有限公司 | A kind of color night vision image interfusion method based on yuv space |
Non-Patent Citations (1)
| Title |
|---|
| 《采用WA-WBA与改进INSCT的图像融合算法》;赵春晖 等;《电子与信息学报》;20140228;第304-310页 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN112217962A (en) | 2021-01-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112217962B (en) | Camera and image generation method | |
| JP6947412B2 (en) | Combined HDR / LDR video streaming | |
| CN110022469B (en) | Image processing method, device, storage medium and electronic device | |
| US7656437B2 (en) | Image processing apparatus, image processing method, and computer program | |
| US7030913B2 (en) | White balance control apparatus and method, and image pickup apparatus | |
| US9445067B2 (en) | Imaging device and image signal processor with color noise correction | |
| US9407889B2 (en) | Image processing apparatus and image processing method for white balance control | |
| US20120127334A1 (en) | Adaptive spatial sampling using an imaging assembly having a tunable spectral response | |
| US20090310885A1 (en) | Image processing apparatus, imaging apparatus, image processing method and recording medium | |
| US8614751B2 (en) | Image processing apparatus and image processing method | |
| CN106063249A (en) | Imaging device, control method thereof, and computer-readable recording medium | |
| KR102102740B1 (en) | Image processing apparatus and image processing method | |
| JP2002027491A (en) | Image input apparatus, white balance adjustment method, and computer-readable recording medium storing program for executing the method | |
| US10395347B2 (en) | Image processing device, imaging device, image processing method, and image processing program | |
| CN107533756B (en) | Image processing device, imaging device, image processing method, and storage medium storing image processing program for image processing device | |
| CN113711584B (en) | a camera device | |
| JP4250506B2 (en) | Image processing method, image processing apparatus, image processing program, and imaging system | |
| WO2020146118A1 (en) | Lens rolloff assisted auto white balance | |
| CN103379289A (en) | Imaging apparatus | |
| Vuong et al. | A new auto exposure and auto white-balance algorithm to detect high dynamic range conditions using CMOS technology | |
| JP2003264850A (en) | Digital camera | |
| CN114143443B (en) | Dual sensor camera system and camera method thereof | |
| US20200228769A1 (en) | Lens rolloff assisted auto white balance | |
| JP2005142953A (en) | Digital imaging device | |
| JP7263018B2 (en) | Image processing device, image processing method and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |