[go: up one dir, main page]

WO2021208593A1 - High dynamic range image processing system and method, electronic device, and storage medium - Google Patents

High dynamic range image processing system and method, electronic device, and storage medium Download PDF

Info

Publication number
WO2021208593A1
WO2021208593A1 PCT/CN2021/077093 CN2021077093W WO2021208593A1 WO 2021208593 A1 WO2021208593 A1 WO 2021208593A1 CN 2021077093 W CN2021077093 W CN 2021077093W WO 2021208593 A1 WO2021208593 A1 WO 2021208593A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
processing
image
original image
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2021/077093
Other languages
French (fr)
Chinese (zh)
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of WO2021208593A1 publication Critical patent/WO2021208593A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Definitions

  • This application relates to the field of image processing technology, and in particular to a high dynamic range image processing system and method, electronic equipment, and computer-readable storage media.
  • a camera with a high dynamic range (High-Dynamic Range, HDR) function can capture images in a large light ratio, and it can perform better than ordinary cameras in both high light and dark positions.
  • HDR High-Dynamic Range
  • the embodiments of the present application provide a high dynamic range image processing system and method, electronic equipment, and computer-readable storage medium.
  • the high dynamic range image processing system includes an image sensor, a high dynamic fusion unit, and an image processor.
  • the image sensor includes a pixel array.
  • the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels.
  • the color photosensitive pixel has a narrower spectral response than the full-color photosensitive pixel.
  • the pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array is exposed for a first exposure time to obtain a first original image.
  • the first original image includes first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by the full-color photosensitive pixels exposed at the first exposure time.
  • the pixel array is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixel exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixel exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image processor and the high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image.
  • the high dynamic range image processing method provided by the embodiment of the present application is used in a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor.
  • the image sensor includes a pixel array, and the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels.
  • the color photosensitive pixel has a narrower spectral response than the full-color photosensitive pixel.
  • the pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the high dynamic range image processing method includes: controlling a pixel array to perform at least two exposures, wherein the pixel array is exposed at a first exposure time to obtain a first original image, and the first original image includes the single exposed at the first exposure time.
  • the second original image includes second color original image data generated by the single-color photosensitive pixel exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixel exposed at the second exposure time ;
  • the first exposure time is not equal to the second exposure time; and image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on the first original image and the second original image to obtain the target image.
  • the electronic device provided by the embodiment of the present application includes a lens, a housing, and a high dynamic range image processing system.
  • the lens and the high dynamic range image processing system are combined with the housing.
  • the lens cooperates with the image sensor of the high dynamic range image processing system for imaging.
  • the high dynamic range image processing system includes an image sensor, a high dynamic fusion unit and an image processor.
  • the image sensor includes an array of pixels.
  • the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units.
  • Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time.
  • the pixel array is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image processor and the high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image.
  • the processor executes the high dynamic range image processing method.
  • the high dynamic range image processing method is used in a high dynamic range image processing system.
  • the high dynamic range image processing system includes an image sensor, a color high dynamic fusion unit, a panchromatic high dynamic fusion unit and an image processor.
  • the image sensor includes a pixel array, and the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • the pixel array is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time.
  • the pixel array is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image processor, the color high dynamic fusion unit, and the panchromatic high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image.
  • Fig. 1 is a schematic diagram of a high dynamic range image processing system according to an embodiment of the present application
  • Fig. 2 is a schematic diagram of a pixel array according to an embodiment of the present application.
  • FIG. 3 is a schematic cross-sectional view of a photosensitive pixel according to an embodiment of the present application.
  • FIG. 4 is a pixel circuit diagram of a photosensitive pixel according to an embodiment of the present application.
  • 5 to 10 are schematic diagrams of the arrangement of the smallest repeating unit in the pixel array of the embodiment of the present application.
  • 11 to 13 are schematic diagrams of original images output by image sensors in some embodiments of the present application.
  • FIG. 14 is a schematic diagram of a pixel array according to an embodiment of the present application.
  • FIG. 15 to FIG. 17 are schematic diagrams of pixel completion processing in an embodiment of the present application.
  • 18 to 20 are schematic diagrams of a high dynamic range image processing system according to an embodiment of the present application.
  • FIG. 21 is a schematic diagram of black level correction processing according to an embodiment of the present application.
  • FIG. 22 is a schematic diagram of lens shading correction processing according to an embodiment of the present application.
  • FIG. 23 and FIG. 24 are schematic diagrams of dead pixel compensation processing in an embodiment of the present application.
  • FIG. 25 to FIG. 28 are schematic diagrams of demosaicing processing in an embodiment of the present application.
  • FIG. 29 is a schematic diagram of the mapping relationship between Vout and Vin in the tone mapping process of the embodiment of the present application.
  • FIG. 30 is a schematic diagram of brightness alignment processing according to an embodiment of the present application.
  • FIG. 31 is a schematic diagram of pixel addition processing in an embodiment of the present application.
  • FIG. 32 is a schematic diagram of pixel averaging processing according to an embodiment of the present application.
  • FIG. 33 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 34 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 35 is a schematic diagram of interaction between a non-volatile computer-readable storage medium and a processor in some embodiments of the present application.
  • the high dynamic range image processing system 100 includes an image sensor 10, a high dynamic fusion unit 50 and an image processor 20.
  • the image sensor 10 includes a pixel array 11, and the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The pixel array 11 is exposed for a first exposure time to obtain a first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the image processor 20 and the high dynamic fusion unit 50 are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image.
  • the high dynamic range image processing system 100 of the embodiment of the present application controls the pixel array 11 to perform at least two exposures at the first exposure time and the second exposure time respectively, and generates multiple images according to different exposure times and different photosensitive pixels, In order to subsequently perform image preprocessing, high dynamic range processing, image processing and fusion algorithm processing on the multiple images, so as to obtain a target image with high dynamic range.
  • the high dynamic range image processing system 100 of the embodiment of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the bright and dark positions of the target image can be better. Performance, which is conducive to improving imaging performance, while helping to reduce costs.
  • the pixel array 11 is exposed at a third exposure time to obtain a third original image
  • the third original image includes the third original image exposed at the third exposure time.
  • the image processor 20 and the high dynamic fusion unit 50 are used to perform image preprocessing, high dynamic range processing, image processing, and processing on the first original image, the second original image, and the third original image.
  • the fusion algorithm processes the target image.
  • the image processor 20 includes a color preprocessing module 2023, a full color preprocessing module 2024, a color processing module 2021, a full color processing module 2022, and a fusion module 204,
  • the image preprocessing includes pixel completion processing and demosaicing processing, and the image processing includes first image processing and second image processing.
  • the color preprocessing module 2023 is used to perform pixel complement processing on the color original image data to obtain a color original image
  • the full color preprocessing module 2024 is used to perform demosaicing processing on the full color original image data to obtain a complete Color original image
  • the color processing module 2021 is used to perform first image processing on the color original image to obtain a color intermediate image
  • the full color processing module 2022 is used to perform second image processing on the full color original image ,
  • the fusion module 204 is configured to perform a fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image.
  • the high dynamic fusion unit 50 is configured to combine at least twice The target image corresponding to the exposure is fused to obtain the highly dynamic target image.
  • the high-dynamic fusion unit 50 includes a color high-dynamic fusion unit 30 and a panchromatic high-dynamic fusion unit 40.
  • the color preprocessing module 2023 performs pixel processing on the color original image data.
  • the color high-dynamic fusion unit 30 is configured to fuse the color original image data corresponding to at least two exposures to obtain the high-dynamic color original image data.
  • the panchromatic preprocessing module 2024 performs demosaic processing on the panchromatic original image data to obtain a panchromatic original image: the panchromatic high dynamic fusion unit 40 is configured to combine the panchromatic original image data corresponding to at least two exposures. The image data is fused to obtain the high dynamic panchromatic original image data.
  • the first image processing includes one of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing. Or more.
  • the second image processing includes: one or more of the black level correction processing, the lens shading correction processing, the dead pixel compensation processing, and the global tone mapping processing.
  • the first image processing includes a first image sub-processing and a second image sub-processing
  • the color processing module 2021 is used to perform the first image sub-processing on the color original image before performing the first image sub-processing.
  • the second image sub-processing, the first image sub-processing includes: one or more of black level correction processing, lens shading correction processing, and dead pixel compensation processing.
  • the second image sub-processing includes: one or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.
  • the high dynamic fusion unit 50 is configured to perform brightness alignment processing on the target image corresponding to at least two exposures to obtain the brightness-aligned target image, and then merge the brightness-aligned target image. An image and one or more of the target image to obtain the highly dynamic target image.
  • the color high dynamic fusion unit 30 is configured to perform brightness alignment processing on the color original image data corresponding to at least two exposures to obtain the brightness-aligned color original image data, and then merge the brightness The color original image data and one or more sheets of the color original image data are aligned to obtain the high dynamic color original image data.
  • the panchromatic high dynamic fusion unit 40 is configured to perform brightness alignment processing on the panchromatic original image data corresponding to at least two exposures, so as to obtain the brightness-aligned panchromatic original image data, and then merge the brightness-aligned panchromatic original image data. Full-color original image data and one or more sheets of the full-color original image data to obtain the high-dynamic full-color original image data.
  • the image processor 20 further includes a receiving unit 201 and a memory unit 203.
  • the receiving unit 201 is used to receive the color original image data and the full-color original image data;
  • the memory unit 203 is used to temporarily store the color original image data, the full-color original image data, and the color original image data.
  • the image processor 20 includes a color preprocessing module 2023 and a panchromatic preprocessing module 2024.
  • the image preprocessing includes pixel addition processing and demosaicing processing.
  • the color preprocessing module 2023 is used for The color original image data is subjected to pixel addition processing to obtain a color original image, and the panchromatic preprocessing module 2024 is configured to perform demosaicing processing on the panchromatic original image data to obtain a panchromatic original image; or, the image preprocessing includes Pixel averaging processing and demosaicing processing, the color preprocessing module 2023 is used to perform pixel averaging processing on color original image data to obtain a color original image, and the panchromatic preprocessing module 2024 is used to perform pixel averaging processing on the color original image.
  • the data is demosaiced to obtain a full-color original image.
  • the high dynamic fusion unit 50 is integrated in the image sensor 10.
  • the high dynamic fusion unit 50 is integrated in the image processor 20.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 may include the image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the pixel array 11 is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the pixel array 11 may also be exposed for a third exposure time to obtain a third original image.
  • the third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time.
  • the third exposure time is not equal to the first exposure time
  • the third exposure time is not equal to the second exposure time.
  • Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image may include: performing image preprocessing on the first original image, the second original image, and the third original image. Processing, high dynamic range processing, image processing and fusion algorithm processing to obtain the target image.
  • the image preprocessing includes pixel completion processing and demosaicing processing
  • the image processing includes first image processing and second image processing
  • image preprocessing and high dynamics are performed on the first original image and the second original image.
  • the range processing, image processing, and fusion algorithm processing to obtain the target image may also include: performing pixel complement processing on the color original image data to obtain the color original image; performing demosaic processing on the panchromatic original image data to obtain the panchromatic original image; Perform the first image processing on the color original image to obtain a color intermediate image; perform the second image processing on the panchromatic original image to obtain the panchromatic intermediate image; perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image.
  • the fusion algorithm to obtain the target image image preprocessing, high dynamic range processing, image processing, and fusion are performed on the first original image and the second original image.
  • the algorithm processing to obtain the target image also includes: fusing the target images corresponding to at least two exposures to obtain a highly dynamic target image.
  • image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image and the second original image.
  • the processing to obtain the target image also includes: fusing the color original image data corresponding to at least two exposures to obtain highly dynamic color original image data; before demosaicing the panchromatic original image data to obtain the panchromatic original image, perform the first
  • the original image and the second original image are processed by image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing to obtain the target image. It also includes: fusing the panchromatic primitive image data corresponding to at least two exposures to obtain a highly dynamic panchromatic primitive image data.
  • the first image processing includes one or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing. indivual.
  • the second image processing includes one or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing.
  • the first image processing includes a first image sub-processing and a second image sub-processing
  • the color processing module 2021 is configured to perform the first image sub-processing on the color original image first, and then perform the second image sub-processing
  • the first image sub-processing includes one or more of black level correction processing, lens shading correction processing, and dead pixel compensation processing
  • the second image sub-processing includes: one or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.
  • fusing the target images corresponding to at least two exposures to obtain a highly dynamic target image includes: subjecting the target images corresponding to the at least two exposures to brightness alignment processing to obtain a target image with aligned brightness, and then fusing the brightness Align the target image and one or more target images to obtain a highly dynamic target image.
  • fusing the color original image data corresponding to at least two exposures to obtain highly dynamic color original image data includes: subjecting the color original image data corresponding to the at least two exposures to brightness alignment processing to obtain brightness alignment.
  • the color original image data is merged with brightness-aligned color original image data and one or more color original image data to obtain highly dynamic color original image data.
  • Fusion of panchromatic original image data corresponding to at least two exposures to obtain highly dynamic panchromatic original image data includes: performing brightness alignment processing on panchromatic original image data corresponding to at least two exposures to obtain a brightness-aligned panchromatic original image Data, and then merge the brightness-aligned panchromatic original image data and one or more panchromatic original image data to obtain highly dynamic panchromatic original image data.
  • the high dynamic range image processing method further includes: receiving color original image data and full-color original image data; and temporarily storing color original image data, full-color original image data, color original image, and full-color original image , One or more of the color intermediate image, the full-color intermediate image, and the target image.
  • the image preprocessing includes pixel addition processing and demosaicing processing, and image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image, the second original image, and the third original image.
  • the processing to obtain the target image includes: performing pixel addition processing on the color original image data to obtain the color original image; and performing demosaicing processing on the panchromatic original image data to obtain the panchromatic original image; or image preprocessing including pixel averaging processing and Demosaic processing, image preprocessing, high dynamic range processing, image processing and fusion algorithm processing are performed on the first original image, the second original image and the third original image to obtain the target image including: pixel averaging processing on the color original image data , Get the color original image; and perform demosaic processing on the full-color original image data to get the full-color original image.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments.
  • the lens 300 and the high dynamic range image processing system 100 are combined with the housing 200.
  • the lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.
  • the present application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method described in any one of the foregoing embodiments.
  • FIG. 2 is a schematic diagram of the image sensor 10 in the embodiment of the present application.
  • the image sensor 10 includes a pixel array 11, a vertical driving unit 12, a control unit 13, a column processing unit 14 and a horizontal driving unit 15.
  • the image sensor 10 may adopt a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.
  • CMOS complementary metal oxide semiconductor
  • CCD Charge-coupled Device
  • the pixel array 11 includes a plurality of photosensitive pixels 110 (shown in FIG. 3) arranged two-dimensionally in an array (ie, arranged in a two-dimensional matrix), and each photosensitive pixel 110 includes a photoelectric conversion element 1111 (shown in FIG. 4) .
  • Each photosensitive pixel 110 converts light into electric charge according to the intensity of light incident thereon.
  • the vertical driving unit 12 includes a shift register and an address decoder.
  • the vertical drive unit 12 includes readout scanning and reset scanning functions.
  • the readout scan refers to sequentially scanning the unit photosensitive pixels 110 line by line, and reading signals from these unit photosensitive pixels 110 line by line.
  • the signal output by each photosensitive pixel 110 in the selected and scanned photosensitive pixel row is transmitted to the column processing unit 14.
  • the reset scan is used to reset the charge, and the photocharge of the photoelectric conversion element is discarded, so that the accumulation of new photocharge can be started.
  • the signal processing performed by the column processing unit 14 is correlated double sampling (CDS) processing.
  • CDS correlated double sampling
  • the reset level and the signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated.
  • the signals of the photosensitive pixels 110 in a row are obtained.
  • the column processing unit 14 may have an analog-to-digital (A/D) conversion function for converting analog pixel signals into a digital format.
  • A/D analog-to-digital
  • the horizontal driving unit 15 includes a shift register and an address decoder.
  • the horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Through the selection scanning operation performed by the horizontal driving unit 15, each photosensitive pixel column is sequentially processed by the column processing unit 14, and is sequentially output.
  • control unit 13 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 12, the column processing unit 14 and the horizontal driving unit 15 to work together.
  • FIG. 3 is a schematic diagram of a photosensitive pixel 110 in an embodiment of the present application.
  • the photosensitive pixel 110 includes a pixel circuit 111, a filter 112, and a micro lens 113. Along the light-receiving direction of the photosensitive pixel 110, the microlens 113, the filter 112, and the pixel circuit 111 are arranged in sequence.
  • the microlens 113 is used for condensing light
  • the filter 112 is used for passing light of a certain waveband and filtering out the light of other wavebands.
  • the pixel circuit 111 is used to convert the received light into electrical signals, and provide the generated electrical signals to the column processing unit 14 shown in FIG. 2.
  • FIG. 4 is a schematic diagram of a pixel circuit 111 of a photosensitive pixel 110 in an embodiment of the present application.
  • the pixel circuit 111 in FIG. 4 can be applied to each photosensitive pixel 110 (shown in FIG. 3) in the pixel array 11 shown in FIG.
  • the working principle of the pixel circuit 111 will be described below with reference to FIGS. 2 to 4.
  • the pixel circuit 111 includes a photoelectric conversion element 1111 (for example, a photodiode), an exposure control circuit (for example, a transfer transistor 1112), a reset circuit (for example, a reset transistor 1113), and an amplification circuit (for example, an amplification transistor 1114). ) And a selection circuit (for example, a selection transistor 1115).
  • the transfer transistor 1112, the reset transistor 1113, the amplifying transistor 1114, and the selection transistor 1115 are, for example, MOS transistors, but are not limited thereto.
  • the photoelectric conversion element 1111 includes a photodiode, and the anode of the photodiode is connected to the ground, for example.
  • the photodiode converts the received light into electric charge.
  • the cathode of the photodiode is connected to the floating diffusion unit FD via an exposure control circuit (for example, a transfer transistor 1112).
  • the floating diffusion unit FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.
  • the exposure control circuit is a transfer transistor 1112, and the control terminal TG of the exposure control circuit is the gate of the transfer transistor 1112.
  • the transfer transistor 1112 When a pulse of an active level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on.
  • the transfer transistor 1112 transfers the charge photoelectrically converted by the photodiode to the floating diffusion unit FD.
  • the drain of the reset transistor 1113 is connected to the pixel power supply VPIX.
  • the source of the reset transistor 113 is connected to the floating diffusion unit FD.
  • a pulse of an effective reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on.
  • the reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
  • the gate of the amplifying transistor 1114 is connected to the floating diffusion unit FD.
  • the drain of the amplifying transistor 1114 is connected to the pixel power supply VPIX.
  • the amplifying transistor 1114 After the floating diffusion unit FD is reset by the reset transistor 1113, the amplifying transistor 1114 outputs the reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplifying transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.
  • the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114.
  • the source of the selection transistor 1115 is connected to the column processing unit 14 in FIG. 2 through the output terminal OUT.
  • the selection transistor 1115 is turned on.
  • the signal output by the amplifying transistor 1114 is transmitted to the column processing unit 14 through the selection transistor 1115.
  • the pixel structure of the pixel circuit 111 in the embodiment of the present application is not limited to the structure shown in FIG. 4.
  • the pixel circuit 111 may also have a three-transistor pixel structure, in which the functions of the amplifying transistor 1114 and the selecting transistor 1115 are performed by one transistor.
  • the exposure control circuit is not limited to the way of a single transfer transistor 1112, and other electronic devices or structures with the function of controlling the conduction of the control terminal can be used as the exposure control circuit in the embodiment of the present application.
  • the implementation of the transistor 1112 is simple, low in cost, and easy to control.
  • 5 to 10 are schematic diagrams of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the pixel array 11 (shown in FIG. 2) according to some embodiments of the present application.
  • the photosensitive pixels 110 include two types, one is a full-color photosensitive pixel W, and the other is a color photosensitive pixel.
  • 5 to 10 only show the arrangement of a plurality of photosensitive pixels 110 in a minimum repeating unit. The smallest repeating unit shown in FIGS. 5 to 10 is copied multiple times in rows and columns to form the pixel array 11. Each minimum repeating unit is composed of multiple full-color photosensitive pixels W and multiple color photosensitive pixels. Each minimum repeating unit includes multiple subunits.
  • Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels W.
  • the full-color photosensitive pixel W and the color photosensitive pixel in each sub-unit are alternately arranged.
  • multiple photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category; or, multiple photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category 110.
  • FIG. 5 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit of an embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110
  • the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • a first type subunit UA and a third type subunit UC are arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner in FIG. 5), and two second type subunits UB are arranged In the second diagonal direction D2 (for example, the direction where the upper right corner and the lower left corner are connected in FIG. 5).
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • first diagonal direction D1 may also be a direction connecting the upper right corner and the lower left corner
  • second diagonal direction D2 may also be a direction connecting the upper left corner and the lower right corner
  • direction here is not a single direction, but can be understood as the concept of a "straight line” indicating the arrangement, and there may be two-way directions at both ends of the straight line.
  • the explanation of the first diagonal direction D1 and the second diagonal direction D2 in FIGS. 6 to 10 is the same as here.
  • FIG. 6 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in a minimum repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 36 photosensitive pixels 110 in 6 rows and 6 columns, and the sub-units are 9 photosensitive pixels 110 in 3 rows and 3 columns.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 7 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the minimum repeating unit is 8 rows and 8 columns and 64 photosensitive pixels 110
  • the sub-units are 4 rows and 4 columns and 16 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repetition unit includes four subunits, which are a first type subunit UA, two second type subunits UB, and a third type subunit UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 8 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG. 8 is roughly the same as the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG.
  • the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the subunit UB is inconsistent with the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the second type of subunit UB in the lower left corner of FIG. 5, and ,
  • the alternating sequence of the full-color photosensitive pixel W and the single-color photosensitive pixel in the third type subunit UC in FIG. 8 is the same as the full-color photosensitive pixel W and the single-color photosensitive pixel W in the third type subunit UC in the lower right corner of FIG.
  • the alternating sequence of photosensitive pixels is also inconsistent. Specifically, in the second type subunit UB in the lower left corner of FIG. 5, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (ie, second-color photosensitive pixels B), and The alternating sequence of the two rows of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B) and full-color photosensitive pixels W; and in the second-type subunit UB in the lower left corner of FIG.
  • the first row The alternating sequence of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (ie The second color photosensitive pixel B).
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels C), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, a third-color photosensitive pixel C) and a full-color photosensitive pixel W; and in the third type subunit UC in the lower right corner of FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence of the single-color photosensitive pixels (that is, the third color photosensitive pixel C), the full-color photosensitive pixel W, the alternating sequence of the photosensitive pixels 110 in the second row is the full-color photosensitive pixel W, the single-color photosensitive pixel (that is, the third color Photosensitive pixel C).
  • the alternating sequence of pixels is not consistent.
  • the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, first-color photosensitive pixels A), and the second row
  • the alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, the first color photosensitive pixel A), a full-color photosensitive pixel W; and in the third type of subunit CC shown in FIG.
  • the photosensitive pixels 110 in the first row The alternating sequence is single-color photosensitive pixels (that is, third-color photosensitive pixels C), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels). Pixel C). That is to say, in the same minimum repeating unit, the alternating sequence of full-color photosensitive pixels W and color photosensitive pixels in different subunits can be the same (as shown in Figure 5) or inconsistent (as shown in Figure 8). Show).
  • FIG. 9 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repetition unit includes four subunits, which are a first type subunit UA, two second type subunits UB, and a third type subunit UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • FIG. 10 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application.
  • the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110.
  • the arrangement method is:
  • W represents the full-color photosensitive pixel
  • A represents the first color photosensitive pixel among the multiple color photosensitive pixels
  • B represents the second color photosensitive pixel among the multiple color photosensitive pixels
  • C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.
  • a plurality of photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category.
  • the photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.
  • the categories of subunits include three categories.
  • the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A
  • the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B
  • the third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C.
  • Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC.
  • one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1
  • two second type subunits UB are arranged in the second diagonal direction D2.
  • the first diagonal direction D1 is different from the second diagonal direction D2.
  • the first diagonal line and the second diagonal line are perpendicular.
  • multiple photosensitive pixels 110 in the same row in some sub-units may be photosensitive pixels 110 of the same category, and multiple photosensitive pixels 110 in the same column in the remaining sub-units
  • the pixels 110 are photosensitive pixels 110 of the same type.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a green photosensitive pixel G; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.
  • the first color photosensitive pixel A may be a magenta photosensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; and the third color photosensitive pixel C may It is the yellow photosensitive pixel Y.
  • the response band of the full-color photosensitive pixel W may be the visible light band (for example, 400 nm-760 nm).
  • the full-color photosensitive pixel W is provided with an infrared filter to filter out infrared light.
  • the response wavelength bands of the full-color photosensitive pixel W are visible light and near-infrared wavelengths (for example, 400nm-1000nm), and the photoelectric conversion element 1111 (shown in FIG. 4) in the image sensor 10 (shown in FIG. 1) (Shown) to match the response band.
  • the full-color photosensitive pixel W may not be provided with a filter or a filter that can pass light of all wavelength bands.
  • the response band of the full-color photosensitive pixel W is determined by the response band of the photoelectric conversion element 1111, that is, the two match. .
  • the embodiments of the present application include, but are not limited to, the above-mentioned waveband range.
  • the control unit 13 controls the pixel array 11 to expose.
  • the pixel array 11 is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time; wherein, the first The exposure time is not equal to the second exposure time.
  • the image processor 20 can control the pixel array 11 to perform two exposures.
  • the pixel array 11 in the first exposure, the pixel array 11 is exposed for the first exposure time L to obtain the first original image.
  • the first original image includes first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time L and first full-color original image data generated by the panchromatic photosensitive pixels exposed at the first exposure time L.
  • the pixel array 11 is exposed for the second exposure time S to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time S and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time S.
  • the pixel array 11 may also be exposed for a third exposure time to obtain a third original image.
  • the third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time.
  • the third exposure time is not equal to the first exposure time
  • the third exposure time is not equal to the second exposure time.
  • the image processor 20 and the high dynamic fusion unit 50 (which may include a color high dynamic fusion unit 30 and a panchromatic high dynamic fusion unit 40) are used to perform image preprocessing on the first original image, the second original image, and the third original image. High dynamic range processing, image processing and fusion algorithm processing get the target image.
  • the image processor 20 can control the pixel array 11 to perform three exposures.
  • the first original image, the second original image, and the third original image are obtained respectively.
  • the first original image includes first color original image data generated by a single-color photosensitive pixel exposed at the first exposure time L and first full-color original image data generated by a panchromatic photosensitive pixel exposed at the first exposure time L.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time M and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time M.
  • the third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time S and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time S.
  • the image processor 20 may also control the pixel array 11 to perform more exposures such as four, five, six, ten, or twenty times, so as to obtain more original images.
  • the image processor 20, the color high dynamic fusion unit 30, and the panchromatic high dynamic fusion unit 40 then perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on all original images to obtain the target image.
  • the exposure process of the pixel array 11 may be: (1) The pixel array 11 takes at least two exposure times (for example, the first exposure time L and the second exposure time S, or the first exposure time L and the second exposure time S).
  • Exposure time L, second exposure time M, and third exposure time S sequential exposure (the order of exposure for different exposure times is not limited), and the exposure time of at least two exposures does not overlap on the time axis;
  • the pixel array 11 is exposed with at least two exposure times (for example, the first exposure time L and the second exposure time S, or the first exposure time L, the second exposure time M, and the third exposure time S) (where the exposure of different exposure times The sequence is not limited), and the exposure time of at least two exposures partially overlaps on the time axis; (3) All the exposure time of shorter exposure time is within the exposure time of the longest exposure time; for example, ,
  • the exposure time of the second exposure time S is within the exposure time of the first exposure time L; for another example, the second exposure time M and the third exposure time S are both within the exposure time of the first exposure time L.
  • the high dynamic range processing system 100 of the embodiment of the present application can adopt the (3) or (4) exposure mode. Using this exposure mode can shorten the exposure time required by the pixel array 11 in one shot, which is beneficial to improve the image quality.
  • the frame rate is also beneficial to minimize the interval between the exposure times of at least two exposures, so that the exposure times of multiple frames of images are closer, thereby improving the image of a highly dynamic image merged by multiple images with different exposure times quality.
  • the exposure time of at least two exposures has overlapping exposure modes (for example, the above-mentioned (2), (3), and (4) exposure modes), as shown in FIG. 14, which can be determined by the image sensor
  • a buffer processor 16 is provided in 10, and the buffer processor 16 cooperates with the control unit and the pixel array 11 to work.
  • the image sensor 10 controls the pixel array 11 to perform three exposures, which are respectively a first exposure time of 1s, a second exposure time of 1/8s, and a third exposure time of 1/64s.
  • the control unit of the image sensor 10 controls the pixel array 11 to output exposure image data with an exposure duration of 1/512 every 1/512s, and store the exposure image data in the buffer processor 16.
  • the buffer processor 16 After the buffer processor 16 receives the exposure image data, it stores the received exposure image data in the buffer memory area inside the buffer processor 16, and after a shooting starts, it will accumulate 8 images after receiving 8 exposure data. After the addition processing of the exposure image data, it is transmitted to the image sensor 10 as the third original image, and after the cumulative 64 image exposure data are received, the cumulative 64 exposure image data is added and then processed as the second original image. The image is transmitted to the image sensor 10, after 512 image exposure data are received in total, the accumulated 512 exposure image data are added together, and then transmitted to the image sensor 10 as the first original image, and 512 exposure data are received in total After that, the image sensor 10 controls the exposure of this shooting to end.
  • the buffer processor 16 is set to cooperate with the control unit and the pixel array 11, so as to complete at least two exposures in the embodiment of the present application with a simple device and working logic.
  • (2), (3), and (4) exposure methods which help improve the reliability of the system, and at the same time help shorten the exposure time required by the pixel array 11 in one shot, and improve the image frame
  • the image processor 20 may include a color preprocessing module 2023, a full color preprocessing module 2024, a color processing module 2021, a full color processing module 2022, and a fusion module 204.
  • Image preprocessing can include pixel completion processing and demosaicing processing.
  • Image processing includes first image processing and second image processing.
  • the color preprocessing module 2023 can be used to perform pixel complement processing on the color original image data to obtain the color original image.
  • the panchromatic preprocessing module 2024 can be used to demosaicate the panchromatic original image data to obtain the panchromatic original image.
  • the color processing module 2021 may be used to perform first image processing on the color original image to obtain a color intermediate image.
  • the panchromatic processing module 2022 may be used to perform second image processing on the panchromatic original image to obtain a panchromatic intermediate image.
  • the fusion module 204 may be used to perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image.
  • the image processor 20 further includes an image front-end processing unit 202.
  • the color preprocessing module 2023, the full color preprocessing module 2024, the color processing module 2021, and the full color processing module 2022 may be integrated in the image front-end processing unit 202.
  • the specific operation process of the panchromatic preprocessing module 2024 for demosaicing the panchromatic original image data is similar to the specific operation process of the demosaicing process for the first color original image and the second color original image in the following embodiments of the application. In the following, we will explain in detail in a unified way.
  • the specific operation process of the color preprocessing module 2023 performing pixel complement processing on the color original image data may include the following steps: (1) Decompose the color original image data into the first color original image Data (the original image data generated by the first color photosensitive pixel A described above), the second color original image data (the original image data generated by the second color photosensitive pixel B described above), and the third color Original image data (original image data generated by the third color photosensitive pixel C described above).
  • the resulting image with one color channel The original image of the first color, the original image of the second color, and the original image of the third color are synthesized into a color original image with the same resolution of the three color channels and the same resolution of the color original image.
  • the color preprocessing module 2023 can perform the pixel complement processing of the above steps on all color original image data corresponding to at least two exposures, thereby completing the pixel complement processing of all color original image data, and obtain the color corresponding to at least two exposures.
  • the original image Specifically, referring to FIG. 15, FIG. 16, and FIG.
  • the color preprocessing module 2023 performs pixel complement processing on the first red original image data in the first color original image data as an example for description. As shown in Figure 15, the color preprocessing module 2023 first decomposes the color original image (which can be the first color original image, the second color original image, the third color original image, etc.) data into red original image data and green original image data. And blue raw image data. As shown in FIG.
  • the red intermediate image data is obtained.
  • the color preprocessing module 2023 uses a bilinear interpolation method to interpolate the red intermediate image data to obtain red interpolated image data.
  • the color preprocessing module 2023 fuses the red interpolated image data and the red original image data to obtain a red original image.
  • the color preprocessing module 2023 In the fusion process, first, the color preprocessing module 2023 generates a null image with the same resolution as the red original image data, the pixel color arrangement in the smallest repeating unit and the red interpolated image data, and then the fusion is performed according to the following principles: ( 1) If there are pixel values in the same coordinates of the first red original image data, and the color channels are the same, then directly fill the pixel values in the same coordinates of the first red original image data into the empty image; (2) If in The first red original image data has pixel values in the same coordinates but different color channels, then the pixel values in the corresponding coordinates of the first red interpolated image data are filled into the empty image; (3) If the first red original image is If there is no pixel value in the same coordinate of the data, then the pixel value in the corresponding coordinate of the first red interpolated image data is filled into the empty image.
  • the color preprocessing module 2023 can obtain a red original image, a green original image, and a blue original image, and combine the resulting red original image, green original image, and blue original image with one color channel.
  • the original image is synthesized into a color original image with 3 color channels.
  • the color preprocessing module 2023 can perform the pixel compensation of the above steps on the first color original image data and the second color original image data (or the first color original image data, the second color original image data, and the third color original image data).
  • the high dynamic range image processing system 100 of the embodiment of the present application performs pixel complement processing on the color information in some pixel grids where the color information is missing and the pixel grids with color information only have color original image data with single color channel information. In the case of high rate, the color information of the complete channel with the complete pixel grid is obtained, and then the color original image is obtained, so that other image processing can be continued on the image subsequently to improve the imaging quality.
  • the high dynamic fusion unit 50 may expose the target image corresponding to at least two exposures ( It may include the first target image and the second target image) fusion to obtain a highly dynamic target image.
  • the high-dynamic fusion unit 50 may include a color high-dynamic fusion unit 30 and a full-color high-dynamic fusion unit 40.
  • the color high dynamic fusion unit 30 may fuse the color original image data corresponding to at least two exposures to obtain high dynamic color original image data .
  • the panchromatic high dynamic fusion unit 40 is used to fuse the panchromatic original image data corresponding to at least two exposures to obtain a high dynamic The full-color original image data.
  • the high-dynamic fusion unit 50 may include a color high-dynamic fusion unit 30 and a full-color high-dynamic fusion unit 40.
  • the color high dynamic fusion unit 30 may fuse the color original images corresponding to at least two exposures to obtain a high dynamic color original image.
  • the panchromatic high dynamic fusion unit 40 may fuse the panchromatic original images corresponding to at least two exposures to obtain a highly dynamic panchromatic original image. image.
  • the high-dynamic fusion unit 50 may include a color high-dynamic fusion unit 30 and a full-color high-dynamic fusion unit 40.
  • the color high dynamic fusion unit 30 may fuse the color intermediate images corresponding to at least two exposures to obtain a high dynamic color intermediate image
  • the panchromatic high dynamic fusion unit 40 can merge panchromatic intermediate images corresponding to at least two exposures to obtain a highly dynamic panchromatic intermediate image.
  • the first image processing may include one or more of black level correction processing, lens shading correction processing, demosaicing processing, dead pixel compensation processing, color correction processing, global tone mapping processing, and color conversion processing.
  • the second image processing may include one or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing.
  • the first image processing may include a first image sub-processing and a second image sub-processing.
  • the color processing module 2021 may first perform the first image sub-processing on the color original image, and then perform the second image sub-processing.
  • the first image sub-processing may include one or more of black level correction processing, lens shading correction processing, and dead pixel compensation processing.
  • the second image sub-processing may include one or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.
  • the process of black level correction processing may be that the color processing module 2021 or the panchromatic processing module 2022 subtracts a fixed value from each pixel value on the basis of the original image data output by the image sensor 10.
  • Each color channel (such as a red channel, a green channel, a blue channel, and a panchromatic channel.
  • the red channel refers to the red information generated by the red photosensitive pixels in the image output by the image sensor 10
  • the green channel refers to Is the green information generated by the green photosensitive pixels in the image output by the image sensor 10.
  • the red channel refers to the blue information generated by the blue photosensitive pixels in the image output by the image sensor 10
  • the panchromatic channel refers to the image sensor 10.
  • the fixed value corresponding to the full-color information generated by the full-color photosensitive pixel in the output image may be the same or different.
  • the image sensor 10 controls the pixel array 11 to perform two exposures (which may be two or more times) as an example for description.
  • the image sensor 10 can output the first color original image data and the second Color original image data, first full-color original image data, and second full-color original image data.
  • the image processor 20 receives the first color original image data, the second color original image data, the first full-color original image data, and the second full-color original image data.
  • the color preprocessing module 2023 performs pixel complement processing on the first color original image data and the second color original image data to obtain the first color original image and the second color original image.
  • a color original image and a second color original image are subjected to the black level correction processing in the first image processing; the panchromatic preprocessing module 2024 performs demosaicing processing on the first panchromatic original image data and the second panchromatic original image data.
  • the full-color processing module 2022 performs black level correction processing in the second image processing on the first full-color original image and the second full-color original image.
  • the first color original image has a red channel, a green channel, and a blue channel. Please refer to FIG. 21.
  • the color processing module 2021 performs black level correction processing on the first color original image. All pixel values in the first color original image are subtracted from a fixed value of 5, thereby obtaining the first black level correction processing. Color original image.
  • the image sensor 10 adds a fixed offset of 5 (or other values) before the input of AD, so that the output pixel value is between 5 (or other values) to 255.
  • the black level correction processing it can make While the details of the dark parts of the image obtained by the image sensor 10 and the high dynamic range image processing system 100 of the embodiment of the present application are completely preserved, the pixel value of the image is not increased or decreased, which is beneficial to improving the imaging quality.
  • Lens shadow is the phenomenon that the lens has a shadow around the lens caused by the uneven optical refraction of the lens, that is, the intensity of the received light in the center and the surrounding area of the image area is inconsistent.
  • the process of lens shading correction processing can be that the color processing module 2021 or panchromatic processing module 2022 can mesh the processed image on the basis of the color original image and the panchromatic original image that have undergone black level correction processing, and then The lens shading correction is performed on the image by the bilinear interpolation method through the compensation coefficients of each grid area adjacent or itself and adjacent circumferences.
  • the following takes the lens shading correction processing on the first color original image as an example for description. As shown in FIG.
  • the color processing module 2021 divides the first color original image (that is, the processed image) into sixteen equally Grid, each of the sixteen grids has a preset compensation coefficient. Then, the color processing module 2021 performs shading correction on the image by a bilinear interpolation method according to the compensation coefficients adjacent to each grid area or itself and its vicinity.
  • R2 is the pixel value in the dashed frame in the first color intermediate image that has undergone lens shading correction processing
  • R1 is the pixel value in the dashed frame in the first color original image shown in the figure.
  • R2 R1*k1
  • k1 is obtained by bilinear interpolation of the compensation coefficients 1.10, 1.04, 1.105, and 1.09 of the grid adjacent to the R1 pixel.
  • the coordinates of the image are (x, y), x is counted from the first pixel from the left to the right, y is counted from the first pixel on the top, and both x and y are natural numbers, as indicated by the logo on the edge of the image Show.
  • the coordinates of R1 are (3,3)
  • the coordinates of R1 in each grid compensation coefficient map should be (0.75,0.75).
  • f(x, y) represents the compensation value of the coordinate (x, y) in each grid compensation coefficient graph.
  • f(0.75, j0.75) is the compensation coefficient value corresponding to R1 in each grid compensation coefficient graph.
  • the compensation coefficient of each grid has been preset before the color processing module 2021 or the panchromatic processing module 2022 performs lens shading correction processing.
  • the compensation coefficient of each grid can be determined by the following methods: (1) Place the lens 300 in a closed device with constant and uniform light intensity and color temperature, and make the lens 300 face a pure gray target with uniform brightness distribution in the closed device The object is shot to obtain a grayscale image; (2) The grayscale image is gridded (for example, divided into 16 grids) to obtain the grayscale image divided into different grid areas; (3) The different grids of the grayscale image are calculated The compensation coefficient of the grid area.
  • the high dynamic range image processing system 100 of the present application presets the compensation coefficient in the color processing module 2021 or the panchromatic processing module 2022.
  • the compensation coefficient is obtained, and the color processing module 2021 or the panchromatic processing module 2022 then uses the bilinear interpolation method to perform the correction according to the compensation coefficient of each grid area The image undergoes lens shading correction processing.
  • the photosensitive pixels on the pixel array of the image sensor may have defects in process, or errors in the process of converting optical signals into electrical signals, resulting in incorrect pixel information on the image and inaccurate pixel values in the image. These are defects
  • the pixels that appear on the output image are image dead pixels.
  • Image dead pixels may exist, so the image needs to be compensated for dead pixels.
  • the dead pixel compensation process may include the following steps: (1) create a 3 ⁇ 3 pixel matrix of the same color photosensitive pixels with the pixel to be detected as the center pixel; (2) use the surrounding pixels of the central pixel As a reference point, it is judged whether the difference between the color value of the central pixel and the surrounding pixels is greater than the first threshold.
  • the central pixel is a dead pixel; if not, the central pixel is Normal point; (3) Perform bilinear interpolation on the central pixel point judged as a bad point to obtain the corrected pixel value.
  • FIG. 23 The following describes the dead pixel compensation processing on the first full-color original image.
  • R1 is the pixel to be detected.
  • a 3 ⁇ 3 pixel matrix of pixels of the same color of the photosensitive pixels, and the second image in FIG. 23 is obtained.
  • the surrounding pixels of the central pixel R1 is determined whether the difference between the color value of the central pixel R1 and the surrounding pixel is greater than the first threshold Q (Q is preset in the color processing module 2021 ).
  • the central pixel R1 is a bad pixel, and if not, the central pixel R1 is a normal pixel. If R1 is a dead pixel, then bilinear interpolation is performed on R1 to obtain the corrected pixel value R1' (the case where R1 is a dead pixel is shown in the figure) to obtain the third image in FIG. 23.
  • R1 is a dead pixel
  • bilinear interpolation is performed on R1 to obtain the corrected pixel value R1' (the case where R1 is a dead pixel is shown in the figure) to obtain the third image in FIG. 23.
  • FIG. 24 Please refer to describe the dead pixel compensation processing performed by the panchromatic processing module 2022 on the first panchromatic original image that has undergone lens shading correction processing.
  • W1 in the first picture in Fig. 24 is the pixel to be detected.
  • the panchromatic processing module 2022 uses W1 as the center pixel to establish a 3 ⁇ 3 pixel matrix of pixels of the same color as the photosensitive pixel of W1 to obtain the first pixel in Fig. 24 Two pictures. And taking the surrounding pixels of the central pixel W1 as a reference point, it is determined whether the difference between the color value of the central pixel W1 and the surrounding pixels is greater than the first threshold K (K is preset in the panchromatic processing module 2022) . If it is, the central pixel W1 is a bad pixel, and if not, the central pixel W1 is a normal pixel.
  • W1 is a dead pixel
  • W1' shown in the figure is the case where W1 is a dead pixel
  • the color processing module 2021 and the full color processing module 2022 of the embodiment of the present application can perform dead pixel compensation processing on the image, which is beneficial to the high dynamic range image processing system 100 to eliminate the presence of photosensitive pixels in the imaging process of the high dynamic range image processing system 100 Process defects, or image defects caused by errors in the process of converting optical signals into electrical signals, thereby improving the accuracy of the pixel values of the target image formed by the high dynamic range image processing system 100, thereby enabling the implementation of this application
  • the method has a better imaging effect.
  • each pixel grid of the color original image (such as the first color original image and the second color original image) of the embodiment of the present application is a single-color pixel, there is no optical information of other colors, so it is necessary to compare the first color original image Perform demosaicing with the second color original image.
  • the full-color preprocessing module 2024 can also perform demosaic processing on the full-color original image data to obtain a full-color original image.
  • the color processing module 2021 performs demosaic processing on the first color original image (for example, including the red channel, the green channel, and the blue channel) as an example.
  • the demosaic processing steps include the following steps: (1) The first color The original image is decomposed into a first red original image, a first green original image, and a first blue original image, as shown in FIG. 25, among the obtained first red original image, first green original image, and first blue original image Some pixel grids have no pixel value. (2) The first red original image, the first green original image, and the first blue original image are respectively interpolated by using a bilinear interpolation method. As shown in FIG. 26, the color processing module 2021 uses a bilinear interpolation method to perform interpolation processing on the first blue original image. The pixel B1 to be interpolated in FIG.
  • the color processing module 2021 uses a bilinear interpolation method to perform interpolation processing on the first green original image.
  • the pixel to be interpolated G1 in FIG. 27 performs bilinear interpolation according to the four pixels G2, G3, G4, and G5 around G1 to obtain the interpolated pixel G1' of G1.
  • All the pixels to be interpolated in the blanks in the first image in FIG. 27 are traversed to use the bilinear interpolation method to complement the pixel values to obtain the interpolated first green original image.
  • the color processing module 2021 may use a bilinear interpolation method to perform interpolation processing on the first red original image to obtain the interpolated first red original image. (3) Re-synthesize the interpolated first red original image, the interpolated first green original image, and the interpolated first blue original image into an image with 3 color channels, as shown in FIG. 28.
  • the color processing module 2021 performs demosaic processing on the color image, which is beneficial for the implementation of the present application to complete the color image with the pixel value of a single color channel into a color image with multiple color channels, so that the hardware of the single-color photosensitive pixel On the basis of maintaining the complete presentation of the image color.
  • the color correction processing can specifically be to use a color correction matrix to correct the color channel values of each pixel of the color original image (which can be the first color original image and the second color original image that have undergone demosaicing processing), thereby realizing Correction of image color.
  • a color correction matrix to correct the color channel values of each pixel of the color original image (which can be the first color original image and the second color original image that have undergone demosaicing processing), thereby realizing Correction of image color.
  • the color correction matrix (CCM) is preset in the color processing module.
  • the color correction matrix can be specifically:
  • the color processing module traverses all pixels in the image and performs color correction processing through the above color correction matrix to obtain an image that has undergone color correction processing.
  • the color correction processing in the embodiment of the present application is beneficial to eliminate the serious color deviation caused by colored light sources in the image or video frame, and the color distortion of people or objects in the image, so that the high dynamic range image processing system 100 of the embodiment of the present application can Restore the original color of the image and improve the visual effect of the image.
  • the binary digits of the gray value are often higher than 8 bits (the binary digits of the gray value of ordinary gray-scale images are generally 8 bits), and the gray scale of many displays is only 8 bits Therefore, the color of the high dynamic range image is changed, which is beneficial for the high dynamic range image to have higher compatibility and can be displayed on a conventional monitor.
  • the high dynamic range image generally has a very uneven distribution of gray values, only a few pixels are brighter, and most of the pixels are distributed in the interval with a lower gray value.
  • the high dynamic range image processing system of the embodiment of the present application The tone mapping processing of 100 pairs of images is non-linear, but the slope of the mapping relationship in the interval with lower gray value is greater than that in the interval with higher gray value.
  • the high dynamic range image processing system 100 of the embodiment of the present application can perform processing of color original images (which may be the first color original image and the second color original image that have undergone tone mapping processing).
  • Color original images perform color conversion processing to convert the image from one color space (for example, RGB color space) to another color space (for example, YUV color space) to have a wider range of application scenarios or a more efficient transmission format.
  • the color conversion process to convert the image from the RGB color space to the YUV color space is beneficial to the implementation of this application
  • the subsequent image processing of the high dynamic range image processing system 100 compresses the chrominance information of the image, which can reduce the amount of information of the image while not affecting the viewing effect of the image, thereby improving the transmission efficiency of the image.
  • the high dynamic fusion unit 50 may perform brightness alignment processing on the target image (which may include the first target image and the second target image) corresponding to at least two exposures to obtain a brightness-aligned target image, and then merge the target image. Brightly aligned target image and one or more target images to obtain a highly dynamic target image.
  • the color high dynamic fusion unit 30 may perform brightness alignment processing on the color original image data corresponding to at least two exposures (for example, the first color original image data and the second color original image data) to obtain the brightness alignment.
  • the original color image data is combined with brightness-aligned color original image data and one or more color original image data to obtain highly dynamic color original image data.
  • the panchromatic high dynamic fusion unit 40 may perform brightness alignment processing on panchromatic original image data corresponding to at least two exposures (for example, the first panchromatic original image data and the second panchromatic original image data) to obtain a brightness-aligned panchromatic
  • the original image data is then fused with brightness-aligned panchromatic primitive image data and one or more panchromatic primitive image data to obtain highly dynamic panchromatic primitive image data.
  • the color high dynamic fusion unit 30 may perform brightness alignment processing on the color original images corresponding to at least two exposures (for example, the first color original image and the second color original image) to obtain brightness-aligned color original images. Image, and then merge the brightness-aligned color original image and one or more color original images to obtain a highly dynamic color original image.
  • the panchromatic high dynamic fusion unit 40 may perform brightness alignment processing on panchromatic original images corresponding to at least two exposures (for example, a first panchromatic original image and a second panchromatic original image) to obtain a panchromatic original image with aligned brightness. Then merge the brightness-aligned panchromatic original image and one or more panchromatic original images to obtain a highly dynamic panchromatic original image.
  • the color high dynamic fusion unit 30 may perform brightness alignment processing on the color intermediate images corresponding to at least two exposures (for example, the first color intermediate image and the second color intermediate image) to obtain a brightness-aligned color intermediate image. Image, and then merge the brightness-aligned color intermediate image and one or more color intermediate images to obtain a highly dynamic color intermediate image.
  • the panchromatic high dynamic fusion unit 40 may perform brightness alignment processing on panchromatic intermediate images corresponding to at least two exposures to obtain panchromatic intermediate images with aligned brightness (for example, a first panchromatic intermediate image and a second panchromatic intermediate image), Then merge the brightness-aligned panchromatic intermediate image and one or more panchromatic intermediate images to obtain a highly dynamic panchromatic intermediate image.
  • the high dynamic range processing performed on the image by the high dynamic fusion unit 50 may include brightness alignment processing.
  • the high dynamic fusion unit 50 (which may include the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40) performs brightness alignment processing on the image and includes the following steps (the number of images can be equal to the exposure of the image sensor 10 to control the pixel array 11 The number of times, the image can be the first color original image data and the second color original image data, the first color original image data, the second color original image data and the third color original image data, the first target image and the second target image , The first color original image and the second color original image, the first color intermediate image and the second color intermediate image, the first panchromatic original image and the second panchromatic original image, the first panchromatic intermediate image and the second panchromatic image, the first panchromatic intermediate image and the second panchromatic image Intermediate image, the first color original image, the second color original image and the third color original image, the first panchromatic original image, the second panchromatic original image and the
  • the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 extends a predetermined area with the overexposed image pixel P12 as the center, for example, the 3*3 area shown in FIG. 30.
  • it may also be a 4*4 area, a 5*5 area, a 10*10 area, etc., which is not limited here.
  • the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 searches for intermediate image pixels whose pixel value is less than the first preset threshold V0 in a predetermined area of 3*3, such as the image pixel P21 in FIG. 30 (FIG. 30). If the pixel value V2 of the image pixel marked with a dotted circle in the first color intermediate image is less than the first preset threshold V0, the image pixel P21 is the intermediate image pixel P21.
  • the color high dynamic fusion unit 30 searches for the image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively in the second color intermediate image, that is, the image pixel P1'2' (in the second color intermediate image in FIG.
  • the color high dynamic fusion unit 30 uses the value of V1' to replace the value of V1.
  • V1' is greater than the first preset threshold V0
  • the color high dynamic fusion unit 30 is in the third In the color intermediate image, look for the image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively, that is, the image pixels P1"2" and P2"1", the pixel value of the image pixel P1"'2" is V5, and the image pixel
  • the overexposed image can be calculated The actual pixel value of pixel P12.
  • the color high dynamic fusion unit 30 or panchromatic high dynamic fusion unit 40 performs this brightness alignment process on each overexposed image pixel in the first color intermediate image to obtain the brightness alignment Since the pixel value of the overexposed image pixel in the first color intermediate image after brightness alignment has been corrected, the pixel value of each image pixel in the first color intermediate image after brightness alignment is equal More accurate.
  • the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 can adjust the brightness The aligned images are merged with similar images to obtain a highly dynamic image. Specifically, the following uses the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 to adjust the brightness of the first color intermediate image (obtained by the long-time L exposure) and the second color intermediate image (exposed by the middle time M). Correspondingly obtained) and the third color intermediate image (obtained by the short-time S exposure) are fused to obtain a highly dynamic color intermediate image for description.
  • the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 first performs motion detection on the first color intermediate image after brightness alignment to identify whether there is a motion blur area in the first color intermediate image after brightness alignment. If there is no motion blur area in the first color intermediate image after brightness alignment, the first color intermediate image, the second color intermediate image, and the third color intermediate image after brightness alignment are directly merged to obtain a highly dynamic color intermediate image. If there is a motion blur area in the first color intermediate image after brightness alignment, remove the motion blur area in the first color intermediate image, and only merge all areas of the second color intermediate image and the third color intermediate image, and after the brightness is aligned The first color intermediate image except for the motion blur area in the first color intermediate image to obtain a highly dynamic color intermediate image.
  • the resolution of the high dynamic color intermediate image may be equal to the resolution of the pixel array 11.
  • two The fusion of the intermediate image follows the following principles: (1) In the first color intermediate image after brightness alignment, the pixel value of the image pixel in the overexposed area is directly replaced with the pixel value of the image pixel in the second color intermediate image corresponding to the overexposed area.
  • the pixel value of the image pixel in the underexposed area is: the long exposure pixel value divided by Coefficient K1, coefficient K1 is the average of K2 and K3; K2 is the ratio of long-exposure pixel value to medium-exposure pixel value, K3 is the ratio of long-exposure pixel value and short-exposure pixel value; (3) the first after brightness alignment In the color intermediate image, the pixel value of the image pixel in the area neither under-exposed nor over-exposed is: the long-exposure pixel value divided by the coefficient K1.
  • the fusion of the three intermediate images at this time not only follows the above three principles, but also needs to follow the (4) principle: the first color after brightness alignment In the intermediate image, the pixel value of the image pixel in the motion blur area is directly replaced with the pixel value of the image pixel corresponding to the motion blur area in the second color intermediate image and the image pixel corresponding to the motion blur area in the third color intermediate image The average of the pixel values.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs high dynamic range processing on the image through the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40, first performs brightness alignment processing on the image, and then performs brightness alignment on the image after the brightness alignment. It is fused with other images to obtain a highly dynamic image, so that the target image formed by the high dynamic range image processing system 100 has a larger dynamic range, and thus has a better imaging effect.
  • the fusion module 204 can perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image.
  • the specific process of the fusion algorithm processing can be as follows.
  • the color intermediate image has the color information of the three color channels of R (i.e. red), G (i.e. green) and B (i.e.
  • the panchromatic information can be brightness information
  • the fusion module 204 of the embodiment of the present application performs fusion algorithm processing on the color image and the panchromatic image, so that the source of the final target image has both color information and brightness information. Since the human eye is more sensitive to brightness than chroma, it is In terms of eye vision characteristics, the high dynamic range image processing system 100 of the embodiment of the present application has a better imaging effect, and the final target image obtained is closer to human vision.
  • the high dynamic fusion unit 50 is integrated in the image sensor 10; or the high dynamic fusion unit 50 is integrated in the image processor 20. Specifically, please refer to FIG. 18. In some embodiments, the color high dynamic fusion unit 30 and the panchromatic high dynamic fusion unit 40 may be integrated in the image sensor 10; please refer to FIG. 1, FIG. 19, and FIG. 20, in addition In the embodiment, the color high dynamic fusion unit 30 and the panchromatic high dynamic fusion unit 40 may be integrated in the image processor 20. The color high dynamic fusion unit 30 and the panchromatic high dynamic fusion unit 40 are integrated in the image sensor 10 or the image processor 20, so that the high dynamic range image processing system 100 of the embodiment of the present application does not need to improve the hardware performance of the image sensor 10. Realize high dynamic range processing. At the same time, the color high dynamic range fusion unit 30 and the full-color high dynamic range fusion unit 40 independently encapsulate the high dynamic range processing function, which is beneficial to reduce the design difficulty in the product design process and improve the convenience of design changes. .
  • image preprocessing may include pixel addition processing and demosaicing processing.
  • the color preprocessing module 2023 can perform pixel addition processing on the color original image data to obtain the color original image
  • the panchromatic preprocessing module 2024 can demosaicing the panchromatic original image data. Process to get the full-color original image.
  • the color preprocessing module 2023 can perform pixel addition processing on the color original image data to obtain the color original image
  • the full color preprocessing module 2024 can demosaicing the full color original image data Process to get the full-color original image.
  • the demosaic processing is the same as the specific implementation process of the demosaic processing performed by the full-color pre-processing module 2024 on the full-color original image data, and will not be further described here.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs pixel addition processing on the color information in some pixel grids that are missing and the pixel grids with color information only have color original image data with single color channel information. In a less computationally intensive way, the color information of the complete channel is obtained, and then the original color image is obtained, so that other image processing can be continued on the image to improve the imaging quality.
  • the image preprocessing may include pixel averaging processing and demosaicing processing.
  • the color preprocessing module 2023 can perform pixel averaging processing on the color original image data to obtain the color original image, and the panchromatic preprocessing module 2024 can demosaicing the panchromatic original image data. Process to get the full-color original image.
  • the color preprocessing module 2023 can perform pixel averaging processing on the color original image data to obtain the color original image; the full color preprocessing module 2024 can demosaicing the full color original image data Process to get the full-color original image.
  • the demosaic processing is the same as the specific implementation process of the demosaic processing performed by the full-color preprocessing module 2024 on the full-color original image data, and will not be further described here.
  • the high dynamic range image processing system 100 of the embodiment of the present application performs pixel averaging processing on the color original image data in which the color information in some pixel grids are missing and the pixel grids with color information only have single-color channel information. In a less computationally intensive way, the color information of the complete channel is obtained, and then the original color image is obtained, so that other image processing can be continued on the image to improve the imaging quality.
  • the following takes the pixel addition processing of the color original image data as an example for description.
  • the specific steps of the pixel addition processing are as follows: (1) Decompose the color original image data into the first color original image data (from the above-mentioned first The original image data generated by the first color photosensitive pixel A), the second color original image data (the original image data generated by the second color photosensitive pixel B described above) and the third color original image data (from the above The third color photosensitive pixel C generates the original image data).
  • the color preprocessing module 2023 can perform the pixel addition processing of the above steps on all color original image data corresponding to at least two exposures, thereby completing the pixel ball addition processing of all color original image data to obtain at least two color original images . Specifically, referring to FIG. 31, the following takes the color preprocessing module 2023 to perform pixel addition processing on the first red original image data in the first color original image data as an example for description.
  • the color original image which can be the first color original image, the second color original image, or the third color original image, etc.
  • the red preprocessing module uses a bilinear interpolation method to interpolate the red intermediate image data to obtain a red original image with a resolution of a quarter of the red original image data.
  • the color preprocessing module 2023 can obtain a red original image, a green original image, and a blue original image, and combine the obtained red original image, green original image, and blue original image with one color channel into three Color original image with three color channels.
  • the color preprocessing module 2023 can perform the pixel addition of the above steps on the first color original image data and the second original image data (or the first color original image data, the second color original image data, and the third color original image data).
  • the processing, thereby completing the pixel addition processing of the color original image data obtains the first color original image and the second color original image (or the first color original image, the second color original image, and the third color original image).
  • the specific steps of the pixel averaging processing are as follows: (1) Decompose the color original image data into the first color original image data (from the above-mentioned first color image data).
  • the third color photosensitive pixel C generates the original image data).
  • the color preprocessing module 2023 can perform the pixel averaging processing of the above steps on all color original image data corresponding to at least two exposures, so as to complete the pixel averaging processing of all color original image data to obtain at least two color original images. Specifically, referring to FIG.
  • the color preprocessing module 2023 takes the color preprocessing module 2023 to perform pixel averaging processing on the first red original image data in the first color original image data as an example for description.
  • the color preprocessing module 2023 first decomposes the color original image (which can be the first color original image, the second color original image, or the third color original image, etc.) data into red original image data and green original image data. And blue raw image data. As shown in FIG.
  • the red preprocessing module uses a bilinear interpolation method to interpolate the red intermediate image data to obtain a red original image with a resolution of a quarter of the red original image data.
  • the color preprocessing module 2023 can obtain a red original image, a green original image, and a blue original image, and combine the obtained red original image, green original image, and blue original image with one color channel into three Color original image with three color channels.
  • the color preprocessing module 2023 can perform the pixel averaging of the above steps on the first color original image data and the second original image data (or the first color original image data, the second color original image data, and the third color original image data). Processing, thereby completing the pixel averaging processing of the color original image data to obtain the first color original image and the second color original image (or the first color original image, the second color original image, and the third color original image).
  • the image processor 20 may further include a receiving unit 201 and a memory unit 203.
  • the receiving unit 201 is used to receive color original image data and full-color original image data;
  • the memory unit 203 is used to temporarily store color original image data, full-color original image data, color original images, full-color original images, color intermediate images, and full-color original images. One or more of the intermediate image and the target image.
  • the image processor 20 is provided with a receiving unit 201 and a memory unit 203 to separate the receiving, processing and storage of images, which is conducive to more independent packaging of the modules of the high dynamic range image processing system 100, so that the high dynamic range image processing system 100 has Higher execution efficiency and better anti-interference effect, in addition, are also beneficial to reduce the design difficulty of the redesign process of the high dynamic range image processing system 100, thereby reducing costs.
  • the present application also provides an electronic device 1000.
  • the electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments.
  • the lens 300 and the high dynamic range image processing system 100 are combined with the housing 200.
  • the lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.
  • the electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (such as a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, a head-mounted display device, etc., which are not limited here.
  • a smart wearable device such as a smart watch, a smart bracelet, a smart glasses, a smart helmet
  • a drone a head-mounted display device, etc., which are not limited here.
  • the electronic device 1000 of the embodiment of the present application controls the pixel array 11 to perform at least two exposures at the first exposure time and the second exposure time respectively, and generates multiple images according to different exposure times and different photosensitive pixels, so as to follow up. Image preprocessing, high dynamic range processing, image processing and fusion algorithm processing are performed on multiple images to obtain a target image with high dynamic range.
  • the electronic device 1000 of the embodiment of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the bright and dark areas of the target image can have better performance, which is beneficial to While improving imaging performance, it helps reduce costs.
  • the high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100.
  • the high dynamic range image processing system 100 may include the image sensor 10.
  • the image sensor 10 includes a pixel array 11.
  • the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels.
  • the pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels.
  • High dynamic range image processing methods include:
  • the pixel array 11 is exposed for the first exposure time to obtain the first original image.
  • the first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time.
  • the pixel array 11 is exposed for a second exposure time to obtain a second original image.
  • the second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time.
  • the first exposure time is not equal to the second exposure time.
  • the high dynamic range image processing method of the embodiment of the present application controls the pixel array 11 to perform at least two exposures at the first exposure time and the second exposure time respectively, and generates multiple images according to different exposure times and different photosensitive pixels, so that Subsequently, image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on the multiple images to obtain a target image with high dynamic range.
  • the high dynamic range image processing method of the embodiment of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the bright and dark areas of the target image can have better performance , Which is beneficial to improve imaging performance and at the same time help to reduce costs.
  • the pixel array 11 may also be exposed for a third exposure time to obtain a third original image.
  • the third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time.
  • the third exposure time is not equal to the first exposure time
  • the third exposure time is not equal to the second exposure time.
  • the image preprocessing includes pixel completion processing and demosaicing processing
  • the image processing includes first image processing and second image processing; image preprocessing and high dynamics are performed on the first original image and the second original image.
  • Range processing, image processing and fusion algorithm processing to obtain the target image can also include:
  • the fusion algorithm is performed on the color intermediate image and the panchromatic intermediate image to obtain the target image.
  • the fusion algorithm to obtain the target image image preprocessing, high dynamic range processing, image processing, and fusion are performed on the first original image and the second original image.
  • the algorithm processing to obtain the target image also includes:
  • the target image obtained by processing also includes:
  • the first original image and the second original image are subjected to image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing to obtain the target image.
  • the full-color original image data corresponding to at least two exposures are fused to obtain high-dynamic full-color original image data.
  • the first image processing includes:
  • the second image processing includes:
  • One or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing are included in black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing.
  • the first image processing includes a first image sub-processing and a second image sub-processing
  • the color processing module 2021 is configured to perform the first image sub-processing on the color original image first, and then perform the second image sub-processing
  • the first image sub-processing includes:
  • the second image sub-processing includes:
  • demosaicing processing One or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.
  • fusing the target images corresponding to at least two exposures to obtain a highly dynamic target image includes:
  • the target image corresponding to at least two exposures is subjected to brightness alignment processing to obtain a brightness-aligned target image, and then the brightness-aligned target image and one or more target images are merged to obtain a highly dynamic target image.
  • fusing color raw image data corresponding to at least two exposures to obtain high dynamic color raw image data includes:
  • Fusion of panchromatic raw image data corresponding to at least two exposures to obtain high dynamic panchromatic raw image data includes:
  • the high dynamic range image processing method further includes:
  • the image preprocessing includes pixel addition processing and demosaicing processing, and image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image, the second original image, and the third original image.
  • the processed target image includes:
  • Image preprocessing includes pixel averaging processing and demosaicing processing.
  • Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image, the second original image, and the third original image to obtain the target image includes:
  • the specific implementation process of the high dynamic range image processing method of any one of the above embodiments is the same as the specific implementation process of the aforementioned high dynamic range image processing system 100 to obtain a target image, and will not be further described here.
  • This application also provides a non-volatile computer-readable storage medium 400 containing a computer program.
  • the processor 60 is caused to execute the high dynamic range image processing method described in any one of the foregoing embodiments.
  • the high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiments of the present application control the pixel array 11 to at least perform two exposures at the first exposure time and the second exposure time, respectively. And according to different exposure times and different photosensitive pixels, multiple images are generated, so that image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on these multiple images to obtain a target image with high dynamic range.
  • the high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiments of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the target The bright and dark parts of the image can have better performance, which is conducive to improving imaging performance and helping to reduce costs.
  • the image processor can only process the image formed by the traditional pixel array composed of color photosensitive pixels, and is not applicable to the image produced by the pixel array having both color photosensitive pixels and full-color photosensitive pixels.
  • the high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiments of the present application are applicable to images generated by a pixel array 11 having color photosensitive pixels and full-color photosensitive pixels.
  • full-color photosensitive pixels can receive more light than color photosensitive pixels, which can improve the brightness of the final image, and the human eye is more sensitive to brightness than chromaticity, which makes the
  • the high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the application embodiment have better imaging effects.
  • methods such as increasing the shutter speed or selecting photosensitive pixels whose photosensitive response curve is in a logarithmic form put forward higher requirements on the hardware parameters of the image sensor of the high-dynamic camera.
  • the high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiment of the present application do not need to increase the hardware parameter requirements of the image sensor 10, by providing a high dynamic fusion unit in the image sensor 10 50 and the fusion module 204, in conjunction with the corresponding exposure mode, can realize the high dynamic range processing function, thereby obtaining an image with better imaging effect.
  • the term “installation” should be understood in a broad sense, for example, it can be a fixed connection, a detachable connection, or an integral Connection; it can be mechanical connection, it can be electrical connection or it can communicate with each other; it can be directly connected, it can also be indirectly connected through an intermediate medium, it can be the internal communication of two components or the interaction relationship between two components.
  • installation should be understood in a broad sense, for example, it can be a fixed connection, a detachable connection, or an integral Connection; it can be mechanical connection, it can be electrical connection or it can communicate with each other; it can be directly connected, it can also be indirectly connected through an intermediate medium, it can be the internal communication of two components or the interaction relationship between two components.
  • a "computer-readable medium” can be any device that can contain, store, communicate, propagate, or transmit a program for use by an instruction execution system, device, or device or in combination with these instruction execution systems, devices, or devices.
  • computer readable media include the following: electrical connections with one or more wiring (control method), portable computer disk cases (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable and editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because it can be used, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable media if necessary.
  • the program is processed in a manner to obtain the program electronically, and then stored in the computer memory.
  • each part of the embodiments of the present application can be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods can be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if it is implemented by hardware, as in another embodiment, it can be implemented by any one or a combination of the following technologies known in the art: Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate array (PGA), field programmable gate array (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried in the method of the foregoing embodiments can be implemented by a program instructing relevant hardware to complete.
  • the program can be stored in a computer-readable storage medium. When executed, it includes one of the steps of the method embodiment or a combination thereof.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

A high dynamic range image processing system (100) and method, an electronic device (1000), and a computer-readable storage medium (400). The high dynamic range image processing system (100) comprises an image sensor (10), a high dynamic fusion unit (50), and an image processor (20). A pixel array (11) is exposed at a first exposure time to obtain a first original image, and is exposed at a second exposure time to obtain a second original image. The image processor (20) and the high dynamic fusion unit (50) are used for performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image.

Description

高动态范围图像处理系统及方法、电子设备和存储介质High dynamic range image processing system and method, electronic equipment and storage medium

优先权信息Priority information

本申请请求2020年4月17日向中国国家知识产权局提交的、专利申请号为202010304152.6的专利申请的优先权和权益,并且通过参照将其全文并入此处。This application requests the priority and rights of the patent application with the patent application number 202010304152.6 filed with the State Intellectual Property Office of China on April 17, 2020, and the full text is incorporated herein by reference.

技术领域Technical field

本申请涉及图像处理技术领域,特别涉及一种高动态范围图像处理系统及方法、电子设备和计算机可读存储介质。This application relates to the field of image processing technology, and in particular to a high dynamic range image processing system and method, electronic equipment, and computer-readable storage media.

背景技术Background technique

普通相机因受到动态范围的限制,不能记录极端亮或者暗的细节。尤其在拍摄场景的光比较大的时候,很容易出现过曝或者欠曝的情况。具有高动态范围(High-Dynamic Range,HDR)功能的相机,在大光比情况下拍摄图像,无论是高光、暗位都能够具有比普通相机更佳的表现。Ordinary cameras cannot record extremely bright or dark details due to the limitation of dynamic range. Especially when the light in the shooting scene is relatively large, it is prone to overexposure or underexposure. A camera with a high dynamic range (High-Dynamic Range, HDR) function can capture images in a large light ratio, and it can perform better than ordinary cameras in both high light and dark positions.

发明内容Summary of the invention

本申请实施方式提供一种高动态范围图像处理系统及方法、电子设备和计算机可读存储介质。The embodiments of the present application provide a high dynamic range image processing system and method, electronic equipment, and computer-readable storage medium.

本申请实施方式提供的高动态范围图像处理系统包括图像传感器、高动态融合单元和图像处理器。所述图像传感器包括像素阵列。所述像素阵列包括多个全色感光像素和多个彩色感光像素。所述彩色感光像素具有比所述全色感光像素更窄的光谱响应。所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元。每个所述子单元包括多个单颜色感光像素及多个全色感光像素。所述像素阵列以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据。所述像素阵列以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。所述图像处理器和所述高动态融合单元用于对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。The high dynamic range image processing system provided by the embodiments of the present application includes an image sensor, a high dynamic fusion unit, and an image processor. The image sensor includes a pixel array. The pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. The color photosensitive pixel has a narrower spectral response than the full-color photosensitive pixel. The pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The pixel array is exposed for a first exposure time to obtain a first original image. The first original image includes first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by the full-color photosensitive pixels exposed at the first exposure time. The pixel array is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixel exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixel exposed at the second exposure time. Wherein, the first exposure time is not equal to the second exposure time. The image processor and the high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image.

本申请实施方式提供的高动态范围图像处理方法用于高动态范围图像处理系统。所述高动态范围图像处理系统包括图像传感器。所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素。所述彩色感光像素具有比所述全色感光像素更窄的光谱响应。所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元。每个所述子单元包括多个单颜色感光像素及多个全色感光像素。高动态范围图像处理方法包括:控制像素阵列进行至少两次曝光,其中,所述像素阵列以第一曝光时间曝光得到第一原始图像,第一原始图像包括以第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据;所述像素阵列以第二曝光时间曝光得到第二原始图像,第二原始图像包括以第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据;其中,第一曝光时间不等于第二曝光时间;和对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。The high dynamic range image processing method provided by the embodiment of the present application is used in a high dynamic range image processing system. The high dynamic range image processing system includes an image sensor. The image sensor includes a pixel array, and the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. The color photosensitive pixel has a narrower spectral response than the full-color photosensitive pixel. The pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The high dynamic range image processing method includes: controlling a pixel array to perform at least two exposures, wherein the pixel array is exposed at a first exposure time to obtain a first original image, and the first original image includes the single exposed at the first exposure time. The first color original image data generated by the color photosensitive pixels and the first panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the first exposure time; the pixel array is exposed at the second exposure time to obtain the second original image , The second original image includes second color original image data generated by the single-color photosensitive pixel exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixel exposed at the second exposure time ; Wherein, the first exposure time is not equal to the second exposure time; and image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on the first original image and the second original image to obtain the target image.

本申请实施方式提供的电子设备包括镜头、壳体及高动态范围图像处理系统。所述镜头、高动态范围图像处理系统与所述壳体结合。所述镜头与所述高动态范围图像处理系统的图像传感器配合成像。所述高动态范围图像处理系统包括图像传感器、高动态融合单元和图像处理器。图像传感器包括像素阵列。像素阵列包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列包括最小重复单元,每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。像素阵列以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。图像处理器、高动态融合单元用于对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。The electronic device provided by the embodiment of the present application includes a lens, a housing, and a high dynamic range image processing system. The lens and the high dynamic range image processing system are combined with the housing. The lens cooperates with the image sensor of the high dynamic range image processing system for imaging. The high dynamic range image processing system includes an image sensor, a high dynamic fusion unit and an image processor. The image sensor includes an array of pixels. The pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels. The pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The pixel array is exposed for the first exposure time to obtain the first original image. The first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time. The pixel array is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time. Wherein, the first exposure time is not equal to the second exposure time. The image processor and the high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image.

本申请实施方式提供的包含计算机程序的非易失性计算机可读存储介质中,所述计算机程序被处理器执行时,使得所述处理器执行高动态范围图像处理方法。所述高动态范围图像处理方法用于高动态范围图像处理系统。所述高动态范围图像处理系统包括图像传感器、彩色高动态融合单元、全色高动态融合单元和图像处理器。图像传感器包括像素阵列,像素阵列包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列包括最小重复单元,每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。像素阵列以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。图像处理器、彩色高动态融合单元和全色高动态融合单元用于对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。In the non-volatile computer-readable storage medium containing a computer program provided by the embodiments of the present application, when the computer program is executed by a processor, the processor executes the high dynamic range image processing method. The high dynamic range image processing method is used in a high dynamic range image processing system. The high dynamic range image processing system includes an image sensor, a color high dynamic fusion unit, a panchromatic high dynamic fusion unit and an image processor. The image sensor includes a pixel array, and the pixel array includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels. The pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The pixel array is exposed for the first exposure time to obtain the first original image. The first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time. The pixel array is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time. Wherein, the first exposure time is not equal to the second exposure time. The image processor, the color high dynamic fusion unit, and the panchromatic high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image.

附图说明Description of the drawings

本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:The above and/or additional aspects and advantages of the present application will become obvious and easy to understand from the description of the embodiments in conjunction with the following drawings, in which:

图1是本申请实施方式的高动态范围图像处理系统的示意图;Fig. 1 is a schematic diagram of a high dynamic range image processing system according to an embodiment of the present application;

图2是本申请实施方式的像素阵列的示意图;Fig. 2 is a schematic diagram of a pixel array according to an embodiment of the present application;

图3是本申请实施方式的感光像素的截面示意图;3 is a schematic cross-sectional view of a photosensitive pixel according to an embodiment of the present application;

图4是本申请实施方式的感光像素的像素电路图;4 is a pixel circuit diagram of a photosensitive pixel according to an embodiment of the present application;

图5至图10是本申请实施方式的像素阵列中最小重复单元的排布示意图;5 to 10 are schematic diagrams of the arrangement of the smallest repeating unit in the pixel array of the embodiment of the present application;

图11至图13是本申请某些实施方式的图像传感器输出的原始图像的示意图;11 to 13 are schematic diagrams of original images output by image sensors in some embodiments of the present application;

图14是本申请实施方式的像素阵列的示意图;FIG. 14 is a schematic diagram of a pixel array according to an embodiment of the present application;

图15至图17是本申请实施方式的像素补全处理的示意图;FIG. 15 to FIG. 17 are schematic diagrams of pixel completion processing in an embodiment of the present application;

图18至图20是本申请实施方式的高动态范围图像处理系统的示意图;18 to 20 are schematic diagrams of a high dynamic range image processing system according to an embodiment of the present application;

图21是本申请实施方式的黑电平矫正处理的示意图;FIG. 21 is a schematic diagram of black level correction processing according to an embodiment of the present application;

图22是本申请实施方式的镜头阴影矫正处理的示意图;FIG. 22 is a schematic diagram of lens shading correction processing according to an embodiment of the present application;

图23和图24是本申请实施方式的坏点补偿处理处理的示意图;FIG. 23 and FIG. 24 are schematic diagrams of dead pixel compensation processing in an embodiment of the present application;

图25至图28是本申请实施方式的去马赛克处理的示意图;FIG. 25 to FIG. 28 are schematic diagrams of demosaicing processing in an embodiment of the present application;

图29是本申请实施方式的色调映射处理的Vout和Vin之间的映射关系示意图;FIG. 29 is a schematic diagram of the mapping relationship between Vout and Vin in the tone mapping process of the embodiment of the present application;

图30是本申请实施方式的亮度对齐处理的示意图;FIG. 30 is a schematic diagram of brightness alignment processing according to an embodiment of the present application;

图31是本申请实施方式的像素相加处理的示意图;FIG. 31 is a schematic diagram of pixel addition processing in an embodiment of the present application;

图32是本申请实施方式的像素求平均处理的示意图;FIG. 32 is a schematic diagram of pixel averaging processing according to an embodiment of the present application;

图33是本申请实施方式的电子设备的结构示意图;FIG. 33 is a schematic structural diagram of an electronic device according to an embodiment of the present application;

图34是本申请某些实施方式的图像获取方法的流程示意图;FIG. 34 is a schematic flowchart of an image acquisition method according to some embodiments of the present application;

图35是本申请某些实施方式的非易失性计算机可读存储介质与处理器的交互示意图。FIG. 35 is a schematic diagram of interaction between a non-volatile computer-readable storage medium and a processor in some embodiments of the present application.

具体实施方式Detailed ways

下面详细描述本申请的实施方式,实施方式的示例在附图中示出,相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。The embodiments of the present application are described in detail below. Examples of the embodiments are shown in the drawings, and the same or similar reference numerals indicate the same or similar elements or elements with the same or similar functions throughout. The following embodiments described with reference to the drawings are exemplary, and are only used to explain the present application, and cannot be understood as a limitation to the present application.

下文的公开提供了许多不同的实施方式或例子用来实现本申请的实施方式的不同结构。为了简化本申请的实施方式的公开,下文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本申请。The following disclosure provides many different embodiments or examples for realizing the different structures of the embodiments of the present application. In order to simplify the disclosure of the embodiments of the present application, the components and settings of specific examples are described below. Of course, they are only examples, and are not intended to limit the application.

请参阅图1和图2,本申请实施方式的高动态范围图像处理系统100包括图像传感器10、高动态融合单元50和图像处理器20。图像传感器10包括像素阵列11,像素阵列11包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元,每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。像素阵列11以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像。第二 原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。图像处理器20和高动态融合单元50用于对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。1 and FIG. 2, the high dynamic range image processing system 100 according to the embodiment of the present application includes an image sensor 10, a high dynamic fusion unit 50 and an image processor 20. The image sensor 10 includes a pixel array 11, and the pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels. The pixel array 11 includes a minimum repeating unit, and each minimum repeating unit includes a plurality of sub-units. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. The pixel array 11 is exposed for a first exposure time to obtain a first original image. The first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time. The pixel array 11 is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time. Wherein, the first exposure time is not equal to the second exposure time. The image processor 20 and the high dynamic fusion unit 50 are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image.

本申请实施方式的高动态范围图像处理系统100通过控制像素阵列11至少分别以第一曝光时间和第二曝光时间进行两次曝光,并且根据不同的曝光时间和不同的感光像素生成多张图像,以便后续对此多张图像进行图像预处理、高动态范围处理、图像处理和融合算法处理,从而得到具有高动态范围的目标图像。本申请实施方式的高动态范围图像处理系统100在无需提高图像传感器10的感光像素硬件参数的情况下,就能实现高动态范围功能,使得目标图像的亮处、暗位都能够具有更佳的表现,有利于提高成像性能的同时,有助于降低成本。The high dynamic range image processing system 100 of the embodiment of the present application controls the pixel array 11 to perform at least two exposures at the first exposure time and the second exposure time respectively, and generates multiple images according to different exposure times and different photosensitive pixels, In order to subsequently perform image preprocessing, high dynamic range processing, image processing and fusion algorithm processing on the multiple images, so as to obtain a target image with high dynamic range. The high dynamic range image processing system 100 of the embodiment of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the bright and dark positions of the target image can be better. Performance, which is conducive to improving imaging performance, while helping to reduce costs.

请参阅图1和图2,在某些实施方式中,所述像素阵列11以第三曝光时间曝光得到第三原始图像,所述第三原始图像包括以所述第三曝光时间曝光的所述单颜色感光像素生成的第三彩色原始图像数据和以所述第三曝光时间曝光的所述全色感光像素生成的第三全色原始图像数据;其中,所述第三曝光时间不等于所述第一曝光时间,所述第三曝光时间不等于所述第二曝光时间。所述图像处理器20、所述高动态融合单元50用于对所述第一原始图像、所述第二原始图像和所述第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。1 and 2, in some embodiments, the pixel array 11 is exposed at a third exposure time to obtain a third original image, and the third original image includes the third original image exposed at the third exposure time. The third color original image data generated by the single-color photosensitive pixel and the third panchromatic original image data generated by the panchromatic photosensitive pixel exposed at the third exposure time; wherein the third exposure time is not equal to the The first exposure time, the third exposure time is not equal to the second exposure time. The image processor 20 and the high dynamic fusion unit 50 are used to perform image preprocessing, high dynamic range processing, image processing, and processing on the first original image, the second original image, and the third original image. The fusion algorithm processes the target image.

请参阅图1和图2,在某些实施方式中,所述图像处理器20包括彩色预处理模块2023、全色预处理模块2024、彩色处理模块2021、全色处理模块2022和融合模块204,所述图像预处理包括像素补全处理和去马赛克处理,所述图像处理包括第一图像处理和第二图像处理。其中:所述彩色预处理模块2023用于对彩色原始图像数据进行像素补全处理,得到彩色原始图像;所述全色预处理模块2024用于对全色原始图像数据进行去马赛克处理,得到全色原始图像;所述彩色处理模块2021用于对所述彩色原始图像进行第一图像处理,得到彩色中间图像;所述全色处理模块2022用于对所述全色原始图像进行第二图像处理,得到全色中间图像;所述融合模块204用于对所述彩色中间图像和所述全色中间图像进行融合算法处理得到所述目标图像。1 and 2, in some embodiments, the image processor 20 includes a color preprocessing module 2023, a full color preprocessing module 2024, a color processing module 2021, a full color processing module 2022, and a fusion module 204, The image preprocessing includes pixel completion processing and demosaicing processing, and the image processing includes first image processing and second image processing. Wherein: the color preprocessing module 2023 is used to perform pixel complement processing on the color original image data to obtain a color original image; the full color preprocessing module 2024 is used to perform demosaicing processing on the full color original image data to obtain a complete Color original image; the color processing module 2021 is used to perform first image processing on the color original image to obtain a color intermediate image; the full color processing module 2022 is used to perform second image processing on the full color original image , To obtain a panchromatic intermediate image; the fusion module 204 is configured to perform a fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image.

在某些实施方式中,在所述融合模块204对所述彩色中间图像和所述全色中间图像进行融合算法处理得到所述目标图像之后:所述高动态融合单元50用于将至少两次曝光对应的所述目标图像融合得到高动态的所述目标图像。In some embodiments, after the fusion module 204 performs fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image: the high dynamic fusion unit 50 is configured to combine at least twice The target image corresponding to the exposure is fused to obtain the highly dynamic target image.

请参阅图18,在某些实施方式中,所述高动态融合单元50包括彩色高动态融合单元30和全色高动态融合单元40,在所述彩色预处理模块2023对彩色原始图像数据进行像素补全处理,得到彩色原始图像之前:所述彩色高动态融合单元30用于将至少两次曝光对应的所述彩色原始图像数据融合得到高动态的所述彩色原始图像数据。在所述全色预处理模块2024对全色原始图像数据进行去马赛克处理,得到全色原始图像之前:所述全色高动态融合单元40用于将至少两次曝光对应的所述全色原始图像数据融合得到高动态的所述全色原始图像数据。Referring to FIG. 18, in some embodiments, the high-dynamic fusion unit 50 includes a color high-dynamic fusion unit 30 and a panchromatic high-dynamic fusion unit 40. The color preprocessing module 2023 performs pixel processing on the color original image data. Before the completion process to obtain the color original image: the color high-dynamic fusion unit 30 is configured to fuse the color original image data corresponding to at least two exposures to obtain the high-dynamic color original image data. Before the panchromatic preprocessing module 2024 performs demosaic processing on the panchromatic original image data to obtain a panchromatic original image: the panchromatic high dynamic fusion unit 40 is configured to combine the panchromatic original image data corresponding to at least two exposures. The image data is fused to obtain the high dynamic panchromatic original image data.

在某些实施方式中,所述第一图像处理包括:黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理、去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。所述第二图像处理包括:所述黑电平矫正处理、所述镜头阴影矫正处理、所述坏点补偿处理和所述全局色调映射处理中的一个或多个。In some embodiments, the first image processing includes one of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing. Or more. The second image processing includes: one or more of the black level correction processing, the lens shading correction processing, the dead pixel compensation processing, and the global tone mapping processing.

在某些实施方式中,所述第一图像处理包括第一图像子处理和第二图像子处理,所述彩色处理模2021用于对所述彩色原始图像先进行第一图像子处理,再进行第二图像子处理,所述第一图像子处理包括:黑电平矫正处理、镜头阴影矫正处理和坏点补偿处理中的一个或多个。所述第二图像子处理包括:去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。In some embodiments, the first image processing includes a first image sub-processing and a second image sub-processing, and the color processing module 2021 is used to perform the first image sub-processing on the color original image before performing the first image sub-processing. The second image sub-processing, the first image sub-processing includes: one or more of black level correction processing, lens shading correction processing, and dead pixel compensation processing. The second image sub-processing includes: one or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.

在某些实施方式中,所述高动态融合单元50用于将至少两次曝光对应的所述目标图像进行亮度对齐处理,以得到亮度对齐的所述目标图像,再融合亮度对齐的所述目标图像及一张或多张所述目标图像以得到所述高动态的所述目标图像。In some embodiments, the high dynamic fusion unit 50 is configured to perform brightness alignment processing on the target image corresponding to at least two exposures to obtain the brightness-aligned target image, and then merge the brightness-aligned target image. An image and one or more of the target image to obtain the highly dynamic target image.

在某些实施方式中,所述彩色高动态融合单元30用于将至少两次曝光对应的所述彩色原始图像数据进行亮度对齐处理,以得到亮度对齐的所述彩色原始图像数据,再融合亮度对齐的所述彩色原始图像数据及一张或多张所述彩色原始图像数据以得到高动态的所述彩色原始图像数据。所 述全色高动态融合单元40用于将至少两次曝光对应的所述全色原始图像数据进行亮度对齐处理,以得到亮度对齐的所述全色原始图像数据,再融合亮度对齐的所述全色原始图像数据及一张或多张所述全色原始图像数据以得到高动态的所述全色原始图像数据。In some embodiments, the color high dynamic fusion unit 30 is configured to perform brightness alignment processing on the color original image data corresponding to at least two exposures to obtain the brightness-aligned color original image data, and then merge the brightness The color original image data and one or more sheets of the color original image data are aligned to obtain the high dynamic color original image data. The panchromatic high dynamic fusion unit 40 is configured to perform brightness alignment processing on the panchromatic original image data corresponding to at least two exposures, so as to obtain the brightness-aligned panchromatic original image data, and then merge the brightness-aligned panchromatic original image data. Full-color original image data and one or more sheets of the full-color original image data to obtain the high-dynamic full-color original image data.

请参阅图1,在某些实施方式中,所述图像处理器20还包括接收单元201及内存单元203。所述接收单元201用于接收所述彩色原始图像数据和所述全色原始图像数据;所述内存单元203用于暂存所述彩色原始图像数据、所述全色原始图像数据、所述彩色原始图像、所述全色原始图像、所述彩色中间图像、所述全色中间图像和所述目标图像中的一个或多个。Please refer to FIG. 1, in some embodiments, the image processor 20 further includes a receiving unit 201 and a memory unit 203. The receiving unit 201 is used to receive the color original image data and the full-color original image data; the memory unit 203 is used to temporarily store the color original image data, the full-color original image data, and the color original image data. One or more of the original image, the full-color original image, the color intermediate image, the full-color intermediate image, and the target image.

在某些实施方式中,图像处理器20包括彩色预处理模块2023和全色预处理模块2024,所述图像预处理包括像素相加处理和去马赛克处理,所述彩色预处理模块2023用于对彩色原始图像数据进行像素相加处理,得到彩色原始图像,所述全色预处理模块2024用于对全色原始图像数据进行去马赛克处理,得到全色原始图像;或者,所述图像预处理包括像素求平均处理和去马赛克处理,所述彩色预处理模,2023用于对彩色原始图像数据进行像素求平均处理,得到彩色原始图像,所述全色预处理模块2024用于对全色原始图像数据进行去马赛克处理,得到全色原始图像。In some embodiments, the image processor 20 includes a color preprocessing module 2023 and a panchromatic preprocessing module 2024. The image preprocessing includes pixel addition processing and demosaicing processing. The color preprocessing module 2023 is used for The color original image data is subjected to pixel addition processing to obtain a color original image, and the panchromatic preprocessing module 2024 is configured to perform demosaicing processing on the panchromatic original image data to obtain a panchromatic original image; or, the image preprocessing includes Pixel averaging processing and demosaicing processing, the color preprocessing module 2023 is used to perform pixel averaging processing on color original image data to obtain a color original image, and the panchromatic preprocessing module 2024 is used to perform pixel averaging processing on the color original image. The data is demosaiced to obtain a full-color original image.

在某些实施方式中,所述高动态融合单元50集成在所述图像传感器10中。In some embodiments, the high dynamic fusion unit 50 is integrated in the image sensor 10.

在某些实施方式中,所述高动态融合单元50集成在所述图像处理器20中。In some embodiments, the high dynamic fusion unit 50 is integrated in the image processor 20.

请参阅图34,本申请提供一种高动态范围图像处理方法。本申请实施方式的高动态范围图像处理方法用于高动态范围图像处理系统100。高动态范围图像处理系统100可以包括图像传感器10。图像传感器10包括像素阵列11。像素阵列11包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元。每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。高动态范围图像处理方法包括:Please refer to FIG. 34. This application provides a high dynamic range image processing method. The high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100. The high dynamic range image processing system 100 may include the image sensor 10. The image sensor 10 includes a pixel array 11. The pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels. The pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. High dynamic range image processing methods include:

01:控制像素阵列11曝光。其中,像素阵列11以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。和01: Control the exposure of the pixel array 11. Wherein, the pixel array 11 is exposed for the first exposure time to obtain the first original image. The first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time. The pixel array 11 is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time. Wherein, the first exposure time is not equal to the second exposure time. with

02:对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。02: Perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image.

在某些实施方式中,像素阵列11还可以以第三曝光时间曝光得到第三原始图像。第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素生成的第三全色原始图像数据。其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像可以包括:对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。In some embodiments, the pixel array 11 may also be exposed for a third exposure time to obtain a third original image. The third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time. Wherein, the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time. Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image may include: performing image preprocessing on the first original image, the second original image, and the third original image. Processing, high dynamic range processing, image processing and fusion algorithm processing to obtain the target image.

在某些实施方式中,图像预处理包括像素补全处理和去马赛克处理,图像处理包括第一图像处理和第二图像处理;对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还可以包括:对彩色原始图像数据进行像素补全处理,得到彩色原始图像;对全色原始图像数据进行去马赛克处理,得到全色原始图像;对彩色原始图像进行第一图像处理,得到彩色中间图像;对全色原始图像进行第二图像处理,得到全色中间图像;对彩色中间图像和全色中间图像进行融合算法处理得到目标图像。In some embodiments, the image preprocessing includes pixel completion processing and demosaicing processing, and the image processing includes first image processing and second image processing; image preprocessing and high dynamics are performed on the first original image and the second original image. The range processing, image processing, and fusion algorithm processing to obtain the target image may also include: performing pixel complement processing on the color original image data to obtain the color original image; performing demosaic processing on the panchromatic original image data to obtain the panchromatic original image; Perform the first image processing on the color original image to obtain a color intermediate image; perform the second image processing on the panchromatic original image to obtain the panchromatic intermediate image; perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image.

在某些实施方式中,在对彩色中间图像和全色中间图像进行融合算法处理得到目标图像之后,对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:将至少两次曝光对应的目标图像融合得到高动态的目标图像。In some embodiments, after the color intermediate image and the panchromatic intermediate image are processed by the fusion algorithm to obtain the target image, image preprocessing, high dynamic range processing, image processing, and fusion are performed on the first original image and the second original image. The algorithm processing to obtain the target image also includes: fusing the target images corresponding to at least two exposures to obtain a highly dynamic target image.

在某些实施方式中,在对彩色原始图像数据进行像素补全处理,得到彩色原始图像之前,对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:将至少两次曝光对应的彩色原始图像数据融合得到高动态的彩色原始图像数据;在对全色原始图像数据进行去马赛克处理,得到全色原始图像之前,对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:将至少两次曝光对应的全色原始图像数据融合得到高动态的全色原始图像数据。In some embodiments, before performing pixel complement processing on the color original image data to obtain the color original image, image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image and the second original image. The processing to obtain the target image also includes: fusing the color original image data corresponding to at least two exposures to obtain highly dynamic color original image data; before demosaicing the panchromatic original image data to obtain the panchromatic original image, perform the first The original image and the second original image are processed by image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing to obtain the target image. It also includes: fusing the panchromatic primitive image data corresponding to at least two exposures to obtain a highly dynamic panchromatic primitive image data.

在某些实施方式中,第一图像处理包括:黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理、去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。第二图像处理包括:黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理和全局色调映射处理中的一个或多个。In some embodiments, the first image processing includes one or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing. indivual. The second image processing includes one or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing.

在某些实施方式中,第一图像处理包括第一图像子处理和第二图像子处理,彩色处理模块2021用于对彩色原始图像先进行第一图像子处理,再进行第二图像子处理,第一图像子处理包括:黑电平矫正处理、镜头阴影矫正处理和坏点补偿处理中的一个或多个。第二图像子处理包括:去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。In some embodiments, the first image processing includes a first image sub-processing and a second image sub-processing, and the color processing module 2021 is configured to perform the first image sub-processing on the color original image first, and then perform the second image sub-processing, The first image sub-processing includes one or more of black level correction processing, lens shading correction processing, and dead pixel compensation processing. The second image sub-processing includes: one or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.

在某些实施方式中,将至少两次曝光对应的目标图像融合得到高动态的目标图像包括:将至少两次曝光对应的目标图像进行亮度对齐处理,以得到亮度对齐的目标图像,再融合亮度对齐的目标图像及一张或多张目标图像以得到高动态的目标图像。In some embodiments, fusing the target images corresponding to at least two exposures to obtain a highly dynamic target image includes: subjecting the target images corresponding to the at least two exposures to brightness alignment processing to obtain a target image with aligned brightness, and then fusing the brightness Align the target image and one or more target images to obtain a highly dynamic target image.

在某些实施方式中,将至少两次曝光对应的彩色原始图像数据融合得到高动态的彩色原始图像数据包括:将至少两次曝光对应的彩色原始图像数据进行亮度对齐处理,以得到亮度对齐的彩色原始图像数据,再融合亮度对齐的彩色原始图像数据及一张或多张彩色原始图像数据以得到高动态的彩色原始图像数据。将至少两次曝光对应的全色原始图像数据融合得到高动态的全色原始图像数据包括:将至少两次曝光对应的全色原始图像数据进行亮度对齐处理,以得到亮度对齐的全色原始图像数据,再融合亮度对齐的全色原始图像数据及一张或多张全色原始图像数据以得到高动态的全色原始图像数据。In some embodiments, fusing the color original image data corresponding to at least two exposures to obtain highly dynamic color original image data includes: subjecting the color original image data corresponding to the at least two exposures to brightness alignment processing to obtain brightness alignment. The color original image data is merged with brightness-aligned color original image data and one or more color original image data to obtain highly dynamic color original image data. Fusion of panchromatic original image data corresponding to at least two exposures to obtain highly dynamic panchromatic original image data includes: performing brightness alignment processing on panchromatic original image data corresponding to at least two exposures to obtain a brightness-aligned panchromatic original image Data, and then merge the brightness-aligned panchromatic original image data and one or more panchromatic original image data to obtain highly dynamic panchromatic original image data.

在某些实施方式中,高动态范围像处理方法还包括:接收彩色原始图像数据和全色原始图像数据;和暂存彩色原始图像数据、全色原始图像数据、彩色原始图像、全色原始图像、彩色中间图像、全色中间图像和目标图像中的一个或多个。In some embodiments, the high dynamic range image processing method further includes: receiving color original image data and full-color original image data; and temporarily storing color original image data, full-color original image data, color original image, and full-color original image , One or more of the color intermediate image, the full-color intermediate image, and the target image.

在某些实施方式中,图像预处理包括像素相加处理和去马赛克处理,对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:对彩色原始图像数据进行像素相加处理,得到彩色原始图像;和对全色原始图像数据进行去马赛克处理,得到全色原始图像;或者图像预处理包括像素求平均处理和去马赛克处理,对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:对彩色原始图像数据进行像素求平均处理,得到彩色原始图像;和对全色原始图像数据进行去马赛克处理,得到全色原始图像。In some embodiments, the image preprocessing includes pixel addition processing and demosaicing processing, and image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image, the second original image, and the third original image. The processing to obtain the target image includes: performing pixel addition processing on the color original image data to obtain the color original image; and performing demosaicing processing on the panchromatic original image data to obtain the panchromatic original image; or image preprocessing including pixel averaging processing and Demosaic processing, image preprocessing, high dynamic range processing, image processing and fusion algorithm processing are performed on the first original image, the second original image and the third original image to obtain the target image including: pixel averaging processing on the color original image data , Get the color original image; and perform demosaic processing on the full-color original image data to get the full-color original image.

请参阅图33,本申请还提供一种电子设备1000。本申请实施方式的电子设备1000包括镜头300、壳体200及上述任意一项实施方式的高动态范围图像处理系统100。镜头300、高动态范围图像处理系统100与壳体200结合。镜头300与高动态范围图像处理系统100的图像传感器10配合成像。Please refer to FIG. 33. The present application also provides an electronic device 1000. The electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments. The lens 300 and the high dynamic range image processing system 100 are combined with the housing 200. The lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.

请参阅图35,本申请还提供一种包含计算机程序的非易失性计算机可读存储介质400。该计算机程序被处理器60执行时,使得处理器60执行上述任意一项实施方式所述的高动态范围图像处理方法。Referring to FIG. 35, the present application also provides a non-volatile computer-readable storage medium 400 containing a computer program. When the computer program is executed by the processor 60, the processor 60 is caused to execute the high dynamic range image processing method described in any one of the foregoing embodiments.

图2是本申请实施方式中的图像传感器10的示意图。图像传感器10包括像素阵列11、垂直驱动单元12、控制单元13、列处理单元14和水平驱动单元15。FIG. 2 is a schematic diagram of the image sensor 10 in the embodiment of the present application. The image sensor 10 includes a pixel array 11, a vertical driving unit 12, a control unit 13, a column processing unit 14 and a horizontal driving unit 15.

例如,图像传感器10可以采用互补金属氧化物半导体(CMOS,Complementary Metal Oxide Semiconductor)感光元件或者电荷耦合元件(CCD,Charge-coupled Device)感光元件。For example, the image sensor 10 may adopt a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.

例如,像素阵列11包括以阵列形式二维排列(即二维矩阵形式排布)的多个感光像素110(图3所示),每个感光像素110包括光电转换元件1111(图4所示)。每个感光像素110根据入射在其上的光的强度将光转换为电荷。For example, the pixel array 11 includes a plurality of photosensitive pixels 110 (shown in FIG. 3) arranged two-dimensionally in an array (ie, arranged in a two-dimensional matrix), and each photosensitive pixel 110 includes a photoelectric conversion element 1111 (shown in FIG. 4) . Each photosensitive pixel 110 converts light into electric charge according to the intensity of light incident thereon.

例如,垂直驱动单元12包括移位寄存器和地址译码器。垂直驱动单元12包括读出扫描和复位扫描功能。读出扫描是指顺序地逐行扫描单位感光像素110,从这些单位感光像素110逐行地读取信号。例如,被选择并被扫描的感光像素行中的每一感光像素110输出的信号被传输到列处理单元14。复位扫描用于复位电荷,光电转换元件的光电荷被丢弃,从而可以开始新的光电荷的积累。For example, the vertical driving unit 12 includes a shift register and an address decoder. The vertical drive unit 12 includes readout scanning and reset scanning functions. The readout scan refers to sequentially scanning the unit photosensitive pixels 110 line by line, and reading signals from these unit photosensitive pixels 110 line by line. For example, the signal output by each photosensitive pixel 110 in the selected and scanned photosensitive pixel row is transmitted to the column processing unit 14. The reset scan is used to reset the charge, and the photocharge of the photoelectric conversion element is discarded, so that the accumulation of new photocharge can be started.

例如,由列处理单元14执行的信号处理是相关双采样(CDS)处理。在CDS处理中,取出从所选感光像素行中的每一感光像素110输出的复位电平和信号电平,并且计算电平差。因而,获得了一行中的感光像素110的信号。列处理单元14可以具有用于将模拟像素信号转换为数字格式的模数(A/D)转换功能。For example, the signal processing performed by the column processing unit 14 is correlated double sampling (CDS) processing. In the CDS process, the reset level and the signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated. Thus, the signals of the photosensitive pixels 110 in a row are obtained. The column processing unit 14 may have an analog-to-digital (A/D) conversion function for converting analog pixel signals into a digital format.

例如,水平驱动单元15包括移位寄存器和地址译码器。水平驱动单元15顺序逐列扫描像素 阵列11。通过水平驱动单元15执行的选择扫描操作,每一感光像素列被列处理单元14顺序地处理,并且被顺序输出。For example, the horizontal driving unit 15 includes a shift register and an address decoder. The horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Through the selection scanning operation performed by the horizontal driving unit 15, each photosensitive pixel column is sequentially processed by the column processing unit 14, and is sequentially output.

例如,控制单元13根据操作模式配置时序信号,利用多种时序信号来控制垂直驱动单元12、列处理单元14和水平驱动单元15协同工作。For example, the control unit 13 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 12, the column processing unit 14 and the horizontal driving unit 15 to work together.

图3是本申请实施方式中一种感光像素110的示意图。感光像素110包括像素电路111、滤光片112、及微透镜113。沿感光像素110的收光方向,微透镜113、滤光片112、及像素电路111依次设置。微透镜113用于汇聚光线,滤光片112用于供某一波段的光线通过并过滤掉其余波段的光线。像素电路111用于将接收到的光线转换为电信号,并将生成的电信号提供给图2所示的列处理单元14。FIG. 3 is a schematic diagram of a photosensitive pixel 110 in an embodiment of the present application. The photosensitive pixel 110 includes a pixel circuit 111, a filter 112, and a micro lens 113. Along the light-receiving direction of the photosensitive pixel 110, the microlens 113, the filter 112, and the pixel circuit 111 are arranged in sequence. The microlens 113 is used for condensing light, and the filter 112 is used for passing light of a certain waveband and filtering out the light of other wavebands. The pixel circuit 111 is used to convert the received light into electrical signals, and provide the generated electrical signals to the column processing unit 14 shown in FIG. 2.

图4是本申请实施方式中一种感光像素110的像素电路111的示意图。图4中像素电路111可应用在图2所示的像素阵列11内的每个感光像素110(图3所示)中。下面结合图2至图4对像素电路111的工作原理进行说明。FIG. 4 is a schematic diagram of a pixel circuit 111 of a photosensitive pixel 110 in an embodiment of the present application. The pixel circuit 111 in FIG. 4 can be applied to each photosensitive pixel 110 (shown in FIG. 3) in the pixel array 11 shown in FIG. The working principle of the pixel circuit 111 will be described below with reference to FIGS. 2 to 4.

如图4所示,像素电路111包括光电转换元件1111(例如,光电二极管)、曝光控制电路(例如,转移晶体管1112)、复位电路(例如,复位晶体管1113)、放大电路(例如,放大晶体管1114)和选择电路(例如,选择晶体管1115)。在本申请的实施例中,转移晶体管1112、复位晶体管1113、放大晶体管1114和选择晶体管1115例如是MOS管,但不限于此。As shown in FIG. 4, the pixel circuit 111 includes a photoelectric conversion element 1111 (for example, a photodiode), an exposure control circuit (for example, a transfer transistor 1112), a reset circuit (for example, a reset transistor 1113), and an amplification circuit (for example, an amplification transistor 1114). ) And a selection circuit (for example, a selection transistor 1115). In the embodiment of the present application, the transfer transistor 1112, the reset transistor 1113, the amplifying transistor 1114, and the selection transistor 1115 are, for example, MOS transistors, but are not limited thereto.

例如,光电转换元件1111包括光电二极管,光电二极管的阳极例如连接到地。光电二极管将所接收的光转换为电荷。光电二极管的阴极经由曝光控制电路(例如,转移晶体管1112)连接到浮动扩散单元FD。浮动扩散单元FD与放大晶体管1114的栅极、复位晶体管1113的源极连接。For example, the photoelectric conversion element 1111 includes a photodiode, and the anode of the photodiode is connected to the ground, for example. The photodiode converts the received light into electric charge. The cathode of the photodiode is connected to the floating diffusion unit FD via an exposure control circuit (for example, a transfer transistor 1112). The floating diffusion unit FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.

例如,曝光控制电路为转移晶体管1112,曝光控制电路的控制端TG为转移晶体管1112的栅极。当有效电平(例如,VPIX电平)的脉冲通过曝光控制线传输到转移晶体管1112的栅极时,转移晶体管1112导通。转移晶体管1112将光电二极管光电转换的电荷传输到浮动扩散单元FD。For example, the exposure control circuit is a transfer transistor 1112, and the control terminal TG of the exposure control circuit is the gate of the transfer transistor 1112. When a pulse of an active level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on. The transfer transistor 1112 transfers the charge photoelectrically converted by the photodiode to the floating diffusion unit FD.

例如,复位晶体管1113的漏极连接到像素电源VPIX。复位晶体管113的源极连接到浮动扩散单元FD。在电荷被从光电二极管转移到浮动扩散单元FD之前,有效复位电平的脉冲经由复位线传输到复位晶体管113的栅极,复位晶体管113导通。复位晶体管113将浮动扩散单元FD复位到像素电源VPIX。For example, the drain of the reset transistor 1113 is connected to the pixel power supply VPIX. The source of the reset transistor 113 is connected to the floating diffusion unit FD. Before the charge is transferred from the photodiode to the floating diffusion unit FD, a pulse of an effective reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on. The reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.

例如,放大晶体管1114的栅极连接到浮动扩散单元FD。放大晶体管1114的漏极连接到像素电源VPIX。在浮动扩散单元FD被复位晶体管1113复位之后,放大晶体管1114经由选择晶体管1115通过输出端OUT输出复位电平。在光电二极管的电荷被转移晶体管1112转移之后,放大晶体管1114经由选择晶体管1115通过输出端OUT输出信号电平。For example, the gate of the amplifying transistor 1114 is connected to the floating diffusion unit FD. The drain of the amplifying transistor 1114 is connected to the pixel power supply VPIX. After the floating diffusion unit FD is reset by the reset transistor 1113, the amplifying transistor 1114 outputs the reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplifying transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.

例如,选择晶体管1115的漏极连接到放大晶体管1114的源极。选择晶体管1115的源极通过输出端OUT连接到图2中的列处理单元14。当有效电平的脉冲通过选择线被传输到选择晶体管1115的栅极时,选择晶体管1115导通。放大晶体管1114输出的信号通过选择晶体管1115传输到列处理单元14。For example, the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114. The source of the selection transistor 1115 is connected to the column processing unit 14 in FIG. 2 through the output terminal OUT. When the pulse of the active level is transmitted to the gate of the selection transistor 1115 through the selection line, the selection transistor 1115 is turned on. The signal output by the amplifying transistor 1114 is transmitted to the column processing unit 14 through the selection transistor 1115.

需要说明的是,本申请实施例中像素电路111的像素结构并不限于图4所示的结构。例如,像素电路111也可以具有三晶体管像素结构,其中放大晶体管1114和选择晶体管1115的功能由一个晶体管完成。例如,曝光控制电路也不局限于单个转移晶体管1112的方式,其它具有控制端控制导通功能的电子器件或结构均可以作为本申请实施例中的曝光控制电路,本申请实施方式中的单个转移晶体管1112的实施方式简单、成本低、易于控制。It should be noted that the pixel structure of the pixel circuit 111 in the embodiment of the present application is not limited to the structure shown in FIG. 4. For example, the pixel circuit 111 may also have a three-transistor pixel structure, in which the functions of the amplifying transistor 1114 and the selecting transistor 1115 are performed by one transistor. For example, the exposure control circuit is not limited to the way of a single transfer transistor 1112, and other electronic devices or structures with the function of controlling the conduction of the control terminal can be used as the exposure control circuit in the embodiment of the present application. The implementation of the transistor 1112 is simple, low in cost, and easy to control.

图5至图10是本申请某些实施方式的像素阵列11(图2所示)中的感光像素110(图3所示)的排布示意图。感光像素110包括两类,一类为全色感光像素W,另一类为彩色感光像素。图5至图10仅示出了一个最小重复单元中的多个感光像素110的排布。对图5至图10所示的最小重复单元在行和列上多次复制,即可形成像素阵列11。每个最小重复单元均由多个全色感光像素W和多个彩色感光像素组成。每个最小重复单元包括多个子单元。每个子单元内包括多个单颜色感光像素和多个全色感光像素W。其中,图5至图8所示的最小重复单元中,每个子单元中的全色感光像素W和彩色感光像素交替设置。图9和图10所示的最小重复单元中,每个子单元中,同一行的多个感光像素110为同一类别的感光像素110;或者,同一列的多个感光像素110为同一类别的感光像素110。5 to 10 are schematic diagrams of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the pixel array 11 (shown in FIG. 2) according to some embodiments of the present application. The photosensitive pixels 110 include two types, one is a full-color photosensitive pixel W, and the other is a color photosensitive pixel. 5 to 10 only show the arrangement of a plurality of photosensitive pixels 110 in a minimum repeating unit. The smallest repeating unit shown in FIGS. 5 to 10 is copied multiple times in rows and columns to form the pixel array 11. Each minimum repeating unit is composed of multiple full-color photosensitive pixels W and multiple color photosensitive pixels. Each minimum repeating unit includes multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels W. Among them, in the smallest repeating unit shown in FIGS. 5 to 8, the full-color photosensitive pixel W and the color photosensitive pixel in each sub-unit are alternately arranged. In the smallest repeating unit shown in FIGS. 9 and 10, in each sub-unit, multiple photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category; or, multiple photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category 110.

具体地,例如,图5为本申请一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:Specifically, for example, FIG. 5 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit of an embodiment of the application. Among them, the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement method is:

Figure PCTCN2021077093-appb-000001
Figure PCTCN2021077093-appb-000001

Figure PCTCN2021077093-appb-000002
Figure PCTCN2021077093-appb-000002

W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。W represents the full-color photosensitive pixel; A represents the first color photosensitive pixel among the multiple color photosensitive pixels; B represents the second color photosensitive pixel among the multiple color photosensitive pixels; C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.

例如,如图5所示,对于每个子单元,全色感光像素W和单颜色感光像素交替设置。For example, as shown in FIG. 5, for each sub-unit, full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.

例如,如图5所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1(例如图5中左上角和右下角连接的方向),两个第二类子单元UB设置在第二对角线方向D2(例如图5中右上角和左下角连接的方向)。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。For example, as shown in Figure 5, the categories of subunits include three categories. Wherein, the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A; the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B; The third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC. Among them, a first type subunit UA and a third type subunit UC are arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner in FIG. 5), and two second type subunits UB are arranged In the second diagonal direction D2 (for example, the direction where the upper right corner and the lower left corner are connected in FIG. 5). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line are perpendicular.

需要说明的是,在其他实施方式中,第一对角线方向D1也可以是右上角和左下角连接的方向,第二对角线方向D2也可以是左上角和右下角连接的方向。另外,这里的“方向”并非单一指向,可以理解为指示排布的“直线”的概念,可以有直线两端的双向指向。下文图6至图10中对第一对角线方向D1及第二对角线方向D2的解释与此处相同。It should be noted that in other embodiments, the first diagonal direction D1 may also be a direction connecting the upper right corner and the lower left corner, and the second diagonal direction D2 may also be a direction connecting the upper left corner and the lower right corner. In addition, the "direction" here is not a single direction, but can be understood as the concept of a "straight line" indicating the arrangement, and there may be two-way directions at both ends of the straight line. The explanation of the first diagonal direction D1 and the second diagonal direction D2 in FIGS. 6 to 10 is the same as here.

再例如,图6为本申请另一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为6行6列36个感光像素110,子单元为3行3列9个感光像素110。排布方式为:For another example, FIG. 6 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in a minimum repeating unit according to another embodiment of the application. Among them, the smallest repeating unit is 36 photosensitive pixels 110 in 6 rows and 6 columns, and the sub-units are 9 photosensitive pixels 110 in 3 rows and 3 columns. The arrangement method is:

Figure PCTCN2021077093-appb-000003
Figure PCTCN2021077093-appb-000003

W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。W represents the full-color photosensitive pixel; A represents the first color photosensitive pixel among the multiple color photosensitive pixels; B represents the second color photosensitive pixel among the multiple color photosensitive pixels; C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.

例如,如图6所示,对于每个子单元,全色感光像素W和单颜色感光像素交替设置。For example, as shown in FIG. 6, for each sub-unit, full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.

例如,如图6所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。For example, as shown in Figure 6, the categories of subunits include three categories. Wherein, the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A; the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B; The third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC. Among them, one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1, and two second type subunits UB are arranged in the second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line are perpendicular.

再例如,图7为本申请又一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为8行8列64个感光像素110,子单元为4行4列16个感光像素110。排布方式为:For another example, FIG. 7 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application. Among them, the minimum repeating unit is 8 rows and 8 columns and 64 photosensitive pixels 110, and the sub-units are 4 rows and 4 columns and 16 photosensitive pixels 110. The arrangement method is:

Figure PCTCN2021077093-appb-000004
Figure PCTCN2021077093-appb-000004

W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。W represents the full-color photosensitive pixel; A represents the first color photosensitive pixel among the multiple color photosensitive pixels; B represents the second color photosensitive pixel among the multiple color photosensitive pixels; C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.

例如,如图7所示,对于每个子单元,全色感光像素W和单颜色感光像素交替设置。For example, as shown in FIG. 7, for each sub-unit, full-color photosensitive pixels W and single-color photosensitive pixels are alternately arranged.

例如,如图7所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重 复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。For example, as shown in Figure 7, the categories of subunits include three categories. Wherein, the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A; the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B; The third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repetition unit includes four subunits, which are a first type subunit UA, two second type subunits UB, and a third type subunit UC. Among them, one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1, and two second type subunits UB are arranged in the second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line are perpendicular.

具体地,例如,图8为本申请再一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:Specifically, for example, FIG. 8 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application. Among them, the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement method is:

Figure PCTCN2021077093-appb-000005
Figure PCTCN2021077093-appb-000005

W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。W represents the full-color photosensitive pixel; A represents the first color photosensitive pixel among the multiple color photosensitive pixels; B represents the second color photosensitive pixel among the multiple color photosensitive pixels; C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.

图8所示的最小重复单元中感光像素110的排布与图5所示的最小重复单元中感光像素110的排布大致相同,其不同之处在于,图8中位于左下角的第二类子单元UB中的全色感光像素W与单颜色感光像素的交替顺序与图5中位于左下角的第二类子单元UB中的全色感光像素W与单颜色感光像素的交替顺序不一致,并且,图8中的第三类子单元UC中的全色感光像素W与单颜色感光像素的交替顺序与图5中位于右下角的第三类子单元UC中的全色感光像素W与单颜色感光像素的交替顺序也不一致。具体地,图5中位于左下角的第二类子单元UB中,第一行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第二颜色感光像素B),第二行的感光像素110的交替顺序为单颜色感光像素(即第二颜色感光像素B)、全色感光像素W;而图8中位于左下角的第二类子单元UB中,第一行的感光像素110的交替顺序为单颜色感光像素(即第二颜色感光像素B)、全色感光像素W,第二行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第二颜色感光像素B)。图5中位于右下角的第三类子单元UC中,第一行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第三颜色感光像素C),第二行的感光像素110的交替顺序为单颜色感光像素(即第三颜色感光像素C)、全色感光像素W;而图8中位于右下角的第三类子单元UC中,第一行的感光像素110的交替顺序为单颜色感光像素(即第三颜色感光像素C)、全色感光像素W,第二行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第三颜色感光像素C)。The arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG. 8 is roughly the same as the arrangement of the photosensitive pixels 110 in the smallest repeating unit shown in FIG. The alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the subunit UB is inconsistent with the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the second type of subunit UB in the lower left corner of FIG. 5, and , The alternating sequence of the full-color photosensitive pixel W and the single-color photosensitive pixel in the third type subunit UC in FIG. 8 is the same as the full-color photosensitive pixel W and the single-color photosensitive pixel W in the third type subunit UC in the lower right corner of FIG. 5 The alternating sequence of photosensitive pixels is also inconsistent. Specifically, in the second type subunit UB in the lower left corner of FIG. 5, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (ie, second-color photosensitive pixels B), and The alternating sequence of the two rows of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B) and full-color photosensitive pixels W; and in the second-type subunit UB in the lower left corner of FIG. 8, the first row The alternating sequence of photosensitive pixels 110 is single-color photosensitive pixels (ie, second-color photosensitive pixels B), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (ie The second color photosensitive pixel B). In the third type of subunit UC in the lower right corner of FIG. 5, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels C), and the second row The alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, a third-color photosensitive pixel C) and a full-color photosensitive pixel W; and in the third type subunit UC in the lower right corner of FIG. 8, the photosensitive pixels 110 in the first row The alternating sequence of the single-color photosensitive pixels (that is, the third color photosensitive pixel C), the full-color photosensitive pixel W, the alternating sequence of the photosensitive pixels 110 in the second row is the full-color photosensitive pixel W, the single-color photosensitive pixel (that is, the third color Photosensitive pixel C).

如图8所示,图8中的第一类子单元UA中的全色感光像素W与单颜色感光像素的交替顺序与第三类子单元UC中的全色感光像W素与单颜色感光像素的交替顺序不一致。具体地,图8所示的第一类子单元CA中,第一行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第一颜色感光像素A),第二行的感光像素110的交替顺序为单颜色感光像素(即第一颜色感光像素A)、全色感光像素W;而图8所示的第三类子单元CC中,第一行的感光像素110的交替顺序为单颜色感光像素(即第三颜色感光像素C)、全色感光像素W,第二行的感光像素110的交替顺序为全色感光像素W、单颜色感光像素(即第三颜色感光像素C)。也即是说,同一最小重复单元中,不同子单元内的全色感光像素W与彩色感光像素的交替顺序可以是一致的(如图5所示),也可以是不一致的(如图8所示)。As shown in FIG. 8, the alternating sequence of full-color photosensitive pixels W and single-color photosensitive pixels in the first type of subunit UA in FIG. The alternating sequence of pixels is not consistent. Specifically, in the first type of sub-unit CA shown in FIG. 8, the alternating sequence of the photosensitive pixels 110 in the first row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, first-color photosensitive pixels A), and the second row The alternating sequence of the photosensitive pixels 110 is a single-color photosensitive pixel (that is, the first color photosensitive pixel A), a full-color photosensitive pixel W; and in the third type of subunit CC shown in FIG. 8, the photosensitive pixels 110 in the first row The alternating sequence is single-color photosensitive pixels (that is, third-color photosensitive pixels C), full-color photosensitive pixels W, and the alternating sequence of photosensitive pixels 110 in the second row is full-color photosensitive pixels W, single-color photosensitive pixels (that is, third-color photosensitive pixels). Pixel C). That is to say, in the same minimum repeating unit, the alternating sequence of full-color photosensitive pixels W and color photosensitive pixels in different subunits can be the same (as shown in Figure 5) or inconsistent (as shown in Figure 8). Show).

再例如,图9为本申请还一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:For another example, FIG. 9 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application. Among them, the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement method is:

Figure PCTCN2021077093-appb-000006
Figure PCTCN2021077093-appb-000006

W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。W represents the full-color photosensitive pixel; A represents the first color photosensitive pixel among the multiple color photosensitive pixels; B represents the second color photosensitive pixel among the multiple color photosensitive pixels; C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.

例如,如图9所示,对于每个子单元,同一行的多个感光像素110为同一类别的感光像素110。其中,同一类别的感光像素110包括:(1)均为全色感光像素W;(2)均为第一颜色感光像素A;(3)均为第二颜色感光像素B;(4)均为第三颜色感光像素C。For example, as shown in FIG. 9, for each sub-unit, multiple photosensitive pixels 110 in the same row are photosensitive pixels 110 of the same category. The photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.

例如,如图9所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重 复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。For example, as shown in Figure 9, the categories of subunits include three categories. Wherein, the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A; the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B; The third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repetition unit includes four subunits, which are a first type subunit UA, two second type subunits UB, and a third type subunit UC. Among them, one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1, and two second type subunits UB are arranged in the second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line are perpendicular.

再例如,图10为本申请还一个实施例的最小重复单元中感光像素110(图3所示)的排布示意图。其中,最小重复单元为4行4列16个感光像素110,子单元为2行2列4个感光像素110。排布方式为:For another example, FIG. 10 is a schematic diagram of the arrangement of photosensitive pixels 110 (shown in FIG. 3) in the smallest repeating unit according to another embodiment of the application. Among them, the smallest repeating unit is 4 rows, 4 columns and 16 photosensitive pixels 110, and the sub-units are 2 rows, 2 columns and 4 photosensitive pixels 110. The arrangement method is:

Figure PCTCN2021077093-appb-000007
Figure PCTCN2021077093-appb-000007

W表示全色感光像素;A表示多个彩色感光像素中的第一颜色感光像素;B表示多个彩色感光像素中的第二颜色感光像素;C表示多个彩色感光像素中的第三颜色感光像素。W represents the full-color photosensitive pixel; A represents the first color photosensitive pixel among the multiple color photosensitive pixels; B represents the second color photosensitive pixel among the multiple color photosensitive pixels; C represents the third color photosensitive pixel among the multiple color photosensitive pixels Pixels.

例如,如图10所示,对于每个子单元,同一列的多个感光像素110为同一类别的感光像素110。其中,同一类别的感光像素110包括:(1)均为全色感光像素W;(2)均为第一颜色感光像素A;(3)均为第二颜色感光像素B;(4)均为第三颜色感光像素C。For example, as shown in FIG. 10, for each sub-unit, a plurality of photosensitive pixels 110 in the same column are photosensitive pixels 110 of the same category. The photosensitive pixels 110 of the same category include: (1) all full-color photosensitive pixels W; (2) all first-color photosensitive pixels A; (3) all second-color photosensitive pixels B; (4) all The third color photosensitive pixel C.

例如,如图10所示,子单元的类别包括三类。其中,第一类子单元UA包括多个全色感光像素W和多个第一颜色感光像素A;第二类子单元UB包括多个全色感光像素W和多个第二颜色感光像素B;第三类子单元UC包括多个全色感光像素W和多个第三颜色感光像素C。每个最小重复单元包括四个子单元,分别为一个第一类子单元UA、两个第二类子单元UB及一个第三类子单元UC。其中,一个第一类子单元UA与一个第三类子单元UC设置在第一对角线方向D1,两个第二类子单元UB设置在第二对角线方向D2。第一对角线方向D1与第二对角线方向D2不同。例如,第一对角线和第二对角线垂直。For example, as shown in Figure 10, the categories of subunits include three categories. Wherein, the first type subunit UA includes a plurality of full-color photosensitive pixels W and a plurality of first color photosensitive pixels A; the second type of subunit UB includes a plurality of panchromatic photosensitive pixels W and a plurality of second-color photosensitive pixels B; The third type of subunit UC includes a plurality of full-color photosensitive pixels W and a plurality of third-color photosensitive pixels C. Each minimum repeating unit includes four subunits, which are one subunit of the first type UA, two subunits of the second type UB, and one subunit of the third type UC. Among them, one first type subunit UA and one third type subunit UC are arranged in the first diagonal direction D1, and two second type subunits UB are arranged in the second diagonal direction D2. The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal line and the second diagonal line are perpendicular.

例如,在其他实施方式中,同一最小重复单元中,也可以是部分子单元内的同一行的多个感光像素110为同一类别的感光像素110,其余部分子单元内的同一列的多个感光像素110为同一类别的感光像素110。For example, in other embodiments, in the same minimum repeating unit, multiple photosensitive pixels 110 in the same row in some sub-units may be photosensitive pixels 110 of the same category, and multiple photosensitive pixels 110 in the same column in the remaining sub-units The pixels 110 are photosensitive pixels 110 of the same type.

例如,如图5至图10所示的最小重复单元中,第一颜色感光像素A可以为红色感光像素R;第二颜色感光像素B可以为绿色感光像素G;第三颜色感光像素C可以为蓝色感光像素Bu。For example, in the smallest repeating unit shown in FIGS. 5 to 10, the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a green photosensitive pixel G; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.

例如,如图5至图10所示的最小重复单元中,第一颜色感光像素A可以为红色感光像素R;第二颜色感光像素B可以为黄色感光像素Y;第三颜色感光像素C可以为蓝色感光像素Bu。For example, in the smallest repeating unit shown in FIGS. 5 to 10, the first color photosensitive pixel A may be a red photosensitive pixel R; the second color photosensitive pixel B may be a yellow photosensitive pixel Y; and the third color photosensitive pixel C may be Blue photosensitive pixel Bu.

例如,如图5至图10所示的最小重复单元中,第一颜色感光像素A可以为品红色感光像素M;第二颜色感光像素B可以为青色感光像素Cy;第三颜色感光像素C可以为黄色感光像素Y。For example, in the smallest repeating unit shown in FIGS. 5 to 10, the first color photosensitive pixel A may be a magenta photosensitive pixel M; the second color photosensitive pixel B may be a cyan photosensitive pixel Cy; and the third color photosensitive pixel C may It is the yellow photosensitive pixel Y.

需要说明的是,在一些实施例中,全色感光像素W的响应波段可为可见光波段(例如,400nm-760nm)。例如,全色感光像素W上设置有红外滤光片,以实现红外光的滤除。在另一些实施例中,全色感光像素W的响应波段为可见光波段和近红外波段(例如,400nm-1000nm),与图像传感器10(图1所示)中的光电转换元件1111(图4所示)的响应波段相匹配。例如,全色感光像素W可以不设置滤光片或者设置可供所有波段的光线通过的滤光片,全色感光像素W的响应波段由光电转换元件1111的响应波段确定,即两者相匹配。本申请的实施例包括但不局限于上述波段范围。It should be noted that, in some embodiments, the response band of the full-color photosensitive pixel W may be the visible light band (for example, 400 nm-760 nm). For example, the full-color photosensitive pixel W is provided with an infrared filter to filter out infrared light. In other embodiments, the response wavelength bands of the full-color photosensitive pixel W are visible light and near-infrared wavelengths (for example, 400nm-1000nm), and the photoelectric conversion element 1111 (shown in FIG. 4) in the image sensor 10 (shown in FIG. 1) (Shown) to match the response band. For example, the full-color photosensitive pixel W may not be provided with a filter or a filter that can pass light of all wavelength bands. The response band of the full-color photosensitive pixel W is determined by the response band of the photoelectric conversion element 1111, that is, the two match. . The embodiments of the present application include, but are not limited to, the above-mentioned waveband range.

请结合图1至图3、图5、图11及图12,在某些实施方式中,控制单元13控制像素阵列11曝光。其中,像素阵列11以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据;其中,第一曝光时间不等于第二曝光时间。1 to 3, FIG. 5, FIG. 11, and FIG. 12, in some embodiments, the control unit 13 controls the pixel array 11 to expose. Wherein, the pixel array 11 is exposed for the first exposure time to obtain the first original image. The first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time. The pixel array 11 is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time; wherein, the first The exposure time is not equal to the second exposure time.

具体地,图像处理器20可以控制像素阵列11进行两次曝光。例如,如图11所示,在第一次曝光中,像素阵列11以第一曝光时间L曝光得到第一原始图像。第一原始图像包括以第一曝光时间L曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间L曝光的全色感光像素生成的第一全色原始图像数据。在第二次曝光中,像素阵列11以第二曝光时间S曝光得到第二原始图像。第二原始图像包括以第二曝光时间S曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间S曝光的全色感光像素生成的第二全色原始图像数据。Specifically, the image processor 20 can control the pixel array 11 to perform two exposures. For example, as shown in FIG. 11, in the first exposure, the pixel array 11 is exposed for the first exposure time L to obtain the first original image. The first original image includes first color original image data generated by the single-color photosensitive pixels exposed at the first exposure time L and first full-color original image data generated by the panchromatic photosensitive pixels exposed at the first exposure time L. In the second exposure, the pixel array 11 is exposed for the second exposure time S to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time S and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time S.

在某些实施方式中,像素阵列11还可以以第三曝光时间曝光得到第三原始图像。第三原始图 像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素生成的第三全色原始图像数据。其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。图像处理器20、高动态融合单元50(可以包括彩色高动态融合单元30和全色高动态融合单元40)用于对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。In some embodiments, the pixel array 11 may also be exposed for a third exposure time to obtain a third original image. The third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time. Wherein, the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time. The image processor 20 and the high dynamic fusion unit 50 (which may include a color high dynamic fusion unit 30 and a panchromatic high dynamic fusion unit 40) are used to perform image preprocessing on the first original image, the second original image, and the third original image. High dynamic range processing, image processing and fusion algorithm processing get the target image.

具体地,请参阅图13,图像处理器20可以控制像素阵列11进行三次曝光。分别得到第一原始图像、第二原始图像和第三原始图像。其中,第一原始图像包括以第一曝光时间L曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间L曝光的全色感光像素生成的第一全色原始图像数据。第二原始图像包括以第二曝光时间M曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间M曝光的全色感光像素生成的第二全色原始图像数据。第三原始图像包括以第三曝光时间S曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间S曝光的全色感光像素生成的第三全色原始图像数据。Specifically, referring to FIG. 13, the image processor 20 can control the pixel array 11 to perform three exposures. The first original image, the second original image, and the third original image are obtained respectively. The first original image includes first color original image data generated by a single-color photosensitive pixel exposed at the first exposure time L and first full-color original image data generated by a panchromatic photosensitive pixel exposed at the first exposure time L. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time M and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time M. The third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time S and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time S.

在其他实施方式中,图像处理器20还可以控制像素阵列11进行例如四次、五次、六次、十次或二十次的更多次数的曝光,从而得到更多的原始图像。图像处理器20、彩色高动态融合单元30和全色高动态融合单元40再对所有原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。In other embodiments, the image processor 20 may also control the pixel array 11 to perform more exposures such as four, five, six, ten, or twenty times, so as to obtain more original images. The image processor 20, the color high dynamic fusion unit 30, and the panchromatic high dynamic fusion unit 40 then perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on all original images to obtain the target image.

需要说明的是,在某些实施例中,像素阵列11的曝光过程可以是:(1)像素阵列11以至少两次曝光时间(例如第一曝光时间L和第二曝光时间S,或者第一曝光时间L、第二曝光时间M和第三曝光时间S)依次曝光(其中不同曝光时间的曝光先后顺序不作限制),且至少两次曝光的曝光进行时间在时间轴上不重叠;(2)像素阵列11以至少两次曝光时间(例如第一曝光时间L和第二曝光时间S,或者第一曝光时间L、第二曝光时间M和第三曝光时间S)曝光(其中不同曝光时间的曝光先后顺序不作限制),且至少两次曝光的曝光进行时间在时间轴上存在部分重叠;(3)所有较短的曝光时间的曝光进行时间均位于最长的曝光时间的曝光进行时间内;例如,第二曝光时间S的曝光进行时间位于第一曝光时间L的曝光进行时间内;又例如,第二曝光时间M和第三曝光时间S均位于第一曝光时间L的曝光进行时间内。(4)所有曝光可以在同一时刻开始,在不同时刻结束,或者所有曝光可以在同一时刻结束,在不同时刻开始。本申请实施方式的高动态范围处理系统100可采用第(3)或第(4)种曝光方式,使用该曝光方式可以缩短像素阵列11在一次拍摄中所需要的曝光时间,有利于提升图像的帧率,同时有利于最小化至少两次曝光的曝光时间之间的间隔,以使得多帧图像的曝光时间更为接近,从而提升由多张曝光时间不同的图像融合成的高动态图像的图像质量。It should be noted that, in some embodiments, the exposure process of the pixel array 11 may be: (1) The pixel array 11 takes at least two exposure times (for example, the first exposure time L and the second exposure time S, or the first exposure time L and the second exposure time S). Exposure time L, second exposure time M, and third exposure time S) sequential exposure (the order of exposure for different exposure times is not limited), and the exposure time of at least two exposures does not overlap on the time axis; (2) The pixel array 11 is exposed with at least two exposure times (for example, the first exposure time L and the second exposure time S, or the first exposure time L, the second exposure time M, and the third exposure time S) (where the exposure of different exposure times The sequence is not limited), and the exposure time of at least two exposures partially overlaps on the time axis; (3) All the exposure time of shorter exposure time is within the exposure time of the longest exposure time; for example, , The exposure time of the second exposure time S is within the exposure time of the first exposure time L; for another example, the second exposure time M and the third exposure time S are both within the exposure time of the first exposure time L. (4) All exposures can start at the same time and end at different times, or all exposures can end at the same time and start at different times. The high dynamic range processing system 100 of the embodiment of the present application can adopt the (3) or (4) exposure mode. Using this exposure mode can shorten the exposure time required by the pixel array 11 in one shot, which is beneficial to improve the image quality. The frame rate is also beneficial to minimize the interval between the exposure times of at least two exposures, so that the exposure times of multiple frames of images are closer, thereby improving the image of a highly dynamic image merged by multiple images with different exposure times quality.

具体地,至少两次曝光的曝光进行时间具有重叠的曝光方式(例如上述的第(2)、第(3)和第(4)种曝光方式),如图14所示,可以由在图像传感器10中设置缓冲处理器16,缓冲处理器16配合控制单元和像素阵列11工作实现。以第(4)种曝光方式为例,请参阅12,图像传感器10控制像素阵列11进行三次曝光,分别为第一曝光时间1s,第二曝光时间1/8s,第三曝光时间1/64s。图像传感器10的控制单元控制像素阵列11每隔1/512s输出一次曝光时长为1/512的曝光图像数据,并存入缓冲处理器16中。缓冲处理器16接收曝光图像数据后,将接收到的曝光图像数据存储到缓冲处理器16内部的缓冲内存区中,并在一次拍摄开始后,累计收到8个图像曝光数据后将累计的8个曝光图像数据进行相加处理后,作为第三原始图像传输到图像传感器10中,并且累计收到64个图像曝光数据后将累计的64个曝光图像数据进行相加处理后,作为第二原始图像传输到图像传感器10中,累计收到512个图像曝光数据后将累计的512个曝光图像数据进行相加处理后,作为第一原始图像传输到图像传感器10中,累计接收到512个曝光数据后,图像传感器10控制此次拍摄的曝光结束。本申请实施方式通过设置缓冲处理器16配合控制单元和像素阵列11工作,从而以简单的器件和工作逻辑完成本申请实施方式中至少两次曝光的曝光进行时间具有重叠的曝光方式(例如上述的第(2)、第(3)和第(4)种曝光方式),有助于提高系统工作可靠性的同时,有利于缩短像素阵列11在一次拍摄中所需要的曝光时间,提升图像的帧率,同时有利于缩小至少两次曝光的曝光时间之间的间隔,以使得多帧图像的曝光时间更为接近,从而提升由多张曝光时间不同的图像融合成的高动态图像的图像质量。Specifically, the exposure time of at least two exposures has overlapping exposure modes (for example, the above-mentioned (2), (3), and (4) exposure modes), as shown in FIG. 14, which can be determined by the image sensor A buffer processor 16 is provided in 10, and the buffer processor 16 cooperates with the control unit and the pixel array 11 to work. Taking the exposure method (4) as an example, please refer to 12. The image sensor 10 controls the pixel array 11 to perform three exposures, which are respectively a first exposure time of 1s, a second exposure time of 1/8s, and a third exposure time of 1/64s. The control unit of the image sensor 10 controls the pixel array 11 to output exposure image data with an exposure duration of 1/512 every 1/512s, and store the exposure image data in the buffer processor 16. After the buffer processor 16 receives the exposure image data, it stores the received exposure image data in the buffer memory area inside the buffer processor 16, and after a shooting starts, it will accumulate 8 images after receiving 8 exposure data. After the addition processing of the exposure image data, it is transmitted to the image sensor 10 as the third original image, and after the cumulative 64 image exposure data are received, the cumulative 64 exposure image data is added and then processed as the second original image. The image is transmitted to the image sensor 10, after 512 image exposure data are received in total, the accumulated 512 exposure image data are added together, and then transmitted to the image sensor 10 as the first original image, and 512 exposure data are received in total After that, the image sensor 10 controls the exposure of this shooting to end. In the embodiment of the present application, the buffer processor 16 is set to cooperate with the control unit and the pixel array 11, so as to complete at least two exposures in the embodiment of the present application with a simple device and working logic. (2), (3), and (4) exposure methods), which help improve the reliability of the system, and at the same time help shorten the exposure time required by the pixel array 11 in one shot, and improve the image frame At the same time, it is beneficial to reduce the interval between the exposure times of at least two exposures, so that the exposure times of multiple frames of images are closer, thereby improving the image quality of a high dynamic image fused by multiple images with different exposure times.

请参阅图1,图像处理器20可以包括彩色预处理模块2023、全色预处理模块2024、彩色处理模块2021、全色处理模块2022和融合模块204。图像预处理可以包括像素补全处理和去马赛克处理。图像处理包括第一图像处理和第二图像处理。其中,彩色预处理模块2023可以用于对彩色原始图像数据进行像素补全处理,得到彩色原始图像。全色预处理模块2024可以用于对全色原始图像数据进行去马赛克处理,得到全色原始图像。彩色处理模块2021可以用于对彩色原始图像进行 第一图像处理,得到彩色中间图像。全色处理模块2022可以用于对全色原始图像进行第二图像处理,得到全色中间图像。融合模块204可以用于对彩色中间图像和全色中间图像进行融合算法处理得到目标图像。在某些实施方式中,图像处理器20还包括图像前端处理单元202。彩色预处理模块2023、全色预处理模块2024、彩色处理模块2021和全色处理模块2022可以集成在图像前端处理单元202中。1, the image processor 20 may include a color preprocessing module 2023, a full color preprocessing module 2024, a color processing module 2021, a full color processing module 2022, and a fusion module 204. Image preprocessing can include pixel completion processing and demosaicing processing. Image processing includes first image processing and second image processing. Among them, the color preprocessing module 2023 can be used to perform pixel complement processing on the color original image data to obtain the color original image. The panchromatic preprocessing module 2024 can be used to demosaicate the panchromatic original image data to obtain the panchromatic original image. The color processing module 2021 may be used to perform first image processing on the color original image to obtain a color intermediate image. The panchromatic processing module 2022 may be used to perform second image processing on the panchromatic original image to obtain a panchromatic intermediate image. The fusion module 204 may be used to perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image. In some embodiments, the image processor 20 further includes an image front-end processing unit 202. The color preprocessing module 2023, the full color preprocessing module 2024, the color processing module 2021, and the full color processing module 2022 may be integrated in the image front-end processing unit 202.

全色预处理模块2024对全色原始图像数据进行去马赛克处理的具体操作过程与后文中本申请实施方式对第一彩色原始图像和第二彩色原始图像进行去马赛克处理的具体操作过程类似,将在后文统一进行详细阐述。The specific operation process of the panchromatic preprocessing module 2024 for demosaicing the panchromatic original image data is similar to the specific operation process of the demosaicing process for the first color original image and the second color original image in the following embodiments of the application. In the following, we will explain in detail in a unified way.

请参阅图15、图16和图17,彩色预处理模块2023对彩色原始图像数据进行像素补全处理的具体操作过程可以包括如下步骤:(1)将彩色原始图像数据分解成第一颜色原始图像数据(由上文所述的第一颜色感光像素A生成的原始图像数据)、第二颜色原始图像数据(由上文所述的第二颜色感光像素B生成的原始图像数据)和第三颜色原始图像数据(由上文所述的第三颜色感光像素C生成的原始图像数据)。(2)将第一颜色原始图像数据中子单元的多个第一颜色感光像素A生成的像素值进行求平均值运算,求得平均值后将子单元范围的像素格融合成一个像素格,并将该平均值填入该像素格中,得到第一颜色中间图像数据。(3)对第一颜色中间图像数据利用双线性插值方法进行插值,得到第一颜色插值图像数据。双线性插值的具体操作方式下文中会详细阐述。(4)将第一颜色插值图像数据和第一颜色原始图像数据融合得到第一颜色原始图像。(5)将第一颜色原始图像数据、第二颜色原始图像数据和第三颜色原始图像数据,均进行以上(2)、(3)和(4)步骤后,将所得的具有一个颜色通道的第一颜色原始图像、第二颜色原始图像和第三颜色原始图像合成为具有三个颜色通道的分辨率和彩色原始图像的分辨率相同的彩色原始图像。彩色预处理模块2023可以对至少两次曝光对应的所有彩色原始图像数据均进行以上步骤的像素补全处理,从而完成对所有彩色原始图像数据的像素补全处理,得到至少两次曝光对应的彩色原始图像。具体地,请结合图15、图16和图17,下面以彩色预处理模块2023对第一彩色原始图像数据中的第一红色原始图像数据进行像素补全处理为例进行说明。如图15所示,彩色预处理模块2023先将彩色原始图像(可以为第一彩色原始图像、第二彩色原始图像或第三彩色原始图像等)数据分解成红色原始图像数据、绿色原始图像数据和蓝色原始图像数据。如图16所示,彩色预处理模块2023再将红色原始图像数据中子单元的多个红颜色感光像素R生成的像素值(例如L1和L2)进行求平均值运算,求得平均值L1’=(L1+L2)/2后将子单元范围的像素格融合成一个像素格,并将该平均值填入该像素格中,得到红色中间图像数据。然后,彩色预处理模块2023对红色中间图像数据利用双线性插值方法进行插值,得到红色插值图像数据。接着,彩色预处理模块2023将红色插值图像数据和红色原始图像数据融合得到红色原始图像。融合过程中,首先,彩色预处理模块2023生成一张分辨率与红色原始图像数据相同,最小重复单元中的像素颜色排列方式与红色插值图像数据的空值图像,然后根据以下原则进行融合:(1)若在第一红色原始图像数据的相同坐标中具有像素值,并且颜色通道相同,则直接将第一红色原始图像数据的相同坐标中的像素值填入空值图像;(2)若在第一红色原始图像数据的相同坐标中具有像素值,但颜色通道不同,则将第一红色插值图像数据的相应坐标中的像素值填入空值图像;(3)若在第一红色原始图像数据的相同坐标中不具有像素值,则将第一红色插值图像数据的相应坐标中的像素值填入空值图像。根据以上融合原则,如图16所示,将得到红色原始图像。与此相似地,如图17所示,彩色预处理模块2023可以得到红色原始图像、绿色原始图像和蓝色原始图像,并将所得的具有一个颜色通道的红色原始图像、绿色原始图像和蓝色原始图像合成为具有3个颜色通道的彩色原始图像。彩色预处理模块2023可以对第一彩色原始图像数据和第二彩色原始图像数据(或者第一彩色原始图数据、第二彩色原始图像数据和第三彩色原始图像数据)均进行以上步骤的像素补全处理,从而完成对彩色原始图像数据的像素补全处理,得到第一彩色原始图像和第二彩色原始图像(或者第一彩色原始图、第二彩色原始图像和第三彩色原始图像)。本申请实施方式的高动态范围图像处理系统100对部分像素格中的彩色信息缺失并且具有彩色信息的像素格仅具有单颜色通道信息的彩色原始图像数据进行像素补全处理,能在不损失分辨率的情况下,得到具有完整的像素格的完整通道的彩色信息,进而得到彩色原始图像,以便后续对图像继续进行其他图像处理,提高成像质量。Referring to FIG. 15, FIG. 16, and FIG. 17, the specific operation process of the color preprocessing module 2023 performing pixel complement processing on the color original image data may include the following steps: (1) Decompose the color original image data into the first color original image Data (the original image data generated by the first color photosensitive pixel A described above), the second color original image data (the original image data generated by the second color photosensitive pixel B described above), and the third color Original image data (original image data generated by the third color photosensitive pixel C described above). (2) Perform an averaging operation on the pixel values generated by the multiple first-color photosensitive pixels A of the sub-unit in the original image data of the first color, and after the average value is obtained, the pixel grids in the sub-unit range are merged into a pixel grid. The average value is filled into the pixel grid to obtain the first color intermediate image data. (3) Interpolate the first color intermediate image data by using a bilinear interpolation method to obtain the first color interpolated image data. The specific operation method of bilinear interpolation will be explained in detail below. (4) Fusion of the first color interpolated image data and the first color original image data to obtain the first color original image. (5) After performing the above steps (2), (3) and (4) on the original image data of the first color, the original image data of the second color, and the original image data of the third color, the resulting image with one color channel The original image of the first color, the original image of the second color, and the original image of the third color are synthesized into a color original image with the same resolution of the three color channels and the same resolution of the color original image. The color preprocessing module 2023 can perform the pixel complement processing of the above steps on all color original image data corresponding to at least two exposures, thereby completing the pixel complement processing of all color original image data, and obtain the color corresponding to at least two exposures. The original image. Specifically, referring to FIG. 15, FIG. 16, and FIG. 17, the color preprocessing module 2023 performs pixel complement processing on the first red original image data in the first color original image data as an example for description. As shown in Figure 15, the color preprocessing module 2023 first decomposes the color original image (which can be the first color original image, the second color original image, the third color original image, etc.) data into red original image data and green original image data. And blue raw image data. As shown in FIG. 16, the color preprocessing module 2023 then performs an averaging operation on the pixel values (such as L1 and L2) generated by the multiple red photosensitive pixels R in the subunit of the red original image data to obtain the average value L1' =(L1+L2)/2 After fusing the pixel grid of the sub-unit range into a pixel grid, and filling the average value into the pixel grid, the red intermediate image data is obtained. Then, the color preprocessing module 2023 uses a bilinear interpolation method to interpolate the red intermediate image data to obtain red interpolated image data. Next, the color preprocessing module 2023 fuses the red interpolated image data and the red original image data to obtain a red original image. In the fusion process, first, the color preprocessing module 2023 generates a null image with the same resolution as the red original image data, the pixel color arrangement in the smallest repeating unit and the red interpolated image data, and then the fusion is performed according to the following principles: ( 1) If there are pixel values in the same coordinates of the first red original image data, and the color channels are the same, then directly fill the pixel values in the same coordinates of the first red original image data into the empty image; (2) If in The first red original image data has pixel values in the same coordinates but different color channels, then the pixel values in the corresponding coordinates of the first red interpolated image data are filled into the empty image; (3) If the first red original image is If there is no pixel value in the same coordinate of the data, then the pixel value in the corresponding coordinate of the first red interpolated image data is filled into the empty image. According to the above fusion principle, as shown in Figure 16, a red original image will be obtained. Similarly, as shown in FIG. 17, the color preprocessing module 2023 can obtain a red original image, a green original image, and a blue original image, and combine the resulting red original image, green original image, and blue original image with one color channel. The original image is synthesized into a color original image with 3 color channels. The color preprocessing module 2023 can perform the pixel compensation of the above steps on the first color original image data and the second color original image data (or the first color original image data, the second color original image data, and the third color original image data). Full processing, thereby completing the pixel complement processing of the color original image data to obtain the first color original image and the second color original image (or the first color original image, the second color original image, and the third color original image). The high dynamic range image processing system 100 of the embodiment of the present application performs pixel complement processing on the color information in some pixel grids where the color information is missing and the pixel grids with color information only have color original image data with single color channel information. In the case of high rate, the color information of the complete channel with the complete pixel grid is obtained, and then the color original image is obtained, so that other image processing can be continued on the image subsequently to improve the imaging quality.

请参阅图1,在某些实施方式中,在融合模块204对彩色中间图像和全色中间图像进行融合算法处理得到目标图像之后,高动态融合单元50可以将至少两次曝光对应的目标图像(可以包括第一目标图像和第二目标图像)融合得到高动态的目标图像。Referring to FIG. 1, in some embodiments, after the fusion module 204 performs fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image, the high dynamic fusion unit 50 may expose the target image corresponding to at least two exposures ( It may include the first target image and the second target image) fusion to obtain a highly dynamic target image.

请参阅图18,在另一些实施方式中,高动态融合单元50可以包括彩色高动态融合单元30和 全色高动态融合单元40。在彩色预处理模块2023对彩色原始图像数据进行像素补全处理,得到彩色原始图像之前,彩色高动态融合单元30可以将至少两次曝光对应的彩色原始图像数据融合得到高动态的彩色原始图像数据。在全色预处理模块2024对全色原始图像数据进行去马赛克处理,得到全色原始图像之前,全色高动态融合单元40用于将至少两次曝光对应的全色原始图像数据融合得到高动态的全色原始图像数据。Referring to FIG. 18, in other embodiments, the high-dynamic fusion unit 50 may include a color high-dynamic fusion unit 30 and a full-color high-dynamic fusion unit 40. Before the color preprocessing module 2023 performs pixel complement processing on the color original image data to obtain the color original image, the color high dynamic fusion unit 30 may fuse the color original image data corresponding to at least two exposures to obtain high dynamic color original image data . Before the panchromatic preprocessing module 2024 performs demosaic processing on the panchromatic original image data to obtain the panchromatic original image, the panchromatic high dynamic fusion unit 40 is used to fuse the panchromatic original image data corresponding to at least two exposures to obtain a high dynamic The full-color original image data.

请参阅图19,在又一些实施方式中,高动态融合单元50可以包括彩色高动态融合单元30和全色高动态融合单元40。在彩色处理模块2021对彩色原始图像进行第一图像处理得到彩色中间图像之前,彩色高动态融合单元30可以将至少两次曝光对应的彩色原始图像融合得到高动态的彩色原始图像。在全色处理模块2022对全色原始图像进行第二图像处理得到全色中间图像之前,全色高动态融合单元40可以将至少两次曝光对应的全色原始图像融合得到高动态的全色原始图像。Referring to FIG. 19, in still other embodiments, the high-dynamic fusion unit 50 may include a color high-dynamic fusion unit 30 and a full-color high-dynamic fusion unit 40. Before the color processing module 2021 performs first image processing on the color original image to obtain a color intermediate image, the color high dynamic fusion unit 30 may fuse the color original images corresponding to at least two exposures to obtain a high dynamic color original image. Before the panchromatic processing module 2022 performs the second image processing on the panchromatic original image to obtain a panchromatic intermediate image, the panchromatic high dynamic fusion unit 40 may fuse the panchromatic original images corresponding to at least two exposures to obtain a highly dynamic panchromatic original image. image.

请参阅图20,在再一些实施方式中,高动态融合单元50可以包括彩色高动态融合单元30和全色高动态融合单元40。在融合模块204用于对彩色中间图像和全色中间图像进行融合算法处理得到目标图像之前,彩色高动态融合单元30可以将至少两次曝光对应的彩色中间图像融合得到高动态的彩色中间图像,全色高动态融合单元40可以将至少两次曝光对应的全色中间图像融合得到高动态的全色中间图像。Referring to FIG. 20, in still other embodiments, the high-dynamic fusion unit 50 may include a color high-dynamic fusion unit 30 and a full-color high-dynamic fusion unit 40. Before the fusion module 204 is used to perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image, the color high dynamic fusion unit 30 may fuse the color intermediate images corresponding to at least two exposures to obtain a high dynamic color intermediate image, The panchromatic high dynamic fusion unit 40 can merge panchromatic intermediate images corresponding to at least two exposures to obtain a highly dynamic panchromatic intermediate image.

彩色处理模块2021中,第一图像处理可以包括:黑电平矫正处理、镜头阴影矫正处理、去马赛克处理、坏点补偿处理、色彩矫正处理、全局色调映射处理、色彩转换处理中的一个或多个;全色处理模块2022中,第二图像处理可以包括:黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理、全局色调映射处理中的一个或多个。In the color processing module 2021, the first image processing may include one or more of black level correction processing, lens shading correction processing, demosaicing processing, dead pixel compensation processing, color correction processing, global tone mapping processing, and color conversion processing. One; in the panchromatic processing module 2022, the second image processing may include one or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing.

具体地,第一图像处理可以包括第一图像子处理和第二图像子处理。彩色处理模块2021可以对彩色原始图像先进行第一图像子处理,再进行第二图像子处理。其中,第一图像子处理可以包括黑电平矫正处理、镜头阴影矫正处理和坏点补偿处理中的一个或多个。第二图像子处理可以包括去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。Specifically, the first image processing may include a first image sub-processing and a second image sub-processing. The color processing module 2021 may first perform the first image sub-processing on the color original image, and then perform the second image sub-processing. The first image sub-processing may include one or more of black level correction processing, lens shading correction processing, and dead pixel compensation processing. The second image sub-processing may include one or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.

由于图像传感器采集的信息经过一系列转换生成原始图像。以8bit数据为例,单个像素的有效值是0~255,但是实际图像传感器中的模数转换芯片的精度可能无法将电压值很小的一部分转换出来,便容易造成生成图像的暗部细节的损失。黑电平矫正处理的过程可以是,彩色处理模块2021或全色处理模块2022在图像传感器10输出的原始图像数据的基础上,将每个像素值减去一个固定值。各颜色通道(例如红色通道、绿色通道、蓝色通道和全色通道,在某些实施例中,红色通道指的是图像传感器10输出的图像中由红色感光像素生成的红色信息,绿色通道指的是图像传感器10输出的图像中由绿色感光像素生成的绿色信息,红色通道指的是图像传感器10输出的图像中由蓝色感光像素生成的蓝色信息,全色通道指的是图像传感器10输出的图像中由全色感光像素生成的全色信息)对应的固定值可以是一样,也可以是不一样。具体地,请参阅图20,以图形传感器控制像素阵列11进行两次曝光(可以为两次或两次以上的次数)为例进行说明,图像传感器10可以输出第一彩色原始图像数据、第二彩色原始图像数据、第一全色原始图像数据和第二全色原始图像数据,图像处理器20接收第一彩色原始图像数据、第二彩色原始图像数据、第一全色原始图像数据和第二全色原始图像数据后,彩色预处理模块2023对第一彩色原始图像数据和第二彩色原始图像数据进行像素补全处理得到第一彩色原始图像和第二彩色原始图像,彩色处理模块2021对第一彩色原始图像和第二彩色原始图像进行第一图像处理中的黑电平矫正处理;全色预处理模块2024对第一全色原始图像数据和第二全色原始图像数据进行去马赛克处理得到第一全色原始图像和第二全色原始图像,全色处理模块2022对第一全色原始图像和第二全色原始图像进行第二图像处理中的黑电平矫正处理。以彩色处理模块2021对第一彩色原始图像进行黑电平矫正处理为例,第一彩色原始图像中具有红色通道、绿色通道和蓝色通道。请参阅图21,彩色处理模块2021对第一彩色原始图像进行黑电平矫正处理,第一彩色原始图像中所有的像素值均减去固定值5,从而得到经过黑电平矫正处理的第一彩色原始图像。同时图像传感器10在AD的输入之前加上一个固定的偏移量5(或者其他数值),使输出的像素值在5(或者其他数值)~255之间,配合黑电平矫正处理,能使得本申请实施方式的图像传感器10和高动态范围图像处理系统100得到的图像的暗部的细节完全保留的同时,不增大或减小图像的像素值,有利于提高成像质量。Because the information collected by the image sensor undergoes a series of conversions to generate the original image. Taking 8bit data as an example, the effective value of a single pixel is 0-255, but the accuracy of the analog-to-digital conversion chip in the actual image sensor may not be able to convert a small part of the voltage value, which will easily cause the loss of the dark details of the generated image . The process of black level correction processing may be that the color processing module 2021 or the panchromatic processing module 2022 subtracts a fixed value from each pixel value on the basis of the original image data output by the image sensor 10. Each color channel (such as a red channel, a green channel, a blue channel, and a panchromatic channel. In some embodiments, the red channel refers to the red information generated by the red photosensitive pixels in the image output by the image sensor 10, and the green channel refers to Is the green information generated by the green photosensitive pixels in the image output by the image sensor 10. The red channel refers to the blue information generated by the blue photosensitive pixels in the image output by the image sensor 10, and the panchromatic channel refers to the image sensor 10. The fixed value corresponding to the full-color information generated by the full-color photosensitive pixel in the output image may be the same or different. Specifically, referring to FIG. 20, the image sensor 10 controls the pixel array 11 to perform two exposures (which may be two or more times) as an example for description. The image sensor 10 can output the first color original image data and the second Color original image data, first full-color original image data, and second full-color original image data. The image processor 20 receives the first color original image data, the second color original image data, the first full-color original image data, and the second full-color original image data. After the full-color original image data, the color preprocessing module 2023 performs pixel complement processing on the first color original image data and the second color original image data to obtain the first color original image and the second color original image. A color original image and a second color original image are subjected to the black level correction processing in the first image processing; the panchromatic preprocessing module 2024 performs demosaicing processing on the first panchromatic original image data and the second panchromatic original image data. For the first full-color original image and the second full-color original image, the full-color processing module 2022 performs black level correction processing in the second image processing on the first full-color original image and the second full-color original image. Taking the color processing module 2021 performing black level correction processing on the first color original image as an example, the first color original image has a red channel, a green channel, and a blue channel. Please refer to FIG. 21. The color processing module 2021 performs black level correction processing on the first color original image. All pixel values in the first color original image are subtracted from a fixed value of 5, thereby obtaining the first black level correction processing. Color original image. At the same time, the image sensor 10 adds a fixed offset of 5 (or other values) before the input of AD, so that the output pixel value is between 5 (or other values) to 255. With the black level correction processing, it can make While the details of the dark parts of the image obtained by the image sensor 10 and the high dynamic range image processing system 100 of the embodiment of the present application are completely preserved, the pixel value of the image is not increased or decreased, which is beneficial to improving the imaging quality.

镜头阴影是由于镜头对于光学折射不均匀导致的镜头周围出现阴影的情况,即影像区的中心和四周的接收到的光强程度不一致的现象。镜头阴影矫正处理的过程可以是,彩色处理模块2021或全色处理模块2022可以在经过黑电平矫正处理的彩色原始图像和全色原始图像的基础上,将被处理图像进行网格划分,再通过各网格区域邻近的或者自身及邻近周的补偿系数,采用双线性插值方法对图像进行镜头阴影矫正。下文以对第一彩色原始图像进行镜头阴影矫正处理为例进行说明, 如图22所示,彩色处理模块2021将第一彩色原始图像(即被处理图像)进行划分,均等地分为十六个网格,十六个网格中每个网格具有一预设好的补偿系数。然后,彩色处理模块2021根据各网格区域邻近的或者自身及其邻近的补偿系数通过双线性插值方法对图像进行阴影矫正。R2为图示的经过镜头阴影矫正处理的第一彩色中间图像中虚线框内的像素值,R1为图示的第一彩色原始图像中的虚线框内的像素值。R2=R1*k1,k1由R1像素邻近的网格的补偿系数1.10、1.04、1.105和1.09进行双线性插值获得。设图像的坐标记为(x,y),x从左第一个像素开始往右计数,y从上第一个像素开始往下计数,x和y均为自然数,如图像边上的标识所示。例如,R1的坐标为(3,3),则R1在各网格补偿系数图中的坐标应为(0.75,0.75)。f(x,y)表示各网格补偿系数图中坐标为(x,y)的补偿值。则f(0.75,j0.75)为R1在各网格补偿系数图中对应的补偿系数值。双线性插值的插值公式可以为f(i+u,j+v)=(1-u)(1-v)f(i,j)+(1-u)vf(i,j+1)+u(1-v)f(i+1,j)+uvf(i+1,j+1),其中,x=i+u,i为x的整数部分,u为x的小数部分,j为y的整数部分,v为y的小数部分。则有f(0.75,j0.75)=(0.25)*(0.25)*f(0,0)+0.25*0.75*f(0,1)+0.75*0.25*f(1,0)+0.75*0.75f(1,1)=0.0625*1.11+0.1875*1.10+0.1875*1.09+0.5625*1.03。各网格的补偿系数在彩色处理模块2021或全色处理模块2022进行镜头阴影矫正处理之前已经预先设置。各网格的补偿系数可由如下方法确定:(1)将镜头300置于光线强度和色温恒定且均一的密闭装置内,并使镜头300在该密闭装置内正对亮度分布均匀的纯灰色的目标对象拍摄得到灰度图像;(2)将灰度图像进行网格划分(例如划分为16个网格),得到划分为不同网格区域的灰度图像;(3)计算灰度图像的不同网格区域的补偿系数。确定了镜头300的补偿系数之后,本申请的高动态范围图像处理系统100将该补偿系数预先设置在彩色处理模块2021或全色处理模块2022中,当高动态范围图像处理系统100中的彩色处理模块2021或全色处理模块2022对图像进行镜头阴影矫正处理时,该补偿系数被获取,彩色处理模块2021或全色处理模块2022再根据各网格区域的补偿系数,采用双线性插值方法对图像进行镜头阴影矫正处理。Lens shadow is the phenomenon that the lens has a shadow around the lens caused by the uneven optical refraction of the lens, that is, the intensity of the received light in the center and the surrounding area of the image area is inconsistent. The process of lens shading correction processing can be that the color processing module 2021 or panchromatic processing module 2022 can mesh the processed image on the basis of the color original image and the panchromatic original image that have undergone black level correction processing, and then The lens shading correction is performed on the image by the bilinear interpolation method through the compensation coefficients of each grid area adjacent or itself and adjacent circumferences. The following takes the lens shading correction processing on the first color original image as an example for description. As shown in FIG. 22, the color processing module 2021 divides the first color original image (that is, the processed image) into sixteen equally Grid, each of the sixteen grids has a preset compensation coefficient. Then, the color processing module 2021 performs shading correction on the image by a bilinear interpolation method according to the compensation coefficients adjacent to each grid area or itself and its vicinity. R2 is the pixel value in the dashed frame in the first color intermediate image that has undergone lens shading correction processing, and R1 is the pixel value in the dashed frame in the first color original image shown in the figure. R2=R1*k1, k1 is obtained by bilinear interpolation of the compensation coefficients 1.10, 1.04, 1.105, and 1.09 of the grid adjacent to the R1 pixel. Suppose the coordinates of the image are (x, y), x is counted from the first pixel from the left to the right, y is counted from the first pixel on the top, and both x and y are natural numbers, as indicated by the logo on the edge of the image Show. For example, if the coordinates of R1 are (3,3), the coordinates of R1 in each grid compensation coefficient map should be (0.75,0.75). f(x, y) represents the compensation value of the coordinate (x, y) in each grid compensation coefficient graph. Then f(0.75, j0.75) is the compensation coefficient value corresponding to R1 in each grid compensation coefficient graph. The interpolation formula of bilinear interpolation can be f(i+u,j+v)=(1-u)(1-v)f(i,j)+(1-u)vf(i,j+1) +u(1-v)f(i+1,j)+uvf(i+1,j+1), where x=i+u, i is the integer part of x, u is the fractional part of x, j Is the integer part of y, and v is the decimal part of y. Then f(0.75,j0.75)=(0.25)*(0.25)*f(0,0)+0.25*0.75*f(0,1)+0.75*0.25*f(1,0)+0.75* 0.75f(1,1)=0.0625*1.11+0.1875*1.10+0.1875*1.09+0.5625*1.03. The compensation coefficient of each grid has been preset before the color processing module 2021 or the panchromatic processing module 2022 performs lens shading correction processing. The compensation coefficient of each grid can be determined by the following methods: (1) Place the lens 300 in a closed device with constant and uniform light intensity and color temperature, and make the lens 300 face a pure gray target with uniform brightness distribution in the closed device The object is shot to obtain a grayscale image; (2) The grayscale image is gridded (for example, divided into 16 grids) to obtain the grayscale image divided into different grid areas; (3) The different grids of the grayscale image are calculated The compensation coefficient of the grid area. After determining the compensation coefficient of the lens 300, the high dynamic range image processing system 100 of the present application presets the compensation coefficient in the color processing module 2021 or the panchromatic processing module 2022. When the color processing in the high dynamic range image processing system 100 When the module 2021 or the panchromatic processing module 2022 performs lens shading correction processing on the image, the compensation coefficient is obtained, and the color processing module 2021 or the panchromatic processing module 2022 then uses the bilinear interpolation method to perform the correction according to the compensation coefficient of each grid area The image undergoes lens shading correction processing.

图像传感器的像素阵列上的感光像素可能存在工艺上的缺陷,或光信号进行转化为电信号的过程中出现错误,从而造成图像上像素信息错误,导致图像中的像素值不准确,这些有缺陷的像素表现在输出的图像上即为图像坏点。图像坏点可能存在,因此需要对图像进行坏点补偿处理。坏点补偿处理可以包括如下步骤:(1)以待检测像素点为中心像素点建立相同颜色的感光像素的像素点的3×3像素矩阵;(2)以所述中心像素点的周围像素点为参考点,判断所述中心像素点的色值与所述周围像素点的差值是否均大于第一阈值,如果是,则该中心像素点为坏点,如果否,则该中心像素点为正常点;(3)对判定为坏点的中心像素点进行双线性插值得到校正后的像素值。请参阅图23,下面以对第一全色原始图像进行坏点补偿处理进行说明,图23中的第一张图中的R1为待检测像素点,彩色处理模块2021以R1为中心像素点建立与R1的感光像素相同颜色的像素点的3×3像素矩阵,得到图23中的第二张图。并以中心像素点R1的所述周围像素点为参考点,判断中心像素点R1的色值与所述周围像素点的差值是否均大于第一阈值Q(Q在彩色处理模块2021中预设)。如果是,则该中心像素点R1为坏点,如果否,则该中心像素点R1为正常点。如果R1是坏点,则对R1进行双线性插值得到校正后的像素值R1’(图中展示的为R1是坏点的情况)得到图23中的第三张图。请参阅图24,以全色处理模块2022对经过镜头阴影矫正处理的第一全色原始图像进行坏点补偿处理进行说明。图24中的第一张图中的W1为待检测像素点,全色处理模块2022以W1为中心像素点建立与W1的感光像素相同颜色的像素点的3×3像素矩阵,得到图24中的第二张图。并以中心像素点W1的周围像素点为参考点,判断中心像素点W1的色值与所述周围像素点的差值是否均大于第一阈值K(K在全色处理模块2022中预设)。如果是,则该中心像素点W1为坏点,如果否,则该中心像素点W1为正常点。如果W1是坏点,则对W1进行双线性插值得到校正后的像素值W1’(图中展示的为W1是坏点的情况)得到图24中的第三张图。本申请实施方式的彩色处理模块2021和全色处理模块2022可以对图像进行坏点补偿处理,有利于高动态范围图像处理系统100消除高动态范围图像处理系统100的成像过程中,由于感光像素存在工艺上的缺陷,或光信号进行转化为电信号的过程中出现错误而产生的图像坏点,进而提高高动态范围图像处理系统100形成的目标图像的像素值的准确性,从而使得本申请实施方式具有更好的成像效果。The photosensitive pixels on the pixel array of the image sensor may have defects in process, or errors in the process of converting optical signals into electrical signals, resulting in incorrect pixel information on the image and inaccurate pixel values in the image. These are defects The pixels that appear on the output image are image dead pixels. Image dead pixels may exist, so the image needs to be compensated for dead pixels. The dead pixel compensation process may include the following steps: (1) create a 3×3 pixel matrix of the same color photosensitive pixels with the pixel to be detected as the center pixel; (2) use the surrounding pixels of the central pixel As a reference point, it is judged whether the difference between the color value of the central pixel and the surrounding pixels is greater than the first threshold. If yes, the central pixel is a dead pixel; if not, the central pixel is Normal point; (3) Perform bilinear interpolation on the central pixel point judged as a bad point to obtain the corrected pixel value. Please refer to FIG. 23. The following describes the dead pixel compensation processing on the first full-color original image. In the first image in FIG. 23, R1 is the pixel to be detected. A 3×3 pixel matrix of pixels of the same color of the photosensitive pixels, and the second image in FIG. 23 is obtained. And taking the surrounding pixels of the central pixel R1 as a reference point, it is determined whether the difference between the color value of the central pixel R1 and the surrounding pixel is greater than the first threshold Q (Q is preset in the color processing module 2021 ). If it is, the central pixel R1 is a bad pixel, and if not, the central pixel R1 is a normal pixel. If R1 is a dead pixel, then bilinear interpolation is performed on R1 to obtain the corrected pixel value R1' (the case where R1 is a dead pixel is shown in the figure) to obtain the third image in FIG. 23. Please refer to FIG. 24 to describe the dead pixel compensation processing performed by the panchromatic processing module 2022 on the first panchromatic original image that has undergone lens shading correction processing. W1 in the first picture in Fig. 24 is the pixel to be detected. The panchromatic processing module 2022 uses W1 as the center pixel to establish a 3×3 pixel matrix of pixels of the same color as the photosensitive pixel of W1 to obtain the first pixel in Fig. 24 Two pictures. And taking the surrounding pixels of the central pixel W1 as a reference point, it is determined whether the difference between the color value of the central pixel W1 and the surrounding pixels is greater than the first threshold K (K is preset in the panchromatic processing module 2022) . If it is, the central pixel W1 is a bad pixel, and if not, the central pixel W1 is a normal pixel. If W1 is a dead pixel, perform bilinear interpolation on W1 to obtain the corrected pixel value W1' (shown in the figure is the case where W1 is a dead pixel) to obtain the third image in FIG. 24. The color processing module 2021 and the full color processing module 2022 of the embodiment of the present application can perform dead pixel compensation processing on the image, which is beneficial to the high dynamic range image processing system 100 to eliminate the presence of photosensitive pixels in the imaging process of the high dynamic range image processing system 100 Process defects, or image defects caused by errors in the process of converting optical signals into electrical signals, thereby improving the accuracy of the pixel values of the target image formed by the high dynamic range image processing system 100, thereby enabling the implementation of this application The method has a better imaging effect.

由于本申请实施方式的彩色原始图像(例如第一彩色原始图像和第二彩色原始图像)的每个像素格中均为单颜色像素,没有其他颜色的光学信息,因此需要对第一彩色原始图像和第二彩色原始图像进行去马赛克处理。另外,全色预处理模块2024也可以对全色原始图像数据进行去马赛克处理得到全色原始图像。下面以彩色处理模块2021对第一彩色原始图像(例如包括红色通道、绿色通道和蓝色通道)进行去马赛克处理为例进行说明,去马赛克处理的步骤包括如下步骤:(1) 将第一彩色原始图像分解成第一红色原始图像、第一绿色原始图像和第一蓝色原始图像,如图25所示,所得的第一红色原始图像、第一绿色原始图像和第一蓝色原始图像中部分像素格没有像素值。(2)采用双线性插值方法分别对第一红色原始图像、第一绿色原始图像和第一蓝色原始图像进行插值处理。如图26所示,彩色处理模块2021采用双线性插值方法对第一蓝色原始图像进行插值处理。图26的待插值像素B1根据B1周围的四个像素B2、B3、B4和B5进行双线性插值,得到B1的插值像素B1’。图26的第一张图中的所有空白处的待插值像素均遍历地采用该双线性插值的方式补全像素值,得到插值后的第一蓝色原始图像。如图27所示,彩色处理模块2021采用双线性插值方法对第一绿色原始图像进行插值处理。图27的待插值像素G1根据G1周围的四个像素G2、G3、G4和G5进行双线性插值,得到G1的插值像素G1’。图27的第一张图中的所有空白处的待插值像素均遍历地采用该双线性插值的方式补全像素值,得到插值后的第一绿色原始图像。与之类似地,彩色处理模块2021可以采用双线性插值方法对第一红色原始图像进行插值处理,得到插值后的第一红色原始图像。(3)将插值后的第一红色原始图像、插值后的第一绿色原始图像和插值后的第一蓝色原始图像重新合成为具有3个颜色通道的一张图像,如图28所示。色彩处理模块2021对彩色图像进行去马赛克处理,有利于本申请实施方式将具有单颜色通道的像素值的彩色图像补全为具有多个颜色通道的彩色图像,从而在单颜色的感光像素的硬件基础上保持图像色彩的完整呈现。Since each pixel grid of the color original image (such as the first color original image and the second color original image) of the embodiment of the present application is a single-color pixel, there is no optical information of other colors, so it is necessary to compare the first color original image Perform demosaicing with the second color original image. In addition, the full-color preprocessing module 2024 can also perform demosaic processing on the full-color original image data to obtain a full-color original image. In the following, the color processing module 2021 performs demosaic processing on the first color original image (for example, including the red channel, the green channel, and the blue channel) as an example. The demosaic processing steps include the following steps: (1) The first color The original image is decomposed into a first red original image, a first green original image, and a first blue original image, as shown in FIG. 25, among the obtained first red original image, first green original image, and first blue original image Some pixel grids have no pixel value. (2) The first red original image, the first green original image, and the first blue original image are respectively interpolated by using a bilinear interpolation method. As shown in FIG. 26, the color processing module 2021 uses a bilinear interpolation method to perform interpolation processing on the first blue original image. The pixel B1 to be interpolated in FIG. 26 performs bilinear interpolation according to the four pixels B2, B3, B4, and B5 around B1 to obtain the interpolated pixel B1' of B1. The pixels to be interpolated in all blank spaces in the first image in FIG. 26 are traversed to use the bilinear interpolation method to complete the pixel values to obtain the interpolated first blue original image. As shown in FIG. 27, the color processing module 2021 uses a bilinear interpolation method to perform interpolation processing on the first green original image. The pixel to be interpolated G1 in FIG. 27 performs bilinear interpolation according to the four pixels G2, G3, G4, and G5 around G1 to obtain the interpolated pixel G1' of G1. All the pixels to be interpolated in the blanks in the first image in FIG. 27 are traversed to use the bilinear interpolation method to complement the pixel values to obtain the interpolated first green original image. Similarly, the color processing module 2021 may use a bilinear interpolation method to perform interpolation processing on the first red original image to obtain the interpolated first red original image. (3) Re-synthesize the interpolated first red original image, the interpolated first green original image, and the interpolated first blue original image into an image with 3 color channels, as shown in FIG. 28. The color processing module 2021 performs demosaic processing on the color image, which is beneficial for the implementation of the present application to complete the color image with the pixel value of a single color channel into a color image with multiple color channels, so that the hardware of the single-color photosensitive pixel On the basis of maintaining the complete presentation of the image color.

色彩矫正处理具体可以为利用一个色彩校正矩阵对彩色原始图像(可以为经过去马赛克处理的第一彩色原始图像和第二彩色原始图像)的各像素的各颜色通道值进行一次校正,从而实现了对图像色彩的矫正。如下所示:The color correction processing can specifically be to use a color correction matrix to correct the color channel values of each pixel of the color original image (which can be the first color original image and the second color original image that have undergone demosaicing processing), thereby realizing Correction of image color. As follows:

Figure PCTCN2021077093-appb-000008
Figure PCTCN2021077093-appb-000008

其中,色彩矫正矩阵(Color Correction Matrix,CCM)在色彩处理模块中预设。例如,色彩矫正矩阵具体可以为:Among them, the color correction matrix (CCM) is preset in the color processing module. For example, the color correction matrix can be specifically:

Figure PCTCN2021077093-appb-000009
Figure PCTCN2021077093-appb-000009

色彩处理模块通过对图像中的所有像素遍历地通过以上色彩矫正矩阵进行色彩矫正处理,可以得到经过色彩矫正处理的图像。本申请实施方式中色彩矫正处理有利于消除图像或视频帧中因为有色光源等造成的颜色严重偏差、图像中人或物体颜色失真的问题,使得本申请实施方式的高动态范围图像处理系统100能够恢复图像原始色彩,提高了图像的视觉效果。The color processing module traverses all pixels in the image and performs color correction processing through the above color correction matrix to obtain an image that has undergone color correction processing. The color correction processing in the embodiment of the present application is beneficial to eliminate the serious color deviation caused by colored light sources in the image or video frame, and the color distortion of people or objects in the image, so that the high dynamic range image processing system 100 of the embodiment of the present application can Restore the original color of the image and improve the visual effect of the image.

色调映射处理可以包括如下步骤:(1)把彩色原始图像(可以为经过色彩矫正处理的第一彩色原始图像和第二彩色原始图像)的灰度值归一化到区间[0,1]内,记归一化后的灰度值为Vin;(2)设Vout=Y(Vin),Vout和Vin之间的映射关系可以为如图29所示;(3)把Vout乘上255(当设定输出图像的灰度值为256阶时,乘上255,在其他设定时,可以为其他数值)后再四舍五入取整数,得到了色调映射处理后的图像。对于高动态范围的图像而言,其灰度值的二进制位数往往高于8位(普通的灰度图像灰度值的二进制位数一般是8位),而许多显示器的灰度只有8位,因此对高动态范围的图像的颜色进行变换,有利于高动态范围的图像具有更高的兼容性,能在常规的显示器上显示。另外,由于高动态范围图像一般灰度值分布得很不均匀,只有少数的像素点较亮,大部分像素都分布在灰度值较低的区间,本申请实施方式的高动态范围图像处理系统100对图像的色调映射处理并非线性的映射,而是在灰度值较低的区间的映射关系的斜率大于在灰度值较高的区间的映射关系的斜率,如图29所示,有利灰度值较低的区间内不同灰度值的像素点的区分度,而大部分像素都分布在灰度值较低的区间,因而使得本申请实施方式的高动态范围图像处理系统100具有更好的成像效果。The tone mapping process may include the following steps: (1) Normalize the gray value of the color original image (which may be the first color original image and the second color original image that have undergone color correction processing) into the interval [0,1] , Record the normalized gray value as Vin; (2) Set Vout=Y(Vin), the mapping relationship between Vout and Vin can be as shown in Figure 29; (3) Multiply Vout by 255 (when When the grayscale value of the output image is set to 256 levels, multiply it by 255. In other settings, it can be other values) and then round to the nearest integer to obtain the image after the tone mapping process. For images with high dynamic range, the binary digits of the gray value are often higher than 8 bits (the binary digits of the gray value of ordinary gray-scale images are generally 8 bits), and the gray scale of many displays is only 8 bits Therefore, the color of the high dynamic range image is changed, which is beneficial for the high dynamic range image to have higher compatibility and can be displayed on a conventional monitor. In addition, because the high dynamic range image generally has a very uneven distribution of gray values, only a few pixels are brighter, and most of the pixels are distributed in the interval with a lower gray value. The high dynamic range image processing system of the embodiment of the present application The tone mapping processing of 100 pairs of images is non-linear, but the slope of the mapping relationship in the interval with lower gray value is greater than that in the interval with higher gray value. As shown in Figure 29, it is beneficial to gray The degree of discrimination of pixels with different gray values in the interval with a lower degree value, and most of the pixels are distributed in the interval with a lower gray value, so that the high dynamic range image processing system 100 of the embodiment of the present application has better The imaging effect.

为了图像具有更广泛的应用场景或者具有更高效率的传输格式,本申请实施方式的高动态范围图像处理系统100可以对彩色原始图像(可以为经过色调映射处理的第一彩色原始图像和第二彩色原始图像)进行色彩转换处理,将图像由一个色彩空间(例如RGB色彩空间)转换成另一个色彩空间(例如YUV色彩空间)从而具有更广泛的应用场景或者具有更高效率的传输格式。在具体的实施例中,色彩转换处理的步骤可以为对图像中的所有像素值的R、G和B通道像素值进行如下公式转换得到Y、U和V通道像素值:(1)Y=0.30R+0.59G+0.11B;(2)U=0.493(B-Y); (3)V=0.877(R-Y);从而将该图像由RGB色彩空间转换为YUV色彩空间。由于YUV色彩空间中的亮度信号Y和色度信号U和V是分离的,并且人眼对亮度的敏感超过色度,色彩转换处理将图像由RGB色彩空间转换为YUV色彩空间有利于本申请实施方式的高动态范围图像处理系统100后续的其他图像处理对图像进行色度信息的压缩,在不影响图像观看效果的同时,能减小图像的信息量,从而提高图像的传输效率。In order for the image to have a wider range of application scenarios or a more efficient transmission format, the high dynamic range image processing system 100 of the embodiment of the present application can perform processing of color original images (which may be the first color original image and the second color original image that have undergone tone mapping processing). Color original images) perform color conversion processing to convert the image from one color space (for example, RGB color space) to another color space (for example, YUV color space) to have a wider range of application scenarios or a more efficient transmission format. In a specific embodiment, the step of color conversion processing may be to convert the R, G, and B channel pixel values of all the pixel values in the image to obtain the Y, U, and V channel pixel values: (1) Y=0.30 R+0.59G+0.11B; (2) U=0.493 (B-Y); (3) V=0.877 (R-Y); thereby converting the image from the RGB color space to the YUV color space. Since the luminance signal Y and the chrominance signal U and V in the YUV color space are separated, and the human eye is more sensitive to luminance than chrominance, the color conversion process to convert the image from the RGB color space to the YUV color space is beneficial to the implementation of this application The subsequent image processing of the high dynamic range image processing system 100 compresses the chrominance information of the image, which can reduce the amount of information of the image while not affecting the viewing effect of the image, thereby improving the transmission efficiency of the image.

在某些实施方式中,高动态融合单元50可以将至少两次曝光对应的目标图像(可以包括第一目标图像和第二目标图像)进行亮度对齐处理,以得到亮度对齐的目标图像,再融合亮度对齐的目标图像及一张或多张目标图像以得到高动态的目标图像。In some embodiments, the high dynamic fusion unit 50 may perform brightness alignment processing on the target image (which may include the first target image and the second target image) corresponding to at least two exposures to obtain a brightness-aligned target image, and then merge the target image. Brightly aligned target image and one or more target images to obtain a highly dynamic target image.

在另一些实施方式中,彩色高动态融合单元30可以将至少两次曝光对应的彩色原始图像数据(例如第一彩色原始图像数据和第二彩色原始图像数据)进行亮度对齐处理,以得到亮度对齐的彩色原始图像数据,再融合亮度对齐的彩色原始图像数据及一张或多张彩色原始图像数据以得到高动态的彩色原始图像数据。全色高动态融合单元40可以将至少两次曝光对应的全色原始图像数据(例如第一全色原始图像数据和第二全色原始图像数据)进行亮度对齐处理,以得到亮度对齐的全色原始图像数据,再融合亮度对齐的全色原始图像数据及一张或多张全色原始图像数据以得到高动态的全色原始图像数据。In other embodiments, the color high dynamic fusion unit 30 may perform brightness alignment processing on the color original image data corresponding to at least two exposures (for example, the first color original image data and the second color original image data) to obtain the brightness alignment. The original color image data is combined with brightness-aligned color original image data and one or more color original image data to obtain highly dynamic color original image data. The panchromatic high dynamic fusion unit 40 may perform brightness alignment processing on panchromatic original image data corresponding to at least two exposures (for example, the first panchromatic original image data and the second panchromatic original image data) to obtain a brightness-aligned panchromatic The original image data is then fused with brightness-aligned panchromatic primitive image data and one or more panchromatic primitive image data to obtain highly dynamic panchromatic primitive image data.

在又一些实施方式中,彩色高动态融合单元30可以将至少两次曝光对应的彩色原始图像(例如第一彩色原始图像和第二彩色原始图像)进行亮度对齐处理,以得到亮度对齐的彩色原始图像,再融合亮度对齐的彩色原始图像及一张或多张彩色原始图像以得到高动态的彩色原始图像。全色高动态融合单元40可以将至少两次曝光对应的全色原始图像(例如第一全色原始图像和第二全色原始图像)进行亮度对齐处理,以得到亮度对齐的全色原始图像,再融合亮度对齐的全色原始图像及一张或多张全色原始图像以得到高动态的全色原始图像。In still other embodiments, the color high dynamic fusion unit 30 may perform brightness alignment processing on the color original images corresponding to at least two exposures (for example, the first color original image and the second color original image) to obtain brightness-aligned color original images. Image, and then merge the brightness-aligned color original image and one or more color original images to obtain a highly dynamic color original image. The panchromatic high dynamic fusion unit 40 may perform brightness alignment processing on panchromatic original images corresponding to at least two exposures (for example, a first panchromatic original image and a second panchromatic original image) to obtain a panchromatic original image with aligned brightness. Then merge the brightness-aligned panchromatic original image and one or more panchromatic original images to obtain a highly dynamic panchromatic original image.

在再一些实施方式中,彩色高动态融合单元30可以将至少两次曝光对应的彩色中间图像(例如第一彩色中间图像和第二彩色中间图像)进行亮度对齐处理,以得到亮度对齐的彩色中间图像,再融合亮度对齐的彩色中间图像及一张或多张彩色中间图像以得到高动态的彩色中间图像。全色高动态融合单元40可以将至少两次曝光对应的全色中间图像进行亮度对齐处理,以得到亮度对齐的全色中间图像(例如第一全色中间图像和第二全色中间图像),再融合亮度对齐的全色中间图像及一张或多张全色中间图像以得到高动态的全色中间图像。In still other embodiments, the color high dynamic fusion unit 30 may perform brightness alignment processing on the color intermediate images corresponding to at least two exposures (for example, the first color intermediate image and the second color intermediate image) to obtain a brightness-aligned color intermediate image. Image, and then merge the brightness-aligned color intermediate image and one or more color intermediate images to obtain a highly dynamic color intermediate image. The panchromatic high dynamic fusion unit 40 may perform brightness alignment processing on panchromatic intermediate images corresponding to at least two exposures to obtain panchromatic intermediate images with aligned brightness (for example, a first panchromatic intermediate image and a second panchromatic intermediate image), Then merge the brightness-aligned panchromatic intermediate image and one or more panchromatic intermediate images to obtain a highly dynamic panchromatic intermediate image.

具体地,高动态融合单元50对图像进行的高动态范围处理可以包括亮度对齐处理。高动态融合单元50(可以包括彩色高动态融合单元30或全色高动态融合单元40)对图像进行亮度对齐处理包括如下步骤(该图像的张数可以等于图像传感器10控制像素阵列11进行的曝光次数,该图像可以为第一彩色原始图像数据和第二彩色原始图像数据,第一彩色原始图像数据、第二彩色原始图像数据和第三彩色原始图像数据,第一目标图像和第二目标图像,第一彩色原始图像和第二彩色原始图像,第一彩色中间图像和第二彩色中间图像,第一全色原始图像和第二全色原始图像,第一全色中间图像和第二全色中间图像,第一彩色原始图像、第二彩色原始图像和第三彩色原始图像,第一全色原始图像、第二全色原始图像和第三全色原始图像,第一彩色中间图像、第二彩色中间图像和第三彩色中间图像,第一全色中间图像、第二全色中间图像和第三全色中间图像中的其中一组,下文中以彩色高动态融合单元30对第一彩色中间图像(由长时间L曝光对应得到)、第二彩色中间图像(由中时间M曝光对应得到)和第三彩色中间图像(由短时间S曝光对应得到)进行亮度对齐处理为例进行说明):(1)识别第一彩色中间图像中像素值大于第一预设阈值的过曝图像像素;(2)对于每一个过曝图像像素,以该过曝图像像素为中心扩展预定区域;(3)在预定区域内寻找像素值小于第一预设阈值的中间图像像素;(4)利用中间图像像素及第二彩色中间图像、第三彩色中间曝光图像对过曝图像像素的像素值进行修正;(5)利用过曝图像像素的修正后的像素值更新第一彩色中间图像以得到亮度对齐后的第一彩色中间图像。具体地,请结合图30,假设图像像素P12(图30中第一彩色中间图像内标记有虚线圆圈的图像像素)的像素值V1大于第一预设阈值V0,即图像像素P12为过曝图像像素P12,则彩色高动态融合单元30或全色高动态融合单元40以过曝图像像素P12为中心扩展一个预定区域,例如,图30所示的3*3区域。当然,在其他实施例中,也可以是4*4区域、5*5区域、10*10区域等,在此不作限制。随后,彩色高动态融合单元30或全色高动态融合单元40在3*3的预定区域内寻找像素值小于第一预设阈值V0的中间图像像素,例如图30中的图像像素P21(图30中第一彩色中间图像内标记有点画线圆圈的图像像素)的像素值V2小于第一预设阈值V0,则图像像素P21即为中间图像像素P21。随后,彩色高动态融合单元30在第二彩色中间图像中寻找与过曝图像像素P12及中间图像像素P21分别对应的图像像素,即图像像素P1'2'(图30中第二彩色中间图像内标记有虚线圆圈的图像像素) 和图像像素P2'1'(图30中第二彩色中间图像内标记有点画线圆圈的图像像素),其中,图像像素P1'2'与过曝图像像素P12对应,图像像素P2'1'与中间图像像素P21对应,图像像素P1'2'的像素值为V3,图像像素P2'1'的像素值为V4。随后,彩色高动态融合单元30根据V1'/V3=V2/V4计算出V1'。当V1'小于第一预设阈值V0,彩色高动态融合单元30利用V1'的值来替换掉V1的值,当V1'大于第一预设阈值V0时,彩色高动态融合单元30在第三彩色中间图像中寻找与过曝图像像素P12及中间图像像素P21分别对应的图像像素,即图像像素P1”2”和P2”1”,图像像素P1”'2”的像素值为V5,图像像素P2”1”的像素值为V6,同理最终根据V1”/V5=V2/V6计算出V1”,并利用V1”的值来替换掉V1的值。由此,即可计算出过曝图像像素P12的实际像素值。彩色高动态融合单元30或全色高动态融合单元40对第一彩色中间图像中的每一个过曝图像像素均执行这一亮度对齐的处理过程,即可得到亮度对齐后的第一彩色中间图像。由于亮度对齐后的第一彩色中间图像中的过曝图像像素的像素值经过了修正,亮度对齐后的第一彩色中间图像中的每个图像像素的像素值均较为准确。Specifically, the high dynamic range processing performed on the image by the high dynamic fusion unit 50 may include brightness alignment processing. The high dynamic fusion unit 50 (which may include the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40) performs brightness alignment processing on the image and includes the following steps (the number of images can be equal to the exposure of the image sensor 10 to control the pixel array 11 The number of times, the image can be the first color original image data and the second color original image data, the first color original image data, the second color original image data and the third color original image data, the first target image and the second target image , The first color original image and the second color original image, the first color intermediate image and the second color intermediate image, the first panchromatic original image and the second panchromatic original image, the first panchromatic intermediate image and the second panchromatic image Intermediate image, the first color original image, the second color original image and the third color original image, the first panchromatic original image, the second panchromatic original image and the third panchromatic original image, the first color intermediate image, the second The color intermediate image and the third color intermediate image, one of the first full color intermediate image, the second pan The image (corresponding to the long-term L exposure), the second color intermediate image (corresponding to the intermediate time M exposure), and the third color intermediate image (corresponding to the short-term S exposure) are subjected to brightness alignment processing as an example for description): (1) Identify the overexposed image pixels in the first color intermediate image whose pixel value is greater than the first preset threshold; (2) For each overexposed image pixel, expand a predetermined area with the overexposed image pixel as the center; (3) Find intermediate image pixels with pixel values less than the first preset threshold in a predetermined area; (4) Use intermediate image pixels, second color intermediate image, and third color intermediate exposure image to correct the pixel value of overexposed image pixels; ( 5) The first color intermediate image is updated by using the corrected pixel values of the pixels of the overexposed image to obtain the first color intermediate image with aligned brightness. Specifically, please refer to Figure 30, assuming that the pixel value V1 of the image pixel P12 (the image pixel marked with a dashed circle in the first color intermediate image in Figure 30) is greater than the first preset threshold V0, that is, the image pixel P12 is an overexposed image For the pixel P12, the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 extends a predetermined area with the overexposed image pixel P12 as the center, for example, the 3*3 area shown in FIG. 30. Of course, in other embodiments, it may also be a 4*4 area, a 5*5 area, a 10*10 area, etc., which is not limited here. Subsequently, the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 searches for intermediate image pixels whose pixel value is less than the first preset threshold V0 in a predetermined area of 3*3, such as the image pixel P21 in FIG. 30 (FIG. 30). If the pixel value V2 of the image pixel marked with a dotted circle in the first color intermediate image is less than the first preset threshold V0, the image pixel P21 is the intermediate image pixel P21. Subsequently, the color high dynamic fusion unit 30 searches for the image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively in the second color intermediate image, that is, the image pixel P1'2' (in the second color intermediate image in FIG. 30) The image pixels marked with a dashed circle) and the image pixel P2'1' (the image pixels marked with a dotted circle in the second color intermediate image in Figure 30), where the image pixel P1'2' corresponds to the overexposed image pixel P12 , The image pixel P2'1' corresponds to the intermediate image pixel P21, the pixel value of the image pixel P1'2' is V3, and the pixel value of the image pixel P2'1' is V4. Subsequently, the color high dynamic fusion unit 30 calculates V1' according to V1'/V3=V2/V4. When V1' is less than the first preset threshold V0, the color high dynamic fusion unit 30 uses the value of V1' to replace the value of V1. When V1' is greater than the first preset threshold V0, the color high dynamic fusion unit 30 is in the third In the color intermediate image, look for the image pixels corresponding to the overexposed image pixel P12 and the intermediate image pixel P21 respectively, that is, the image pixels P1"2" and P2"1", the pixel value of the image pixel P1"'2" is V5, and the image pixel The pixel value of P2"1" is V6, and in the same way, V1" is finally calculated according to V1"/V5=V2/V6, and the value of V1" is used to replace the value of V1. From this, the overexposed image can be calculated The actual pixel value of pixel P12. The color high dynamic fusion unit 30 or panchromatic high dynamic fusion unit 40 performs this brightness alignment process on each overexposed image pixel in the first color intermediate image to obtain the brightness alignment Since the pixel value of the overexposed image pixel in the first color intermediate image after brightness alignment has been corrected, the pixel value of each image pixel in the first color intermediate image after brightness alignment is equal More accurate.

高动态范围处理过程中,在获取到亮度对齐后的第一彩色中间图像(或者经过了上述亮度对齐处理的其他图像)后,彩色高动态融合单元30或全色高动态融合单元40可以对亮度对齐后的图像和同类图像进行融合以得到高动态的图像。具体地,下面以彩色高动态融合单元30或全色高动态融合单元40对亮度对齐后的第一彩色中间图像(由长时间L曝光对应得到)及第二彩色中间图像(由中时间M曝光对应得到)和第三彩色中间图像(由短时间S曝光对应得到)进行融合得到高动态的彩色中间图像进行说明。彩色高动态融合单元30或全色高动态融合单元40首先对亮度对齐后的第一彩色中间图像进行运动检测,以识别亮度对齐后的第一彩色中间图像中是否存在运动模糊区域。若亮度对齐后的第一彩色中间图像中不存在运动模糊区域,则直接融合亮度对齐后的第一彩色中间图像及第二彩色中间图像和第三彩色中间图像以得到高动态的彩色中间图像。若亮度对齐后的第一彩色中间图像中存在运动模糊区域,则将第一彩色中间图像中的运动模糊区域剔除,只融合第二彩色中间图像和第三彩色中间图像的所有区域以及亮度对齐后的第一彩色中间图像中除运动模糊区域以外的区域以得到高动态的彩色中间图像。其中,该高动态的彩色中间图像的分辨率可以等于像素阵列11的分辨率。具体地,在融合亮度对齐后的第一彩色中间图像及第二彩色中间图像和第三彩色中间图像时,若亮度对齐后的第一彩色中间图像中不存在运动模糊区域,则此时两张中间图像的融合遵循以下原则:(1)亮度对齐后的第一彩色中间图像中,过曝区域的图像像素的像素值直接替换为第二彩色中间图像中对应于该过曝区域的图像像素的像素值;若第二彩色中间图像中对应于该过曝区域的图像像素的像素值也过曝,则将亮度对齐后的第一彩色中间图像中过曝区域的图像像素的像素值直接替换为第三彩色中间图像中对应于该过曝区域的图像像素的像素值;(2)亮度对齐后的第一彩色中间图像中,欠曝区域的图像像素的像素值为:长曝光像素值除以系数K1,系数K1为K2和K3的平均数;K2为长曝光像素值和中曝光像素值的比例,K3为长曝光像素值和短曝光像素值的比例;(3)亮度对齐后的第一彩色中间图像中,未欠曝也未过曝区域的图像像素的像素值为:长曝光像素值除以系数K1。若亮度对齐后的第一彩色中间图像中存在运动模糊区域,则此时三张中间图像的融合除了遵循上述三个原则外,还需要遵循第(4)个原则:亮度对齐后的第一彩色中间图像中,运动模糊区域的图像像素的像素值直接替换为第二彩色中间图像中对应于该运动模糊区域的图像像素的像素值和第三彩色中间图像中对应于该运动模糊区域的图像像素的像素值的平均值。本申请实施方式的高动态范围图像处理系统100通过彩色高动态融合单元30或全色高动态融合单元40对图像进行高动态范围处理,先对图像进行亮度对齐处理,再对亮度对齐后的图像与其他图像进行融合,得到高动态的图像,使得高动态范围图像处理系统100形成的目标图像具有更大的动态范围,进而具有更好的成像效果。In the high dynamic range processing process, after the first color intermediate image (or other images that have undergone the above brightness alignment processing) after the brightness alignment is obtained, the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 can adjust the brightness The aligned images are merged with similar images to obtain a highly dynamic image. Specifically, the following uses the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 to adjust the brightness of the first color intermediate image (obtained by the long-time L exposure) and the second color intermediate image (exposed by the middle time M). Correspondingly obtained) and the third color intermediate image (obtained by the short-time S exposure) are fused to obtain a highly dynamic color intermediate image for description. The color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40 first performs motion detection on the first color intermediate image after brightness alignment to identify whether there is a motion blur area in the first color intermediate image after brightness alignment. If there is no motion blur area in the first color intermediate image after brightness alignment, the first color intermediate image, the second color intermediate image, and the third color intermediate image after brightness alignment are directly merged to obtain a highly dynamic color intermediate image. If there is a motion blur area in the first color intermediate image after brightness alignment, remove the motion blur area in the first color intermediate image, and only merge all areas of the second color intermediate image and the third color intermediate image, and after the brightness is aligned The first color intermediate image except for the motion blur area in the first color intermediate image to obtain a highly dynamic color intermediate image. Wherein, the resolution of the high dynamic color intermediate image may be equal to the resolution of the pixel array 11. Specifically, when fusing the first color intermediate image, the second color intermediate image, and the third color intermediate image after brightness alignment, if there is no motion blur area in the first color intermediate image after brightness alignment, then two The fusion of the intermediate image follows the following principles: (1) In the first color intermediate image after brightness alignment, the pixel value of the image pixel in the overexposed area is directly replaced with the pixel value of the image pixel in the second color intermediate image corresponding to the overexposed area. Pixel value; if the pixel value of the image pixel corresponding to the overexposed area in the second color intermediate image is also overexposed, then the pixel value of the image pixel in the overexposed area in the first color intermediate image after brightness alignment is directly replaced with The pixel value of the image pixel in the third color intermediate image corresponding to the overexposed area; (2) In the first color intermediate image after brightness alignment, the pixel value of the image pixel in the underexposed area is: the long exposure pixel value divided by Coefficient K1, coefficient K1 is the average of K2 and K3; K2 is the ratio of long-exposure pixel value to medium-exposure pixel value, K3 is the ratio of long-exposure pixel value and short-exposure pixel value; (3) the first after brightness alignment In the color intermediate image, the pixel value of the image pixel in the area neither under-exposed nor over-exposed is: the long-exposure pixel value divided by the coefficient K1. If there is a motion blur area in the first color intermediate image after brightness alignment, the fusion of the three intermediate images at this time not only follows the above three principles, but also needs to follow the (4) principle: the first color after brightness alignment In the intermediate image, the pixel value of the image pixel in the motion blur area is directly replaced with the pixel value of the image pixel corresponding to the motion blur area in the second color intermediate image and the image pixel corresponding to the motion blur area in the third color intermediate image The average of the pixel values. The high dynamic range image processing system 100 of the embodiment of the present application performs high dynamic range processing on the image through the color high dynamic fusion unit 30 or the panchromatic high dynamic fusion unit 40, first performs brightness alignment processing on the image, and then performs brightness alignment on the image after the brightness alignment. It is fused with other images to obtain a highly dynamic image, so that the target image formed by the high dynamic range image processing system 100 has a larger dynamic range, and thus has a better imaging effect.

融合模块204可以对彩色中间图像和全色中间图像进行融合算法处理。融合算法处理的具体过程可以为如下,以彩色中间图像具有R(即红色)、G(即绿色)和B(即蓝色)三个颜色通道的彩色信息、全色中间图像具有全色信息为例进行说明,全色信息可以为亮度信息,融合算法处理的具体过程可以包括:(1)根据彩色中间图像,计算各个像素对应的辅助值Y,其中,Y=(R*w1+B*w2+G*w3)/(w1+w2+w3),R为像素对应的R通道的值,G为像素对应的G通道的值,B为像素对应的B通道的值,w1、w2和w3为权重值;(2)计算彩色中间图像中各个通道值与辅助值Y的比例,得到各个像素对应的参考通道值K1、K2和K3,其中,K1=R/Y,K2=G/Y,K3=B/Y;(3)对所述参考通道值K1、K2和K3进行色彩降噪处理;(4)将对应像素上的全色信息Y’与色彩降噪后的参考通道值K1-K3进行融合,生成融合后的RGB三通道值R’、G’和B’,得到目标图像;其中,R’=K1*Y’;G’=K2*Y’;B’=K3*Y’。本申请实施方式的融合模块204对彩色图像和全色图像进行融合算法处理,使得最终形成的目标图像的来源同时具有彩色 信息和亮度信息,由于人眼对亮度的敏感超过色度,因而对于人眼视觉特性而言,本申请实施方式的高动态范围图像处理系统100具有更好的成像效果,最终得到的目标图像也更贴近人眼视觉。The fusion module 204 can perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image. The specific process of the fusion algorithm processing can be as follows. The color intermediate image has the color information of the three color channels of R (i.e. red), G (i.e. green) and B (i.e. blue), and the full-color intermediate image has full-color information as As an example, the panchromatic information can be brightness information, and the specific process of fusion algorithm processing can include: (1) Calculate the auxiliary value Y corresponding to each pixel according to the color intermediate image, where Y=(R*w1+B*w2 +G*w3)/(w1+w2+w3), R is the value of the R channel corresponding to the pixel, G is the value of the G channel corresponding to the pixel, B is the value of the B channel corresponding to the pixel, w1, w2 and w3 are Weight value; (2) Calculate the ratio of each channel value in the color intermediate image to the auxiliary value Y to obtain the reference channel values K1, K2, and K3 corresponding to each pixel, where K1=R/Y, K2=G/Y, K3 =B/Y; (3) Perform color noise reduction processing on the reference channel values K1, K2, and K3; (4) Combine the panchromatic information Y'on the corresponding pixel with the reference channel values K1-K3 after color noise reduction The fusion is performed to generate the fused RGB three-channel values R', G'and B'to obtain the target image; wherein, R'=K1*Y'; G'=K2*Y'; B'=K3*Y'. The fusion module 204 of the embodiment of the present application performs fusion algorithm processing on the color image and the panchromatic image, so that the source of the final target image has both color information and brightness information. Since the human eye is more sensitive to brightness than chroma, it is In terms of eye vision characteristics, the high dynamic range image processing system 100 of the embodiment of the present application has a better imaging effect, and the final target image obtained is closer to human vision.

高动态融合单元50集成在图像传感器10中;或高动态融合单元50集成在图像处理器20中。具体地,请参阅图18,在某些实施方式中,彩色高动态融合单元30和全色高动态融合单元40可以集成在图像传感器10中;请参阅图1、图19和图20,在另外的实施方式中,彩色高动态融合单元30和全色高动态融合单元40可以集成在图像处理器20中。彩色高动态融合单元30和全色高动态融合单元40集成在图像传感器10或图像处理器20中使得本申请实施方式的高动态范围图像处理系统100在无需提高图像传感器10的硬件性能的情况下实现高动态范围处理,同时彩色高动态融合单元30和全色高动态融合单元40将高动态范围处理的功能独立地封装起来,有利于降低产品设计过程中的设计难度,提高设计更改的便捷性。The high dynamic fusion unit 50 is integrated in the image sensor 10; or the high dynamic fusion unit 50 is integrated in the image processor 20. Specifically, please refer to FIG. 18. In some embodiments, the color high dynamic fusion unit 30 and the panchromatic high dynamic fusion unit 40 may be integrated in the image sensor 10; please refer to FIG. 1, FIG. 19, and FIG. 20, in addition In the embodiment, the color high dynamic fusion unit 30 and the panchromatic high dynamic fusion unit 40 may be integrated in the image processor 20. The color high dynamic fusion unit 30 and the panchromatic high dynamic fusion unit 40 are integrated in the image sensor 10 or the image processor 20, so that the high dynamic range image processing system 100 of the embodiment of the present application does not need to improve the hardware performance of the image sensor 10. Realize high dynamic range processing. At the same time, the color high dynamic range fusion unit 30 and the full-color high dynamic range fusion unit 40 independently encapsulate the high dynamic range processing function, which is beneficial to reduce the design difficulty in the product design process and improve the convenience of design changes. .

在某些实施方式中,图像预处理可以包括像素相加处理和去马赛克处理。在某些实施方式中,请参阅图1,彩色预处理模块2023可以对彩色原始图像数据进行像素相加处理,得到彩色原始图像,全色预处理模块2024可以对全色原始图像数据进行去马赛克处理,得到全色原始图像。在另外的实施方式中,请参阅图1,彩色预处理模块2023可以对彩色原始图像数据进行像素相加处理,得到彩色原始图像;全色预处理模块2024可以对全色原始图像数据进行去马赛克处理,得到全色原始图像。其中,去马赛克处理与前述全色预处理模块2024对全色原始图像数据进行的去马赛克处理的具体实施过程相同,在此不再展开说明。本申请实施方式的高动态范围图像处理系统100对部分像素格中的彩色信息缺失并且具有彩色信息的像素格仅具有单颜色通道信息的彩色原始图像数据进行像素相加处理,能以一种简单的、计算量较小的方式,得到完整通道的彩色信息,进而得到彩色原始图像,以便后续对图像继续进行其他图像处理,提高成像质量。In some embodiments, image preprocessing may include pixel addition processing and demosaicing processing. In some embodiments, referring to FIG. 1, the color preprocessing module 2023 can perform pixel addition processing on the color original image data to obtain the color original image, and the panchromatic preprocessing module 2024 can demosaicing the panchromatic original image data. Process to get the full-color original image. In another embodiment, referring to FIG. 1, the color preprocessing module 2023 can perform pixel addition processing on the color original image data to obtain the color original image; the full color preprocessing module 2024 can demosaicing the full color original image data Process to get the full-color original image. Wherein, the demosaic processing is the same as the specific implementation process of the demosaic processing performed by the full-color pre-processing module 2024 on the full-color original image data, and will not be further described here. The high dynamic range image processing system 100 of the embodiment of the present application performs pixel addition processing on the color information in some pixel grids that are missing and the pixel grids with color information only have color original image data with single color channel information. In a less computationally intensive way, the color information of the complete channel is obtained, and then the original color image is obtained, so that other image processing can be continued on the image to improve the imaging quality.

在另一些实施方式中,图像预处理可以包括像素求平均处理和去马赛克处理。在某些实施方式中,请参阅图1,彩色预处理模块2023可以对彩色原始图像数据进行像素求平均处理,得到彩色原始图像,全色预处理模块2024可以对全色原始图像数据进行去马赛克处理,得到全色原始图像。在另外的实施方式中,请参阅图1,彩色预处理模块2023可以对彩色原始图像数据进行像素求平均处理,得到彩色原始图像;全色预处理模块2024可以对全色原始图像数据进行去马赛克处理,得到全色原始图像。其中,去马赛克处理与前述全色预处理模块2024对全色原始图像数据进行地去马赛克处理的具体实施过程相同,在此不再展开说明。本申请实施方式的高动态范围图像处理系统100对部分像素格中的彩色信息缺失并且具有彩色信息的像素格仅具有单颜色通道信息的彩色原始图像数据进行像素求平均处理,能以一种简单的、计算量较小的方式,得到完整通道的彩色信息,进而得到彩色原始图像,以便后续对图像继续进行其他图像处理,提高成像质量。In other embodiments, the image preprocessing may include pixel averaging processing and demosaicing processing. In some embodiments, referring to FIG. 1, the color preprocessing module 2023 can perform pixel averaging processing on the color original image data to obtain the color original image, and the panchromatic preprocessing module 2024 can demosaicing the panchromatic original image data. Process to get the full-color original image. In another embodiment, referring to FIG. 1, the color preprocessing module 2023 can perform pixel averaging processing on the color original image data to obtain the color original image; the full color preprocessing module 2024 can demosaicing the full color original image data Process to get the full-color original image. Among them, the demosaic processing is the same as the specific implementation process of the demosaic processing performed by the full-color preprocessing module 2024 on the full-color original image data, and will not be further described here. The high dynamic range image processing system 100 of the embodiment of the present application performs pixel averaging processing on the color original image data in which the color information in some pixel grids are missing and the pixel grids with color information only have single-color channel information. In a less computationally intensive way, the color information of the complete channel is obtained, and then the original color image is obtained, so that other image processing can be continued on the image to improve the imaging quality.

下面以对彩色原始图像数据进行像素相加处理为例进行说明,像素相加处理的具体步骤如下:(1)将彩色原始图像数据分解成第一颜色原始图像数据(由上文所述的第一颜色感光像素A生成的原始图像数据)、第二颜色原始图像数据(由上文所述的第二颜色感光像素B生成的原始图像数据)和第三颜色原始图像数据(由上文所述的第三颜色感光像素C生成原始图像数据)。(2)将第一颜色原始图像数据中子单元的多个第一颜色感光像素A生成的像素值进行相加运算,求得加和值后将每个子单元范围的像素格融合成一个像素格,并将该平均值填入该像素格中,得到第一颜色中间图像数据。(3)对第一颜色中间图像数据利用双线性插值方法进行插值,得到分辨率为彩色原始图像数据的四分之一的彩色原始图像。双线性插值的具体操作方式下文中会详细阐述。(4)将第一颜色原始图像数据、第二颜色原始图像数据和第三颜色原始图像数据,均进行以上(2)和(3)步骤后,将所得的具有一个颜色通道的第一颜色原始图像、第二颜色原始图像和第三颜色原始图像合成为具有三个颜色通道的彩色原始图像。彩色预处理模块2023可以对至少两次曝光对应的所有彩色原始图像数据均进行以上步骤的像素相加处理,从而完成对所有彩色原始图像数据的像素球相加处理,得到至少两张彩色原始图像。具体地,请结合图31,下面以彩色预处理模块2023对第一彩色原始图像数据中的第一红色原始图像数据进行像素相加处理为例进行说明。如图31所示,彩色预处理模块2023先将彩色原始图像(可以为第一彩色原始图像、第二彩色原始图像或第三彩色原始图像等)数据分解成红色原始图像数据、绿色原始图像数据和蓝色原始图像数据。如图16所示,彩色预处理模块2023再将红色原始图像数据中子单元的多个红颜色感光像素R生成的像素值(例如L1和L2)进行求相加运算,求得加和值L1’=L1+L2后将子单元范围的像素格融合成一个像素格,并将该加和值填入该像素格中,得到红色中间图像数据。然后,红色预处理模块对红色中间图像数据利用双线性插值方法进行插值,得到分辨率为红色原始图像数据的四分之一的红色原始图像。与此相似地,彩色预处理模块2023可以得到红色原始图像、绿色原始图像和蓝色原始图像,并将所得的具有一个颜色通道的红色原始图像、绿色原始图像和蓝色原始图 像合成为具有3个颜色通道的彩色原始图像。彩色预处理模块2023可以对第一彩色原始图像数据和第二原始图像数据(或者第一彩色原始图数据、第二彩色原始图像数据和第三彩色原始图像数据)均进行以上步骤的像素相加处理,从而完成对彩色原始图像数据的像素相加处理,得到第一彩色原始图像和第二彩色原始图像(或者第一彩色原始图、第二彩色原始图像和第三彩色原始图像)。The following takes the pixel addition processing of the color original image data as an example for description. The specific steps of the pixel addition processing are as follows: (1) Decompose the color original image data into the first color original image data (from the above-mentioned first The original image data generated by the first color photosensitive pixel A), the second color original image data (the original image data generated by the second color photosensitive pixel B described above) and the third color original image data (from the above The third color photosensitive pixel C generates the original image data). (2) The pixel values generated by the multiple first-color photosensitive pixels A of the sub-units in the first-color original image data are added together, and after the sum is obtained, the pixel grids in each sub-unit range are merged into a pixel grid , And fill the average value into the pixel grid to obtain the first color intermediate image data. (3) Interpolate the first color intermediate image data by using a bilinear interpolation method to obtain a color original image with a resolution of one quarter of the color original image data. The specific operation method of bilinear interpolation will be explained in detail below. (4) After the first color original image data, the second color original image data, and the third color original image data are all subjected to the above steps (2) and (3), the obtained first color original image data with one color channel The image, the original image of the second color, and the original image of the third color are synthesized into a color original image with three color channels. The color preprocessing module 2023 can perform the pixel addition processing of the above steps on all color original image data corresponding to at least two exposures, thereby completing the pixel ball addition processing of all color original image data to obtain at least two color original images . Specifically, referring to FIG. 31, the following takes the color preprocessing module 2023 to perform pixel addition processing on the first red original image data in the first color original image data as an example for description. As shown in Figure 31, the color preprocessing module 2023 first decomposes the color original image (which can be the first color original image, the second color original image, or the third color original image, etc.) data into red original image data and green original image data. And blue raw image data. As shown in FIG. 16, the color preprocessing module 2023 then performs an addition operation on the pixel values (such as L1 and L2) generated by the multiple red photosensitive pixels R in the sub-unit of the red original image data to obtain the sum value L1 After'=L1+L2, the pixel grid in the sub-unit range is merged into a pixel grid, and the sum value is filled into the pixel grid to obtain the red intermediate image data. Then, the red preprocessing module uses a bilinear interpolation method to interpolate the red intermediate image data to obtain a red original image with a resolution of a quarter of the red original image data. Similarly, the color preprocessing module 2023 can obtain a red original image, a green original image, and a blue original image, and combine the obtained red original image, green original image, and blue original image with one color channel into three Color original image with three color channels. The color preprocessing module 2023 can perform the pixel addition of the above steps on the first color original image data and the second original image data (or the first color original image data, the second color original image data, and the third color original image data). The processing, thereby completing the pixel addition processing of the color original image data, obtains the first color original image and the second color original image (or the first color original image, the second color original image, and the third color original image).

下面以对彩色原始图像数据进行像素求平均处理为例进行说明,像素求平均处理的具体步骤如下:(1)将彩色原始图像数据分解成第一颜色原始图像数据(由上文所述的第一颜色感光像素A生成的原始图像数据)、第二颜色原始图像数据(由上文所述的第二颜色感光像素B生成的原始图像数据)和第三颜色原始图像数据(由上文所述的第三颜色感光像素C生成原始图像数据)。(2)将第一颜色原始图像数据中子单元的多个第一颜色感光像素A生成的像素值进行求平均值运算,求得平均值后将子单元范围的像素格融合成一个像素格,并将该平均值填入该像素格中,得到第一颜色中间图像数据。(3)对第一颜色中间图像数据利用双线性插值方法进行插值,得到分辨率为彩色原始图像数据的四分之一的彩色原始图像。双线性插值的具体操作方式下文中会详细阐述。(4)将第一颜色原始图像数据、第二颜色原始图像数据和第三颜色原始图像数据,均进行以上(2)和(3)步骤后,将所得的具有一个颜色通道的第一颜色原始图像、第二颜色原始图像和第三颜色原始图像合成为具有三个颜色通道的彩色原始图像。彩色预处理模块2023可以对至少两次曝光对应的所有彩色原始图像数据均进行以上步骤的像素求平均处理,从而完成对所有彩色原始图像数据的像素求平均处理,得到至少两张彩色原始图像。具体地,请结合图32,下面以彩色预处理模块2023对第一彩色原始图像数据中的第一红色原始图像数据进行像素求平均处理为例进行说明。如图32所示,彩色预处理模块2023先将彩色原始图像(可以为第一彩色原始图像、第二彩色原始图像或第三彩色原始图像等)数据分解成红色原始图像数据、绿色原始图像数据和蓝色原始图像数据。如图16所示,彩色预处理模块2023再将红色原始图像数据中子单元的多个红颜色感光像素R生成的像素值(例如L1和L2)进行求平均值运算,求得平均值L1’=(L1+L2)/2后将子单元范围的像素格融合成一个像素格,并将该平均值填入该像素格中,得到红色中间图像数据。然后,红色预处理模块对红色中间图像数据利用双线性插值方法进行插值,得到分辨率为红色原始图像数据的四分之一的红色原始图像。与此相似地,彩色预处理模块2023可以得到红色原始图像、绿色原始图像和蓝色原始图像,并将所得的具有一个颜色通道的红色原始图像、绿色原始图像和蓝色原始图像合成为具有3个颜色通道的彩色原始图像。彩色预处理模块2023可以对第一彩色原始图像数据和第二原始图像数据(或者第一彩色原始图数据、第二彩色原始图像数据和第三彩色原始图像数据)均进行以上步骤的像素求平均处理,从而完成对彩色原始图像数据的像素求平均处理,得到第一彩色原始图像和第二彩色原始图像(或者第一彩色原始图、第二彩色原始图像和第三彩色原始图像)。The following takes the pixel averaging processing on the color original image data as an example. The specific steps of the pixel averaging processing are as follows: (1) Decompose the color original image data into the first color original image data (from the above-mentioned first color image data). The original image data generated by the first color photosensitive pixel A), the second color original image data (the original image data generated by the second color photosensitive pixel B described above) and the third color original image data (from the above The third color photosensitive pixel C generates the original image data). (2) Perform an averaging operation on the pixel values generated by the multiple first-color photosensitive pixels A of the sub-unit in the original image data of the first color, and after the average value is obtained, the pixel grids in the sub-unit range are merged into a pixel grid. The average value is filled into the pixel grid to obtain the first color intermediate image data. (3) Interpolate the first color intermediate image data by using a bilinear interpolation method to obtain a color original image with a resolution of one quarter of the color original image data. The specific operation method of bilinear interpolation will be explained in detail below. (4) After the first color original image data, the second color original image data, and the third color original image data are all subjected to the above steps (2) and (3), the obtained first color original image data with one color channel The image, the original image of the second color, and the original image of the third color are synthesized into a color original image with three color channels. The color preprocessing module 2023 can perform the pixel averaging processing of the above steps on all color original image data corresponding to at least two exposures, so as to complete the pixel averaging processing of all color original image data to obtain at least two color original images. Specifically, referring to FIG. 32, the following takes the color preprocessing module 2023 to perform pixel averaging processing on the first red original image data in the first color original image data as an example for description. As shown in Figure 32, the color preprocessing module 2023 first decomposes the color original image (which can be the first color original image, the second color original image, or the third color original image, etc.) data into red original image data and green original image data. And blue raw image data. As shown in FIG. 16, the color preprocessing module 2023 then performs an averaging operation on the pixel values (such as L1 and L2) generated by the multiple red photosensitive pixels R in the subunit of the red original image data to obtain the average value L1' =(L1+L2)/2 After fusing the pixel grid of the sub-unit range into a pixel grid, and filling the average value into the pixel grid, the red intermediate image data is obtained. Then, the red preprocessing module uses a bilinear interpolation method to interpolate the red intermediate image data to obtain a red original image with a resolution of a quarter of the red original image data. Similarly, the color preprocessing module 2023 can obtain a red original image, a green original image, and a blue original image, and combine the obtained red original image, green original image, and blue original image with one color channel into three Color original image with three color channels. The color preprocessing module 2023 can perform the pixel averaging of the above steps on the first color original image data and the second original image data (or the first color original image data, the second color original image data, and the third color original image data). Processing, thereby completing the pixel averaging processing of the color original image data to obtain the first color original image and the second color original image (or the first color original image, the second color original image, and the third color original image).

图像处理器20还可以包括接收单元201和内存单元203。接收单元201用于接收彩色原始图像数据和全色原始图像数据;内存单元203用于暂存彩色原始图像数据、全色原始图像数据、彩色原始图像、全色原始图像、彩色中间图像、全色中间图像和目标图像中的一个或多个。图像处理器20设置接收单元201和内存单元203将图像的接收、处理和存储分开,有利于高动态范围图像处理系统100的各模块具有更加独立的封装,从而使得高动态范围图像处理系统100具有更高的执行效率和更好的防干扰效果,此外,还有利于降低高动态范围图像处理系统100的重新设计过程的设计难度,从而降低成本。The image processor 20 may further include a receiving unit 201 and a memory unit 203. The receiving unit 201 is used to receive color original image data and full-color original image data; the memory unit 203 is used to temporarily store color original image data, full-color original image data, color original images, full-color original images, color intermediate images, and full-color original images. One or more of the intermediate image and the target image. The image processor 20 is provided with a receiving unit 201 and a memory unit 203 to separate the receiving, processing and storage of images, which is conducive to more independent packaging of the modules of the high dynamic range image processing system 100, so that the high dynamic range image processing system 100 has Higher execution efficiency and better anti-interference effect, in addition, are also beneficial to reduce the design difficulty of the redesign process of the high dynamic range image processing system 100, thereby reducing costs.

请参阅图33,本申请还提供一种电子设备1000。本申请实施方式的电子设备1000包括镜头300、壳体200及上述任意一项实施方式的高动态范围图像处理系统100。镜头300、高动态范围图像处理系统100与壳体200结合。镜头300与高动态范围图像处理系统100的图像传感器10配合成像。Please refer to FIG. 33. The present application also provides an electronic device 1000. The electronic device 1000 of the embodiment of the present application includes a lens 300, a housing 200, and the high dynamic range image processing system 100 of any one of the above embodiments. The lens 300 and the high dynamic range image processing system 100 are combined with the housing 200. The lens 300 cooperates with the image sensor 10 of the high dynamic range image processing system 100 for imaging.

电子设备1000可以是手机、平板电脑、笔记本电脑、智能穿戴设备(例如智能手表、智能手环、智能眼镜、智能头盔)、无人机、头显设备等,在此不作限制。The electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (such as a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, a head-mounted display device, etc., which are not limited here.

本申请实施方式的电子设备1000通过控制像素阵列11至少分别以第一曝光时间和第二曝光时间进行两次曝光,并且根据不同的曝光时间和不同的感光像素生成多张图像,以便后续对此多张图像进行图像预处理、高动态范围处理、图像处理和融合算法处理,从而得到具有高动态范围的目标图像。本申请实施方式的电子设备1000在无需提高图像传感器10的感光像素硬件参数的情况下,就能实现高动态范围功能,使得目标图像的亮处、暗位都能够具有更佳的表现,有利于提高成像性能的同时,有助于降低成本。The electronic device 1000 of the embodiment of the present application controls the pixel array 11 to perform at least two exposures at the first exposure time and the second exposure time respectively, and generates multiple images according to different exposure times and different photosensitive pixels, so as to follow up. Image preprocessing, high dynamic range processing, image processing and fusion algorithm processing are performed on multiple images to obtain a target image with high dynamic range. The electronic device 1000 of the embodiment of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the bright and dark areas of the target image can have better performance, which is beneficial to While improving imaging performance, it helps reduce costs.

请参阅图34,本申请提供一种高动态范围图像处理方法。本申请实施方式的高动态范围图像 处理方法用于高动态范围图像处理系统100。高动态范围图像处理系统100可以包括图像传感器10。图像传感器10包括像素阵列11。像素阵列11包括多个全色感光像素和多个彩色感光像素。彩色感光像素具有比全色感光像素更窄的光谱响应。像素阵列11包括最小重复单元。每个最小重复单元包含多个子单元。每个子单元包括多个单颜色感光像素及多个全色感光像素。高动态范围图像处理方法包括:Please refer to FIG. 34. This application provides a high dynamic range image processing method. The high dynamic range image processing method of the embodiment of the present application is used in the high dynamic range image processing system 100. The high dynamic range image processing system 100 may include the image sensor 10. The image sensor 10 includes a pixel array 11. The pixel array 11 includes a plurality of full-color photosensitive pixels and a plurality of color photosensitive pixels. Color photosensitive pixels have a narrower spectral response than full-color photosensitive pixels. The pixel array 11 includes the smallest repeating unit. Each minimum repeating unit contains multiple subunits. Each subunit includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels. High dynamic range image processing methods include:

01:控制像素阵列11曝光。其中,像素阵列11以第一曝光时间曝光得到第一原始图像。第一原始图像包括以第一曝光时间曝光的单颜色感光像素生成的第一彩色原始图像数据和以第一曝光时间曝光的全色感光像素生成的第一全色原始图像数据。像素阵列11以第二曝光时间曝光得到第二原始图像。第二原始图像包括以第二曝光时间曝光的单颜色感光像素生成的第二彩色原始图像数据和以第二曝光时间曝光的全色感光像素生成的第二全色原始图像数据。其中,第一曝光时间不等于第二曝光时间。和01: Control the exposure of the pixel array 11. Wherein, the pixel array 11 is exposed for the first exposure time to obtain the first original image. The first original image includes first color original image data generated by single-color photosensitive pixels exposed at the first exposure time and first full-color original image data generated by panchromatic photosensitive pixels exposed at the first exposure time. The pixel array 11 is exposed for a second exposure time to obtain a second original image. The second original image includes second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time. Wherein, the first exposure time is not equal to the second exposure time. with

02:对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。02: Perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image.

本申请实施方式的高动态范围图像处理方法通过控制像素阵列11至少分别以第一曝光时间和第二曝光时间进行两次曝光,并且根据不同的曝光时间和不同的感光像素生成多张图像,以便后续对此多张图像进行图像预处理、高动态范围处理、图像处理和融合算法处理,从而得到具有高动态范围的目标图像。本申请实施方式的高动态范围图像处理方法在无需提高图像传感器10的感光像素硬件参数的情况下,就能实现高动态范围功能,使得目标图像的亮处、暗位都能够具有更佳的表现,有利于提高成像性能的同时,有助于降低成本。The high dynamic range image processing method of the embodiment of the present application controls the pixel array 11 to perform at least two exposures at the first exposure time and the second exposure time respectively, and generates multiple images according to different exposure times and different photosensitive pixels, so that Subsequently, image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on the multiple images to obtain a target image with high dynamic range. The high dynamic range image processing method of the embodiment of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the bright and dark areas of the target image can have better performance , Which is beneficial to improve imaging performance and at the same time help to reduce costs.

在某些实施方式中,像素阵列11还可以以第三曝光时间曝光得到第三原始图像。第三原始图像包括以第三曝光时间曝光的单颜色感光像素生成的第三彩色原始图像数据和以第三曝光时间曝光的全色感光像素生成的第三全色原始图像数据。其中,第三曝光时间不等于第一曝光时间,第三曝光时间不等于第二曝光时间。对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像可以包括:In some embodiments, the pixel array 11 may also be exposed for a third exposure time to obtain a third original image. The third original image includes third color original image data generated by the single-color photosensitive pixels exposed at the third exposure time and third full-color original image data generated by the panchromatic photosensitive pixels exposed at the third exposure time. Wherein, the third exposure time is not equal to the first exposure time, and the third exposure time is not equal to the second exposure time. Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain the target image may include:

对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。Perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image, the second original image, and the third original image to obtain the target image.

在某些实施方式中,图像预处理包括像素补全处理和去马赛克处理,图像处理包括第一图像处理和第二图像处理;对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还可以包括:In some embodiments, the image preprocessing includes pixel completion processing and demosaicing processing, and the image processing includes first image processing and second image processing; image preprocessing and high dynamics are performed on the first original image and the second original image. Range processing, image processing and fusion algorithm processing to obtain the target image can also include:

对彩色原始图像数据进行像素补全处理,得到彩色原始图像;Perform pixel complement processing on the color original image data to obtain the color original image;

对全色原始图像数据进行去马赛克处理,得到全色原始图像;De-mosaic processing the panchromatic original image data to obtain the panchromatic original image;

对彩色原始图像进行第一图像处理,得到彩色中间图像;Performing the first image processing on the color original image to obtain a color intermediate image;

对全色原始图像进行第二图像处理,得到全色中间图像;Performing second image processing on the full-color original image to obtain a full-color intermediate image;

对彩色中间图像和全色中间图像进行融合算法处理得到目标图像。The fusion algorithm is performed on the color intermediate image and the panchromatic intermediate image to obtain the target image.

在某些实施方式中,在对彩色中间图像和全色中间图像进行融合算法处理得到目标图像之后,对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:In some embodiments, after the color intermediate image and the panchromatic intermediate image are processed by the fusion algorithm to obtain the target image, image preprocessing, high dynamic range processing, image processing, and fusion are performed on the first original image and the second original image. The algorithm processing to obtain the target image also includes:

将至少两次曝光对应的目标图像融合得到高动态的目标图像。Fusion of target images corresponding to at least two exposures to obtain a highly dynamic target image.

在某些实施方式中,在对彩色原始图像数据进行像素补全处理,得到彩色原始图像之前,对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:In some embodiments, before performing pixel complement processing on the color original image data to obtain the color original image, image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image and the second original image The target image obtained by processing also includes:

将至少两次曝光对应的彩色原始图像数据融合得到高动态的彩色原始图像数据;Fusion of color original image data corresponding to at least two exposures to obtain highly dynamic color original image data;

在对全色原始图像数据进行去马赛克处理,得到全色原始图像之前,对第一原始图像和第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:Before the panchromatic original image data is demosaiced to obtain the panchromatic original image, the first original image and the second original image are subjected to image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing to obtain the target image. :

将至少两次曝光对应的全色原始图像数据融合得到高动态的全色原始图像数据。The full-color original image data corresponding to at least two exposures are fused to obtain high-dynamic full-color original image data.

在某些实施方式中,第一图像处理包括:In some embodiments, the first image processing includes:

黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理、去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个;One or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing and color conversion processing;

第二图像处理包括:The second image processing includes:

黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理和全局色调映射处理中的一个或多个。One or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, and global tone mapping processing.

在某些实施方式中,第一图像处理包括第一图像子处理和第二图像子处理,彩色处理模块2021用于对彩色原始图像先进行第一图像子处理,再进行第二图像子处理,第一图像子处理包括:In some embodiments, the first image processing includes a first image sub-processing and a second image sub-processing, and the color processing module 2021 is configured to perform the first image sub-processing on the color original image first, and then perform the second image sub-processing, The first image sub-processing includes:

黑电平矫正处理、镜头阴影矫正处理和坏点补偿处理中的一个或多个;One or more of black level correction processing, lens shading correction processing and dead pixel compensation processing;

第二图像子处理包括:The second image sub-processing includes:

去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。One or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing.

在某些实施方式中,将至少两次曝光对应的目标图像融合得到高动态的目标图像包括:In some embodiments, fusing the target images corresponding to at least two exposures to obtain a highly dynamic target image includes:

将至少两次曝光对应的目标图像进行亮度对齐处理,以得到亮度对齐的目标图像,再融合亮度对齐的目标图像及一张或多张目标图像以得到高动态的目标图像。The target image corresponding to at least two exposures is subjected to brightness alignment processing to obtain a brightness-aligned target image, and then the brightness-aligned target image and one or more target images are merged to obtain a highly dynamic target image.

在某些实施方式中,将至少两次曝光对应的彩色原始图像数据融合得到高动态的彩色原始图像数据包括:In some embodiments, fusing color raw image data corresponding to at least two exposures to obtain high dynamic color raw image data includes:

将至少两次曝光对应的彩色原始图像数据进行亮度对齐处理,以得到亮度对齐的彩色原始图像数据,再融合亮度对齐的彩色原始图像数据及一张或多张彩色原始图像数据以得到高动态的彩色原始图像数据;Perform brightness alignment processing on the color original image data corresponding to at least two exposures to obtain brightness-aligned color original image data, and then merge the brightness-aligned color original image data and one or more color original image data to obtain a highly dynamic Color original image data;

将至少两次曝光对应的全色原始图像数据融合得到高动态的全色原始图像数据包括:Fusion of panchromatic raw image data corresponding to at least two exposures to obtain high dynamic panchromatic raw image data includes:

将至少两次曝光对应的全色原始图像数据进行亮度对齐处理,以得到亮度对齐的全色原始图像数据,再融合亮度对齐的全色原始图像数据及一张或多张全色原始图像数据以得到高动态的全色原始图像数据。Perform brightness alignment processing on the panchromatic original image data corresponding to at least two exposures to obtain brightness aligned panchromatic original image data, and then merge the brightness aligned panchromatic original image data and one or more panchromatic original image data to Obtain high dynamic panchromatic original image data.

在某些实施方式中,高动态范围像处理方法还包括:In some embodiments, the high dynamic range image processing method further includes:

接收彩色原始图像数据和全色原始图像数据;和Receiving color original image data and panchromatic original image data; and

暂存彩色原始图像数据、全色原始图像数据、彩色原始图像、全色原始图像、彩色中间图像、全色中间图像和目标图像中的一个或多个。Temporarily store one or more of color original image data, full color original image data, color original image, full color original image, color intermediate image, full color intermediate image, and target image.

在某些实施方式中,图像预处理包括像素相加处理和去马赛克处理,对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:In some embodiments, the image preprocessing includes pixel addition processing and demosaicing processing, and image preprocessing, high dynamic range processing, image processing, and fusion algorithms are performed on the first original image, the second original image, and the third original image. The processed target image includes:

对彩色原始图像数据进行像素相加处理,得到彩色原始图像;和Perform pixel addition processing on the color original image data to obtain a color original image; and

对全色原始图像数据进行去马赛克处理,得到全色原始图像;或者Demosaic processing the panchromatic original image data to obtain a panchromatic original image; or

图像预处理包括像素求平均处理和去马赛克处理,对第一原始图像、第二原始图像和第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:Image preprocessing includes pixel averaging processing and demosaicing processing. Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image, the second original image, and the third original image to obtain the target image includes:

对彩色原始图像数据进行像素求平均处理,得到彩色原始图像;和Perform pixel averaging processing on the color original image data to obtain the color original image; and

对全色原始图像数据进行去马赛克处理,得到全色原始图像。Demosaic processing is performed on the panchromatic original image data to obtain a panchromatic original image.

上述任意一项实施方式的高动态范围图像处理方法的具体实施过程与前述高动态范围图像处理系统100获得目标图像的具体实施过程相同,在此不再展开说明。The specific implementation process of the high dynamic range image processing method of any one of the above embodiments is the same as the specific implementation process of the aforementioned high dynamic range image processing system 100 to obtain a target image, and will not be further described here.

请参阅35,本申请还提供一种包含计算机程序的非易失性计算机可读存储介质400。该计算机程序被处理器60执行时,使得处理器60执行上述任意一项实施方式所述的高动态范围图像处理方法。Please refer to 35. This application also provides a non-volatile computer-readable storage medium 400 containing a computer program. When the computer program is executed by the processor 60, the processor 60 is caused to execute the high dynamic range image processing method described in any one of the foregoing embodiments.

综上,本申请实施方式的高动态范围图像处理系统100及方法、电子设备1000和计算机可读存储介质400通过控制像素阵列11至少分别以第一曝光时间和第二曝光时间进行两次曝光,并且根据不同的曝光时间和不同的感光像素生成多张图像,以便后续对此多张图像进行图像预处理、高动态范围处理、图像处理和融合算法处理,从而得到具有高动态范围的目标图像。本申请实施方式的高动态范围图像处理系统100及方法、电子设备1000和计算机可读存储介质400在无需提高图像传感器10的感光像素硬件参数的情况下,就能实现高动态范围功能,使得目标图像的亮处、暗位都能够具有更佳的表现,有利于提高成像性能的同时,有助于降低成本。In summary, the high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiments of the present application control the pixel array 11 to at least perform two exposures at the first exposure time and the second exposure time, respectively. And according to different exposure times and different photosensitive pixels, multiple images are generated, so that image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on these multiple images to obtain a target image with high dynamic range. The high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiments of the present application can realize the high dynamic range function without increasing the hardware parameters of the photosensitive pixels of the image sensor 10, so that the target The bright and dark parts of the image can have better performance, which is conducive to improving imaging performance and helping to reduce costs.

此外,相关技术中,图像处理器仅能对传统的由彩色感光像素组成的像素阵列形成的图像进行处理,不适用同时具有彩色感光像素和全色感光像素的像素阵列产生的图像。本申请实施方式的高动态范围图像处理系统100及方法、电子设备1000和计算机可读存储介质400适用于具有彩色感光像素和全色感光像素的像素阵列11产生的图像。在同样的光环境和其他辅助硬件下,全色感光像素能接收比彩色感光像素更多的光线,从而能提高最终形成的图像的亮度,并且人眼对亮度的敏感超过色度,从而使得本申请实施方式的高动态范围图像处理系统100及方法、电子设备1000和计算机可读存储介质400具有更好的成像效果。In addition, in the related art, the image processor can only process the image formed by the traditional pixel array composed of color photosensitive pixels, and is not applicable to the image produced by the pixel array having both color photosensitive pixels and full-color photosensitive pixels. The high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiments of the present application are applicable to images generated by a pixel array 11 having color photosensitive pixels and full-color photosensitive pixels. Under the same light environment and other auxiliary hardware, full-color photosensitive pixels can receive more light than color photosensitive pixels, which can improve the brightness of the final image, and the human eye is more sensitive to brightness than chromaticity, which makes the The high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the application embodiment have better imaging effects.

相关技术中采用例如提高快门速度或选取感光响应曲线呈对数形式的感光像素的方式,对高动态相机的图像传感器的硬件参数提出较高要求。本申请实施方式的高动态范围图像处理系统100及方法、电子设备1000和计算机可读存储介质400在无需提高图像传感器10的硬件参数要求的情况下,通过在图像传感器10中设置高动态融合单元50和融合模块204,配合相应的曝光方式,就能实现高动态范围处理功能,进而得到成像效果更好的图像。In the related art, methods such as increasing the shutter speed or selecting photosensitive pixels whose photosensitive response curve is in a logarithmic form put forward higher requirements on the hardware parameters of the image sensor of the high-dynamic camera. The high dynamic range image processing system 100 and method, the electronic device 1000, and the computer-readable storage medium 400 of the embodiment of the present application do not need to increase the hardware parameter requirements of the image sensor 10, by providing a high dynamic fusion unit in the image sensor 10 50 and the fusion module 204, in conjunction with the corresponding exposure mode, can realize the high dynamic range processing function, thereby obtaining an image with better imaging effect.

在本申请的实施方式的描述中,需要说明的是,除非另有明确的规定和限定,术语“安装”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接或可以相互通讯;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通或两个元件的相互作用关系。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请的实施方式中的具体含义。In the description of the embodiments of the present application, it should be noted that, unless otherwise clearly defined and limited, the term “installation” should be understood in a broad sense, for example, it can be a fixed connection, a detachable connection, or an integral Connection; it can be mechanical connection, it can be electrical connection or it can communicate with each other; it can be directly connected, it can also be indirectly connected through an intermediate medium, it can be the internal communication of two components or the interaction relationship between two components. For those of ordinary skill in the art, the specific meanings of the above-mentioned terms in the embodiments of the present application can be understood according to specific circumstances.

流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。Any process or method description in the flowchart or described in other ways herein can be understood as a module, segment or part of code that includes one or more executable instructions for implementing specific logical functions or steps of the process , And the scope of the preferred embodiments of the present application includes additional implementations, which may not be in the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved. This should It is understood by those skilled in the art to which the embodiments of the present application belong.

在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理模块的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(控制方法),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。The logic and/or steps represented in the flowchart or described in other ways herein, for example, can be considered as a sequenced list of executable instructions for implementing logic functions, and can be embodied in any computer-readable medium, For use by instruction execution systems, devices, or equipment (such as computer-based systems, systems including processing modules, or other systems that can fetch instructions from instruction execution systems, devices, or equipment and execute instructions), or combine these instruction execution systems, devices Or equipment. For the purposes of this specification, a "computer-readable medium" can be any device that can contain, store, communicate, propagate, or transmit a program for use by an instruction execution system, device, or device or in combination with these instruction execution systems, devices, or devices. More specific examples (non-exhaustive list) of computer readable media include the following: electrical connections with one or more wiring (control method), portable computer disk cases (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable and editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM). In addition, the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because it can be used, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable media if necessary. The program is processed in a manner to obtain the program electronically, and then stored in the computer memory.

应当理解,本申请的实施方式的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。It should be understood that each part of the embodiments of the present application can be implemented by hardware, software, firmware, or a combination thereof. In the above embodiments, multiple steps or methods can be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if it is implemented by hardware, as in another embodiment, it can be implemented by any one or a combination of the following technologies known in the art: Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate array (PGA), field programmable gate array (FPGA), etc.

本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。A person of ordinary skill in the art can understand that all or part of the steps carried in the method of the foregoing embodiments can be implemented by a program instructing relevant hardware to complete. The program can be stored in a computer-readable storage medium. When executed, it includes one of the steps of the method embodiment or a combination thereof.

此外,在本申请的各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。In addition, each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units may be integrated into one module. The above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer readable storage medium.

上述提到的存储介质可以是只读存储器,磁盘或光盘等。The aforementioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.

在本说明书的描述中,参考术语“某些实施方式”等的描述意指结合所述实施方式或示例描述的具体特征、结构或者特点包含于本申请的至少一个实施方式中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式。而且,描述的具体特征、结构或者特点可以在任何的一个或多个实施方式中以合适的方式结合。In the description of this specification, the description with reference to the term "certain embodiments" and the like means that a specific feature, structure, or characteristic described in conjunction with the embodiment or example is included in at least one embodiment of the present application. In this specification, the schematic representation of the above-mentioned terms does not necessarily refer to the same embodiment. Moreover, the described specific features, structures or characteristics can be combined in any one or more embodiments in an appropriate manner.

尽管上面已经示出和描述了本申请实施方式,可以理解,上述实施方式是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施方式进行变化、修改、替换和变型,本申请的范围由权利要求及其等同物限定。Although the embodiments of the present application have been shown and described above, it can be understood that the foregoing embodiments are exemplary and should not be construed as limiting the present application. Those of ordinary skill in the art can perform the above-mentioned embodiments within the scope of the present application. For changes, modifications, substitutions and variations, the scope of this application is defined by the claims and their equivalents.

Claims (25)

一种高动态范围图像处理系统,其特征在于,包括图像传感器、高动态融合单元和图像处理器;A high dynamic range image processing system, which is characterized by comprising an image sensor, a high dynamic fusion unit and an image processor; 所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应,所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素;The image sensor includes a pixel array, the pixel array includes a plurality of panchromatic photosensitive pixels and a plurality of color photosensitive pixels, the color photosensitive pixels have a narrower spectral response than the panchromatic photosensitive pixels, and the pixel array includes A minimum repeating unit, each of the minimum repeating units includes a plurality of sub-units, and each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels; 所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据;所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据;其中,所述第一曝光时间不等于所述第二曝光时间;The pixel array is exposed at a first exposure time to obtain a first original image, and the first original image includes first color original image data generated by the single-color photosensitive pixel exposed at the first exposure time and The first full-color original image data generated by the full-color photosensitive pixels exposed for the first exposure time; the pixel array is exposed for the second exposure time to obtain a second original image, and the second original image includes the second original image. The second color original image data generated by the single-color photosensitive pixel exposed at the exposure time and the second full-color original image data generated by the panchromatic photosensitive pixel exposed at the second exposure time; wherein, the first The exposure time is not equal to the second exposure time; 所述图像处理器和所述高动态融合单元用于对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。The image processor and the high dynamic fusion unit are configured to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述像素阵列以第三曝光时间曝光得到第三原始图像,所述第三原始图像包括以所述第三曝光时间曝光的所述单颜色感光像素生成的第三彩色原始图像数据和以所述第三曝光时间曝光的所述全色感光像素生成的第三全色原始图像数据;其中,所述第三曝光时间不等于所述第一曝光时间,所述第三曝光时间不等于所述第二曝光时间;The high dynamic range image processing system according to claim 1, wherein the pixel array is exposed at a third exposure time to obtain a third original image, and the third original image includes a third original image exposed at the third exposure time. The third color original image data generated by the single-color photosensitive pixel and the third panchromatic original image data generated by the panchromatic photosensitive pixel exposed at the third exposure time; wherein the third exposure time is not equal to The first exposure time, the third exposure time is not equal to the second exposure time; 所述图像处理器、所述高动态融合单元用于对所述第一原始图像、所述第二原始图像和所述第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。The image processor and the high dynamic fusion unit are used to perform image preprocessing, high dynamic range processing, image processing, and fusion algorithms on the first original image, the second original image, and the third original image Process to get the target image. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述图像处理器包括彩色预处理模块、全色预处理模块、彩色处理模块、全色处理模块和融合模块,所述图像预处理包括像素补全处理和去马赛克处理,所述图像处理包括第一图像处理和第二图像处理;其中:The high dynamic range image processing system according to claim 1, wherein the image processor includes a color preprocessing module, a full color preprocessing module, a color processing module, a full color processing module, and a fusion module, and the image The preprocessing includes pixel completion processing and demosaicing processing, and the image processing includes first image processing and second image processing; wherein: 所述彩色预处理模块用于对彩色原始图像数据进行像素补全处理,得到彩色原始图像;The color preprocessing module is used to perform pixel complement processing on color original image data to obtain a color original image; 所述全色预处理模块用于对全色原始图像数据进行去马赛克处理,得到全色原始图像;The full-color preprocessing module is used for demosaicing the full-color original image data to obtain the full-color original image; 所述彩色处理模块用于对所述彩色原始图像进行第一图像处理,得到彩色中间图像;The color processing module is configured to perform first image processing on the color original image to obtain a color intermediate image; 所述全色处理模块用于对所述全色原始图像进行第二图像处理,得到全色中间图像;The panchromatic processing module is configured to perform second image processing on the panchromatic original image to obtain a panchromatic intermediate image; 所述融合模块用于对所述彩色中间图像和所述全色中间图像进行融合算法处理得到所述目标图像。The fusion module is used to perform fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image. 根据权利要求3所述的高动态范围图像处理系统,其特征在于,在所述融合模块对所述彩色中间图像和所述全色中间图像进行融合算法处理得到所述目标图像之后:The high dynamic range image processing system according to claim 3, characterized in that, after the fusion module performs fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image: 所述高动态融合单元用于将至少两次曝光对应的所述目标图像融合得到高动态的所述目标图像。The high dynamic fusion unit is used for fusing the target images corresponding to at least two exposures to obtain the highly dynamic target image. 根据权利要求3所述的高动态范围图像处理系统,其特征在于,所述高动态融合单元包括彩色高动态融合单元和全色高动态融合单元,在所述彩色预处理模块对彩色原始图像数据进行像素补全处理,得到彩色原始图像之前:The high dynamic range image processing system according to claim 3, wherein the high dynamic fusion unit includes a color high dynamic fusion unit and a panchromatic high dynamic fusion unit, and the color original image data is processed by the color preprocessing module Before performing pixel completion processing to get the color original image: 所述彩色高动态融合单元用于将至少两次曝光对应的所述彩色原始图像数据融合得到高动态的所述彩色原始图像数据;The color high-dynamic fusion unit is configured to fuse the color original image data corresponding to at least two exposures to obtain the high-dynamic color original image data; 在所述全色预处理模块对全色原始图像数据进行去马赛克处理,得到全色原始图像之前:Before the panchromatic preprocessing module performs demosaic processing on the panchromatic original image data to obtain the panchromatic original image: 所述全色高动态融合单元用于将至少两次曝光对应的所述全色原始图像数据融合得到高动态的所述全色原始图像数据。The panchromatic high dynamic fusion unit is used for fusing the panchromatic original image data corresponding to at least two exposures to obtain the high dynamic panchromatic original image data. 根据权利要求3所述的高动态范围图像处理系统,其特征在于,所述第一图像处理包括:The high dynamic range image processing system according to claim 3, wherein the first image processing comprises: 黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理、去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个;One or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing and color conversion processing; 所述第二图像处理包括:The second image processing includes: 所述黑电平矫正处理、所述镜头阴影矫正处理、所述坏点补偿处理和所述全局色调映射处理中的一个或多个。One or more of the black level correction processing, the lens shading correction processing, the dead pixel compensation processing, and the global tone mapping processing. 根据权利要求6所述的高动态范围图像处理系统,其特征在于,所述第一图像处理包括第一图像子处理和第二图像子处理,所述彩色处理模块用于对所述彩色原始图像先进行第一图像子处理,再进行第二图像子处理,所述第一图像子处理包括:The high dynamic range image processing system according to claim 6, wherein the first image processing includes a first image sub-processing and a second image sub-processing, and the color processing module is used for processing the color original image Perform the first image sub-processing first, and then perform the second image sub-processing, and the first image sub-processing includes: 黑电平矫正处理、镜头阴影矫正处理和坏点补偿处理中的一个或多个;One or more of black level correction processing, lens shading correction processing and dead pixel compensation processing; 所述第二图像子处理包括:The second image sub-processing includes: 去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。One or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing. 根据权利要求4所述的高动态范围图像处理系统,其特征在于,所述高动态融合单元用于:The high dynamic range image processing system according to claim 4, wherein the high dynamic range fusion unit is used for: 将至少两次曝光对应的所述目标图像进行亮度对齐处理,以得到亮度对齐的所述目标图像,再融合亮度对齐的所述目标图像及一张或多张所述目标图像以得到所述高动态的所述目标图像。Perform brightness alignment processing on the target image corresponding to at least two exposures to obtain the brightness-aligned target image, and then merge the brightness-aligned target image and one or more of the target images to obtain the high The dynamic target image. 根据权利要求5所述的高动态范围图像处理系统,其特征在于,所述彩色高动态融合单元用于:The high dynamic range image processing system according to claim 5, wherein the color high dynamic fusion unit is used for: 将至少两次曝光对应的所述彩色原始图像数据进行亮度对齐处理,以得到亮度对齐的所述彩色原始图像数据,再融合亮度对齐的所述彩色原始图像数据及一张或多张所述彩色原始图像数据以得到高动态的所述彩色原始图像数据;Perform brightness alignment processing on the color original image data corresponding to at least two exposures to obtain the brightness-aligned color original image data, and then merge the brightness-aligned color original image data and one or more sheets of the color Original image data to obtain the high-dynamic color original image data; 所述全色高动态融合单元用于:The panchromatic high dynamic fusion unit is used for: 将至少两次曝光对应的所述全色原始图像数据进行亮度对齐处理,以得到亮度对齐的所述全色原始图像数据,再融合亮度对齐的所述全色原始图像数据及一张或多张所述全色原始图像数据以得到高动态的所述全色原始图像数据。Perform brightness alignment processing on the panchromatic original image data corresponding to at least two exposures to obtain brightness aligned panchromatic original image data, and then merge the brightness aligned panchromatic original image data and one or more sheets The full-color original image data is used to obtain the high-dynamic full-color original image data. 根据权利要求4或5所述的高动态范围图像处理系统,其特征在于,所述图像处理器还包括:The high dynamic range image processing system according to claim 4 or 5, wherein the image processor further comprises: 接收单元,所述接收单元用于接收所述彩色原始图像数据和所述全色原始图像数据;和A receiving unit configured to receive the color original image data and the full-color original image data; and 内存单元,所述内存单元用于暂存所述彩色原始图像数据、所述全色原始图像数据、所述彩色原始图像、所述全色原始图像、所述彩色中间图像、所述全色中间图像和所述目标图像中的一个或多个。A memory unit, the memory unit is used to temporarily store the color original image data, the full color original image data, the color original image, the full color original image, the color intermediate image, the full color intermediate One or more of the image and the target image. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述图像处理器包括彩色预处理模块和全色预处理模块,The high dynamic range image processing system according to claim 1, wherein the image processor includes a color preprocessing module and a full color preprocessing module, 所述图像预处理包括像素相加处理和去马赛克处理,所述彩色预处理模块用于对彩色原始图像数据进行像素相加处理,得到彩色原始图像,所述全色预处理模块用于对全色原始图像数据进行去马赛克处理,得到全色原始图像;或者The image preprocessing includes pixel addition processing and demosaicing processing, the color preprocessing module is used to perform pixel addition processing on color original image data to obtain a color original image, and the panchromatic preprocessing module is used to Demosaicing the original image data to obtain a full-color original image; or 所述图像预处理包括像素求平均处理和去马赛克处理,所述彩色预处理模块用于对彩色原始图像数据进行像素求平均处理,得到彩色原始图像,所述全色预处理模块用于对全色原始图像数据进行去马赛克处理,得到全色原始图像。The image preprocessing includes pixel averaging processing and demosaicing processing. The color preprocessing module is used to perform pixel averaging processing on color original image data to obtain a color original image. The panchromatic preprocessing module is used to The color original image data is demosaiced to obtain a full-color original image. 根据权利要求1所述的高动态范围图像处理系统,其特征在于,所述高动态融合单元集成在所述图像传感器中;或所述高动态融合单元集成在所述图像处理器中。The high dynamic range image processing system according to claim 1, wherein the high dynamic fusion unit is integrated in the image sensor; or the high dynamic fusion unit is integrated in the image processor. 一种高动态范围图像处理方法,用于高动态范围图像处理系统,其特征在于,所述高动态范围图像处理系统包括图像传感器,所述图像传感器包括像素阵列,所述像素阵列包括多个全色感光像素和多个彩色感光像素,所述彩色感光像素具有比所述全色感光像素更窄的光谱响应,所述像素阵列包括最小重复单元,每个所述最小重复单元包含多个子单元,每个所述子单元包括多个单颜色感光像素及多个全色感光像素;所述高动态范围图像处理方法包括:A high dynamic range image processing method for a high dynamic range image processing system, wherein the high dynamic range image processing system includes an image sensor, the image sensor includes a pixel array, and the pixel array includes a plurality of pixels. A color photosensitive pixel and a plurality of color photosensitive pixels, the color photosensitive pixel has a narrower spectral response than the full-color photosensitive pixel, the pixel array includes a minimum repeating unit, and each minimum repeating unit includes a plurality of subunits, Each of the sub-units includes a plurality of single-color photosensitive pixels and a plurality of full-color photosensitive pixels; the high dynamic range image processing method includes: 控制所述像素阵列曝光,其中,所述像素阵列以第一曝光时间曝光得到第一原始图像,所述第一原始图像包括以所述第一曝光时间曝光的所述单颜色感光像素生成的第一彩色原始图像数据和以所述第一曝光时间曝光的所述全色感光像素生成的第一全色原始图像数据;所述像素阵列以第二曝光时间曝光得到第二原始图像,所述第二原始图像包括以所述第二曝光时间曝光的所述单颜色感光像素生成的第二彩色原始图像数据和以所述第二曝光时间曝光的所述全色感光像素生成的第二全色原始图像数据;其中,所述第一曝光时间不等于所述第二曝光时间;和Control the exposure of the pixel array, wherein the pixel array is exposed for a first exposure time to obtain a first original image, and the first original image includes a first image generated by the single-color photosensitive pixel exposed for the first exposure time. One color original image data and first panchromatic original image data generated by the panchromatic photosensitive pixels exposed at the first exposure time; the pixel array is exposed at the second exposure time to obtain a second original image, the first The two original images include second color original image data generated by the single-color photosensitive pixels exposed at the second exposure time and second full-color original image data generated by the panchromatic photosensitive pixels exposed at the second exposure time Image data; wherein the first exposure time is not equal to the second exposure time; and 对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image. 根据权利要求13所述的高动态范围图像处理方法,其特征在于,所述像素阵列以第三曝光时间曝光得到第三原始图像,所述第三原始图像包括以所述第三曝光时间曝光的所述单颜色感光像素生成的第三彩色原始图像数据和以所述第三曝光时间曝光的所述全色感光像素生成的第三全色原始图像数据;其中,所述第三曝光时间不等于所述第一曝光时间,所述第三曝光时间不等于所述第二曝光时间;所述对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:The high dynamic range image processing method according to claim 13, wherein the pixel array is exposed at a third exposure time to obtain a third original image, and the third original image includes a third original image exposed at the third exposure time. The third color original image data generated by the single-color photosensitive pixel and the third panchromatic original image data generated by the panchromatic photosensitive pixel exposed at the third exposure time; wherein the third exposure time is not equal to The first exposure time and the third exposure time are not equal to the second exposure time; the image preprocessing, high dynamic range processing, and image processing are performed on the first original image and the second original image The target image obtained by processing and fusion algorithm includes: 对所述第一原始图像、所述第二原始图像和所述第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像。Image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing are performed on the first original image, the second original image, and the third original image to obtain a target image. 根据权利要求13所述的高动态范围图像处理方法,其特征在于,所述图像预处理包括像素补全处理和去马赛克处理,所述图像处理包括第一图像处理和第二图像处理;所述对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:The high dynamic range image processing method according to claim 13, wherein the image preprocessing includes pixel complement processing and demosaicing processing, and the image processing includes first image processing and second image processing; Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image includes: 对彩色原始图像数据进行像素补全处理,得到彩色原始图像;Perform pixel complement processing on the color original image data to obtain the color original image; 对全色原始图像数据进行去马赛克处理,得到全色原始图像;De-mosaic processing the panchromatic original image data to obtain the panchromatic original image; 对所述彩色原始图像进行第一图像处理,得到彩色中间图像;Performing first image processing on the color original image to obtain a color intermediate image; 对所述全色原始图像进行第二图像处理,得到全色中间图像;Performing second image processing on the full-color original image to obtain a full-color intermediate image; 对所述彩色中间图像和所述全色中间图像进行融合算法处理得到所述目标图像。Performing fusion algorithm processing on the color intermediate image and the panchromatic intermediate image to obtain the target image. 根据权利要求15所述的高动态范围图像处理方法,其特征在于,在所述对所述彩色中间图像和所述全色中间图像进行融合算法处理得到所述目标图像之后,所述对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:The high dynamic range image processing method according to claim 15, wherein after the target image is obtained by performing fusion algorithm processing on the color intermediate image and the panchromatic intermediate image, the Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the first original image and the second original image to obtain a target image further includes: 将至少两次曝光对应的所述目标图像融合得到高动态的所述目标图像。Fusion of the target images corresponding to at least two exposures to obtain the target image with high dynamics. 根据权利要求15所述的高动态范围图像处理方法,其特征在于,在对彩色原始图像数据进行像素补全处理,得到彩色原始图像之前,所述对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:The high dynamic range image processing method according to claim 15, characterized in that, before performing pixel complementation processing on color original image data to obtain a color original image, said pairing said first original image and said second original image The original image is processed by image preprocessing, high dynamic range processing, image processing and fusion algorithm processing to obtain the target image also includes: 将至少两次曝光对应的所述彩色原始图像数据融合得到高动态的所述彩色原始图像数据;Fusing the color original image data corresponding to at least two exposures to obtain the high dynamic color original image data; 在对全色原始图像数据进行去马赛克处理,得到全色原始图像之前,所述对所述第一原始图像和所述第二原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像还包括:Before performing demosaic processing on the panchromatic original image data to obtain a panchromatic original image, the first original image and the second original image are subjected to image preprocessing, high dynamic range processing, image processing, and fusion algorithm The processed target image also includes: 将至少两次曝光对应的所述全色原始图像数据融合得到高动态的所述全色原始图像数据。The full-color original image data corresponding to at least two exposures are fused to obtain the high-dynamic full-color original image data. 根据权利要求15所述的高动态范围像处理方法,其特征在于,所述第一图像处理包括:The high dynamic range image processing method according to claim 15, wherein the first image processing comprises: 黑电平矫正处理、镜头阴影矫正处理、坏点补偿处理、去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个;One or more of black level correction processing, lens shading correction processing, dead pixel compensation processing, demosaicing processing, color correction processing, global tone mapping processing and color conversion processing; 所述第二图像处理包括:The second image processing includes: 所述黑电平矫正处理、所述镜头阴影矫正处理、所述坏点补偿处理和所述全局色调映射处理中的一个或多个。One or more of the black level correction processing, the lens shading correction processing, the dead pixel compensation processing, and the global tone mapping processing. 根据权利要求18所述的高动态范围图像处理方法,其特征在于,所述第一图像处理包括第一图像子处理和第二图像子处理,所述彩色处理模块用于对所述彩色原始图像先进行第一图像子处理,再进行第二图像子处理,所述第一图像子处理包括:The high dynamic range image processing method according to claim 18, wherein the first image processing includes a first image sub-processing and a second image sub-processing, and the color processing module is used for processing the color original image Perform the first image sub-processing first, and then perform the second image sub-processing, and the first image sub-processing includes: 黑电平矫正处理、镜头阴影矫正处理和坏点补偿处理中的一个或多个;One or more of black level correction processing, lens shading correction processing and dead pixel compensation processing; 所述第二图像子处理包括:The second image sub-processing includes: 去马赛克处理、色彩矫正处理、全局色调映射处理和色彩转换处理中的一个或多个。One or more of demosaicing processing, color correction processing, global tone mapping processing, and color conversion processing. 根据权利要求16所述的高动态范围像处理方法,其特征在于,所述将至少两次曝光对应的所述目标图像融合得到高动态的所述目标图像包括:The high dynamic range image processing method according to claim 16, wherein said fusing the target images corresponding to at least two exposures to obtain the highly dynamic target image comprises: 将至少两次曝光对应的所述目标图像进行亮度对齐处理,以得到亮度对齐的所述目标图像,再融合亮度对齐的所述目标图像及一张或多张所述目标图像以得到所述高动态的所述目标图像。Perform brightness alignment processing on the target image corresponding to at least two exposures to obtain the brightness-aligned target image, and then merge the brightness-aligned target image and one or more of the target images to obtain the high The dynamic target image. 根据权利要求17任意一项所述的高动态范围像处理方法,其特征在于,所述将至少两次曝光对应的所述彩色原始图像数据融合得到高动态的所述彩色原始图像数据包括:The high dynamic range image processing method according to any one of claim 17, wherein the fusing the color original image data corresponding to at least two exposures to obtain the high dynamic color original image data comprises: 将至少两次曝光对应的所述彩色原始图像数据进行亮度对齐处理,以得到亮度对齐的所述彩色原始图像数据,再融合亮度对齐的所述彩色原始图像数据及一张或多张所述彩色原始图像数据以得到高动态的所述彩色原始图像数据;Perform brightness alignment processing on the color original image data corresponding to at least two exposures to obtain the brightness-aligned color original image data, and then merge the brightness-aligned color original image data and one or more sheets of the color Original image data to obtain the high-dynamic color original image data; 所述将至少两次曝光对应的所述全色原始图像数据融合得到高动态的所述全色原始图像数据包括:The fusing the panchromatic original image data corresponding to at least two exposures to obtain the highly dynamic panchromatic original image data includes: 将至少两次曝光对应的所述全色原始图像数据进行亮度对齐处理,以得到亮度对齐的所述全色原始图像数据,再融合亮度对齐的所述全色原始图像数据及一张或多张所述全色原始图像数据以得到高动态的所述全色原始图像数据。Perform brightness alignment processing on the panchromatic original image data corresponding to at least two exposures to obtain brightness aligned panchromatic original image data, and then merge the brightness aligned panchromatic original image data and one or more sheets The full-color original image data is used to obtain the high-dynamic full-color original image data. 根据权利要求16或17所述的高动态范围像处理方法,其特征在于,所述高动态范围像处理方法还包括:The high dynamic range image processing method according to claim 16 or 17, wherein the high dynamic range image processing method further comprises: 接收所述彩色原始图像数据和所述全色原始图像数据;和Receiving the color original image data and the full-color original image data; and 暂存所述彩色原始图像数据、所述全色原始图像数据、所述彩色原始图像、所述全色原始图 像、所述彩色中间图像、所述全色中间图像和所述目标图像中的一个或多个。Temporarily store one of the color original image data, the full color original image data, the color original image, the full color original image, the color intermediate image, the full color intermediate image, and the target image Or more. 根据权利要求13所述的高动态范围像处理方法,其特征在于,所述图像预处理包括像素相加处理和去马赛克处理,所述对所述第一原始图像、所述第二原始图像和所述第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:The high dynamic range image processing method according to claim 13, wherein the image preprocessing includes pixel addition processing and demosaicing processing, and the first original image, the second original image, and the Performing image preprocessing, high dynamic range processing, image processing, and fusion algorithm processing on the third original image to obtain a target image includes: 对彩色原始图像数据进行像素相加处理,得到彩色原始图像;和Perform pixel addition processing on the color original image data to obtain a color original image; and 对全色原始图像数据进行去马赛克处理,得到全色原始图像;或者Demosaic processing the panchromatic original image data to obtain a panchromatic original image; or 所述图像预处理包括像素求平均处理和去马赛克处理,所述对所述第一原始图像、所述第二原始图像和所述第三原始图像进行图像预处理、高动态范围处理、图像处理和融合算法处理得到目标图像包括:The image preprocessing includes pixel averaging processing and demosaicing processing, and performing image preprocessing, high dynamic range processing, and image processing on the first original image, the second original image, and the third original image The target image obtained by processing and fusion algorithm includes: 对彩色原始图像数据进行像素求平均处理,得到彩色原始图像;和Perform pixel averaging processing on the color original image data to obtain the color original image; and 对全色原始图像数据进行去马赛克处理,得到全色原始图像。Demosaic processing is performed on the panchromatic original image data to obtain a panchromatic original image. 一种电子设备,其特征在于,包括:An electronic device, characterized in that it comprises: 镜头;Lens 壳体;及Shell; and 权利要求1至12任意一项所述的高动态范围图像处理系统,所述镜头、所述高动态范围图像处理系统与所述壳体结合,所述镜头与所述高动态范围图像处理系统的图像传感器配合成像。The high dynamic range image processing system according to any one of claims 1 to 12, wherein the lens, the high dynamic range image processing system are combined with the housing, and the lens is connected to the high dynamic range image processing system. The image sensor cooperates with imaging. 一种包含计算机程序的非易失性计算机可读存储介质,其特征在于,所述计算机程序被处理器执行时,使得所述处理器执行权利要求13至23任意一项所述的高动态范围图像处理方法。A non-volatile computer-readable storage medium containing a computer program, wherein when the computer program is executed by a processor, the processor executes the high dynamic range of any one of claims 13 to 23 Image processing method.
PCT/CN2021/077093 2020-04-17 2021-02-20 High dynamic range image processing system and method, electronic device, and storage medium Ceased WO2021208593A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010304152.6 2020-04-17
CN202010304152.6A CN111491110B (en) 2020-04-17 2020-04-17 High dynamic range image processing system and method, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
WO2021208593A1 true WO2021208593A1 (en) 2021-10-21

Family

ID=71813690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/077093 Ceased WO2021208593A1 (en) 2020-04-17 2021-02-20 High dynamic range image processing system and method, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN111491110B (en)
WO (1) WO2021208593A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063333A (en) * 2022-06-29 2022-09-16 西安欧珀通信科技有限公司 Image processing method, apparatus, electronic device, and computer-readable storage medium
CN118803436A (en) * 2024-09-12 2024-10-18 合肥埃科光电科技股份有限公司 High dynamic range image processing method, system and medium based on multi-line sensor

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111491110B (en) * 2020-04-17 2021-09-17 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and storage medium
CN111970459B (en) * 2020-08-12 2022-02-18 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device, and readable storage medium
CN111970461B (en) * 2020-08-17 2022-03-22 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device and readable storage medium
CN111970460B (en) * 2020-08-17 2022-05-20 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device and readable storage medium
CN112270639B (en) * 2020-09-21 2024-04-19 浙江大华技术股份有限公司 Image processing method, image processing device and storage medium
CN112563312B (en) 2020-12-10 2022-04-26 合肥维信诺科技有限公司 Pixel arrangement structure, display panel and mask assembly
CN114697537B (en) * 2020-12-31 2024-05-10 浙江清华柔性电子技术研究院 Image acquisition method, image sensor, and computer-readable storage medium
CN112931482B (en) * 2021-01-15 2022-06-07 国网山西省电力公司晋城供电公司 Shaft tower drives bird ware
CN112911163B (en) * 2021-01-20 2023-04-07 维沃移动通信有限公司 Image exposure method and device and electronic equipment
CN113038025B (en) * 2021-02-26 2023-06-20 Oppo广东移动通信有限公司 An image processing method, terminal, and storage medium
CN113676635B (en) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 Method and device for generating high dynamic range image, electronic equipment and storage medium
CN113676636B (en) * 2021-08-16 2023-05-05 Oppo广东移动通信有限公司 Method and device for generating high dynamic range image, electronic equipment and storage medium
CN114007055B (en) * 2021-10-26 2023-05-23 四川创安微电子有限公司 Image sensor lens shading correction method and device
CN114723834B (en) * 2022-04-08 2024-12-06 Oppo广东移动通信有限公司 Image main color extraction method and device, terminal and computer readable storage medium
CN115471435B (en) * 2022-09-21 2025-07-29 Oppo广东移动通信有限公司 Image fusion method and device, computer readable medium and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102647565A (en) * 2012-04-18 2012-08-22 格科微电子(上海)有限公司 Arrangement method of pixel array, image sensor and image sensing method
CN104170376A (en) * 2012-03-27 2014-11-26 索尼公司 Image processing device, image-capturing element, image processing method, and program
CN105578065A (en) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 High dynamic range image generation method, photographing device and terminal
KR101633893B1 (en) * 2010-01-15 2016-06-28 삼성전자주식회사 Apparatus and Method for Image Fusion
CN107786814A (en) * 2016-08-24 2018-03-09 杭州海康威视数字技术股份有限公司 One kind is based on wide dynamic image processing method, device and exposure circuit
CN110649056A (en) * 2019-09-30 2020-01-03 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
US10560629B2 (en) * 2017-05-23 2020-02-11 Google Llc Systems and methods for automatic exposure in high dynamic range video capture systems
CN111491110A (en) * 2020-04-17 2020-08-04 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8405750B2 (en) * 2009-06-08 2013-03-26 Aptina Imaging Corporation Image sensors and image reconstruction methods for capturing high dynamic range images
TWI644568B (en) * 2013-07-23 2018-12-11 新力股份有限公司 Camera element, camera method and camera program
US9344639B2 (en) * 2014-08-12 2016-05-17 Google Technology Holdings LLC High dynamic range array camera
JP2017112457A (en) * 2015-12-15 2017-06-22 オリンパス株式会社 Imaging device, imaging program, imaging method
CN109413335B (en) * 2017-08-16 2020-12-18 瑞芯微电子股份有限公司 Method and device for synthesizing HDR image by double exposure
CN110049241A (en) * 2019-04-09 2019-07-23 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101633893B1 (en) * 2010-01-15 2016-06-28 삼성전자주식회사 Apparatus and Method for Image Fusion
CN104170376A (en) * 2012-03-27 2014-11-26 索尼公司 Image processing device, image-capturing element, image processing method, and program
CN102647565A (en) * 2012-04-18 2012-08-22 格科微电子(上海)有限公司 Arrangement method of pixel array, image sensor and image sensing method
CN105578065A (en) * 2015-12-18 2016-05-11 广东欧珀移动通信有限公司 High dynamic range image generation method, photographing device and terminal
CN107786814A (en) * 2016-08-24 2018-03-09 杭州海康威视数字技术股份有限公司 One kind is based on wide dynamic image processing method, device and exposure circuit
US10560629B2 (en) * 2017-05-23 2020-02-11 Google Llc Systems and methods for automatic exposure in high dynamic range video capture systems
CN110649056A (en) * 2019-09-30 2020-01-03 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111491110A (en) * 2020-04-17 2020-08-04 Oppo广东移动通信有限公司 High dynamic range image processing system and method, electronic device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115063333A (en) * 2022-06-29 2022-09-16 西安欧珀通信科技有限公司 Image processing method, apparatus, electronic device, and computer-readable storage medium
CN118803436A (en) * 2024-09-12 2024-10-18 合肥埃科光电科技股份有限公司 High dynamic range image processing method, system and medium based on multi-line sensor

Also Published As

Publication number Publication date
CN111491110B (en) 2021-09-17
CN111491110A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
CN111432099B (en) Image sensor, processing system and method, electronic device, and storage medium
WO2021208593A1 (en) High dynamic range image processing system and method, electronic device, and storage medium
CN111491111B (en) High dynamic range image processing system and method, electronic device and readable storage medium
CN112261391B (en) Image processing method, camera assembly and mobile terminal
US12289544B2 (en) Image acquisition method, electronic device, and non-transitory computer-readable storage medium for obtaining a target image with a same resolution as resolution of a pixel array
CN111586375B (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111479071B (en) High dynamic range image processing system and method, electronic device and readable storage medium
CN111314592A (en) Image processing method, camera assembly and mobile terminal
CN112738493B (en) Image processing method, image processing apparatus, electronic device, and readable storage medium
US12309502B2 (en) Image processing method, camera assembly and mobile terminal
US20220150450A1 (en) Image capturing method, camera assembly, and mobile terminal
CN111970460B (en) High dynamic range image processing system and method, electronic device and readable storage medium
EP4270931A1 (en) Image processing method, image processing system, electronic device, and readable storage medium
CN111970459A (en) High dynamic range image processing system and method, electronic device, and readable storage medium
CN111970461B (en) High dynamic range image processing system and method, electronic device and readable storage medium
CN112822475B (en) Image processing method, image processing apparatus, terminal, and readable storage medium
CN112738494B (en) Image processing method, image processing system, terminal device, and readable storage medium
US20220279108A1 (en) Image sensor and mobile terminal
CN112235485A (en) Image sensor, image processing method, imaging device, terminal, and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21789110

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21789110

Country of ref document: EP

Kind code of ref document: A1