[go: up one dir, main page]

CN111311500B - A method and device for color restoration of an image - Google Patents

A method and device for color restoration of an image Download PDF

Info

Publication number
CN111311500B
CN111311500B CN201811517272.3A CN201811517272A CN111311500B CN 111311500 B CN111311500 B CN 111311500B CN 201811517272 A CN201811517272 A CN 201811517272A CN 111311500 B CN111311500 B CN 111311500B
Authority
CN
China
Prior art keywords
image
pixels
saturation
brightness
communication area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811517272.3A
Other languages
Chinese (zh)
Other versions
CN111311500A (en
Inventor
孙超伟
竺旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201811517272.3A priority Critical patent/CN111311500B/en
Priority to PCT/CN2019/121126 priority patent/WO2020119454A1/en
Publication of CN111311500A publication Critical patent/CN111311500A/en
Application granted granted Critical
Publication of CN111311500B publication Critical patent/CN111311500B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

本申请实施例提供一种对图像进行颜色还原的方法和装置,涉及图像处理领域,能够对图像中过曝的物体(例如,交通信号灯)进行准确的色彩还原。其方法为:获取待处理的第一图像,第一图像包括过曝的第一目标对象;根据第一图像的像素的饱和度和明度确定第一图像的第一区域;对第一区域进行二值化处理,得到第一区域对应的二值图像;确定二值图像中面积大于或等于第一阈值的至少一个连通区域;至少一个连通区域的轮廓对应第一目标对象的轮廓;还原至少一个连通区域的颜色。本申请实施例应用于对过曝的图像进行色彩还原的场景。

The embodiments of the present application provide a method and device for color restoration of an image, which relates to the field of image processing and can accurately restore the color of an overexposed object in an image (for example, a traffic light). The method is as follows: obtaining a first image to be processed, the first image including an overexposed first target object; determining a first region of the first image according to the saturation and brightness of pixels of the first image; binarizing the first region to obtain a binary image corresponding to the first region; determining at least one connected region in the binary image whose area is greater than or equal to a first threshold; the contour of at least one connected region corresponds to the contour of the first target object; and restoring the color of at least one connected region. The embodiments of the present application are applied to the scene of color restoration of an overexposed image.

Description

Method and device for carrying out color reduction on image
Technical Field
The present application relates to the field of image processing, and in particular, to a method and apparatus for performing color reduction on an image.
Background
Image overexposure is a relatively common phenomenon that can lead to a series of problems. For example, in a traffic video monitoring system, a monitoring camera operating in an electronic police mode can simultaneously capture a red light signal and a violation vehicle for evidence as evidence of a violation by a violation person. However, in the case of low ambient light (such as a dusk or night scene), the camera often needs to increase the shutter, gain, and aperture size when the monitoring camera takes a picture of the signal red light for evidence. If the gain, shutter or aperture is excessively increased, overexposure of the captured traffic light (e.g., a red signal light may turn yellow or white) may result in a captured image that may not be used as evidence of a violation.
At present, the following methods for realizing the color reproduction of the signal lamp can be divided into a hardware method and a software method. The hardware method is that a camera with ultra-wide dynamic range can be adopted to eliminate the color distortion of the signal lamp caused by strong light. This is because the camera with ultra-wide dynamic range detects a large dynamic range of brightness, and can restore image details in a high-contrast brightness scene. However, on the one hand, ultra-wide dynamic cameras are expensive, and on the other hand, because the proportion of the signal lamp area to the whole picture is very small, the color restoration effect of the ultra-wide dynamic range camera on the overexposure signal lamp under low illumination is limited. The software method 1 can be used for identifying signal lamp areas based on red, green and blue (red, green, blue, RGB) (color) spaces of images and then performing color reproduction of the signal lamps. However, the RGB color space does not reflect the brightness information of the signal lamp well, which may cause discontinuous gradient phenomenon between the signal lamp area after color reproduction and the surrounding image. And 2, carrying out region identification on the signal lamp in the image based on a deep learning method, and carrying out color enhancement. However, the contour recognition accuracy of the signal lamp is not high by the deep learning method, so that the signal lamp color restoration area is easy to be discontinuous with surrounding images.
Therefore, there is a need for a more accurate method of color reproduction of overexposed objects in an image (e.g., overexposed traffic lights).
Disclosure of Invention
The embodiment of the application provides an image processing method which can accurately restore colors of overexposed objects (such as traffic lights) in an image. Further, a series of problems due to overexposure of objects in the image (e.g., difficult evidence collection problems due to overexposure of signal lights in current electrical police monitoring scenarios) may be addressed.
In a first aspect, an embodiment of the present application provides a method for performing color reduction on an image, where the method includes obtaining a first image to be processed, the first image including a first target object that is overexposed, determining a first area of the first image according to saturation and brightness of pixels of the first image, where the saturation of pixels of the first area is lower than an average value of saturation of pixels of the first image and the brightness of pixels of the first area is higher than the average value of brightness of pixels of the first image, where the first area corresponds to (a location of) the first target object, performing binarization processing on the first area to obtain a binary image corresponding to the first area, determining at least one connected area in the binary image with an area greater than or equal to a first threshold, where a contour of the at least one connected area corresponds to a contour of the first target object, and reducing a color of the at least one connected area.
According to the method provided by the embodiment of the application, after the first image to be processed is obtained, a first area of the first image (the first area can be used as an area where the overexposed first target object is located) can be determined according to the saturation and brightness of the pixels of the first image, then binarization processing can be carried out on the first area to obtain a binary image corresponding to the first area, at least one connected area (the outline of the at least one connected area corresponds to the outline of the first target object) with the area larger than or equal to a first threshold value in the binary image is determined, and then the color of the at least one connected area (namely, the color of the overexposed first target object) is restored. Therefore, the method provided by the embodiment of the application can accurately restore the color of the overexposed object (namely the first target object, such as a traffic signal lamp). Further, a series of problems due to overexposure of objects in the image (e.g., difficult evidence collection problems due to overexposure of signal lights in current electrical police monitoring scenarios) may be addressed.
In one possible implementation, before determining the first region of the first image from the saturation and brightness of the first image, the method further includes converting the first image from a first space to a hue saturation brightness (hue saturation value, HSV) space, the first space being any one of a luma chroma YUV space, an RGB space, or a hue saturation luma hue saturation brightness (hue saturation lightness, HSL) space. That is, if the first image to be processed is in the first space, the first space may be YUV space, RGB space, or HSL space, and at this time, in order to obtain the saturation component and the brightness component of the pixels of the first image, the first image needs to be converted from the first space to the HSV space.
In one possible implementation, restoring the color of the at least one communication region when the first target object is a traffic light includes obtaining color information of the traffic light, adjusting the hue of the pixel of the at least one communication region to a red range and increasing and decreasing the saturation and brightness of the pixel of the at least one communication region, respectively, when the traffic light is red, adjusting the hue of the pixel of the at least one communication region to a yellow range and increasing and decreasing the saturation and brightness of the pixel of the at least one communication region, respectively, when the traffic light is yellow, adjusting the hue of the pixel of the at least one communication region to a green range and increasing and decreasing the saturation and brightness of the pixel of the at least one communication region, respectively.
In the embodiment of the present application, the increasing and decreasing of the saturation and the brightness of the pixel of the at least one connected region may be a linear increasing and a linear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively, or may be a nonlinear increasing and a nonlinear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively.
In one possible implementation, restoring the color of the at least one connected region when the first target object is a traffic light includes converting the at least one connected region to RGB space, obtaining color information of the traffic light, adjusting the red component of the pixel of the at least one connected region to a first preset range when the traffic light is red, reducing the blue and green components of the pixel of the at least one connected region, adjusting the red and green components of the pixel of the at least one connected region to a second preset range when the traffic light is yellow, reducing the blue component of the pixel of the at least one connected region, and adjusting the green component of the pixel of the at least one connected region to a third preset range when the traffic light is green.
Wherein the decreasing of the blue and green components of the pixels of the at least one connected region may be a linear decrease or a non-linear decrease of the blue and green components of the pixels of the at least one connected region.
In one possible implementation, before determining the first region of the first image from the saturation and the brightness of the first image, the method further includes filtering the saturation component and the brightness component of the pixels of the first image. In this way, the outline of the overexposed first object may be more stably and accurately identified.
In one possible implementation, acquiring the first image to be processed includes taking a second region selected by the user on the second image as the first image to be processed.
In this way, compared with directly processing the second image, the first image is selected from the second image and only the first image is processed, so that on one hand, the operation amount in the subsequent step can be reduced, and on the other hand, the recognition accuracy of the area where the first target object (for example, the signal lamp) is located can be improved.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including an acquiring unit configured to acquire a first image to be processed, the first image including a first target object that is overexposed, a determining unit configured to determine a first area of the first image according to saturation and brightness of pixels of the first image, the saturation of pixels of the first area being lower than an average value of saturation of pixels of the first image, and the brightness of pixels of the first area being higher than an average value of brightness of pixels of the first image, the first area corresponding to the first target object area, a processing unit configured to perform binarization processing on the first area to obtain a binary image corresponding to the first area, a determining unit further configured to determine at least one connected area in the binary image having an area greater than or equal to a first threshold, a contour of the first target object corresponding to a contour of the at least one connected area, and a processing unit further configured to reduce a color of the at least one connected area.
In one possible implementation, the processing unit is further configured to convert the first image from a first space to an HSV space, the first space being any one of YUV space, RGB space, or HSL space.
In one possible implementation, when the first target object is a traffic light, the processing unit is configured to acquire color information of the traffic light through the acquiring unit, adjust a hue of a pixel of the at least one communication area to a red range and raise and lower a saturation and a brightness of the pixel of the at least one communication area, respectively, when the traffic light is red, adjust the hue of the pixel of the at least one communication area to a yellow range and raise and lower the saturation and the brightness of the pixel of the at least one communication area, respectively, and adjust the hue of the pixel of the at least one communication area to a green range and raise and lower the saturation and the brightness of the pixel of the at least one communication area, respectively.
In one possible implementation, when the first target object is a traffic light, the processing unit is configured to convert the at least one connected region to an RGB space, obtain color information of the traffic light through the obtaining unit, adjust a red component of a pixel of the at least one connected region to a first preset range and reduce blue and green components of the pixel of the at least one connected region when the traffic light is red, adjust the red and green components of the pixel of the at least one connected region to a second preset range and reduce the blue component of the pixel of the at least one connected region when the traffic light is yellow, and adjust the green component of the pixel of the at least one connected region to a third preset range and reduce the red and blue components of the pixel of the at least one connected region when the traffic light is green.
In a possible implementation the processing unit is further adapted to filter the saturation component and the brightness component of the pixels of the first image.
In a possible implementation, the acquisition unit is configured to take a second area selected by the user on the second image as the first image to be processed.
The technical effects of the second aspect and its various possible implementations may be referred to the technical effects of the first aspect and its various possible implementations, and are not described herein.
In a third aspect, an embodiment of the present application provides an apparatus in the form of a chip, where the apparatus includes a processor and a memory, where the memory is configured to be coupled to the processor, and store program instructions and data necessary for the apparatus, and where the processor is configured to execute the program instructions stored in the memory, so that the apparatus performs the functions of the image processing device in the above method.
In a fourth aspect, an embodiment of the present application provides an image processing apparatus, where the image processing apparatus may implement a function performed by the image processing apparatus, and the function may be implemented by hardware or may be implemented by hardware executing corresponding software. The hardware or software comprises one or more modules corresponding to the functions.
In one possible design, the image processing apparatus includes a processor and a communication interface in a structure, the processor being configured to support the image processing apparatus to perform the corresponding functions in the above-described method. The communication interface is used to support communication between the image processing device and other devices, such as a traffic light detector on a traffic light. The image processing device may further comprise a memory for coupling with the processor, which holds the program instructions and data necessary for the image processing device.
In a fifth aspect, an embodiment of the application provides a computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform any of the methods provided in the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the methods provided in the first aspect.
Drawings
Fig. 1 is a schematic diagram of a system architecture of a method for color reduction of an image according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for color reduction of an image according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a user selecting a first image to be processed according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a first area according to an embodiment of the present application;
Fig. 6 is a schematic diagram of a binary image corresponding to a first area according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a communication area according to an embodiment of the present application;
Fig. 8 is a schematic diagram of a first area, a binary image corresponding to the first area, and a connected area determined according to the binary image according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of still another image processing apparatus according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a method and a device for performing color (color) restoration on an image, which are applied to a scene for performing color restoration on an overexposed image. It will be appreciated that embodiments of the present application may also be applied to scenes in which video comprising one or more overexposed images is color restored. In particular, the method can be applied to the color reduction process of the overexposed object (or overexposed area) in the image. For example, in color reproduction of overexposed traffic lights in images or videos taken by electronic police (cameras or monitors).
Taking a scene of an electronic police camera shooting a violation image as an example, as shown in fig. 1, the embodiment of the application provides a system architecture schematic diagram of a method suitable for performing color restoration on an image (the violation image shot by the electronic police camera), which comprises the electronic police camera and an image processing device connected with the electronic police camera (the image processing device can also be integrated in the electronic police camera), wherein the image processing device can perform color restoration processing on the image or video shot by the electronic police camera. The image processing device may also be connected to a traffic light in order to obtain (historical) color changes of the light from a light detector on the traffic light.
The image processing apparatus in fig. 1 according to the embodiment of the present application may be implemented by one apparatus or may be a functional module in one apparatus, which is not particularly limited in the embodiment of the present application. It will be appreciated that the above described functionality may be either a network element in a hardware device, or a software function running on dedicated hardware, or a virtualized function instantiated on a platform (e.g., a cloud platform), or a system on a chip. In the embodiment of the application, the chip system can be formed by a chip, and can also comprise the chip and other discrete devices.
For example, the apparatus for implementing the functions of the image processing apparatus provided by the embodiment of the present application may be implemented by the apparatus 200 in fig. 2. Fig. 2 is a schematic diagram of a hardware structure of an apparatus 200 according to an embodiment of the present application. The apparatus 200 includes at least one processor 201 configured to implement the functions of the image processing device provided in the embodiment of the present application. The apparatus 200 may further include a bus 202 and at least one communication interface 204. Memory 203 may also be included in apparatus 200.
In an embodiment of the application, the processor may be a central processing unit (central processing unit, CPU), a general purpose processor, a network processor (network processor, NP), a digital signal processor (DIGITAL SIGNAL processing, DSP), a microprocessor, a microcontroller, a programmable logic device (programmable logic device, PLD), or any combination thereof. The processor may also be any other means for performing a processing function, such as a circuit, device, or software module.
Bus 202 may be used to transfer information between the components described above.
A communication interface 204 for communicating with other devices or communication networks, such as ethernet, radio access network (radio access network, RAN), wireless local area network (wireless local area networks, WLAN), etc. The communication interface 204 may be an interface, circuit, transceiver, or other device capable of communication, and the application is not limited. The communication interface 204 may be coupled to the processor 201. The coupling in the embodiments of the present application is an indirect coupling or communication connection between devices, units, or modules, which may be in electrical, mechanical, or other forms for information interaction between the devices, units, or modules.
In embodiments of the application, the memory may be, but is not limited to, read-only memory (ROM) or other type of static storage device that may store static information and instructions, random access memory (random access memory, RAM) or other type of dynamic storage device that may store information and instructions, electrically erasable programmable read-only memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-only memory, EEPROM), compact disc read-only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate or coupled to the processor, such as via bus 202. The memory may also be integrated with the processor.
The memory 203 is configured to store program instructions and may be controlled by the processor 201 to perform a method for color restoration of an image according to the embodiment of the present application. The processor 201 is configured to invoke and execute instructions stored in the memory 203, thereby implementing a method for color restoration of an image according to the embodiments of the present application described below.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application program codes, which are not particularly limited in the embodiments of the present application.
Optionally, a memory 203 may be included in the processor 201.
In a particular implementation, as one embodiment, processor 201 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 2.
In a particular implementation, the apparatus 200 may include a plurality of processors, such as the processor 201 and the processor 207 in fig. 2, as one embodiment. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In a specific implementation, the apparatus 200 may further comprise an output device 205 and an input device 206, as an embodiment. The output device 205 is coupled to the processor 201 and can display information in a variety of ways. For example, the output device 205 may be a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, or a projector (projector), or the like. The input device 206 is coupled to the processor 201 and may receive user input in a variety of ways. For example, the input device 206 may be a camera, mouse, keyboard, touch screen device, or sensing device, among others.
The apparatus 200 may be a general purpose device or a special purpose device. In a specific implementation, the image processing device 200 may be a video camera, a monitor, a video display device, a desktop computer, a portable computer, a web server, a personal computer (PDA), a mobile phone, a tablet computer, a wireless terminal device, an embedded device, or a device having a similar structure as in fig. 2. Embodiments of the present application are not limited in the type of apparatus 200.
For clarity and conciseness in the description of the embodiments below, a brief introduction to related concepts or technologies is first given:
Color space-color is generally described by three independent attributes, which act in combination to form a spatial coordinate, i.e., color space. The color space may include RGB (color) space, YUV (color) space, HSV (color) space, HSL (color) space, etc., and different color spaces may weigh the colors of the same object from different angles. In different processing procedures, emphasis on color processing is different, so that various color spaces can be mutually converted to meet different processing requirements. Wherein:
RGB space R represents the red component (red channel), G represents the green component (green channel), and B represents the blue component (blue channel). The colors are obtained by varying the three color channels of red, green and blue and by superimposing them on each other. The smaller the value, the lower the luminance, the larger the value, and the higher the luminance in each component. When the components are mixed, the mixed brightness is equal to the sum of the brightness of the components.
YUV space Y represents brightness, i.e. gray scale value. U and V represent chromaticity, which describes the image color and saturation, and may specify the color of the pixel. If there is only a Y signal component and no U, V component, the image so represented is a black and white gray image. The YUV space is mainly used for optimizing the transmission of color video signals, so that the color video signals are backwards compatible with old black-and-white televisions.
HSV space HSV is a representation of points in the RGB color space in an inverted cone. Wherein H represents color information, namely the position of the spectrum color, and can be measured by an angle, and the range of the value is 0-360 degrees. Red is 0 °, green is 120 °, and blue is 240 °. S is expressed as the ratio between the saturation of the selected color and the maximum saturation of that color, ranging from 0 to 1. When s=0, only the gradation is present. V represents the brightness of the color, ranging from 0 to 1.
HSL space HSL is similar to HSV, the first two parameters in this model being the same as HSV. L represents the brightness of a color and can be used to control the shade change of the color. The value range of L is 0% to 100%, the smaller the value is, the darker the color is, the closer to black, the larger the value is, the brighter the color is, and the closer to white is.
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the application, unless otherwise indicated, "at least one" means one or more, and "a plurality" means two or more. In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
For easy understanding, the method for color reduction of an image according to the embodiment of the present application is specifically described below with reference to the accompanying drawings.
As shown in fig. 3, an embodiment of the present application provides a method for performing color reduction on an image, including:
301. A first image to be processed is acquired, the first image comprising the overexposed first target object.
Where the first target object refers to one or more objects (or areas where objects are located) that are overexposed in the first image. The objects may be, for example, traffic lights (of various shapes and colors), vehicles, traffic signs, etc.
In one possible design, the second area framed by the user on the second image may be used as the first image to be processed. For example, as shown in fig. 4, when the user considers that the traffic light (the first target object) in a certain image (the second image) is overexposed, the user may perform a first operation on the second image through an input device (for example, a mouse or a touch screen) of the image processing apparatus, where the first operation is used to select an area (the second area) where the overexposed traffic light is located. After the image processing device identifies the first operation of the user on the second image, the second area can be cut from the second image according to the coordinate information of the second area selected by the user in a frame mode, and a third image and a first image to be processed are obtained. Wherein the first image to be processed includes a second region, and the third image includes other regions than the second region. That is, the first image to be processed includes a second region selected by the user on the second image.
In this way, compared with directly processing the second image, the first image is selected from the second image and only the first image is processed, so that on one hand, the operation amount in the subsequent step can be reduced, and on the other hand, the recognition accuracy of the area where the first target object (for example, a traffic light) is located can be improved.
302. Optionally, the saturation component and the brightness component of the pixels of the first image are filtered.
It should be noted that, if the second image is an HSV image, that is, the second image is in HSV space, then the first image to be processed is also in HSV space, and the saturation component and the brightness component of the pixel of the first image may be directly filtered, so as to more stably and accurately identify the outline of the overexposed first object (for example, the overexposed traffic light) in the subsequent steps.
If the second image is in the first space, the first space may be YUV space, RGB space or HSL space, and the first image to be processed is also in the first space. At this time, in order to obtain the saturation component and the brightness component of the pixels of the first image, the first image needs to be converted from the first space to the HSV space. Then, the saturation component and the brightness component of the pixels of the first image are filtered again in order to more stably and accurately identify the outline of the overexposed first object in a subsequent step.
303. A first region of the first image is determined based on the saturation and brightness of pixels of the first image.
In the embodiment of the present application, the first region corresponds to a region of the first target object (where the first target object is located). The region in the first image where the overexposed first target object (e.g., an overexposed traffic light) is located may be determined from the saturation and brightness of the pixels of the first image.
In one possible design, the saturation component and the brightness component of the pixel of the first image after the filtering process may be analyzed, and a region where the saturation component is lower than the average value of the saturation of the pixel of the first image and the brightness component is higher than the average value of the brightness of the pixel of the first image is determined to be the region where the first target object is located. I.e. the saturation of the pixels of the first area is lower than the average of the saturation of the pixels of the first image and the brightness of the pixels of the first area is higher than the average of the brightness of the pixels of the first image. Fig. 5 is a schematic view of a first area.
304. And carrying out binarization processing on the first region to obtain a binary image corresponding to the first region.
For example, as shown in fig. 6, after the binarization processing is performed on the first area shown in fig. 5, a binary image corresponding to the first area may be obtained. The binary image corresponding to the first region may include three connected regions (a, b, and c, respectively). The communication area a is an area where the signal lamp is located, and the communication areas b and c may be areas where halation generated by the signal lamp is located.
305. At least one connected region in the binary image having an area greater than or equal to a first threshold is determined.
In one possible design, the contour of at least one connected region in the binary image with an area greater than or equal to the first threshold corresponds to the contour of the first target object. The contour of at least one communication region having an area greater than or equal to the first threshold in the binary image may be taken as the contour of the first target object. It should be noted that, the contour of the at least one connected region having the area greater than or equal to the first threshold may be larger or smaller than the contour of the actual (real) first target object. For example, when the first target object is a signal lamp, since the (bright) signal lamp may generate halation, the at least one communication area may include not only an area where the signal lamp itself is located but also an area where the halation generated by the signal lamp is located, and thus the contour of the at least one communication area having the area greater than or equal to the first threshold may be larger than the contour of the actual signal lamp. For another example, if a portion of the signal lamp is damaged and cannot emit light, the at least one communication region may include only a region where the portion of the signal lamp emits light, and in this case, the contour of the at least one communication region is slightly smaller than the contour of the actual signal lamp.
The first threshold may be determined according to an area of the connected region having the largest area in the binary image. For example, the first threshold may be equal to the area of the connected region having the largest area in the binary image, or the first threshold may be N% (e.g., 30%) of the area of the connected region having the largest area in the binary image, N being a positive number.
A plurality of connected areas with different area sizes exist in the binary image corresponding to the first area. For example, as shown in fig. 6, three connected regions (a, b, and c, respectively) may be included in the binary image corresponding to the first region. And searching all connected areas in the binary image based on the binary image. And sequencing all the searched connected areas according to the size of the areas (for example, sequencing all the connected areas from large to small). Among these communication regions of different sizes, the communication region of smaller area may be filtered out, the communication region of larger area may be reserved, and at least one communication region of larger area may be taken as the region of the first target object (e.g., signal lamp) (for example, one communication region of largest area may be taken as the region where the signal lamp is located, or a plurality of communication regions of larger area may be taken as the region where the signal lamp is located).
For example, the connected region with the largest area in the binary image may be screened out first, assuming that the area is max, the first threshold may be set to be max/(x+2) to filter out the connected region with the area smaller than max/(x+2). For example, as shown in fig. 7, the communication areas (b and c) may be filtered out, leaving the communication area a. The contour of the communication area a corresponds to the contour of the first target object.
The value range of the parameter x may be 0-9, and the parameter x may be used to adjust the magnitude of the first threshold. The larger the value of x is, the fewer the filtered communication areas with smaller areas are, namely the more the communication areas with smaller areas are reserved, so that the outline of the first target object corresponds to the outline of the more communication areas. For example, if the first target object is a signal lamp, the more the area where the signal lamp is located is spread outwards (i.e. the area where the signal lamp is located may include not only the signal lamp but also a halo generated by the signal lamp). The smaller the value of x is, the more connected areas with smaller areas are filtered out, namely, the fewer connected areas with smaller areas are reserved, so that the outline of the first target object corresponds to the outline of the fewer connected areas. For example, if the first target object is a signal lamp, the area where the signal lamp is located is shrunk inwards (i.e. the area where the signal lamp is located includes only the signal lamp and does not include the halo generated by the signal lamp).
For another example, if the first image to be processed includes an arrow-shaped indicator light, after the image processing apparatus determines that the first area of the first image is shown in (a) of fig. 8 and performs binarization processing on the first area, as shown in (b) of fig. 8, four connected areas (d, e, f, and g respectively) may be included in the binary image corresponding to the first area, and as shown in (c) of fig. 8, at least one connected area having an area greater than or equal to the first threshold may be connected areas d and e. I.e., the communication areas (f and g) are filtered out, leaving the communication areas d and e. The outline of the communication areas d and e may be regarded as the outline of the first target object.
306. The color of at least one of the connected regions is reduced.
In the embodiment of the present application, the reduction of the color of the at least one communication region may also be regarded as the correction of the color of the at least one communication region. The color of at least one of the connected regions, i.e., the color of the overexposed first target object is reduced or corrected. For example, as shown in fig. 7, the color of the communication area a, that is, the color of the overexposed first target object is restored or corrected.
For example, when the first target object is a traffic signal, color information of the traffic signal may be obtained based on a signal detector externally connected to the signal or based on an image recognition status light.
In one possible design, the color of at least one connected region may be restored in HSV space. Specifically, when the traffic signal lamp is red, the tone of the pixel of at least one communication area is adjusted to a red range, the saturation and the brightness of the pixel of at least one communication area are respectively increased and decreased, when the traffic signal lamp is yellow, the tone of the pixel of at least one communication area is adjusted to a yellow range, the saturation and the brightness of the pixel of at least one communication area are respectively increased and decreased, and when the traffic signal lamp is green, the tone of the pixel of at least one communication area is adjusted to a green range, and the saturation and the brightness of the pixel of at least one communication area are respectively increased and decreased.
In the embodiment of the present application, the increasing and decreasing of the saturation and the brightness of the pixel of the at least one connected region may be a linear increasing and a linear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively, or may be a nonlinear increasing and a nonlinear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively. The linear increase in saturation may increase the saturation of each of the pixels in the at least one connected region by the same amount (value) and the non-linear increase in saturation may increase the saturation of different ones of the pixels in the at least one connected region by different amounts (value). The linearly decreasing of the brightness may decrease the brightness of each of the pixels in the at least one connected region by the same amount (value) and the non-linearly decreasing of the brightness may decrease the brightness of different ones of the pixels in the at least one connected region by different amounts (values). After the saturation and brightness of the pixel of the at least one connected region are respectively increased and decreased, there is a strong correlation between the saturation and brightness of the at least one connected region. And before the saturation and brightness of the pixels of the at least one connected region are respectively increased and decreased, there is no strong correlation between the saturation and brightness of the at least one connected region.
In another possible design, the color of at least one connected region may be restored in RGB space. Specifically, when the traffic signal is (determined to be) red, the red component of the pixel of the at least one connected region may be adjusted to a first preset range, and the blue and green components of the pixel of the at least one connected region may be reduced.
For example, assuming that color reduction is performed on an 8-bit encoded RGB image, if the red component (R value) of a point pixel is Rx, the R value of the point pixel may be adjusted to r= (200+55×x/9+rx)/2 (a first preset range). The value range of the parameter x can be 0-9, and the parameter x can be used for adjusting the magnitude of the R value.
Wherein the decreasing of the blue and green components of the pixels of the at least one connected region may be a linear decrease or a non-linear decrease of the blue and green components of the pixels of the at least one connected region. The linear reduction may reduce the blue and green components of each of the pixels in the at least one connected region by the same amount (value) and the non-linear reduction may reduce the blue and green components of different ones of the pixels in the at least one connected region by different amounts (value).
And when the traffic signal lamp is determined to be green, the red component and the green component of the pixel of the at least one connected region are increased, and the red component and the blue component of the pixel of the at least one connected region are reduced. The specific process may refer to the above related processing process when the traffic signal light is red, and will not be described herein.
307. The first image and the third image are combined.
If the first image is converted from the first space to the HSV space at step 302, the color-restored first image may be reconverted from the HSV space to the first space to combine the first image and the third image (i.e., the remaining portion of the first image is cropped from the second image) to obtain a complete image (which may be considered to be the color-restored second image). If the conversion of the first image from the first space to HSV space is not performed in step 302, the first image and the third image after the color reduction may be directly combined to obtain a complete image.
In one possible design, in the process of performing color reduction on the overexposed first target object in the video, a user may select the overexposed object before the video starts or when pausing, and in the process of playing the subsequent video, the method of the embodiment of the present application may be adopted to perform corresponding processing on each frame image of the video, so as to reduce the color of the overexposed first target object in the whole video.
According to the method provided by the embodiment of the application, after the first image to be processed is obtained, a first area of the first image (the area where the first area can be used as the overexposed first target object) can be determined according to the saturation and brightness of the pixels of the first image, then binarization processing is carried out on the first area to obtain a binary image corresponding to the first area, at least one connected area (the outline of the at least one connected area can be used as the outline of the first target object) with the area larger than or equal to a first threshold value in the binary image is determined, and then the color of the at least one connected area (namely the color of the overexposed first target object) is restored. Therefore, the method provided by the embodiment of the application can accurately restore the color of the overexposed object (such as a traffic signal lamp) in the image. Further, a series of problems due to overexposure of objects in the image (e.g., difficult evidence collection problems due to overexposure of signal lights in current electrical police monitoring scenarios) may be addressed.
In the prior art, a camera with ultra-wide dynamic range can be adopted to eliminate the color distortion of the signal lamp caused by strong light, and the ultra-wide dynamic camera is high in price and has limited color restoration effect on the overexposure signal lamp under low illumination. The method provided by the embodiment of the application has the advantages of no need of adding extra equipment, low price, high environmental adaptability and capability of achieving the color restoration effect of the overexposure signal lamp under low illumination. In the prior art, color reproduction of the signal lamp can be performed based on an RGB space of the image, or region recognition and color enhancement can be performed on the signal lamp in the image based on a deep learning method, but a gradient phenomenon that the signal lamp region after color reproduction is discontinuous with surrounding images may be caused. The method provided by the embodiment of the application is based on HSV space, and better reflects the brightness information of the entity signal lamp, so that the signal lamp area is more stably and accurately identified, the signal lamp area after color restoration is more continuous with surrounding images, and the visual effect is more lifelike. In addition, the method provided by the embodiment of the application is efficient and reliable, and can meet the color reduction effect of the signal lamp in the video.
The above description has been made mainly in terms of the image processing apparatus for the solution provided by the embodiment of the present application. It will be appreciated that the image processing apparatus, in order to achieve the above-described functions, comprises corresponding hardware structures and/or software modules that perform the respective functions. Those skilled in the art will readily appreciate that the algorithm steps described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and software. Whether a function is implemented as hardware or software-driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the present application may divide the functional modules of the image processing apparatus according to the above-described method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing the respective functional modules with the respective functions, fig. 9 shows a possible structural schematic diagram of the image processing apparatus 9 referred to in the above-described embodiment, which includes an acquisition unit 901, a determination unit 902, and a processing unit 903. The acquisition unit 901 is for supporting the image processing apparatus to execute the process 301 in fig. 3. The determination unit 902 is for supporting the image processing apparatus to execute the processes 303 and 305 in fig. 3. The processing unit 903 is used to support the image processing apparatus to execute the processes 302, 304, and 306 in fig. 3. All relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The steps of a method or algorithm described in connection with the present disclosure may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable disk, a read-only optical disk, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC). In addition, the ASIC may be located in a core network interface device. The processor and the storage medium may reside as discrete components in a core network interface device.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on an image processing device readable medium. The image processing device readable medium includes an image processing device storage medium and a communication medium, wherein the communication medium includes any medium that facilitates transfer of an image processing device program from one place to another. A storage medium may be any available medium that can be accessed by a general purpose or special purpose image processing device.
The foregoing embodiments have been provided for the purpose of illustrating the general principles of the present application in further detail, and are not to be construed as limiting the scope of the application, but are merely intended to cover any modifications, equivalents, improvements, etc. based on the teachings of the application.

Claims (13)

1. A method of color reducing an image, comprising:
Acquiring a first image to be processed, wherein the first image comprises a first target object which is overexposed;
Determining a first region of the first image according to the saturation and brightness of the pixels of the first image, wherein the saturation of the pixels of the first region is lower than the average value of the saturation of the pixels of the first image, and the brightness of the pixels of the first region is higher than the average value of the brightness of the pixels of the first image;
performing binarization processing on the first region to obtain a binary image corresponding to the first region;
Determining at least one communication region with the area larger than or equal to a first threshold value in the binary image, wherein the contour of the at least one communication region corresponds to the contour of the first target object;
and restoring the color of the at least one communication area.
2. The method of color restoration of an image according to claim 1, wherein before the determining the first region of the first image from the saturation and the brightness of the first image, the method further comprises:
the first image is converted from a first space to a hue saturation brightness HSV space, the first space being any one of a luminance chrominance YUV space, a red green blue RGB space, or a hue saturation luminance HSL space.
3. The method of color restoration of an image according to claim 1 or 2, wherein when the first target object is a traffic light, the restoring the color of the at least one connected region includes:
Acquiring color information of the traffic signal lamp;
When the traffic signal lamp is red, adjusting the tone of the pixels of the at least one communication area to a red range, and respectively increasing and decreasing the saturation and brightness of the pixels of the at least one communication area;
When the traffic signal lamp is yellow, adjusting the tone of the pixels of the at least one communication area to a yellow range, and respectively increasing and decreasing the saturation and brightness of the pixels of the at least one communication area;
When the traffic signal lamp is green, the tone of the pixels of the at least one communication area is adjusted to a green range, and the saturation and the brightness of the pixels of the at least one communication area are respectively increased and decreased.
4. The method of color restoration of an image according to claim 1 or 2, wherein when the first target object is a traffic light, the restoring the color of the at least one connected region includes:
converting the at least one communication region to an RGB space;
Acquiring color information of the traffic signal lamp;
when the traffic signal lamp is red, adjusting the red component of the pixel of the at least one communication area to a first preset range, and reducing the blue and green components of the pixel of the at least one communication area;
when the traffic signal lamp is yellow, red and green components of the pixels of the at least one communication area are adjusted to a second preset range, and blue components of the pixels of the at least one communication area are reduced;
and when the traffic signal lamp is green, adjusting the green component of the pixel of the at least one communication area to a third preset range, and reducing the red and blue components of the pixel of the at least one communication area.
5. The method of color restoration of an image according to claim 1, wherein before the determining the first region of the first image from the saturation and the brightness of the first image, the method further comprises:
A saturation component and a brightness component of pixels of the first image are filtered.
6. The method of color restoration of an image according to claim 1, wherein the acquiring a first image to be processed comprises:
and taking the second area selected by the user on the second image as the first image to be processed.
7. An image processing apparatus, characterized by comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first image to be processed, and the first image comprises a first target object which is overexposed;
a determining unit configured to determine a first region of the first image according to a saturation and a brightness of a pixel of the first image, where the saturation of the pixel of the first region is lower than an average value of the saturation of the pixel of the first image, and the brightness of the pixel of the first region is higher than the average value of the brightness of the pixel of the first image;
The processing unit is used for carrying out binarization processing on the first area to obtain a binary image corresponding to the first area;
The determining unit is further used for determining at least one communication area with the area larger than or equal to a first threshold value in the binary image, wherein the contour of the at least one communication area corresponds to the contour of the first target object;
the processing unit is further configured to restore a color of the at least one communication area.
8. The image processing apparatus of claim 7, wherein the processing unit is further configured to:
the first image is converted from a first space to a hue saturation brightness HSV space, the first space being any one of a luminance chrominance YUV space, a red green blue RGB space, or a hue saturation luminance HSL space.
9. The image processing apparatus according to claim 7 or 8, wherein when the first target object is a traffic signal, the processing unit is configured to:
Acquiring color information of the traffic signal lamp through the acquisition unit;
When the traffic signal lamp is red, adjusting the tone of the pixels of the at least one communication area to a red range, and respectively increasing and decreasing the saturation and brightness of the pixels of the at least one communication area;
When the traffic signal lamp is yellow, adjusting the tone of the pixels of the at least one communication area to a yellow range, and respectively increasing and decreasing the saturation and brightness of the pixels of the at least one communication area;
When the traffic signal lamp is green, the tone of the pixels of the at least one communication area is adjusted to a green range, and the saturation and the brightness of the pixels of the at least one communication area are respectively increased and decreased.
10. The image processing apparatus according to claim 7 or 8, wherein when the first target object is a traffic signal, the processing unit is configured to:
converting the at least one communication region to an RGB space;
Acquiring color information of the traffic signal lamp through the acquisition unit;
when the traffic signal lamp is red, adjusting the red component of the pixel of the at least one communication area to a first preset range, and reducing the blue and green components of the pixel of the at least one communication area;
when the traffic signal lamp is yellow, red and green components of the pixels of the at least one communication area are adjusted to a second preset range, and blue components of the pixels of the at least one communication area are reduced;
and when the traffic signal lamp is green, adjusting the green component of the pixel of the at least one communication area to a third preset range, and reducing the red and blue components of the pixel of the at least one communication area.
11. The image processing apparatus of claim 7, wherein the processing unit is further configured to:
A saturation component and a brightness component of pixels of the first image are filtered.
12. The image processing apparatus according to claim 7, wherein the acquisition unit is configured to:
and taking the second area selected by the user on the second image as the first image to be processed.
13. A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of colour reduction of an image as claimed in any one of claims 1 to 6.
CN201811517272.3A 2018-12-12 2018-12-12 A method and device for color restoration of an image Active CN111311500B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811517272.3A CN111311500B (en) 2018-12-12 2018-12-12 A method and device for color restoration of an image
PCT/CN2019/121126 WO2020119454A1 (en) 2018-12-12 2019-11-27 Method and apparatus for color reproduction of image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811517272.3A CN111311500B (en) 2018-12-12 2018-12-12 A method and device for color restoration of an image

Publications (2)

Publication Number Publication Date
CN111311500A CN111311500A (en) 2020-06-19
CN111311500B true CN111311500B (en) 2024-12-03

Family

ID=71076231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811517272.3A Active CN111311500B (en) 2018-12-12 2018-12-12 A method and device for color restoration of an image

Country Status (2)

Country Link
CN (1) CN111311500B (en)
WO (1) WO2020119454A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200747B (en) * 2020-10-16 2022-06-21 展讯通信(上海)有限公司 Image processing method and device and computer readable storage medium
CN113436126B (en) * 2021-07-13 2022-06-10 上海艾为电子技术股份有限公司 Image saturation enhancement method and system and electronic equipment
CN114783192B (en) * 2022-03-24 2024-11-15 杭州海康威视数字技术股份有限公司 A method and device for processing the color of a signal light
CN115482399A (en) * 2022-08-29 2022-12-16 东莞市力博得电子科技有限公司 Tooth color identification method and device based on image processing and processing equipment
CN115760684A (en) * 2022-09-22 2023-03-07 超聚变数字技术有限公司 Method for adjusting test threshold interval of mainboard LED lamp, test method and device
CN116883542B (en) * 2022-11-22 2024-11-19 广州开得联软件技术有限公司 Image processing method, device, electronic device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104301621A (en) * 2014-09-28 2015-01-21 北京凌云光技术有限责任公司 Image processing method, device and terminal
CN105430352A (en) * 2015-12-23 2016-03-23 浙江宇视科技有限公司 A method for processing video surveillance images

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003234912A (en) * 2002-02-06 2003-08-22 Sharp Corp Image processing method, image processing apparatus, and image forming apparatus
CN102202163B (en) * 2011-05-13 2013-01-23 成都西图科技有限公司 Adaptive enhancement method and device for monitored video
US8867830B2 (en) * 2011-12-06 2014-10-21 Michael Donvig Joensson Image processing method for recovering details in overexposed digital video footage or digital still images
JP2016001782A (en) * 2014-06-11 2016-01-07 キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, program, and storage medium
CN105979121A (en) * 2015-11-18 2016-09-28 乐视致新电子科技(天津)有限公司 Image processing method and device
CN106507079B (en) * 2016-11-03 2019-08-27 浙江宇视科技有限公司 A color restoration method and device
CN106504217B (en) * 2016-11-29 2019-03-15 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, imaging apparatus, and electronic apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104301621A (en) * 2014-09-28 2015-01-21 北京凌云光技术有限责任公司 Image processing method, device and terminal
CN105430352A (en) * 2015-12-23 2016-03-23 浙江宇视科技有限公司 A method for processing video surveillance images

Also Published As

Publication number Publication date
CN111311500A (en) 2020-06-19
WO2020119454A1 (en) 2020-06-18

Similar Documents

Publication Publication Date Title
CN111311500B (en) A method and device for color restoration of an image
CN110572637B (en) Image anomaly detection method, terminal device and storage medium
US9813635B2 (en) Method and apparatus for auto exposure value detection for high dynamic range imaging
CN109274985B (en) Video transcoding method and device, computer equipment and storage medium
CN104883504B (en) Open the method and device of high dynamic range HDR functions on intelligent terminal
KR102346522B1 (en) Image processing device and auto white balancing metohd thereof
US20070047803A1 (en) Image processing device with automatic white balance
CN103646392B (en) Backlighting detecting and equipment
WO2018149253A1 (en) Image processing method and device
CN112218065B (en) Image white balance method, system, terminal device and storage medium
CN115496668A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112907497B (en) Image fusion method and image fusion device
CN116466899A (en) Image processing method and electronic equipment
CN113727085A (en) White balance processing method and electronic equipment
US9832395B2 (en) Information processing method applied to an electronic device and electronic device having at least two image capturing units that have the same image capturing direction
TWI736599B (en) Method for detection of saturated pixels in an image
CN110807735A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN114945087A (en) Image processing method, device, device and storage medium based on facial features
CN112488933B (en) Video detail enhancement method and device, mobile terminal and storage medium
CN112215237B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN115914850A (en) Method, electronic device and storage medium for enhancing wide dynamic image transparency
CN107392860A (en) Image enchancing method and equipment, AR equipment
CN118071658A (en) Image processing method, apparatus, electronic device, and computer-readable storage medium
CN115660997A (en) Image data processing method and device and electronic equipment
US11995743B2 (en) Skin tone protection using a dual-core geometric skin tone model built in device-independent space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant