Disclosure of Invention
The embodiment of the application provides an image processing method which can accurately restore colors of overexposed objects (such as traffic lights) in an image. Further, a series of problems due to overexposure of objects in the image (e.g., difficult evidence collection problems due to overexposure of signal lights in current electrical police monitoring scenarios) may be addressed.
In a first aspect, an embodiment of the present application provides a method for performing color reduction on an image, where the method includes obtaining a first image to be processed, the first image including a first target object that is overexposed, determining a first area of the first image according to saturation and brightness of pixels of the first image, where the saturation of pixels of the first area is lower than an average value of saturation of pixels of the first image and the brightness of pixels of the first area is higher than the average value of brightness of pixels of the first image, where the first area corresponds to (a location of) the first target object, performing binarization processing on the first area to obtain a binary image corresponding to the first area, determining at least one connected area in the binary image with an area greater than or equal to a first threshold, where a contour of the at least one connected area corresponds to a contour of the first target object, and reducing a color of the at least one connected area.
According to the method provided by the embodiment of the application, after the first image to be processed is obtained, a first area of the first image (the first area can be used as an area where the overexposed first target object is located) can be determined according to the saturation and brightness of the pixels of the first image, then binarization processing can be carried out on the first area to obtain a binary image corresponding to the first area, at least one connected area (the outline of the at least one connected area corresponds to the outline of the first target object) with the area larger than or equal to a first threshold value in the binary image is determined, and then the color of the at least one connected area (namely, the color of the overexposed first target object) is restored. Therefore, the method provided by the embodiment of the application can accurately restore the color of the overexposed object (namely the first target object, such as a traffic signal lamp). Further, a series of problems due to overexposure of objects in the image (e.g., difficult evidence collection problems due to overexposure of signal lights in current electrical police monitoring scenarios) may be addressed.
In one possible implementation, before determining the first region of the first image from the saturation and brightness of the first image, the method further includes converting the first image from a first space to a hue saturation brightness (hue saturation value, HSV) space, the first space being any one of a luma chroma YUV space, an RGB space, or a hue saturation luma hue saturation brightness (hue saturation lightness, HSL) space. That is, if the first image to be processed is in the first space, the first space may be YUV space, RGB space, or HSL space, and at this time, in order to obtain the saturation component and the brightness component of the pixels of the first image, the first image needs to be converted from the first space to the HSV space.
In one possible implementation, restoring the color of the at least one communication region when the first target object is a traffic light includes obtaining color information of the traffic light, adjusting the hue of the pixel of the at least one communication region to a red range and increasing and decreasing the saturation and brightness of the pixel of the at least one communication region, respectively, when the traffic light is red, adjusting the hue of the pixel of the at least one communication region to a yellow range and increasing and decreasing the saturation and brightness of the pixel of the at least one communication region, respectively, when the traffic light is yellow, adjusting the hue of the pixel of the at least one communication region to a green range and increasing and decreasing the saturation and brightness of the pixel of the at least one communication region, respectively.
In the embodiment of the present application, the increasing and decreasing of the saturation and the brightness of the pixel of the at least one connected region may be a linear increasing and a linear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively, or may be a nonlinear increasing and a nonlinear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively.
In one possible implementation, restoring the color of the at least one connected region when the first target object is a traffic light includes converting the at least one connected region to RGB space, obtaining color information of the traffic light, adjusting the red component of the pixel of the at least one connected region to a first preset range when the traffic light is red, reducing the blue and green components of the pixel of the at least one connected region, adjusting the red and green components of the pixel of the at least one connected region to a second preset range when the traffic light is yellow, reducing the blue component of the pixel of the at least one connected region, and adjusting the green component of the pixel of the at least one connected region to a third preset range when the traffic light is green.
Wherein the decreasing of the blue and green components of the pixels of the at least one connected region may be a linear decrease or a non-linear decrease of the blue and green components of the pixels of the at least one connected region.
In one possible implementation, before determining the first region of the first image from the saturation and the brightness of the first image, the method further includes filtering the saturation component and the brightness component of the pixels of the first image. In this way, the outline of the overexposed first object may be more stably and accurately identified.
In one possible implementation, acquiring the first image to be processed includes taking a second region selected by the user on the second image as the first image to be processed.
In this way, compared with directly processing the second image, the first image is selected from the second image and only the first image is processed, so that on one hand, the operation amount in the subsequent step can be reduced, and on the other hand, the recognition accuracy of the area where the first target object (for example, the signal lamp) is located can be improved.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including an acquiring unit configured to acquire a first image to be processed, the first image including a first target object that is overexposed, a determining unit configured to determine a first area of the first image according to saturation and brightness of pixels of the first image, the saturation of pixels of the first area being lower than an average value of saturation of pixels of the first image, and the brightness of pixels of the first area being higher than an average value of brightness of pixels of the first image, the first area corresponding to the first target object area, a processing unit configured to perform binarization processing on the first area to obtain a binary image corresponding to the first area, a determining unit further configured to determine at least one connected area in the binary image having an area greater than or equal to a first threshold, a contour of the first target object corresponding to a contour of the at least one connected area, and a processing unit further configured to reduce a color of the at least one connected area.
In one possible implementation, the processing unit is further configured to convert the first image from a first space to an HSV space, the first space being any one of YUV space, RGB space, or HSL space.
In one possible implementation, when the first target object is a traffic light, the processing unit is configured to acquire color information of the traffic light through the acquiring unit, adjust a hue of a pixel of the at least one communication area to a red range and raise and lower a saturation and a brightness of the pixel of the at least one communication area, respectively, when the traffic light is red, adjust the hue of the pixel of the at least one communication area to a yellow range and raise and lower the saturation and the brightness of the pixel of the at least one communication area, respectively, and adjust the hue of the pixel of the at least one communication area to a green range and raise and lower the saturation and the brightness of the pixel of the at least one communication area, respectively.
In one possible implementation, when the first target object is a traffic light, the processing unit is configured to convert the at least one connected region to an RGB space, obtain color information of the traffic light through the obtaining unit, adjust a red component of a pixel of the at least one connected region to a first preset range and reduce blue and green components of the pixel of the at least one connected region when the traffic light is red, adjust the red and green components of the pixel of the at least one connected region to a second preset range and reduce the blue component of the pixel of the at least one connected region when the traffic light is yellow, and adjust the green component of the pixel of the at least one connected region to a third preset range and reduce the red and blue components of the pixel of the at least one connected region when the traffic light is green.
In a possible implementation the processing unit is further adapted to filter the saturation component and the brightness component of the pixels of the first image.
In a possible implementation, the acquisition unit is configured to take a second area selected by the user on the second image as the first image to be processed.
The technical effects of the second aspect and its various possible implementations may be referred to the technical effects of the first aspect and its various possible implementations, and are not described herein.
In a third aspect, an embodiment of the present application provides an apparatus in the form of a chip, where the apparatus includes a processor and a memory, where the memory is configured to be coupled to the processor, and store program instructions and data necessary for the apparatus, and where the processor is configured to execute the program instructions stored in the memory, so that the apparatus performs the functions of the image processing device in the above method.
In a fourth aspect, an embodiment of the present application provides an image processing apparatus, where the image processing apparatus may implement a function performed by the image processing apparatus, and the function may be implemented by hardware or may be implemented by hardware executing corresponding software. The hardware or software comprises one or more modules corresponding to the functions.
In one possible design, the image processing apparatus includes a processor and a communication interface in a structure, the processor being configured to support the image processing apparatus to perform the corresponding functions in the above-described method. The communication interface is used to support communication between the image processing device and other devices, such as a traffic light detector on a traffic light. The image processing device may further comprise a memory for coupling with the processor, which holds the program instructions and data necessary for the image processing device.
In a fifth aspect, an embodiment of the application provides a computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform any of the methods provided in the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform any of the methods provided in the first aspect.
Detailed Description
The embodiment of the application provides a method and a device for performing color (color) restoration on an image, which are applied to a scene for performing color restoration on an overexposed image. It will be appreciated that embodiments of the present application may also be applied to scenes in which video comprising one or more overexposed images is color restored. In particular, the method can be applied to the color reduction process of the overexposed object (or overexposed area) in the image. For example, in color reproduction of overexposed traffic lights in images or videos taken by electronic police (cameras or monitors).
Taking a scene of an electronic police camera shooting a violation image as an example, as shown in fig. 1, the embodiment of the application provides a system architecture schematic diagram of a method suitable for performing color restoration on an image (the violation image shot by the electronic police camera), which comprises the electronic police camera and an image processing device connected with the electronic police camera (the image processing device can also be integrated in the electronic police camera), wherein the image processing device can perform color restoration processing on the image or video shot by the electronic police camera. The image processing device may also be connected to a traffic light in order to obtain (historical) color changes of the light from a light detector on the traffic light.
The image processing apparatus in fig. 1 according to the embodiment of the present application may be implemented by one apparatus or may be a functional module in one apparatus, which is not particularly limited in the embodiment of the present application. It will be appreciated that the above described functionality may be either a network element in a hardware device, or a software function running on dedicated hardware, or a virtualized function instantiated on a platform (e.g., a cloud platform), or a system on a chip. In the embodiment of the application, the chip system can be formed by a chip, and can also comprise the chip and other discrete devices.
For example, the apparatus for implementing the functions of the image processing apparatus provided by the embodiment of the present application may be implemented by the apparatus 200 in fig. 2. Fig. 2 is a schematic diagram of a hardware structure of an apparatus 200 according to an embodiment of the present application. The apparatus 200 includes at least one processor 201 configured to implement the functions of the image processing device provided in the embodiment of the present application. The apparatus 200 may further include a bus 202 and at least one communication interface 204. Memory 203 may also be included in apparatus 200.
In an embodiment of the application, the processor may be a central processing unit (central processing unit, CPU), a general purpose processor, a network processor (network processor, NP), a digital signal processor (DIGITAL SIGNAL processing, DSP), a microprocessor, a microcontroller, a programmable logic device (programmable logic device, PLD), or any combination thereof. The processor may also be any other means for performing a processing function, such as a circuit, device, or software module.
Bus 202 may be used to transfer information between the components described above.
A communication interface 204 for communicating with other devices or communication networks, such as ethernet, radio access network (radio access network, RAN), wireless local area network (wireless local area networks, WLAN), etc. The communication interface 204 may be an interface, circuit, transceiver, or other device capable of communication, and the application is not limited. The communication interface 204 may be coupled to the processor 201. The coupling in the embodiments of the present application is an indirect coupling or communication connection between devices, units, or modules, which may be in electrical, mechanical, or other forms for information interaction between the devices, units, or modules.
In embodiments of the application, the memory may be, but is not limited to, read-only memory (ROM) or other type of static storage device that may store static information and instructions, random access memory (random access memory, RAM) or other type of dynamic storage device that may store information and instructions, electrically erasable programmable read-only memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-only memory, EEPROM), compact disc read-only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate or coupled to the processor, such as via bus 202. The memory may also be integrated with the processor.
The memory 203 is configured to store program instructions and may be controlled by the processor 201 to perform a method for color restoration of an image according to the embodiment of the present application. The processor 201 is configured to invoke and execute instructions stored in the memory 203, thereby implementing a method for color restoration of an image according to the embodiments of the present application described below.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application program codes, which are not particularly limited in the embodiments of the present application.
Optionally, a memory 203 may be included in the processor 201.
In a particular implementation, as one embodiment, processor 201 may include one or more CPUs, such as CPU0 and CPU1 of FIG. 2.
In a particular implementation, the apparatus 200 may include a plurality of processors, such as the processor 201 and the processor 207 in fig. 2, as one embodiment. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In a specific implementation, the apparatus 200 may further comprise an output device 205 and an input device 206, as an embodiment. The output device 205 is coupled to the processor 201 and can display information in a variety of ways. For example, the output device 205 may be a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, or a projector (projector), or the like. The input device 206 is coupled to the processor 201 and may receive user input in a variety of ways. For example, the input device 206 may be a camera, mouse, keyboard, touch screen device, or sensing device, among others.
The apparatus 200 may be a general purpose device or a special purpose device. In a specific implementation, the image processing device 200 may be a video camera, a monitor, a video display device, a desktop computer, a portable computer, a web server, a personal computer (PDA), a mobile phone, a tablet computer, a wireless terminal device, an embedded device, or a device having a similar structure as in fig. 2. Embodiments of the present application are not limited in the type of apparatus 200.
For clarity and conciseness in the description of the embodiments below, a brief introduction to related concepts or technologies is first given:
Color space-color is generally described by three independent attributes, which act in combination to form a spatial coordinate, i.e., color space. The color space may include RGB (color) space, YUV (color) space, HSV (color) space, HSL (color) space, etc., and different color spaces may weigh the colors of the same object from different angles. In different processing procedures, emphasis on color processing is different, so that various color spaces can be mutually converted to meet different processing requirements. Wherein:
RGB space R represents the red component (red channel), G represents the green component (green channel), and B represents the blue component (blue channel). The colors are obtained by varying the three color channels of red, green and blue and by superimposing them on each other. The smaller the value, the lower the luminance, the larger the value, and the higher the luminance in each component. When the components are mixed, the mixed brightness is equal to the sum of the brightness of the components.
YUV space Y represents brightness, i.e. gray scale value. U and V represent chromaticity, which describes the image color and saturation, and may specify the color of the pixel. If there is only a Y signal component and no U, V component, the image so represented is a black and white gray image. The YUV space is mainly used for optimizing the transmission of color video signals, so that the color video signals are backwards compatible with old black-and-white televisions.
HSV space HSV is a representation of points in the RGB color space in an inverted cone. Wherein H represents color information, namely the position of the spectrum color, and can be measured by an angle, and the range of the value is 0-360 degrees. Red is 0 °, green is 120 °, and blue is 240 °. S is expressed as the ratio between the saturation of the selected color and the maximum saturation of that color, ranging from 0 to 1. When s=0, only the gradation is present. V represents the brightness of the color, ranging from 0 to 1.
HSL space HSL is similar to HSV, the first two parameters in this model being the same as HSV. L represents the brightness of a color and can be used to control the shade change of the color. The value range of L is 0% to 100%, the smaller the value is, the darker the color is, the closer to black, the larger the value is, the brighter the color is, and the closer to white is.
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. Wherein, in the description of the application, unless otherwise indicated, "at least one" means one or more, and "a plurality" means two or more. In addition, in order to facilitate the clear description of the technical solution of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
For easy understanding, the method for color reduction of an image according to the embodiment of the present application is specifically described below with reference to the accompanying drawings.
As shown in fig. 3, an embodiment of the present application provides a method for performing color reduction on an image, including:
301. A first image to be processed is acquired, the first image comprising the overexposed first target object.
Where the first target object refers to one or more objects (or areas where objects are located) that are overexposed in the first image. The objects may be, for example, traffic lights (of various shapes and colors), vehicles, traffic signs, etc.
In one possible design, the second area framed by the user on the second image may be used as the first image to be processed. For example, as shown in fig. 4, when the user considers that the traffic light (the first target object) in a certain image (the second image) is overexposed, the user may perform a first operation on the second image through an input device (for example, a mouse or a touch screen) of the image processing apparatus, where the first operation is used to select an area (the second area) where the overexposed traffic light is located. After the image processing device identifies the first operation of the user on the second image, the second area can be cut from the second image according to the coordinate information of the second area selected by the user in a frame mode, and a third image and a first image to be processed are obtained. Wherein the first image to be processed includes a second region, and the third image includes other regions than the second region. That is, the first image to be processed includes a second region selected by the user on the second image.
In this way, compared with directly processing the second image, the first image is selected from the second image and only the first image is processed, so that on one hand, the operation amount in the subsequent step can be reduced, and on the other hand, the recognition accuracy of the area where the first target object (for example, a traffic light) is located can be improved.
302. Optionally, the saturation component and the brightness component of the pixels of the first image are filtered.
It should be noted that, if the second image is an HSV image, that is, the second image is in HSV space, then the first image to be processed is also in HSV space, and the saturation component and the brightness component of the pixel of the first image may be directly filtered, so as to more stably and accurately identify the outline of the overexposed first object (for example, the overexposed traffic light) in the subsequent steps.
If the second image is in the first space, the first space may be YUV space, RGB space or HSL space, and the first image to be processed is also in the first space. At this time, in order to obtain the saturation component and the brightness component of the pixels of the first image, the first image needs to be converted from the first space to the HSV space. Then, the saturation component and the brightness component of the pixels of the first image are filtered again in order to more stably and accurately identify the outline of the overexposed first object in a subsequent step.
303. A first region of the first image is determined based on the saturation and brightness of pixels of the first image.
In the embodiment of the present application, the first region corresponds to a region of the first target object (where the first target object is located). The region in the first image where the overexposed first target object (e.g., an overexposed traffic light) is located may be determined from the saturation and brightness of the pixels of the first image.
In one possible design, the saturation component and the brightness component of the pixel of the first image after the filtering process may be analyzed, and a region where the saturation component is lower than the average value of the saturation of the pixel of the first image and the brightness component is higher than the average value of the brightness of the pixel of the first image is determined to be the region where the first target object is located. I.e. the saturation of the pixels of the first area is lower than the average of the saturation of the pixels of the first image and the brightness of the pixels of the first area is higher than the average of the brightness of the pixels of the first image. Fig. 5 is a schematic view of a first area.
304. And carrying out binarization processing on the first region to obtain a binary image corresponding to the first region.
For example, as shown in fig. 6, after the binarization processing is performed on the first area shown in fig. 5, a binary image corresponding to the first area may be obtained. The binary image corresponding to the first region may include three connected regions (a, b, and c, respectively). The communication area a is an area where the signal lamp is located, and the communication areas b and c may be areas where halation generated by the signal lamp is located.
305. At least one connected region in the binary image having an area greater than or equal to a first threshold is determined.
In one possible design, the contour of at least one connected region in the binary image with an area greater than or equal to the first threshold corresponds to the contour of the first target object. The contour of at least one communication region having an area greater than or equal to the first threshold in the binary image may be taken as the contour of the first target object. It should be noted that, the contour of the at least one connected region having the area greater than or equal to the first threshold may be larger or smaller than the contour of the actual (real) first target object. For example, when the first target object is a signal lamp, since the (bright) signal lamp may generate halation, the at least one communication area may include not only an area where the signal lamp itself is located but also an area where the halation generated by the signal lamp is located, and thus the contour of the at least one communication area having the area greater than or equal to the first threshold may be larger than the contour of the actual signal lamp. For another example, if a portion of the signal lamp is damaged and cannot emit light, the at least one communication region may include only a region where the portion of the signal lamp emits light, and in this case, the contour of the at least one communication region is slightly smaller than the contour of the actual signal lamp.
The first threshold may be determined according to an area of the connected region having the largest area in the binary image. For example, the first threshold may be equal to the area of the connected region having the largest area in the binary image, or the first threshold may be N% (e.g., 30%) of the area of the connected region having the largest area in the binary image, N being a positive number.
A plurality of connected areas with different area sizes exist in the binary image corresponding to the first area. For example, as shown in fig. 6, three connected regions (a, b, and c, respectively) may be included in the binary image corresponding to the first region. And searching all connected areas in the binary image based on the binary image. And sequencing all the searched connected areas according to the size of the areas (for example, sequencing all the connected areas from large to small). Among these communication regions of different sizes, the communication region of smaller area may be filtered out, the communication region of larger area may be reserved, and at least one communication region of larger area may be taken as the region of the first target object (e.g., signal lamp) (for example, one communication region of largest area may be taken as the region where the signal lamp is located, or a plurality of communication regions of larger area may be taken as the region where the signal lamp is located).
For example, the connected region with the largest area in the binary image may be screened out first, assuming that the area is max, the first threshold may be set to be max/(x+2) to filter out the connected region with the area smaller than max/(x+2). For example, as shown in fig. 7, the communication areas (b and c) may be filtered out, leaving the communication area a. The contour of the communication area a corresponds to the contour of the first target object.
The value range of the parameter x may be 0-9, and the parameter x may be used to adjust the magnitude of the first threshold. The larger the value of x is, the fewer the filtered communication areas with smaller areas are, namely the more the communication areas with smaller areas are reserved, so that the outline of the first target object corresponds to the outline of the more communication areas. For example, if the first target object is a signal lamp, the more the area where the signal lamp is located is spread outwards (i.e. the area where the signal lamp is located may include not only the signal lamp but also a halo generated by the signal lamp). The smaller the value of x is, the more connected areas with smaller areas are filtered out, namely, the fewer connected areas with smaller areas are reserved, so that the outline of the first target object corresponds to the outline of the fewer connected areas. For example, if the first target object is a signal lamp, the area where the signal lamp is located is shrunk inwards (i.e. the area where the signal lamp is located includes only the signal lamp and does not include the halo generated by the signal lamp).
For another example, if the first image to be processed includes an arrow-shaped indicator light, after the image processing apparatus determines that the first area of the first image is shown in (a) of fig. 8 and performs binarization processing on the first area, as shown in (b) of fig. 8, four connected areas (d, e, f, and g respectively) may be included in the binary image corresponding to the first area, and as shown in (c) of fig. 8, at least one connected area having an area greater than or equal to the first threshold may be connected areas d and e. I.e., the communication areas (f and g) are filtered out, leaving the communication areas d and e. The outline of the communication areas d and e may be regarded as the outline of the first target object.
306. The color of at least one of the connected regions is reduced.
In the embodiment of the present application, the reduction of the color of the at least one communication region may also be regarded as the correction of the color of the at least one communication region. The color of at least one of the connected regions, i.e., the color of the overexposed first target object is reduced or corrected. For example, as shown in fig. 7, the color of the communication area a, that is, the color of the overexposed first target object is restored or corrected.
For example, when the first target object is a traffic signal, color information of the traffic signal may be obtained based on a signal detector externally connected to the signal or based on an image recognition status light.
In one possible design, the color of at least one connected region may be restored in HSV space. Specifically, when the traffic signal lamp is red, the tone of the pixel of at least one communication area is adjusted to a red range, the saturation and the brightness of the pixel of at least one communication area are respectively increased and decreased, when the traffic signal lamp is yellow, the tone of the pixel of at least one communication area is adjusted to a yellow range, the saturation and the brightness of the pixel of at least one communication area are respectively increased and decreased, and when the traffic signal lamp is green, the tone of the pixel of at least one communication area is adjusted to a green range, and the saturation and the brightness of the pixel of at least one communication area are respectively increased and decreased.
In the embodiment of the present application, the increasing and decreasing of the saturation and the brightness of the pixel of the at least one connected region may be a linear increasing and a linear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively, or may be a nonlinear increasing and a nonlinear decreasing of the saturation and the brightness of the pixel of the at least one connected region, respectively. The linear increase in saturation may increase the saturation of each of the pixels in the at least one connected region by the same amount (value) and the non-linear increase in saturation may increase the saturation of different ones of the pixels in the at least one connected region by different amounts (value). The linearly decreasing of the brightness may decrease the brightness of each of the pixels in the at least one connected region by the same amount (value) and the non-linearly decreasing of the brightness may decrease the brightness of different ones of the pixels in the at least one connected region by different amounts (values). After the saturation and brightness of the pixel of the at least one connected region are respectively increased and decreased, there is a strong correlation between the saturation and brightness of the at least one connected region. And before the saturation and brightness of the pixels of the at least one connected region are respectively increased and decreased, there is no strong correlation between the saturation and brightness of the at least one connected region.
In another possible design, the color of at least one connected region may be restored in RGB space. Specifically, when the traffic signal is (determined to be) red, the red component of the pixel of the at least one connected region may be adjusted to a first preset range, and the blue and green components of the pixel of the at least one connected region may be reduced.
For example, assuming that color reduction is performed on an 8-bit encoded RGB image, if the red component (R value) of a point pixel is Rx, the R value of the point pixel may be adjusted to r= (200+55×x/9+rx)/2 (a first preset range). The value range of the parameter x can be 0-9, and the parameter x can be used for adjusting the magnitude of the R value.
Wherein the decreasing of the blue and green components of the pixels of the at least one connected region may be a linear decrease or a non-linear decrease of the blue and green components of the pixels of the at least one connected region. The linear reduction may reduce the blue and green components of each of the pixels in the at least one connected region by the same amount (value) and the non-linear reduction may reduce the blue and green components of different ones of the pixels in the at least one connected region by different amounts (value).
And when the traffic signal lamp is determined to be green, the red component and the green component of the pixel of the at least one connected region are increased, and the red component and the blue component of the pixel of the at least one connected region are reduced. The specific process may refer to the above related processing process when the traffic signal light is red, and will not be described herein.
307. The first image and the third image are combined.
If the first image is converted from the first space to the HSV space at step 302, the color-restored first image may be reconverted from the HSV space to the first space to combine the first image and the third image (i.e., the remaining portion of the first image is cropped from the second image) to obtain a complete image (which may be considered to be the color-restored second image). If the conversion of the first image from the first space to HSV space is not performed in step 302, the first image and the third image after the color reduction may be directly combined to obtain a complete image.
In one possible design, in the process of performing color reduction on the overexposed first target object in the video, a user may select the overexposed object before the video starts or when pausing, and in the process of playing the subsequent video, the method of the embodiment of the present application may be adopted to perform corresponding processing on each frame image of the video, so as to reduce the color of the overexposed first target object in the whole video.
According to the method provided by the embodiment of the application, after the first image to be processed is obtained, a first area of the first image (the area where the first area can be used as the overexposed first target object) can be determined according to the saturation and brightness of the pixels of the first image, then binarization processing is carried out on the first area to obtain a binary image corresponding to the first area, at least one connected area (the outline of the at least one connected area can be used as the outline of the first target object) with the area larger than or equal to a first threshold value in the binary image is determined, and then the color of the at least one connected area (namely the color of the overexposed first target object) is restored. Therefore, the method provided by the embodiment of the application can accurately restore the color of the overexposed object (such as a traffic signal lamp) in the image. Further, a series of problems due to overexposure of objects in the image (e.g., difficult evidence collection problems due to overexposure of signal lights in current electrical police monitoring scenarios) may be addressed.
In the prior art, a camera with ultra-wide dynamic range can be adopted to eliminate the color distortion of the signal lamp caused by strong light, and the ultra-wide dynamic camera is high in price and has limited color restoration effect on the overexposure signal lamp under low illumination. The method provided by the embodiment of the application has the advantages of no need of adding extra equipment, low price, high environmental adaptability and capability of achieving the color restoration effect of the overexposure signal lamp under low illumination. In the prior art, color reproduction of the signal lamp can be performed based on an RGB space of the image, or region recognition and color enhancement can be performed on the signal lamp in the image based on a deep learning method, but a gradient phenomenon that the signal lamp region after color reproduction is discontinuous with surrounding images may be caused. The method provided by the embodiment of the application is based on HSV space, and better reflects the brightness information of the entity signal lamp, so that the signal lamp area is more stably and accurately identified, the signal lamp area after color restoration is more continuous with surrounding images, and the visual effect is more lifelike. In addition, the method provided by the embodiment of the application is efficient and reliable, and can meet the color reduction effect of the signal lamp in the video.
The above description has been made mainly in terms of the image processing apparatus for the solution provided by the embodiment of the present application. It will be appreciated that the image processing apparatus, in order to achieve the above-described functions, comprises corresponding hardware structures and/or software modules that perform the respective functions. Those skilled in the art will readily appreciate that the algorithm steps described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and software. Whether a function is implemented as hardware or software-driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the present application may divide the functional modules of the image processing apparatus according to the above-described method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In the case of dividing the respective functional modules with the respective functions, fig. 9 shows a possible structural schematic diagram of the image processing apparatus 9 referred to in the above-described embodiment, which includes an acquisition unit 901, a determination unit 902, and a processing unit 903. The acquisition unit 901 is for supporting the image processing apparatus to execute the process 301 in fig. 3. The determination unit 902 is for supporting the image processing apparatus to execute the processes 303 and 305 in fig. 3. The processing unit 903 is used to support the image processing apparatus to execute the processes 302, 304, and 306 in fig. 3. All relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The steps of a method or algorithm described in connection with the present disclosure may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable disk, a read-only optical disk, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC). In addition, the ASIC may be located in a core network interface device. The processor and the storage medium may reside as discrete components in a core network interface device.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on an image processing device readable medium. The image processing device readable medium includes an image processing device storage medium and a communication medium, wherein the communication medium includes any medium that facilitates transfer of an image processing device program from one place to another. A storage medium may be any available medium that can be accessed by a general purpose or special purpose image processing device.
The foregoing embodiments have been provided for the purpose of illustrating the general principles of the present application in further detail, and are not to be construed as limiting the scope of the application, but are merely intended to cover any modifications, equivalents, improvements, etc. based on the teachings of the application.