[go: up one dir, main page]

CN111815531B - Image processing method, device, terminal equipment and computer-readable storage medium - Google Patents

Image processing method, device, terminal equipment and computer-readable storage medium Download PDF

Info

Publication number
CN111815531B
CN111815531B CN202010657947.5A CN202010657947A CN111815531B CN 111815531 B CN111815531 B CN 111815531B CN 202010657947 A CN202010657947 A CN 202010657947A CN 111815531 B CN111815531 B CN 111815531B
Authority
CN
China
Prior art keywords
image
images
frames
processed
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010657947.5A
Other languages
Chinese (zh)
Other versions
CN111815531A (en
Inventor
赖泽民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010657947.5A priority Critical patent/CN111815531B/en
Publication of CN111815531A publication Critical patent/CN111815531A/en
Application granted granted Critical
Publication of CN111815531B publication Critical patent/CN111815531B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image processing method, an image processing device, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: acquiring at least two frames of images to be processed which are continuously shot; acquiring a static area and a moving area of the at least two frames of images to be processed; according to the sub-images of the static area in the at least two frames of images to be processed, carrying out noise reduction on the static area to obtain a first sub-image; according to the sub-image of the motion area in a frame of candidate image to be processed, carrying out noise reduction treatment on the motion area to obtain a second sub-image; and splicing the first sub-image and the second sub-image to obtain a target image. By the method and the device, noise in the image can be reduced, and a high-quality image can be obtained.

Description

Image processing method, device, terminal equipment and computer readable storage medium
Technical Field
The present application belongs to the technical field of image processing, and in particular, relates to an image processing method, an image processing device, a terminal device, and a computer readable storage medium.
Background
Image noise refers to unnecessary or redundant interference information existing in an image, and the existence of noise seriously affects the quality of the image. During the process of acquiring, transmitting and storing an image, noise in the image may be increased due to the influence of various factors (such as relative motion between the imaging device and the object, nonlinearity of the image sensor, etc.), resulting in degradation of the image quality. Therefore, how to improve the quality of an image is a technical problem to be solved in the field of image processing.
Disclosure of Invention
The application provides an image processing method, an image processing device, a terminal device and a computer readable storage medium, so as to inhibit noise in an image and obtain a high-quality image.
In a first aspect, the present application provides an image processing method, including:
acquiring at least two frames of images to be processed which are continuously shot;
acquiring a static area and a moving area of the at least two frames of images to be processed;
according to the sub-images of the static area in the at least two frames of images to be processed, carrying out noise reduction processing on the static area to obtain a first sub-image, wherein the first sub-image is a noise-reduced sub-image corresponding to the static area;
according to the sub-image of the motion area in a frame of candidate to-be-processed image, carrying out noise reduction processing on the motion area to obtain a second sub-image, wherein the second sub-image is a noise-reduced sub-image corresponding to the motion area, and the candidate to-be-processed image is a to-be-processed image with the motion area in the at least two frames of to-be-processed images;
and synthesizing the first sub-image and the second sub-image to obtain a target image.
In a second aspect, the present application provides an image processing apparatus including:
the image acquisition module is used for acquiring at least two frames of images to be processed which are continuously shot;
the region acquisition module is used for acquiring a static region and a moving region of the at least two frames of images to be processed;
the first noise reduction module is used for carrying out noise reduction processing on the static area according to the sub-images of the static area in the at least two frames of images to be processed respectively to obtain a first sub-image, wherein the first sub-image is a noise-reduced sub-image corresponding to the static area;
the second denoising module is used for denoising the motion region according to the sub-image of the motion region in a frame of candidate images to be processed to obtain a second sub-image, wherein the second sub-image is a denoising sub-image corresponding to the motion region, and the candidate images to be processed are images to be processed in which the motion region exists in the at least two frames of images to be processed;
and the target acquisition module is used for splicing the first sub-image and the second sub-image to obtain a target image.
In a third aspect, the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the image processing method according to the first aspect as described above when the computer program is executed.
In a fourth aspect, the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the image processing method according to the first aspect described above.
In a fifth aspect, the present application provides a computer program product for, when run on a terminal device, causing the terminal device to perform the steps of the image processing method as described in the first aspect above.
From the above, according to the method and the device, the still region and the moving region of at least two frames of images to be processed which are continuously shot are acquired, the first sub-image which is corresponding to the still region and has reduced noise is obtained according to the sub-images in the at least two frames of images to be processed respectively in the still region, the second sub-image which is corresponding to the moving region and has reduced noise is obtained according to the sub-images of the moving region in one frame of candidate images to be processed, and the first sub-image and the second sub-image are spliced, so that the target image which has reduced noise can be obtained, namely, the high-quality image is obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flowchart of an implementation of an image processing method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of image stitching;
fig. 3 is a schematic implementation flow chart of an image processing method according to a second embodiment of the present application;
fig. 4 is a schematic implementation flow chart of an image processing method according to the third embodiment of the present application;
fig. 5 is a schematic structural diagram of an image processing apparatus according to a fourth embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device provided in a fifth embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device provided in a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). It should also be appreciated that in some embodiments, the device is not a portable communication device, but a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
In the following discussion, a terminal device including a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, telephony applications, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital video camera applications, web browsing applications, digital music player applications, and/or digital video player applications.
Various applications that may be executed on the terminal device may use at least one common physical user interface device such as a touch sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal may be adjusted and/or changed between applications and/or within the corresponding applications. In this way, the common physical architecture (e.g., touch-sensitive surface) of the terminal may support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that the sequence number of each step in this embodiment does not mean the sequence of execution, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiment of the present application.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
Referring to fig. 1, a flowchart of an implementation of an image processing method according to an embodiment of the present application, where the image processing method is applied to a terminal device, as shown in the figure, the image processing method may include the following steps:
step 101, acquiring at least two frames of images to be processed, which are shot continuously.
The at least two frames of images to be processed that are continuously shot are adjacent in shooting time, for example, five frames of images are continuously shot by a camera integrated by a terminal device, the images are respectively A1, A2, A3, A4, A5, A1, A2 and A3 according to the sequencing of the shooting time, namely three frames of images that are continuously shot, and the images A1 and A3 are not two frames of images that are continuously shot because the images A1 and A3 are not adjacent.
It should be noted that, in the embodiment of the present application, at least two frames of continuously shot images to be processed may be acquired through a camera integrated with a terminal device (for example, at least two frames of continuously shot images to be processed are continuously shot through the camera), or at least two frames of continuously shot images to be processed may be acquired from other devices (for example, at least two frames of continuously shot images to be processed sent by a receiving server), where the manner of acquiring the at least two frames of continuously shot images to be processed is not limited.
Step 102, acquiring a static area and a moving area of at least two frames of images to be processed.
In the embodiment of the application, the static area and the moving area of the at least two frames of images to be processed can be divided according to the difference between all adjacent two frames of images to be processed in the at least two frames of images to be processed, so that the static area and the moving area of the at least two frames of images to be processed are obtained. The difference between two adjacent frames of images to be processed can be the offset of the matched pixels in the two adjacent frames of images to be processed, if the offset of the matched pixels is greater than the offset threshold, the matched pixels are determined to move, the region formed by a plurality of pixels which move is a moving region, and if the offset of the matched pixels is less than or equal to the offset threshold, the matched pixels are determined to not move, and the region formed by a plurality of pixels which do not move is a static region. Optionally, the user may set the offset threshold according to the actual requirement, or may set the offset thresholds corresponding to different ambient light levels in advance, and adaptively select the offset threshold according to the photographed ambient light level, which is not limited herein.
Taking five-frame to-be-processed images as an example, illustrating that all adjacent two-frame to-be-processed images in at least two-frame to-be-processed images are respectively A1, A2, A3, A4 and A5 according to the sequencing of the five-frame to-be-processed images in shooting time, (A1, A2), (A2, A3), (A3, A4) and (A4, A5) are all adjacent two-frame to-be-processed images, namely, the five-frame to-be-processed images comprise four groups of adjacent two-frame to-be-processed images.
In the embodiment of the present application, after the still region and the moving region of at least two frames of to-be-processed images are acquired, for an i-th frame of to-be-processed image, the i-th frame of to-be-processed image is any one of the at least two frames of to-be-processed images, and a sub-image of the still region in the i-th frame of to-be-processed image can be extracted from the i-th frame of to-be-processed image, so that at least two sub-images can be obtained from the at least two frames of to-be-processed images; sub-images of the motion region in the frame of candidate images to be processed can be extracted from the frame of candidate images to be processed, so that the sub-images of the motion region in the frame of candidate images to be processed are obtained, and the candidate images to be processed are the images to be processed with the motion region in at least two frames of images to be processed.
And 103, carrying out noise reduction processing on the static area according to the sub-images of the static area in at least two frames of images to be processed respectively, and obtaining a first sub-image.
The first sub-image refers to a noise-reduced sub-image corresponding to the static area.
In this embodiment of the present application, the still region corresponds to a sub-image in each of the at least two frames of images to be processed, and the noise-reduced sub-image (i.e., the first sub-image) corresponding to the still region may be obtained according to the sub-images of the still region in all the images to be processed. For example, taking five frames of images to be processed as an example, the still area is the area where the sun is located, then the sun images (i.e. the sub-images of the sun area in the images to be processed) exist in the five frames of images to be processed, and according to the sun images of the areas where the sun is located in the five frames of images to be processed respectively, the noise-reduced sun images corresponding to the areas where the sun is located can be obtained.
Noise reduction processing for stationary areas includes, but is not limited to, the following two ways:
in the first mode, sub-images of the static area in at least two frames of images to be processed are overlapped, and the images obtained after overlapping are the first sub-images, namely, the quality of the first sub-images corresponding to the static area is improved by overlapping at least two sub-images corresponding to the static area;
In the second mode, a preset noise reduction algorithm is used first to perform noise reduction treatment on sub-images of the static area in at least two frames of images to be treated, the sub-images after the noise reduction treatment are overlapped, and the overlapped image is the first sub-image, namely, before at least two sub-images corresponding to the static area are overlapped, the noise reduction treatment is performed on at least two sub-images corresponding to the static area, so that the quality of the first sub-image corresponding to the static area can be further improved. The preset noise reduction algorithm may be any noise reduction algorithm set in advance, for example, a non-local average filtering algorithm, which is not limited herein.
And 104, carrying out noise reduction processing on the motion region according to the sub-image of the motion region in the frame of candidate image to be processed, and obtaining a second sub-image.
The second sub-image is a noise-reduced sub-image corresponding to the motion area.
In this embodiment of the present application, since the motion region does not necessarily exist in each of the at least two frames of to-be-processed images, the candidate to-be-processed image may be obtained from the at least two frames of to-be-processed images first, and then the noise-reduced sub-image (i.e., the second sub-image) corresponding to the motion region may be obtained according to the sub-image of the motion region in one frame of the candidate to-be-processed image. The one frame of candidate to-be-processed image in step 104 may refer to any one frame of candidate to-be-processed image of all candidate to-be-processed images in the at least two frames of to-be-processed images, and when the number of candidate to-be-processed images is at least two, the one frame of candidate to-be-processed image in step 104 may be any one candidate to-be-processed image in the at least two candidate to-be-processed images, or may be the candidate to-be-processed image with the highest definition, which is not limited herein.
For example, taking five frames of to-be-processed images as an example, the motion area is an area where lightning is located, two frames of to-be-processed images exist in the five frames of to-be-processed images, and no lightning image exists in the three frames of to-be-processed images, and the lightning image with reduced noise can be obtained according to any one of the two frames of to-be-processed images with the lightning image or the to-be-processed image with highest definition.
Noise reduction processing of the motion region includes, but is not limited to, the following two ways:
in the first mode, the sub-image of the motion area in the one-frame candidate to-be-processed image in the step 104 is used as the second sub-image, namely, the sub-image of the motion area in the one-frame candidate to-be-processed image is used as the second sub-image, so that motion blurring caused by superposition of a plurality of sub-images can be avoided, and the quality of the second sub-image corresponding to the motion area is improved;
in the second mode, a preset noise reduction algorithm is used first to perform noise reduction processing on the sub-image in the one-frame candidate to-be-processed image of the motion area in step 104, the sub-image after the noise reduction processing is used as a second sub-image, that is, before the sub-image in the one-frame candidate to-be-processed image of the motion area is used as the second sub-image, the noise reduction processing is performed on the sub-image in the one-frame candidate to-be-processed image of the motion area, so that the quality of the second sub-image corresponding to the motion area can be further improved. The preset noise reduction algorithm may be any noise reduction algorithm set in advance, for example, a non-local average filtering algorithm, which is not limited herein.
And 105, splicing the first sub-image and the second sub-image to obtain a target image.
In the embodiment of the present application, the first sub-image and the second sub-image may be spliced according to the position distribution of the stationary area corresponding to the first sub-image and the moving area corresponding to the second sub-image in the image to be processed, where the spliced image is the target image, and since the first sub-image and the second sub-image forming the target image are both sub-images with reduced noise, the target image is also an image with reduced noise, and the picture quality is higher.
As shown in fig. 2, which is an example graph of image stitching, two frames of images to be processed are shown in fig. 2, the area where the sun is located and the area where the clouds are located are both static areas, the area where the lightning is located is a moving area, the area where the sun is located and the area where the clouds are located are obtained from the first frame of images to be processed, noise reduction processing is performed on the areas where the sun is located, a noise-reduced sun image corresponding to the area where the sun is located is obtained, a noise-reduced clouds image corresponding to the area where the clouds are located is obtained, the area where the lightning is located is obtained from the second frame of images to be processed, noise-reduced lightning images corresponding to the area where the lightning is located are obtained, the sun images, the clouds images and the lightning images are stitched, and high-quality images including the sun images, the clouds images and the lightning images can be obtained, wherein black dots in the first frame of images to be processed and the second frame of images to be processed represent noise dots, and the high-quality images can be obtained through the application of effectively suppressing noise or effectively reducing noise dots.
According to the method, the device and the system, the static area and the moving area of at least two frames of images to be processed are obtained, the first sub-image which is corresponding to the static area and has noise reduced is obtained according to the sub-images in the at least two frames of images to be processed, the second sub-image which is corresponding to the moving area and has noise reduced is obtained according to the sub-images of the moving area in one frame of candidate images to be processed, and the first sub-image and the second sub-image are spliced, so that the target image which has noise reduced can be obtained, and the high-quality image can be obtained.
Referring to fig. 3, a flowchart of an implementation of an image processing method according to a second embodiment of the present application, where the image processing method is applied to a terminal device integrated with a camera, as shown in the figure, the image processing method may include the following steps:
step 301, when the camera is in the professional mode, if a photographing instruction is received, acquiring an exposure mode of the camera.
The professional mode is a mode in which a user sets photographing parameters according to actual requirements, and the photographing parameters include, but are not limited to, a photometry mode, photosensitivity, exposure time, exposure compensation, a focusing mode, white balance, and the like.
In this embodiment of the present application, when the terminal device detects that the camera application program is started, it may detect whether the camera is in a professional mode, if the camera is in the professional mode, it detects whether a photographing instruction is received, if the photographing instruction is received, it acquires an exposure mode of the camera, if the photographing instruction is not received, it continues to detect whether the photographing instruction is received, until the photographing instruction is received or the camera exits from the professional mode. The camera may be switched from another photographing mode to a professional mode, or may be switched from a default photographing mode to a professional mode when the camera application is started, which is not limited herein. The camera may be switched from the professional mode to other photographing modes, or may be turned off, which is not limited herein.
In step 302, if the exposure mode of the camera is automatic exposure or the exposure mode of the camera is manual exposure and the exposure time of the camera is less than the time threshold, the camera is controlled to continuously capture at least two frames of images to be processed.
Wherein, automatic exposure means that the camera replaces manual operation, and exposure time, aperture, photosensitivity and the like are automatically adjusted to control exposure. Manual exposure means that the exposure can be controlled by manually setting an aperture, exposure time, sensitivity, and the like. The time threshold may be set based on empirical values, for example 0.03 seconds.
It should be noted that, the number of at least two frames of images to be processed continuously captured in step 302 may be specifically set according to an empirical value, for example, six frames.
Optionally, the embodiment of the present application further includes:
if the exposure mode of the camera is manual exposure and the exposure time of the camera is greater than or equal to the time threshold, the camera is controlled to shoot a frame of image to be processed.
In the embodiment of the application, when the exposure mode of the camera is manual exposure, the exposure time of the camera can be obtained, the exposure time of the camera is compared with a time threshold, if the exposure time of the camera is smaller than the time threshold, the camera is controlled to continuously shoot at least two frames of images to be processed, and a high-quality image can be output based on the at least two frames of images to be processed; if the exposure time of the camera is greater than or equal to the time threshold, the camera may be controlled to capture a frame of the image to be processed and output the frame of the image to be processed in order to improve the photographing experience of the user.
Step 303, acquiring a still area and a moving area of at least two frames of images to be processed.
The step is the same as step 102, and the detailed description of step 102 is omitted here.
And step 304, carrying out noise reduction processing on the static area according to the sub-images of the static area in at least two frames of images to be processed respectively, and obtaining a first sub-image.
The step is the same as step 103, and specific reference may be made to the related description of step 103, which is not repeated here.
And 305, carrying out noise reduction processing on the motion region according to the sub-image of the motion region in the frame of candidate to-be-processed image, and obtaining a second sub-image.
The step is the same as step 104, and the detailed description of step 104 is omitted here.
And 306, splicing the first sub-image and the second sub-image to obtain a target image.
This step is the same as step 105, and specific reference may be made to the description related to step 105, which is not repeated here.
According to the embodiment of the application, when the camera is in the professional mode, the photographing frame number of the camera is adaptively controlled according to the exposure mode and the exposure time, so that the photographing experience of a user can be improved while a high-quality image is obtained.
Referring to fig. 4, a flowchart of an implementation of an image processing method according to a third embodiment of the present application, where the image processing method is applied to a terminal device integrated with a camera, as shown in the figure, the image processing method may include the following steps:
step 401, when a continuous shooting instruction is received, controlling a camera to continuously shoot M frames of alternative images, and storing the M frames of alternative images in a preset buffer area.
Wherein m=l+n-1, L is an integer greater than 1, and N is an odd number greater than 1. The continuous shooting instruction is used for instructing the camera to output at least two frames of continuous shooting images.
In the embodiment of the application, when a continuous shooting instruction is received, a camera is controlled to continuously shoot M frames of alternative images, and in the shooting process, the shot alternative images are sequentially stored in a preset cache area.
The M frame candidate images are obtained according to a photographing data stream, compared with a preview data stream, the photographing data stream can ensure that the photographed images are full-size images, and image information is effectively reserved.
In step 402, M frame candidate images are divided into L image groups, one image group including N frame candidate images.
The N frame candidate images are continuous in shooting time, one image group finally outputs one frame of target image, and L image groups finally output L frame of target images, namely the L frame of target images are continuous shooting images output by the camera.
Optionally, dividing the M-frame candidate images into L image groups includes:
the first frame in the M frame candidate imageThe frames are used as reference frames, and M frame candidate images are arranged according to shooting time;
to be adjacent to and arranged in front of the reference frameFrame-alternative image, and +_adjacent to and arranged after the reference frame >The frame candidate image is used as a reference image of a base frame, and the base frame and the reference image are determined to form a group of images;
reference frames in M frame candidate imagesIs used as a reference frame, and returns to perform the arrangement of the next frame image adjacent to and before the reference frameFrame-alternative image, and +_adjacent to and arranged after the reference frame>The frame candidate image is used as a reference image of a reference frame, and a group of image groups formed by the reference frame and the reference image is determined until the M frame candidate images are traversed, so that L image groups are obtained.
In the embodiment of the application, the M frame candidate images are arranged according to the shooting time, for example, a first frame to be processed image in the M frame candidate images is a first frame in the M frame candidate images, and an mth frame candidate image is a last frame in the M frame candidate images.
For example, five frame candidate images B1, B2, B3, B4, and B5 are arranged according to the shooting time, B1 is a first frame candidate image, B2 is a second frame candidate image, B3 is a third frame candidate image, B4 is a fourth frame candidate image, B5 is a fifth frame candidate image, first B2 is a reference frame, B1 and B3 are reference images of B2, then B1, B2, and B3 constitute one image group, then B3 is a reference frame, B2 and B4 are reference images of B3, then B2, B3, and B4 constitute one image group, finally B4 is a reference frame, B3 and B5 are reference images of B4, then B3, B4, and A5 constitute one image group, and three image groups are obtained in total, each image group including three frame candidate images.
Step 403, obtaining at least two frames of images to be processed from the N frames of candidate images in each image group.
For one image group, at least two frames of images to be processed corresponding to the image group can be obtained from N frames of alternative images in the image group.
Optionally, after dividing the M-frame candidate images into L image groups, the method further includes:
judging whether the reference frames in each image group meet the image synthesis conditions, wherein the condition that the reference frames in each image group meet the image synthesis conditions means that images similar to the reference frames exist in the N-1 frame reference images of each image group, and the condition that the reference frames in each image group do not meet the image synthesis conditions means that the images similar to the reference frames do not exist in the N-1 frame reference images of each image group;
correspondingly, acquiring at least two frames of images to be processed from the N frames of candidate images in each image group comprises:
and if the reference frames in each image group meet the image synthesis conditions, determining the reference frames in each image group and the reference images similar to the reference frames as at least two frames of images to be processed.
In the embodiment of the application, for any image group, the similarity between the reference frame of the image group and the N-1 frame reference image of the image group can be obtained, the obtained N-1 similarities are compared with a similarity threshold value respectively, and if the reference image with the similarity greater than the similarity threshold value exists, the reference frame in the image group is determined to meet the image synthesis condition; and if the reference image with the similarity larger than the similarity threshold value does not exist, determining that the reference frame in the image group does not meet the image synthesis condition.
Optionally, the embodiment of the present application further includes:
and if the reference frame in each image group does not meet the image synthesis condition, taking the reference frame in each image group as a target image of each image group.
Step 404, acquiring a still region and a moving region of at least two frames of images to be processed.
The step is the same as step 102, and the detailed description of step 102 is omitted here.
And step 405, performing noise reduction processing on the static area according to the sub-images of the static area in at least two frames of images to be processed respectively, so as to obtain a first sub-image.
The step is the same as step 103, and specific reference may be made to the related description of step 103, which is not repeated here.
And step 406, performing noise reduction processing on the motion region according to the sub-image of the motion region in the frame of candidate image to be processed, and obtaining a second sub-image.
The step is the same as step 104, and the detailed description of step 104 is omitted here.
Step 407, stitching the first sub-image and the second sub-image to obtain the target image.
This step is the same as step 105, and specific reference may be made to the description related to step 105, which is not repeated here.
According to the method and the device for processing the images, when the cameras continuously shoot images, the images to be processed corresponding to each image group are obtained from multi-frame alternative images continuously shot by the cameras, and when the images to be processed contain a static area and a moving area, noise reduction processing is carried out on the static area and the moving area in different modes, so that the high-quality images with reduced noise can be obtained.
Referring to fig. 5, which is a schematic structural diagram of an image processing apparatus provided in the fourth embodiment of the present application, only a portion related to the embodiment of the present application is shown for convenience of explanation.
The image processing device includes:
an image acquisition module 51, configured to acquire at least two frames of images to be processed that are continuously shot;
the region acquisition module 52 is configured to acquire a still region and a moving region of at least two frames of images to be processed;
the first denoising module 53 is configured to denoise the static area according to sub-images of the static area in at least two frames of images to be processed, so as to obtain a first sub-image, where the first sub-image is a denoised sub-image corresponding to the static area;
the second noise reduction module 54 is configured to perform noise reduction processing on the motion region according to a sub-image of the motion region in a frame of candidate to-be-processed images, so as to obtain a second sub-image, where the second sub-image is a noise-reduced sub-image corresponding to the motion region, and the candidate to-be-processed images are to-be-processed images in which the motion region exists in at least two frames of to-be-processed images;
and the target acquisition module 55 is used for splicing the first sub-image and the second sub-image to obtain a target image.
Optionally, the image processing apparatus further includes:
The mode acquisition module is used for acquiring an exposure mode of the camera if a photographing instruction is received when the camera is in a professional mode;
correspondingly, the image acquisition module is specifically configured to:
if the exposure mode of the camera is automatic exposure or the exposure mode of the camera is manual exposure and the exposure time of the camera is smaller than the time threshold, controlling the camera to continuously shoot at least two frames of images to be processed.
Optionally, the image processing apparatus further includes:
and the camera control module is used for controlling the camera to shoot a frame of image to be processed if the exposure mode of the camera is manual exposure and the exposure time of the camera is greater than or equal to a time threshold value.
Optionally, the image processing apparatus further includes:
the image storage module is used for controlling the camera to continuously shoot M frames of alternative images when a continuous shooting instruction is received, and storing the M frames of alternative images in a preset buffer area, wherein M=L+N-1, L is an integer greater than 1, and N is an odd number greater than 1;
the image dividing module is used for dividing the M frame candidate images into L image groups, wherein one image group comprises N frame candidate images, and the N frame candidate images are continuous in shooting time;
correspondingly, the image acquisition module is specifically configured to:
and acquiring at least two frames of images to be processed from the N frames of alternative images in each image group.
Optionally, the image dividing module is specifically configured to:
the first frame in the M frame candidate imageThe frames are used as reference frames, and M frame candidate images are arranged according to shooting time;
to be adjacent to and arranged in front of the reference frameFrame-alternative image, and +_adjacent to and arranged after the reference frame>The frame candidate image is used as a reference image of a base frame, and the base frame and the reference image are determined to form a group of images;
taking the next frame image of the reference frames in the M frame candidate images as the reference frame, and returning to execute the steps of arranging the images adjacent to the reference frame and before the reference frameFrame-alternative image, and +_adjacent to and arranged after the reference frame>The frame candidate image is used as a reference image of a reference frame, and a group of image groups formed by the reference frame and the reference image is determined until the M frame candidate images are traversed, so that L image groups are obtained.
Optionally, the image processing apparatus further includes:
a composition judgment module, configured to judge whether a reference frame in each image group meets an image composition condition, where the reference frame in each image group meets the image composition condition refers to that an image similar to the reference frame exists in an N-1 frame reference image of each image group, and the reference frame in each image group does not meet the image composition condition refers to that an image similar to the reference frame does not exist in an N-1 frame reference image of each image group;
Correspondingly, the image acquisition module is specifically configured to:
and if the reference frames in each image group meet the image synthesis conditions, determining the reference frames in each image group and the reference images similar to the reference frames as at least two frames of images to be processed.
Optionally, the image processing apparatus further includes:
and the image determining module is used for taking the reference frame in each image group as a target image of each image group if the reference frame in each image group does not meet the image synthesis condition.
The image processing device provided in the embodiment of the present application may be applied to the foregoing method embodiment, and details refer to descriptions of the foregoing method embodiment, which are not repeated herein.
Fig. 6 is a schematic structural diagram of a terminal device provided in a fifth embodiment of the present application. The terminal device as shown in the figure may include: one or more processors 601 (only one shown in the figure); one or more input devices 602 (only one shown in the figure), one or more output devices 603 (only one shown in the figure), and a memory 604. The processor 601, input device 602, output device 603, and memory 604 are connected by a bus 605. The memory 604 is used for storing instructions, and the processor 601 is used for implementing the steps in the embodiments of the image processing method described above when executing the instructions stored in the memory 604.
It should be appreciated that in embodiments of the present application, the processor 601 may be a central processing unit (Central Processing Unit, CPU), which may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 602 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of a fingerprint), a microphone, a data receiving interface, and the like. The output device 603 may include a display (LCD, etc.), a speaker, a data transmission interface, etc.
The memory 604 may include read only memory and random access memory and provides instructions and data to the processor 601. A portion of memory 604 may also include non-volatile random access memory. For example, the memory 604 may also store information of device type.
In a specific implementation, the processor 601, the input device 602, the output device 603, and the memory 604 described in the embodiments of the present application may perform the implementation described in the embodiments of the image processing method provided in the embodiments of the present application, or may perform the implementation described in the image processing apparatus described in the fourth embodiment, which is not repeated herein.
Fig. 7 is a schematic structural diagram of a terminal device provided in a sixth embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment includes: a processor 70, a memory 71, and a computer program 72 stored in the memory 71 and executable on the processor 70. The processor 70, when executing the computer program 72, implements the steps of the various image processing method embodiments described above.
By way of example, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 72 in the terminal device 7. For example, the computer program 72 may be divided into an event acquisition module, a display module, an update module, and an information transmission module, each of which specifically functions as follows:
The terminal device 7 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the terminal device 7 and does not constitute a limitation of the terminal device 7, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor 70 may be a central processing unit, CPU, or other general purpose processor, digital signal processor, DSP, application specific integrated circuit, ASIC, off-the-shelf programmable gate array, FPGA, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may be an external storage device of the terminal device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing the computer program as well as other programs and data required by the terminal device. The memory 71 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. An image processing method, characterized in that the image processing method comprises:
acquiring at least two frames of images to be processed which are continuously shot;
acquiring a static area and a moving area of the at least two frames of images to be processed; according to the sub-images of the static area in the at least two frames of images to be processed, carrying out noise reduction processing on the static area to obtain a first sub-image, wherein the first sub-image is a noise-reduced sub-image corresponding to the static area;
according to the sub-image of the motion area in a frame of candidate to-be-processed image, carrying out noise reduction processing on the motion area to obtain a second sub-image, wherein the second sub-image is a noise-reduced sub-image corresponding to the motion area, and the candidate to-be-processed image is a to-be-processed image with the motion area in the at least two frames of to-be-processed images;
And splicing the first sub-image and the second sub-image to obtain a target image.
2. The image processing method according to claim 1, further comprising, before acquiring at least two frames of images to be processed that are consecutively photographed:
when a camera is in a professional mode, if a photographing instruction is received, acquiring an exposure mode of the camera;
correspondingly, the acquiring at least two frames of images to be processed, which are continuously shot, comprises:
and if the exposure mode of the camera is automatic exposure or the exposure mode of the camera is manual exposure and the exposure time of the camera is smaller than a time threshold, controlling the camera to continuously shoot at least two frames of images to be processed.
3. The image processing method according to claim 2, characterized in that the image processing method further comprises:
and if the exposure mode of the camera is manual exposure and the exposure time of the camera is greater than or equal to a time threshold, controlling the camera to shoot a frame of image to be processed.
4. The image processing method according to claim 1, further comprising, before acquiring at least two frames of images to be processed that are consecutively photographed:
when a continuous shooting instruction is received, controlling a camera to continuously shoot M frame alternative images, and storing the M frame alternative images in a preset buffer area, wherein M=L+N-1, L is an integer greater than 1, and N is an odd number greater than 1;
Dividing the M frame candidate images into L image groups, wherein one image group comprises N frame candidate images, and the N frame candidate images are continuous in shooting time;
correspondingly, the acquiring at least two frames of images to be processed, which are continuously shot, comprises:
and acquiring the at least two frames of images to be processed from the N frames of candidate images in each image group.
5. The image processing method of claim 4, wherein the dividing the M-frame candidate images into L image groups comprises:
the first frame of the M frame of the candidate imagesThe frames are used as reference frames, and the M frame candidate images are arranged according to shooting time;
to be adjacent to and arranged in front of the reference frameA frame candidate image, and +_adjacent to and arranged after the reference frame>The frame candidate image is used as a reference image of the reference frame, and the reference frame and the reference image are determined to form a group of images;
taking the next frame image of the reference frames in the M frame candidate images as a reference frame, and returning to execute the steps of arranging the images adjacent to the reference frame and before the reference frameFrame candidate image and reference frame Adjacent and arranged after said reference frame +.>And taking the frame candidate image as a reference image of the reference frame, and determining that the reference frame and the reference image form a group of image groups until the M frame candidate images are traversed, so as to obtain L image groups.
6. The image processing method according to claim 5, further comprising, after dividing the M-frame candidate image into L image groups:
judging whether the reference frames in each image group meet the image synthesis conditions or not, wherein the reference frames in each image group meet the image synthesis conditions, namely that images similar to the reference frames exist in N-1 frame reference images of each image group, and the reference frames in each image group do not meet the image synthesis conditions, namely that images similar to the reference frames do not exist in N-1 frame reference images of each image group;
correspondingly, the acquiring the at least two frames of images to be processed from the N frames of candidate images in each image group comprises:
and if the reference frames in each image group meet the image synthesis conditions, determining the reference frames in each image group and the reference images similar to the reference frames as the at least two frames of images to be processed.
7. The image processing method according to claim 6, wherein the image processing method further comprises:
and if the reference frame in each image group does not meet the image synthesis condition, taking the reference frame in each image group as a target image of each image group.
8. An image processing apparatus, characterized in that the image processing apparatus comprises:
the image acquisition module is used for acquiring at least two frames of images to be processed which are continuously shot;
the region acquisition module is used for acquiring a static region and a moving region of the at least two frames of images to be processed;
the first noise reduction module is used for carrying out noise reduction processing on the static area according to the sub-images of the static area in the at least two frames of images to be processed respectively to obtain a first sub-image, wherein the first sub-image is a noise-reduced sub-image corresponding to the static area;
the second denoising module is used for denoising the motion region according to the sub-image of the motion region in a frame of candidate images to be processed to obtain a second sub-image, wherein the second sub-image is a denoising sub-image corresponding to the motion region, and the candidate images to be processed are images to be processed in which the motion region exists in the at least two frames of images to be processed;
And the target acquisition module is used for splicing the first sub-image and the second sub-image to obtain a target image.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the image processing method according to any one of claims 1 to 7 when the computer program is executed.
10. A computer-readable storage medium storing a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the image processing method according to any one of claims 1 to 7.
CN202010657947.5A 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer-readable storage medium Active CN111815531B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010657947.5A CN111815531B (en) 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010657947.5A CN111815531B (en) 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN111815531A CN111815531A (en) 2020-10-23
CN111815531B true CN111815531B (en) 2024-03-01

Family

ID=72842074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010657947.5A Active CN111815531B (en) 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN111815531B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117541497A (en) * 2023-10-19 2024-02-09 惠州Tcl云创科技有限公司 Image processing method, device, storage medium and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9756249B1 (en) * 2016-04-27 2017-09-05 Gopro, Inc. Electronic image stabilization frequency estimator
WO2017205492A1 (en) * 2016-05-25 2017-11-30 Gopro, Inc. Three-dimensional noise reduction
CN107872623A (en) * 2017-12-22 2018-04-03 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN108616687A (en) * 2018-03-23 2018-10-02 维沃移动通信有限公司 A photographing method, device and mobile terminal
CN109120862A (en) * 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 High dynamic range image acquisition method and device and mobile terminal
CN109474787A (en) * 2018-12-28 2019-03-15 维沃移动通信有限公司 A photographing method, terminal device and storage medium
CN111062881A (en) * 2019-11-20 2020-04-24 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070248330A1 (en) * 2006-04-06 2007-10-25 Pillman Bruce H Varying camera self-determination based on subject motion
US8428308B2 (en) * 2011-02-04 2013-04-23 Apple Inc. Estimating subject motion for capture setting determination
US8379934B2 (en) * 2011-02-04 2013-02-19 Eastman Kodak Company Estimating subject motion between image frames
US10491832B2 (en) * 2017-08-16 2019-11-26 Qualcomm Incorporated Image capture device with stabilized exposure or white balance
US10643308B2 (en) * 2017-10-11 2020-05-05 Gopro, Inc. Double non-local means denoising
US20200134791A1 (en) * 2018-10-27 2020-04-30 BARS Imaging LLC Spatio-temporal differential synthesis of detail images for high dynamic range imaging

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9756249B1 (en) * 2016-04-27 2017-09-05 Gopro, Inc. Electronic image stabilization frequency estimator
WO2017205492A1 (en) * 2016-05-25 2017-11-30 Gopro, Inc. Three-dimensional noise reduction
CN107872623A (en) * 2017-12-22 2018-04-03 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN108616687A (en) * 2018-03-23 2018-10-02 维沃移动通信有限公司 A photographing method, device and mobile terminal
CN109120862A (en) * 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 High dynamic range image acquisition method and device and mobile terminal
CN109474787A (en) * 2018-12-28 2019-03-15 维沃移动通信有限公司 A photographing method, terminal device and storage medium
CN111062881A (en) * 2019-11-20 2020-04-24 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Minimum Error Seam-Based Efficient Panorama VideoStitching Method Robust to Parallax;JEONHO KANG etc.;IEEE Access;全文 *
三维运动图像多参考帧边缘差异动态分割仿真;李波等;计算机仿真(第06期);全文 *
基于图像融合的运动目标检测与跟踪方法研究;刘秀进等;机械工程与自动化(第04期);全文 *
基于多曝光的高动态图像合成的噪声处理;刘宗等;电子科技;20161115(第11期);全文 *
改进的背景减法与五帧差分法相结合的运动目标检测;潘峥嵘等;自动化与仪表(第07期);全文 *

Also Published As

Publication number Publication date
CN111815531A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN111726533B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
WO2020171373A1 (en) Techniques for convolutional neural network-based multi-exposure fusion of multiple image frames and for deblurring multiple image frames
CN108322646B (en) Image processing method, image processing device, storage medium and electronic equipment
CN111028189A (en) Image processing method, device, storage medium and electronic device
US12307683B2 (en) Subject detecting method and device, electronic device, and non-transitory computer-readable storage medium
CN112887602B (en) Camera switching method, device, storage medium and electronic device
CN110084765B (en) An image processing method, an image processing device and a terminal device
CN108737739B (en) Preview picture acquisition method, preview picture acquisition device and electronic equipment
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
CN109040603A (en) High dynamic range image acquisition method, device and mobile terminal
EP2569934A1 (en) Imaging apparatus, image processing method, and recording medium for recording program thereon
CN106231200B (en) A kind of photographic method and device
CN108513069B (en) Image processing method, image processing device, storage medium and electronic equipment
CN105049695A (en) Video recording method and device
CN108492266A (en) Image processing method, image processing device, storage medium and electronic equipment
CN114390201A (en) Focusing method and device thereof
CN111815531B (en) Image processing method, device, terminal equipment and computer-readable storage medium
CN112367464A (en) Image output method and device and electronic equipment
CN111340722A (en) Image processing method, processing device, terminal device and readable storage medium
CN114723603B (en) Image processing method, image processing device and storage medium
CN114390197A (en) Shooting method and device, electronic equipment and readable storage medium
CN112367470B (en) Image processing method and device and electronic equipment
CN111754411A (en) Image noise reduction method, image noise reduction device and terminal equipment
CN111340736B (en) Image processing method, device, storage medium and electronic equipment
CN115797160A (en) Image generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant