[go: up one dir, main page]

WO2020029679A1 - 控制方法、装置、成像设备、电子设备及可读存储介质 - Google Patents

控制方法、装置、成像设备、电子设备及可读存储介质 Download PDF

Info

Publication number
WO2020029679A1
WO2020029679A1 PCT/CN2019/090629 CN2019090629W WO2020029679A1 WO 2020029679 A1 WO2020029679 A1 WO 2020029679A1 CN 2019090629 W CN2019090629 W CN 2019090629W WO 2020029679 A1 WO2020029679 A1 WO 2020029679A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
black
exposure
white
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/090629
Other languages
English (en)
French (fr)
Inventor
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of WO2020029679A1 publication Critical patent/WO2020029679A1/zh
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present disclosure relates to the technical field of electronic devices, and in particular, to a control method, an apparatus, an imaging device, an electronic device, and a readable storage medium.
  • the edge of the moving object is prevented from being dislocated or blurred.
  • the present disclosure proposes a control method, device, imaging device, electronic device, and readable storage medium, so as to simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, thereby avoiding misalignment or blurring of the edges of the moving object.
  • a control method, device, imaging device, electronic device, and readable storage medium so as to simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, thereby avoiding misalignment or blurring of the edges of the moving object.
  • An embodiment of one aspect of the present disclosure provides a control method applied to an imaging device, where the imaging device includes a color camera and a black and white camera, and the control method includes:
  • the control method in the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • An embodiment of still another aspect of the present disclosure provides a control device applied to an imaging device, where the imaging device includes a color camera and a black and white camera, and the control device includes:
  • a first control module configured to control the color camera to capture a color image with a high dynamic range
  • a second control module configured to control the black and white camera to obtain a black and white image with a low dynamic range
  • a synthesis module is configured to synthesize the black and white image and the color image to obtain a target image.
  • the control device first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • An embodiment of another aspect of the present disclosure provides an imaging device.
  • the imaging device includes a color camera and a black and white camera.
  • the imaging device further includes a processor.
  • the processor is configured to:
  • the imaging device of the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • An embodiment of another aspect of the present disclosure provides an electronic device, including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • the processor executes the program, the implementation is implemented as the present disclosure
  • the control method proposed in the foregoing embodiment is implemented as the present disclosure
  • An embodiment of another aspect of the present disclosure provides a computer-readable storage medium having a computer program stored thereon, which is characterized in that when the program is executed by a processor, the control method according to the foregoing embodiment of the present disclosure is implemented.
  • FIG. 1 is a schematic flowchart of a control method according to Embodiment 1 of the present disclosure
  • FIG. 2 is a schematic flowchart of a control method provided in Embodiment 2 of the present disclosure
  • FIG. 3 is a schematic flowchart of a control method according to a third embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of an image captured by a color camera according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of a color image in an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of a black and white image in an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of an adjusted black and white image in an embodiment of the present disclosure.
  • FIG. 8 is a schematic flowchart of a control method according to a fourth embodiment of the present disclosure.
  • Embodiment 9 is a schematic flowchart of a control method provided in Embodiment 5 of the present disclosure.
  • FIG. 10 is a schematic flowchart of a control method provided in Embodiment 6 of the present disclosure.
  • FIG. 11 is a schematic structural diagram of a control device according to a seventh embodiment of the present disclosure.
  • FIG. 12 is a schematic structural diagram of a control device according to an eighth embodiment of the present disclosure.
  • FIG. 13 is a schematic structural diagram of an electronic device according to a ninth embodiment of the present disclosure.
  • FIG. 14 is a schematic block diagram of an electronic device according to some embodiments of the present disclosure.
  • FIG. 15 is a schematic block diagram of an image processing circuit according to some embodiments of the present disclosure.
  • the present disclosure mainly aims to provide a control method for the technical problem that the details of the synthesized image may be lost or blurred after the two frames of the exposed image are synthesized by the image synthesis technology in the prior art.
  • the control method in the embodiment of the present disclosure firstly obtains a high dynamic range color image by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • FIG. 1 is a schematic flowchart of a control method according to a first embodiment of the present disclosure.
  • the control method in the embodiment of the present disclosure is applied to an imaging device.
  • the imaging device includes a color camera and a black and white camera.
  • control method includes the following steps:
  • step 101 a color camera is controlled to obtain a color image with a high dynamic range.
  • the color camera in order to reduce noise in the image, thereby improving the sharpness and contrast of the image, in the present disclosure, can be controlled to obtain a high dynamic range color image. Reduce noise, and clearly display the current shooting scene, improve imaging results and imaging quality.
  • step 102 a black and white camera is controlled to obtain a black and white image with a low dynamic range.
  • the black and white camera can be controlled to obtain a black and white image with a low dynamic range, so that the black and white image can better retain the edges and details of the image.
  • step 103 a black and white image and a color image are synthesized to obtain a target image.
  • a black and white image and a color image may be synthesized to obtain a target image. Therefore, the high dynamic range of the target image, as well as the edges and details of the target image can be retained at the same time, thereby avoiding misalignment or blurring of the edges of the moving object, and improving the user's shooting experience.
  • each pixel in the color image may be determined, and a pixel corresponding to each pixel in the color image in the black and white image may be determined.
  • the color map is applied to the corresponding pixels in the black and white image to obtain the target image.
  • each pixel in the color image may be determined, and a pixel corresponding to each pixel in the color image in the black and white image may be determined. Then, using the color image as the main body, each pixel in the black and white image may be determined. The light intensity is compensated to the pixels corresponding to the color image to obtain the target image.
  • the control method in the embodiment of the present disclosure firstly obtains a high dynamic range color image by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then, by controlling the black-and-white camera to obtain a black-and-white image with a low dynamic range, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • the angle of view of the black and white image may be adjusted according to the angle of view difference between the color camera and the black and white camera. The above process is described in detail below with reference to FIG. 2.
  • FIG. 2 is a schematic flowchart of a control method provided in Embodiment 2 of the present disclosure.
  • control method may include the following steps:
  • step 201 a color camera is controlled to obtain a color image with a high dynamic range.
  • step 202 a black and white camera is controlled to obtain a black and white image with a low dynamic range.
  • steps 201 to 202 For the execution process of steps 201 to 202, refer to the execution process of steps 101 to 102 in the foregoing embodiment, and details are not described herein.
  • Step 203 Adjust the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera.
  • the angle of view of the black and white image may be adjusted according to the angle of view difference between the color camera and the black and white camera.
  • the viewing angle adjustment may be performed only for an image area with the same screen content in a black and white image and a color image.
  • an image area with the same screen content as that of the color image may be retained from the black-and-white image, and then the pixels in the image area may be adjusted to obtain an adjusted black-and-white image; wherein, The pixels in the adjusted black and white image correspond to the pixels in the color image one-to-one.
  • image recognition technology can be used to retain the image area with the same screen content as that of the color image from the black and white image according to the difference in perspective.
  • feature extraction can be performed on the screen content in the black and white image according to the difference in perspective.
  • Feature extraction is performed on the picture content in the color image, and then the extracted features are compared to determine an image area in which the picture content in the black and white image is the same as the picture content of the color image, and then the pixels in the image area are adjusted so that The angle of view and position of the pixels in the adjusted black and white image correspond one-to-one with the pixels in the color image.
  • adjusting the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera can improve the matching between the adjusted black and white image and the color image, thereby improving the quality of subsequent composite images.
  • step 204 the adjusted black and white image and a color image are synthesized to obtain a target image.
  • the adjusted black and white image and the color image may be synthesized to obtain a target image. Therefore, the high dynamic range of the target image, as well as the edges and details of the target image can be retained at the same time, thereby avoiding misalignment or blurring of the edges of the moving object, and improving the user's shooting experience.
  • each pixel in the color image can be determined, and the pixel corresponding to each pixel in the color image in the adjusted black and white image is determined.
  • the color map of each pixel in the color image is applied to the corresponding pixel in the adjusted black and white image to obtain the target image.
  • each pixel in the color image may be determined, and pixels corresponding to each pixel in the color image in the adjusted black-and-white image may be determined, and then the color image is used as the main body.
  • the light intensity of each pixel in the black and white image is compensated to the corresponding pixel in the color image to obtain the target image.
  • the control method in the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then, by controlling the black-and-white camera to obtain a black-and-white image with a low dynamic range, the edges and details of the image can be better preserved. Then, by adjusting the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera, the matching between the adjusted black and white image and the color image can be improved, thereby improving the quality of subsequent composite images.
  • the target image is obtained by combining the adjusted black and white image and the color image, which can simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, so as to avoid misalignment or blurring of the edges of moving objects, and improve User shooting experience.
  • FIG. 3 is a schematic flowchart of a control method provided in Embodiment 3 of the present disclosure.
  • control method may include the following steps:
  • Step 301 Control a color camera to shoot according to the long exposure time, the short exposure time, and the middle exposure time to obtain a color image.
  • the long exposure time is longer than the middle exposure time, and the middle exposure time is longer than the short exposure time, that is, the long exposure time> the middle exposure time> the short exposure time.
  • the long exposure duration, short exposure duration, and medium exposure duration can be preset in a built-in program of the electronic device, or can be set by a user, which is not limited.
  • the color camera can be controlled to shoot with a long exposure time, a short exposure time, and a medium exposure time, respectively, to obtain at least one long exposure image, at least one middle exposure image, and at least one short exposure image, for example, see FIG. 4 is a schematic diagram of an image obtained by a color camera according to an embodiment of the present disclosure.
  • the color camera collects one frame of long exposure image, one frame of exposure image, and one frame of short exposure image, respectively.
  • a color image can be obtained according to at least two frames of the acquired images. For example, three frames of images in FIG. 4 may be synthesized, and the obtained color image may be as shown in FIG. 5.
  • Step 302 Control the black-and-white camera to shoot with a medium exposure time to obtain a black-and-white image with a low dynamic range.
  • a black and white camera may be controlled to take a medium exposure time to obtain a black and white image with a low dynamic range.
  • a black-and-white image obtained by a black-and-white camera may be shown in FIG. 6.
  • step 303 an image area with the same screen content as that of the color image is retained from the black and white image according to the difference in viewing angle.
  • the viewing angle adjustment can be performed only for the image area with the same screen content in the black and white image and the color image. Specifically, an image area with the same screen content as that of the color image may be retained from the black and white image according to a difference in the viewing angle between the color camera and the black and white camera.
  • image recognition technology can be used to retain the image area with the same screen content as that of the color image from the black and white image according to the difference in viewing angle.
  • feature extraction can be performed on the screen content in the black and white image according to the difference in viewing angle.
  • Feature extraction is performed on the picture content in the color image, and then the extracted features are compared, so that an image region in which the picture content in the black and white image is the same as the picture content of the color image can be determined.
  • Step 304 Adjust the pixels in the image area to obtain an adjusted black-and-white image.
  • the pixels in the adjusted black-and-white image correspond to the pixels in the color image on a one-to-one basis.
  • the pixels in the image area may be adjusted so that the angles of view and positions of the pixels in the adjusted black and white image correspond to the pixels in the color image on a one-to-one basis.
  • FIG. 7 is a schematic diagram of an adjusted black and white image in an embodiment of the present disclosure. It can be seen that the adjusted black and white image matches the color image more.
  • Step 305 Combine the adjusted black and white image and the color image to obtain a target image.
  • step 305 For the execution process of step 305, refer to the execution process of step 204 in the foregoing embodiment, and details are not described herein.
  • control method of the embodiment of the present disclosure by first capturing a color image with a high dynamic range through a color camera, it is possible to reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality.
  • the black and white image with low dynamic range is then captured by the black and white camera, which can better preserve the edges and details of the image.
  • the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera, the matching between the adjusted black and white image and the color image can be improved, thereby improving the quality of subsequent composite images.
  • the target image is obtained by combining the adjusted black and white image and the color image, which can simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, so as to avoid misalignment or blurring of the edges of moving objects, and improve User shooting experience.
  • the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels covered by the same color filter; multiple photosensitive pixels are used to output original pixels.
  • the information includes at least one long exposure pixel, at least one medium exposure pixel, and at least one short exposure pixel.
  • step 301 may specifically include the following sub-steps:
  • Step 401 Control the pixel unit array to output multiple pieces of original pixel information under different exposure times.
  • each photosensitive pixel unit in the pixel unit array includes at least one long exposure pixel, at least one medium exposure pixel, and at least one short exposure pixel, where the long exposure pixel refers to the exposure time corresponding to the photosensitive pixel is Long exposure time, medium exposure pixel refers to the exposure time corresponding to the photosensitive pixel is medium exposure time, short exposure pixel refers to the exposure time corresponding to the photosensitive pixel is short exposure time, long exposure time> medium exposure time> short exposure time, That is, the long exposure time of the long exposure pixel is greater than the middle exposure time of the middle exposure pixel, and the middle exposure time of the middle exposure pixel is greater than the short exposure time of the short exposure pixel.
  • the long exposure pixels, the middle exposure pixels, and the short exposure pixels are simultaneously exposed.
  • the synchronous exposure means that the exposure duration of the middle exposure pixels and the short exposure pixels is within the exposure duration of the long exposure pixels.
  • the long-exposure pixel can be controlled to start exposure first.
  • the exposure of the middle-exposure pixel and the short-exposure pixel can be controlled.
  • the exposure cut-off time of the middle-exposure pixel and the short-exposure pixel should be longer than
  • the exposure cutoff time of the exposure pixel is the same or before the exposure cutoff time of the long exposure pixel; or, the long exposure pixel, the middle exposure pixel, and the short exposure pixel are controlled to start exposure at the same time, that is, the exposure of the long exposure pixel, the middle exposure pixel, and the short exposure pixel.
  • the start times are the same. In this way, there is no need to control the pixel unit array to perform long exposure, medium exposure, and short exposure in order, which can reduce the shooting time of color images.
  • the imaging device first controls the synchronous exposure of the long exposure pixel, the middle exposure pixel, and the short exposure pixel in each photosensitive pixel unit in the pixel unit array.
  • the exposure time corresponding to the long exposure pixel is the initial long exposure time, and the exposure corresponding to the middle exposure pixel.
  • Time is the initial medium exposure time
  • the exposure time corresponding to the short exposure pixel is the initial short exposure time.
  • the initial long exposure time, the initial medium exposure time, and the initial short exposure time are all preset. After the exposure is over, each photosensitive pixel unit in the pixel unit array will output multiple original pixel information at different exposure times.
  • Step 402 Calculate the combined pixel information according to the original pixel information with the same exposure time in the same photosensitive pixel unit.
  • Step 403 Output a color image according to the merged pixel information.
  • each photosensitive pixel unit when each photosensitive pixel unit includes one long exposure pixel, two middle exposure pixels, and one short exposure pixel, the original pixel information of the only long exposure pixel is the combined pixel information of the long exposure.
  • the sum of the original pixel information of the exposed pixels is the combined pixel information of the middle exposure, and the original pixel information of the only short exposure pixel is the combined pixel information of the short exposure;
  • each photosensitive pixel unit includes 2 long exposure pixels, 4
  • the sum of the original pixel information of the two long exposure pixels is the combined pixel information for the long exposure
  • the sum of the original pixel information of the four middle exposure pixels is the merged pixel for the middle exposure.
  • the sum of the original pixel information of the two short exposure pixels is the combined pixel information of the short exposure.
  • multiple long-exposure combined pixel information, multiple medium-exposure combined pixel information, and multiple short-exposure combined pixel information of the entire pixel unit array can be obtained.
  • a long-exposure sub-image is calculated by interpolation based on a plurality of long-exposure merged pixel information
  • a mid-exposure sub-image is calculated by interpolation based on a plurality of mid-exposure merged pixel information
  • a short is calculated based on a plurality of short-exposure merged pixel information interpolation.
  • the long exposure sub-image, the middle exposure sub-image, and the short exposure sub-image are processed by fusion to obtain a high dynamic range color image.
  • the long exposure sub-image, the middle exposure sub-image, and the short exposure sub-image are not three traditional images.
  • a frame image is an image portion formed by corresponding regions of long, short, and medium exposure pixels in the same frame of image.
  • the original pixel information of the short exposure pixel and the original pixel information of the middle exposure pixel may be superimposed on the original pixel information of the long exposure pixel based on the original pixel information output by the long exposure pixel.
  • three kinds of original pixel information with different exposure times can be given different weights respectively.
  • the original pixel information corresponding to each exposure time is multiplied with the weight, three kinds of multiplied weights are then used.
  • the original pixel information after the value is added up as the synthesized pixel information of one photosensitive pixel unit.
  • the long exposure pixels can be firstly Calculate the long exposure histogram based on the output original pixel information, calculate the short exposure histogram based on the original pixel information output by the short exposure pixels, and correct the initial long exposure time based on the long exposure histogram to obtain the corrected long exposure time.
  • Short exposure times get corrected short exposure times.
  • the long exposure pixels, the middle exposure pixels, and the short exposure pixels are controlled to synchronize exposures according to the modified long exposure time, the initial medium exposure time, and the modified short exposure time, respectively.
  • the correction process is not in one step, but the imaging device needs to perform multiple long, medium, and short simultaneous exposures. After each simultaneous exposure, the imaging device will continue to correct according to the generated long exposure histogram and short exposure histogram.
  • Long exposure time and short exposure time and use the modified long exposure time, the corrected short exposure time and the original medium exposure time to perform synchronous exposure at the next exposure, and continue to obtain the long exposure histogram and short exposure histogram. This cycle continues until there are no underexposed areas in the image corresponding to the long exposure histogram and no overexposed areas in the image corresponding to the short exposure histogram.
  • the modified long exposure time and the corrected short exposure time are the final corrections.
  • Long exposure time and corrected short exposure time After the exposure is finished, the color image calculation is performed according to the output of the long exposure pixel, the middle exposure pixel, and the short exposure pixel. The calculation method is the same as that in the previous embodiment, and is not repeated here.
  • the long exposure histogram may be one or more.
  • a long exposure histogram can be generated according to the original pixel information output by all the long exposure pixels.
  • Exposure histogram The function of dividing the area is to improve the accuracy of the long exposure time for each correction and speed up the correction process of the long exposure time.
  • the short exposure histogram may be one or more.
  • a short exposure histogram can be generated based on the original pixel information output by all short exposure pixels.
  • step 402 may specifically include the following sub-steps:
  • step 501 in the same photosensitive pixel unit, original pixel information of long exposure pixels, original pixel information of short exposure pixels, or original pixel information of middle exposure pixels is selected.
  • the original pixel information of the long exposure pixels, the original pixel information of the short exposure pixels, or the original pixel information of the middle exposure pixels, that is, the original pixel information of the long exposure pixels and the short exposure are selected.
  • One of the original pixel information of the pixel or the original pixel information of the exposed pixel is selected.
  • a photosensitive pixel unit includes one long exposure pixel, two middle exposure pixels, and one short exposure pixel
  • the original pixel information of the long exposure pixel is 80
  • the original pixel information of the two middle exposure pixels is 255
  • the original pixel information of the short exposure pixel is 255
  • the original pixel information of the long exposure pixel can be selected: 80.
  • Step 502 Calculate the combined pixel information according to the selected original pixel information and the exposure ratio between the long exposure time, the middle exposure time, and the short exposure time.
  • the merged pixel information can be calculated by the selected original pixel information and the exposure ratio between the long exposure time, the middle exposure time, and the short exposure time, which can expand the dynamic range. Obtain high dynamic range images, thereby improving the imaging effect of color images.
  • the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels that output original pixel information; referring to FIG. 10, shown in FIG. 3
  • step 301 may specifically include the following sub-steps:
  • Step 601 Control the pixel unit array to perform multiple exposures using different exposure durations.
  • the pixel unit array is controlled to perform multiple exposures with different exposure durations.
  • the pixel unit array may be controlled to be exposed 3 times with a long exposure duration, a medium exposure duration, and a short exposure duration.
  • each exposure time can be preset in a built-in program of the electronic device, or can be set by a user to improve the flexibility and applicability of the control method.
  • step 602 at each exposure, the photosensitive pixels of the control pixel unit array are exposed using the same exposure time, and multiple original pixel information is obtained by output.
  • the exposure time used by the photosensitive pixels in the pixel unit array is the same.
  • the photosensitive pixels of the pixel unit array are all the same. Long exposure time exposure.
  • each photosensitive pixel unit in the pixel unit array will output a plurality of original pixel information at a corresponding exposure time.
  • Step 603 Obtain an exposure image based on the multiple pieces of original pixel information obtained by each exposure.
  • the exposure durations are different, such as the long exposure duration, the medium exposure duration, or the short exposure duration.
  • the original pixel information output by each photosensitive pixel unit is different.
  • the original pixel information output by each photosensitive pixel unit may be different.
  • To generate a frame of exposure image for example, to generate a long exposure image, a medium exposure image, and a short exposure image.
  • the exposure time of the pixel unit array of the same frame of the exposure image is the same, that is, the exposure time of different photosensitive pixel units of the long exposure image is the same, the exposure time of different photosensitive pixel units of the middle exposure image is the same, and that of the different photosensitive pixels of the short exposure image is the same.
  • the exposure time is the same.
  • Step 604 Combine the exposure images of each frame to obtain a color image.
  • the exposure images generated by using different exposure durations can be synthesized to obtain a color image.
  • the exposure images generated for different exposure durations may be assigned different weights respectively, and then the exposure images generated using different exposure durations are combined according to the corresponding weights of the exposure images to obtain a color image.
  • the weights corresponding to the exposure images generated at different exposure durations can be preset in a built-in program of the electronic device, or can be set by a user, which is not limited.
  • a long exposure image, a medium exposure image, and a short exposure image are generated according to the original pixel information output by each photosensitive pixel unit. Then, the long exposure image, the middle exposure image, and the short exposure image are synthesized according to the preset weights corresponding to the long exposure image, the middle exposure image, and the short exposure image, and a color image with a high dynamic range can be obtained.
  • the above method of synthesizing a high-dynamic-range color image using three frames of exposure images is merely an example.
  • the number of exposure images may also be two frames, four frames, five frames, six frames, etc. It is not specifically limited here.
  • the pixel unit array can be controlled to use the long exposure time and the short exposure time to perform two exposures, or the pixel unit array can be controlled to use the middle exposure time and the short exposure time to perform two exposures.
  • the pixel unit array is controlled to use the long exposure time and the middle exposure time to perform two exposures.
  • the present disclosure also proposes a control device.
  • FIG. 11 is a schematic structural diagram of a control device according to a seventh embodiment of the present disclosure.
  • the control device 100 includes a first control module 110, a second control module 120, and a synthesis module 130. among them,
  • the first control module 110 is configured to control a color camera to obtain a color image with a high dynamic range.
  • the first control module 110 is specifically configured to control a color camera to perform shooting according to a long exposure duration, a short exposure duration, and a middle exposure duration to obtain a color image.
  • the second control module 120 is configured to control a black-and-white camera to obtain a black-and-white image with a low dynamic range.
  • the second control module 120 is specifically configured to control the black and white camera to shoot with a medium exposure time to obtain a black and white image with a low dynamic range.
  • a synthesizing module 130 is configured to synthesize a black and white image and a color image to obtain a target image.
  • the synthesis module 130 is specifically configured to: determine each pixel in the color image; determine pixels corresponding to each pixel in the color image in the black and white image; map the color of each pixel in the color image to black and white The corresponding pixels in the image; the black and white image after mapping is used as the target image.
  • the synthesis module 130 is specifically configured to: determine each pixel in the color image; determine pixels in the black and white image corresponding to each pixel in the color image; and change the light intensity of each pixel in the black and white image, Compensate to the pixel corresponding to the color image; use the compensated color image as the target image.
  • control device 100 may further include:
  • the adjusting module 140 is configured to adjust the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera before the combining the black and white image and the color image.
  • the field of view of the black and white camera is greater than the field of view of the color camera
  • the adjustment module 140 includes:
  • the retention submodule 141 is configured to retain an image area with the same screen content as the screen content of the color image from the black and white image according to the difference in viewing angle.
  • the adjustment sub-module 142 is configured to adjust the pixels in the image area to obtain an adjusted black and white image; wherein the pixels in the adjusted black and white image correspond to the pixels in the color image one to one.
  • the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels covered by the same color filter; multiple photosensitive pixels are used to output original pixels.
  • the information includes at least one long exposure pixel, at least one middle exposure pixel, and at least one short exposure pixel.
  • the first control module 110 includes:
  • a control sub-module 111 is used to control the pixel unit array to output multiple pieces of original pixel information under different exposure times; wherein, the long exposure duration of the long exposure pixels is greater than the middle exposure duration of the middle exposure pixels, and the middle exposure of the middle exposure pixels The duration is greater than the short exposure duration of the short exposure pixels.
  • the calculation sub-module 112 is configured to calculate and obtain the merged pixel information according to the original pixel information with the same exposure time in the same photosensitive pixel unit.
  • the processing sub-module 113 is configured to output a color image according to the merged pixel information.
  • the calculation submodule 112 includes:
  • the selecting unit 1121 is configured to select the original pixel information of the long-exposed pixels, the original pixel information of the short-exposed pixels, or the original pixel information of the middle-exposed pixels in the same photosensitive pixel unit.
  • the calculation unit 1122 is configured to calculate and obtain the merged pixel information according to the selected original pixel information and the exposure ratio between the long exposure time, the middle exposure time, and the short exposure time.
  • the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels that output original pixel information.
  • the control submodule 111 is further configured to control the pixel unit array to perform multiple exposures with different exposure durations, and to control the exposure of the photosensitive pixels of the pixel unit array to the same exposure time at each exposure, and output multiple original pixel information. .
  • the calculation sub-module 112 is further configured to obtain a frame of an exposure image according to a plurality of original pixel information obtained by each exposure.
  • the processing sub-module 113 is further configured to synthesize the exposure images of each frame to obtain a color image.
  • the control device first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • the present disclosure also proposes an imaging apparatus.
  • FIG. 13 is a schematic structural diagram of an imaging apparatus according to Embodiment 9 of the present disclosure.
  • the imaging device includes a color camera 10 and a black-and-white camera 20.
  • the imaging device further includes a processor 30.
  • the processor 30 is configured to:
  • the black and white image and the color image are synthesized to obtain a target image.
  • the imaging device of the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
  • the present disclosure also provides an electronic device including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • a processor executes the program, the foregoing embodiments of the present disclosure are implemented Proposed control method.
  • the present disclosure also proposes a computer-readable storage medium on which a computer program is stored, which is characterized in that when the program is executed by a processor, the control method as proposed in the foregoing embodiment of the present disclosure is implemented.
  • the present disclosure further provides an electronic device 200.
  • the electronic device 200 includes a memory 50 and a processor 60.
  • the memory 50 stores computer-readable instructions.
  • the processor 60 is caused to execute the control method of any one of the foregoing embodiments.
  • FIG. 14 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
  • the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
  • the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
  • the computer-readable instructions can be executed by the processor 60 to implement the control method of the embodiment of the present disclosure.
  • the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
  • the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
  • the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display, and the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball or a touch button provided on the housing of the electronic device 200 Board, which can also be an external keyboard, trackpad, or mouse.
  • the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
  • the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
  • the electronic device 200 includes an image processing circuit 90.
  • the image processing circuit 90 may be implemented by hardware and / or software components, including various types of pipelines that define an ISP (Image Signal Processing) pipeline. Processing unit.
  • FIG. 15 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 15, for convenience of explanation, only various aspects of the image processing technology related to the embodiments of the present disclosure are shown.
  • the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
  • the image data captured by the camera 93 is first processed by the ISP processor 91.
  • the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
  • the camera 93 may include one or more lenses 932 and an image sensor 934.
  • the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
  • the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
  • the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
  • the image sensor 934 may also send the original image data to the sensor 94.
  • the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
  • the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
  • the ISP processor 91 may also receive image data from the image memory 95.
  • the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
  • the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
  • DMA Direct Memory Access
  • the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
  • the processed image data may be sent to the image memory 95 for further processing before being displayed.
  • the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
  • the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
  • the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
  • the image memory 95 may be configured to implement one or more frame buffers.
  • the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 97 device.
  • the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
  • the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
  • the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
  • the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware).
  • the one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data.
  • 91 control parameters For example, the control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focal length for focus or zoom), or these parameters The combination.
  • the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
  • the following are the steps for implementing the control method by using the processor 60 in FIG. 14 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 15:
  • the black and white image and the color image are synthesized to obtain a target image.
  • the image area with the same screen content as the color image is retained from the black and white image
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present disclosure, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
  • the scope of the preferred embodiments of the present disclosure includes additional implementations, in which the functions may be performed out of the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present disclosure belong.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

本公开提出一种控制方法、装置、成像设备、电子设备及可读存储介质,其中,方法应用于成像设备,成像设备包括彩色摄像头和黑白摄像头,控制方法包括:控制彩色摄像头拍摄得到高动态范围的彩色图像;控制黑白摄像头拍摄得到低动态范围的黑白图像;对黑白图像与彩色图像进行合成,得到目标图像。该方法能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。

Description

控制方法、装置、成像设备、电子设备及可读存储介质
相关申请的交叉引用
本公开要求OPPO广东移动通信有限公司于2018年08月06日提交的、申请名称为“控制方法、装置、成像设备、电子设备及可读存储介质”的、中国专利申请号“201810886728.7”的优先权。
技术领域
本公开涉及电子设备技术领域,尤其涉及一种控制方法、装置、成像设备、电子设备及可读存储介质。
背景技术
随着终端技术以及图像处理技术的不断发展,人们对于拍摄图像的质量的需求越来越高。在某些场景下,在对移动物体进行拍照时,由于曝光时间不同,画面中移动物体的边缘容易出现错位或者模糊的情况。
相关技术中,通过使用两个彩色摄像头分别拍摄长曝光时间和短曝光时间的两帧曝光图像,而后将两帧曝光图像进行合成,来避免移动物体的边缘出现错位或者模糊的情况。
公开内容
本公开提出一种控制方法、装置、成像设备、电子设备及可读存储介质,以实现同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验,用于解决现有技术中在被摄物体移动的情况下,由于曝光时长不同,长曝光时间的曝光图像中的移动物体边缘会有较长的拖影,当通过图像合成技术将两帧曝光图像进行合成后,可能发生合成图像中细节丢失或者模糊的情况,使得合成图像质量较低的技术问题。
本公开一方面实施例提出了一种控制方法,应用于成像设备,所述成像设备包括彩色摄像头和黑白摄像头,所述控制方法包括:
控制所述彩色摄像头拍摄得到高动态范围的彩色图像;
控制所述黑白摄像头拍摄得到低动态范围的黑白图像;
对所述黑白图像与所述彩色图像进行合成,得到目标图像。
本公开实施例的控制方法,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像, 能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
本公开又一方面实施例提出了一种控制装置,应用于成像设备,所述成像设备包括彩色摄像头和黑白摄像头,所述控制装置包括:
第一控制模块,用于控制所述彩色摄像头拍摄得到高动态范围的彩色图像;
第二控制模块,用于控制所述黑白摄像头拍摄得到低动态范围的黑白图像;
合成模块,用于对所述黑白图像与所述彩色图像进行合成,得到目标图像。
本公开实施例的控制装置,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
本公开又一方面实施例提出了一种成像设备,所述成像设备包括彩色摄像头和黑白摄像头,所述成像设备还包括处理器,所述处理器用于:
控制所述彩色摄像头拍摄得到高动态范围的彩色图像;
控制所述黑白摄像头拍摄得到低动态范围的黑白图像;
对所述黑白图像与所述彩色图像进行合成,得到目标图像。
本公开实施例的成像设备,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
本公开又一方面实施例提出了一种电子设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如本公开前述实施例提出的控制方法。
本公开又一方面实施例提出了一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如本公开前述实施例提出的控制方法。
本公开附加的方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本公开的实践了解到。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例一所提供的控制方法的流程示意图;
图2为本公开实施例二所提供的控制方法的流程示意图;
图3为本公开实施例三所提供的控制方法的流程示意图;
图4为本公开实施例中彩色摄像头拍摄得到的图像示意图;
图5为本公开实施例中的彩色图像示意图;
图6为本公开实施例中的黑白图像示意图;
图7为本公开实施例中调整后的黑白图像示意图;
图8为本公开实施例四所提供的控制方法的流程示意图;
图9为本公开实施例五所提供的控制方法的流程示意图;
图10为本公开实施例六所提供的控制方法的流程示意图;
图11为本公开实施例七所提供的控制装置的结构示意图;
图12为本公开实施例八所提供的控制装置的结构示意图;
图13为本公开实施例九所提供的电子设备的结构示意图;
图14是本公开某些实施方式的电子设备的模块示意图;
图15是本公开某些实施方式的图像处理电路的模块示意图。
具体实施方式
下面详细描述本公开的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本公开,而不能理解为对本公开的限制。
本公开主要针对现有技术中通过图像合成技术将两帧曝光图像进行合成后,可能发生合成图像中细节丢失或者模糊的情况,使得合成图像质量较低的技术问题,提供一种控制方法。
本公开实施例的控制方法,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体 验。
下面参考附图描述本公开实施例的控制方法、装置、成像设备、电子设备及可读存储介质。
图1为本公开实施例一所提供的控制方法的流程示意图。
本公开实施例的控制方法,应用于成像设备,成像设备包括彩色摄像头和黑白摄像头。
如图1所示,该控制方法包括以下步骤:
步骤101,控制彩色摄像头拍摄得到高动态范围的彩色图像。
本公开实施例中,为了减少图像中的噪声,从而提升图像的清晰度和对比度,本公开中,可以控制彩色摄像头拍摄得到高动态范围的彩色图像,从而通过该高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。
步骤102,控制黑白摄像头拍摄得到低动态范围的黑白图像。
本公开实施例中,由于黑白摄像头的进光量高于彩色摄像头,例如现有的黑白摄像头的进光量为彩色摄像头的四倍,因此,为了较多地保留图像的边缘和细节,本公开中,可以控制黑白摄像头拍摄得到低动态范围的黑白图像,从而使得黑白图像能够更好地保留图像的边缘和细节。
步骤103,对黑白图像与彩色图像进行合成,得到目标图像。
本公开实施例中,可以对黑白图像与彩色图像进行合成,得到目标图像。由此,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
作为一种可能的实现方式,可以确定彩色图像中的每个像素,以及确定黑白图像中与彩色图像中的每个像素对应的像素,而后以黑白图像为主体,将彩色图像中每个像素的颜色贴图至黑白图像中对应的像素上,即可得到目标图像。
作为另一种可能的实现方式,可以确定彩色图像中的每个像素,以及确定黑白图像中与彩色图像中的每个像素对应的像素,而后以彩色图像为主体,将黑白图像上每个像素的光强,补偿到彩色图像对应的像素上,即可得到目标图像。
本公开实施例的控制方法,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到低动态范围的黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
作为一种可能的实现方式,为了提升合成的目标图像的质量,本公开实施例中,可以根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整。下面结合图2,对上述过程进行详细说明。
图2为本公开实施例二所提供的控制方法的流程示意图。
如图2所示,该控制方法可以包括以下步骤:
步骤201,控制彩色摄像头拍摄得到高动态范围的彩色图像。
步骤202,控制黑白摄像头拍摄得到低动态范围的黑白图像。
步骤201至202的执行过程可以参见上述实施例中步骤101至102的执行过程,在此不做赘述。
步骤203,根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整。
可以理解的是,由于彩色摄像头和黑白摄像头的安装位置不同,因而彩色摄像头和黑白摄像头存在视角差异。为了提升后续合成的目标图像的质量,本公开实施例中,可以根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整。
作为一种可能的实现方式,为了提升图像的处理效率,可以仅针对黑白图像与彩色图像中画面内容相同的图像区域进行视角调整。具体地,可以根据彩色摄像头和黑白摄像头的视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域,而后对图像区域中的像素进行调整得到调整后的黑白图像;其中,调整后的黑白图像中的像素与彩色图像中的像素一一对应。
具体而言,可以利用图像识别技术,根据视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域,例如,可以根据视角差异,对黑白图像中的画面内容进行特征提取,并对彩色图像中的画面内容进行特征提取,而后将提取的特征进行比较,从而可以确定黑白图像中画面内容与彩色图像的画面内容相同的图像区域,而后对图像区域中的像素进行调整,使得调整后的黑白图像中的像素的视角和位置,与彩色图像中的像素一一对应。
本公开实施例中,根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整,能够提升调整后的黑白图像与彩色图像的匹配度,从而提升后续合成图像的质量。
步骤204,将调整后的黑白图像与彩色图像进行合成,得到目标图像。
本公开实施例中,在对黑白图像进行视角调整后,可以将调整后的黑白图像与彩色图像进行合成,得到目标图像。由此,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
作为一种可能的实现方式,可以确定彩色图像中的每个像素,以及确定调整后的黑白图像中与彩色图像中的每个像素对应的像素,而后以视角调整后的黑白图像为主体,将彩色图像中每个像素的颜色贴图至调整后的黑白图像中对应的像素上,即可得到目标图像。
作为另一种可能的实现方式,可以确定彩色图像中的每个像素,以及确定调整后的黑白图像中与彩色图像中的每个像素对应的像素,而后以彩色图像为主体,将调整后的黑白图像上每个像素的光强,补偿到彩色图像对应的像素上,即可得到目标图像。
本公开实施例的控制方法,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像, 能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到低动态范围的黑白图像,能够更好地保留图像的边缘和细节。接着通过根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整,能够提升调整后的黑白图像与彩色图像的匹配度,从而提升后续合成图像的质量。最后通过将调整后的黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
为了清楚说明上一实施例,本实施例提供了另一种控制方法,图3为本公开实施例三所提供的控制方法的流程示意图。
如图3所示,该控制方法可以包括以下步骤:
步骤301,根据长曝光时长、短曝光时长和中曝光时长控制彩色摄像头进行拍摄,得到彩色图像。
其中,长曝光时长大于中曝光时长,中曝光时长大于短曝光时长,即长曝光时长>中曝光时长>短曝光时长。长曝光时长、短曝光时长和中曝光时长可以预设在电子设备的内置程序中,或者也可以由用户进行设置,对此不作限制。
可选地,可以控制彩色摄像头分别采用长曝光时长、短曝光时长和中曝光时长进行拍摄,得到包含至少一帧长曝光图像、至少一帧中曝光图像和至少一帧短曝光图像,例如,参见图4,图4为本公开实施例中彩色摄像头拍摄得到的图像示意图。其中,彩色摄像头分别采集了一帧长曝光图像、一帧中曝光图像和一帧短曝光图像。而后可以根据采集得到的图像中的至少两帧图像,得到彩色图像。例如,可以将图4中的三帧图像进行合成,得到的彩色图像可以如图5所示。
步骤302,控制黑白摄像头采用中曝光时长进行拍摄,得到低动态范围的黑白图像。
本公开实施例中,为了保留较多的图像细节,可以控制黑白摄像头采用中曝光时长进行拍摄,得到低动态范围的黑白图像。例如,黑白摄像头拍摄得到的黑白图像可以如图6所示。
步骤303,根据视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域。
为了提升图像的处理效率,可以仅针对黑白图像与彩色图像中画面内容相同的图像区域进行视角调整。具体地,可以根据彩色摄像头和黑白摄像头的视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域。
具体而言,可以利用图像识别技术,根据视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域,例如,可以根据视角差异,对黑白图像中的画面内容进行特征提取,并对彩色图像中的画面内容进行特征提取,而后将提取的特征进行比较,从而可以确定黑白图像中画面内容与彩色图像的画面内容相同的图像区域。
步骤304,对图像区域中的像素进行调整得到调整后的黑白图像;其中,调整后的黑白图像中的像素与彩色图像中的像素一一对应。
本公开实施例中,可以对图像区域中的像素进行调整,使得调整后的黑白图像中的像素的视角和位置,与彩色图像中的像素一一对应。
举例而言,针对图5中的彩色图像和图6中的黑白图像,可以首先确定黑白图像中与彩色图像中画面内容相同的图像区域,对图像区域中的像素进行调整,得到的黑白图像可以如图7所示,图7为本公开实施例中调整后的黑白图像示意图。可知,调整后的黑白图像与彩色图像更匹配。
步骤305,将调整后的黑白图像与彩色图像进行合成,得到目标图像。
步骤305的执行过程可以参见上述实施例中步骤204的执行过程,在此不做赘述。
本公开实施例的控制方法,通过首先通过彩色摄像头拍摄高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过黑白摄像头拍摄得到低动态范围的黑白图像,能够更好地保留图像的边缘和细节。接着通过根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整,能够提升调整后的黑白图像与彩色图像的匹配度,从而提升后续合成图像的质量。最后通过将调整后的黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
作为一种可能的实现方式,彩色摄像头包括由多个感光像素单元组成的像素单元阵列,每个感光像素单元包括同色滤光片覆盖的多个感光像素;多个感光像素,用于输出原始像素信息,包括至少一个长曝光像素、至少一个中曝光像素和至少一个短曝光像素,则参见图8,在图3所示实施例的基础上,步骤301具体可以包括以下子步骤:
步骤401,控制像素单元阵列输出分别处于不同曝光时间下的多个原始像素信息。
本公开实施例中,像素单元阵列中的每个感光像素单元包括至少一个长曝光像素、至少一个中曝光像素和至少一个短曝光像素,其中,长曝光像素指的是感光像素对应的曝光时间为长曝光时间,中曝光像素指的是感光像素对应的曝光时间为中曝光时间,短曝光像素指的是感光像素对应的曝光时间为短曝光时间,长曝光时间>中曝光时间>短曝光时间,即长曝光像素的长曝光时间大于中曝光像素的中曝光时间,且中曝光像素的中曝光时间大于短曝光像素的短曝光时间。在成像设备工作时,长曝光像素、中曝光像素及短曝光像素同步曝光,同步曝光指的是中曝光像素及短曝光像素的曝光进行时间位于长曝光像素的曝光进行时间以内。
具体地,可以首先控制长曝光像素最先开始曝光,在长曝光像素的曝光期间内,再控制中曝光像素以及短曝光像素曝光,其中,中曝光像素和短曝光像素的曝光截止时间应与长曝光像素的曝光截止时间相同或位于长曝光像素的曝光截止时间之前;或者,控制长曝 光像素、中曝光像素以及短曝光像素同时开始曝光,即长曝光像素、中曝光像素以及短曝光像素的曝光起始时间相同。如此,无需控制像素单元阵列依次进行长曝、中曝和短曝,可减小彩色图像的拍摄时间。
成像设备首先控制像素单元阵列中的每个感光像素单元中的长曝光像素、中曝光像素及短曝光像素同步曝光,其中长曝光像素对应的曝光时间为初始长曝光时间,中曝光像素对应的曝光时间为初始中曝光时间,短曝光像素对应的曝光时间为初始短曝光时间,初始长曝光时间、初始中曝光时间及初始短曝光时间均为预先设定好的。曝光结束后,像素单元阵列中的每个感光像素单元将输出分别处于不同曝光时间下的多个原始像素信息。
步骤402,根据同一感光像素单元中曝光时间相同的原始像素信息计算得到合并像素信息。
步骤403,根据合并像素信息输出彩色图像。
例如,当每个感光像素单元中包括1个长曝光像素、2个中曝光像素、1个短曝光像素时,唯一的长曝光像素的原始像素信息即为长曝光的合并像素信息,2个中曝光像素的原始像素信息之和即为中曝光的合并像素信息,唯一的短曝光像素的原始像素信息即为短曝光的合并像素信息;当每个感光像素单元中包括2个长曝光像素、4个中曝光像素、2个短曝光像素时,2个长曝光像素的原始像素信息之和即为长曝光的合并像素信息,4个中曝光像素的原始像素信息之和即为中曝光的合并像素信息,2个短曝光像素的原始像素信息之和即为短曝光的合并像素信息。如此,可以获得整个像素单元阵列的多个长曝光的合并像素信息、多个中曝光的合并像素信息、多个短曝光的合并像素信息。
而后,再根据多个长曝光的合并像素信息插值计算得到长曝光子图像,根据多个中曝光的合并像素信息插值计算得到中曝光子图像,根据多个短曝光的合并像素信息插值计算得到短曝光子图像。最后,将长曝光子图像、中曝光子图像和短曝光子图像融合处理得到高动态范围的彩色图像,其中,长曝光子图像、中曝光子图像和短曝光子图像并非为传统意义上的三帧图像,而是同一帧图像中长、短、中曝光像素对应区域形成的图像部分。
或者,在像素单元阵列曝光结束后,可以以长曝光像素输出的原始像素信息为基准,将短曝光像素的原始像素信息和中曝光像素的原始像素信息叠加到长曝光像素的原始像素信息上。具体地,针对同一感光像素单元,可以对三种不同曝光时间的原始像素信息分别赋予不同的权值,在各曝光时间对应的原始像素信息与权值相乘后,再将三种乘以权值后的原始像素信息相加作为一个感光像素单元的合成像素信息。随后,由于根据三种不同曝光时间的原始像素信息计算得到的每一个合成像素信息的灰度级别会产生变化,因此,在得到合成像素信息后需要对每一个合成像素信息做灰度级别的压缩。压缩完毕后,可以根据多个压缩完毕后得到的合成像素信息进行插值计算即可得到彩色图像。如此,彩色图像中暗部已经由长曝光像素输出的原始像素信息进行补偿,亮部已经由短曝光像素输出的原始像素信息进行压制,因此,彩色图像不存在过曝区域及欠曝区域,具有较高的动态范围 和较佳的成像效果。
进一步地,为进一步改善彩色图像的成像质量,在长曝光像素、中曝光像素及短曝光像素分别根据初始长曝光时间、初始中曝光时间及初始短曝光时间同步曝光后,可以首先根据长曝光像素输出的原始像素信息计算长曝光直方图,根据短曝光像素输出的原始像素信息计算短曝光直方图,并根据长曝光直方图修正初始长曝光时间得到修正长曝光时间,根据短曝光直方图修正初始短曝光时间得到修正短曝光时间。随后,再控制长曝光像素、中曝光像素及短曝光像素分别根据修正长曝光时间、初始中曝光时间及修正短曝光时间同步曝光。其中,修正过程并非一步到位,而是成像设备需要进行多次长、中、短的同步曝光,在每一次的同步曝光后,成像设备会根据生成的长曝光直方图和短曝光直方图继续修正长曝光时间和短曝光时间,并在下一次曝光时采用上一时刻修正好的修正长曝光时间、修正短曝光时间以及原始中曝光时间进行同步曝光,继续获取长曝光直方图和短曝光直方图,如此周而复始,直至长曝光直方图对应的图像中不存在欠曝区域、短曝光直方图对应的图像中不存在过曝区域为止,此时的修正长曝光时间和修正短曝光时间即为最终的修正长曝光时间和修正短曝光时间。曝光结束后再根据长曝光像素、中曝光像素和短曝光像素的输出进行彩色图像的计算,该计算方式与上一实施方式中的计算方式相同,在此不再赘述。
其中,长曝光直方图可以为一个或多个。长曝光直方图为一个时,可以根据所有长曝光像素输出的原始像素信息生成一个长曝光直方图。长曝光直方图为多个时,可以对长曝光像素划分区域,并根据每个区域中的多个长曝光像的原始像素信素生成一个长曝光直方图,如此,多个区域对应多个长曝光直方图。划分区域的作用是可以提升每一次修正的长曝光时间的准确性,加快长曝光时间的修正进程。同样地,短曝光直方图可以为一个或多个。短曝光直方图为一个时,可以根据所有短曝光像素输出的原始像素信息生成一个短曝光直方图。短曝光直方图为多个时,可以对短曝光像素划分区域,并根据每个区域中的多个短曝光像素的原始像素信息生成一个短曝光直方图,如此,多个区域对应多个短曝光直方图。划分区域的作用是可以提升每一次修正的短曝光时间的准确性,加快短曝光时间的修正进程。
作为一种可能的实现方式,参见图9,在图8所示实施例的基础上,步骤402具体可以包括以下子步骤:
步骤501,在同一感光像素单元中,选取长曝光像素的原始像素信息、短曝光像素的原始像素信息或中曝光像素的原始像素信息。
本公开实施例中,在同一感光像素单元中,选取长曝光像素的原始像素信息、短曝光像素的原始像素信息或中曝光像素的原始像素信息,即从长曝光像素的原始像素信息、短曝光像素的原始像素信息或中曝光像素的原始像素信息中,选取一个原始像素信息。
例如,当一个感光像素单元中包括1个长曝光像素、2个中曝光像素、1个短曝光像素, 且长曝光像素的原始像素信息为80,两个中曝光像素的原始像素信息为255,短曝光像素的原始像素信息为255时,由于255为原始像素信息的上限,因此,选取的可以为长曝光像素的原始像素信息:80。
步骤502,根据选取的原始像素信息,以及长曝光时间、中曝光时间和短曝光时间之间的曝光比,计算得到合并像素信息。
仍以上述例子示例,假设长曝光时间、中曝光时间和短曝光时间之间的曝光比为:16:4:1,则合并像素信息为:80*16=1280。
由于现有技术中原始像素信息的上限为255,通过根据选取的原始像素信息,以及长曝光时间、中曝光时间和短曝光时间之间的曝光比,计算得到合并像素信息,可以扩展动态范围,得到高动态范围图像,从而提升彩色图像的成像效果。
作为另一种可能的实现方式,彩色摄像头包括由多个感光像素单元组成的像素单元阵列,每个感光像素单元包括输出原始像素信息的多个感光像素;则参见图10,在图3所示实施例的基础上,步骤301具体可以包括以下子步骤:
步骤601,控制像素单元阵列采用不同曝光时长分别进行多次曝光。
本公开实施例中,以不同曝光时长,控制像素单元阵列进行多次曝光,例如可以以长曝光时长、中曝光时长、短曝光时长,控制控制像素单元阵列进行3次曝光。其中,各曝光时长可以预设在电子设备的内置程序中,或者,还可以由用户进行设置,以提升该控制方法的灵活性及适用性。
步骤602,在每一次曝光时,控制像素单元阵列的感光像素采用相同曝光时间曝光,输出得到多个原始像素信息。
本公开实施例中,在每一次曝光时,像素单元阵列中的感光像素所采用的曝光时间相同,例如,以长曝光时间控制像素单元阵列进行曝光时,像素单元阵列的感光像素均采用相同的长曝光时间曝光。在每次曝光结束后,像素单元阵列中的每个感光像素单元将输出处于对应曝光时间下的多个原始像素信息。
步骤603,根据每一次曝光得到的多个原始像素信息,得到一帧曝光图像。
本公开实施例中,曝光时长不同,例如长曝光时长、中曝光时长或短曝光时长,各感光像素单元输出的原始像素信息不同,针对同一次曝光,可以根据各感光像素单元输出的原始像素信息,生成一帧曝光图像,例如,生成长曝光图像、中曝光图像、短曝光图像。其中,同一帧曝光图像的像素单元阵列的曝光时间相同,即长曝光图像的不同感光像素单元的曝光时间相同,中曝光图像的不同感光像素单元的曝光时间相同,短曝光图像的不同感光像素的曝光时间相同。
步骤604,对各帧曝光图像进行合成,得到彩色图像。
本公开实施例中,在得到采用不同曝光时长生成的曝光图像后,可以对对采用不同曝光时长曝光生成的曝光图像进行合成,得到彩色图像。例如,针对不同曝光时长生成的曝 光图像,可以分别赋予不同的权值,而后根据各曝光图像对应的权值,对采用不同曝光时长生成的曝光图像进行合成,得到彩色图像。其中,不同曝光时长生成的曝光图像对应的权值可以预设在电子设备的内置程序中,或者可以由用户进行设置,对此不作限制。
例如,当分别以长曝光时长、中曝光时长和短曝光时长,控制像素单元阵列进行三次曝光后,根据各感光像素单元输出的原始像素信息,生成长曝光图像、中曝光图像、短曝光图像。而后根据预先设置的长曝光图像、中曝光图像、短曝光图像对应的权值,对长曝光图像、中曝光图像、短曝光图像进行合成,可以得到高动态范围的彩色图像。
需要说明的是,上述利用三帧曝光图像合成高动态范围的彩色图像的方法仅为示例,在其他实施方式中,曝光图像的数量也可以为两帧、四帧、五帧、六帧等,在此不做具体限定。例如,当曝光图像的数量为两帧时,可以控制像素单元阵列分别采用长曝光时长和短曝光时长进行两次曝光,或者控制像素单元阵列分别采用中曝光时长和短曝光时长进行两次曝光,或者控制像素单元阵列分别采用长曝光时长和中曝光时长进行两次曝光。
为了实现上述实施例,本公开还提出一种控制装置。
图11为本公开实施例七所提供的控制装置的结构示意图。
如图11所示,该控制装置100包括:第一控制模块110、第二控制模块120,以及合成模块130。其中,
第一控制模块110,用于控制彩色摄像头拍摄得到高动态范围的彩色图像。
作为一种可能的实现方式,第一控制模块110,具体用于:根据长曝光时长、短曝光时长和中曝光时长控制彩色摄像头进行拍摄,得到彩色图像。
第二控制模块120,用于控制黑白摄像头拍摄得到低动态范围的黑白图像。
作为一种可能的实现方式,第二控制模块120,具体用于:控制黑白摄像头采用中曝光时长进行拍摄,得到低动态范围的黑白图像。
合成模块130,用于对黑白图像与彩色图像进行合成,得到目标图像。
作为一种可能的实现方式,合成模块130,具体用于:确定彩色图像中的各像素;确定黑白图像中与彩色图像中的各像素对应的像素;将彩色图像中各像素的颜色贴图至黑白图像中对应的像素上;将贴图后的黑白图像作为目标图像。
作为另一种可能的实现方式,合成模块130,具体用于:确定彩色图像中的各像素;确定黑白图像中与彩色图像中的各像素对应的像素;将黑白图像上各像素的光强,补偿到彩色图像对应的像素上;将补偿后的彩色图像作为目标图像。
进一步地,在本公开实施例的一种可能的实现方式中,参见图11,在图10所示实施例的基础上,该控制装置100还可以包括:
调整模块140,用于在所述对所述黑白图像与所述彩色图像进行合成之前,根据彩色摄像头和黑白摄像头的视角差异,对黑白图像进行视角调整。
作为一种可能的实现方式,黑白摄像头的视场角大于彩色摄像头的视场角,调整模块 140,包括:
保留子模块141,用于根据视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域。
调整子模块142,用于对图像区域中的像素进行调整得到调整后的黑白图像;其中,调整后的黑白图像中的像素与彩色图像中的像素一一对应。
作为一种可能的实现方式,彩色摄像头包括由多个感光像素单元组成的像素单元阵列,每个感光像素单元包括同色滤光片覆盖的多个感光像素;多个感光像素,用于输出原始像素信息,包括至少一个长曝光像素、至少一个中曝光像素和至少一个短曝光像素。
第一控制模块110,包括:
控制子模块111,用于控制像素单元阵列输出分别处于不同曝光时间下的多个原始像素信息;其中,长曝光像素的长曝光时长大于中曝光像素的中曝光时长,且中曝光像素的中曝光时长大于短曝光像素的短曝光时长。
计算子模块112,用于根据同一感光像素单元中曝光时间相同的原始像素信息计算得到合并像素信息。
处理子模块113,用于根据合并像素信息输出彩色图像。
作为一种可能的实现方式,计算子模块112,包括:
选取单元1121,用于在同一感光像素单元中,选取长曝光像素的原始像素信息、短曝光像素的原始像素信息或中曝光像素的原始像素信息。
计算单元1122,用于根据选取的原始像素信息,以及长曝光时间、中曝光时间和短曝光时间之间的曝光比,计算得到合并像素信息。
作为一种可能的实现方式,彩色摄像头包括由多个感光像素单元组成的像素单元阵列,每个感光像素单元包括输出原始像素信息的多个感光像素。
控制子模块111,还用于:控制像素单元阵列采用不同曝光时长分别进行多次曝光,以及在每一次曝光时,控制像素单元阵列的感光像素采用相同曝光时间曝光,输出得到多个原始像素信息。
计算子模块112,还用于:根据每一次曝光得到的多个原始像素信息,得到一帧曝光图像。
处理子模块113,还用于:对各帧曝光图像进行合成,得到彩色图像。
需要说明的是,前述对控制方法实施例的解释说明也适用于该实施例的控制装置100,此处不再赘述。
本公开实施例的控制装置,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图 像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
为了实现上述实施例,本公开还提出一种成像设备。
图13为本公开实施例九所提供的成像设备的结构示意图。
如图13所示,成像设备包括彩色摄像头10和黑白摄像头20,成像设备还包括处理器30,处理器30用于:
控制彩色摄像头10拍摄得到高动态范围的彩色图像;
控制黑白摄像头20拍摄得到低动态范围的黑白图像;
对黑白图像与彩色图像进行合成,得到目标图像。
本公开实施例的成像设备,首先通过控制彩色摄像头拍摄得到高动态范围的彩色图像,能够减少噪声,并清楚地显示当前的拍摄场景,提升成像效果和成像质量。而后通过控制黑白摄像头拍摄得到黑白图像,能够更好地保留图像的边缘和细节。最后通过对黑白图像与彩色图像进行合成,得到目标图像,能够同时保留目标图像的高动态范围,以及目标图像的边缘和细节,从而避免移动物体的边缘出现错位或者模糊的情况,改善用户的拍摄体验。
为了实现上述实施例,本公开还提出一种电子设备,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,处理器执行程序时,实现如本公开前述实施例提出的控制方法。
为了实现上述实施例,本公开还提出一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如本公开前述实施例提出的控制方法。
请参阅图14,本公开还提供一种电子设备200。电子设备200包括存储器50和处理器60。存储器50中存储有计算机可读指令。计算机可读指令被存储器50执行时,使得处理器60执行上述任一实施方式的控制方法。
图14为一个实施例中电子设备200的内部结构示意图。该电子设备200包括通过系统总线81连接的处理器60、存储器50(例如为非易失性存储介质)、内存储器82、显示屏83和输入装置84。其中,电子设备200的存储器50存储有操作系统和计算机可读指令。该计算机可读指令可被处理器60执行,以实现本公开实施方式的控制方法。该处理器60用于提供计算和控制能力,支撑整个电子设备200的运行。电子设备200的内存储器50为存储器52中的计算机可读指令的运行提供环境。电子设备200的显示屏83可以是液晶显示屏或者电子墨水显示屏等,输入装置84可以是显示屏83上覆盖的触摸层,也可以是电子设备200外壳上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子设备200可以是手机、平板电脑、笔记本电脑、个人数字助理或穿戴式设备(例如智能手环、智能手表、智能头盔、智能眼镜)等。本领域技术人员可以理解,图14中示出的结构,仅仅是与本公开方案相关的部分结构的示意图,并不构成对本公开方案所应用于 其上的电子设备200的限定,具体的电子设备200可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
请参阅图15,本公开实施例的电子设备200中包括图像处理电路90,图像处理电路90可利用硬件和/或软件组件实现,包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图15为一个实施例中图像处理电路90的示意图。如图15所示,为便于说明,仅示出与本公开实施例相关的图像处理技术的各个方面。
如图15所示,图像处理电路90包括ISP处理器91(ISP处理器91可为处理器60)和控制逻辑器92。摄像头93捕捉的图像数据首先由ISP处理器91处理,ISP处理器91对图像数据进行分析以捕捉可用于确定摄像头93的一个或多个控制参数的图像统计信息。摄像头93可包括一个或多个透镜932和图像传感器934。图像传感器934可包括色彩滤镜阵列(如Bayer滤镜),图像传感器934可获取每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器91处理的一组原始图像数据。传感器94(如陀螺仪)可基于传感器94接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器91。传感器94接口可以为SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器934也可将原始图像数据发送给传感器94,传感器94可基于传感器94接口类型把原始图像数据提供给ISP处理器91,或者传感器94将原始图像数据存储到图像存储器95中。
ISP处理器91按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器91可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器91还可从图像存储器95接收图像数据。例如,传感器94接口将原始图像数据发送给图像存储器95,图像存储器95中的原始图像数据再提供给ISP处理器91以供处理。图像存储器95可为存储器50、存储器50的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器934接口或来自传感器94接口或来自图像存储器95的原始图像数据时,ISP处理器91可进行一个或多个图像处理操作,如时域滤波。处理后的图像数据可发送给图像存储器95,以便在被显示之前进行另外的处理。ISP处理器91从图像存储器95接收处理数据,并对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。ISP处理器91处理后的图像数据可输出给显示器97(显示器97可包括显示屏83),以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器91的输出还可发送给图像存储器95,且显示器97可从图像存储器95读取图像数据。在一个实施例中,图像存储器95可被配置为实现一个或多个帧缓冲 器。此外,ISP处理器91的输出可发送给编码器/解码器96,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器97设备上之前解压缩。编码器/解码器96可由CPU或GPU或协处理器实现。
ISP处理器91确定的统计数据可发送给控制逻辑器92单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜932阴影校正等图像传感器934统计信息。控制逻辑器92可包括执行一个或多个例程(如固件)的处理元件和/或微控制器,一个或多个例程可根据接收的统计数据,确定摄像头93的控制参数及ISP处理器91的控制参数。例如,摄像头93的控制参数可包括传感器94控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜932控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜932阴影校正参数。
例如,以下为运用图14中的处理器60或运用图15中的图像处理电路90(具体为ISP处理器91)实现控制方法的步骤:
控制彩色摄像头拍摄得到高动态范围的彩色图像;
控制黑白摄像头拍摄得到低动态范围的黑白图像;
对黑白图像与彩色图像进行合成,得到目标图像。
再例如,以下为运用图14中的处理器或运用图15中的图像处理电路90(具体为ISP处理器)实现控制方法的步骤:
根据视角差异,从黑白图像中保留画面内容与彩色图像的画面内容相同的图像区域;
对图像区域中的像素进行调整得到调整后的黑白图像;其中,调整后的黑白图像中的像素与彩色图像中的像素一一对应。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本公开的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本公开的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现定制逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分, 并且本公开的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本公开的实施例所属技术领域的技术人员所理解。
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。
应当理解,本公开的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。如,如果用硬件来实现和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。
此外,在本公开各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本公开的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本公开的限制,本领域的普通技术人员在本公开的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种控制方法,其特征在于,应用于成像设备,所述成像设备包括彩色摄像头和黑白摄像头,所述控制方法包括以下步骤:
    控制所述彩色摄像头拍摄得到高动态范围的彩色图像;
    控制所述黑白摄像头拍摄得到低动态范围的黑白图像;
    对所述黑白图像与所述彩色图像进行合成,得到目标图像。
  2. 根据权利要求1所述的控制方法,其特征在于,所述对所述黑白图像与所述彩色图像进行合成之前,还包括:
    根据所述彩色摄像头和所述黑白摄像头的视角差异,对所述黑白图像进行视角调整。
  3. 根据权利要求2所述的控制方法,其特征在于,所述黑白摄像头的视场角大于所述彩色摄像头的视场角,所述根据所述彩色摄像头和所述黑白摄像头的视角差异,对所述黑白图像进行视角调整,包括:
    根据所述视角差异,从所述黑白图像中保留画面内容与所述彩色图像的画面内容相同的图像区域;
    对所述图像区域中的像素进行调整得到调整后的黑白图像;其中,调整后的黑白图像中的像素与所述彩色图像中的像素一一对应。
  4. 根据权利要求1-3中任一所述的控制方法,其特征在于,所述控制所述彩色摄像头拍摄得到高动态范围的彩色图像,包括:
    根据长曝光时长、短曝光时长和中曝光时长控制所述彩色摄像头进行拍摄,得到所述彩色图像。
  5. 根据权利要求4所述的控制方法,其特征在于,所述彩色摄像头包括由多个感光像素单元组成的像素单元阵列,每个感光像素单元包括同色滤光片覆盖的多个感光像素;所述多个感光像素,用于输出原始像素信息,包括至少一个长曝光像素、至少一个中曝光像素和至少一个短曝光像素;
    所述根据长曝光时长、短曝光时长和中曝光时长控制所述彩色摄像头进行拍摄,得到所述彩色图像,包括:
    控制所述像素单元阵列输出分别处于不同曝光时间下的多个原始像素信息;其中,所述长曝光像素的长曝光时长大于所述中曝光像素的中曝光时长,且所述中曝光像素的中曝光时长大于所述短曝光像素的短曝光时长;
    根据同一感光像素单元中曝光时间相同的所述原始像素信息计算得到合并像素信息;
    根据所述合并像素信息输出所述彩色图像。
  6. 根据权利要求5所述的控制方法,其特征在于,所述根据同一感光像素单元中曝光时间相同的所述原始像素信息计算得到合并像素信息,包括:
    在同一感光像素单元中,选取长曝光像素的原始像素信息、短曝光像素的原始像素信息或中曝光像素的原始像素信息;
    根据选取的原始像素信息,以及所述长曝光时间、所述中曝光时间和所述短曝光时间之间的曝光比,计算得到所述合并像素信息。
  7. 根据权利要求4所述的控制方法,其特征在于,所述彩色摄像头包括由多个感光像素单元组成的像素单元阵列,每个感光像素单元包括输出原始像素信息的多个感光像素;
    所述根据长曝光时长、短曝光时长和中曝光时长控制所述彩色摄像头进行拍摄,得到所述彩色图像,包括:
    控制所述像素单元阵列采用不同曝光时长分别进行多次曝光;
    在每一次曝光时,控制所述像素单元阵列的感光像素采用相同曝光时间曝光,输出得到多个原始像素信息;
    根据每一次曝光得到的多个原始像素信息,得到一帧曝光图像;
    对各帧曝光图像进行合成,得到所述彩色图像。
  8. 根据权利要求4-7中任一所述的控制方法,其特征在于,所述控制所述黑白摄像头拍摄得到低动态范围的黑白图像,包括:
    控制所述黑白摄像头采用所述中曝光时长进行拍摄,得到所述低动态范围的黑白图像。
  9. 根据权利要求1-8中任一所述的控制方法,其特征在于,所述对所述黑白图像与所述彩色图像进行合成,得到目标图像,包括:
    确定所述彩色图像中的各像素;
    确定所述黑白图像中与所述彩色图像中的各像素对应的像素;
    将所述彩色图像中各像素的颜色贴图至所述黑白图像中对应的像素上;
    将贴图后的黑白图像作为所述目标图像。
  10. 根据权利要求1-8中任一所述的控制方法,其特征在于,所述对所述黑白图像与所述彩色图像进行合成,得到目标图像,包括:
    确定所述彩色图像中的各像素;
    确定所述黑白图像中与所述彩色图像中的各像素对应的像素;
    将所述黑白图像上各像素的光强,补偿到所述彩色图像对应的像素上;
    将补偿后的彩色图像作为所述目标图像。
  11. 一种控制装置,其特征在于,应用于成像设备,所述成像设备包括彩色摄像头和黑白摄像头,所述控制装置包括:
    第一控制模块,用于控制所述彩色摄像头拍摄得到高动态范围的彩色图像;
    第二控制模块,用于控制所述黑白摄像头拍摄得到低动态范围的黑白图像;
    合成模块,用于对所述黑白图像与所述彩色图像进行合成,得到目标图像。
  12. 根据权利要求11所述的控制装置,其特征在于,所述装置还包括:
    调整模块,用于根据所述彩色摄像头和所述黑白摄像头的视角差异,对所述黑白图像进行视角调整。
  13. 根据权利要求12所述的控制装置,其特征在于,所述黑白摄像头的视场角大于所述彩色摄像头的视场角,所述调整模块,包括:
    保留子模块,用于根据所述视角差异,从所述黑白图像中保留画面内容与所述彩色图像的画面内容相同的图像区域;
    调整子模块,用于对所述图像区域中的像素进行调整得到调整后的黑白图像;其中,调整后的黑白图像中的像素与所述彩色图像中的像素一一对应。
  14. 根据权利要求11-13中任一所述的控制装置,其特征在于,所述第一控制模块,具体用于:
    根据长曝光时长、短曝光时长和中曝光时长控制所述彩色摄像头进行拍摄,得到所述彩色图像。
  15. 根据权利要求14所述的控制装置,其特征在于,所述第二控制模块,具体用于:
    控制所述黑白摄像头采用所述中曝光时长进行拍摄,得到所述低动态范围的黑白图像。
  16. 根据权利要求11-15中任一所述的控制装置,其特征在于,所述合成模块,具体用于:
    确定所述彩色图像中的各像素;
    确定所述黑白图像中与所述彩色图像中的各像素对应的像素;
    将所述彩色图像中各像素的颜色贴图至所述黑白图像中对应的像素上;
    将贴图后的黑白图像作为所述目标图像。
  17. 根据权利要求11-15中任一所述的控制装置,其特征在于,所述合成模块,具体用于:
    确定所述彩色图像中的各像素;
    确定所述黑白图像中与所述彩色图像中的各像素对应的像素;
    将所述黑白图像上各像素的光强,补偿到所述彩色图像对应的像素上;
    将补偿后的彩色图像作为所述目标图像。
  18. 一种成像设备,其特征在于,所述成像设备包括彩色摄像头和黑白摄像头,所述成像设备还包括处理器,所述处理器用于:
    控制所述彩色摄像头拍摄得到高动态范围的彩色图像;
    控制所述黑白摄像头拍摄得到低动态范围的黑白图像;
    对所述黑白图像与所述彩色图像进行合成,得到目标图像。
  19. 一种电子设备,其特征在于,包括:存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时,实现如权利要求1-10中任一所述的控制方法。
  20. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时实现如权利要求1-10中任一所述的控制方法。
PCT/CN2019/090629 2018-08-06 2019-06-11 控制方法、装置、成像设备、电子设备及可读存储介质 Ceased WO2020029679A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810886728.7A CN109005343A (zh) 2018-08-06 2018-08-06 控制方法、装置、成像设备、电子设备及可读存储介质
CN201810886728.7 2018-08-06

Publications (1)

Publication Number Publication Date
WO2020029679A1 true WO2020029679A1 (zh) 2020-02-13

Family

ID=64595905

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090629 Ceased WO2020029679A1 (zh) 2018-08-06 2019-06-11 控制方法、装置、成像设备、电子设备及可读存储介质

Country Status (2)

Country Link
CN (1) CN109005343A (zh)
WO (1) WO2020029679A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114338988A (zh) * 2021-12-29 2022-04-12 Oppo广东移动通信有限公司 图像生成方法、装置、电子设备和计算机可读存储介质
CN114615395A (zh) * 2020-12-04 2022-06-10 中兴通讯股份有限公司 屏下摄像装置、显示设备、屏下摄像装置生成图像的方法
CN114630055A (zh) * 2022-03-21 2022-06-14 广州华欣电子科技有限公司 一种屏下摄像头组件及其摄像头控制方法、设备及介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005343A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 控制方法、装置、成像设备、电子设备及可读存储介质
CN112449095A (zh) * 2020-11-12 2021-03-05 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、可读存储介质
CN113810601B (zh) * 2021-08-12 2022-12-20 荣耀终端有限公司 终端的图像处理方法、装置和终端设备
CN116309504A (zh) * 2023-03-28 2023-06-23 苏州海汰池自动化科技有限公司 一种视觉检测图像采集分析方法
CN118138740B (zh) * 2024-03-11 2024-10-25 杭州非白三维科技有限公司 四目相机手持高精度三维扫描阵列结构、视觉方法和系统
CN118488283B (zh) * 2024-07-09 2025-02-11 广东保伦电子股份有限公司 一种字幕显示方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986875A (zh) * 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 一种图像获取装置、方法、终端及视频获取方法
CN107395898A (zh) * 2017-08-24 2017-11-24 维沃移动通信有限公司 一种拍摄方法及移动终端
CN108270977A (zh) * 2018-03-06 2018-07-10 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN109005343A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 控制方法、装置、成像设备、电子设备及可读存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10827140B2 (en) * 2016-10-17 2020-11-03 Huawei Technologies Co., Ltd. Photographing method for terminal and terminal
CN108605097B (zh) * 2016-11-03 2020-09-08 华为技术有限公司 光学成像方法及其装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103986875A (zh) * 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 一种图像获取装置、方法、终端及视频获取方法
CN107395898A (zh) * 2017-08-24 2017-11-24 维沃移动通信有限公司 一种拍摄方法及移动终端
CN108270977A (zh) * 2018-03-06 2018-07-10 广东欧珀移动通信有限公司 控制方法及装置、成像设备、计算机设备及可读存储介质
CN109005343A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 控制方法、装置、成像设备、电子设备及可读存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114615395A (zh) * 2020-12-04 2022-06-10 中兴通讯股份有限公司 屏下摄像装置、显示设备、屏下摄像装置生成图像的方法
CN114338988A (zh) * 2021-12-29 2022-04-12 Oppo广东移动通信有限公司 图像生成方法、装置、电子设备和计算机可读存储介质
CN114630055A (zh) * 2022-03-21 2022-06-14 广州华欣电子科技有限公司 一种屏下摄像头组件及其摄像头控制方法、设备及介质
CN114630055B (zh) * 2022-03-21 2023-08-01 广州华欣电子科技有限公司 一种应用于屏下摄像头组件的摄像头控制方法、设备及介质

Also Published As

Publication number Publication date
CN109005343A (zh) 2018-12-14

Similar Documents

Publication Publication Date Title
JP6911202B2 (ja) 撮像制御方法および撮像装置
CN108322669B (zh) 图像获取方法及装置、成像装置和可读存储介质
CN108989700B (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
WO2020029732A1 (zh) 全景拍摄方法、装置和成像设备
WO2020029679A1 (zh) 控制方法、装置、成像设备、电子设备及可读存储介质
CN108632537B (zh) 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108712608B (zh) 终端设备拍摄方法和装置
WO2020057199A1 (zh) 成像方法、装置和电子设备
CN109005364A (zh) 成像控制方法、装置、电子设备以及计算机可读存储介质
WO2020038072A1 (zh) 曝光控制方法、装置和电子设备
CN108683862A (zh) 成像控制方法、装置、电子设备及计算机可读存储介质
CN107948519A (zh) 图像处理方法、装置及设备
CN110072052A (zh) 基于多帧图像的图像处理方法、装置、电子设备
CN108833802B (zh) 曝光控制方法、装置和电子设备
CN108683863B (zh) 成像控制方法、装置、电子设备以及可读存储介质
CN108683861A (zh) 拍摄曝光控制方法、装置、成像设备和电子设备
CN108270977A (zh) 控制方法及装置、成像设备、计算机设备及可读存储介质
CN108198152A (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
US11601600B2 (en) Control method and electronic device
CN108156369A (zh) 图像处理方法和装置
WO2020034702A1 (zh) 控制方法、装置、电子设备和计算机可读存储介质
CN109040607A (zh) 成像控制方法、装置、电子设备和计算机可读存储介质
CN108900785A (zh) 曝光控制方法、装置和电子设备
CN108513062B (zh) 终端的控制方法及装置、可读存储介质和计算机设备
CN109005363B (zh) 成像控制方法、装置、电子设备以及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19848362

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19848362

Country of ref document: EP

Kind code of ref document: A1