WO2020029679A1 - Procédé et appareil de commande, dispositif d'imagerie, dispositif électronique et support de stockage lisible - Google Patents
Procédé et appareil de commande, dispositif d'imagerie, dispositif électronique et support de stockage lisible Download PDFInfo
- Publication number
- WO2020029679A1 WO2020029679A1 PCT/CN2019/090629 CN2019090629W WO2020029679A1 WO 2020029679 A1 WO2020029679 A1 WO 2020029679A1 CN 2019090629 W CN2019090629 W CN 2019090629W WO 2020029679 A1 WO2020029679 A1 WO 2020029679A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- black
- exposure
- white
- color
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
Definitions
- the present disclosure relates to the technical field of electronic devices, and in particular, to a control method, an apparatus, an imaging device, an electronic device, and a readable storage medium.
- the edge of the moving object is prevented from being dislocated or blurred.
- the present disclosure proposes a control method, device, imaging device, electronic device, and readable storage medium, so as to simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, thereby avoiding misalignment or blurring of the edges of the moving object.
- a control method, device, imaging device, electronic device, and readable storage medium so as to simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, thereby avoiding misalignment or blurring of the edges of the moving object.
- An embodiment of one aspect of the present disclosure provides a control method applied to an imaging device, where the imaging device includes a color camera and a black and white camera, and the control method includes:
- the control method in the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- An embodiment of still another aspect of the present disclosure provides a control device applied to an imaging device, where the imaging device includes a color camera and a black and white camera, and the control device includes:
- a first control module configured to control the color camera to capture a color image with a high dynamic range
- a second control module configured to control the black and white camera to obtain a black and white image with a low dynamic range
- a synthesis module is configured to synthesize the black and white image and the color image to obtain a target image.
- the control device first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- An embodiment of another aspect of the present disclosure provides an imaging device.
- the imaging device includes a color camera and a black and white camera.
- the imaging device further includes a processor.
- the processor is configured to:
- the imaging device of the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- An embodiment of another aspect of the present disclosure provides an electronic device, including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
- the processor executes the program, the implementation is implemented as the present disclosure
- the control method proposed in the foregoing embodiment is implemented as the present disclosure
- An embodiment of another aspect of the present disclosure provides a computer-readable storage medium having a computer program stored thereon, which is characterized in that when the program is executed by a processor, the control method according to the foregoing embodiment of the present disclosure is implemented.
- FIG. 1 is a schematic flowchart of a control method according to Embodiment 1 of the present disclosure
- FIG. 2 is a schematic flowchart of a control method provided in Embodiment 2 of the present disclosure
- FIG. 3 is a schematic flowchart of a control method according to a third embodiment of the present disclosure.
- FIG. 4 is a schematic diagram of an image captured by a color camera according to an embodiment of the present disclosure
- FIG. 5 is a schematic diagram of a color image in an embodiment of the present disclosure.
- FIG. 6 is a schematic diagram of a black and white image in an embodiment of the present disclosure.
- FIG. 7 is a schematic diagram of an adjusted black and white image in an embodiment of the present disclosure.
- FIG. 8 is a schematic flowchart of a control method according to a fourth embodiment of the present disclosure.
- Embodiment 9 is a schematic flowchart of a control method provided in Embodiment 5 of the present disclosure.
- FIG. 10 is a schematic flowchart of a control method provided in Embodiment 6 of the present disclosure.
- FIG. 11 is a schematic structural diagram of a control device according to a seventh embodiment of the present disclosure.
- FIG. 12 is a schematic structural diagram of a control device according to an eighth embodiment of the present disclosure.
- FIG. 13 is a schematic structural diagram of an electronic device according to a ninth embodiment of the present disclosure.
- FIG. 14 is a schematic block diagram of an electronic device according to some embodiments of the present disclosure.
- FIG. 15 is a schematic block diagram of an image processing circuit according to some embodiments of the present disclosure.
- the present disclosure mainly aims to provide a control method for the technical problem that the details of the synthesized image may be lost or blurred after the two frames of the exposed image are synthesized by the image synthesis technology in the prior art.
- the control method in the embodiment of the present disclosure firstly obtains a high dynamic range color image by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- FIG. 1 is a schematic flowchart of a control method according to a first embodiment of the present disclosure.
- the control method in the embodiment of the present disclosure is applied to an imaging device.
- the imaging device includes a color camera and a black and white camera.
- control method includes the following steps:
- step 101 a color camera is controlled to obtain a color image with a high dynamic range.
- the color camera in order to reduce noise in the image, thereby improving the sharpness and contrast of the image, in the present disclosure, can be controlled to obtain a high dynamic range color image. Reduce noise, and clearly display the current shooting scene, improve imaging results and imaging quality.
- step 102 a black and white camera is controlled to obtain a black and white image with a low dynamic range.
- the black and white camera can be controlled to obtain a black and white image with a low dynamic range, so that the black and white image can better retain the edges and details of the image.
- step 103 a black and white image and a color image are synthesized to obtain a target image.
- a black and white image and a color image may be synthesized to obtain a target image. Therefore, the high dynamic range of the target image, as well as the edges and details of the target image can be retained at the same time, thereby avoiding misalignment or blurring of the edges of the moving object, and improving the user's shooting experience.
- each pixel in the color image may be determined, and a pixel corresponding to each pixel in the color image in the black and white image may be determined.
- the color map is applied to the corresponding pixels in the black and white image to obtain the target image.
- each pixel in the color image may be determined, and a pixel corresponding to each pixel in the color image in the black and white image may be determined. Then, using the color image as the main body, each pixel in the black and white image may be determined. The light intensity is compensated to the pixels corresponding to the color image to obtain the target image.
- the control method in the embodiment of the present disclosure firstly obtains a high dynamic range color image by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then, by controlling the black-and-white camera to obtain a black-and-white image with a low dynamic range, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- the angle of view of the black and white image may be adjusted according to the angle of view difference between the color camera and the black and white camera. The above process is described in detail below with reference to FIG. 2.
- FIG. 2 is a schematic flowchart of a control method provided in Embodiment 2 of the present disclosure.
- control method may include the following steps:
- step 201 a color camera is controlled to obtain a color image with a high dynamic range.
- step 202 a black and white camera is controlled to obtain a black and white image with a low dynamic range.
- steps 201 to 202 For the execution process of steps 201 to 202, refer to the execution process of steps 101 to 102 in the foregoing embodiment, and details are not described herein.
- Step 203 Adjust the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera.
- the angle of view of the black and white image may be adjusted according to the angle of view difference between the color camera and the black and white camera.
- the viewing angle adjustment may be performed only for an image area with the same screen content in a black and white image and a color image.
- an image area with the same screen content as that of the color image may be retained from the black-and-white image, and then the pixels in the image area may be adjusted to obtain an adjusted black-and-white image; wherein, The pixels in the adjusted black and white image correspond to the pixels in the color image one-to-one.
- image recognition technology can be used to retain the image area with the same screen content as that of the color image from the black and white image according to the difference in perspective.
- feature extraction can be performed on the screen content in the black and white image according to the difference in perspective.
- Feature extraction is performed on the picture content in the color image, and then the extracted features are compared to determine an image area in which the picture content in the black and white image is the same as the picture content of the color image, and then the pixels in the image area are adjusted so that The angle of view and position of the pixels in the adjusted black and white image correspond one-to-one with the pixels in the color image.
- adjusting the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera can improve the matching between the adjusted black and white image and the color image, thereby improving the quality of subsequent composite images.
- step 204 the adjusted black and white image and a color image are synthesized to obtain a target image.
- the adjusted black and white image and the color image may be synthesized to obtain a target image. Therefore, the high dynamic range of the target image, as well as the edges and details of the target image can be retained at the same time, thereby avoiding misalignment or blurring of the edges of the moving object, and improving the user's shooting experience.
- each pixel in the color image can be determined, and the pixel corresponding to each pixel in the color image in the adjusted black and white image is determined.
- the color map of each pixel in the color image is applied to the corresponding pixel in the adjusted black and white image to obtain the target image.
- each pixel in the color image may be determined, and pixels corresponding to each pixel in the color image in the adjusted black-and-white image may be determined, and then the color image is used as the main body.
- the light intensity of each pixel in the black and white image is compensated to the corresponding pixel in the color image to obtain the target image.
- the control method in the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then, by controlling the black-and-white camera to obtain a black-and-white image with a low dynamic range, the edges and details of the image can be better preserved. Then, by adjusting the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera, the matching between the adjusted black and white image and the color image can be improved, thereby improving the quality of subsequent composite images.
- the target image is obtained by combining the adjusted black and white image and the color image, which can simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, so as to avoid misalignment or blurring of the edges of moving objects, and improve User shooting experience.
- FIG. 3 is a schematic flowchart of a control method provided in Embodiment 3 of the present disclosure.
- control method may include the following steps:
- Step 301 Control a color camera to shoot according to the long exposure time, the short exposure time, and the middle exposure time to obtain a color image.
- the long exposure time is longer than the middle exposure time, and the middle exposure time is longer than the short exposure time, that is, the long exposure time> the middle exposure time> the short exposure time.
- the long exposure duration, short exposure duration, and medium exposure duration can be preset in a built-in program of the electronic device, or can be set by a user, which is not limited.
- the color camera can be controlled to shoot with a long exposure time, a short exposure time, and a medium exposure time, respectively, to obtain at least one long exposure image, at least one middle exposure image, and at least one short exposure image, for example, see FIG. 4 is a schematic diagram of an image obtained by a color camera according to an embodiment of the present disclosure.
- the color camera collects one frame of long exposure image, one frame of exposure image, and one frame of short exposure image, respectively.
- a color image can be obtained according to at least two frames of the acquired images. For example, three frames of images in FIG. 4 may be synthesized, and the obtained color image may be as shown in FIG. 5.
- Step 302 Control the black-and-white camera to shoot with a medium exposure time to obtain a black-and-white image with a low dynamic range.
- a black and white camera may be controlled to take a medium exposure time to obtain a black and white image with a low dynamic range.
- a black-and-white image obtained by a black-and-white camera may be shown in FIG. 6.
- step 303 an image area with the same screen content as that of the color image is retained from the black and white image according to the difference in viewing angle.
- the viewing angle adjustment can be performed only for the image area with the same screen content in the black and white image and the color image. Specifically, an image area with the same screen content as that of the color image may be retained from the black and white image according to a difference in the viewing angle between the color camera and the black and white camera.
- image recognition technology can be used to retain the image area with the same screen content as that of the color image from the black and white image according to the difference in viewing angle.
- feature extraction can be performed on the screen content in the black and white image according to the difference in viewing angle.
- Feature extraction is performed on the picture content in the color image, and then the extracted features are compared, so that an image region in which the picture content in the black and white image is the same as the picture content of the color image can be determined.
- Step 304 Adjust the pixels in the image area to obtain an adjusted black-and-white image.
- the pixels in the adjusted black-and-white image correspond to the pixels in the color image on a one-to-one basis.
- the pixels in the image area may be adjusted so that the angles of view and positions of the pixels in the adjusted black and white image correspond to the pixels in the color image on a one-to-one basis.
- FIG. 7 is a schematic diagram of an adjusted black and white image in an embodiment of the present disclosure. It can be seen that the adjusted black and white image matches the color image more.
- Step 305 Combine the adjusted black and white image and the color image to obtain a target image.
- step 305 For the execution process of step 305, refer to the execution process of step 204 in the foregoing embodiment, and details are not described herein.
- control method of the embodiment of the present disclosure by first capturing a color image with a high dynamic range through a color camera, it is possible to reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality.
- the black and white image with low dynamic range is then captured by the black and white camera, which can better preserve the edges and details of the image.
- the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera, the matching between the adjusted black and white image and the color image can be improved, thereby improving the quality of subsequent composite images.
- the target image is obtained by combining the adjusted black and white image and the color image, which can simultaneously retain the high dynamic range of the target image, as well as the edges and details of the target image, so as to avoid misalignment or blurring of the edges of moving objects, and improve User shooting experience.
- the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels covered by the same color filter; multiple photosensitive pixels are used to output original pixels.
- the information includes at least one long exposure pixel, at least one medium exposure pixel, and at least one short exposure pixel.
- step 301 may specifically include the following sub-steps:
- Step 401 Control the pixel unit array to output multiple pieces of original pixel information under different exposure times.
- each photosensitive pixel unit in the pixel unit array includes at least one long exposure pixel, at least one medium exposure pixel, and at least one short exposure pixel, where the long exposure pixel refers to the exposure time corresponding to the photosensitive pixel is Long exposure time, medium exposure pixel refers to the exposure time corresponding to the photosensitive pixel is medium exposure time, short exposure pixel refers to the exposure time corresponding to the photosensitive pixel is short exposure time, long exposure time> medium exposure time> short exposure time, That is, the long exposure time of the long exposure pixel is greater than the middle exposure time of the middle exposure pixel, and the middle exposure time of the middle exposure pixel is greater than the short exposure time of the short exposure pixel.
- the long exposure pixels, the middle exposure pixels, and the short exposure pixels are simultaneously exposed.
- the synchronous exposure means that the exposure duration of the middle exposure pixels and the short exposure pixels is within the exposure duration of the long exposure pixels.
- the long-exposure pixel can be controlled to start exposure first.
- the exposure of the middle-exposure pixel and the short-exposure pixel can be controlled.
- the exposure cut-off time of the middle-exposure pixel and the short-exposure pixel should be longer than
- the exposure cutoff time of the exposure pixel is the same or before the exposure cutoff time of the long exposure pixel; or, the long exposure pixel, the middle exposure pixel, and the short exposure pixel are controlled to start exposure at the same time, that is, the exposure of the long exposure pixel, the middle exposure pixel, and the short exposure pixel.
- the start times are the same. In this way, there is no need to control the pixel unit array to perform long exposure, medium exposure, and short exposure in order, which can reduce the shooting time of color images.
- the imaging device first controls the synchronous exposure of the long exposure pixel, the middle exposure pixel, and the short exposure pixel in each photosensitive pixel unit in the pixel unit array.
- the exposure time corresponding to the long exposure pixel is the initial long exposure time, and the exposure corresponding to the middle exposure pixel.
- Time is the initial medium exposure time
- the exposure time corresponding to the short exposure pixel is the initial short exposure time.
- the initial long exposure time, the initial medium exposure time, and the initial short exposure time are all preset. After the exposure is over, each photosensitive pixel unit in the pixel unit array will output multiple original pixel information at different exposure times.
- Step 402 Calculate the combined pixel information according to the original pixel information with the same exposure time in the same photosensitive pixel unit.
- Step 403 Output a color image according to the merged pixel information.
- each photosensitive pixel unit when each photosensitive pixel unit includes one long exposure pixel, two middle exposure pixels, and one short exposure pixel, the original pixel information of the only long exposure pixel is the combined pixel information of the long exposure.
- the sum of the original pixel information of the exposed pixels is the combined pixel information of the middle exposure, and the original pixel information of the only short exposure pixel is the combined pixel information of the short exposure;
- each photosensitive pixel unit includes 2 long exposure pixels, 4
- the sum of the original pixel information of the two long exposure pixels is the combined pixel information for the long exposure
- the sum of the original pixel information of the four middle exposure pixels is the merged pixel for the middle exposure.
- the sum of the original pixel information of the two short exposure pixels is the combined pixel information of the short exposure.
- multiple long-exposure combined pixel information, multiple medium-exposure combined pixel information, and multiple short-exposure combined pixel information of the entire pixel unit array can be obtained.
- a long-exposure sub-image is calculated by interpolation based on a plurality of long-exposure merged pixel information
- a mid-exposure sub-image is calculated by interpolation based on a plurality of mid-exposure merged pixel information
- a short is calculated based on a plurality of short-exposure merged pixel information interpolation.
- the long exposure sub-image, the middle exposure sub-image, and the short exposure sub-image are processed by fusion to obtain a high dynamic range color image.
- the long exposure sub-image, the middle exposure sub-image, and the short exposure sub-image are not three traditional images.
- a frame image is an image portion formed by corresponding regions of long, short, and medium exposure pixels in the same frame of image.
- the original pixel information of the short exposure pixel and the original pixel information of the middle exposure pixel may be superimposed on the original pixel information of the long exposure pixel based on the original pixel information output by the long exposure pixel.
- three kinds of original pixel information with different exposure times can be given different weights respectively.
- the original pixel information corresponding to each exposure time is multiplied with the weight, three kinds of multiplied weights are then used.
- the original pixel information after the value is added up as the synthesized pixel information of one photosensitive pixel unit.
- the long exposure pixels can be firstly Calculate the long exposure histogram based on the output original pixel information, calculate the short exposure histogram based on the original pixel information output by the short exposure pixels, and correct the initial long exposure time based on the long exposure histogram to obtain the corrected long exposure time.
- Short exposure times get corrected short exposure times.
- the long exposure pixels, the middle exposure pixels, and the short exposure pixels are controlled to synchronize exposures according to the modified long exposure time, the initial medium exposure time, and the modified short exposure time, respectively.
- the correction process is not in one step, but the imaging device needs to perform multiple long, medium, and short simultaneous exposures. After each simultaneous exposure, the imaging device will continue to correct according to the generated long exposure histogram and short exposure histogram.
- Long exposure time and short exposure time and use the modified long exposure time, the corrected short exposure time and the original medium exposure time to perform synchronous exposure at the next exposure, and continue to obtain the long exposure histogram and short exposure histogram. This cycle continues until there are no underexposed areas in the image corresponding to the long exposure histogram and no overexposed areas in the image corresponding to the short exposure histogram.
- the modified long exposure time and the corrected short exposure time are the final corrections.
- Long exposure time and corrected short exposure time After the exposure is finished, the color image calculation is performed according to the output of the long exposure pixel, the middle exposure pixel, and the short exposure pixel. The calculation method is the same as that in the previous embodiment, and is not repeated here.
- the long exposure histogram may be one or more.
- a long exposure histogram can be generated according to the original pixel information output by all the long exposure pixels.
- Exposure histogram The function of dividing the area is to improve the accuracy of the long exposure time for each correction and speed up the correction process of the long exposure time.
- the short exposure histogram may be one or more.
- a short exposure histogram can be generated based on the original pixel information output by all short exposure pixels.
- step 402 may specifically include the following sub-steps:
- step 501 in the same photosensitive pixel unit, original pixel information of long exposure pixels, original pixel information of short exposure pixels, or original pixel information of middle exposure pixels is selected.
- the original pixel information of the long exposure pixels, the original pixel information of the short exposure pixels, or the original pixel information of the middle exposure pixels, that is, the original pixel information of the long exposure pixels and the short exposure are selected.
- One of the original pixel information of the pixel or the original pixel information of the exposed pixel is selected.
- a photosensitive pixel unit includes one long exposure pixel, two middle exposure pixels, and one short exposure pixel
- the original pixel information of the long exposure pixel is 80
- the original pixel information of the two middle exposure pixels is 255
- the original pixel information of the short exposure pixel is 255
- the original pixel information of the long exposure pixel can be selected: 80.
- Step 502 Calculate the combined pixel information according to the selected original pixel information and the exposure ratio between the long exposure time, the middle exposure time, and the short exposure time.
- the merged pixel information can be calculated by the selected original pixel information and the exposure ratio between the long exposure time, the middle exposure time, and the short exposure time, which can expand the dynamic range. Obtain high dynamic range images, thereby improving the imaging effect of color images.
- the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels that output original pixel information; referring to FIG. 10, shown in FIG. 3
- step 301 may specifically include the following sub-steps:
- Step 601 Control the pixel unit array to perform multiple exposures using different exposure durations.
- the pixel unit array is controlled to perform multiple exposures with different exposure durations.
- the pixel unit array may be controlled to be exposed 3 times with a long exposure duration, a medium exposure duration, and a short exposure duration.
- each exposure time can be preset in a built-in program of the electronic device, or can be set by a user to improve the flexibility and applicability of the control method.
- step 602 at each exposure, the photosensitive pixels of the control pixel unit array are exposed using the same exposure time, and multiple original pixel information is obtained by output.
- the exposure time used by the photosensitive pixels in the pixel unit array is the same.
- the photosensitive pixels of the pixel unit array are all the same. Long exposure time exposure.
- each photosensitive pixel unit in the pixel unit array will output a plurality of original pixel information at a corresponding exposure time.
- Step 603 Obtain an exposure image based on the multiple pieces of original pixel information obtained by each exposure.
- the exposure durations are different, such as the long exposure duration, the medium exposure duration, or the short exposure duration.
- the original pixel information output by each photosensitive pixel unit is different.
- the original pixel information output by each photosensitive pixel unit may be different.
- To generate a frame of exposure image for example, to generate a long exposure image, a medium exposure image, and a short exposure image.
- the exposure time of the pixel unit array of the same frame of the exposure image is the same, that is, the exposure time of different photosensitive pixel units of the long exposure image is the same, the exposure time of different photosensitive pixel units of the middle exposure image is the same, and that of the different photosensitive pixels of the short exposure image is the same.
- the exposure time is the same.
- Step 604 Combine the exposure images of each frame to obtain a color image.
- the exposure images generated by using different exposure durations can be synthesized to obtain a color image.
- the exposure images generated for different exposure durations may be assigned different weights respectively, and then the exposure images generated using different exposure durations are combined according to the corresponding weights of the exposure images to obtain a color image.
- the weights corresponding to the exposure images generated at different exposure durations can be preset in a built-in program of the electronic device, or can be set by a user, which is not limited.
- a long exposure image, a medium exposure image, and a short exposure image are generated according to the original pixel information output by each photosensitive pixel unit. Then, the long exposure image, the middle exposure image, and the short exposure image are synthesized according to the preset weights corresponding to the long exposure image, the middle exposure image, and the short exposure image, and a color image with a high dynamic range can be obtained.
- the above method of synthesizing a high-dynamic-range color image using three frames of exposure images is merely an example.
- the number of exposure images may also be two frames, four frames, five frames, six frames, etc. It is not specifically limited here.
- the pixel unit array can be controlled to use the long exposure time and the short exposure time to perform two exposures, or the pixel unit array can be controlled to use the middle exposure time and the short exposure time to perform two exposures.
- the pixel unit array is controlled to use the long exposure time and the middle exposure time to perform two exposures.
- the present disclosure also proposes a control device.
- FIG. 11 is a schematic structural diagram of a control device according to a seventh embodiment of the present disclosure.
- the control device 100 includes a first control module 110, a second control module 120, and a synthesis module 130. among them,
- the first control module 110 is configured to control a color camera to obtain a color image with a high dynamic range.
- the first control module 110 is specifically configured to control a color camera to perform shooting according to a long exposure duration, a short exposure duration, and a middle exposure duration to obtain a color image.
- the second control module 120 is configured to control a black-and-white camera to obtain a black-and-white image with a low dynamic range.
- the second control module 120 is specifically configured to control the black and white camera to shoot with a medium exposure time to obtain a black and white image with a low dynamic range.
- a synthesizing module 130 is configured to synthesize a black and white image and a color image to obtain a target image.
- the synthesis module 130 is specifically configured to: determine each pixel in the color image; determine pixels corresponding to each pixel in the color image in the black and white image; map the color of each pixel in the color image to black and white The corresponding pixels in the image; the black and white image after mapping is used as the target image.
- the synthesis module 130 is specifically configured to: determine each pixel in the color image; determine pixels in the black and white image corresponding to each pixel in the color image; and change the light intensity of each pixel in the black and white image, Compensate to the pixel corresponding to the color image; use the compensated color image as the target image.
- control device 100 may further include:
- the adjusting module 140 is configured to adjust the viewing angle of the black and white image according to the difference in the viewing angle between the color camera and the black and white camera before the combining the black and white image and the color image.
- the field of view of the black and white camera is greater than the field of view of the color camera
- the adjustment module 140 includes:
- the retention submodule 141 is configured to retain an image area with the same screen content as the screen content of the color image from the black and white image according to the difference in viewing angle.
- the adjustment sub-module 142 is configured to adjust the pixels in the image area to obtain an adjusted black and white image; wherein the pixels in the adjusted black and white image correspond to the pixels in the color image one to one.
- the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels covered by the same color filter; multiple photosensitive pixels are used to output original pixels.
- the information includes at least one long exposure pixel, at least one middle exposure pixel, and at least one short exposure pixel.
- the first control module 110 includes:
- a control sub-module 111 is used to control the pixel unit array to output multiple pieces of original pixel information under different exposure times; wherein, the long exposure duration of the long exposure pixels is greater than the middle exposure duration of the middle exposure pixels, and the middle exposure of the middle exposure pixels The duration is greater than the short exposure duration of the short exposure pixels.
- the calculation sub-module 112 is configured to calculate and obtain the merged pixel information according to the original pixel information with the same exposure time in the same photosensitive pixel unit.
- the processing sub-module 113 is configured to output a color image according to the merged pixel information.
- the calculation submodule 112 includes:
- the selecting unit 1121 is configured to select the original pixel information of the long-exposed pixels, the original pixel information of the short-exposed pixels, or the original pixel information of the middle-exposed pixels in the same photosensitive pixel unit.
- the calculation unit 1122 is configured to calculate and obtain the merged pixel information according to the selected original pixel information and the exposure ratio between the long exposure time, the middle exposure time, and the short exposure time.
- the color camera includes a pixel unit array composed of multiple photosensitive pixel units, and each photosensitive pixel unit includes multiple photosensitive pixels that output original pixel information.
- the control submodule 111 is further configured to control the pixel unit array to perform multiple exposures with different exposure durations, and to control the exposure of the photosensitive pixels of the pixel unit array to the same exposure time at each exposure, and output multiple original pixel information. .
- the calculation sub-module 112 is further configured to obtain a frame of an exposure image according to a plurality of original pixel information obtained by each exposure.
- the processing sub-module 113 is further configured to synthesize the exposure images of each frame to obtain a color image.
- the control device first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- the present disclosure also proposes an imaging apparatus.
- FIG. 13 is a schematic structural diagram of an imaging apparatus according to Embodiment 9 of the present disclosure.
- the imaging device includes a color camera 10 and a black-and-white camera 20.
- the imaging device further includes a processor 30.
- the processor 30 is configured to:
- the black and white image and the color image are synthesized to obtain a target image.
- the imaging device of the embodiment of the present disclosure first obtains a color image with a high dynamic range by controlling a color camera, which can reduce noise, clearly display the current shooting scene, and improve the imaging effect and imaging quality. Then by controlling the black and white camera to obtain a black and white image, the edges and details of the image can be better preserved. Finally, the target image is obtained by synthesizing the black and white image and the color image, which can retain the high dynamic range of the target image and the edges and details of the target image at the same time, thereby avoiding misalignment or blurring of the edges of moving objects and improving user shooting. Experience.
- the present disclosure also provides an electronic device including: a memory, a processor, and a computer program stored on the memory and executable on the processor.
- a processor executes the program, the foregoing embodiments of the present disclosure are implemented Proposed control method.
- the present disclosure also proposes a computer-readable storage medium on which a computer program is stored, which is characterized in that when the program is executed by a processor, the control method as proposed in the foregoing embodiment of the present disclosure is implemented.
- the present disclosure further provides an electronic device 200.
- the electronic device 200 includes a memory 50 and a processor 60.
- the memory 50 stores computer-readable instructions.
- the processor 60 is caused to execute the control method of any one of the foregoing embodiments.
- FIG. 14 is a schematic diagram of the internal structure of the electronic device 200 in one embodiment.
- the electronic device 200 includes a processor 60, a memory 50 (for example, a non-volatile storage medium), an internal memory 82, a display screen 83, and an input device 84 connected through a system bus 81.
- the memory 50 of the electronic device 200 stores an operating system and computer-readable instructions.
- the computer-readable instructions can be executed by the processor 60 to implement the control method of the embodiment of the present disclosure.
- the processor 60 is used to provide computing and control capabilities to support the operation of the entire electronic device 200.
- the internal memory 50 of the electronic device 200 provides an environment for execution of computer-readable instructions in the memory 52.
- the display screen 83 of the electronic device 200 may be a liquid crystal display or an electronic ink display, and the input device 84 may be a touch layer covered on the display screen 83, or may be a button, a trackball or a touch button provided on the housing of the electronic device 200 Board, which can also be an external keyboard, trackpad, or mouse.
- the electronic device 200 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (for example, a smart bracelet, a smart watch, a smart helmet, or smart glasses).
- the specific electronic device 200 may include more or fewer components than shown in the figure, or some components may be combined, or have different component arrangements.
- the electronic device 200 includes an image processing circuit 90.
- the image processing circuit 90 may be implemented by hardware and / or software components, including various types of pipelines that define an ISP (Image Signal Processing) pipeline. Processing unit.
- FIG. 15 is a schematic diagram of an image processing circuit 90 in one embodiment. As shown in FIG. 15, for convenience of explanation, only various aspects of the image processing technology related to the embodiments of the present disclosure are shown.
- the image processing circuit 90 includes an ISP processor 91 (the ISP processor 91 may be the processor 60) and a control logic 92.
- the image data captured by the camera 93 is first processed by the ISP processor 91.
- the ISP processor 91 analyzes the image data to capture image statistical information that can be used to determine one or more control parameters of the camera 93.
- the camera 93 may include one or more lenses 932 and an image sensor 934.
- the image sensor 934 may include a color filter array (such as a Bayer filter). The image sensor 934 may obtain light intensity and wavelength information captured by each imaging pixel, and provide a set of raw image data that can be processed by the ISP processor 91.
- the sensor 94 (such as a gyroscope) may provide parameters (such as image stabilization parameters) of the acquired image processing to the ISP processor 91 based on the interface type of the sensor 94.
- the sensor 94 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the foregoing interfaces.
- the image sensor 934 may also send the original image data to the sensor 94.
- the sensor 94 may provide the original image data to the ISP processor 91 based on the interface type of the sensor 94, or the sensor 94 stores the original image data into the image memory 95.
- the ISP processor 91 processes the original image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 91 may perform one or more image processing operations on the original image data and collect statistical information about the image data. The image processing operations may be performed with the same or different bit depth accuracy.
- the ISP processor 91 may also receive image data from the image memory 95.
- the sensor 94 interface sends the original image data to the image memory 95, and the original image data in the image memory 95 is then provided to the ISP processor 91 for processing.
- the image memory 95 may be an independent dedicated memory in the memory 50, a part of the memory 50, a storage device, or an electronic device, and may include a DMA (Direct Memory Access) feature.
- DMA Direct Memory Access
- the ISP processor 91 may perform one or more image processing operations, such as time-domain filtering.
- the processed image data may be sent to the image memory 95 for further processing before being displayed.
- the ISP processor 91 receives processing data from the image memory 95, and performs processing on the image data in the original domain and in the RGB and YCbCr color spaces.
- the image data processed by the ISP processor 91 may be output to a display 97 (the display 97 may include a display screen 83) for viewing by a user and / or further processing by a graphics engine or a GPU (Graphics Processing Unit).
- the output of the ISP processor 91 can also be sent to the image memory 95, and the display 97 can read image data from the image memory 95.
- the image memory 95 may be configured to implement one or more frame buffers.
- the output of the ISP processor 91 may be sent to an encoder / decoder 96 to encode / decode image data.
- the encoded image data can be saved and decompressed before being displayed on the display 97 device.
- the encoder / decoder 96 may be implemented by a CPU or a GPU or a coprocessor.
- the statistical data determined by the ISP processor 91 may be sent to the control logic unit 92.
- the statistical data may include image sensor 934 statistical information such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, and lens 932 shading correction.
- the control logic 92 may include a processing element and / or a microcontroller that executes one or more routines (such as firmware).
- the one or more routines may determine the control parameters of the camera 93 and the ISP processor according to the received statistical data.
- 91 control parameters For example, the control parameters of the camera 93 may include sensor 94 control parameters (such as gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 932 control parameters (such as focal length for focus or zoom), or these parameters The combination.
- the ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 932 shading correction parameters.
- the following are the steps for implementing the control method by using the processor 60 in FIG. 14 or the image processing circuit 90 (specifically, the ISP processor 91) in FIG. 15:
- the black and white image and the color image are synthesized to obtain a target image.
- the image area with the same screen content as the color image is retained from the black and white image
- first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present disclosure, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
- any process or method description in a flowchart or otherwise described herein can be understood as representing a module, fragment, or portion of code that includes one or more executable instructions for implementing steps of a custom logic function or process
- the scope of the preferred embodiments of the present disclosure includes additional implementations, in which the functions may be performed out of the order shown or discussed, including performing functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present disclosure belong.
- Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
- a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
- computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
- the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
- portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof.
- multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
- Discrete logic circuits with logic gates for implementing logic functions on data signals Logic circuits, ASICs with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
- a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
- the program can be stored in a computer-readable storage medium.
- the program is When executed, one or a combination of the steps of the method embodiment is included.
- each functional unit in each embodiment of the present disclosure may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
- the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
- the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
La présente invention concerne un procédé et un appareil de commande, un dispositif d'imagerie, un dispositif électronique et un support de stockage lisible, le procédé étant appliqué à un dispositif d'imagerie, le dispositif d'imagerie comprenant une caméra couleur et une caméra noir et blanc ; le procédé de commande consiste à : commander la caméra couleur pour capturer une image couleur à grande gamme dynamique ; commander la caméra noir et blanc pour capturer une image noir et blanc à petite gamme dynamique ; et synthétiser l'image noir et blanc et l'image couleur pour obtenir une image cible. En utilisant le procédé, la grande gamme dynamique d'une image cible ainsi que le bord et les détails de l'image cible peuvent être conservés, ce qui évite un mauvais placement ou un flou du bord d'un objet en mouvement, ce qui améliore l'expérience de capture d'image d'un utilisateur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810886728.7A CN109005343A (zh) | 2018-08-06 | 2018-08-06 | 控制方法、装置、成像设备、电子设备及可读存储介质 |
| CN201810886728.7 | 2018-08-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020029679A1 true WO2020029679A1 (fr) | 2020-02-13 |
Family
ID=64595905
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/090629 Ceased WO2020029679A1 (fr) | 2018-08-06 | 2019-06-11 | Procédé et appareil de commande, dispositif d'imagerie, dispositif électronique et support de stockage lisible |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN109005343A (fr) |
| WO (1) | WO2020029679A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114338988A (zh) * | 2021-12-29 | 2022-04-12 | Oppo广东移动通信有限公司 | 图像生成方法、装置、电子设备和计算机可读存储介质 |
| CN114615395A (zh) * | 2020-12-04 | 2022-06-10 | 中兴通讯股份有限公司 | 屏下摄像装置、显示设备、屏下摄像装置生成图像的方法 |
| CN114630055A (zh) * | 2022-03-21 | 2022-06-14 | 广州华欣电子科技有限公司 | 一种屏下摄像头组件及其摄像头控制方法、设备及介质 |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109005343A (zh) * | 2018-08-06 | 2018-12-14 | Oppo广东移动通信有限公司 | 控制方法、装置、成像设备、电子设备及可读存储介质 |
| CN112449095A (zh) * | 2020-11-12 | 2021-03-05 | Oppo广东移动通信有限公司 | 图像处理方法和装置、电子设备、可读存储介质 |
| CN113810601B (zh) * | 2021-08-12 | 2022-12-20 | 荣耀终端有限公司 | 终端的图像处理方法、装置和终端设备 |
| CN116309504A (zh) * | 2023-03-28 | 2023-06-23 | 苏州海汰池自动化科技有限公司 | 一种视觉检测图像采集分析方法 |
| CN118138740B (zh) * | 2024-03-11 | 2024-10-25 | 杭州非白三维科技有限公司 | 四目相机手持高精度三维扫描阵列结构、视觉方法和系统 |
| CN118488283B (zh) * | 2024-07-09 | 2025-02-11 | 广东保伦电子股份有限公司 | 一种字幕显示方法及系统 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103986875A (zh) * | 2014-05-29 | 2014-08-13 | 宇龙计算机通信科技(深圳)有限公司 | 一种图像获取装置、方法、终端及视频获取方法 |
| CN107395898A (zh) * | 2017-08-24 | 2017-11-24 | 维沃移动通信有限公司 | 一种拍摄方法及移动终端 |
| CN108270977A (zh) * | 2018-03-06 | 2018-07-10 | 广东欧珀移动通信有限公司 | 控制方法及装置、成像设备、计算机设备及可读存储介质 |
| CN109005343A (zh) * | 2018-08-06 | 2018-12-14 | Oppo广东移动通信有限公司 | 控制方法、装置、成像设备、电子设备及可读存储介质 |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10827140B2 (en) * | 2016-10-17 | 2020-11-03 | Huawei Technologies Co., Ltd. | Photographing method for terminal and terminal |
| CN108605097B (zh) * | 2016-11-03 | 2020-09-08 | 华为技术有限公司 | 光学成像方法及其装置 |
-
2018
- 2018-08-06 CN CN201810886728.7A patent/CN109005343A/zh active Pending
-
2019
- 2019-06-11 WO PCT/CN2019/090629 patent/WO2020029679A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103986875A (zh) * | 2014-05-29 | 2014-08-13 | 宇龙计算机通信科技(深圳)有限公司 | 一种图像获取装置、方法、终端及视频获取方法 |
| CN107395898A (zh) * | 2017-08-24 | 2017-11-24 | 维沃移动通信有限公司 | 一种拍摄方法及移动终端 |
| CN108270977A (zh) * | 2018-03-06 | 2018-07-10 | 广东欧珀移动通信有限公司 | 控制方法及装置、成像设备、计算机设备及可读存储介质 |
| CN109005343A (zh) * | 2018-08-06 | 2018-12-14 | Oppo广东移动通信有限公司 | 控制方法、装置、成像设备、电子设备及可读存储介质 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114615395A (zh) * | 2020-12-04 | 2022-06-10 | 中兴通讯股份有限公司 | 屏下摄像装置、显示设备、屏下摄像装置生成图像的方法 |
| CN114338988A (zh) * | 2021-12-29 | 2022-04-12 | Oppo广东移动通信有限公司 | 图像生成方法、装置、电子设备和计算机可读存储介质 |
| CN114630055A (zh) * | 2022-03-21 | 2022-06-14 | 广州华欣电子科技有限公司 | 一种屏下摄像头组件及其摄像头控制方法、设备及介质 |
| CN114630055B (zh) * | 2022-03-21 | 2023-08-01 | 广州华欣电子科技有限公司 | 一种应用于屏下摄像头组件的摄像头控制方法、设备及介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN109005343A (zh) | 2018-12-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6911202B2 (ja) | 撮像制御方法および撮像装置 | |
| CN108322669B (zh) | 图像获取方法及装置、成像装置和可读存储介质 | |
| CN108989700B (zh) | 成像控制方法、装置、电子设备以及计算机可读存储介质 | |
| WO2020029732A1 (fr) | Procédé et appareil de photographie panoramique, et dispositif d'imagerie | |
| WO2020029679A1 (fr) | Procédé et appareil de commande, dispositif d'imagerie, dispositif électronique et support de stockage lisible | |
| CN108632537B (zh) | 控制方法及装置、成像设备、计算机设备及可读存储介质 | |
| CN108712608B (zh) | 终端设备拍摄方法和装置 | |
| WO2020057199A1 (fr) | Procédé et dispositif d'imagerie, et dispositif électronique | |
| CN109005364A (zh) | 成像控制方法、装置、电子设备以及计算机可读存储介质 | |
| WO2020038072A1 (fr) | Procédé et dispositif de contrôle d'exposition, et dispositif électronique | |
| CN108683862A (zh) | 成像控制方法、装置、电子设备及计算机可读存储介质 | |
| CN107948519A (zh) | 图像处理方法、装置及设备 | |
| CN110072052A (zh) | 基于多帧图像的图像处理方法、装置、电子设备 | |
| CN108833802B (zh) | 曝光控制方法、装置和电子设备 | |
| CN108683863B (zh) | 成像控制方法、装置、电子设备以及可读存储介质 | |
| CN108683861A (zh) | 拍摄曝光控制方法、装置、成像设备和电子设备 | |
| CN108270977A (zh) | 控制方法及装置、成像设备、计算机设备及可读存储介质 | |
| CN108198152A (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
| US11601600B2 (en) | Control method and electronic device | |
| CN108156369A (zh) | 图像处理方法和装置 | |
| WO2020034702A1 (fr) | Procédé de commande, dispositif, équipement électronique et support d'informations lisible par ordinateur | |
| CN109040607A (zh) | 成像控制方法、装置、电子设备和计算机可读存储介质 | |
| CN108900785A (zh) | 曝光控制方法、装置和电子设备 | |
| CN108513062B (zh) | 终端的控制方法及装置、可读存储介质和计算机设备 | |
| CN109005363B (zh) | 成像控制方法、装置、电子设备以及存储介质 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19848362 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19848362 Country of ref document: EP Kind code of ref document: A1 |