[go: up one dir, main page]

WO2020060081A1 - Dispositif électronique pour piloter une pluralité de caméras sur la base d'un éclairement externe - Google Patents

Dispositif électronique pour piloter une pluralité de caméras sur la base d'un éclairement externe Download PDF

Info

Publication number
WO2020060081A1
WO2020060081A1 PCT/KR2019/011494 KR2019011494W WO2020060081A1 WO 2020060081 A1 WO2020060081 A1 WO 2020060081A1 KR 2019011494 W KR2019011494 W KR 2019011494W WO 2020060081 A1 WO2020060081 A1 WO 2020060081A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
time
illuminance
image
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2019/011494
Other languages
English (en)
Korean (ko)
Inventor
김학준
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of WO2020060081A1 publication Critical patent/WO2020060081A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • Digital cameras have a problem of acquiring an image with a clear boundary or a very dark image due to movement of a camera or a subject in a low-light environment. Therefore, an electronic device equipped with a digital camera increases the exposure value of a digital camera in a low-light environment, or improves the image quality of a low-light image by applying a brightness / color correction algorithm for an image acquired in a low-light environment.
  • Conventional electronic devices use one camera to improve the image quality of a low-light image.
  • Various embodiments disclosed in this document provide an electronic device capable of improving image quality in a low light environment using a plurality of cameras.
  • An electronic device includes a first camera; A second camera; And a processor, wherein the processor is configured to acquire an image using the first camera, and while obtaining an image using the first camera, the illuminance outside the electronic device is designated using the first camera.
  • the second camera is deactivated, and when the time identified after the second camera is activated satisfies the third designated time, an image is acquired using the second camera, and the It may be set to deactivate the first camera.
  • the housing the housing; display; A first camera disposed on the first surface of the housing; A second camera disposed on a first surface of the housing so as to be spaced apart from the first camera at a specified distance, and having a higher image quality than the first camera in a designated illumination range; And a processor functionally connected to the display, the first camera, and the second camera, the processor outputting an image acquired using the first camera to the display, and using the first camera While outputting the obtained image to the display, the first camera is used to check the illuminance outside the electronic device, and if the external illuminance falls within the specified illuminance range, the second camera is activated, The image acquired by using the second camera may be output to the display, and may be set to deactivate the first camera.
  • an electronic device includes: a sensor module capable of detecting illuminance outside the electronic device; A first camera; A second camera; And a processor, wherein the processor acquires an image using the first camera and acquires an image using the first camera, wherein the illuminance outside the electronic device is designated using the sensor module.
  • image quality may be improved in a low-light environment using a plurality of cameras.
  • various effects that can be directly or indirectly identified through this document may be provided.
  • FIG 1 illustrates appearances of first and second surfaces of an electronic device according to an embodiment.
  • FIG. 2 is a block diagram of an electronic device according to an embodiment.
  • FIG 3 shows the structure of a tetracell image sensor according to an embodiment.
  • FIG. 4 is a schematic flowchart of a camera switching method according to an embodiment.
  • FIG. 5 is a detailed flowchart of a camera switching method according to an embodiment.
  • FIG. 6 is an example of a case where the first camera is switched to the second camera and then switched to the first camera according to an embodiment.
  • 7 is another example of switching from the first camera to the second camera and then switching to the first camera according to an embodiment.
  • FIG. 8 is an example of a case in which a first camera is used even in a change in external illuminance according to an embodiment.
  • 9 is an example of a case in which a second camera is used even in a change in external illuminance according to an embodiment.
  • FIG. 10 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • FIG. 1 illustrates appearances h100 of first and second surfaces of an electronic device according to an embodiment.
  • the electronic device 100 may include a first camera 110, a second camera 120, and a display 130.
  • the first camera 110 may be provided on the first surface (eg, rear) 102 of the electronic device 100.
  • the first camera 110 may have a first angle of view and a first focal length, and photograph the first shooting range.
  • the second camera 120 may be disposed at a predetermined distance from the first camera 110 on the first surface 102 of the electronic device 100.
  • the second camera 120 has a second angle of view and a second focal length, and may photograph the second shooting range.
  • the second angle of view may be the same as or similar to the first angle of view.
  • the second focal length may be the same or similar to the first focal length.
  • the first shooting range and the second shooting range may overlap at least a designated range (eg, about 70% or more).
  • the first camera 110 and the second camera 120 may be disposed on a first surface on which the display 130 of the electronic device 100 is disposed.
  • the display 130 may be exposed to the outside of the electronic device 100 through at least a portion of the second surface (eg, front) 101 of the electronic device 100.
  • the display 130 may be, for example, a touch screen display.
  • FIG. 2 is a block diagram of an electronic device according to an embodiment.
  • the electronic device 100 may include a first camera 110, a second camera 120, a display 130, a memory 140, and a processor 150.
  • the first camera 110 may have better image quality than the second camera 120 in the first illumination range.
  • the first illuminance range may include, for example, an illuminance range exceeding the critical illuminance.
  • the critical illuminance is an illuminance that is a low illuminance criterion, and may be, for example, about 50 lux.
  • the second camera 120 may have better image quality than the first camera 110 in the second illuminance range.
  • the second illuminance range is a low illuminance range, and may include, for example, an illuminance range below a critical illuminance.
  • the total number of pixels of the first camera 110 may be greater than the total number of pixels of the second camera 120.
  • the first camera 110 may include a 24M pixel image sensor
  • the second camera 120 may include a 16M pixel image sensor.
  • the total number of pixels of the first camera 110 may be less than the total number of pixels of the second camera 120.
  • the first camera 110 may include a 16M pixel image sensor
  • the second camera 120 may include a 24M pixel tetracell image sensor.
  • the display 130 may output (display) an image obtained by using at least one of the first camera 110 or the second camera 120.
  • the display 130 may be, for example, a touch screen display that receives user input.
  • the display 130 may display various contents (for example, text, images, videos, icons, and / or symbols, etc.).
  • the display 130 may include, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or an electronic paper display.
  • LCD liquid crystal display
  • LED light emitting diode
  • OLED organic light emitting diode
  • the memory 140 may store, for example, commands or data related to at least one other component of the electronic device 100.
  • the memory 140 may be volatile memory (eg, RAM, etc.), non-volatile memory (eg, ROM, flash memory, etc.) or a combination thereof.
  • the memory 140 may store a first illumination range, a second illumination range, a first threshold time, a second threshold time, a third threshold time, and a fourth threshold time.
  • the memory 140 may store commands for activating or deactivating the first camera 110 and the second camera 120.
  • the processor 150 checks the illuminance outside the electronic device 100 using at least one of the first camera 110 or the second camera 120, and the external illuminance ranges from the first illuminance range Depending on whether it belongs to or belongs to the second illumination range, at least one of the first camera 110 or the second camera 120 is activated, and the image is imaged using the first camera 110 or the second camera 120. To obtain, and output the obtained image to the display 130.
  • the processor 150 may execute operations or data processing related to control and / or communication of at least one other component of the electronic device 100 using instructions stored in the memory 140.
  • the processor 150 includes, for example, a central processing unit (CPU), a graphics processing unit (GPU), a microprocessor, an application processor, an application specific integrated circuit (ASIC), and field programmable gate arrays (FPGAs). )), And may have a plurality of cores.
  • CPU central processing unit
  • GPU graphics processing unit
  • ASIC application specific integrated circuit
  • FPGAs field programmable gate arrays
  • the processor 150 may activate the first camera 110, acquire an image using the first camera 110, and output the image to the display 130.
  • the processor 150 activates the first camera 110 regardless of external illuminance, acquires an image using the first camera 110, and outputs the image to the display 130 can do.
  • the processor 150 may check external illuminance using the first camera 110 while acquiring an image using the first camera 110.
  • the processor 150 may check whether the identified external illuminance falls within the first illuminance range or the second illuminance range.
  • the processor 150 may perform a first time (first duration) in which the external illuminance belongs to the second illuminance range. Time) can be checked (measured). For example, while acquiring an image using the first camera 110, the processor 150 may start a measurement of the first time by driving a timer when it is determined that the external illuminance falls within the second illuminance range. .
  • the timer may be a component built into the processor 150. Alternatively, the timer may be configured separately from the processor 150.
  • the processor 150 may obtain a second camera when the first time in the second illuminance range satisfies the first specified time (or time condition) while acquiring an image using the first camera 110. (120) can be activated. For example, if the first time satisfies the first specified time, it may be that the first time exceeds the first threshold time.
  • the first threshold time may be determined, for example, based on at least one of the optical characteristics of the first camera 110 or the optical characteristics of the second camera 120.
  • the optical characteristic may include, for example, at least one of F number , exposure value, and aperture value.
  • the processor 150 stops driving the timer and initializes a timer (eg, when the external illuminance is out of the second illuminance range) while checking the first time that the external illuminance belongs to the second illuminance range. First time).
  • the processor 150 may deactivate the second camera 120 if the first time identified after the second camera 120 is activated satisfies the second designated time (time condition). Since the first time identified after the second camera 120 is activated is a time accumulated from before activating the second camera 120, it may be a time exceeding at least the first threshold time. When the first time satisfies the second designated time, it may be that the external illuminance is outside the second illuminance range before the first time exceeds the second threshold time.
  • the second threshold time may be set based on at least one of the optical characteristics of the first camera 110 and the optical characteristics of the second camera 120.
  • the second threshold time may be set to exceed the time required for the second camera 120 to stabilize after the second camera 120 is activated based on the optical characteristics of the second camera 120. have.
  • the second threshold time may be set to be shorter as the difference between the F number value of the first camera 110 and the F number value of the second camera 120 increases.
  • the processor 150 may acquire an image using the second camera 120 and deactivate the first camera 110.
  • the first time satisfies the third designated time the first time that the external illuminance falls within the second illuminance range while acquiring an image using the first camera 110 exceeds the second threshold time. You can.
  • the processor 150 may check the external illuminance using the second camera 120 while acquiring an image using the second camera 120. For example, while acquiring an image using the second camera 120, the processor 150 may check whether the external illuminance falls within the first illuminance range or the second illuminance range.
  • the processor 150 determines a second time (second duration) belonging to the first illuminance range when it is determined that the external illuminance falls within the first illuminance range. Can be confirmed. For example, while the processor 150 acquires an image using the second camera 120, if it determines that the external illuminance falls within the first illuminance range, it starts a timer to start measuring the second time. You can.
  • the processor 150 may activate the first camera 110.
  • the second time may exceed the third threshold time.
  • the third threshold time may be determined, for example, based on at least one of the optical characteristics of the second camera 120 or the optical characteristics of the first camera 110.
  • the processor 150 may deactivate the first camera 110 if the second time identified after activating the first camera 110 satisfies the fifth designated time (time condition). Since the second time identified after activating the first camera 110 is a time accumulated from before activating the first camera 110, it may be a time exceeding at least the third threshold time. When the second time satisfies the fifth designated time, external illuminance may be outside the first illuminance range before the second time exceeds the fourth threshold time.
  • the fourth threshold time may be set based on the optical characteristics of the first camera 110. For example, the fourth threshold time may be set to exceed a time required for the first camera 110 to stabilize after the first camera 110 is activated based on the characteristics of the first camera 110. .
  • the fourth threshold time may be set shorter as the difference between the F number value of the first camera 110 and the F number value of the second camera 120 increases.
  • the processor 150 may stop driving the timer and initialize the timer (initialize the second time).
  • the processor 150 acquires an image using the first camera 110 and the second camera 120 Can be disabled.
  • the second time satisfies the sixth designated time
  • the fourth threshold time may be set to less than the second threshold time.
  • the processor 150 when the capture request is confirmed while the processor 150 outputs an image obtained using the first camera 110 or the second camera 120 to the display 130, the processor 150 An image may be taken using a camera corresponding to the image, and the captured image may be stored in the memory 140.
  • the electronic device 100 further includes an illuminance sensor (not shown), and the processor 150 can detect external illuminance using the illuminance sensor.
  • a plurality of illuminance sensors are provided at a plurality of positions of the electronic device 100 (for example, when the illuminance sensors are provided on the front and rear surfaces of the electronic device 100), and the processor 150 is Among the first camera 110 and the second camera 120, external illuminance may be detected using an illuminance sensor located close to the camera being used to acquire an image.
  • the electronic device 100 before entering the preview mode, the electronic device 100 detects the external illuminance using the illuminance sensor, and selects the external illuminance from the first camera 110 and the second camera 120 according to the external illuminance.
  • the first camera to be used in the preview mode is determined, and the determined camera is activated in the preview mode to acquire an image from the determined camera.
  • the processor 150 may check the external illuminance, and acquire an image using a camera having higher image quality at the current illuminance among the first camera 110 and the second camera 120. In particular, it is possible to increase user satisfaction with images in a low-light environment in which image quality deteriorates.
  • the processor 150 in addition to the external illuminance, considering at least one of the optical characteristics of the first camera 110 and the second camera 120, the image quality can be substantially improved, As the camera used for acquiring an image is switched, a quality improvement effect can be obtained substantially due to the camera switching.
  • the electronic device (eg, the electronic device 100 of FIG. 2) includes a first camera (eg, the first camera 110 of FIG. 2); A second camera (eg, second camera 120 in FIG. 2); And a processor (eg, the processor 150 of FIG.
  • the processor acquires an image using the first camera and acquires an image using the first camera, while the first camera Checking the time when the illuminance outside the electronic device falls within a specified illuminance range, and activating the second camera based on at least a determination that the time satisfies the first specified time, and activating the second camera If the time checked after the time satisfies the second designated time, the second camera is deactivated, and when the time checked after the second camera is activated satisfies the third designated time, the second time It may be set to acquire an image using a 2 camera and deactivate the first camera.
  • the image quality of the image acquired using the second camera may be superior to that of the image acquired using the first camera.
  • the electronic device further includes a display (eg, the display 130), and the processor may be configured to output the image obtained by the first camera or the second camera to the display. have.
  • the designated illuminance range may include an illuminance range below a specified illuminance or an illuminance range exceeding the specified illuminance.
  • the processor may be configured to determine that the time satisfies the first specified time when the time exceeds a first threshold time while acquiring an image using the first camera.
  • the processor while acquiring an image using the first camera, in a situation where the time is less than or equal to a second threshold time, if the illuminance is outside the specified illuminance range, the time satisfies the second specified time And the second threshold time exceeds the first threshold time and may be determined based on optical characteristics of the first camera and the second camera.
  • the processor is set to determine that the time satisfies the third specified time if the time exceeds a second threshold time while acquiring an image using the first camera, and the second threshold time May exceed the first threshold time and be determined based on optical characteristics of the first camera and the second camera.
  • the second threshold time may be set to more than the time required for stabilization after the second camera is activated.
  • the second threshold time, wherein the larger the difference between the first camera and the second camera of the F number value of the F number value may be set shorter.
  • the processor while acquiring an image using the second camera, uses the second camera to identify another time when the external illuminance is out of the specified illuminance range, and the other time to determine a fourth specified time.
  • the first camera may be activated based at least on a satisfied decision.
  • the processor deactivates the first camera when the other time identified after the first camera is activated satisfies the fifth designated time, and the other time checked after the first camera is activated is removed. 6 If a specified time is satisfied, an image may be acquired using the first camera, and the second camera may be deactivated.
  • the electronic device (eg, 100 in FIG. 2) includes a housing (eg, h100 in FIG. 1); A display (eg, 130 in FIG. 1); A first camera (eg, the first camera 110 of FIG. 1) disposed on a first surface of the housing (eg, the first surface 102 of FIG. 1); A second camera disposed on a first surface of the housing to be spaced apart from the first camera at a specified distance, and having a higher image quality than the first camera in a designated illumination range (eg, the second camera 120 of FIG. 1); And a processor (eg, the processor 150 of FIG. 2) functionally connected to the display, the first camera, and the second camera.
  • a processor eg, the processor 150 of FIG. 2 functionally connected to the display, the first camera, and the second camera.
  • the processor outputs an image acquired using the first camera to the display, and outputs an image acquired using the first camera to the display, while using the first camera to externally use the electronic device.
  • the processor in an operation of activating the second camera, when the external illuminance falls within the specified illuminance range, the external illuminance confirms a first duration within the specified illuminance range, and the first duration If the time exceeds the first threshold time, it can be set to activate the second camera.
  • the processor is configured to deactivate the second camera when the first duration determined after the second camera is activated exceeds a second threshold time, and the processor is deactivated.
  • the second threshold time may exceed the first threshold time and be determined based at least on the optical characteristics of the first camera and the second camera.
  • the processor may be configured to deactivate the second camera when the illuminance is outside the specified illuminance range in a situation in which the first duration exceeds the first threshold time and is less than or equal to the second threshold time.
  • the image quality of the first camera is superior to that of the second camera, and the processor is configured to display an image acquired using the second camera on the display.
  • the external camera is used to check the external illuminance, and if the external illuminance is outside the specified illuminance range, the first camera is activated, and an image acquired using the first camera It can be set to output to the display, and to deactivate the second camera.
  • the processor in an operation of activating the first camera, if the external illuminance is outside the specified illuminance range, the external illuminance confirms a second duration outside the specified illuminance range, and the second duration If the time exceeds the third threshold time, it can be set to activate the first camera.
  • the electronic device includes: a sensor module (eg, a sensor module 1076 in FIG. 10) that can detect the illuminance outside the electronic device; A first camera (eg, first camera 110 in FIG. 2); A second camera (eg, second camera 120 in FIG. 2); And a processor (eg, the processor 150 of FIG. 2), wherein the processor acquires an image using the first camera and acquires an image using the first camera while the sensor module is acquired.
  • a sensor module eg, a sensor module 1076 in FIG. 1076 in FIG. 1076 in FIG. 1076 in FIG. 10
  • a first camera eg, first camera 110 in FIG. 2
  • a second camera eg, second camera 120 in FIG. 2
  • a processor eg, the processor 150 of FIG. 2
  • the second camera Use to check the time that the illumination outside the electronic device falls within a specified illumination range, activate the second camera based on at least a determination that the time satisfies the first specified time, and activate the second camera If the time identified thereafter satisfies the second designated time, deactivate the second camera, and if the time checked after the second camera is activated satisfies the third designated time, the second It may be set to acquire an image using a camera and deactivate the first camera.
  • the sensor module includes a plurality of illuminance sensors, and the processor uses the illuminance sensor relatively close to the first camera among the plurality of illuminance sensors while acquiring an image using the first camera. It can be set to detect external illuminance.
  • the designated illuminance range may include an illuminance range below a specified illuminance or an illuminance range exceeding the specified illuminance.
  • FIG. 3 is a diagram comparing the structure of a general image sensor having a same pixel (16 pixels) and a tetracell image sensor according to an embodiment.
  • the general image sensor 310 is an image sensor to which one color filter is applied per pixel
  • the tetracell image sensor 320 is an image sensor to which one color filter is applied per 4 pixels
  • the processor 150 displays a general image in which the size of each pixel corresponds to four pixels An image of the same or similar quality as the sensor can be obtained. Since the image sensor has a characteristic that the amount of light received from the outside increases as the pixel size increases, the tetracell image sensor 320 may have better image quality at low light than the general image sensor 310 having at least the same pixel size.
  • FIG. 4 is a schematic flowchart of a camera switching method according to an embodiment.
  • the external illumination corresponds to a second illumination range (corresponding to the 'first time' described above) Can be checked).
  • the second illuminance range may include, for example, an illuminance range below a critical illuminance.
  • the critical illuminance is a roughness that is a reference for low illuminance, and may be, for example, about 50 lux.
  • the processor 150 drives the timer to time the external illuminance falls into the second illuminance range Can measure
  • the processor 150 may activate the second camera 120 based at least on the determination that the identified time satisfies the first specified time. For example, when the identified time exceeds the first threshold time, the processor 150 may activate the second camera 120.
  • the processor 150 may deactivate the second camera 120 if the identified time after activating the second camera 120 satisfies the second designated time.
  • the time identified after activating the second camera 120 is an accumulated time from before activating the second camera 120 and may exceed at least the first threshold time.
  • the processor 150 may initialize the timer after deactivating the second camera 120.
  • the processor 150 acquires an image using the second camera 120 and first camera 110 ) Can be disabled.
  • the processor 150 may initialize the timer after deactivating the first camera 110.
  • FIG. 5 is a detailed flowchart of a camera switching method according to an embodiment.
  • the processor 150 acquires an image using the first camera 110 (eg, the first camera 110 of FIG. 1). While, it is possible to check the illuminance outside the electronic device 100 using the first camera 110.
  • the processor 150 outputs the image obtained using the first camera 110 to the display 130, or It can be stored in the memory 140.
  • the processor 150 may check whether the external illuminance identified while acquiring an image using the first camera 110 is less than or equal to a critical illuminance (eg, about 50 lux).
  • a critical illuminance eg, about 50 lux
  • the timer 150 measures a time when the external illuminance lasts below the critical illuminance. can do.
  • the processor 150 may determine whether the time that the external illuminance lasts below the threshold illuminance exceeds the first threshold time using a timer while acquiring an image using the first camera 110.
  • the processor 150 may acquire the image using the first camera 110, and if the time during which the external illuminance persists below the threshold illuminance exceeds the first threshold time, the second camera 120 (eg, : The second camera 10 of FIG. 1 may be activated.
  • the second camera 120 eg, : The second camera 10 of FIG. 1 may be activated.
  • the processor 150 may determine whether a time during which the external illuminance lasts below the threshold illuminance exceeds the second threshold time while acquiring an image using the first camera 110.
  • the processor 150 uses the second camera 120 if the time during which the external illuminance continues to be below the critical illuminance while acquiring an image using the first camera 110 exceeds the second threshold time To obtain an image and deactivate the first camera 110. In operation 535, the processor 150 may stop driving the timer and initialize the timer.
  • the processor 150 may externally illuminate It is possible to monitor whether or not the critical illuminance is below.
  • the processor 150 may perform operation 520.
  • the processor 150 may stop driving the timer and initialize the timer in operation 545.
  • the processor 150 monitors whether the external illuminance is below the critical illuminance in operation 555. You can.
  • the processor 150 may deactivate the second camera 120. In operation 555, the processor 150 may stop driving the timer and initialize the timer.
  • the 6 is converted from the first camera (eg, the first camera 110 of FIG. 2) to the second camera (eg, the second camera 120 of FIG. 2) according to an embodiment, and then back to the first camera This is an example of conversion.
  • the processor 150 eg, the processor 150 of FIG. 2 enters the preview mode according to a user input
  • the first camera 110 is activated and the first camera 110 is used.
  • the external illuminance may be detected using the first camera 110.
  • the processor 150 does not deactivate the second camera 120, so the second camera 120 may be in a deactivated state.
  • the processor 150 measures a time when the external illuminance belongs to the first illuminance range when the range to which the external illuminance belongs varies from the first illuminance range to the second illuminance range while acquiring an image using the first camera 110. can do.
  • the processor 150 activates the second camera 120 when the time when the external illuminance falls within the second illuminance range exceeds the first threshold time T 1 while acquiring an image using the first camera 110. (ON).
  • the processor 150 uses the second camera 120 when the time when the external illuminance belongs to the second illuminance range exceeds the second threshold time T 2 while acquiring the image using the first camera 110. To obtain an image and deactivate (OFF) the first camera 110.
  • the processor 150 may detect external illuminance using the second camera 120 while acquiring an image using the second camera 120. While acquiring an image using the second camera 120, the processor 150 may confirm that the range to which the external illuminance belongs changes from the second illuminance range to the first illuminance range.
  • the processor 150 may measure a time belonging to the first illuminance range when the external illuminance falls within the first illuminance range while acquiring an image using the second camera 120.
  • the processor 150 activates the first camera 110 when the time when the external illuminance falls within the first illuminance range exceeds the third threshold time T 3 while acquiring an image using the second camera 120. can do.
  • the processor 150 uses the first camera 110 when the time when the external illuminance falls within the first illuminance range exceeds the fourth threshold time T 4 while acquiring an image using the second camera 120. To obtain an image and deactivate (OFF) the second camera 120.
  • 7 is another example of switching from the first camera to the second camera and then switching to the first camera according to an embodiment.
  • the processor 150 eg, the processor 150 of FIG. 2 enters the preview mode according to a user input
  • the first camera 110 is activated and the first camera 110 is used While acquiring an image (eg, a preview image or a captured image), the external illuminance may be detected using the first camera 110.
  • the processor 150 does not deactivate the second camera 120, so the second camera 120 may be in a deactivated state.
  • the processor 150 may measure a time when the external illuminance belongs to the first illuminance range while determining that the external illuminance range is the second illuminance range.
  • the processor 150 activates the second camera 120 when the time when the external illuminance falls within the second illuminance range exceeds the first threshold time T 1 while acquiring an image using the first camera 110. (ON).
  • the processor 150 uses the second camera 120 when the time when the external illuminance belongs to the second illuminance range exceeds the second threshold time T 2 while acquiring the image using the first camera 110. To obtain an image and deactivate (OFF) the first camera 110.
  • the processor 150 may detect external illuminance using the second camera 120 while acquiring an image using the second camera 120. While acquiring an image using the second camera 120, the processor 150 may confirm that the range to which the external illuminance belongs changes from the second illuminance range to the first illuminance range.
  • the processor 150 may measure a time belonging to the first illuminance range when the external illuminance falls within the first illuminance range while acquiring an image using the second camera 120.
  • the processor 150 activates the first camera 110 when the time when the external illuminance falls within the first illuminance range exceeds the third threshold time T 3 while acquiring an image using the second camera 120. can do.
  • the processor 150 uses the first camera 110 when the time when the external illuminance falls within the first illuminance range exceeds the fourth threshold time T 4 while acquiring an image using the second camera 120. To obtain an image and deactivate (OFF) the second camera 120.
  • FIG. 8 is an example of a case in which a first camera is used even in a change in external illuminance according to an embodiment.
  • the processor 150 acquires an image using the first camera 110
  • a range in which the external illuminance belongs belongs to the second illuminance in the first illuminance range It can be seen that the range changes.
  • the processor 150 may measure the time that the external illuminance belongs to the second illuminance range.
  • the processor 150 activates the second camera 120 when the time when the external illuminance falls within the second illuminance range exceeds the first threshold time T 1 while acquiring an image using the first camera 110. can do.
  • the processor 150 While the processor 150 acquires an image using the first camera 110, a time when the external illuminance falls within the second illuminance range is greater than the first threshold time T 1 and less than or equal to the second threshold time T 2 From it can be seen that the external illuminance changes to the first illuminance range. In this case, the processor 150 may deactivate the second camera 120 again and acquire an image using the first camera 110.
  • 9 is an example of a case in which a second camera is used even in a change in external illuminance according to an embodiment.
  • the processor 150 (eg, the processor 150 of FIG. 2) has a range in which the external illuminance belongs to the first illuminance in the second illuminance range while acquiring an image using the second camera 120. It can be seen that the range changes.
  • the processor 150 may measure the time that the external illuminance belongs to the first illuminance range. The processor 150 activates the first camera 110 when the time when the external illuminance falls within the first illuminance range exceeds the third threshold time T 3 while acquiring an image using the second camera 120. can do.
  • a time when the external illuminance falls within the first illuminance range is greater than or equal to the third threshold time T 3 and less than or equal to the fourth threshold time T 4 It can be seen that the external illuminance changes back to the second illuminance range. In this case, the processor 150 may deactivate the first camera 110 again and acquire an image using the second camera 120.
  • FIG. 10 is a block diagram of an electronic device 1001 (eg, the electronic device 100 of FIG. 2) in a network environment 1000 according to various embodiments.
  • the electronic device 1001 communicates with the electronic device 1002 through a first network 1098 (eg, a short-range wireless communication network), or a second network 1099.
  • An electronic device 1004 or a server 1008 may be communicated through (eg, a remote wireless communication network).
  • the electronic device 1001 may communicate with the electronic device 1004 through the server 1008.
  • the electronic device 1001 includes a processor 1020 (eg, the processor 150 of FIG. 2), a memory 1030 (eg, the memory 140 of FIG.
  • an input device 1050 Audio output device 1055, display device 1060 (e.g., display 130 in FIG. 2), audio module 1070, sensor module 1076, interface 1077, haptic module 1079, camera module ( 1080) (e.g., first camera 110 and second camera 120 in FIG. 2), power management module 1088, battery 1089, communication module 1090, subscriber identification module 1096, or antenna It may include a module 1097.
  • the components for example, the display device 1060 or the camera module 1080
  • the display device 1060 or the camera module 1080 may be omitted or one or more other components may be added to the electronic device 1001.
  • some of these components may be implemented as one integrated circuit.
  • the sensor module 1076 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the display device 1060 e.g, a display.
  • the processor 1020 executes software (eg, the program 1040) to execute at least one other component (eg, hardware or software component) of the electronic device 1001 connected to the processor 1020. It can be controlled and can perform various data processing or operations. According to one embodiment, as at least part of data processing or computation, the processor 1020 may receive instructions or data received from other components (eg, the sensor module 1076 or the communication module 1090) in the volatile memory 1032. Loaded into, process instructions or data stored in volatile memory 1032, and store result data in nonvolatile memory 1034.
  • software eg, the program 1040
  • the processor 1020 may receive instructions or data received from other components (eg, the sensor module 1076 or the communication module 1090) in the volatile memory 1032. Loaded into, process instructions or data stored in volatile memory 1032, and store result data in nonvolatile memory 1034.
  • the processor 1020 includes a main processor 1021 (eg, a central processing unit or an application processor), and an auxiliary processor 1023 (eg, a graphics processing unit, an image signal processor) that can be operated independently or together. , Sensor hub processor, or communication processor). Additionally or alternatively, the coprocessor 1023 may be configured to use less power than the main processor 1021, or to be specialized for a specified function. The coprocessor 1023 may be implemented separately from, or as part of, the main processor 1021.
  • a main processor 1021 eg, a central processing unit or an application processor
  • auxiliary processor 1023 eg, a graphics processing unit, an image signal processor
  • the coprocessor 1023 may be configured to use less power than the main processor 1021, or to be specialized for a specified function.
  • the coprocessor 1023 may be implemented separately from, or as part of, the main processor 1021.
  • the coprocessor 1023 may, for example, replace the main processor 1021 while the main processor 1021 is in an inactive (eg sleep) state, or the main processor 1021 is active (eg, executing an application) ) With the main processor 1021 while in the state, at least one of the components of the electronic device 1001 (for example, a display device 1060, a sensor module 1076, or a communication module 1090) It can control at least some of the functions or states associated with.
  • the coprocessor 1023 eg, image signal processor or communication processor
  • may be implemented as part of other functionally relevant components eg, camera module 1080 or communication module 1090). have.
  • the memory 1030 may store various data used by at least one component of the electronic device 1001 (for example, the processor 1020 or the sensor module 1076).
  • the data may include, for example, software (eg, the program 1040) and input data or output data for commands related thereto.
  • the memory 1030 may include a volatile memory 1032 or a nonvolatile memory 1034.
  • the program 1040 may be stored as software in the memory 1030, and may include, for example, an operating system 1042, middleware 1044, or an application 1046.
  • the input device 1050 may receive commands or data to be used for components (eg, the processor 1020) of the electronic device 1001 from outside (eg, a user) of the electronic device 1001.
  • the input device 1050 may include, for example, a microphone, mouse, keyboard, or digital pen (eg, a stylus pen).
  • the sound output device 1055 may output sound signals to the outside of the electronic device 1001.
  • the audio output device 1055 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive an incoming call.
  • the receiver may be implemented separately from, or as part of, a speaker.
  • the display device 1060 may visually provide information to the outside of the electronic device 1001 (for example, a user).
  • the display device 1060 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
  • the display device 1060 may include a touch circuitry configured to sense a touch, or a sensor circuit configured to measure the strength of the force generated by the touch (eg, a pressure sensor). have.
  • the audio module 1070 may convert sound into an electrical signal, or vice versa. According to an embodiment, the audio module 1070 acquires sound through the input device 1050, or an external electronic device (eg, directly or wirelessly connected to the sound output device 1055) or the electronic device 1001 The electronic device 1002) (eg, speaker or headphone) may output sound.
  • an external electronic device eg, directly or wirelessly connected to the sound output device 1055
  • the electronic device 1002 eg, speaker or headphone
  • the sensor module 1076 detects an operating state (eg, power or temperature) of the electronic device 1001, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state can do.
  • the sensor module 1076 includes, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared (IR) sensor, a biological sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 1077 may support one or more designated protocols that can be used for the electronic device 1001 to be directly or wirelessly connected to an external electronic device (eg, the electronic device 1002).
  • the interface 1077 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • connection terminal 1078 may include a connector through which the electronic device 1001 is physically connected to an external electronic device (eg, the electronic device 1002).
  • the connection terminal 1078 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 1079 may convert electrical signals into mechanical stimuli (eg, vibration or movement) or electrical stimuli that the user can perceive through tactile or motor sensations.
  • the haptic module 1079 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 1080 may capture still images and videos. According to one embodiment, the camera module 1080 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 1088 may manage power supplied to the electronic device 1001.
  • the power management module 388 may be implemented, for example, as at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 1089 may supply power to at least one component of the electronic device 1001.
  • the battery 1089 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 1090 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 1001 and an external electronic device (eg, the electronic device 1002, the electronic device 1004, or the server 1008). It can support establishing and performing communication through the established communication channel.
  • the communication module 1090 operates independently of the processor 1020 (eg, an application processor), and may include one or more communication processors supporting direct (eg, wired) communication or wireless communication.
  • the communication module 1090 is a wireless communication module 1092 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 1094 (eg : Local area network (LAN) communication module, or power line communication module.
  • the corresponding communication module among these communication modules is a first network 1098 (for example, a short-range communication network such as Bluetooth, WiFi direct, or infrared data association (IrDA)) or a second network 1099 (for example, a cellular network, the Internet, or It may communicate with external electronic devices through a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • the wireless communication module 1092 uses a subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 1096 in a communication network such as the first network 1098 or the second network 1099.
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 1001 may be identified and authenticated.
  • the antenna module 1097 may transmit a signal or power to the outside (eg, an external electronic device) or receive it from the outside.
  • the antenna module may include a single antenna including a conductor formed on a substrate (eg, a PCB) or a radiator made of a conductive pattern.
  • the antenna module 1097 may include a plurality of antennas. In this case, at least one antenna suitable for a communication scheme used in a communication network, such as the first network 1098 or the second network 1099, is transmitted from the plurality of antennas by, for example, the communication module 1090. Can be selected.
  • the signal or power may be transmitted or received between the communication module 1090 and an external electronic device through the at least one selected antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as part of the antenna module 1097.
  • peripheral devices for example, a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 1001 and the external electronic device 1004 through the server 1008 connected to the second network 1099.
  • Each of the electronic devices 1002 and 1004 may be the same or a different type of device from the electronic device 1001.
  • all or part of the operations executed in the electronic device 1001 may be executed in one or more external devices of the external electronic devices 1002, 1004, or 1008.
  • the electronic device 1001 executes the function or service itself.
  • one or more external electronic devices may be requested to perform at least a portion of the function or the service.
  • the one or more external electronic devices receiving the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and deliver the result of the execution to the electronic device 1001.
  • the electronic device 1001 may process the result, as it is or additionally, and provide it as at least part of a response to the request.
  • cloud computing distributed computing, or client-server computing technology This can be used.
  • the electronic device may be various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • any (eg, first) component is referred to as a "coupled” or “connected” to another (eg, second) component, with or without the term “functionally” or “communicatively”
  • any of the above components can be connected directly to the other components (eg, by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic blocks, components, or circuits.
  • the module may be an integrally configured component or a minimum unit of the component or a part thereof performing one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document may include one or more instructions stored in a storage medium (eg, internal memory 1036 or external memory 1038) readable by a machine (eg, electronic device 1001). It may be implemented as software (eg, program 1040) that includes.
  • a processor eg, processor 1020
  • a device eg, electronic device 1001
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the storage medium readable by the device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device, and does not include a signal (eg, electromagnetic wave), and this term is used when data is stored semi-permanently. It does not distinguish between temporary storage cases.
  • a method according to various embodiments disclosed in this document may be provided as being included in a computer program product.
  • Computer program products are commodities that can be traded between sellers and buyers.
  • the computer program product is distributed in the form of a device-readable storage medium (eg compact disc read only memory (CD-ROM)), or through an application store (eg Play Store TM ) or two user devices ( It can be distributed (eg, downloaded or uploaded) directly or online between smartphones).
  • a portion of the computer program product may be temporarily stored at least temporarily in a storage medium readable by a device such as a memory of a manufacturer's server, an application store's server, or a relay server, or temporarily generated.
  • each component (eg, module or program) of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations of the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components the same or similar to that performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, or omitted Or, one or more other actions can be added. Accordingly, the scope of this document should be construed to include all changes or various other embodiments based on the technical spirit of this document.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif électronique. Un mode de réalisation de la présente invention concerne un dispositif électronique comprenant une première caméra, une seconde caméra, et un processeur. Le processeur est configuré pour : obtenir une image au moyen de la première caméra ; durant l'obtention de l'image au moyen de la première caméra, vérifier, au moyen de la première caméra, une durée pendant laquelle un éclairement de l'extérieur du dispositif électronique se situe dans une plage d'éclairement désignée ; activer la seconde caméra au moins sur la base de la détermination selon laquelle la durée satisfait une première durée désignée ; lorsque la durée vérifiée après l'activation de la seconde caméra satisfait un deuxième durée désignée, désactiver la seconde caméra ; et lorsque la durée vérifiée après l'activation de la seconde caméra satisfait une troisième durée désignée, obtenir une image au moyen de la seconde caméra, et désactiver la première caméra. Divers modes de réalisation définis dans la description sont également possibles.
PCT/KR2019/011494 2018-09-18 2019-09-05 Dispositif électronique pour piloter une pluralité de caméras sur la base d'un éclairement externe Ceased WO2020060081A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0111406 2018-09-18
KR1020180111406A KR102544709B1 (ko) 2018-09-18 2018-09-18 외부 조도에 기반하여 복수개의 카메라를 구동하는 전자 장치

Publications (1)

Publication Number Publication Date
WO2020060081A1 true WO2020060081A1 (fr) 2020-03-26

Family

ID=69887477

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/011494 Ceased WO2020060081A1 (fr) 2018-09-18 2019-09-05 Dispositif électronique pour piloter une pluralité de caméras sur la base d'un éclairement externe

Country Status (2)

Country Link
KR (1) KR102544709B1 (fr)
WO (1) WO2020060081A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4391569A4 (fr) 2021-12-10 2024-12-04 Samsung Electronics Co., Ltd. Appareil électronique pour exécuter une application en utilisant différentes informations d'appareil photo selon un environnement de photographie, et son procédé de commande
WO2023106889A1 (fr) * 2021-12-10 2023-06-15 삼성전자주식회사 Appareil électronique pour exécuter une application en utilisant différentes informations d'appareil photo selon un environnement de photographie, et son procédé de commande
EP4459971A4 (fr) * 2022-04-20 2025-05-21 Samsung Electronics Co., Ltd. Dispositif électronique, et procédé pour régler une zone d'affichage d'une unité d'affichage sur la base d'une intensité de lumière émise par une unité d'affichage
WO2025048191A1 (fr) * 2023-08-31 2025-03-06 삼성전자주식회사 Dispositif électronique, procédé et support de stockage non transitoire lisible par ordinateur pour ajuster la température de couleur d'un écran

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009147977A (ja) * 2009-03-23 2009-07-02 Sony Corp カメラシステムおよび移動体カメラシステム
KR20160087324A (ko) * 2015-01-13 2016-07-21 삼성전자주식회사 카메라 활성화 및 조도
US20180070009A1 (en) * 2016-09-07 2018-03-08 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
CN107948508A (zh) * 2017-11-24 2018-04-20 北京图森未来科技有限公司 一种车载端图像采集系统及方法
KR20180086942A (ko) * 2017-01-24 2018-08-01 삼성전자주식회사 복수의 카메라를 제어 하는 방법 및 전자 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009147977A (ja) * 2009-03-23 2009-07-02 Sony Corp カメラシステムおよび移動体カメラシステム
KR20160087324A (ko) * 2015-01-13 2016-07-21 삼성전자주식회사 카메라 활성화 및 조도
US20180070009A1 (en) * 2016-09-07 2018-03-08 Samsung Electronics Co., Ltd. Electronic apparatus and controlling method thereof
KR20180086942A (ko) * 2017-01-24 2018-08-01 삼성전자주식회사 복수의 카메라를 제어 하는 방법 및 전자 장치
CN107948508A (zh) * 2017-11-24 2018-04-20 北京图森未来科技有限公司 一种车载端图像采集系统及方法

Also Published As

Publication number Publication date
KR20200032411A (ko) 2020-03-26
KR102544709B1 (ko) 2023-06-20

Similar Documents

Publication Publication Date Title
WO2020080845A1 (fr) Dispositif électronique et procédé pour obtenir des images
WO2019045497A1 (fr) Dispositif électronique comprenant un affichage et procédé de correction associé
WO2020130654A1 (fr) Module de caméra ayant une structure multi-cellulaire et dispositif de communication portable le comprenant
WO2020060081A1 (fr) Dispositif électronique pour piloter une pluralité de caméras sur la base d'un éclairement externe
WO2019093856A1 (fr) Dispositif et procédé de commande de microphone en fonction de la connexion d'un accessoire externe
EP3925206A1 (fr) Dispositif électronique, procédé et support lisible par ordinateur pour fournir un effet de flou dans une vidéo
WO2019156308A1 (fr) Appareil et procédé d'estimation de mouvement de stabilisation d'image optique
WO2021020900A1 (fr) Dispositif électronique de prévention de l'endommagement d'un dispositif usb et son procédé de fonctionnement
WO2021096219A1 (fr) Dispositif électronique comprenant une caméra et son procédé
WO2020159255A1 (fr) Système permettant de traiter des données d'utilisateur et son procédé de commande
WO2020130729A1 (fr) Dispositif électronique pliable destiné à fournir des informations associées à un événement et son procédé de fonctionnement
WO2020111576A1 (fr) Procédé de compensation de dégradation en fonction d'un écran d'exécution d'application et dispositif électronique mettant en œuvre ce dernier
WO2019172577A1 (fr) Dispositif et procédé de traitement d'images d'un dispositif électronique
WO2020171607A1 (fr) Circuit tactile pour empêcher un toucher erroné dû à un changement de température, dispositif électronique comprenant le circuit tactile et son procédé de fonctionnement
WO2020153738A1 (fr) Dispositif électronique et procédé de connexion d'un nœud de masse à un module de caméra
WO2020171492A1 (fr) Procédé de traitement d'image photographique et dispositif électronique associé
WO2019203425A1 (fr) Dispositif et procédé de traitement de signal de stylo optique à fréquence de résonance modifiée
WO2021060943A1 (fr) Dispositif électronique pour identifier un dispositif électronique externe, et procédé de commande associé
WO2019054610A1 (fr) Dispositif électronique et procédé de commande d'une pluralité de capteurs d'image
WO2021235884A1 (fr) Dispositif électronique et procédé de génération d'image par réalisation d'un awb
WO2020204428A1 (fr) Dispositif électronique et procédé de compensation d'erreur de profondeur en fonction de la fréquence de modulation
WO2019172723A1 (fr) Interface connectée à un capteur d'image et dispositif électronique comprenant des interfaces connectées parmi une pluralité de processeurs
WO2019107968A1 (fr) Dispositif électronique et son procédé de d'acquisition d'image
WO2019151619A1 (fr) Dispositif électronique dans lequel est inséré un plateau de chargement de carte sim et procédé de commande associé
WO2021162241A1 (fr) Procédé et dispositif de commande d'un capteur d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19862117

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19862117

Country of ref document: EP

Kind code of ref document: A1