[go: up one dir, main page]

WO2020052284A1 - Procédé et dispositif de commande, caméra de profondeur, dispositif électronique, et support de stockage lisible par ordinateur - Google Patents

Procédé et dispositif de commande, caméra de profondeur, dispositif électronique, et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2020052284A1
WO2020052284A1 PCT/CN2019/090020 CN2019090020W WO2020052284A1 WO 2020052284 A1 WO2020052284 A1 WO 2020052284A1 CN 2019090020 W CN2019090020 W CN 2019090020W WO 2020052284 A1 WO2020052284 A1 WO 2020052284A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
scene
scene image
projection distance
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/090020
Other languages
English (en)
Chinese (zh)
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of WO2020052284A1 publication Critical patent/WO2020052284A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/40Transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/50Transmitters
    • H04B10/564Power control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present application relates to the field of three-dimensional imaging technology, and in particular, to a control method, a control device, a time-of-flight depth camera, an electronic device, and a computer-readable storage medium.
  • Time of flight (TOF) imaging system can calculate the depth information of the measured object by calculating the time difference between the moment when the optical transmitter emits the optical signal and the moment when the optical receiver receives the optical signal.
  • Light emitters typically include a light source and a diffuser. The light from the light source is diffused by the diffuser and then casts a uniform surface light into the scene. The light emitted by the light source is usually an infrared laser.
  • Embodiments of the present application provide a control method, a control device, a time-of-flight depth camera, an electronic device, and a computer-readable storage medium.
  • the method for controlling an optical transmitter includes: acquiring a scene image of a scene; identifying whether a human face exists in the scene image; and controlling the optical transmitter to The first light emitting power and / or the first on frequency emit light; when the human face does not exist in the scene image, controlling the light emitter to emit light with the second light emitting power and / or the second on frequency.
  • the control device of the light transmitter includes a first acquisition module, an identification module, and a control module.
  • the first acquisition module is configured to acquire a scene image of a scene;
  • the recognition module is configured to recognize whether a human face exists in the scene image;
  • the control module is configured to, when the human face exists in the scene image, Controlling the light emitter to emit light at a first light emitting power and / or a first on frequency, and when the human face does not exist in the scene image, controlling the light emitter to emit light at a second light emitting power and / or Two on-frequency lights.
  • the time-of-flight camera includes a light transmitter and a processor.
  • the processor is configured to obtain a scene image of a scene, identify whether a human face exists in the scene image, and control the light transmitter to be turned on at a first light emitting power and / or a first when the human face exists in the scene image.
  • the light emitter is controlled to emit light at a second light emission power and / or a second on frequency.
  • the electronic device includes the time-of-flight depth camera, one or more processors, a memory, and one or more programs.
  • the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the programs include instructions for performing the control method described above.
  • the computer-readable storage medium of the embodiment of the present application includes a computer program used in combination with an electronic device, and the computer program can be executed by a processor to complete the control method described above.
  • FIG. 1 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
  • FIG. 2 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 3 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 4 and FIG. 5 are schematic diagrams of the turn-on frequencies of the optical transmitter in some embodiments of the present application.
  • FIG. 6 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 7 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
  • FIGS. 8 and 9 are schematic flowcharts of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 10 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 11 is a schematic block diagram of a second acquisition module in a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 12 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 13 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 14 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 15 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 16 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • FIG. 17 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 18 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
  • 19 is a schematic block diagram of a control module in a control device for a light transmitter according to some embodiments of the present application.
  • FIG. 20 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
  • FIG. 21 is a schematic diagram of a three-dimensional structure of a depth camera according to some embodiments of the present application.
  • FIG. 22 is a schematic plan view of a depth camera according to some embodiments of the present application.
  • FIG. 23 is a schematic cross-sectional view of the depth camera in FIG. 22 along the line XXIII-XXIII.
  • FIG. 24 is a schematic structural diagram of a light emitter according to some embodiments of the present application.
  • 25 is a schematic diagram of a connection between an electronic device and a computer-readable storage medium according to some embodiments of the present application.
  • the present application provides a method for controlling a light transmitter 100.
  • the control method includes: acquiring a scene image of the scene; 03: identifying whether a human face exists in the scene image; and controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face exists in the scene image; When a human face does not exist in the scene image, the light transmitter 100 is controlled to emit light at a second light emitting power and / or a second on frequency.
  • the first turn-on frequency includes a first sub-turn-on frequency and a second sub-turn-on frequency.
  • the control method further includes: judging an application scenario of the light transmitter.
  • the step of controlling the light transmitter 100 to emit light at the first luminous power and / or the first on frequency includes: when the application scene is the first scene and the human face exists in the scene image, Control the light emitter 100 to emit light at the first light emitting power and the first sub-on frequency; when the application scene is the second scene and a human face is present in the scene image, control the light emitter 100 to turn on at the first light power and the second sub-light Frequency glow.
  • control method further includes: obtaining a projection distance between the user and the light transmitter 100; and calculating the first light emitting power according to the projection distance.
  • the step of obtaining a projection distance between the user and the light transmitter 100 includes: calculating a first proportion of a human face in the scene image; and calculating the projection distance according to the first proportion.
  • the step of calculating the projection distance according to the first scale includes: calculating a second ratio of the preset feature area of the human face in the scene image to the human face; and calculating the projection distance according to the first ratio and the second ratio.
  • the step of calculating the projection distance according to the first ratio includes: judging whether the user is wearing glasses according to the scene image; and calculating the projection distance according to the first ratio and the distance coefficient when the user wears glasses.
  • the step of calculating the projection distance according to the first ratio includes: judging the age of the user according to the scene image; and calculating the projection distance according to the first ratio and age.
  • the step of controlling the light emitter 100 to emit light at the second light emission power and / or the second on frequency includes: obtaining a target subject and the light emitter 100 in the scene. Obtains the ambient brightness of the scene; and calculates the second luminous power according to the ambient brightness and the projection distance.
  • the present application provides a control device 90 of a light transmitter 100.
  • the control device 90 includes a first acquisition module 91, an identification module 93, and a control module 95.
  • the first acquisition module 91 is configured to acquire a scene image of a scene.
  • the recognition module 93 is used to recognize whether a human face exists in the scene image.
  • the control module 95 is configured to control the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face is present in the scene image, and control the light transmitter 100 to The second light emitting power and / or the second on frequency emit light.
  • the first turn-on frequency includes a first sub-turn-on frequency and a second sub-turn-on frequency.
  • the control device 90 further includes a determination module 941, which is configured to determine an application scenario of the optical transmitter 100.
  • the control module 90 may also control the light transmitter 100 to emit light at the first light emission power and the first sub-on frequency when the application scene is the first scene and the human face is present in the scene image, and the second scene is the second scene in the application scene.
  • the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
  • control device 90 further includes a second obtaining module 942 and a calculation module 942.
  • the second acquisition module 942 is configured to acquire a projection distance between the user and the light transmitter 100.
  • the calculation module 943 is configured to calculate the first light emitting power according to the projection distance.
  • the second obtaining module 942 includes a first calculation unit 9421 and a second calculation unit 9422.
  • the first calculation unit 9421 is configured to calculate a first proportion of a face in a scene image.
  • the second calculation unit 9422 is configured to calculate a projection distance according to the first ratio.
  • the second calculation unit 942 includes a first calculation sub-unit 9423 and a second calculation sub-unit 9424.
  • the first calculation subunit 9423 is configured to calculate a second proportion of the preset feature area of the human face in the scene image.
  • the second calculation subunit 9424 is configured to calculate a projection distance according to the first scale and the second scale.
  • the second calculation unit 942 includes a first determination sub-unit 9425 and a third calculation sub-unit 9426.
  • the first judging subunit 9425 is configured to judge whether the user wears glasses according to the scene image.
  • the third subunit 9426 is configured to calculate a projection distance according to the first scale and the distance coefficient when the user wears glasses.
  • the second calculation unit 942 includes a second determination sub-unit 9427 and a fourth calculation sub-unit 9428.
  • the second judging subunit 9427 is configured to judge the age of the user according to the scene image.
  • the fourth calculation subunit 9428 is configured to calculate a projection distance according to the first ratio and age.
  • the control module 95 includes a first obtaining unit 951, a second obtaining unit 952, and a third computing unit 953.
  • the first obtaining unit 951 is configured to obtain a projection distance between a target subject in the scene and the light emitter 100.
  • the second obtaining unit 952 is configured to obtain the ambient brightness of the scene.
  • the third calculation unit 953 is configured to calculate a second light emitting power according to the ambient brightness and the projection distance.
  • the application also provides a time-of-flight depth camera 300.
  • the time-of-flight depth camera 300 includes a light transmitter 100 and a processor.
  • the processor is configured to: obtain a scene image of the scene; identify whether a human face is present in the scene image; and when a human face is present in the scene image, control the light transmitter 100 to emit light at a first light emission power and / or a first on frequency; and in the scene image When the human face does not exist, the light transmitter 100 is controlled to emit light at a second light emitting power and / or a second on frequency.
  • the processor is further configured to: determine an application scenario of the light transmitter 100; and when the application scenario is the first scene and a human face is present in the scene image, control the light transmitter at the first light emitting power and the first When the application scene is the second scene and a human face is present in the scene image, the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
  • the processor is further configured to: obtain a projection distance between the user and the light transmitter 100; and calculate the first light emitting power according to the projection distance.
  • the processor is further configured to: calculate a first proportion of a face in a scene image; and calculate a projection distance according to the first proportion.
  • the processor is further configured to: calculate a second proportion of the preset feature area of the face in the scene image to the human face; and calculate a projection distance according to the first proportion and the second proportion.
  • the processor is further configured to: determine whether the user wears glasses according to the scene image; and calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
  • the processor is further configured to: determine the age of the user according to the scene image; and calculate the projection distance according to the first scale and age.
  • the processor is further configured to: obtain a projection distance between the target subject and the light emitter in the scene; obtain the ambient brightness of the scene; and calculate the second luminous power according to the ambient brightness and the projection distance .
  • the present application further provides an electronic device 800.
  • the electronic device includes the depth camera 300 according to any one of the foregoing embodiments, one or more processors 805, a memory 806, and one or more programs.
  • One or more programs are stored in the memory 806 and are configured to be executed by one or more processors 805.
  • the program includes instructions for the control method according to any one of the above embodiments.
  • the present application further provides a computer-readable storage medium 900.
  • the computer-readable storage medium 900 includes a computer program used in conjunction with the electronic device 800.
  • the computer program may be executed by the processor 805 to implement the control method according to any one of the foregoing embodiments.
  • Control methods include:
  • controlling the light transmitter 100 when a human face is present in the scene image, controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency;
  • the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
  • the present application further provides a control device 90 of the optical transmitter 100.
  • the control method according to the embodiment of the present application may be implemented by the control device 90 according to the embodiment of the present application.
  • the control device 90 includes a first acquisition module 91, an identification module 93, and a control module 95.
  • Step 01 may be implemented by the first obtaining module 91.
  • Step 03 may be implemented by the identification module 93.
  • Both steps 05 and 07 can be implemented by the control module 95. That is, the first acquisition module 91 may be configured to acquire a scene image of a scene.
  • the recognition module 93 may be used to recognize whether a human face exists in the scene image.
  • the control module 95 may be configured to control the light emitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face is present in the scene image, and to control the light emitter 100 to emit a second light when no human face is present in the scene image.
  • the light emitting power and / or the second on frequency emit light.
  • this application further provides a time-of-flight depth camera 300.
  • the control device 90 according to the embodiment of the present application can be applied to the time-of-flight depth camera 300 according to the embodiment of the present application.
  • the time-of-flight depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor.
  • Step 01, step 03, step 05, and step 07 can all be implemented by a processor. That is to say, the processor may be configured to acquire a scene image of the scene, identify whether a human face exists in the scene image, and control the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face exists in the scene image. And when the human face does not exist in the scene image, the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
  • the time-of-flight camera 300 according to the embodiment of the present application can be applied to the electronic device 800.
  • the processor in the time-of-flight camera 300 according to the embodiment of the present application and the processor 805 of the electronic device 800 may be the same processor, or may be two independent processors. In a specific embodiment of the present application, the processor in the time-of-flight depth camera 300 and the processor 805 of the electronic device 800 are the same processor.
  • the electronic device 800 may be a mobile phone, a tablet computer, a smart wearable device (a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, etc., and is not limited herein.
  • the time-of-flight depth camera 300 generally includes a light transmitter 100 and a light receiver 200.
  • the light transmitter 100 is used to project a laser light into the scene, and the light receiver 200 receives the laser light reflected by a person or an object in the scene.
  • the processor 805 controls both the optical transmitter 100 and the optical receiver 200 to be turned on, and inputs a modulation signal with a certain frequency and amplitude to the driver 61 (shown in FIG. 24), and the driver 61 converts the modulation signal After being a constant current source, it is transmitted to the light source (shown in FIG. 24) of the light emitter 100, so that the light source emits laser light.
  • the laser emitted by the optical transmitter 100 is usually an infrared laser. If the energy of the infrared laser is too high or the infrared laser is continuously emitted to a position for a long time, it is easy to cause damage to the eyes of the user.
  • the control device 90 and the time-of-flight camera 300 when the light transmitter 100 is turned on, a scene image of a scene is first collected.
  • a scene image of a scene is first collected.
  • an infrared camera which may be the light receiver 200
  • a visible light camera may be used. 400 acquisitions.
  • the processor 805 recognizes whether a face exists in the scene image based on the face recognition algorithm.
  • the processor 805 controls the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency.
  • the processor 805 controls the light transmitter 100 to emit light at a second light emitting power and / or a second on frequency.
  • the light emitting power is indirectly characterized by the current output by the driver 61.
  • the turn-on frequency refers to the turn-on frequency of the light transmitter 100, not the light-emitting frequency of the light transmitter 100.
  • the turn-on frequency of the light transmitter 100 corresponds to the frame rate at which the light receiver 200 outputs a depth image. Specifically, with reference to FIG.
  • the processor 805 controlling the light emitter 100 to emit light at the first light emitting power and / or the first on frequency includes: (1) the processor 805 controls the light emitter 100 to emit light at the first light emitting power; (2) the processor 805 controls the light emitting The transmitter 100 emits light at a first on frequency; (3) The processor 805 controls the light transmitter 100 to emit light at a first light emission power and a first on frequency at the same time.
  • the processor 805 controlling the light emitter 100 to emit light at the second light emitting power and / or the second on frequency includes: (1) the processor 805 controls the light emitter 100 to emit light at the second light emitting power; (2) the processor 805 Control the light transmitter 100 to emit light at a second on frequency; (3) The processor 805 controls the light transmitter 100 to emit light at a second light emission power and a second on frequency simultaneously.
  • the light when a human face is present in the scene image, the light is turned on with a lower first luminous power and a lower first. Frequency light emission.
  • a lower first luminous power can reduce the energy of the infrared laser light irradiated to the user's eyes.
  • a lower first turn-on frequency can reduce the time that the infrared laser light continues to irradiate the user's eyes. In this way, the emitted laser light can reduce the impact on the user.
  • the risk of injury to the eyes increases the safety of use of the time-of-flight depth camera 300.
  • the light is emitted at a second light emission power and a second on frequency suitable for the current scene. In this way, the accuracy of the depth image output by the light receiver 200 can be improved.
  • the first turn-on frequency includes a first turn-on sub-frequency and a second turn-on sub-frequency.
  • Control methods also include:
  • Step 05 When a human face exists in the scene image, controlling the light transmitter 100 to emit light at the first light emitting power and / or the first on frequency includes:
  • the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
  • the control device 90 further includes a determination module 941.
  • Step 041 may be implemented by the judgment module 941.
  • Both steps 051 and 052 can be implemented by the control module 95. That is to say, the determination module 941 can be used to determine an application scenario of the light transmitter 100.
  • the control module 95 may be configured to control the light emitter 100 to emit light at a first light emitting power and a first sub-on frequency when the application scene is the first scene and the face is present in the scene image, and when the application scene is the second scene and the scene When the human face is present in the image, the light transmitter 100 is controlled to emit light at a first light emitting power and a second sub-on frequency.
  • step 041, step 051, and step 052 can all be implemented by the processor 805. That is to say, the processor 805 may be configured to determine an application scene of the light transmitter 100, and when the application scene is the first scene, and the face exists in the scene image, control the light transmitter 100 to use the first light emitting power and the first The first sub-frequency is turned on to emit light, and when the application scene is the second scene and the human face is present in the scene image, the light transmitter 100 is controlled to emit light at the first sub-frequency and the second sub-on frequency.
  • the first scene refers to an application scene in which the application time of the depth-of-flight camera 300 is shorter than a preset time, for example, shooting a static three-dimensional image of the scene, unlocking the electronic device 800 based on a three-dimensional face, and performing payment based on the three-dimensional face
  • Application scenarios where the time module has less application time refers to an application scenario in which the application time of the time-of-flight depth camera 300 is greater than or equal to a preset time, for example, an application scenario in which a user performs a three-dimensional video chat with other users.
  • the first scenario as an application scenario of unlocking the electronic device 800 based on a three-dimensional face
  • the second scenario as an application scenario in which a user performs a three-dimensional video chat with other users.
  • the time-of-flight depth camera 300 is used to capture a user's three-dimensional face to unlock the electronic device 800
  • the light receiver 200 usually only needs to output a few depth images per second, for example, 3 frames per second, 4 frames per second, 5 frames per second, etc.
  • the corresponding optical transmitter 100 is turned on 3 times / second, 4 times / second, 5 times / second, etc.
  • the light receiver 200 can output only one frame of the image, and correspondingly, the processor 805 can set the first sub-on frequency to 1 time / second.
  • the time-of-flight depth camera 300 is used to capture a three-dimensional video of a user, so that the user can use the electronic device 800 to have a three-dimensional video chat with other users, the light receiver 200 usually needs to output more depth images per second, such as every 30 frames per second, 60 frames per second, etc., the corresponding light transmitter 100 is turned on 30 times / second, 60 times / second, etc.
  • the light receiver 200 can only output 24 frames of images.
  • the processor 805 can set the first sub-on frequency to 24 times / second. It can be understood that when the screen refresh rate of the electronic device 800 reaches 24 frames / second, what the human eye sees is a smooth picture. Therefore, the first sub-on frequency can be set to 24 times / second, and the light transmitter 100 starts at The lowest frame rate output depth image, on the one hand, can reduce the damage of the laser to the user's eyes, and on the other hand, can ensure that the user can see a smooth three-dimensional video picture.
  • control method after step 03 further includes:
  • Step 042 includes:
  • the control device 90 further includes a second acquisition module 942 and a calculation module 943.
  • the second acquisition module 942 includes a first calculation unit 9421 and a second calculation unit 9422.
  • Step 042 may be implemented by the second obtaining module 942.
  • Step 043 may be implemented by the first calculation unit 9421.
  • Step 0421 may be implemented by the first calculation unit 9421, and step 0422 may be implemented by the second calculation unit 9422. That is, the second obtaining module 942 may be used to obtain a projection distance between the user and the light transmitter 100.
  • the calculation module 943 may be configured to calculate the first light emitting power according to the projection distance.
  • the first calculation unit 9421 may be configured to calculate a first proportion of a human face in the scene image.
  • the second calculation unit 9422 may be configured to calculate a projection distance according to the first ratio.
  • step 041, step 042, step 0421, and step 0422 can all be implemented by the processor 805. That is to say, the processor 805 may be further configured to obtain a projection distance between the user and the light transmitter 100, and calculate a first light emitting power according to the projection distance. When the processor 805 obtains the projection distance between the user and the light transmitter 100, the processor 805 specifically performs operations of calculating a first proportion of a face in the scene image and calculating a projection distance according to the first proportion.
  • the processor 805 After the processor 805 recognizes the human face in the scene image, the processor 805 extracts the human face and calculates the number of pixels occupied by the human face. Subsequently, the processor 805 divides the number of human face pixels by the scene image. To obtain the first proportion of faces in the scene image, and finally calculate the projection distance based on the first proportion.
  • the first ratio when the first ratio is larger, the user is closer to the time-of-flight camera 300, that is, the user is closer to the light transmitter 100, and the projection distance is smaller.
  • the first ratio is larger, the user and the time-of-flight depth are explained. The camera 300 is farther away, that is, the user is farther away from the light transmitter 100, and the projection distance is larger.
  • the relationship between the projection distance and the first ratio satisfies that the projection distance increases as the first ratio decreases.
  • the face with the largest area among the multiple faces may be selected to calculate the first proportion; or, the average of the areas of the multiple faces may be selected to calculate The first proportion; or, the face of the holder of the electronic device 800 may be identified from a plurality of faces, and the first proportion may be calculated using the faces of the holder.
  • the first ratio has a mapping relationship with the projection distance.
  • the first ratio is a specific value and the projection distance is also a specific value.
  • the first ratio corresponds to the projection distance one by one.
  • the first ratio is a range and the projection distance is
  • the first ratio is a one-to-one correspondence with the projection distance; or, the first ratio is a range and the projection distance is also a range, and the first ratio corresponds to the projection distance one-to-one.
  • the mapping relationship between the first scale and the projection distance may be calibrated in advance.
  • the user is instructed to stand at a plurality of predetermined projection distances from the infrared camera or visible light camera 400, respectively, and the infrared camera or visible light camera 400 sequentially captures scene images.
  • the processor 805 calculates the calibration ratio of the face to the scene image in each scene image, and then stores the correspondence between the calibration ratio in each scene image and the predetermined projection distance. In subsequent use, based on the first ratio actually measured Find the projection distance corresponding to the first ratio in the above mapping relationship.
  • the user is instructed to stand at a projection distance of 10 cm, 20 cm, 30 cm, 40 cm, an infrared camera or a visible light camera 400 sequentially captures scene images, and the processor 805 calculates a projection distance of 10 cm from the multiple scene images , 20 cm, 30 cm, and 40 cm respectively corresponding to the calibration ratio of 80%, 60%, 45%, 30%, and the mapping relationship between the calibration ratio and the predetermined projection distance 10cm-80%, 20cm-60%, 30cm-45
  • The%, 40cm-30% are stored in the memory 806 of the electronic device 800 in the form of a mapping table. In subsequent use, directly find the projection distance corresponding to the first ratio in the mapping table.
  • the projection distance and the first ratio are calibrated in advance.
  • the user is directed to stand at a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 collects scene images.
  • the processor 805 calculates the calibration ratio of the face in the scene image to the scene image, and then stores the correspondence between the calibration ratio in the scene image and the predetermined projection distance. In subsequent use, based on the correspondence between the calibration ratio and the predetermined projection distance The relationship calculates the projection distance.
  • the processor 805 calculates that the proportion of the human face in the scene image is 45%, and in actual measurement, when When the first ratio is calculated as R, according to the properties of similar triangles, Among them, D is an actual projection distance calculated according to the actually measured first ratio R.
  • the projection distance between the user and the light transmitter 100 can be more objectively reflected.
  • the light transmitter 100 can emit light at a relatively suitable luminous power. On the one hand, it can prevent the light emitting power of the light transmitter 100 from being too high and causing damage to the user's eyes. The luminous power is too low, resulting in inaccurate depth information of the scene.
  • calculating the projection distance according to the first scale in step 0422 includes:
  • 0424 Calculate the projection distance according to the first scale and the second scale.
  • the second calculation unit 9422 includes a first calculation sub-unit 9423 and a second calculation sub-unit 9424.
  • Step 0423 may be implemented by the first calculation subunit 9423.
  • Step 0424 may be implemented by the second calculation subunit 9424. That is to say, the first calculation subunit 9423 can be used to calculate a second proportion of the preset feature area of the human face in the scene image to the human face.
  • the second calculation sub-unit 9424 may be configured to calculate the projection distance according to the first scale and the second scale.
  • step 0423 and step 0424 may be implemented by the processor 805. That is to say, the processor 805 can also be used to calculate a second proportion of the preset feature area of the human face in the scene image to the human face, and calculate a projection distance according to the first and second proportions.
  • the second ratio is the ratio of the preset feature area of the human face to the human face.
  • the preset feature area can select a feature area with a small difference between different user individuals.
  • the preset feature area can be the distance between the eyes of the user.
  • the user is directed to stand at a predetermined projection distance position and collect a scene image, and then calculate a first calibration ratio and a second calibration ratio corresponding to the scene image, and store the predetermined projection distance, the first calibration ratio, and the second
  • the corresponding relationship of the scales is calibrated, so as to calculate the projection distance according to the actual first scale and the second scale in subsequent use. For example, instruct the user to stand at a projection distance of 25 cm and collect a scene image, and then calculate the first calibration ratio corresponding to the scene image is 50% and the second calibration ratio is 10%.
  • D1 is the initial projection distance calculated according to the actually measured first ratio R1, which can be further based on the relationship A calibrated projection distance D2, which is further calculated according to the actually measured second ratio R2, is obtained, and D2 is used as the final projection distance.
  • the projection distance calculated according to the first ratio and the second ratio takes into account the individual differences between different users, and can obtain a more objective projection distance. Further, a more accurate first light emission can be determined based on a more accurate projection distance. power.
  • calculating the projection distance according to the first scale in step 0422 includes:
  • 0426 Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
  • the second calculation unit 9422 further includes a first determination sub-unit 9425 and a third calculation sub-unit 9426.
  • Step 0425 may be implemented by the first judging subunit 9425.
  • Step 0426 may be implemented by the third calculation subunit 9426. That is to say, the first judging subunit 9425 can be used to judge whether the user wears glasses according to the scene image.
  • the third calculation subunit 9426 may be configured to calculate a projection distance according to the first scale and the distance coefficient when the user wears glasses.
  • step 0425 and step 0426 may be implemented by the processor 805. That is to say, the processor 805 may be further configured to determine whether the user wears glasses according to the scene image, and calculate the projection distance according to the first ratio and the distance coefficient when the user wears the glasses.
  • the optical transmitter 100 emits laser light to the user wearing the glasses At this time, the light emitting power of the light transmitter 100 needs to be reduced so that the energy of the laser light emitted by the light transmitter 100 is small, so as not to cause damage to the eyes of the user.
  • the preset distance coefficient can be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial projection distance is calculated according to the first ratio, or the first distance and the second ratio are calculated.
  • the initial projection distance or the calibrated projection distance is multiplied by the distance coefficient to obtain the final projection distance, and the first luminous power is calculated according to the projection distance. In this way, it is possible to avoid that the power of the emitted laser is too large to hurt the user suffering from eye disease or poor vision.
  • calculating the projection distance according to the first scale in step 0422 includes:
  • the second calculation unit 9422 includes a second determination sub-unit 9427 and a fourth calculation sub-unit 9428.
  • Step 0427 may be implemented by the second judgment subunit 9427
  • step 0428 may be implemented by the fourth calculation subunit 9428. That is to say, the second judging subunit 9427 may be configured to judge the age of the user according to the scene image.
  • the fourth calculation sub-unit 9428 may be configured to calculate the projection distance according to the first ratio and the age.
  • step 0427 and step 0428 may be implemented by the processor 805. That is to say, the processor 805 may be configured to determine the age of the user according to the scene image, and calculate the projection distance according to the first scale and age.
  • the number, distribution, and area of feature points of facial wrinkles in the scene image can be extracted to determine the user's age, for example, the number of wrinkles at the corners of the eyes can be used to determine the user's age, or further combined with the user's forehead How many wrinkles are there to determine the user's age.
  • the proportion coefficient can be obtained according to the age of the user. Specifically, the correspondence between age and the proportion coefficient can be found in a query table.
  • the proportion coefficient is 0.6 and the age is between When the age is 15 to 20, the scale factor is 0.8; when the age is 20 to 45, the scale factor is 1.0; when the age is 45 or more, the scale factor is 0.8.
  • the initial projection distance calculated from the first scale or the calibrated projection distance calculated from the first and second scales can be multiplied by the scale factor to obtain the final projection distance.
  • the first light emitting power is calculated according to the projection distance. In this way, excessive power of the emitted laser can be avoided to hurt young users or older users.
  • controlling the light emitter 100 to emit light at the second light emission power and / or the second on frequency includes:
  • the control module 95 includes a first obtaining unit 951, a second obtaining unit 952, and a third computing unit 953.
  • Step 071 may be implemented by the first obtaining unit 951.
  • Step 072 may be performed by the second obtaining unit 952.
  • Step 073 may be performed by the third calculation unit 953.
  • the first acquiring unit 951 may be configured to acquire a projection distance between the target subject in the scene and the light emitter 100.
  • the second obtaining unit 952 may be configured to obtain the ambient brightness of the scene.
  • the third calculation unit 953 may be configured to calculate the second light emitting power according to the ambient brightness and the projection distance.
  • steps 071, 072, and 073 can all be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a projection distance between the target subject in the scene and the light emitter 100, obtain the ambient brightness of the scene, and calculate the second light emitting power according to the ambient brightness and the projection distance.
  • the projection distance between the target subject in the scene and the light transmitter 100 may be obtained by the time-of-flight depth camera 300.
  • the light transmitter 100 emits laser light at a preset light emission power and a predetermined light emission frequency
  • the light receiver 200 receives the laser light reflected by an object in the scene
  • the processor 805 calculates the scene based on the laser light received by the light receiver 200. Initial depth information. Subsequently, the processor 805 determines the target subject from the scene.
  • the central area of the field of view of the light receiver 200 can be used as the area where the target subject is located, and the central The initial depth information of the pixels of this part of the region is used as the initial depth information of the target subject.
  • the processor 805 can calculate the average or median value of the multiple initial depth information, and use the average or median value as the projection between the light emitter 100 and the target subject. distance. In this way, the projection distance between the light emitter 100 and the target subject can be obtained by the time-of-flight depth camera 300.
  • the ambient brightness can be detected by the light sensor, and the processor 805 reads the detected ambient brightness from the light sensor.
  • the ambient brightness may be detected by an infrared camera (which may be the light receiver 200) or a visible light camera 400.
  • the infrared camera or the visible light camera 400 captures an image of a scene, and the processor 805 calculates the brightness value of the image as the ambient brightness.
  • the processor 805 After determining the ambient brightness and the projection distance, the processor 805 jointly calculates the second light emitting power based on the two parameters of the ambient brightness and the projection distance.
  • the optical receiver 200 receives both the infrared laser light emitted by the optical transmitter 100 and the infrared light in the ambient light. If the light emitting power of the infrared laser emitted by the optical transmitter 100 is low, the light from the light received by the optical receiver 200 comes from the light.
  • the ratio between the infrared laser of the transmitter 100 and the infrared light from the ambient light is not much different, which will cause the time when the light receiver 200 receives the light to be inaccurate, or the total amount of light received by the light receiver 200 will be insufficient. Accuracy will further reduce the accuracy of acquiring depth information. Therefore, the transmission power of the infrared laser emitted by the optical transmitter 100 needs to be increased to reduce the influence of the infrared light in the environment on the optical receiver 200 receiving the infrared laser from the optical transmitter 100 ; And when the ambient brightness is low, the infrared light component contained in the ambient light is less, at this time, if the light emitter 100 uses a higher luminous power to emit light, the electricity will increase.
  • the second light emitting power of the light transmitter 100 can be appropriately increased.
  • the second light emitting power of the light transmitter 100 can be appropriately reduced.
  • jointly determining the second light emitting power of the light transmitter 100 based on the ambient brightness and the projection distance can reduce the power consumption of the electronic device 800 on the one hand, and improve the accuracy of obtaining the depth information of the scene on the other hand.
  • the second turning-on frequency of the light transmitter 100 may be determined according to the application scenario.
  • the application scenario is to unlock the electronic device 800 based on a three-dimensional face
  • the light receiver 200 usually only needs to output a few depth images per second, for example, 3 frames per second, 4 frames per second, Output 5 frames per second, etc.
  • the second turn-on frequency of the light transmitter 100 can be set to 3 times / second, 4 times / second, 5 times / second, etc .
  • the application scenario is for the user to record the During video
  • the optical receiver 200 usually needs to output more depth images per second, for example, 30 frames per second and 60 frames per second.
  • the second turn-on frequency of the optical transmitter 100 can be set correspondingly. 30 times / second, 60 times / second, etc. In this way, based on different application scenarios, the opening frequency that is most suitable for each application scenario is set to meet the needs of users.
  • the electronic device 800 includes a casing 801 and a time-of-flight depth camera 300.
  • the housing 801 may serve as a mounting carrier for the functional elements of the electronic device 800.
  • the housing 801 can provide protection for the functional components from dust, drop, and water.
  • the functional components can be a display screen 802, a visible light camera 400, a receiver, and the like.
  • the housing 801 includes a main body 803 and a movable bracket 804.
  • the movable bracket 804 can move relative to the main body 803 under the driving of a driving device.
  • the movable bracket 804 can slide relative to the main body 803 to slide Slide in or out of the main body 803 (as shown in FIG. 20).
  • Some functional elements can be installed on the main body 803, and other functional elements (such as time-of-flight depth camera 300, visible light camera 400, and receiver) can be installed on the movable bracket 804, which can be driven by the movement of the movable bracket 804 The other functional element is retracted into or protruded from the main body 803.
  • the embodiments shown in FIG. 1 and FIG. 21 are merely examples of a specific form of the casing 801, and cannot be understood as a limitation on the casing 801 of the present application.
  • a time-of-flight depth camera 300 is mounted on the housing 801.
  • the casing 801 may be provided with an acquisition window, and the time-of-flight camera 300 is aligned with the acquisition window so that the time-of-flight camera 300 acquires depth information.
  • the time-of-flight depth camera 300 is mounted on a movable bracket 804.
  • the user can trigger the movable bracket 804 to slide out from the main body 803 to drive the time-of-flight depth camera 300 to protrude from the main body 803; when the time-of-flight depth camera 300 is not required, it can be triggered The movable bracket 804 slides into the main body 803 to cause the time-of-flight camera 300 to retract into the main body.
  • the time-of-flight depth camera 300 includes a first substrate assembly 71, a pad 72, a light emitter 100 and a light receiver 200.
  • the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712 connected to each other.
  • the spacer 72 is disposed on the first substrate 711.
  • the light emitter 100 is used for projecting laser light outward, and the light emitter 100 is disposed on the cushion block 72.
  • the flexible circuit board 712 is bent and one end of the flexible circuit board 712 is connected to the first substrate 711 and the other end is connected to the light emitter 100.
  • the light receiver 200 is disposed on the first substrate 711.
  • the light receiver 200 is configured to receive laser light reflected by a person or an object in a scene.
  • the light receiver 200 includes a housing 741 and an optical element 742 provided on the housing 741.
  • the housing 741 is integrally connected with the pad 72.
  • the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712.
  • the first substrate 711 may be a printed wiring board or a flexible wiring board.
  • the control circuit and the like of the time-of-flight camera 300 may be laid on the first substrate 711.
  • One end of the flexible circuit board 712 may be connected to the first substrate 711.
  • the flexible circuit board 712 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 712 can be selected.
  • the spacer 72 is disposed on the first substrate 711.
  • the spacer 72 is in contact with the first substrate 711 and is carried on the first substrate 711.
  • the spacer 72 may be combined with the first substrate 711 by means of adhesion or the like.
  • the material of the spacer 72 may be metal, plastic, or the like.
  • a surface on which the pad 72 is combined with the first substrate 711 may be a flat surface, and a surface on which the pad 72 is opposite to the combined surface may also be a flat surface, so that the light emitter 100 is disposed on the pad 72. It has better smoothness.
  • the light receiver 200 is disposed on the first substrate 711, and the contact surface between the light receiver 200 and the first substrate 711 is substantially flush with the contact surface between the pad 72 and the first substrate 711 (that is, the installation starting point of the two is at On the same plane).
  • the light receiver 200 includes a housing 741 and an optical element 742.
  • the casing 741 is disposed on the first substrate 711, and the optical element 742 is disposed on the casing 741.
  • the casing 741 may be a lens holder and a lens barrel of the light receiver 200, and the optical element 742 may be an element such as a lens disposed in the casing 741.
  • the light receiver 200 further includes a photosensitive chip (not shown), and the laser light reflected by a person or an object in the scene passes through the optical element 742 and is irradiated into the photosensitive chip, and the photosensitive chip generates a response to the laser.
  • the housing 741 and the cushion block 72 are integrally connected.
  • the casing 741 and the cushion block 72 may be integrally formed; or the materials of the casing 741 and the cushion block 72 are different, and the two are integrally formed by two-color injection molding or the like.
  • the housing 741 and the spacer 72 may also be separately formed, and the two form a matching structure.
  • one of the housing 741 and the spacer 72 may be set on the first substrate 711, and then the other One is disposed on the first substrate 711 and connected integrally.
  • the light transmitter 100 is disposed on the pad 72, which can increase the height of the light transmitter 100, thereby increasing the height of the surface on which the laser is emitted by the light transmitter 100.
  • the laser light emitted by the light transmitter 100 is not easily received by the light
  • the device 200 is blocked, so that the laser light can be completely irradiated on the measured object in the target space.
  • the light emitter 100 includes a second substrate assembly 51, a light emitting assembly 101 and a housing 52.
  • the second substrate assembly 51 is disposed on the pad 72, and the second substrate assembly 51 is connected to the flexible circuit board 712.
  • the light emitting component 101 is disposed on the second substrate component 51, and the light emitting component 101 is used for emitting laser light.
  • the casing 52 is disposed on the second substrate assembly 51.
  • the casing 52 is formed with a receiving space 521, and the receiving space 521 can be used for receiving the light emitting module 101.
  • the flexible circuit board 712 may be detachably connected to the second substrate assembly 51.
  • the light emitting component 101 is connected to the second substrate component 51.
  • the housing 52 may be bowl-shaped as a whole, and the opening of the housing 52 is disposed on the second substrate assembly 51 downwardly to accommodate the light emitting assembly 101 in the receiving space 521.
  • the housing 52 is provided with a light emitting port 522 corresponding to the light emitting component 101.
  • the laser light emitted from the light emitting component 101 passes through the light emitting port 522 and is emitted.
  • the laser light can pass directly through the light emitting port 522. It can also pass through the light exit port 522 after changing the optical path through other optical devices.
  • the second substrate assembly 51 includes a second substrate 511 and a reinforcing member 512.
  • the second substrate 511 is connected to the flexible circuit board 712.
  • the light emitting component 101 and the reinforcing member 512 are disposed on opposite sides of the second substrate 511.
  • a specific type of the second substrate 511 may be a printed circuit board or a flexible circuit board, and a control circuit may be laid on the second substrate 511.
  • the reinforcing member 512 can be fixedly connected to the second substrate 511 by gluing, riveting, or the like. The reinforcing member 512 can increase the overall strength of the second substrate assembly 51.
  • the reinforcing member 512 can directly contact the pad 72, the second substrate 511 is not exposed to the outside, and does not need to be in direct contact with the pad 72, and the second substrate 511 is not vulnerable Contamination by dust, etc.
  • the reinforcing member 512 and the cushion block 72 are separately formed.
  • the spacer 72 may be mounted on the first substrate 71.
  • the two ends of the flexible circuit board 712 are respectively connected to the first substrate 711 and the second substrate 511, and the flexible circuit board 712 may Don't bend it.
  • the flexible circuit board 712 is then bent, so that the reinforcing member 512 is disposed on the cushion block 72.
  • the reinforcing member 512 and the spacer 72 may be integrally formed, for example, integrally formed by a process such as injection molding.
  • the spacer 72 and the light emitter 100 may be installed together. On the first substrate 711.
  • the light emitting component 101 includes a light source 10, a diffuser 20, a lens barrel 30, a protective cover 40, and a driver 61.
  • the lens barrel 30 includes a ring-shaped lens barrel sidewall 33, and the ring-shaped lens barrel sidewall 33 surrounds a receiving cavity 62.
  • the side wall 33 of the lens barrel includes an inner surface 331 located in the receiving cavity 62 and an outer surface 332 opposite to the inner surface.
  • the side wall 33 of the lens barrel includes a first surface 31 and a second surface 32 opposite to each other.
  • the receiving cavity 62 penetrates the first surface 31 and the second surface 32.
  • the first surface 31 is recessed toward the second surface 32 to form a mounting groove 34 communicating with the receiving cavity 62.
  • the bottom surface 35 of the mounting groove 34 is located on a side of the mounting groove 34 remote from the first surface 31.
  • the outer surface 332 of the side wall 33 of the lens barrel is circular at one end of the first surface 31, and the outer surface 332 of the side wall 33 of the lens barrel is formed with an external thread at one end of the first surface 31.
  • the lens barrel 30 is carried on a second substrate 511.
  • the second substrate 511 may be a circuit board 511.
  • the circuit board 511 is in contact with the second surface 32 of the lens barrel 30 to close one end of the receiving cavity 62.
  • the light source 10 is carried on the circuit board 511 and is housed in the receiving cavity 62.
  • the light source 10 is configured to emit laser light toward the first surface 31 (the mounting groove 34) side of the lens barrel 30.
  • the light source 10 may be a single-point light source or a multi-point light source.
  • the light source 10 may specifically be an edge-emitting laser, for example, a distributed feedback laser (Distributed Feedback Laser, DFB), etc .; when the light source 10 is a multi-point light source, the light source 10 may be vertical A cavity-surface emitter (Vertical-Cavity Surface Laser, VCSEL), or the light source 10 may also be a multi-point light source composed of multiple edge-emitting lasers.
  • DFB distributed Feedback Laser
  • VCSEL Vertical A cavity-surface emitter
  • VCSEL Vertical-Cavity Surface Laser
  • the vertical cavity surface emitting laser has a small height, and the use of the vertical cavity surface emitter as the light source 10 is beneficial to reducing the height of the light emitter 100 and facilitating the integration of the light emitter 100 into a mobile phone and other requirements on the thickness of the fuselage.
  • Electronic device 800 Compared with the vertical cavity surface emitter, the temperature drift of the side-emitting laser is smaller, and the influence of the temperature on the effect of the projected laser light from the light source 10 can be reduced.
  • the driver 61 is carried on the circuit board 511 and is electrically connected to the light source 10. Specifically, the driver 61 may receive the modulation signal modulated by the processor 805805, and convert the modulation signal into a constant current source and transmit the modulated signal to the light source 10, so that the light source 10 is directed toward the first position of the lens barrel 30 under the action of the constant current source.
  • the one side 31 emits laser light.
  • the driver 61 of this embodiment is provided outside the lens barrel 30. In other embodiments, the driver 61 may be disposed in the lens barrel 30 and carried on the circuit board 511.
  • the diffuser 20 is mounted (supported) in the mounting groove 34 and abuts the mounting groove 34.
  • the diffuser 20 is used to diffuse the laser light passing through the diffuser 20. That is, when the light source 10 emits laser light toward the first surface 31 side of the lens barrel 30, the laser light passes through the diffuser 20 and is diffused or projected outside the lens barrel 30 by the diffuser 20.
  • the protective cover 40 includes a top wall 41 and a protective sidewall 42 extending from one side of the top wall 41.
  • a light through hole 401 is defined in the center of the top wall 41.
  • the protective side wall 42 is disposed around the top wall 41 and the light through hole 401.
  • the top wall 41 and the protection side wall 42 together form a mounting cavity 43, and the light-passing hole 401 communicates with the mounting cavity 43.
  • the cross-section of the inner surface of the protective sidewall 42 is circular, and an inner thread is formed on the inner surface of the protective sidewall 42.
  • the internal thread of the protective sidewall 42 is screwed with the external thread of the lens barrel 30 to mount the protective cover 40 on the lens barrel 30.
  • the interference between the top wall 41 and the diffuser 20 causes the diffuser 40 to be sandwiched between the top wall 41 and the bottom surface 35 of the mounting groove 34.
  • the opening 20 is installed in the lens barrel 30, and the diffuser 20 is installed in the installation groove 34, and the protective cover 40 is installed on the lens barrel 30 to clamp the diffuser 20 between the protective cover 40 and the installation groove.
  • the diffuser 20 is fixed on the lens barrel 30.
  • glue which can prevent the gas glue from diffusing and solidifying on the surface of the diffuser 20 after the glue is volatilized to affect the microstructure of the diffuser 20, and can avoid diffusion
  • the diffuser 20 falls off from the lens barrel 30 when the glue of the device 20 and the lens barrel 30 decreases due to aging.
  • the present application further provides an electronic device 800.
  • the electronic device includes the time-of-flight depth camera 300 according to any one of the above embodiments, one or more processors 805, a memory 806, and one or more programs.
  • One or more programs are stored in the memory 806, and are configured to be executed by one or more processors 806.
  • the programs include instructions for performing the control method according to any one of the foregoing embodiments.
  • the program includes instructions for performing the following steps:
  • controlling the light transmitter 100 when a human face is present in the scene image, controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency;
  • the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
  • the program further includes instructions for performing the following steps:
  • the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
  • the program further includes instructions for performing the following steps:
  • the program further includes instructions for performing the following steps:
  • the program further includes instructions for performing the following steps:
  • 0424 Calculate the projection distance according to the first scale and the second scale.
  • the program further includes instructions for performing the following steps:
  • 0426 Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
  • the program further includes instructions for performing the following steps:
  • the program further includes instructions for performing the following steps:
  • the present application further provides a computer-readable storage medium 900.
  • the computer-readable storage medium 900 includes a computer program used in conjunction with the electronic device 800, and the computer program can be executed by the processor 805 to complete the control method according to any one of the foregoing embodiments.
  • the computer program may be executed by the processor 805 to complete the following steps:
  • controlling the light transmitter 100 when a human face is present in the scene image, controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency;
  • the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
  • the computer program can also be executed by the processor 805 to complete the following steps:
  • the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
  • the computer program may also be executed by the processor 805 to complete the following steps:
  • the computer program may also be executed by the processor 805 to complete the following steps:
  • the computer program may also be executed by the processor 805 to complete the following steps:
  • 0424 Calculate the projection distance according to the first scale and the second scale.
  • the computer program may also be executed by the processor 805 to complete the following steps:
  • 0426 Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
  • the computer program may also be executed by the processor 805 to complete the following steps:
  • the computer program may also be executed by the processor 805 to complete the following steps:
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
  • Any process or method description in a flowchart or otherwise described herein can be understood as a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process
  • the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.
  • Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
  • a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
  • computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
  • the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
  • each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
  • multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it may be implemented using any one or a combination of the following techniques known in the art: Discrete logic circuits, application-specific integrated circuits with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
  • a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
  • the program can be stored in a computer-readable storage medium.
  • the program is When executed, one or a combination of the steps of the method embodiment is included.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
  • the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de commande, un dispositif de commande (90), une caméra de profondeur à temps de vol (300), un dispositif électronique (800) et un support de stockage lisible par ordinateur (900). Le procédé de commande consiste à : (01) obtenir une image de scène d'une scène ; (03) reconnaître si un visage humain existe dans l'image de scène ; (05) lorsqu'un visage humain existe dans l'image de scène, commander un émetteur de lumière (100) pour émettre de la lumière à une première puissance d'émission de lumière et/ou une première fréquence d'allumage ; et (07) lorsqu'aucun visage humain n'existe dans l'image de scène, commander l'émetteur de lumière (100) pour émettre de la lumière à une deuxième puissance d'émission de lumière et/ou une deuxième fréquence d'allumage.
PCT/CN2019/090020 2018-09-12 2019-06-04 Procédé et dispositif de commande, caméra de profondeur, dispositif électronique, et support de stockage lisible par ordinateur Ceased WO2020052284A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811060690.4 2018-09-12
CN201811060690.4A CN109068036B (zh) 2018-09-12 2018-09-12 控制方法及装置、深度相机、电子装置及可读存储介质

Publications (1)

Publication Number Publication Date
WO2020052284A1 true WO2020052284A1 (fr) 2020-03-19

Family

ID=64760107

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/090020 Ceased WO2020052284A1 (fr) 2018-09-12 2019-06-04 Procédé et dispositif de commande, caméra de profondeur, dispositif électronique, et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN109068036B (fr)
WO (1) WO2020052284A1 (fr)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108833889B (zh) * 2018-08-22 2020-06-23 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质
CN109068036B (zh) * 2018-09-12 2020-09-25 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质
CN112351155B (zh) * 2019-08-06 2023-02-17 Oppo(重庆)智能科技有限公司 电子设备、用于电子设备的反偷拍装置及其控制方法
CN110418062A (zh) * 2019-08-29 2019-11-05 上海云从汇临人工智能科技有限公司 一种拍摄方法、装置、设备及机器可读介质
CN113126111B (zh) * 2019-12-30 2024-02-09 Oppo广东移动通信有限公司 飞行时间模组和电子设备
CN113223209A (zh) * 2020-01-20 2021-08-06 深圳绿米联创科技有限公司 门锁控制方法、装置、电子设备及存储介质
CN111487633B (zh) * 2020-04-06 2024-08-23 深圳蚂里奥技术有限公司 一种激光安全控制装置及方法
CN111427049B (zh) * 2020-04-06 2024-08-27 深圳蚂里奥技术有限公司 一种激光安全装置及控制方法
CN114531541B (zh) * 2022-01-10 2023-06-02 荣耀终端有限公司 一种摄像头模组的控制方法和装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016064096A2 (fr) * 2014-10-21 2016-04-28 Lg Electronics Inc. Terminal mobile et son procédé de commande
CN107607957A (zh) * 2017-09-27 2018-01-19 维沃移动通信有限公司 一种深度信息获取系统及方法、摄像模组和电子设备
CN108281880A (zh) * 2018-02-27 2018-07-13 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质
CN108376251A (zh) * 2018-02-27 2018-08-07 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质
CN108376252A (zh) * 2018-02-27 2018-08-07 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质
WO2018144368A1 (fr) * 2017-02-03 2018-08-09 Microsoft Technology Licensing, Llc Gestion d'éclairage active par l'intermédiaire d'informations contextuelles
CN109068036A (zh) * 2018-09-12 2018-12-21 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016064096A2 (fr) * 2014-10-21 2016-04-28 Lg Electronics Inc. Terminal mobile et son procédé de commande
WO2018144368A1 (fr) * 2017-02-03 2018-08-09 Microsoft Technology Licensing, Llc Gestion d'éclairage active par l'intermédiaire d'informations contextuelles
CN107607957A (zh) * 2017-09-27 2018-01-19 维沃移动通信有限公司 一种深度信息获取系统及方法、摄像模组和电子设备
CN108281880A (zh) * 2018-02-27 2018-07-13 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质
CN108376251A (zh) * 2018-02-27 2018-08-07 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质
CN108376252A (zh) * 2018-02-27 2018-08-07 广东欧珀移动通信有限公司 控制方法、控制装置、终端、计算机设备和存储介质
CN109068036A (zh) * 2018-09-12 2018-12-21 Oppo广东移动通信有限公司 控制方法及装置、深度相机、电子装置及可读存储介质

Also Published As

Publication number Publication date
CN109068036A (zh) 2018-12-21
CN109068036B (zh) 2020-09-25

Similar Documents

Publication Publication Date Title
CN109068036B (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
CN112702541B (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
CN108833889B (zh) 控制方法及装置、深度相机、电子装置及可读存储介质
CN109149355B (zh) 光发射模组及其控制方法、tof深度相机和电子设备
US11335028B2 (en) Control method based on facial image, related control device, terminal and computer device
CN108281880A (zh) 控制方法、控制装置、终端、计算机设备和存储介质
WO2020038060A1 (fr) Module de projection laser et son procédé de commande, et dispositif d'acquisition d'images et appareil électronique
CN109271916B (zh) 电子装置及其控制方法、控制装置和计算机可读存储介质
CN108333860B (zh) 控制方法、控制装置、深度相机和电子装置
CN109031252B (zh) 标定方法、标定控制器及标定系统
CN108227361B (zh) 控制方法、控制装置、深度相机和电子装置
WO2016010481A1 (fr) Modules optoélectroniques permettant de faire la distinction entre des signaux indiquant des réflexions d'un objet d'intérêt et des signaux indiquant une réflexion parasite
TWI684026B (zh) 控制方法、控制裝置、深度相機和電子裝置
CN108594451B (zh) 控制方法、控制装置、深度相机和电子装置
CN108960061A (zh) 控制方法、控制装置、电子装置、计算机设备和存储介质
CN108376252B (zh) 控制方法、控制装置、终端、计算机设备和存储介质
CN108509867A (zh) 控制方法、控制装置、深度相机和电子装置
CN113325391A (zh) 广角tof模组及其应用
US10551500B2 (en) Infrared optical element for proximity sensor system
HK40016464A (en) Control method, control apparatus, terminal, computer device, and storage medium
CN120909002A (zh) 智能眼镜、遮挡检测装置和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19859708

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19859708

Country of ref document: EP

Kind code of ref document: A1