WO2020052284A1 - Control method and device, depth camera, electronic device, and readable storage medium - Google Patents
Control method and device, depth camera, electronic device, and readable storage medium Download PDFInfo
- Publication number
- WO2020052284A1 WO2020052284A1 PCT/CN2019/090020 CN2019090020W WO2020052284A1 WO 2020052284 A1 WO2020052284 A1 WO 2020052284A1 CN 2019090020 W CN2019090020 W CN 2019090020W WO 2020052284 A1 WO2020052284 A1 WO 2020052284A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- scene
- scene image
- projection distance
- frequency
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/40—Transceivers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/50—Transmitters
- H04B10/564—Power control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- the present application relates to the field of three-dimensional imaging technology, and in particular, to a control method, a control device, a time-of-flight depth camera, an electronic device, and a computer-readable storage medium.
- Time of flight (TOF) imaging system can calculate the depth information of the measured object by calculating the time difference between the moment when the optical transmitter emits the optical signal and the moment when the optical receiver receives the optical signal.
- Light emitters typically include a light source and a diffuser. The light from the light source is diffused by the diffuser and then casts a uniform surface light into the scene. The light emitted by the light source is usually an infrared laser.
- Embodiments of the present application provide a control method, a control device, a time-of-flight depth camera, an electronic device, and a computer-readable storage medium.
- the method for controlling an optical transmitter includes: acquiring a scene image of a scene; identifying whether a human face exists in the scene image; and controlling the optical transmitter to The first light emitting power and / or the first on frequency emit light; when the human face does not exist in the scene image, controlling the light emitter to emit light with the second light emitting power and / or the second on frequency.
- the control device of the light transmitter includes a first acquisition module, an identification module, and a control module.
- the first acquisition module is configured to acquire a scene image of a scene;
- the recognition module is configured to recognize whether a human face exists in the scene image;
- the control module is configured to, when the human face exists in the scene image, Controlling the light emitter to emit light at a first light emitting power and / or a first on frequency, and when the human face does not exist in the scene image, controlling the light emitter to emit light at a second light emitting power and / or Two on-frequency lights.
- the time-of-flight camera includes a light transmitter and a processor.
- the processor is configured to obtain a scene image of a scene, identify whether a human face exists in the scene image, and control the light transmitter to be turned on at a first light emitting power and / or a first when the human face exists in the scene image.
- the light emitter is controlled to emit light at a second light emission power and / or a second on frequency.
- the electronic device includes the time-of-flight depth camera, one or more processors, a memory, and one or more programs.
- the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the programs include instructions for performing the control method described above.
- the computer-readable storage medium of the embodiment of the present application includes a computer program used in combination with an electronic device, and the computer program can be executed by a processor to complete the control method described above.
- FIG. 1 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
- FIG. 2 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
- FIG. 3 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
- FIG. 4 and FIG. 5 are schematic diagrams of the turn-on frequencies of the optical transmitter in some embodiments of the present application.
- FIG. 6 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
- FIG. 7 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
- FIGS. 8 and 9 are schematic flowcharts of a method for controlling a light transmitter according to some embodiments of the present application.
- FIG. 10 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
- FIG. 11 is a schematic block diagram of a second acquisition module in a control device for a light transmitter according to some embodiments of the present application.
- FIG. 12 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
- FIG. 13 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
- FIG. 14 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
- FIG. 15 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
- FIG. 16 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
- FIG. 17 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
- FIG. 18 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
- 19 is a schematic block diagram of a control module in a control device for a light transmitter according to some embodiments of the present application.
- FIG. 20 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
- FIG. 21 is a schematic diagram of a three-dimensional structure of a depth camera according to some embodiments of the present application.
- FIG. 22 is a schematic plan view of a depth camera according to some embodiments of the present application.
- FIG. 23 is a schematic cross-sectional view of the depth camera in FIG. 22 along the line XXIII-XXIII.
- FIG. 24 is a schematic structural diagram of a light emitter according to some embodiments of the present application.
- 25 is a schematic diagram of a connection between an electronic device and a computer-readable storage medium according to some embodiments of the present application.
- the present application provides a method for controlling a light transmitter 100.
- the control method includes: acquiring a scene image of the scene; 03: identifying whether a human face exists in the scene image; and controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face exists in the scene image; When a human face does not exist in the scene image, the light transmitter 100 is controlled to emit light at a second light emitting power and / or a second on frequency.
- the first turn-on frequency includes a first sub-turn-on frequency and a second sub-turn-on frequency.
- the control method further includes: judging an application scenario of the light transmitter.
- the step of controlling the light transmitter 100 to emit light at the first luminous power and / or the first on frequency includes: when the application scene is the first scene and the human face exists in the scene image, Control the light emitter 100 to emit light at the first light emitting power and the first sub-on frequency; when the application scene is the second scene and a human face is present in the scene image, control the light emitter 100 to turn on at the first light power and the second sub-light Frequency glow.
- control method further includes: obtaining a projection distance between the user and the light transmitter 100; and calculating the first light emitting power according to the projection distance.
- the step of obtaining a projection distance between the user and the light transmitter 100 includes: calculating a first proportion of a human face in the scene image; and calculating the projection distance according to the first proportion.
- the step of calculating the projection distance according to the first scale includes: calculating a second ratio of the preset feature area of the human face in the scene image to the human face; and calculating the projection distance according to the first ratio and the second ratio.
- the step of calculating the projection distance according to the first ratio includes: judging whether the user is wearing glasses according to the scene image; and calculating the projection distance according to the first ratio and the distance coefficient when the user wears glasses.
- the step of calculating the projection distance according to the first ratio includes: judging the age of the user according to the scene image; and calculating the projection distance according to the first ratio and age.
- the step of controlling the light emitter 100 to emit light at the second light emission power and / or the second on frequency includes: obtaining a target subject and the light emitter 100 in the scene. Obtains the ambient brightness of the scene; and calculates the second luminous power according to the ambient brightness and the projection distance.
- the present application provides a control device 90 of a light transmitter 100.
- the control device 90 includes a first acquisition module 91, an identification module 93, and a control module 95.
- the first acquisition module 91 is configured to acquire a scene image of a scene.
- the recognition module 93 is used to recognize whether a human face exists in the scene image.
- the control module 95 is configured to control the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face is present in the scene image, and control the light transmitter 100 to The second light emitting power and / or the second on frequency emit light.
- the first turn-on frequency includes a first sub-turn-on frequency and a second sub-turn-on frequency.
- the control device 90 further includes a determination module 941, which is configured to determine an application scenario of the optical transmitter 100.
- the control module 90 may also control the light transmitter 100 to emit light at the first light emission power and the first sub-on frequency when the application scene is the first scene and the human face is present in the scene image, and the second scene is the second scene in the application scene.
- the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
- control device 90 further includes a second obtaining module 942 and a calculation module 942.
- the second acquisition module 942 is configured to acquire a projection distance between the user and the light transmitter 100.
- the calculation module 943 is configured to calculate the first light emitting power according to the projection distance.
- the second obtaining module 942 includes a first calculation unit 9421 and a second calculation unit 9422.
- the first calculation unit 9421 is configured to calculate a first proportion of a face in a scene image.
- the second calculation unit 9422 is configured to calculate a projection distance according to the first ratio.
- the second calculation unit 942 includes a first calculation sub-unit 9423 and a second calculation sub-unit 9424.
- the first calculation subunit 9423 is configured to calculate a second proportion of the preset feature area of the human face in the scene image.
- the second calculation subunit 9424 is configured to calculate a projection distance according to the first scale and the second scale.
- the second calculation unit 942 includes a first determination sub-unit 9425 and a third calculation sub-unit 9426.
- the first judging subunit 9425 is configured to judge whether the user wears glasses according to the scene image.
- the third subunit 9426 is configured to calculate a projection distance according to the first scale and the distance coefficient when the user wears glasses.
- the second calculation unit 942 includes a second determination sub-unit 9427 and a fourth calculation sub-unit 9428.
- the second judging subunit 9427 is configured to judge the age of the user according to the scene image.
- the fourth calculation subunit 9428 is configured to calculate a projection distance according to the first ratio and age.
- the control module 95 includes a first obtaining unit 951, a second obtaining unit 952, and a third computing unit 953.
- the first obtaining unit 951 is configured to obtain a projection distance between a target subject in the scene and the light emitter 100.
- the second obtaining unit 952 is configured to obtain the ambient brightness of the scene.
- the third calculation unit 953 is configured to calculate a second light emitting power according to the ambient brightness and the projection distance.
- the application also provides a time-of-flight depth camera 300.
- the time-of-flight depth camera 300 includes a light transmitter 100 and a processor.
- the processor is configured to: obtain a scene image of the scene; identify whether a human face is present in the scene image; and when a human face is present in the scene image, control the light transmitter 100 to emit light at a first light emission power and / or a first on frequency; and in the scene image When the human face does not exist, the light transmitter 100 is controlled to emit light at a second light emitting power and / or a second on frequency.
- the processor is further configured to: determine an application scenario of the light transmitter 100; and when the application scenario is the first scene and a human face is present in the scene image, control the light transmitter at the first light emitting power and the first When the application scene is the second scene and a human face is present in the scene image, the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
- the processor is further configured to: obtain a projection distance between the user and the light transmitter 100; and calculate the first light emitting power according to the projection distance.
- the processor is further configured to: calculate a first proportion of a face in a scene image; and calculate a projection distance according to the first proportion.
- the processor is further configured to: calculate a second proportion of the preset feature area of the face in the scene image to the human face; and calculate a projection distance according to the first proportion and the second proportion.
- the processor is further configured to: determine whether the user wears glasses according to the scene image; and calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
- the processor is further configured to: determine the age of the user according to the scene image; and calculate the projection distance according to the first scale and age.
- the processor is further configured to: obtain a projection distance between the target subject and the light emitter in the scene; obtain the ambient brightness of the scene; and calculate the second luminous power according to the ambient brightness and the projection distance .
- the present application further provides an electronic device 800.
- the electronic device includes the depth camera 300 according to any one of the foregoing embodiments, one or more processors 805, a memory 806, and one or more programs.
- One or more programs are stored in the memory 806 and are configured to be executed by one or more processors 805.
- the program includes instructions for the control method according to any one of the above embodiments.
- the present application further provides a computer-readable storage medium 900.
- the computer-readable storage medium 900 includes a computer program used in conjunction with the electronic device 800.
- the computer program may be executed by the processor 805 to implement the control method according to any one of the foregoing embodiments.
- Control methods include:
- controlling the light transmitter 100 when a human face is present in the scene image, controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency;
- the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
- the present application further provides a control device 90 of the optical transmitter 100.
- the control method according to the embodiment of the present application may be implemented by the control device 90 according to the embodiment of the present application.
- the control device 90 includes a first acquisition module 91, an identification module 93, and a control module 95.
- Step 01 may be implemented by the first obtaining module 91.
- Step 03 may be implemented by the identification module 93.
- Both steps 05 and 07 can be implemented by the control module 95. That is, the first acquisition module 91 may be configured to acquire a scene image of a scene.
- the recognition module 93 may be used to recognize whether a human face exists in the scene image.
- the control module 95 may be configured to control the light emitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face is present in the scene image, and to control the light emitter 100 to emit a second light when no human face is present in the scene image.
- the light emitting power and / or the second on frequency emit light.
- this application further provides a time-of-flight depth camera 300.
- the control device 90 according to the embodiment of the present application can be applied to the time-of-flight depth camera 300 according to the embodiment of the present application.
- the time-of-flight depth camera 300 includes a light transmitter 100, a light receiver 200, and a processor.
- Step 01, step 03, step 05, and step 07 can all be implemented by a processor. That is to say, the processor may be configured to acquire a scene image of the scene, identify whether a human face exists in the scene image, and control the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency when a human face exists in the scene image. And when the human face does not exist in the scene image, the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
- the time-of-flight camera 300 according to the embodiment of the present application can be applied to the electronic device 800.
- the processor in the time-of-flight camera 300 according to the embodiment of the present application and the processor 805 of the electronic device 800 may be the same processor, or may be two independent processors. In a specific embodiment of the present application, the processor in the time-of-flight depth camera 300 and the processor 805 of the electronic device 800 are the same processor.
- the electronic device 800 may be a mobile phone, a tablet computer, a smart wearable device (a smart watch, a smart bracelet, a smart glasses, a smart helmet), a drone, etc., and is not limited herein.
- the time-of-flight depth camera 300 generally includes a light transmitter 100 and a light receiver 200.
- the light transmitter 100 is used to project a laser light into the scene, and the light receiver 200 receives the laser light reflected by a person or an object in the scene.
- the processor 805 controls both the optical transmitter 100 and the optical receiver 200 to be turned on, and inputs a modulation signal with a certain frequency and amplitude to the driver 61 (shown in FIG. 24), and the driver 61 converts the modulation signal After being a constant current source, it is transmitted to the light source (shown in FIG. 24) of the light emitter 100, so that the light source emits laser light.
- the laser emitted by the optical transmitter 100 is usually an infrared laser. If the energy of the infrared laser is too high or the infrared laser is continuously emitted to a position for a long time, it is easy to cause damage to the eyes of the user.
- the control device 90 and the time-of-flight camera 300 when the light transmitter 100 is turned on, a scene image of a scene is first collected.
- a scene image of a scene is first collected.
- an infrared camera which may be the light receiver 200
- a visible light camera may be used. 400 acquisitions.
- the processor 805 recognizes whether a face exists in the scene image based on the face recognition algorithm.
- the processor 805 controls the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency.
- the processor 805 controls the light transmitter 100 to emit light at a second light emitting power and / or a second on frequency.
- the light emitting power is indirectly characterized by the current output by the driver 61.
- the turn-on frequency refers to the turn-on frequency of the light transmitter 100, not the light-emitting frequency of the light transmitter 100.
- the turn-on frequency of the light transmitter 100 corresponds to the frame rate at which the light receiver 200 outputs a depth image. Specifically, with reference to FIG.
- the processor 805 controlling the light emitter 100 to emit light at the first light emitting power and / or the first on frequency includes: (1) the processor 805 controls the light emitter 100 to emit light at the first light emitting power; (2) the processor 805 controls the light emitting The transmitter 100 emits light at a first on frequency; (3) The processor 805 controls the light transmitter 100 to emit light at a first light emission power and a first on frequency at the same time.
- the processor 805 controlling the light emitter 100 to emit light at the second light emitting power and / or the second on frequency includes: (1) the processor 805 controls the light emitter 100 to emit light at the second light emitting power; (2) the processor 805 Control the light transmitter 100 to emit light at a second on frequency; (3) The processor 805 controls the light transmitter 100 to emit light at a second light emission power and a second on frequency simultaneously.
- the light when a human face is present in the scene image, the light is turned on with a lower first luminous power and a lower first. Frequency light emission.
- a lower first luminous power can reduce the energy of the infrared laser light irradiated to the user's eyes.
- a lower first turn-on frequency can reduce the time that the infrared laser light continues to irradiate the user's eyes. In this way, the emitted laser light can reduce the impact on the user.
- the risk of injury to the eyes increases the safety of use of the time-of-flight depth camera 300.
- the light is emitted at a second light emission power and a second on frequency suitable for the current scene. In this way, the accuracy of the depth image output by the light receiver 200 can be improved.
- the first turn-on frequency includes a first turn-on sub-frequency and a second turn-on sub-frequency.
- Control methods also include:
- Step 05 When a human face exists in the scene image, controlling the light transmitter 100 to emit light at the first light emitting power and / or the first on frequency includes:
- the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
- the control device 90 further includes a determination module 941.
- Step 041 may be implemented by the judgment module 941.
- Both steps 051 and 052 can be implemented by the control module 95. That is to say, the determination module 941 can be used to determine an application scenario of the light transmitter 100.
- the control module 95 may be configured to control the light emitter 100 to emit light at a first light emitting power and a first sub-on frequency when the application scene is the first scene and the face is present in the scene image, and when the application scene is the second scene and the scene When the human face is present in the image, the light transmitter 100 is controlled to emit light at a first light emitting power and a second sub-on frequency.
- step 041, step 051, and step 052 can all be implemented by the processor 805. That is to say, the processor 805 may be configured to determine an application scene of the light transmitter 100, and when the application scene is the first scene, and the face exists in the scene image, control the light transmitter 100 to use the first light emitting power and the first The first sub-frequency is turned on to emit light, and when the application scene is the second scene and the human face is present in the scene image, the light transmitter 100 is controlled to emit light at the first sub-frequency and the second sub-on frequency.
- the first scene refers to an application scene in which the application time of the depth-of-flight camera 300 is shorter than a preset time, for example, shooting a static three-dimensional image of the scene, unlocking the electronic device 800 based on a three-dimensional face, and performing payment based on the three-dimensional face
- Application scenarios where the time module has less application time refers to an application scenario in which the application time of the time-of-flight depth camera 300 is greater than or equal to a preset time, for example, an application scenario in which a user performs a three-dimensional video chat with other users.
- the first scenario as an application scenario of unlocking the electronic device 800 based on a three-dimensional face
- the second scenario as an application scenario in which a user performs a three-dimensional video chat with other users.
- the time-of-flight depth camera 300 is used to capture a user's three-dimensional face to unlock the electronic device 800
- the light receiver 200 usually only needs to output a few depth images per second, for example, 3 frames per second, 4 frames per second, 5 frames per second, etc.
- the corresponding optical transmitter 100 is turned on 3 times / second, 4 times / second, 5 times / second, etc.
- the light receiver 200 can output only one frame of the image, and correspondingly, the processor 805 can set the first sub-on frequency to 1 time / second.
- the time-of-flight depth camera 300 is used to capture a three-dimensional video of a user, so that the user can use the electronic device 800 to have a three-dimensional video chat with other users, the light receiver 200 usually needs to output more depth images per second, such as every 30 frames per second, 60 frames per second, etc., the corresponding light transmitter 100 is turned on 30 times / second, 60 times / second, etc.
- the light receiver 200 can only output 24 frames of images.
- the processor 805 can set the first sub-on frequency to 24 times / second. It can be understood that when the screen refresh rate of the electronic device 800 reaches 24 frames / second, what the human eye sees is a smooth picture. Therefore, the first sub-on frequency can be set to 24 times / second, and the light transmitter 100 starts at The lowest frame rate output depth image, on the one hand, can reduce the damage of the laser to the user's eyes, and on the other hand, can ensure that the user can see a smooth three-dimensional video picture.
- control method after step 03 further includes:
- Step 042 includes:
- the control device 90 further includes a second acquisition module 942 and a calculation module 943.
- the second acquisition module 942 includes a first calculation unit 9421 and a second calculation unit 9422.
- Step 042 may be implemented by the second obtaining module 942.
- Step 043 may be implemented by the first calculation unit 9421.
- Step 0421 may be implemented by the first calculation unit 9421, and step 0422 may be implemented by the second calculation unit 9422. That is, the second obtaining module 942 may be used to obtain a projection distance between the user and the light transmitter 100.
- the calculation module 943 may be configured to calculate the first light emitting power according to the projection distance.
- the first calculation unit 9421 may be configured to calculate a first proportion of a human face in the scene image.
- the second calculation unit 9422 may be configured to calculate a projection distance according to the first ratio.
- step 041, step 042, step 0421, and step 0422 can all be implemented by the processor 805. That is to say, the processor 805 may be further configured to obtain a projection distance between the user and the light transmitter 100, and calculate a first light emitting power according to the projection distance. When the processor 805 obtains the projection distance between the user and the light transmitter 100, the processor 805 specifically performs operations of calculating a first proportion of a face in the scene image and calculating a projection distance according to the first proportion.
- the processor 805 After the processor 805 recognizes the human face in the scene image, the processor 805 extracts the human face and calculates the number of pixels occupied by the human face. Subsequently, the processor 805 divides the number of human face pixels by the scene image. To obtain the first proportion of faces in the scene image, and finally calculate the projection distance based on the first proportion.
- the first ratio when the first ratio is larger, the user is closer to the time-of-flight camera 300, that is, the user is closer to the light transmitter 100, and the projection distance is smaller.
- the first ratio is larger, the user and the time-of-flight depth are explained. The camera 300 is farther away, that is, the user is farther away from the light transmitter 100, and the projection distance is larger.
- the relationship between the projection distance and the first ratio satisfies that the projection distance increases as the first ratio decreases.
- the face with the largest area among the multiple faces may be selected to calculate the first proportion; or, the average of the areas of the multiple faces may be selected to calculate The first proportion; or, the face of the holder of the electronic device 800 may be identified from a plurality of faces, and the first proportion may be calculated using the faces of the holder.
- the first ratio has a mapping relationship with the projection distance.
- the first ratio is a specific value and the projection distance is also a specific value.
- the first ratio corresponds to the projection distance one by one.
- the first ratio is a range and the projection distance is
- the first ratio is a one-to-one correspondence with the projection distance; or, the first ratio is a range and the projection distance is also a range, and the first ratio corresponds to the projection distance one-to-one.
- the mapping relationship between the first scale and the projection distance may be calibrated in advance.
- the user is instructed to stand at a plurality of predetermined projection distances from the infrared camera or visible light camera 400, respectively, and the infrared camera or visible light camera 400 sequentially captures scene images.
- the processor 805 calculates the calibration ratio of the face to the scene image in each scene image, and then stores the correspondence between the calibration ratio in each scene image and the predetermined projection distance. In subsequent use, based on the first ratio actually measured Find the projection distance corresponding to the first ratio in the above mapping relationship.
- the user is instructed to stand at a projection distance of 10 cm, 20 cm, 30 cm, 40 cm, an infrared camera or a visible light camera 400 sequentially captures scene images, and the processor 805 calculates a projection distance of 10 cm from the multiple scene images , 20 cm, 30 cm, and 40 cm respectively corresponding to the calibration ratio of 80%, 60%, 45%, 30%, and the mapping relationship between the calibration ratio and the predetermined projection distance 10cm-80%, 20cm-60%, 30cm-45
- The%, 40cm-30% are stored in the memory 806 of the electronic device 800 in the form of a mapping table. In subsequent use, directly find the projection distance corresponding to the first ratio in the mapping table.
- the projection distance and the first ratio are calibrated in advance.
- the user is directed to stand at a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 collects scene images.
- the processor 805 calculates the calibration ratio of the face in the scene image to the scene image, and then stores the correspondence between the calibration ratio in the scene image and the predetermined projection distance. In subsequent use, based on the correspondence between the calibration ratio and the predetermined projection distance The relationship calculates the projection distance.
- the processor 805 calculates that the proportion of the human face in the scene image is 45%, and in actual measurement, when When the first ratio is calculated as R, according to the properties of similar triangles, Among them, D is an actual projection distance calculated according to the actually measured first ratio R.
- the projection distance between the user and the light transmitter 100 can be more objectively reflected.
- the light transmitter 100 can emit light at a relatively suitable luminous power. On the one hand, it can prevent the light emitting power of the light transmitter 100 from being too high and causing damage to the user's eyes. The luminous power is too low, resulting in inaccurate depth information of the scene.
- calculating the projection distance according to the first scale in step 0422 includes:
- 0424 Calculate the projection distance according to the first scale and the second scale.
- the second calculation unit 9422 includes a first calculation sub-unit 9423 and a second calculation sub-unit 9424.
- Step 0423 may be implemented by the first calculation subunit 9423.
- Step 0424 may be implemented by the second calculation subunit 9424. That is to say, the first calculation subunit 9423 can be used to calculate a second proportion of the preset feature area of the human face in the scene image to the human face.
- the second calculation sub-unit 9424 may be configured to calculate the projection distance according to the first scale and the second scale.
- step 0423 and step 0424 may be implemented by the processor 805. That is to say, the processor 805 can also be used to calculate a second proportion of the preset feature area of the human face in the scene image to the human face, and calculate a projection distance according to the first and second proportions.
- the second ratio is the ratio of the preset feature area of the human face to the human face.
- the preset feature area can select a feature area with a small difference between different user individuals.
- the preset feature area can be the distance between the eyes of the user.
- the user is directed to stand at a predetermined projection distance position and collect a scene image, and then calculate a first calibration ratio and a second calibration ratio corresponding to the scene image, and store the predetermined projection distance, the first calibration ratio, and the second
- the corresponding relationship of the scales is calibrated, so as to calculate the projection distance according to the actual first scale and the second scale in subsequent use. For example, instruct the user to stand at a projection distance of 25 cm and collect a scene image, and then calculate the first calibration ratio corresponding to the scene image is 50% and the second calibration ratio is 10%.
- D1 is the initial projection distance calculated according to the actually measured first ratio R1, which can be further based on the relationship A calibrated projection distance D2, which is further calculated according to the actually measured second ratio R2, is obtained, and D2 is used as the final projection distance.
- the projection distance calculated according to the first ratio and the second ratio takes into account the individual differences between different users, and can obtain a more objective projection distance. Further, a more accurate first light emission can be determined based on a more accurate projection distance. power.
- calculating the projection distance according to the first scale in step 0422 includes:
- 0426 Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
- the second calculation unit 9422 further includes a first determination sub-unit 9425 and a third calculation sub-unit 9426.
- Step 0425 may be implemented by the first judging subunit 9425.
- Step 0426 may be implemented by the third calculation subunit 9426. That is to say, the first judging subunit 9425 can be used to judge whether the user wears glasses according to the scene image.
- the third calculation subunit 9426 may be configured to calculate a projection distance according to the first scale and the distance coefficient when the user wears glasses.
- step 0425 and step 0426 may be implemented by the processor 805. That is to say, the processor 805 may be further configured to determine whether the user wears glasses according to the scene image, and calculate the projection distance according to the first ratio and the distance coefficient when the user wears the glasses.
- the optical transmitter 100 emits laser light to the user wearing the glasses At this time, the light emitting power of the light transmitter 100 needs to be reduced so that the energy of the laser light emitted by the light transmitter 100 is small, so as not to cause damage to the eyes of the user.
- the preset distance coefficient can be a coefficient between 0 and 1, such as 0.6, 0.78, 0.82, 0.95, etc., for example, the initial projection distance is calculated according to the first ratio, or the first distance and the second ratio are calculated.
- the initial projection distance or the calibrated projection distance is multiplied by the distance coefficient to obtain the final projection distance, and the first luminous power is calculated according to the projection distance. In this way, it is possible to avoid that the power of the emitted laser is too large to hurt the user suffering from eye disease or poor vision.
- calculating the projection distance according to the first scale in step 0422 includes:
- the second calculation unit 9422 includes a second determination sub-unit 9427 and a fourth calculation sub-unit 9428.
- Step 0427 may be implemented by the second judgment subunit 9427
- step 0428 may be implemented by the fourth calculation subunit 9428. That is to say, the second judging subunit 9427 may be configured to judge the age of the user according to the scene image.
- the fourth calculation sub-unit 9428 may be configured to calculate the projection distance according to the first ratio and the age.
- step 0427 and step 0428 may be implemented by the processor 805. That is to say, the processor 805 may be configured to determine the age of the user according to the scene image, and calculate the projection distance according to the first scale and age.
- the number, distribution, and area of feature points of facial wrinkles in the scene image can be extracted to determine the user's age, for example, the number of wrinkles at the corners of the eyes can be used to determine the user's age, or further combined with the user's forehead How many wrinkles are there to determine the user's age.
- the proportion coefficient can be obtained according to the age of the user. Specifically, the correspondence between age and the proportion coefficient can be found in a query table.
- the proportion coefficient is 0.6 and the age is between When the age is 15 to 20, the scale factor is 0.8; when the age is 20 to 45, the scale factor is 1.0; when the age is 45 or more, the scale factor is 0.8.
- the initial projection distance calculated from the first scale or the calibrated projection distance calculated from the first and second scales can be multiplied by the scale factor to obtain the final projection distance.
- the first light emitting power is calculated according to the projection distance. In this way, excessive power of the emitted laser can be avoided to hurt young users or older users.
- controlling the light emitter 100 to emit light at the second light emission power and / or the second on frequency includes:
- the control module 95 includes a first obtaining unit 951, a second obtaining unit 952, and a third computing unit 953.
- Step 071 may be implemented by the first obtaining unit 951.
- Step 072 may be performed by the second obtaining unit 952.
- Step 073 may be performed by the third calculation unit 953.
- the first acquiring unit 951 may be configured to acquire a projection distance between the target subject in the scene and the light emitter 100.
- the second obtaining unit 952 may be configured to obtain the ambient brightness of the scene.
- the third calculation unit 953 may be configured to calculate the second light emitting power according to the ambient brightness and the projection distance.
- steps 071, 072, and 073 can all be implemented by the processor 805. That is to say, the processor 805 may be configured to obtain a projection distance between the target subject in the scene and the light emitter 100, obtain the ambient brightness of the scene, and calculate the second light emitting power according to the ambient brightness and the projection distance.
- the projection distance between the target subject in the scene and the light transmitter 100 may be obtained by the time-of-flight depth camera 300.
- the light transmitter 100 emits laser light at a preset light emission power and a predetermined light emission frequency
- the light receiver 200 receives the laser light reflected by an object in the scene
- the processor 805 calculates the scene based on the laser light received by the light receiver 200. Initial depth information. Subsequently, the processor 805 determines the target subject from the scene.
- the central area of the field of view of the light receiver 200 can be used as the area where the target subject is located, and the central The initial depth information of the pixels of this part of the region is used as the initial depth information of the target subject.
- the processor 805 can calculate the average or median value of the multiple initial depth information, and use the average or median value as the projection between the light emitter 100 and the target subject. distance. In this way, the projection distance between the light emitter 100 and the target subject can be obtained by the time-of-flight depth camera 300.
- the ambient brightness can be detected by the light sensor, and the processor 805 reads the detected ambient brightness from the light sensor.
- the ambient brightness may be detected by an infrared camera (which may be the light receiver 200) or a visible light camera 400.
- the infrared camera or the visible light camera 400 captures an image of a scene, and the processor 805 calculates the brightness value of the image as the ambient brightness.
- the processor 805 After determining the ambient brightness and the projection distance, the processor 805 jointly calculates the second light emitting power based on the two parameters of the ambient brightness and the projection distance.
- the optical receiver 200 receives both the infrared laser light emitted by the optical transmitter 100 and the infrared light in the ambient light. If the light emitting power of the infrared laser emitted by the optical transmitter 100 is low, the light from the light received by the optical receiver 200 comes from the light.
- the ratio between the infrared laser of the transmitter 100 and the infrared light from the ambient light is not much different, which will cause the time when the light receiver 200 receives the light to be inaccurate, or the total amount of light received by the light receiver 200 will be insufficient. Accuracy will further reduce the accuracy of acquiring depth information. Therefore, the transmission power of the infrared laser emitted by the optical transmitter 100 needs to be increased to reduce the influence of the infrared light in the environment on the optical receiver 200 receiving the infrared laser from the optical transmitter 100 ; And when the ambient brightness is low, the infrared light component contained in the ambient light is less, at this time, if the light emitter 100 uses a higher luminous power to emit light, the electricity will increase.
- the second light emitting power of the light transmitter 100 can be appropriately increased.
- the second light emitting power of the light transmitter 100 can be appropriately reduced.
- jointly determining the second light emitting power of the light transmitter 100 based on the ambient brightness and the projection distance can reduce the power consumption of the electronic device 800 on the one hand, and improve the accuracy of obtaining the depth information of the scene on the other hand.
- the second turning-on frequency of the light transmitter 100 may be determined according to the application scenario.
- the application scenario is to unlock the electronic device 800 based on a three-dimensional face
- the light receiver 200 usually only needs to output a few depth images per second, for example, 3 frames per second, 4 frames per second, Output 5 frames per second, etc.
- the second turn-on frequency of the light transmitter 100 can be set to 3 times / second, 4 times / second, 5 times / second, etc .
- the application scenario is for the user to record the During video
- the optical receiver 200 usually needs to output more depth images per second, for example, 30 frames per second and 60 frames per second.
- the second turn-on frequency of the optical transmitter 100 can be set correspondingly. 30 times / second, 60 times / second, etc. In this way, based on different application scenarios, the opening frequency that is most suitable for each application scenario is set to meet the needs of users.
- the electronic device 800 includes a casing 801 and a time-of-flight depth camera 300.
- the housing 801 may serve as a mounting carrier for the functional elements of the electronic device 800.
- the housing 801 can provide protection for the functional components from dust, drop, and water.
- the functional components can be a display screen 802, a visible light camera 400, a receiver, and the like.
- the housing 801 includes a main body 803 and a movable bracket 804.
- the movable bracket 804 can move relative to the main body 803 under the driving of a driving device.
- the movable bracket 804 can slide relative to the main body 803 to slide Slide in or out of the main body 803 (as shown in FIG. 20).
- Some functional elements can be installed on the main body 803, and other functional elements (such as time-of-flight depth camera 300, visible light camera 400, and receiver) can be installed on the movable bracket 804, which can be driven by the movement of the movable bracket 804 The other functional element is retracted into or protruded from the main body 803.
- the embodiments shown in FIG. 1 and FIG. 21 are merely examples of a specific form of the casing 801, and cannot be understood as a limitation on the casing 801 of the present application.
- a time-of-flight depth camera 300 is mounted on the housing 801.
- the casing 801 may be provided with an acquisition window, and the time-of-flight camera 300 is aligned with the acquisition window so that the time-of-flight camera 300 acquires depth information.
- the time-of-flight depth camera 300 is mounted on a movable bracket 804.
- the user can trigger the movable bracket 804 to slide out from the main body 803 to drive the time-of-flight depth camera 300 to protrude from the main body 803; when the time-of-flight depth camera 300 is not required, it can be triggered The movable bracket 804 slides into the main body 803 to cause the time-of-flight camera 300 to retract into the main body.
- the time-of-flight depth camera 300 includes a first substrate assembly 71, a pad 72, a light emitter 100 and a light receiver 200.
- the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712 connected to each other.
- the spacer 72 is disposed on the first substrate 711.
- the light emitter 100 is used for projecting laser light outward, and the light emitter 100 is disposed on the cushion block 72.
- the flexible circuit board 712 is bent and one end of the flexible circuit board 712 is connected to the first substrate 711 and the other end is connected to the light emitter 100.
- the light receiver 200 is disposed on the first substrate 711.
- the light receiver 200 is configured to receive laser light reflected by a person or an object in a scene.
- the light receiver 200 includes a housing 741 and an optical element 742 provided on the housing 741.
- the housing 741 is integrally connected with the pad 72.
- the first substrate assembly 71 includes a first substrate 711 and a flexible circuit board 712.
- the first substrate 711 may be a printed wiring board or a flexible wiring board.
- the control circuit and the like of the time-of-flight camera 300 may be laid on the first substrate 711.
- One end of the flexible circuit board 712 may be connected to the first substrate 711.
- the flexible circuit board 712 can be bent at a certain angle, so that the relative positions of the devices connected at both ends of the flexible circuit board 712 can be selected.
- the spacer 72 is disposed on the first substrate 711.
- the spacer 72 is in contact with the first substrate 711 and is carried on the first substrate 711.
- the spacer 72 may be combined with the first substrate 711 by means of adhesion or the like.
- the material of the spacer 72 may be metal, plastic, or the like.
- a surface on which the pad 72 is combined with the first substrate 711 may be a flat surface, and a surface on which the pad 72 is opposite to the combined surface may also be a flat surface, so that the light emitter 100 is disposed on the pad 72. It has better smoothness.
- the light receiver 200 is disposed on the first substrate 711, and the contact surface between the light receiver 200 and the first substrate 711 is substantially flush with the contact surface between the pad 72 and the first substrate 711 (that is, the installation starting point of the two is at On the same plane).
- the light receiver 200 includes a housing 741 and an optical element 742.
- the casing 741 is disposed on the first substrate 711, and the optical element 742 is disposed on the casing 741.
- the casing 741 may be a lens holder and a lens barrel of the light receiver 200, and the optical element 742 may be an element such as a lens disposed in the casing 741.
- the light receiver 200 further includes a photosensitive chip (not shown), and the laser light reflected by a person or an object in the scene passes through the optical element 742 and is irradiated into the photosensitive chip, and the photosensitive chip generates a response to the laser.
- the housing 741 and the cushion block 72 are integrally connected.
- the casing 741 and the cushion block 72 may be integrally formed; or the materials of the casing 741 and the cushion block 72 are different, and the two are integrally formed by two-color injection molding or the like.
- the housing 741 and the spacer 72 may also be separately formed, and the two form a matching structure.
- one of the housing 741 and the spacer 72 may be set on the first substrate 711, and then the other One is disposed on the first substrate 711 and connected integrally.
- the light transmitter 100 is disposed on the pad 72, which can increase the height of the light transmitter 100, thereby increasing the height of the surface on which the laser is emitted by the light transmitter 100.
- the laser light emitted by the light transmitter 100 is not easily received by the light
- the device 200 is blocked, so that the laser light can be completely irradiated on the measured object in the target space.
- the light emitter 100 includes a second substrate assembly 51, a light emitting assembly 101 and a housing 52.
- the second substrate assembly 51 is disposed on the pad 72, and the second substrate assembly 51 is connected to the flexible circuit board 712.
- the light emitting component 101 is disposed on the second substrate component 51, and the light emitting component 101 is used for emitting laser light.
- the casing 52 is disposed on the second substrate assembly 51.
- the casing 52 is formed with a receiving space 521, and the receiving space 521 can be used for receiving the light emitting module 101.
- the flexible circuit board 712 may be detachably connected to the second substrate assembly 51.
- the light emitting component 101 is connected to the second substrate component 51.
- the housing 52 may be bowl-shaped as a whole, and the opening of the housing 52 is disposed on the second substrate assembly 51 downwardly to accommodate the light emitting assembly 101 in the receiving space 521.
- the housing 52 is provided with a light emitting port 522 corresponding to the light emitting component 101.
- the laser light emitted from the light emitting component 101 passes through the light emitting port 522 and is emitted.
- the laser light can pass directly through the light emitting port 522. It can also pass through the light exit port 522 after changing the optical path through other optical devices.
- the second substrate assembly 51 includes a second substrate 511 and a reinforcing member 512.
- the second substrate 511 is connected to the flexible circuit board 712.
- the light emitting component 101 and the reinforcing member 512 are disposed on opposite sides of the second substrate 511.
- a specific type of the second substrate 511 may be a printed circuit board or a flexible circuit board, and a control circuit may be laid on the second substrate 511.
- the reinforcing member 512 can be fixedly connected to the second substrate 511 by gluing, riveting, or the like. The reinforcing member 512 can increase the overall strength of the second substrate assembly 51.
- the reinforcing member 512 can directly contact the pad 72, the second substrate 511 is not exposed to the outside, and does not need to be in direct contact with the pad 72, and the second substrate 511 is not vulnerable Contamination by dust, etc.
- the reinforcing member 512 and the cushion block 72 are separately formed.
- the spacer 72 may be mounted on the first substrate 71.
- the two ends of the flexible circuit board 712 are respectively connected to the first substrate 711 and the second substrate 511, and the flexible circuit board 712 may Don't bend it.
- the flexible circuit board 712 is then bent, so that the reinforcing member 512 is disposed on the cushion block 72.
- the reinforcing member 512 and the spacer 72 may be integrally formed, for example, integrally formed by a process such as injection molding.
- the spacer 72 and the light emitter 100 may be installed together. On the first substrate 711.
- the light emitting component 101 includes a light source 10, a diffuser 20, a lens barrel 30, a protective cover 40, and a driver 61.
- the lens barrel 30 includes a ring-shaped lens barrel sidewall 33, and the ring-shaped lens barrel sidewall 33 surrounds a receiving cavity 62.
- the side wall 33 of the lens barrel includes an inner surface 331 located in the receiving cavity 62 and an outer surface 332 opposite to the inner surface.
- the side wall 33 of the lens barrel includes a first surface 31 and a second surface 32 opposite to each other.
- the receiving cavity 62 penetrates the first surface 31 and the second surface 32.
- the first surface 31 is recessed toward the second surface 32 to form a mounting groove 34 communicating with the receiving cavity 62.
- the bottom surface 35 of the mounting groove 34 is located on a side of the mounting groove 34 remote from the first surface 31.
- the outer surface 332 of the side wall 33 of the lens barrel is circular at one end of the first surface 31, and the outer surface 332 of the side wall 33 of the lens barrel is formed with an external thread at one end of the first surface 31.
- the lens barrel 30 is carried on a second substrate 511.
- the second substrate 511 may be a circuit board 511.
- the circuit board 511 is in contact with the second surface 32 of the lens barrel 30 to close one end of the receiving cavity 62.
- the light source 10 is carried on the circuit board 511 and is housed in the receiving cavity 62.
- the light source 10 is configured to emit laser light toward the first surface 31 (the mounting groove 34) side of the lens barrel 30.
- the light source 10 may be a single-point light source or a multi-point light source.
- the light source 10 may specifically be an edge-emitting laser, for example, a distributed feedback laser (Distributed Feedback Laser, DFB), etc .; when the light source 10 is a multi-point light source, the light source 10 may be vertical A cavity-surface emitter (Vertical-Cavity Surface Laser, VCSEL), or the light source 10 may also be a multi-point light source composed of multiple edge-emitting lasers.
- DFB distributed Feedback Laser
- VCSEL Vertical A cavity-surface emitter
- VCSEL Vertical-Cavity Surface Laser
- the vertical cavity surface emitting laser has a small height, and the use of the vertical cavity surface emitter as the light source 10 is beneficial to reducing the height of the light emitter 100 and facilitating the integration of the light emitter 100 into a mobile phone and other requirements on the thickness of the fuselage.
- Electronic device 800 Compared with the vertical cavity surface emitter, the temperature drift of the side-emitting laser is smaller, and the influence of the temperature on the effect of the projected laser light from the light source 10 can be reduced.
- the driver 61 is carried on the circuit board 511 and is electrically connected to the light source 10. Specifically, the driver 61 may receive the modulation signal modulated by the processor 805805, and convert the modulation signal into a constant current source and transmit the modulated signal to the light source 10, so that the light source 10 is directed toward the first position of the lens barrel 30 under the action of the constant current source.
- the one side 31 emits laser light.
- the driver 61 of this embodiment is provided outside the lens barrel 30. In other embodiments, the driver 61 may be disposed in the lens barrel 30 and carried on the circuit board 511.
- the diffuser 20 is mounted (supported) in the mounting groove 34 and abuts the mounting groove 34.
- the diffuser 20 is used to diffuse the laser light passing through the diffuser 20. That is, when the light source 10 emits laser light toward the first surface 31 side of the lens barrel 30, the laser light passes through the diffuser 20 and is diffused or projected outside the lens barrel 30 by the diffuser 20.
- the protective cover 40 includes a top wall 41 and a protective sidewall 42 extending from one side of the top wall 41.
- a light through hole 401 is defined in the center of the top wall 41.
- the protective side wall 42 is disposed around the top wall 41 and the light through hole 401.
- the top wall 41 and the protection side wall 42 together form a mounting cavity 43, and the light-passing hole 401 communicates with the mounting cavity 43.
- the cross-section of the inner surface of the protective sidewall 42 is circular, and an inner thread is formed on the inner surface of the protective sidewall 42.
- the internal thread of the protective sidewall 42 is screwed with the external thread of the lens barrel 30 to mount the protective cover 40 on the lens barrel 30.
- the interference between the top wall 41 and the diffuser 20 causes the diffuser 40 to be sandwiched between the top wall 41 and the bottom surface 35 of the mounting groove 34.
- the opening 20 is installed in the lens barrel 30, and the diffuser 20 is installed in the installation groove 34, and the protective cover 40 is installed on the lens barrel 30 to clamp the diffuser 20 between the protective cover 40 and the installation groove.
- the diffuser 20 is fixed on the lens barrel 30.
- glue which can prevent the gas glue from diffusing and solidifying on the surface of the diffuser 20 after the glue is volatilized to affect the microstructure of the diffuser 20, and can avoid diffusion
- the diffuser 20 falls off from the lens barrel 30 when the glue of the device 20 and the lens barrel 30 decreases due to aging.
- the present application further provides an electronic device 800.
- the electronic device includes the time-of-flight depth camera 300 according to any one of the above embodiments, one or more processors 805, a memory 806, and one or more programs.
- One or more programs are stored in the memory 806, and are configured to be executed by one or more processors 806.
- the programs include instructions for performing the control method according to any one of the foregoing embodiments.
- the program includes instructions for performing the following steps:
- controlling the light transmitter 100 when a human face is present in the scene image, controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency;
- the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
- the program further includes instructions for performing the following steps:
- the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
- the program further includes instructions for performing the following steps:
- the program further includes instructions for performing the following steps:
- the program further includes instructions for performing the following steps:
- 0424 Calculate the projection distance according to the first scale and the second scale.
- the program further includes instructions for performing the following steps:
- 0426 Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
- the program further includes instructions for performing the following steps:
- the program further includes instructions for performing the following steps:
- the present application further provides a computer-readable storage medium 900.
- the computer-readable storage medium 900 includes a computer program used in conjunction with the electronic device 800, and the computer program can be executed by the processor 805 to complete the control method according to any one of the foregoing embodiments.
- the computer program may be executed by the processor 805 to complete the following steps:
- controlling the light transmitter 100 when a human face is present in the scene image, controlling the light transmitter 100 to emit light at a first light emitting power and / or a first on frequency;
- the light transmitter 100 is controlled to emit light at the second emission power and / or the second on frequency.
- the computer program can also be executed by the processor 805 to complete the following steps:
- the light transmitter 100 is controlled to emit light at the first light emission power and the second sub-on frequency.
- the computer program may also be executed by the processor 805 to complete the following steps:
- the computer program may also be executed by the processor 805 to complete the following steps:
- the computer program may also be executed by the processor 805 to complete the following steps:
- 0424 Calculate the projection distance according to the first scale and the second scale.
- the computer program may also be executed by the processor 805 to complete the following steps:
- 0426 Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
- the computer program may also be executed by the processor 805 to complete the following steps:
- the computer program may also be executed by the processor 805 to complete the following steps:
- first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality” is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
- Any process or method description in a flowchart or otherwise described herein can be understood as a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process
- the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.
- Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment.
- a "computer-readable medium” may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device.
- computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM).
- the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
- each part of the application may be implemented by hardware, software, firmware, or a combination thereof.
- multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system.
- a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it may be implemented using any one or a combination of the following techniques known in the art: Discrete logic circuits, application-specific integrated circuits with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
- a person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware.
- the program can be stored in a computer-readable storage medium.
- the program is When executed, one or a combination of the steps of the method embodiment is included.
- each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module.
- the above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
- the aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Description
优先权信息Priority information
本申请请求2018年9月12日向中国国家知识产权局提交的、专利申请号为201811060690.4的专利申请的优先权和权益,并且通过参照将其全文并入此处。This application claims the priority and rights of the patent application with the patent application number 201811060690.4, which was filed with the State Intellectual Property Office of China on September 12, 2018, and is incorporated herein by reference in its entirety.
本申请涉及三维成像技术领域,特别涉及一种控制方法、控制装置、飞行时间深度相机、电子装置和计算机可读存储介质。The present application relates to the field of three-dimensional imaging technology, and in particular, to a control method, a control device, a time-of-flight depth camera, an electronic device, and a computer-readable storage medium.
飞行时间(Time of Flight,TOF)成像系统可通过计算光发射器发射光信号的时刻,与光接收器接收到光信号的时刻之间的时间差来计算被测物体的深度信息。光发射器通常包括光源和扩散器。光源发出的光经扩散器的扩散作用后向场景中投射均匀的面光。光源发射的光通常为红外激光。Time of flight (TOF) imaging system can calculate the depth information of the measured object by calculating the time difference between the moment when the optical transmitter emits the optical signal and the moment when the optical receiver receives the optical signal. Light emitters typically include a light source and a diffuser. The light from the light source is diffused by the diffuser and then casts a uniform surface light into the scene. The light emitted by the light source is usually an infrared laser.
发明内容Summary of the Invention
本申请的实施例提供了一种控制方法、控制装置、飞行时间深度相机、电子装置和计算机可读存储介质。Embodiments of the present application provide a control method, a control device, a time-of-flight depth camera, an electronic device, and a computer-readable storage medium.
本申请实施方式的光发射器的控制方法包括:获取场景的场景图像;识别所述场景图像中是否存在人脸;在所述场景图像中存在所述人脸时,控制所述光发射器以第一发光功率和/或第一开启频率发光;在所述场景图像中不存在所述人脸时,控制所述光发射器以第二发光功率和/或第二开启频率发光。The method for controlling an optical transmitter according to an embodiment of the present application includes: acquiring a scene image of a scene; identifying whether a human face exists in the scene image; and controlling the optical transmitter to The first light emitting power and / or the first on frequency emit light; when the human face does not exist in the scene image, controlling the light emitter to emit light with the second light emitting power and / or the second on frequency.
本申请实施方式的光发射器的控制装置包括第一获取模块、识别模块和控制模块。所述第一获取模块用于获取场景的场景图像;所述识别模块用于识别所述场景图像中是否存在人脸;所述控制模块可用于在所述场景图像中存在所述人脸时,控制所述光发射器以第一发光功率和/或第一开启频率发光,以及在所述场景图像中不存在所述人脸时,控制所述光发射器以第二发光功率和/或第二开启频率发光。The control device of the light transmitter according to the embodiment of the present application includes a first acquisition module, an identification module, and a control module. The first acquisition module is configured to acquire a scene image of a scene; the recognition module is configured to recognize whether a human face exists in the scene image; and the control module is configured to, when the human face exists in the scene image, Controlling the light emitter to emit light at a first light emitting power and / or a first on frequency, and when the human face does not exist in the scene image, controlling the light emitter to emit light at a second light emitting power and / or Two on-frequency lights.
本申请实施方式的飞行时间深度相机,飞行时间深度相机包括光发射器和处理器。处理器用于获取场景的场景图像,识别所述场景图像中是否存在人脸,在所述场景图像中存在所述人脸时,控制所述光发射器以第一发光功率和/或第一开启频率发光,以及在所述场景图像中不存在所述人脸时,控制所述光发射器以第二发光功率和/或第二开启频率发光。The time-of-flight camera according to the embodiment of the present application includes a light transmitter and a processor. The processor is configured to obtain a scene image of a scene, identify whether a human face exists in the scene image, and control the light transmitter to be turned on at a first light emitting power and / or a first when the human face exists in the scene image. When the light is emitted at a frequency and when the human face does not exist in the scene image, the light emitter is controlled to emit light at a second light emission power and / or a second on frequency.
本申请实施方式的电子装置包括上述的飞行时间深度相机、一个或多个处理器、存储器和一个或多个程序。其中所述一个或多个程序被存储在所述存储器中,并且被配置成由所述一个或多个处理器执行,所述程序包括用于执行上述的控制方法的指令。The electronic device according to the embodiment of the present application includes the time-of-flight depth camera, one or more processors, a memory, and one or more programs. Wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the programs include instructions for performing the control method described above.
本申请实施方式的计算机可读存储介质包括与电子装置结合使用的计算机程序,所述计算机程序可被处理器执行以完成上述的控制方法。The computer-readable storage medium of the embodiment of the present application includes a computer program used in combination with an electronic device, and the computer program can be executed by a processor to complete the control method described above.
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。Additional aspects and advantages of the present application will be given in part in the following description, part of which will become apparent from the following description, or be learned through practice of the present application.
本申请上述的和/或附加的方面和优点从下面结合附图对实施例的描述中将变得明显和容易理解,其中:The above and / or additional aspects and advantages of this application will become apparent and easily understood from the following description of the embodiments in conjunction with the accompanying drawings, in which:
图1是本申请某些实施方式的电子装置的立体结构示意图。FIG. 1 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
图2是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 2 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图3是本申请某些实施方式的光发射器的控制装置的模块示意图。FIG. 3 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
图4和图5是本申请某些实施方式的光发射器的开启频率的示意图。FIG. 4 and FIG. 5 are schematic diagrams of the turn-on frequencies of the optical transmitter in some embodiments of the present application.
图6是本申请某些实施方式的光发射器的控制方法的流程示意图。6 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图7是本申请某些实施方式的光发射器的控制装置的模块示意图。FIG. 7 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
图8和图9是本申请某些实施方式的光发射器的控制方法的流程示意图。8 and 9 are schematic flowcharts of a method for controlling a light transmitter according to some embodiments of the present application.
图10是本申请某些实施方式的光发射器的控制装置的模块示意图。FIG. 10 is a schematic block diagram of a control device for a light transmitter according to some embodiments of the present application.
图11是本申请某些实施方式的光发射器的控制装置中第二获取模块的模块示意图。11 is a schematic block diagram of a second acquisition module in a control device for a light transmitter according to some embodiments of the present application.
图12是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 12 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图13是本申请某些实施方式的光发射器的控制装置中第二计算单元的模块示意图。FIG. 13 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
图14是本申请某些实施方式的光发射器的控制方法的流程示意图。14 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图15是本申请某些实施方式的光发射器的控制装置中第二计算单元的模块示意图。FIG. 15 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
图16是本申请某些实施方式的光发射器的控制方法的流程示意图。FIG. 16 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图17是本申请某些实施方式的光发射器的控制装置中第二计算单元的模块示意图。FIG. 17 is a schematic block diagram of a second computing unit in a control device for a light transmitter according to some embodiments of the present application.
图18是本申请某些实施方式的光发射器的控制方法的流程示意图。18 is a schematic flowchart of a method for controlling a light transmitter according to some embodiments of the present application.
图19是本申请某些实施方式的光发射器的控制装置中控制模块的模块示意图。19 is a schematic block diagram of a control module in a control device for a light transmitter according to some embodiments of the present application.
图20是本申请某些实施方式的电子装置的立体结构示意图。FIG. 20 is a schematic three-dimensional structure diagram of an electronic device according to some embodiments of the present application.
图21是本申请某些实施方式的深度相机的立体结构示意图。FIG. 21 is a schematic diagram of a three-dimensional structure of a depth camera according to some embodiments of the present application.
图22是本申请某些实施方式的深度相机的平面结构示意图。22 is a schematic plan view of a depth camera according to some embodiments of the present application.
图23是图22中的深度相机沿XXIII-XXIII线的截面示意图。FIG. 23 is a schematic cross-sectional view of the depth camera in FIG. 22 along the line XXIII-XXIII.
图24是本申请某些实施方式的光发射器的结构示意图。FIG. 24 is a schematic structural diagram of a light emitter according to some embodiments of the present application.
图25是本申请某些实施方式的电子装置与计算机可读存储介质的连接示意图。25 is a schematic diagram of a connection between an electronic device and a computer-readable storage medium according to some embodiments of the present application.
下面详细描述本申请的实施例,所述实施例的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施例是示例性的,旨在用于解释本申请,而不能理解为对本申请的限制。Hereinafter, embodiments of the present application will be described in detail. Examples of the embodiments are shown in the accompanying drawings, wherein the same or similar reference numerals represent the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the drawings are exemplary, and are intended to explain the present application, and should not be construed as limiting the present application.
请参阅图1,本申请提供一种光发射器100的控制方法。控制方法包括:获取场景的场景图像;03:识别场景图像中是否存在人脸;在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光;在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光。Referring to FIG. 1, the present application provides a method for controlling a
在某些实施方式中,第一开启频率包括第一子开启频率和第二子开启频率。控制方法还包括:判断光发射器的应用场景。在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光的步骤包括:在应用场景为第一场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第一子开启频率发光;在应用场景为第二场景,且场景图像中存在人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。In some embodiments, the first turn-on frequency includes a first sub-turn-on frequency and a second sub-turn-on frequency. The control method further includes: judging an application scenario of the light transmitter. When a human face exists in the scene image, the step of controlling the
在某些实施方式中,控制方法在识别场景图像中是否存在人脸的步骤后还包括:获取用户与光发射器100的投射距离;根据投射距离计算第一发光功率。In some embodiments, after the step of identifying whether a human face exists in the scene image, the control method further includes: obtaining a projection distance between the user and the
在某些实施方式中,获取用户与光发射器100的投射距离的步骤包括:计算场景图像中人脸所占的第一比例;根据第一比例计算投射距离。In some embodiments, the step of obtaining a projection distance between the user and the
在某些实施方式中,根据第一比例计算投射距离的步骤包括:计算场景图像中人脸的预设特征区域占人脸的第二比例;根据第一比例及第二比例计算投射距离。In some embodiments, the step of calculating the projection distance according to the first scale includes: calculating a second ratio of the preset feature area of the human face in the scene image to the human face; and calculating the projection distance according to the first ratio and the second ratio.
在某些实施方式中,根据第一比例计算投射距离的步骤包括:根据场景图像判断用户是否配戴眼镜;在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。In some embodiments, the step of calculating the projection distance according to the first ratio includes: judging whether the user is wearing glasses according to the scene image; and calculating the projection distance according to the first ratio and the distance coefficient when the user wears glasses.
在某些实施方式中,根据第一比例计算投射距离的步骤包括:根据场景图像判断所述用户的年龄;根据第一比例及年龄计算投射距离。In some embodiments, the step of calculating the projection distance according to the first ratio includes: judging the age of the user according to the scene image; and calculating the projection distance according to the first ratio and age.
在某些实施方式中,在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光的步骤包括:获取场景中的目标主体与光发射器100的投射距离;获取场景的环境亮度;根据环境亮度及投射距离计算第二发光功率。In some embodiments, when there is no human face in the scene image, the step of controlling the
请参阅图1和图3,本申请提供一种光发射器100的控制装置90。控制装置90包括第一获取模块91、识别模块93及控制模块95。第一获取模块91用于获取场景的场景图像。识别模块93用于识别场景图像中是否存在人脸。控制模块95用于在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光,以及在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光。Referring to FIG. 1 and FIG. 3, the present application provides a
请参阅图1和图7,在某些实施方式中,第一开启频率包括第一子开启频率和第二子开启频率。控制装置90还包括判断模块941,判断模块941用于判断光发射器100的应用场景。控制模块90还可于在应用场景为第一场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第一子开启频率发光,以及在应用场景为第二场景,且场景图像中存在人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。Referring to FIG. 1 and FIG. 7, in some embodiments, the first turn-on frequency includes a first sub-turn-on frequency and a second sub-turn-on frequency. The
请参阅图1和图10,在某些实施方式中,控制装置90还包括第二获取模块942和计算模块942。第二获取模块942用于获取用户与光发射器100的投射距离。计算模块943用于根据投射距离计算第一发光功率。Referring to FIG. 1 and FIG. 10, in some embodiments, the
请参阅图11,在某些实施方式中,第二获取模块942包括第一计算单元9421和第二计算单元9422。第一计算单元9421用于计算场景图像中人脸所占的第一比例。第二计算单元9422用于根据所第一比例计算投射距离。Referring to FIG. 11, in some embodiments, the second obtaining
请参阅图13,在某些实施方式中,第二计算单元942包括第一计算子单元9423和第二计算子单元9424。第一计算子单元9423用于计算场景图像中人脸的预设特征区域占人脸的第二比例。第二计算子单元9424用于根据第一比例及第二比例计算投射距离。Referring to FIG. 13, in some embodiments, the
请参阅图15,在某些实施方式中,第二计算单元942包括第一判断子单元9425和第三计算子单元9426。第一判断子单元9425用于根据场景图像判断用户是否配戴眼镜。第三子单元9426用于在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。Referring to FIG. 15, in some embodiments, the
请参阅图17,在某些实施方式中,第二计算单元942包括第二判断子单元9427和第四计算子单元9428。第二判断子单元9427用于根据场景图像判断用户的年龄。第四计算子单元9428用于根据第一比例及年龄计算投射距离。Referring to FIG. 17, in some embodiments, the
请参阅图1和图19,在某些实施方式中,控制模块95包括第一获取单元951、第二获取单元952和第三计算单元953。第一获取单元951用于获取场景中的目标主体与光发射器100的投射距离。第二获取单元952用于获取场景的环境亮度。第三计算单元953用于根据环境亮度及投射距离计算第二发光功率。Referring to FIG. 1 and FIG. 19, in some embodiments, the
请参阅图1,本申请还提供一种飞行时间深度相机300。飞行时间深度相机300包括光发射器100和处理器。处理器用于:获取场景的场景图像;识别场景图像中是否存在人脸;在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光;在场景图像中不存在所述人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光。Please refer to FIG. 1, the application also provides a time-of-
请再参阅图1,处理器还用于:判断光发射器100的应用场景;在应用场景为第一场景,且场景图像中存在人脸时,控制光发射器以第一发光功率和第一子开启频率发光;在应用场景为第二场景,且场景图像中存在人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。Referring to FIG. 1 again, the processor is further configured to: determine an application scenario of the
请再参阅图1,在某些实施方式中,处理器还用于:获取用户与光发射器100的投射距离;根据投射距离计算第一发光功率。Please refer to FIG. 1 again. In some embodiments, the processor is further configured to: obtain a projection distance between the user and the
请再参阅图1,在某些实施方式中,处理器还用于:计算场景图像中人脸所占的第一比例;根据第一比例计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor is further configured to: calculate a first proportion of a face in a scene image; and calculate a projection distance according to the first proportion.
请再参阅图1,在某些实施方式中,处理器还用于:计算场景图像中人脸的预设特征区域占人脸的第二比例;根据第一比例及第二比例计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor is further configured to: calculate a second proportion of the preset feature area of the face in the scene image to the human face; and calculate a projection distance according to the first proportion and the second proportion.
请再参阅图1,在某些实施方式中,处理器还用于:根据场景图像判断用户是否配戴眼镜;在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor is further configured to: determine whether the user wears glasses according to the scene image; and calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
请再参阅图1,在某些实施方式中,处理器还用于:根据场景图像判断用户的年龄;根据第一比例及年龄计算投射距离。Please refer to FIG. 1 again. In some embodiments, the processor is further configured to: determine the age of the user according to the scene image; and calculate the projection distance according to the first scale and age.
请再参阅图1,在某些实施方式中,处理器还用于:获取场景中的目标主体与光发射器的投射距离;获取场景的环境亮度;根据环境亮度及投射距离计算第二发光功率。Please refer to FIG. 1 again. In some embodiments, the processor is further configured to: obtain a projection distance between the target subject and the light emitter in the scene; obtain the ambient brightness of the scene; and calculate the second luminous power according to the ambient brightness and the projection distance .
请参阅图1,本申请还提供一种电子装置800。电子装置包括上述任意一项实施方式所述的深度相机300、一个或多个处理器805、存储器806及一个或多个程序。其中一个或多个程序被存储在存储器806中,并且被配置成由一个或多个处理器805执行。程序包括用于上述任意一项实施方式所述的控制方法的指令。Please refer to FIG. 1, the present application further provides an
请参阅图25,本申请还提供一种计算机可读存储介质900。计算机可读存储介质900包括与电子装置800结合使用的计算机程序。计算机程序可被处理器805执行以完成上述任意一项实施方式所述的控制方法。Referring to FIG. 25, the present application further provides a computer-
请一并参阅图1和图2,本申请提供一种光发射器100的控制方法。控制方法包括:Please refer to FIG. 1 and FIG. 2 together. The present application provides a method for controlling an
01:获取场景的场景图像;01: Get the scene image of the scene;
03:识别场景图像中是否存在人脸;03: identify whether a human face exists in the scene image;
05:在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光;05: when a human face is present in the scene image, controlling the
07:在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光。07: When a human face does not exist in the scene image, the
请一并参阅图1和图3,本申请还提供一种光发射器100的控制装置90。本申请实施方式的控制方法可以由本申请实施方式的控制装置90实现。控制装置90包括第一获取模块91、识别模块93和控制模块95。步骤01可以由第一获取模块91实现。步骤03可以由识别模块93实现。步骤05和步骤07均可以由控制模块95实现。也即是说,第一获取模块91可用于获取场景的场景图像。识别模块93可用于识别场景图像中是否存在人脸。控制模块95可用于在场景图像中存在人脸时控制光发射器100以第一发光功率和/或第一开启频率发光、以及在场景图像中不存在人脸时控制光发射器100以第二发光功率和/或第二开启频率发光。Please refer to FIG. 1 and FIG. 3 together. The present application further provides a
请再参阅图1,本申请还提供一种飞行时间深度相机300。本申请实施方式的控制装置90可以应用在本申请实施方式的飞行时间深度相机300上。飞行时间深度相机300包括光发射器100、光接收器200和处理器。步骤01、步骤03、步骤05和步骤07均可以由处理器实现。也即是说,处理器可用于获取场景的场景图像、识别场景图像中是否存在人脸、在场景图像中存在人脸时控制光发射器100以第一发光功率和/或第一开启频率发光、以及在场景图像中不存在人脸时控制光发射器100以第二发光功率和/或第二开启频率发光。Please refer to FIG. 1 again, this application further provides a time-of-
本申请实施方式的飞行时间深度相机300可以应用于电子装置800中。本申请实施方式的飞行时间深度相机300中的处理器与电子装置800的处理器805可为同一个处理器,也可为两个独立的处理器。在本申请的具体实施例中,飞行时间深度相机300中的处理器与电子装置800的处理器805为同一个处理器。电子装置800可以是手机、平板电脑、智能穿戴设备(智能手表、智能手环、智能眼镜、智能头盔)、无人机等,在此不作限制。The time-of-
具体地,飞行时间深度相机300通常包括一个光发射器100和光接收器200。光发射器100用于向场景中投射激光,光接收器200接收由场景中的人或物反射回的激光。飞行时间深度相机300工作时,处理器805控制光发射器100和光接收器200均开启,并向驱动器61(图24所示)输入具有一定频率和幅值的调制信号,驱动器61将调制信号转换为恒定的电流源后传输给光发射器100的光源(图24所示),以使光源发射激光。光发射器100发射的激光通常为红外激光,红外激光的能量过高或者红外激光持续出射到一个位置的时间过长的情况均容易对用户的眼睛造成伤害。Specifically, the time-of-
本申请实施方式的控制方法、控制装置90和飞行时间深度相机300,在光发射器100开启时,首先采集场景的场景图像,例如,可以用红外相机(可为光接收器200)或可见光相机400采集。随后,处理器805基于人脸识别算法识别场景图像中是否存在人脸。在场景图像中存在人脸时,处理器805控制光发射器100以第一发光功率和/或第一开启频率发光。在场景图像中不存在人脸时,处理器805控制光发射器100以第二发光功率和/或第二开启频率发光。In the control method, the
其中,发光功率由驱动器61输出的电流来间接表征。当驱动器61输出的电流较大时,发光功率较大;当驱动器输出的电流较小时,发光功率较小。开启频率指的是光发射器100的开启频率,而非光发射器100的发光频率。光发射器100的开启频率与光接收器200输出深度图像的帧率对应。具体地,请结合图4,假设光接收器200要输出一帧图像,光发射器100需要发射N个周期T2的激光,光接收器200要输出下一帧图像时,光发射器100又需要再次发射N个周期的激光,则N个周期的时间之和即组成光发射器100的一个开启周期T1,即T1=N×T2,开启频率f的计算方式为f=1/T1;请结合图5,若光接收器200要输出一帧图像,光发射器100需要发射N个周期T2的激光,光接收器200要输出下一帧图像时,光发射器100又需要再次发射N个周期的激光,且光发射器100第一次发射N个周期T2的激光与第二次发射N个周期T2的激光之间还间隔了一段间隔时间t,则光发射器100的开启周期T1为T1=N×T2+t,开启频率的计算方式为1/T1。The light emitting power is indirectly characterized by the current output by the
处理器805控制光发射器100以第一发光功率和/或第一开启频率发光包括:(1)处理器805控制光发射器100以第一发光功率发光;(2)处理器805控制光发射器100以第一开启频率发光;(3)处理器805控制光发射器100同时以第一发光功率和第一开启频率发光。The
同样地,处理器805控制光发射器100以第二发光功率和/或第二开启频率发光包括:(1)处理器805控制光发射器100以第二发光功率发光;(2)处理器805控制光发射器100以第二开启频率发光;(3)处理器805控制光发射器100同时以第二发光功率和第二开启频率发光。Similarly, the
由于红外激光能量较高时易对用户的眼睛造成伤害,因此,本申请实施方式的控制方法中,在场景图像中存在人脸时,以较低的第一发光功率和较低的第一开启频率发光,较低的第一发光功率可以降低照射到用户眼睛的红外激光的能量,较低的第一开启频率可以减少红外激光持续照射到用户眼睛的时间,如此,可以降低出射的激光对用户的眼睛造成伤害的风险,提高飞行时间深度相机300使用的安全性。在场景图像中不存在人脸时,则以合适于当下场景的第二发光功率和第二开启频率发光,如此,可以提高光接收器200输出的深度图像的精度。Because the infrared laser energy is likely to cause damage to the user's eyes when the energy of the infrared laser is high, in the control method of the embodiment of the present application, when a human face is present in the scene image, the light is turned on with a lower first luminous power and a lower first. Frequency light emission. A lower first luminous power can reduce the energy of the infrared laser light irradiated to the user's eyes. A lower first turn-on frequency can reduce the time that the infrared laser light continues to irradiate the user's eyes. In this way, the emitted laser light can reduce the impact on the user. The risk of injury to the eyes increases the safety of use of the time-of-
请一并参阅图1和图6,在某些实施方式中,第一开启频率包括第一开启子频率和第二开启子频率。控制方法还包括:Please refer to FIG. 1 and FIG. 6 together. In some embodiments, the first turn-on frequency includes a first turn-on sub-frequency and a second turn-on sub-frequency. Control methods also include:
041:判断光发射器100的应用场景;041: determine an application scenario of the
步骤05在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光包括:Step 05: When a human face exists in the scene image, controlling the
051:在应用场景为第一场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第一子开启频率发光;和051: when the application scene is the first scene and the face exists in the scene image, controlling the
052:在应用场景为第二场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。052: When the application scene is the second scene and the human face is present in the scene image, the
请一并参阅图1和图7,在某些实施方式中,控制装置90还包括判断模块941。步骤041可以由判断模块941实现。步骤051和步骤052均可以由控制模块95实现。也即是说,判断模块941可用于判断光发射器100的应用场景。控制模块95可用于在应用场景为第一场景且场景图像中存在所述人脸时控制光发射器100以第一发光功率和第一子开启频率发光、以及在应用场景为第二场景且场景图像中存在所述人脸时控制光发射器100以第一发光功率和第二子开启频率发光。Please refer to FIG. 1 and FIG. 7 together. In some embodiments, the
请再参阅图1,在某些实施方式中,步骤041、步骤051和步骤052均可以由处理器805实现。也即是说,处理器805可用于判断光发射器100的应用场景、在应用场景为第一场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第一子开启频率发光、以及在应用场景为第二场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。Please refer to FIG. 1 again. In some embodiments,
其中,第一场景指的是飞行时间深度相机300的应用时间小于预设时间的应用场景,例如,拍摄场景的静态三维图像,基于三维人脸解锁电子装置800,基于三维人脸进行支付等飞行时间模组的应用时间较少的应用场景。第二场景指的是飞行时间深度相机300的应用时间大于或等于预设时间的应用场景,例如,用户与其余用户进行三维视频聊天的应用场景等。The first scene refers to an application scene in which the application time of the depth-of-
以第一场景为基于三维人脸解锁电子装置800,第二场景为用户与其余用户进行三维视频聊天的应用场景为例。在飞行时间深度相机300用于拍摄用户的三维人脸以解锁电子装置800时,光接收器200每秒钟通常仅需要输出较少的几帧深度图像,例如,每秒钟输出3帧、每秒钟输出4帧、每秒钟输出5帧等,对应的光发射器100的开启频率为3次/秒、4次/秒、5次/秒等,此时,为降低激光对用户眼睛的伤害,光接收器200可以仅输出1帧图像,对应的,处理器805可以将第一子开启频率设为1次/秒。在飞行时间深度相机300用于拍摄用户的三维视频,以使用户可以使用电子装置800与其余用户进行三维的视频聊天时,光接收器200每秒钟通常需要输出较多的深度图像,例如每秒钟输出30帧、每秒钟输出60帧等,对应的光发射器100的开启频率为30次/秒、60次/秒等,此时,为降低激光对用户眼睛的伤害,光接收器200可以仅输出24帧图像,对应的,处理器805可以将第一子开启频率设为24次/秒。可以理解的是,当电子装置800的画面刷新速度达到24帧/秒时,人眼看到的就是流畅的画面,因此,可以将第一子开启频率设为24次/秒,光发射器100以最低的帧率输出深度图像,一方面能降低激光对用户眼睛的伤害,另一方面可以保证用户能看到流畅的三维视频画面。Take the first scenario as an application scenario of unlocking the
如此,通过对应用场景的区分,在应用场景不同时使用不同的第一开启频率,既能降低激光对用户造成伤害的风险,还能满足用户的使用需求,用户使用体验较佳。In this way, by distinguishing the application scenarios and using different first turn-on frequencies when the application scenarios are different, the risk of harm to the user from the laser can be reduced, and the user's use needs can be met, and the user experience is better.
请一并参阅图1、图8和图9,在某些实施方式中,控制方法在步骤03后还包括:Please refer to FIG. 1, FIG. 8 and FIG. 9 together. In some embodiments, the control method after
042:获取用户与光发射器100的投射距离;和042: Obtain the projection distance between the user and the
043:根据投射距离计算第一发光功率。043: Calculate the first luminous power according to the projection distance.
其中,步骤042包括:Step 042 includes:
0421:计算场景图像中人脸所占的第一比例;和0421: Calculate the first proportion of faces in the scene image; and
0422:根据第一比例计算投射距离。0422: Calculate the projection distance according to the first scale.
请一并参阅图1、图10和图11,在某些实施方式中,控制装置90还包括第二获取模块942和计算模块943。第二获取模块942包括第一计算单元9421和第二计算单元9422。步骤042可以由第二获取模块942实现。步骤043可以由第一计算单元9421实现。步骤0421可以由第一计算单元9421实现,步骤0422可以由第二计算单元9422实现。也即是说,第二获取模块942可用于获取用户与光发射器100的投射距离。计算模块943可用于根据投射距离计算第一发光功率。第一计算单元9421可用于计算场景图像中人脸所占的第一比例。第二计算单元9422可用于根据第一比例计算投射距离。Please refer to FIG. 1, FIG. 10, and FIG. 11 together. In some embodiments, the
请再参阅图1,在某些实施方式中,步骤041、步骤042、步骤0421和步骤0422均可以由处理器805实现。也即是说,处理器805还可用于获取用户与光发射器100的投射距离、根据投射距离计算第一发光功率。处理器805执行获取用户与光发射器100的投射距离时,具体执行计算场景图像中人脸所占的第一比例、以及根据第一比例计算投射距离的操作。Please refer to FIG. 1 again. In some embodiments,
具体地,处理器805在场景图像中识别到人脸后,处理器805提取出人脸并计算人脸所占的像素个数,随后,处理器805将人脸的像素个数除以场景图像的总像素个数以得到场景图像中人脸所占的第一比例,最后基于第一比例计算投射距离。一般地,当第一比例较大时,说明用户比较靠近飞行时间深度相机300,也就是用户比较靠近光发射器100,投射距离较小;当第一比例较大时,说明用户与飞行时间深度相机300距离较远,也就是用户距离光发射器100较远,投射距离较大。因此,投射距离与第一比例之间的关系满足投射距离随第一比例的减小而增大。在一个例子中,当场景图像中包含多张人脸时,可以选取多张人脸中面积最大的人脸来计算第一比例;或者,也可以选取多张人脸的面积的平均值来计算第一比例;或者,可以从多张人脸中识别出电子装置800的持有者的人脸,利用持有者的人脸来计算第一比例。Specifically, after the
第一比例与投射距离具有映射关系,例如,第一比例为一个具体值,投射距离也为一个具体值,第一比例与投射距离一一对应;或者,第一比例为一个范围,投射距离为一个具体值,第一比例为投射距离一一对应;或者,第一比例为一个范围,投射距离也为一个范围,第一比例与投射距离一一对应。具体地,第一比例与投射距离之间的映射关系可以预先标定。在标定时,指引用户分别站在距离红外相机或可见光相机400的多个预定投射距离的位置处,红外相机或可见光相机400依次采集场景图像。处理器805计算每张场景图像中人脸占场景图像的标定比例,再存储每张场景图像中的标定比例与预定投射距离之间的对应关系,在后续使用时,基于实际测量的第一比例在上述映射关系中寻找与第一比例对应的投射距离。例如,指引用户在投射距离为10厘米、20厘米、30厘米、40厘米的位置处站立,红外相机或可见光相机400依次采集场景图像,处理器805根据多张场景图像计算出与投射距离10厘米、20厘米、30厘米、40厘米分别对应的标定比例80%、60%、45%、30%,并将标定比例与预定投射距离的映射关系10cm-80%、20cm-60%、30cm-45%、40cm-30%以映射表的形式存储在电子装置800的存储器806中。在后续使用时,直接在映射表中寻找与第一比例对应的投射距离。The first ratio has a mapping relationship with the projection distance. For example, the first ratio is a specific value and the projection distance is also a specific value. The first ratio corresponds to the projection distance one by one. Or, the first ratio is a range and the projection distance is For a specific value, the first ratio is a one-to-one correspondence with the projection distance; or, the first ratio is a range and the projection distance is also a range, and the first ratio corresponds to the projection distance one-to-one. Specifically, the mapping relationship between the first scale and the projection distance may be calibrated in advance. During calibration, the user is instructed to stand at a plurality of predetermined projection distances from the infrared camera or visible
或者,预先对投射距离与第一比例进行标定。在标定时,指引用户站在距离红外相机或可见光相机400的某一个预定投射距离处,红外相机或可见光相机400采集场景图像。处理器805计算场景图像中人脸占场景图像的标定比例,再存储场景图像中的标定比例与预定投射距离之间的对应关系,在后续使用时,基于标定比例与预定投射距离之间的对应关系计算投射距离。例如,指引用户在投射距离为30厘米的位置处站立,红外相机或可见光相机400采集场景图像,处理器805计算到人脸在场景图像中的占比为45%,而在实际测量中,当计算得到第一比例为R时,则依据相似三角形的性质有 其中,D依据实际测量的第一比例R计算的实际的投射距离。 Alternatively, the projection distance and the first ratio are calibrated in advance. At the time of calibration, the user is directed to stand at a predetermined projection distance from the infrared camera or visible light camera 400, and the infrared camera or visible light camera 400 collects scene images. The processor 805 calculates the calibration ratio of the face in the scene image to the scene image, and then stores the correspondence between the calibration ratio in the scene image and the predetermined projection distance. In subsequent use, based on the correspondence between the calibration ratio and the predetermined projection distance The relationship calculates the projection distance. For example, to guide the user to stand at a position with a projection distance of 30 cm, the infrared camera or visible light camera 400 captures the scene image, the processor 805 calculates that the proportion of the human face in the scene image is 45%, and in actual measurement, when When the first ratio is calculated as R, according to the properties of similar triangles, Among them, D is an actual projection distance calculated according to the actually measured first ratio R.
如此,依据场景图像中人脸所占的第一比例,可以较为客观地反应用户与光发射器100之间的投射距离。基于投射距离来计算第一发光功率,光发射器100可以以较为合适的发光功率发光,一方面可以 避免光发射器100的发光功率过高而对用户的眼睛造成伤害,另一方面也能避免发光功率过低导致获取的场景的深度信息不准确的。In this way, according to the first proportion of the human face in the scene image, the projection distance between the user and the
请参阅图12,在某些实施方式中,步骤0422根据第一比例计算投射距离包括:Referring to FIG. 12, in some embodiments, calculating the projection distance according to the first scale in
0423:计算场景图像中人脸的预设特征区域占人脸的第二比例;和0423: Calculate the second proportion of the preset feature area of the human face in the scene image; and
0424:根据第一比例及第二比例计算投射距离。0424: Calculate the projection distance according to the first scale and the second scale.
请参阅图13,在某些实施方式中,第二计算单元9422包括第一计算子单元9423和第二计算子单元9424。步骤0423可以由第一计算子单元9423实现。步骤0424可以由第二计算子单元9424实现。也即是说,第一计算子单元9423可用于计算场景图像中人脸的预设特征区域占人脸的第二比例。第二计算子单元9424可用于根据第一比例及第二比例计算投射距离。Referring to FIG. 13, in some embodiments, the
请再参阅图1,在某些实施方式中,步骤0423和步骤0424均可以由处理器805实现。也即是说,处理器805还可用于计算场景图像中人脸的预设特征区域占人脸的第二比例、以及根据第一比例及第二比例计算投射距离。Please refer to FIG. 1 again. In some embodiments,
可以理解,不同的用户的人脸大小有差异,使得不同的用户处于同样的距离下时,采集到的场景图像中人脸所占的第一比例有差异。第二比例为人脸的预设特征区域占人脸的比例,预设特征区域可以选择不同用户个体的差异度较小的特征区域,例如预设特征区域可以为用户的双眼间距。当第二比例较大时,说明该用户的人脸较小,仅依据第一比例计算得到的投射距离过大;当第二比例较小时,说明该用户的人脸较大,仅依据第一比例计算得到的投射距离过小。在实际使用中,可以预先对第一比例、第二比例与投射距离进行标定。具体地,指引用户站在预定的投射距离位置处,并采集场景图像,再计算该场景图像对应的第一标定比例及第二标定比例,存储该预定的投射距离与第一标定比例、第二标定比例的对应关系,以便于在后续使用中依据实际的第一比例和第二比例计算投射距离。例如,指引用户站在投射距离为25厘米处,并采集场景图像,再计算该场景图像对应的第一标定比例为50%,第二标定比例为10%,而在实际测量中,当计算得到的第一比例为R1,第二比例为R2时,则依据三角形相似的性质有 其中,D1为依据实际测量的第一比例R1计算得到的初始的投射距离,可以再依据关系式 求得进一步依据实际测量的第二比例R2计算得到的校准的投射距离D2,D2作为最终的投射距离。如此,依据第一比例和第二比例计算得到的投射距离考虑了不同用户之间的个体差异,能够获得更加客观的投射距离,进一步地可以基于较为准确的投射距离确定出较为准确的第一发光功率。 It can be understood that the face sizes of different users are different, so that when different users are at the same distance, the first proportion of the faces in the collected scene images is different. The second ratio is the ratio of the preset feature area of the human face to the human face. The preset feature area can select a feature area with a small difference between different user individuals. For example, the preset feature area can be the distance between the eyes of the user. When the second ratio is large, the user's face is small, and the projection distance calculated based on the first ratio is too large; when the second ratio is small, the user's face is large, only based on the first The calculated projection distance is too small. In actual use, the first ratio, the second ratio, and the projection distance can be calibrated in advance. Specifically, the user is directed to stand at a predetermined projection distance position and collect a scene image, and then calculate a first calibration ratio and a second calibration ratio corresponding to the scene image, and store the predetermined projection distance, the first calibration ratio, and the second The corresponding relationship of the scales is calibrated, so as to calculate the projection distance according to the actual first scale and the second scale in subsequent use. For example, instruct the user to stand at a projection distance of 25 cm and collect a scene image, and then calculate the first calibration ratio corresponding to the scene image is 50% and the second calibration ratio is 10%. In actual measurement, when calculated, When the first ratio is R1 and the second ratio is R2, according to the similar properties of the triangle, Among them, D1 is the initial projection distance calculated according to the actually measured first ratio R1, which can be further based on the relationship A calibrated projection distance D2, which is further calculated according to the actually measured second ratio R2, is obtained, and D2 is used as the final projection distance. In this way, the projection distance calculated according to the first ratio and the second ratio takes into account the individual differences between different users, and can obtain a more objective projection distance. Further, a more accurate first light emission can be determined based on a more accurate projection distance. power.
请参阅图14,在某些实施方式中,步骤0422根据第一比例计算投射距离包括:Referring to FIG. 14, in some embodiments, calculating the projection distance according to the first scale in
0425:根据场景图像判断用户是否配戴眼镜;和0425: Determine whether the user is wearing glasses based on the scene image; and
0426:在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。0426: Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
请参阅图15,在某些实施方式中,第二计算单元9422还包括第一判断子单元9425和第三计算子单元9426。步骤0425可以由第一判断子单元9425实现。步骤0426可以由第三计算子单元9426实现。也即是说,第一判断子单元9425可用于根据场景图像判断用户是否配戴眼镜。第三计算子单元9426可用于在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。Referring to FIG. 15, in some embodiments, the
请再参阅图1,在某些实施方式中,步骤0425和步骤0426均可以由处理器805实现。也即是说,处理器805还可用于根据场景图像判断用户是否配戴眼镜,在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。Please refer to FIG. 1 again. In some embodiments,
可以理解,用户是否佩戴眼镜可以用于表征用户眼睛的健康状况,具体为用户佩戴眼镜则表明用户的眼睛已经患有相关的眼疾或视力不佳,在光发射器100对佩戴眼镜的用户发射激光时,需要降低光发射器100的发光功率以使得光发射器100发射的激光的能量较小,以免对用户的眼睛造成伤害。预设的距离系数可以是介于0至1的系数,例如0.6、0.78、0.82、0.95等,例如在根据第一比例计算得到初始的投射距离,或者在依据第一比例和第二比例计算得到校准后的投射距离后,再将初始的投射距离或者校准的投射距离乘以距离系数,得到最终的投射距离,并根据该投射距离计算第一发光功率。如此,可以避免发射激光的功率过大伤害患有眼疾或视力不佳的用户。It can be understood that whether the user wears glasses can be used to characterize the health status of the user's eyes. Specifically, if the user wears glasses, it indicates that the user's eyes have suffered from related eye diseases or poor eyesight. The
请参阅图16,在某些实施方式中,步骤0422根据第一比例计算投射距离包括:Referring to FIG. 16, in some embodiments, calculating the projection distance according to the first scale in
0427:根据场景图像判断用户的年龄;和0427: judging the user's age based on the scene image; and
0428:根据第一比例及年龄计算投射距离。0428: Calculate the projection distance based on the first ratio and age.
请参阅图17,在某些实施方式中,第二计算单元9422包括第二判断子单元9427和第四计算子单元9428。步骤0427可以由第二判断子单元9427实现,步骤0428可以由第四计算子单元9428实现。也即是说,第二判断子单元9427可用于根据场景图像判断用户的年龄。第四计算子单元9428可用于根据第一比例及年龄计算投射距离。Referring to FIG. 17, in some embodiments, the
请再参阅图1,在某些实施方式中,步骤0427和步骤0428均可以由处理器805实现。也即是说,处理器805可用于根据场景图像判断用户的年龄,根据第一比例及年龄计算投射距离。Please refer to FIG. 1 again. In some embodiments,
不同年龄段的人对红外激光的耐受能力不同,例如小孩和老人更容易被激光灼伤等,可能对于成年人而言是合适强度的激光会对小孩造成伤害。本实施方式中,可以提取场景图像中,人脸皱纹的特征点的数量、分布和面积等来判断用户的年龄,例如,提取眼角处皱纹的数量来判断用户的年龄,或者进一步结合用户的额头处的皱纹多少来判断用户的年龄。在判断用户的年龄后,可以依据用户的年龄得到比例系数,具体可以是在查询表中查询得知年龄与比例系数的对应关系,例如,年龄在15岁以下时,比例系数为0.6,年龄在15岁至20岁时,比例系数为0.8;年龄在20岁至45岁时,比例系数为1.0;年龄在45岁以上时,比例系数为0.8。在得知比例系数后,可以将根据第一比例计算得到的初始的投射距离、或者根据第一比例及第二比例计算得到的校准的投射距离乘以比例系数,以得到最终的投射距离,再根据投射距离计算第一发光功率。如此,可以避免发射激光的功率过大而伤害小年龄段或者年龄较大的用户。People of different ages have different tolerances to infrared lasers. For example, children and the elderly are more susceptible to laser burns. For adults, lasers of appropriate intensity may cause harm to children. In this embodiment, the number, distribution, and area of feature points of facial wrinkles in the scene image can be extracted to determine the user's age, for example, the number of wrinkles at the corners of the eyes can be used to determine the user's age, or further combined with the user's forehead How many wrinkles are there to determine the user's age. After judging the user's age, the proportion coefficient can be obtained according to the age of the user. Specifically, the correspondence between age and the proportion coefficient can be found in a query table. For example, when the age is under 15 years, the proportion coefficient is 0.6 and the age is between When the age is 15 to 20, the scale factor is 0.8; when the age is 20 to 45, the scale factor is 1.0; when the age is 45 or more, the scale factor is 0.8. After knowing the scale factor, the initial projection distance calculated from the first scale or the calibrated projection distance calculated from the first and second scales can be multiplied by the scale factor to obtain the final projection distance. The first light emitting power is calculated according to the projection distance. In this way, excessive power of the emitted laser can be avoided to hurt young users or older users.
请参阅图18,在某些实施方式中,步骤07在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光包括:Referring to FIG. 18, in some embodiments, when
071:获取场景中的目标主体与光发射器100的投射距离;071: Obtain a projection distance between the target subject in the scene and the
072:获取场景的环境亮度;和072: Get the ambient brightness of the scene; and
073:根据环境亮度及投射距离计算第二发光功率。073: Calculate the second luminous power according to the ambient brightness and the projection distance.
请参阅图19,在某些实施方式中,控制模块95包括第一获取单元951、第二获取单元952和第三计算单元953。步骤071可以由第一获取单元951实现。步骤072可以由第二获取单元952。步骤073可以由第三计算单元953。也即是说,第一获取单元951可用于获取场景中的目标主体与光发射器100的投射距离。第二获取单元952可用于获取场景的环境亮度。第三计算单元953可用于根据环境亮度及投射距离计算第二发光功率。Referring to FIG. 19, in some embodiments, the
请再参阅图1,在某些实施方式中,步骤071、步骤072和步骤073均可以由处理器805实现。也即是说,处理器805可用于获取场景中的目标主体与光发射器100的投射距离、获取场景的环境亮度、以及根据环境亮度及投射距离计算第二发光功率。Please refer to FIG. 1 again. In some embodiments,
其中,场景中的目标主体与光发射器100之间的投射距离可以由飞行时间深度相机300来获取。具体地,光发射器100以预设发光功率和预设发光频率发射激光,光接收器200接收由场景中的物体反射回的激光,处理器805基于光接收器200接收到的激光计算场景的初始深度信息。随后,处理器805从场景中确定目标主体,由于目标主体一般处于光接收器200的视场的中央区域,因此,可以将光接收器200视场的中央区域作为目标主体所在区域,从而将中央区域的这部分像素的初始深度信息作为目标主体的初始深度信息。一般地,目标主体的初始深度信息的值有多个,处理器805可计算出多个初始深度信息的均值或中值,并将均值或中值作为光发射器100与目标主体之间的投射距离。如此,即可通过飞行时间深度相机300获得光发射器100与目标主体之间的投射距离。The projection distance between the target subject in the scene and the
环境亮度可以由光传感器检测,处理器805从光传感器中读取其检测到的环境亮度。或者,环境亮度也可以由红外相机(可以为光接收器200)或可见光相机400来检测。红外相机或可见光相机400拍摄场景的图像,处理器805计算图像的亮度值以作为环境亮度。The ambient brightness can be detected by the light sensor, and the
在确定出环境亮度和投射距离后,处理器805基于环境亮度和投射距离两个参数共同计算第二发光功率。After determining the ambient brightness and the projection distance, the
可以理解的是,在环境亮度较高时,环境光中包含的红外光成分较多,环境光中的红外光与光发射器100发射的红外激光的波段重合的部分也较多,此时,光接收器200同时会接收到光发射器100发射的红外激光以及环境光中的红外光,若光发射器100发射红外激光的发光功率较低,则光接收器200接 收的光中的来自光发射器100的红外激光与来自环境光中的红外光二者的占比相差不大,如此会导致光接收器200接收光的时间点不准确,或者导致光接收器200接收到的光总量不够准确,进一步会降低深度信息的获取精度,因此,需要提升光发射器100发射红外激光的发射功率,以减小环境中的红外光对光接收器200接收来自光发射器100的红外激光的影响;而在环境亮度较低时,环境光线中包含的红外光成分较少,此时光发射器100若采用较高的发光功率发光,则会增加电子装置800的功耗。另外,在投射距离较远时,激光的飞行时间较长,飞行行程较远,激光的损耗较多,会对深度信息的获取精度产生影响。因此,在投射距离较大时,可以适当提升光发射器100的第二发光功率。在投射距离较小时,可以适当降低光发射器100的第二发光功率。It can be understood that when the ambient brightness is high, there are more infrared light components in the ambient light, and there are also more portions where the infrared light in the ambient light coincides with the wavelength band of the infrared laser light emitted by the
如此,基于环境亮度及投射距离共同确定光发射器100的第二发光功率,一方面可以减小电子装置800的功耗,另一方面可以提升场景的深度信息的获取精度。In this way, jointly determining the second light emitting power of the
在场景中不存在人脸时,光发射器100的第二开启频率可以根据应用场景来确定。例如,应用场景为基于三维人脸解锁电子装置800时,光接收器200每秒钟通常仅需要输出较少的几帧深度图像,例如,每秒钟输出3帧、每秒钟输出4帧、每秒钟输出5帧等,则此时光发射器100的第二开启频率可对应地设为3次/秒、4次/秒、5次/秒等;应用场景为用户使用电子装置800录制三维视频时,光接收器200每秒钟通常需要输出较多的深度图像,例如每秒钟输出30帧、每秒钟输出60帧等,则此时光发射器100的第二开启频率可对应地设为30次/秒、60次/秒等。如此,基于不同的应用场景,设定最适合于每种应用场景的开启频率,满足用户的使用需求。When there is no human face in the scene, the second turning-on frequency of the
请一并参阅图1和图21,本申请实施方式的电子装置800包括壳体801及飞行时间深度相机300。Please refer to FIG. 1 and FIG. 21 together. The
壳体801可以作为电子装置800的功能元件的安装载体。壳体801可以为功能元件提供防尘、防摔、防水等保护,功能元件可以是显示屏802、可见光相机400、受话器等。在本申请实施例中,壳体801包括主体803及可动支架804,可动支架804在驱动装置的驱动下可以相对于主体803运动,例如可动支架804可以相对于主体803滑动,以滑入主体803(如图20所示)或从主体803滑出(如图1所示)。部分功能元件(例如显示屏802)可以安装在主体803上,另一部分功能元件(例如飞行时间深度相机300、可见光相机400、受话器)可以安装在可动支架804上,可动支架804运动可带动该另一部分功能元件缩回主体803内或从主体803中伸出。当然,图1和图21所示的实施例仅仅是对壳体801的一种具体形式举例,不能理解为对本申请的壳体801的限制。The
飞行时间深度相机300安装在壳体801上。具体地,壳体801上可以开设有采集窗口,飞行时间深度相机300与采集窗口对准安装以使飞行时间深度相机300采集深度信息。在本申请的具体实施例中,飞行时间深度相机300安装在可动支架804上。用户在需要使用飞行时间深度相机300时,可以触发可动支架804从主体803中滑出以带动飞行时间深度相机300从主体803中伸出;在不需要使用飞行时间深度相机300时,可以触发可动支架804滑入主体803以带动飞行时间深度相机300缩回主体中。A time-of-
请一并参阅图21至图23,飞行时间深度相机300包括第一基板组件71、垫块72、光发射器100及光接收器200。第一基板组件71包括互相连接的第一基板711及柔性电路板712。垫块72设置在第一基板711上。光发射器100用于向外投射激光,光发射器100设置在垫块72上。柔性电路板712弯折且柔性电路板712的一端连接第一基板711,另一端连接光发射器100。光接收器200设置在第一基板711上,光接收器200用于接收被场景中的人或物反射回的激光。光接收器200包括外壳741及设置在外壳741上的光学元件742。外壳741与垫块72连接成一体。Please refer to FIGS. 21 to 23 together. The time-of-
具体地,第一基板组件71包括第一基板711及柔性电路板712。第一基板711可以是印刷线路板或柔性线路板。第一基板711上可以铺设有飞行时间深度相机300的控制线路等。柔性电路板712的一端可以连接在第一基板711上。柔性电路板712可以发生一定角度的弯折,使得柔性电路板712的两端连接的器件的相对位置可以有较多选择。Specifically, the
垫块72设置在第一基板711上。在一个例子中,垫块72与第一基板711接触且承载在第一基板711上,具体地,垫块72可以通过胶粘等方式与第一基板711结合。垫块72的材料可以是金属、塑料等。在本申请的实施例中,垫块72与第一基板711结合的面可以是平面,垫块72与该结合的面相背的面也可以是平面,使得光发射器100设置在垫块72上时具有较好的平稳性。The
光接收器200设置在第一基板711上,且光接收器200和第一基板711的接触面与垫块72和第一基板711的接触面基本齐平设置(即,二者的安装起点在同一平面上)。具体地,光接收器200包括外 壳741及光学元件742。外壳741设置在第一基板711上,光学元件742设置在外壳741上,外壳741可以是光接收器200的镜座及镜筒,光学元件742可以是设置在外壳741内的透镜等元件。进一步地,光接收器200还包括感光芯片(图未示),由场景中的人或物反射回的激光通过光学元件742后照射到感光芯片中,感光芯片对该激光产生响应。在本申请的实施例中,外壳741与垫块72连接成一体。具体地,外壳741与垫块72可以是一体成型;或者外壳741与垫块72的材料不同,二者通过双色注塑等方式一体成型。外壳741与垫块72也可以是分别成型,二者形成配合结构,在组装飞行时间深度相机300时,可以先将外壳741与垫块72中的一个设置在第一基板711上,再将另一个设置在第一基板711上且连接成一体。The
如此,将光发射器100设置在垫块72上,垫块72可以垫高光发射器100的高度,进而提高光发射器100出射激光的面的高度,光发射器100发射的激光不易被光接收器200遮挡,使得激光能够完全照射到目标空间中的被测物体上。In this way, the
请结合图23,光发射器100包括第二基板组件51、光发射组件101和外壳52。第二基板组件51设置在垫块72上,第二基板组件51与柔性电路板712连接。光发射组件101设置在第二基板组件51上,光发射组件101用于发射激光。外壳52设置在第二基板组件51上,外壳52形成有收容空间521,收容空间521可用于收容光发射组件101。柔性电路板712可以是可拆装地连接在第二基板组件51上。光发射组件101与第二基板组件51连接。外壳52整体可以呈碗状,且外壳52的开口向下罩设在第二基板组件51上,以将光发射组件101收容在收容空间521内。在本申请实施例中,外壳52上开设有与光发射组件101对应的出光口522,从光发射组件101发出的激光穿过出光口522后发射到出去,激光可以直接从出光口522穿出,也可以经其他光学器件改变光路后从出光口522穿出。With reference to FIG. 23, the
第二基板组件51包括第二基板511及补强件512。第二基板511与柔性电路板712连接。光发射组件101及补强件512设置在第二基板511的相背的两侧。第二基板511的具体类型可以是印刷线路板或柔性线路板等,第二基板511上可以铺设有控制线路。补强件512可以通过胶粘、铆接等方式与第二基板511固定连接,补强件512可以增加第二基板组件51整体的强度。光发射器100设置在垫块72上时,补强件512可以与垫块72直接接触,第二基板511不会暴露在外部,且不需要与垫块72直接接触,第二基板511不易受到灰尘等的污染。The
补强件512与垫块72分体成型。在组装飞行时间深度相机300时,可以先将垫块72安装在第一基板71上,此时柔性电路板712的两端分别连接第一基板711及第二基板511,且柔性电路板712可以先不弯折。然后再将柔性电路板712弯折,使得补强件512设置在垫块72上。当然,在其他实施例中,补强件512与垫块72可以一体成型,例如通过注塑等工艺一体成型,在组装飞行时间深度相机300时,可以将垫块72及光发射器100一同安装在第一基板711上。The reinforcing
请结合图24,光发射组件101包括光源10、扩散器20、镜筒30、保护罩40、及驱动器61。Referring to FIG. 24, the
镜筒30包括呈环状的镜筒侧壁33,环状的镜筒侧壁33围成收容腔62。镜筒侧壁33包括位于收容腔62内的内表面331及与内表面相背的外表面332。镜筒侧壁33包括相背的第一面31及第二面32。收容腔62贯穿第一面31及第二面32。第一面31朝第二面32凹陷形成与收容腔62连通的安装槽34。安装槽34的底面35位于安装槽34的远离第一面31的一侧。镜筒侧壁33的外表面332在第一面31的一端的横截面呈圆形,镜筒侧壁33的外表面332在第一面31的一端形成有外螺纹。镜筒30承载在第二基板511上,第二基板511具体可为电路板511,电路板511与镜筒30的第二面32接触以封闭收容腔62的一端。The
光源10承载在电路板511上并收容在收容腔62内。光源10用于朝镜筒30的第一面31(安装槽34)一侧发射激光。光源10可以是单点光源,也可是多点光源。在光源10为点电光源时,光源10具体可以为边发射型激光器,例如可以为分布反馈式激光器(Distributed Feedback Laser,DFB)等;在光源10为多点光源时,光源10具体可以为垂直腔面发射器(Vertical-Cavity Surface Laser,VCSEL),或者光源10也可为由多个边发射型激光器组成的多点光源。垂直腔面发射激光器的高度较小,采用垂直腔面发射器作为光源10,有利于减小光发射器100的高度,便于将光发射器100集成到手机等对机身厚度有较高的要求的电子装置800中。与垂直腔面发射器相比,边发射型激光器的温漂较小,可以减小温度对光源10的投射激光的效果的影响。The
驱动器61承载在电路板511上并与光源10电性连接。具体地,驱动器61可以接收经过处理器805805 调制的调制信号,并将调制信号转化为恒定的电流源后传输给光源10,以使光源10在恒定的电流源的作用下朝镜筒30的第一面31一侧发射激光。本实施方式的驱动器61设置在镜筒30外。在其他实施方式中,驱动器61可以设置在镜筒30内并承载在电路板511上。The
扩散器20安装(承载)在安装槽34内并与安装槽34相抵触。扩散器20用于扩散穿过扩散器20的激光。也即是,光源10朝镜筒30的第一面31一侧发射激光时,激光会经过扩散器20并被扩散器20扩散或投射到镜筒30外。The
保护罩40包括顶壁41及自顶壁41的一侧延伸形成的保护侧壁42。顶壁41的中心开设有通光孔401。保护侧壁42环绕顶壁41及通光孔401设置。顶壁41与保护侧壁42共同围成安装腔43,通光孔401与安装腔43连通。保护侧壁42的内表面的横截面呈圆形,保护侧壁42的内表面上形成有内螺纹。保护侧壁42的内螺纹与镜筒30的外螺纹螺合以将保护罩40安装在镜筒30上。顶壁41与扩散器20的抵触使得扩散器40被夹持在顶壁41与安装槽34的底面35之间。The
如此,通过在镜筒30上开设安装槽34,并将扩散器20安装在安装槽34内,以及通过保护罩40安装在镜筒30上以将扩散器20夹持在保护罩40与安装槽34的底面35之间,从而实现将扩散器20固定在镜筒30上。此种方式无需使用胶水将扩散器20固定在镜筒30上,能够避免胶水挥发成气态后,气态的胶水扩散并凝固在扩散器20的表面而影响扩散器20的微观结构,并能够避免扩散器20和镜筒30的胶水因老化而使粘着力下降时扩散器20从镜筒30脱落。In this way, the
请再参阅图1,本申请还提供一种电子装置800。电子装置包括上述任意一项实施方式所述的飞行时间深度相机300、一个或多个处理器805、存储器806和一个或多个程序。其中一个或多个程序被存储在存储器806中,并且被配置成由一个或多个处理器806执行,程序包括用于执行上述任意一项实施方式所述的控制方法的指令。Please refer to FIG. 1 again, the present application further provides an
例如,请结合图2,程序包括用于执行以下步骤的指令:For example, in conjunction with Figure 2, the program includes instructions for performing the following steps:
01:获取场景的场景图像;01: Get the scene image of the scene;
03:识别场景图像中是否存在人脸;03: identify whether a human face exists in the scene image;
05:在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光;05: when a human face is present in the scene image, controlling the
07:在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光。07: When a human face does not exist in the scene image, the
再例如,请结合图6,程序还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 6, the program further includes instructions for performing the following steps:
041:判断光发射器100的应用场景;041: determine an application scenario of the
051:在应用场景为第一场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第一子开启频率发光;和051: when the application scene is the first scene and the face exists in the scene image, controlling the
052:在应用场景为第二场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。052: When the application scene is the second scene and the human face is present in the scene image, the
再例如,请结合图8,程序还包括用于执行以下步骤的指令:As another example, in conjunction with FIG. 8, the program further includes instructions for performing the following steps:
042:获取用户与光发射器100的投射距离;和042: Obtain the projection distance between the user and the
043:根据投射距离计算第一发光功率。043: Calculate the first luminous power according to the projection distance.
再例如,请结合图9,程序还包括用于执行以下步骤的指令:As another example, in conjunction with FIG. 9, the program further includes instructions for performing the following steps:
0421:计算场景图像中人脸所占的第一比例;和0421: Calculate the first proportion of faces in the scene image; and
0422:根据第一比例计算投射距离。0422: Calculate the projection distance according to the first scale.
再例如,请结合图12,程序还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 12, the program further includes instructions for performing the following steps:
0423:计算场景图像中人脸的预设特征区域占人脸的第二比例;和0423: Calculate the second proportion of the preset feature area of the human face in the scene image; and
0424:根据第一比例及第二比例计算投射距离。0424: Calculate the projection distance according to the first scale and the second scale.
再例如,请结合图14,程序还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 14, the program further includes instructions for performing the following steps:
0425:根据场景图像判断用户是否配戴眼镜;和0425: Determine whether the user is wearing glasses based on the scene image; and
0426:在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。0426: Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
再例如,请结合图16,程序还包括用于执行以下步骤的指令:For another example, in conjunction with FIG. 16, the program further includes instructions for performing the following steps:
0427:根据场景图像判断用户的年龄;和0427: judging the user's age based on the scene image; and
0428:根据第一比例及年龄计算投射距离。0428: Calculate the projection distance based on the first ratio and age.
再例如,请结合图18,程序还包括用于执行以下步骤的指令:As another example, in conjunction with FIG. 18, the program further includes instructions for performing the following steps:
071:获取场景中的目标主体与光发射器100的投射距离;071: Obtain a projection distance between the target subject in the scene and the
072:获取场景的环境亮度;和072: Get the ambient brightness of the scene; and
073:根据环境亮度及投射距离计算第二发光功率。073: Calculate the second luminous power according to the ambient brightness and the projection distance.
请参阅图25,本申请还提供一种计算机可读存储介质900。计算机可读存储介质900包括与电子装置800结合使用的计算机程序,计算机程序可被处理器805执行以完成上述任意一项实施方式所述的控制方法。Referring to FIG. 25, the present application further provides a computer-
例如,请结合图2,计算机程序可被处理器805执行以完成以下步骤:For example, in conjunction with FIG. 2, the computer program may be executed by the
01:获取场景的场景图像;01: Get the scene image of the scene;
03:识别场景图像中是否存在人脸;03: identify whether a human face exists in the scene image;
05:在场景图像中存在人脸时,控制光发射器100以第一发光功率和/或第一开启频率发光;05: when a human face is present in the scene image, controlling the
07:在场景图像中不存在人脸时,控制光发射器100以第二发光功率和/或第二开启频率发光。07: When a human face does not exist in the scene image, the
再例如,请结合图6,计算机程序还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 6, the computer program can also be executed by the
041:判断光发射器100的应用场景;041: determine an application scenario of the
051:在应用场景为第一场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第一子开启频率发光;和051: when the application scene is the first scene and the face exists in the scene image, controlling the
052:在应用场景为第二场景,且场景图像中存在所述人脸时,控制光发射器100以第一发光功率和第二子开启频率发光。052: When the application scene is the second scene and the human face is present in the scene image, the
再例如,请结合图8,计算机程序还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 8, the computer program may also be executed by the
042:获取用户与光发射器100的投射距离;和042: Obtain the projection distance between the user and the
043:根据投射距离计算第一发光功率。043: Calculate the first luminous power according to the projection distance.
再例如,请结合图9,计算机程序还可被处理器805执行以完成以下步骤:As another example, in conjunction with FIG. 9, the computer program may also be executed by the
0421:计算场景图像中人脸所占的第一比例;和0421: Calculate the first proportion of faces in the scene image; and
0422:根据第一比例计算投射距离。0422: Calculate the projection distance according to the first scale.
再例如,请结合图12,计算机程序还可被处理器805执行以完成以下步骤:As another example, in conjunction with FIG. 12, the computer program may also be executed by the
0423:计算场景图像中人脸的预设特征区域占人脸的第二比例;和0423: Calculate the second proportion of the preset feature area of the human face in the scene image; and
0424:根据第一比例及第二比例计算投射距离。0424: Calculate the projection distance according to the first scale and the second scale.
再例如,请结合图14,计算机程序还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 14, the computer program may also be executed by the
0425:根据场景图像判断用户是否配戴眼镜;和0425: Determine whether the user is wearing glasses based on the scene image; and
0426:在用户佩戴眼镜时根据第一比例及距离系数计算投射距离。0426: Calculate the projection distance according to the first scale and the distance coefficient when the user wears glasses.
再例如,请结合图16,计算机程序还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 16, the computer program may also be executed by the
0427:根据场景图像判断用户的年龄;和0427: judging the user's age based on the scene image; and
0428:根据第一比例及年龄计算投射距离。0428: Calculate the projection distance based on the first ratio and age.
再例如,请结合图18,计算机程序还可被处理器805执行以完成以下步骤:For another example, in conjunction with FIG. 18, the computer program may also be executed by the
071:获取场景中的目标主体与光发射器100的投射距离;071: Obtain a projection distance between the target subject in the scene and the
072:获取场景的环境亮度;和072: Get the ambient brightness of the scene; and
073:根据环境亮度及投射距离计算第二发光功率。073: Calculate the second luminous power according to the ambient brightness and the projection distance.
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。In the description of this specification, the description with reference to the terms “one embodiment”, “some embodiments”, “examples”, “specific examples”, or “some examples” and the like means specific features described in conjunction with the embodiments or examples , Structure, material, or characteristic is included in at least one embodiment or example of the present application. In this specification, the schematic expressions of the above terms are not necessarily directed to the same embodiment or example. Moreover, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. In addition, without any contradiction, those skilled in the art may combine and combine different embodiments or examples and features of the different embodiments or examples described in this specification.
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。In addition, the terms "first" and "second" are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Therefore, the features defined as "first" and "second" may explicitly or implicitly include at least one of the features. In the description of the present application, the meaning of "a plurality" is at least two, for example, two, three, etc., unless it is specifically and specifically defined otherwise.
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式 的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。Any process or method description in a flowchart or otherwise described herein can be understood as a module, fragment, or portion of code that includes one or more executable instructions for implementing a particular logical function or step of a process And, the scope of the preferred embodiments of this application includes additional implementations in which the functions may be performed out of the order shown or discussed, including performing the functions in a substantially simultaneous manner or in the reverse order according to the functions involved, which should It is understood by those skilled in the art to which the embodiments of the present application pertain.
在流程图中表示或在此以其他方式描述的逻辑和/或步骤,例如,可以被认为是用于实现逻辑功能的可执行指令的定序列表,可以具体实现在任何计算机可读介质中,以供指令执行系统、装置或设备(如基于计算机的系统、包括处理器的系统或其他可以从指令执行系统、装置或设备取指令并执行指令的系统)使用,或结合这些指令执行系统、装置或设备而使用。就本说明书而言,"计算机可读介质"可以是任何可以包含、存储、通信、传播或传输程序以供指令执行系统、装置或设备或结合这些指令执行系统、装置或设备而使用的装置。计算机可读介质的更具体的示例(非穷尽性列表)包括以下:具有一个或多个布线的电连接部(电子装置),便携式计算机盘盒(磁装置),随机存取存储器(RAM),只读存储器(ROM),可擦除可编辑只读存储器(EPROM或闪速存储器),光纤装置,以及便携式光盘只读存储器(CDROM)。另外,计算机可读介质甚至可以是可在其上打印所述程序的纸或其他合适的介质,因为可以例如通过对纸或其他介质进行光学扫描,接着进行编辑、解译或必要时以其他合适方式进行处理来以电子方式获得所述程序,然后将其存储在计算机存储器中。Logic and / or steps represented in a flowchart or otherwise described herein, for example, a sequenced list of executable instructions that may be considered to implement a logical function, may be embodied in any computer-readable medium, For use by, or in combination with, an instruction execution system, device, or device (such as a computer-based system, a system that includes a processor, or another system that can fetch and execute instructions from an instruction execution system, device, or device) Or equipment. For the purposes of this specification, a "computer-readable medium" may be any device that can contain, store, communicate, propagate, or transmit a program for use by or in connection with an instruction execution system, apparatus, or device. More specific examples (non-exhaustive list) of computer-readable media include the following: electrical connections (electronic devices) with one or more wirings, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read-only memory (ROM), erasable and editable read-only memory (EPROM or flash memory), fiber optic devices, and portable optical disk read-only memory (CDROM). In addition, the computer-readable medium may even be paper or other suitable medium on which the program can be printed, because, for example, by optically scanning the paper or other medium, followed by editing, interpretation, or other suitable Processing to obtain the program electronically and then store it in computer memory.
应当理解,本申请的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。It should be understood that each part of the application may be implemented by hardware, software, firmware, or a combination thereof. In the above embodiments, multiple steps or methods may be implemented by software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, it may be implemented using any one or a combination of the following techniques known in the art: Discrete logic circuits, application-specific integrated circuits with suitable combinational logic gate circuits, programmable gate arrays (PGA), field programmable gate arrays (FPGA), etc.
本技术领域的普通技术人员可以理解实现上述实施例方法携带的全部或部分步骤是可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,该程序在执行时,包括方法实施例的步骤之一或其组合。A person of ordinary skill in the art can understand that all or part of the steps carried by the methods in the foregoing embodiments can be implemented by a program instructing related hardware. The program can be stored in a computer-readable storage medium. The program is When executed, one or a combination of the steps of the method embodiment is included.
此外,在本申请各个实施例中的各功能单元可以集成在一个处理模块中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。所述集成的模块如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。In addition, each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may exist separately physically, or two or more units may be integrated into one module. The above integrated modules may be implemented in the form of hardware or software functional modules. If the integrated module is implemented in the form of a software functional module and sold or used as an independent product, it may also be stored in a computer-readable storage medium.
上述提到的存储介质可以是只读存储器,磁盘或光盘等。尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型。The aforementioned storage medium may be a read-only memory, a magnetic disk, or an optical disk. Although the embodiments of the present application have been shown and described above, it can be understood that the above embodiments are exemplary and should not be construed as limitations on the present application. Those skilled in the art may, within the scope of the present application, understand the above. Embodiments are subject to change, modification, substitution, and modification.
Claims (26)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201811060690.4 | 2018-09-12 | ||
| CN201811060690.4A CN109068036B (en) | 2018-09-12 | 2018-09-12 | Control method and device, depth camera, electronic device and readable storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020052284A1 true WO2020052284A1 (en) | 2020-03-19 |
Family
ID=64760107
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/090020 Ceased WO2020052284A1 (en) | 2018-09-12 | 2019-06-04 | Control method and device, depth camera, electronic device, and readable storage medium |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN109068036B (en) |
| WO (1) | WO2020052284A1 (en) |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108833889B (en) * | 2018-08-22 | 2020-06-23 | Oppo广东移动通信有限公司 | Control method and device, depth camera, electronic device and readable storage medium |
| CN109068036B (en) * | 2018-09-12 | 2020-09-25 | Oppo广东移动通信有限公司 | Control method and device, depth camera, electronic device and readable storage medium |
| CN112351155B (en) * | 2019-08-06 | 2023-02-17 | Oppo(重庆)智能科技有限公司 | Electronic device, anti-candid camera for electronic device and control method thereof |
| CN110418062A (en) * | 2019-08-29 | 2019-11-05 | 上海云从汇临人工智能科技有限公司 | A kind of image pickup method, device, equipment and machine readable media |
| CN113126111B (en) * | 2019-12-30 | 2024-02-09 | Oppo广东移动通信有限公司 | Time of flight modules and electronics |
| CN113223209A (en) * | 2020-01-20 | 2021-08-06 | 深圳绿米联创科技有限公司 | Door lock control method and device, electronic equipment and storage medium |
| CN111487633B (en) * | 2020-04-06 | 2024-08-23 | 深圳蚂里奥技术有限公司 | Laser safety control device and method |
| CN111427049B (en) * | 2020-04-06 | 2024-08-27 | 深圳蚂里奥技术有限公司 | Laser safety device and control method |
| CN114531541B (en) * | 2022-01-10 | 2023-06-02 | 荣耀终端有限公司 | Control method and device for camera module |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016064096A2 (en) * | 2014-10-21 | 2016-04-28 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
| CN107607957A (en) * | 2017-09-27 | 2018-01-19 | 维沃移动通信有限公司 | A kind of Depth Information Acquistion system and method, camera module and electronic equipment |
| CN108281880A (en) * | 2018-02-27 | 2018-07-13 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer device, and storage medium |
| CN108376251A (en) * | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
| CN108376252A (en) * | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer device, and storage medium |
| WO2018144368A1 (en) * | 2017-02-03 | 2018-08-09 | Microsoft Technology Licensing, Llc | Active illumination management through contextual information |
| CN109068036A (en) * | 2018-09-12 | 2018-12-21 | Oppo广东移动通信有限公司 | Control method and device, depth camera, electronic device and readable storage medium |
-
2018
- 2018-09-12 CN CN201811060690.4A patent/CN109068036B/en active Active
-
2019
- 2019-06-04 WO PCT/CN2019/090020 patent/WO2020052284A1/en not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016064096A2 (en) * | 2014-10-21 | 2016-04-28 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
| WO2018144368A1 (en) * | 2017-02-03 | 2018-08-09 | Microsoft Technology Licensing, Llc | Active illumination management through contextual information |
| CN107607957A (en) * | 2017-09-27 | 2018-01-19 | 维沃移动通信有限公司 | A kind of Depth Information Acquistion system and method, camera module and electronic equipment |
| CN108281880A (en) * | 2018-02-27 | 2018-07-13 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer device, and storage medium |
| CN108376251A (en) * | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer equipment and storage medium |
| CN108376252A (en) * | 2018-02-27 | 2018-08-07 | 广东欧珀移动通信有限公司 | Control method, control device, terminal, computer device, and storage medium |
| CN109068036A (en) * | 2018-09-12 | 2018-12-21 | Oppo广东移动通信有限公司 | Control method and device, depth camera, electronic device and readable storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| CN109068036A (en) | 2018-12-21 |
| CN109068036B (en) | 2020-09-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109068036B (en) | Control method and device, depth camera, electronic device and readable storage medium | |
| CN112702541B (en) | Control method and device, depth camera, electronic device and readable storage medium | |
| CN108833889B (en) | Control method and device, depth camera, electronic device and readable storage medium | |
| CN109149355B (en) | Light emission module and control method thereof, TOF depth camera and electronic device | |
| US11335028B2 (en) | Control method based on facial image, related control device, terminal and computer device | |
| CN108281880A (en) | Control method, control device, terminal, computer device, and storage medium | |
| WO2020038060A1 (en) | Laser projection module and control method therefor, and image acquisition device and electronic apparatus | |
| CN109271916B (en) | Electronic device, control method thereof, control device, and computer-readable storage medium | |
| CN108333860B (en) | Control method, control device, depth camera and electronic device | |
| CN109031252B (en) | Calibration method, calibration controller and calibration system | |
| CN108227361B (en) | Control method, control device, depth camera and electronic device | |
| WO2016010481A1 (en) | Optoelectronic modules operable to distinguish between signals indicative of reflections from an object of interest and signals indicative of a spurious reflection | |
| TWI684026B (en) | Control method, control device, depth camera and electronic device | |
| CN108594451B (en) | Control method, control device, depth camera, and electronic device | |
| CN108960061A (en) | Control method, control device, electronic device, computer equipment and storage medium | |
| CN108376252B (en) | Control method, control device, terminal, computer device, and storage medium | |
| CN108509867A (en) | Control method, control device, depth camera and electronic device | |
| CN113325391A (en) | Wide-angle TOF module and application thereof | |
| US10551500B2 (en) | Infrared optical element for proximity sensor system | |
| HK40016464A (en) | Control method, control apparatus, terminal, computer device, and storage medium | |
| CN120909002A (en) | Intelligent glasses, shielding detection device and electronic equipment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19859708 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19859708 Country of ref document: EP Kind code of ref document: A1 |