[go: up one dir, main page]

HK1262801B - Data processing method, device, computer readable storage medium and electronic equipment - Google Patents

Data processing method, device, computer readable storage medium and electronic equipment Download PDF

Info

Publication number
HK1262801B
HK1262801B HK19123022.6A HK19123022A HK1262801B HK 1262801 B HK1262801 B HK 1262801B HK 19123022 A HK19123022 A HK 19123022A HK 1262801 B HK1262801 B HK 1262801B
Authority
HK
Hong Kong
Prior art keywords
processor
image
image processor
camera
interface
Prior art date
Application number
HK19123022.6A
Other languages
Chinese (zh)
Other versions
HK1262801A1 (en
Inventor
周海涛
郭子青
Original Assignee
Oppo广东移动通信有限公司
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of HK1262801A1 publication Critical patent/HK1262801A1/en
Publication of HK1262801B publication Critical patent/HK1262801B/en

Links

Description

Data processing method and device, computer readable storage medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a data processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
The more functions are integrated in the intelligent terminal, the higher the requirements on hardware and software of the intelligent terminal are. For example, smart terminals may be used to make phone calls, play games, shop, take pictures, and the like. In order to realize the photographing function, a camera needs to be installed on the intelligent terminal. In order to realize the function of making a call, a microphone and a receiver need to be installed on the intelligent terminal. Therefore, the portability of the intelligent terminal can be realized only by the very high integration level of the hardware resources of the intelligent terminal. Meanwhile, when a plurality of application programs run simultaneously, the consumption of the memory and the processor of the intelligent terminal is very large, and how to reduce the consumption of terminal resources is very important.
Disclosure of Invention
The embodiment of the application provides a data processing method and device, a computer readable storage medium and electronic equipment, which can save resources of the electronic equipment.
A method of data processing, the method comprising:
when the electronic equipment is detected to start the front camera, the first image processor is controlled to be disconnected with a second processor interface connected with the rear camera; the first image processor is connected with the front camera through a first processor interface;
and controlling the first image processor to be connected to a second image processor through the second processor interface, wherein the second image processor is connected to the front camera.
A data processing apparatus, the apparatus comprising:
the starting detection module is used for controlling the first image processor to be disconnected with a second processor interface connected with the rear camera when the electronic equipment is detected to start the front camera; the first image processor is connected with the front camera through a first processor interface;
and the interface connection module is used for controlling the first image processor to be connected to a second image processor through the second processor interface, wherein the second image processor is connected to the front camera.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
when the electronic equipment is detected to start the front camera, the first image processor is controlled to be disconnected with a second processor interface connected with the rear camera; the first image processor is connected with the front camera through a first processor interface;
and controlling the first image processor to be connected to a second image processor through the second processor interface, wherein the second image processor is connected to the front camera.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
when the electronic equipment is detected to start the front camera, the first image processor is controlled to be disconnected with a second processor interface connected with the rear camera; the first image processor is connected with the front camera through a first processor interface;
and controlling the first image processor to be connected to a second image processor through the second processor interface, wherein the second image processor is connected to the front camera.
According to the data processing method and device, the computer readable storage medium and the electronic device, when the electronic device detects that the front camera is started, the first image processor can be controlled to be disconnected with the second processor interface connected with the rear camera, and the first image processor and the second image processor are controlled to be connected through the second processor interface. When the electronic equipment shoots through the camera, one of the front camera and the rear camera is generally started at the same time, and then when the front camera is detected to be started, the second processor interface connected with the rear camera can be used for being connected with the second image processor, so that the time-sharing multiplexing of the second processor interface by the rear camera and the second image processor can be realized, and the resources of the electronic equipment are saved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an application environment of a data processing method in one embodiment;
FIG. 2 is a flow diagram of a data processing method in one embodiment;
FIG. 3 is a schematic diagram showing an internal configuration of an electronic apparatus according to an embodiment;
FIG. 4 is a flowchart of a data processing method in another embodiment;
FIG. 5 is a flowchart of a data processing method in yet another embodiment;
FIG. 6 is a flowchart of a data processing method in yet another embodiment;
FIG. 7 is a flowchart of a data processing method in yet another embodiment;
FIG. 8 is a schematic diagram showing an internal configuration of an electronic apparatus according to another embodiment;
FIG. 9 is a diagram of a software architecture for implementing a data transfer method according to one embodiment;
FIG. 10 is a schematic diagram showing the structure of a data processing apparatus according to an embodiment;
FIG. 11 is a schematic diagram showing the structure of a data processing apparatus according to another embodiment;
FIG. 12 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
FIG. 1 is a diagram of an application environment of a data processing method in one embodiment. As shown in fig. 1, an application environment of the data processing method includes an electronic device 10, a front camera 102 and a rear camera 104 are installed on the electronic device 10, and the electronic device 10 further includes a first image processor and a second image processor. The first image processor is connected with the front camera 102 through a first processor interface, the first image processor is connected with the rear camera 104 through a second processor interface, and the second image processor is connected with the front camera 102. When the electronic equipment 10 is detected to start the front camera 102, the first image processor can be controlled to be disconnected from the second processor interface connected with the rear camera 104; the first image processor is then controlled to connect to the second image processor through the second processor interface. The electronic device 102 may be any device with a front camera and a rear camera mounted thereon, and is not limited in this embodiment. For example, it may be a personal computer, a mobile terminal, a personal digital assistant, a wearable electronic device, and the like.
FIG. 2 is a flow diagram of a data processing method in one embodiment. As shown in fig. 2, the data processing method includes steps 202 to 204. Wherein:
step 202, when detecting that the electronic equipment starts the front camera, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera; the first image processor is connected with the front camera through a first processor interface.
In one embodiment, a camera may be mounted on the electronic device, and an image may be acquired through the mounted camera. The camera can be divided into types such as a laser camera and a visible light camera according to the difference of the obtained images, the laser camera can obtain the image formed by irradiating the laser to the object, and the visible light image can obtain the image formed by irradiating the visible light to the object. The electronic equipment can be provided with a plurality of cameras, and the installation position is not limited. For example, one camera may be installed on a front panel of the electronic device, two cameras may be installed on a back panel of the electronic device, and the cameras may be installed in an embedded manner inside the electronic device and then opened by rotating or sliding.
The image processor refers to a processor which can process an image acquired by the camera. The image processor is connected with the camera, and images acquired by the camera can be transmitted to the image processor and are subjected to cutting, brightness adjustment, face detection and the like through the image processor. The front camera and the rear camera can acquire images from different visual angles, the front camera can acquire images from the front visual angle of the electronic equipment, and the rear camera can acquire images from the back visual angle of the electronic equipment. The front camera and the rear camera are both connected to the image processor, and the acquired images are processed by the image processor.
In the embodiments provided in the present application, the electronic device includes a first image processor and a second image processor, and both the first image processor and the second image processor can process images. Specifically, the front camera and the rear camera are both connected to a first image processor, and the front camera is connected to a second image processor. The first image processor can process the images acquired by the front camera and the rear camera, and the second image processor generally processes the images acquired by the front camera only. The processor interface is an interface for connecting the image processor with other components, and the first image processor comprises a first processor interface and a second processor interface and is respectively connected with the front camera and the rear camera through the first processor interface and the second processor interface.
Fig. 3 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 3, the electronic device includes a front camera 30, a first image processor 31, a second image processor 32, and a rear camera 33, and the front camera 30 includes a laser camera 302 and a visible light camera 304. The front camera 30 is connected with the first image processor 31 through a first processor interface 34, the first image processor 31 is connected with the rear camera 33 through a second processor interface 35, and the front camera 30 is connected with the second image processor 32 through a third processor interface 36. When the electronic device detects that the front camera 30 is activated, the second processor interface 35 controlling the first image processor 31 to be connected with the rear camera 33 is disconnected, and the first image processor 31 is controlled to be connected to the second image processor 32 through the second processor interface 35. It is understood that the laser camera 302 can generally acquire a speckle image formed by irradiating laser speckles on an object, and calculate a depth image from the speckle image. The acquired depth image can be used for payment, unlocking and other verification. The first image processor 31 may be a Central Processing Unit (CPU) of the electronic device, and the second image processor 32 may be an external micro Processing Unit (MCU). The external micro-processing unit is an external processor and is isolated from a system of the electronic equipment, so that the external micro-processing unit can ensure the safety of data processing. When the images are acquired by the front-facing camera 30 to perform operations such as payment and unlocking, the acquired speckle images can be processed by the external micro-processing unit (the second image processor 32) to ensure data security, a depth image is obtained by calculation, and the obtained depth image is sent to a trusted operating environment in the central processing unit (the first image processor 31) to be subsequently processed.
And 204, controlling the first image processor to be connected to a second image processor through a second processor interface, wherein the second image processor is connected to the front camera.
Because the front camera and the rear camera are images obtained from different viewing angles, the front camera and the rear camera cannot be started to shoot when the electronic equipment shoots the images. In order to save interface resources of the image processor, when the electronic device starts the front camera, the first image processor may be disconnected from the second processor interface connected to the rear camera, and the first image processor and the second image processor may be connected through the second processor interface. After the connection, the front camera can be respectively connected with the first image processor and the second image processor, and the first image processor and the second image processor are also connected. The image generated by the front camera can be directly sent to the first image processor for processing, or can be sent to the second image processor first and then sent to the first image processor by the second image processor, that is, the first image processor and the second image processor can jointly complete the processing of the image.
In the data processing method provided by the above embodiment, when the electronic device detects that the front camera is started, the electronic device may control the first image processor to disconnect from the second processor interface connected to the rear camera, and control the first image processor to establish connection with the second image processor through the second processor interface. When the electronic equipment shoots through the camera, one of the front camera and the rear camera is generally started at the same time, and then when the front camera is detected to be started, the second processor interface connected with the rear camera can be used for being connected with the second image processor, so that the time-sharing multiplexing of the second processor interface by the rear camera and the second image processor can be realized, and the resources of the electronic equipment are saved.
Fig. 4 is a flowchart of a data processing method in another embodiment. As shown in fig. 4, the data processing method includes steps 402 to 410. Wherein:
step 402, when it is detected that the electronic device starts the front camera, an image acquisition instruction is acquired.
In one embodiment, the electronic device may monitor the state of the front camera in real time, and detect whether an image acquisition instruction is received or not in real time when detecting that the front camera is started. When the state of the front camera is monitored, the monitoring can be realized in a software mode or in hardware. Specifically, when the front-facing camera is started, the system of the electronic device may send the state of the front-facing camera to each Application (APP) in a broadcast manner, and the Application registered with the broadcast receiver may monitor the state of the front-facing camera. When the front camera is activated, an electric signal of the front camera may be detected, and if an electric signal such as a voltage, a current, an I/O (Input/Output) signal detected by the front camera is detected, the front camera is considered to be activated.
The electronic equipment can be provided with an application program, and the application program can call the camera to acquire images. The image acquisition instruction refers to an instruction for acquiring an image through a camera, and may be input by a user or automatically generated by an electronic device. For example, when a user takes a picture, the user can directly call the camera to take the picture. The user can also carry out unlocking verification through the face, when the user lights the screen, the electronic equipment automatically generates an image acquisition instruction, and obtains an image according to the image acquisition instruction to carry out unlocking verification.
After the front camera is activated, it is possible that an image is not captured by the front camera. For example, when taking a picture, the shooting process is generally divided into a preview phase and a shooting phase. In the process of the preview stage, the camera can acquire a current shooting picture in real time to be used as a preview image, and the preview image is displayed on a display screen of the electronic equipment. The preview image collected in the previewing process cannot be stored, and the user can adjust the current shooting position and angle in real time through the preview image displayed on the display screen. And only after the user adjusts the shooting position and angle, inputting an image acquisition instruction. The electronic equipment shoots an image according to the image acquisition instruction after detecting the image acquisition instruction. Therefore, the electronic apparatus may not connect the second processor interface to the second image processor immediately after detecting that the front camera is activated. After the image acquisition instruction is acquired, the second processor interface is connected to the second image processor, so that frequent switching of the processor interface can be avoided, and resources are saved.
And step 404, if the acquired image acquisition instruction is a depth image acquisition instruction, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
Specifically, the front camera comprises a laser camera and a visible light camera, the laser camera can acquire a speckle image formed by irradiating laser speckles on an object, and a depth image can be obtained through calculation according to the speckle image. The visible light camera can obtain an RGB (Red Green Blue ) image, and the RGB image and the depth image are both composed of a plurality of pixel points. Generally, the obtained RGB image corresponds to the depth image, a pixel value in the RGB image is used to represent a color of a pixel point, and a pixel value in the depth image is used to represent a depth of the pixel point.
When the front camera is turned on, if an image acquisition instruction is acquired, whether the image acquisition instruction is used for acquiring an RGB (red, green and blue) image or a depth image can be judged. If the RGB image is acquired, the RGB image acquired by the front camera can be directly sent to the first image processor through the first processor interface, and the first image processor processes the image. If the depth image is acquired, the acquired depth image can be considered to be required to be used for processing with higher safety such as payment and unlocking, and the speckle image acquired by the front camera can be directly sent to the second image processor to calculate the depth image, and the second image processor sends the depth image to the first image processor through the second processor interface to be processed.
In an embodiment, the image capture instruction may include a type identifier for indicating a type of the captured image, and it may be determined whether the image capture instruction is used to obtain an RGB image or a depth image according to the type identifier. For example, the type identifier may be "RGBget" or "Depthget", where "RGBget" indicates that the image acquisition instruction is used to acquire an RGB image, and "Depthget" indicates that the image acquisition instruction is used to acquire a depth image. When the acquired image acquisition instruction is a depth image acquisition instruction, the first image processor can be controlled to be disconnected with a second processor interface connected with the rear camera, and the first image processor is controlled to be connected to the second image processor through the second processor interface.
And 406, controlling the first image processor to be connected to a second image processor through a second processor interface, wherein the second image processor is connected to the front camera.
In one embodiment, the first processor interface and the second processor interface may implement data transmission between the image processor and the camera, and may also implement data transmission between the image processor and the image processor. For example, the first Processor Interface and the second Processor Interface may be MIPI (Mobile Industry Processor Interface) interfaces. Specifically, the switching of the processor interface may be implemented by software, or may be implemented by hardware. When the method is implemented in a software mode, an interface conversion instruction can be initiated for the first image processor, and the second image processor is connected to the second image processor. When the hardware implementation is realized, the interface of the second processor can be directly switched to be connected with the second image processor through a switch (conversion) circuit.
And step 408, when the front camera is detected to be closed or the rear camera is detected to be started, controlling the first image processor to be disconnected with the second processor interface connected with the second image processor.
After the first image processor is disconnected from the rear camera and is connected with the second image processor, the connection between the rear camera and the first image processor needs to be reestablished in order to ensure the normal operation of the rear camera. Specifically, the state of the front camera and the state of the rear camera can be detected, and when the front camera is detected to be closed or the rear camera is detected to be started, the connection between the first image processor and the rear camera is reestablished, so that the normal work of the rear camera is ensured.
And step 410, controlling the first image processor to be connected to the rear camera through the second processor interface.
In an embodiment, when it is detected that the acquired image capture instruction is a depth image capture instruction, an application identifier included in the depth image capture instruction may be acquired, an application program initiating the depth image capture instruction is determined according to the application identifier, and whether it is necessary to connect the second processor interface to the second image processor is determined according to the application program. The method specifically comprises the following steps:
step 502, if the acquired image acquisition instruction is a depth image acquisition instruction, acquiring an application identifier included in the depth image acquisition instruction, where the application identifier is used to identify an application program that issues the depth image acquisition instruction.
Specifically, an application program may be installed in the electronic device, where the application program refers to software written for a certain application purpose in the electronic device, and the electronic device may implement a service required by a user through the application program. For example, the user may play games through a game-like application, may pay for transactions through a payment-like application, may play music through a music-like application, and so on. The application identifier can mark the application program which sends out the depth image acquisition instruction, and the application program which sends out the depth image acquisition instruction can be identified according to the application identifier.
And step 504, if the application identifier is the preset application identifier, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
It can be understood that the application operations that the user needs to complete through the electronic device are all implemented through the application programs, and the application programs in the electronic device can be divided into a secure application program and a non-secure application program. The security application program has a high requirement on data security, and the non-security application program has a relatively low requirement on data security. For example, payment-type applications have a relatively high level of data security, and gaming-type applications have a relatively low data security requirement. It may be determined by the application initiating the depth image capture instruction whether the depth image needs to be acquired through the secure channel.
Specifically, if the application identifier is a preset application identifier, it is considered that the obtained depth image may be used for performing an application operation with a high security requirement, and the depth image may be obtained through a secure channel. And controlling the first image processor to be disconnected with a second processor interface connected with the rear camera, and controlling the first image processor to be connected with the second image processor through the second processor interface. Therefore, the speckle images acquired by the front camera can be sent to the second image processor, the depth images are obtained by calculation in the second image processor, and the depth images obtained by calculation are sent to the first image processor by the second image processor. Since the second image processor is an external microprocessing unit, the safe processing of the image can be realized.
Further, the security-type application program may acquire the depth image for performing a higher security application operation or a lower security application operation. For example, the payment application may complete payment verification by acquiring a depth image, and may implement AR (Augmented Reality) technology by acquiring a depth image. Whether the processor interface needs to be switched or not can be judged through specific application operation, and after the application identifier contained in the depth image acquisition instruction is judged to be the preset application identifier, the method further comprises the following steps:
step 602, obtaining an operation identifier included in the depth image acquisition instruction, where the operation identifier is used to identify an application operation to be completed through the acquired depth image.
In one embodiment, the operation identifier is used to indicate an application operation that needs to be completed through the acquired depth image, and the electronic device may preset an operation identifier corresponding to each application operation, and may identify the application operation through the operation identifier. For example, the face-beautifying processing of the portrait in the RGB image is completed through the collected depth image, and then the operation identifier is the operation identifier corresponding to the face-beautifying processing. If the 2D image needs to be converted into the 3D image through the acquired depth image, the operation identifier is the operation identifier corresponding to the 3D conversion.
And step 604, if the operation identifier is the preset operation identifier, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
If the operation identifier is the preset operation identifier, the acquired depth image is considered to be used for carrying out application operation with higher safety requirements, the first image processor and the rear camera can be disconnected, and the first image processor and the second image processor are connected through the second processor interface.
After the rear camera is disconnected with the first image processor, if the front camera is detected to be closed, the rear camera and the first image processor can be immediately reconnected, or the rear camera and the first image processor can be reconnected after a period of time. The process of reestablishing the connection may specifically include:
and step 702, when the fact that the front camera is turned off is detected, timing is started.
A timer may be set in the electronic device, and the timer is started to start timing after the front camera is detected to be turned off. Taking an Android system as an example, the system may pre-define a timer, when it is detected that the front camera is turned off, clear the timer by a timer.
And step 704, when the timing duration exceeds the duration threshold, controlling the first image processor to disconnect from the second processor interface connected with the second image processor.
And when the timing duration exceeds the duration threshold, the first image processor is controlled to be connected with the rear camera through the second processor interface again. The user can set the duration threshold, and the system can also set the duration threshold in advance. For example, the time duration threshold may be 5 seconds, and when it is detected that the time duration for which the front camera is turned off exceeds 5 seconds, the first image processor may be controlled to reconnect with the rear camera. Therefore, the situation that the front camera of the user is turned off by mistake and the electric quantity of the electronic equipment is lost due to frequent switching of the processor interface when the front camera is turned on again by the user can be prevented.
In one embodiment, the electronic device may count the starting frequency of the front camera and adjust the duration threshold according to the starting frequency. Generally, if the higher the starting frequency, the higher the possibility that the front camera is turned off and then turned on again, the second processor interface may be connected to the rear camera after a longer time interval after the front camera is turned off. Specifically, the starting frequency of the front camera is counted, and a corresponding duration threshold is obtained according to the starting frequency.
The electronic device can record the operation data of the front camera, and the recorded data can include start time, close time, start duration and the like. The electronic device may obtain the recorded historical operating data of the front camera and count the start-up frequency according to the historical operating data. The counted starting frequency may be the starting times within a certain period of time, for example, the average starting times of the front camera within a day may be counted as the starting frequency. And then establishing a corresponding relation between the starting frequency and the duration threshold, and acquiring the duration threshold according to the counted starting frequency.
And step 706, controlling the first image processor to be connected to the rear camera through the second processor interface.
In the data processing method provided by the above embodiment, when the electronic device detects that the front camera is started, the electronic device may control the first image processor to disconnect from the second processor interface connected to the rear camera according to the image acquisition instruction, and control the first image processor to establish connection with the second image processor through the second processor interface. When the electronic equipment shoots through the camera, one of the front camera and the rear camera is generally started at the same time, and then when the front camera is detected to be started, the second processor interface connected with the rear camera can be used for being connected with the second image processor, so that the time-sharing multiplexing of the second processor interface by the rear camera and the second image processor can be realized, and the resources of the electronic equipment are saved. Meanwhile, only when the depth image is collected, the processor interface is switched by carrying out image processing through the second image processor, so that whether the processor interface needs to be switched or not can be judged more accurately, and the waste of resources is avoided.
It should be understood that although the steps in the flowcharts of fig. 2, 4, 5, 6, 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 4, 5, 6, and 7 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 8 is a schematic diagram of the internal structure of the electronic device in another embodiment. As shown in fig. 8, the electronic device may include a front camera module 810, a first image processor 820, a second image processor 830, and the like, where the front camera module 810 includes a laser camera 812, a floodlight 814, a visible light camera 816, and a laser light 818. Among them, the first image processor 820 includes a first image processor kernel operating under TEE (Trusted Execution Environment) and a first image processor kernel operating under REE (Rich Execution Environment). Both the TEE and the REE are running modes of an ARM module (Advanced RISC Machines). Generally, the operation behavior with higher security in the electronic device needs to be executed under the TEE, and other operation behaviors can be executed under the REE. The second image processor 830 is an external microprocessor, and includes a PWM (Pulse width modulation) module 832, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) module 834, a RAM (random access Memory) module 836, and a Depth Engine module 838. In the embodiment of the application, when the first image processor 820 receives a depth image acquisition instruction of an application program, for example, when the application program needs to unlock face information and the application program needs to pay face information, the first image processor core running under the TEE can send the depth image acquisition instruction to the SPI/I2C module 834 in the MCU830 through the SECURE SPI/I2C, and the second image processor 830 can transmit a pulse wave through the PWM module 832 to control the floodlight 814 in the front camera module 810 to be turned on to acquire an infrared image and control the laser 818 in the front camera module 810 to be turned on to acquire a speckle image. The front camera module 810 can transmit the collected speckle images to a Depth Engine module 838 in the second image processor 830, and the second image processor 830 can calculate a Depth image according to the speckle images and then send the infrared image and the Depth image to the first image processor 820. The first image processor 820 performs face recognition according to the acquired infrared image, and detects whether a face exists in the infrared image and whether the detected face is matched with a stored face; and if the human face passes the identification, performing living body detection according to the infrared image and the depth image, and detecting whether a living body exists on the human face. In one embodiment, the first image processor 820 may perform live detection and then face recognition or perform face recognition and live detection simultaneously after acquiring the infrared image and the depth image. The first image processor 820 may transmit results of the face recognition and the living body detection to the application program, and the application program performs processing such as unlocking and payment according to the results of the face recognition and the living body detection.
FIG. 9 is a diagram illustrating a software architecture for implementing an image processing method according to an embodiment. As shown in fig. 9, the software architecture includes an application layer 910, an operating system 920, and a secure runtime environment 930. The modules in the secure operating environment 930 include a second image processor 931, a camera module 932, a first image processor 933, an encryption module 934, and the like; the operating system 930 comprises a security management module 921, a face management module 922, a camera driver 923 and a camera frame 924; the application layer 910 contains an application program 911. The application 911 may initiate an image capturing instruction and send the image capturing instruction to the second image processor 931 for processing. For example, when operations such as payment, unlocking, beautifying, Augmented Reality (AR) and the like are performed by acquiring a human face, the application program may initiate an image acquisition instruction for acquiring a human face image. It will be appreciated that image instructions initiated by the application 911 may be sent first to the first image processor 933, and then sent by the first image processor 933 to the second image processor 931.
After the second image processor 931 receives the image capturing instruction, if it is determined that the application operation corresponding to the image capturing instruction is a security operation (e.g., a payment operation or an unlocking operation), the camera module 932 is controlled according to the image capturing instruction to capture an infrared image and a speckle image, and the infrared image and the speckle image captured by the camera module 932 are transmitted to the second image processor 931. The second image processor 931 calculates a depth image including depth information according to the speckle image, calculates a depth parallax image according to the depth image, and calculates an infrared parallax image according to the infrared image. The depth parallax image and the infrared parallax image are then transmitted to the first image processor 933 through a secure transmission channel. The first image processor 933 corrects the infrared parallax image to obtain a corrected infrared image, and corrects the infrared parallax image to obtain a corrected depth image. Then, carrying out face authentication according to the corrected infrared image, and detecting whether a face exists in the corrected infrared image and whether the detected face is matched with the stored face; and if the human face passes the authentication, performing living body detection according to the corrected infrared image and the corrected depth image, and judging whether the human face is a living body human face. The face recognition result obtained by the first image processor 933 may be sent to the encryption module 934, and after being encrypted by the encryption module 934, the encrypted face recognition result is sent to the security management module 921. Generally, different application programs 911 all have corresponding security management modules 921, and the security management modules 921 perform decryption processing on the encrypted face recognition results, and send the face recognition results obtained after the decryption processing to corresponding face management modules 922. The face management module 922 sends the face recognition result to the upper application 911, and the application 911 performs corresponding operations according to the face recognition result.
If the application operation corresponding to the image capturing instruction received by the second image processor 931 is a non-secure operation (e.g., a beauty operation or an AR operation), the second image processor 931 may control the camera module 932 to capture the speckle image, calculate a depth image according to the speckle image, and then obtain a depth parallax image according to the depth image. The second image processor 931 sends the depth parallax image to the camera driver 923 through the non-secure transmission channel, the camera driver 923 corrects the depth parallax image to obtain a corrected depth image, and then sends the corrected depth image to the camera frame 924, and the camera frame 924 sends the corrected depth image to the face management module 922 or the application program 911.
FIG. 10 is a block diagram of a data processing apparatus according to an embodiment. As shown in fig. 10, the data processing apparatus 1000 includes a start detection module 1002 and an interface switching module 1004. Wherein:
the starting detection module 1002 is configured to, when it is detected that the front camera is started by the electronic device, control the first image processor to disconnect from a second processor interface connected to the rear camera; the first image processor is connected with the front camera through a first processor interface.
An interface switching module 1004, configured to control the first image processor to be connected to a second image processor through the second processor interface, where the second image processor is connected to the front-facing camera.
In the data processing apparatus provided in the above embodiment, when the electronic device detects that the front camera is started, the electronic device may control the first image processor to disconnect from the second processor interface connected to the rear camera, and control the first image processor to establish connection with the second image processor through the second processor interface. When the electronic equipment shoots through the camera, one of the front camera and the rear camera is generally started at the same time, and then when the front camera is detected to be started, the second processor interface connected with the rear camera can be used for being connected with the second image processor, so that the time-sharing multiplexing of the second processor interface by the rear camera and the second image processor can be realized, and the resources of the electronic equipment are saved.
Fig. 11 is a schematic structural diagram of a data processing apparatus according to another embodiment. As shown in fig. 11, the data processing apparatus 1100 includes a start detection module 1102 and an interface switching module 1104. Wherein:
the starting detection module 1102 is used for controlling the first image processor to be disconnected with a second processor interface connected with the rear camera when the electronic equipment is detected to start the front camera; the first image processor is connected with the front camera through a first processor interface.
And an interface switching module 1104, configured to control the first image processor to be connected to a second image processor through the second processor interface, where the second image processor is connected to the front-facing camera.
An interface reset module 1106, configured to start timing when detecting that the front camera is turned off; when the timing duration exceeds a duration threshold, controlling a second processor interface for connecting the first image processor and the second image processor to be disconnected; and controlling the first image processor to be connected to the rear camera through the second processor interface.
In the data processing apparatus provided in the above embodiment, when the electronic device detects that the front camera is started, the electronic device may control the first image processor to disconnect from the second processor interface connected to the rear camera, and control the first image processor to establish connection with the second image processor through the second processor interface. When the electronic equipment shoots through the camera, one of the front camera and the rear camera is generally started at the same time, and then when the front camera is detected to be started, the second processor interface connected with the rear camera can be used for being connected with the second image processor, so that the time-sharing multiplexing of the second processor interface by the rear camera and the second image processor can be realized, and the resources of the electronic equipment are saved.
In one embodiment, the interface switching module 1104 is further configured to obtain an image acquisition instruction when it is detected that the electronic device starts the front camera; and if the acquired image acquisition instruction is a depth image acquisition instruction, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
In an embodiment, the interface switching module 1104 is further configured to, if the obtained image acquisition instruction is a depth image acquisition instruction, obtain an application identifier included in the depth image acquisition instruction, where the application identifier is used to identify an application program that issues the depth image acquisition instruction; and if the application identifier is a preset application identifier, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
In an embodiment, the interface switching module 1104 is further configured to obtain an operation identifier included in the depth image acquisition instruction, where the operation identifier is used to indicate an application operation that needs to be completed through the acquired depth image; and if the operation identifier is a preset operation identifier, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
In one embodiment, the interface reset module 1106 is further configured to start timing when detecting that the front camera is turned off; when the timing duration exceeds a duration threshold, controlling a second processor interface for connecting the first image processor and the second image processor to be disconnected; and controlling the first image processor to be connected to the rear camera through the second processor interface.
In one embodiment, the interface resetting module 1106 is further configured to count a starting frequency of the front camera, and obtain a corresponding duration threshold according to the starting frequency.
The division of the modules in the data processing apparatus is only for illustration, and in other embodiments, the data processing apparatus may be divided into different modules as needed to complete all or part of the functions of the data processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the data processing methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the data processing method provided by the above embodiments.
The embodiment of the application also provides the mobile terminal. The mobile terminal includes an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 12 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 12, for convenience of explanation, only aspects of the image processing technique related to the embodiment of the present application are shown.
As shown in fig. 12, the image processing circuit includes an ISP processor 1240 and a control logic 1250. The image data captured by imaging device 1210 is first processed by ISP processor 1240, and ISP processor 1240 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of imaging device 1210. The imaging device 1210 may include a camera having one or more lenses 1212 and an image sensor 1214. Image sensor 1214 can include an array of color filters (e.g., Bayer filters), and image sensor 1214 can acquire light intensity and wavelength information captured with each imaging pixel of image sensor 1214 and provide a set of raw image data that can be processed by ISP processor 1240. Sensors 1220 (e.g., gyroscopes) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to ISP processor 1240 based on the type of sensor 1220 interface. The sensor 1220 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, image sensor 1214 may also send raw image data to sensor 1220, sensor 1220 may provide raw image data to ISP processor 1240 based on the type of interface to sensor 1220, or sensor 1220 may store raw image data in image memory 1230.
ISP processor 1240 processes the raw image data pixel-by-pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 1240 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 1240 may also receive image data from image memory 1230. For example, sensor 1220 interface sends raw image data to image memory 1230, and the raw image data in image memory 1230 is then provided to ISP processor 1240 for processing. The image Memory 1230 may be a part of a Memory device, a storage device, or a separate dedicated Memory within the mobile terminal, and may include a DMA (Direct Memory Access) feature.
ISP processor 1240 may perform one or more image processing operations, such as temporal filtering, upon receiving raw image data from image sensor 1214 interface or from sensor 1220 interface or from image memory 1230. The processed image data may be sent to image memory 1230 for additional processing before being displayed. ISP processor 1240 receives processed data from image memory 1230 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. Image data processed by ISP processor 1240 may be output to display 1270 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). In addition, the output of ISP processor 1240 can also be sent to image memory 1230 and display 1270 can read image data from image memory 1230. In one embodiment, image memory 1230 may be configured to implement one or more frame buffers. Further, the output of ISP processor 1240 may be transmitted to encoder/decoder 1260 for encoding/decoding of image data. The encoded image data may be saved and decompressed before being displayed on the display 1270 device. The encoder/decoder 1260 may be implemented by a CPU or GPU or coprocessor.
The statistics determined by ISP processor 1240 may be sent to control logic 1250 unit. For example, the statistical data may include image sensor 1214 statistical information such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 1212 shading correction, and the like. Control logic 1250 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 1210 and control parameters of ISP processor 1240 based on the received statistical data. For example, the control parameters of imaging device 1210 may include sensor 1220 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 1212 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 1212 shading correction parameters.
In the embodiment of the present application, the steps of the data processing method in the embodiment of the present application are implemented when the mobile terminal executes the computer program stored on the memory.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of data processing, the method comprising:
when the electronic equipment is detected to start the front camera, the first image processor is controlled to be disconnected with a second processor interface connected with the rear camera; the first image processor is connected with the front camera through a first processor interface;
and controlling the first image processor to be connected to a second image processor through the second processor interface, wherein the second image processor is connected to the front camera.
2. The method of claim 1, wherein when it is detected that the electronic device activates the front camera, controlling the first image processor to disconnect from a second processor interface connected to the rear camera comprises:
when detecting that the electronic equipment starts a front camera, acquiring an image acquisition instruction;
and if the acquired image acquisition instruction is a depth image acquisition instruction, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
3. The method according to claim 2, wherein if the acquired image capture command is a depth image capture command, controlling the first image processor to disconnect from a second processor interface connected to the rear camera comprises:
if the acquired image acquisition instruction is a depth image acquisition instruction, acquiring an application identifier contained in the depth image acquisition instruction, wherein the application identifier is used for marking an application program which sends the depth image acquisition instruction;
and if the application identifier is a preset application identifier, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
4. The method of claim 3, wherein said controlling the first image processor to disconnect from the second processor interface connected to the rear facing camera comprises:
acquiring an operation identifier contained in the depth image acquisition instruction, wherein the operation identifier is used for marking application operation which needs to be completed through the acquired depth image;
and if the operation identifier is a preset operation identifier, controlling the first image processor to be disconnected with a second processor interface connected with the rear camera.
5. The method of any of claims 1 to 4, wherein after said controlling said first image processor to connect to a second image processor through said second processor interface, further comprising:
when the front camera is detected to be closed or the rear camera is detected to be started, a second processor interface for controlling the first image processor to be connected with the second image processor is disconnected;
and controlling the first image processor to be connected to the rear camera through the second processor interface.
6. The method of any of claims 1 to 4, wherein after said controlling said first image processor to connect to a second image processor through said second processor interface, further comprising:
when the fact that the front camera is turned off is detected, timing is started;
when the timing duration exceeds a duration threshold, controlling a second processor interface for connecting the first image processor and the second image processor to be disconnected;
and controlling the first image processor to be connected to the rear camera through the second processor interface.
7. The method of claim 6, further comprising:
and counting the starting frequency of the front camera, and acquiring a corresponding duration threshold according to the starting frequency.
8. A data processing apparatus, characterized in that the apparatus comprises:
the starting detection module is used for controlling the first image processor to be disconnected with a second processor interface connected with the rear camera when the electronic equipment is detected to start the front camera; the first image processor is connected with the front camera through a first processor interface;
and the interface switching module is used for controlling the first image processor to be connected to a second image processor through the second processor interface, wherein the second image processor is connected to the front camera.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
10. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of the method of any of claims 1 to 7.
HK19123022.6A 2019-04-26 Data processing method, device, computer readable storage medium and electronic equipment HK1262801B (en)

Publications (2)

Publication Number Publication Date
HK1262801A1 HK1262801A1 (en) 2020-01-17
HK1262801B true HK1262801B (en) 2020-11-06

Family

ID=

Similar Documents

Publication Publication Date Title
CN110225258B (en) Data processing method, apparatus, computer-readable storage medium and electronic device
CN108716983B (en) Optical element detection method and device, electronic device and storage medium
CN109842753B (en) Camera anti-shake system, method, electronic device and storage medium
CN107948519B (en) Image processing method, device and equipment
CN107909686B (en) Method and device for unlocking human face, computer readable storage medium and electronic equipment
CN108600740B (en) Optical element detection method, optical element detection device, electronic equipment and storage medium
CN107592473A (en) Exposure parameter adjustment method, device, electronic device and readable storage medium
CN108716982B (en) Optical element detection method, optical element detection device, electronic equipment and storage medium
CN108924426B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN107872631B (en) Image shooting method and device based on double cameras and mobile terminal
CN110177212B (en) Image processing method and apparatus, electronic device, computer-readable storage medium
CN107172352B (en) Focusing control method and device, computer-readable storage medium and mobile terminal
CN109151303B (en) Image processing method and apparatus, electronic device, computer-readable storage medium
CN107657167A (en) Face unlocking method, device, computer-readable storage medium and electronic device
US11218650B2 (en) Image processing method, electronic device, and computer-readable storage medium
CN109981983B (en) Augmented reality image processing method, device, electronic device and storage medium
CN108760245B (en) Optical element detection method and device, electronic equipment and readable storage medium
CN108805025A (en) Laser output control method and apparatus, electronic device, and storage medium
JP6975144B2 (en) Imaging processing device, electronic device, imaging processing method, imaging processing device control program
CN107465880A (en) focusing method, device, terminal and computer readable storage medium
HK1262801B (en) Data processing method, device, computer readable storage medium and electronic equipment
HK1262801A1 (en) Data processing method, device, computer readable storage medium and electronic equipment
CN109120846B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109600547B (en) Photographing method and device, electronic equipment and storage medium
CN111124218B (en) Method for determining display window of mobile terminal, mobile terminal, and computer storage medium