HK1247010A - Image processing method and related products - Google Patents
Image processing method and related products Download PDFInfo
- Publication number
- HK1247010A HK1247010A HK18106010.3A HK18106010A HK1247010A HK 1247010 A HK1247010 A HK 1247010A HK 18106010 A HK18106010 A HK 18106010A HK 1247010 A HK1247010 A HK 1247010A
- Authority
- HK
- Hong Kong
- Prior art keywords
- image
- iris
- infrared
- visible light
- target
- Prior art date
Links
Description
Technical Field
The invention relates to the technical field of mobile terminals, in particular to an image processing method and a related product.
Background
With the widespread application of mobile terminals (mobile phones, tablet computers, etc.), the applications that the mobile terminals can support are increasing, the functions are increasing, and the mobile terminals are developing towards diversification and individuation, and become indispensable electronic products in the life of users.
At present, iris recognition is more and more favored by manufacturers of mobile terminals, the currently acquired iris images are all infrared images, and users tend to watch color images under many conditions, so that the display effect is poor.
Disclosure of Invention
The embodiment of the invention provides an image processing method and a related product, which can obtain a colorful iris image and improve the display effect of the iris image.
In a first aspect, an embodiment of the present invention provides an image processing method, where the method includes:
collecting color information of a target object;
determining a target compensation parameter corresponding to the color information from a plurality of compensation parameters stored in advance;
and when the iris is acquired, compensating the iris based on the target compensation parameter to obtain a colorful iris image.
In a second aspect, an embodiment of the present invention provides a mobile terminal, including an iris recognition device, a memory, and an application processor AP, where the iris recognition device and the memory are both connected to the AP, where:
the iris recognition device is used for acquiring color information of a target object;
the memory is used for storing a plurality of compensation parameters;
the AP is used for determining a target compensation parameter corresponding to the color information from the plurality of compensation parameters; and when the iris recognition device collects the iris, compensating the iris based on the target compensation parameter corresponding to the color information to obtain a color iris image.
In a third aspect, an embodiment of the present invention provides a mobile terminal, including: an iris recognition device, an application processor AP and a memory; and one or more programs stored in the memory and configured to be executed by the AP, the programs including instructions for performing some or all of the steps described in the first aspect.
In a fourth aspect, an embodiment of the present invention provides an image processing apparatus including a color acquisition unit, a compensation parameter determination unit, and an iris acquisition unit, wherein,
the color acquisition unit is used for acquiring color information of a target object;
the compensation parameter determining unit is used for determining a target compensation parameter corresponding to the color information from a plurality of pre-stored compensation parameters;
and the iris acquisition unit is used for compensating the iris based on the target compensation parameter to obtain a colorful iris image when the iris is acquired.
In a fifth aspect, the present invention provides a computer-readable storage medium, where the computer-readable storage medium is used for storing a computer program, where the computer program is used to make a computer perform some or all of the steps described in the first aspect of the present invention.
In a sixth aspect, embodiments of the present invention provide a computer program product, where the computer program product comprises a non-transitory computer-readable storage medium storing a computer program, the computer program being operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present invention. The computer program product may be a software installation package.
The embodiment of the invention has the following beneficial effects:
it can be seen that, in the embodiment of the present invention, the color information of the target object is collected, the target compensation parameter corresponding to the color information is determined from the plurality of pre-stored compensation parameters, and when the iris is collected, the iris is compensated based on the target compensation parameter to obtain the color iris image.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of a smart phone according to an embodiment of the present invention;
fig. 1B is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 1C is a schematic structural diagram of another mobile terminal according to an embodiment of the present invention;
FIG. 2 is a flow chart of an image processing method according to an embodiment of the present invention;
FIG. 3 is a flow chart of another image processing method disclosed in the embodiment of the invention;
fig. 4 is a schematic structural diagram of another mobile terminal according to an embodiment of the present invention;
FIG. 5A is a schematic structural diagram of an image processing apparatus according to an embodiment of the present disclosure;
FIG. 5B is a schematic diagram of a compensation parameter determining unit of the image processing apparatus depicted in FIG. 5A according to an embodiment of the present disclosure;
FIG. 5C is a schematic diagram of an iris collecting unit of the image processing apparatus depicted in FIG. 5A according to an embodiment of the present disclosure;
FIG. 5D is a schematic diagram of a first image fusion module of the iris acquisition unit depicted in FIG. 5C according to an embodiment of the present disclosure;
FIG. 5E is a schematic diagram of another structure of the image processing apparatus depicted in FIG. 5A according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of another mobile terminal disclosed in the embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and the like in the description and claims of the present invention and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The Mobile terminal according to the embodiment of the present invention may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), and the like, which have wireless communication functions. For convenience of description, the above-mentioned devices are collectively referred to as a mobile terminal. The following describes embodiments of the present invention in detail. As shown in fig. 1A, an example smart phone 100, an iris recognition device of the smart phone 100 may include an infrared light supplement lamp 21, an infrared camera 22 and a front camera 23, in a working process of the iris recognition device, after light of the infrared light supplement lamp 21 strikes an iris, the light is reflected back to the infrared camera 22 through the iris, the iris recognition device collects an infrared iris image, the front camera 23 may be used to collect a visible light image, and then, the iris image and the visible light image are subjected to image fusion to obtain a final iris image, and the iris image may be displayed to a user.
Referring to fig. 1B, fig. 1B is a schematic structural diagram of a mobile terminal 100 according to an embodiment of the present invention, where the mobile terminal 100 includes: the application processor AP110, the iris recognition device 130 and the memory 140, wherein the iris recognition device 130 may be integrated with the touch display screen, or the iris recognition device and the memory 140 may exist independently, wherein the AP110 is connected with the touch display screen, the iris recognition device 130 and the memory 140 through the bus 150, and further, referring to fig. 1C, fig. 1C is a modified structure of the mobile terminal 100 described in fig. 1B, and with respect to fig. 1B, fig. 1C further includes an environment sensor 160.
In some possible embodiments, the iris recognition device 130 is configured to acquire color information of a target object; the memory 140 is used for storing a plurality of compensation parameters;
the AP110, configured to determine a target compensation parameter corresponding to the color information from the plurality of compensation parameters; and compensating the iris based on the target compensation parameter corresponding to the color information to obtain a color iris image when the iris recognition device 130 collects the iris.
In some possible embodiments, in the determining the target compensation parameter corresponding to the color information from the plurality of compensation parameters, the AP110 is specifically configured to:
analyzing the color information to obtain abnormal color information; determining a target characterization color corresponding to the abnormal color information; and determining a target compensation parameter corresponding to the target representation color from the plurality of compensation parameters according to a preset corresponding relation between the representation color and the compensation parameters.
In some possible embodiments, the iris recognition device 130 includes a visible light camera, an infrared fill-in light, and an infrared camera;
in the aspect of compensating the iris based on the target compensation parameter to obtain a color iris image during the iris acquisition, the iris recognition apparatus 130 is specifically configured to:
acquiring an image of the target object by a visible light camera of the iris recognition device 130 according to the target compensation parameter to obtain a visible light image; and acquiring an image of the target object through an infrared fill-in light and an infrared camera of the iris recognition device 130 to obtain an infrared image, and instructing the AP110 to perform image fusion on the visible light image and the infrared image to obtain the color iris image.
In some possible embodiments, in the aspect of performing image fusion on the visible light image and the infrared image to obtain the color iris image, the AP110 is specifically configured to:
carrying out image segmentation on the visible light image to obtain a visible light iris image; carrying out image segmentation on the infrared image to obtain an infrared iris image; performing spatial transformation on the visible light iris image and the infrared iris image so as to enable the size and the angle of the transformed visible light iris image and the size and the angle of the infrared iris image to be consistent; and carrying out image fusion on the transformed visible light iris image and the infrared iris image to obtain the colorful iris image.
In some possible embodiments, the mobile terminal is further provided with an environmental sensor 160;
the environment sensor 160 is used for acquiring environment parameters;
in the aspect of compensating the iris based on the target compensation parameter to obtain a color iris image during the iris acquisition, the iris recognition apparatus 130 is specifically configured to:
and when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
Referring to fig. 2, fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present invention, applied to a mobile terminal including an iris recognition device, a memory, and an application processor AP, where fig. 1A to 1C are referenced for a physical diagram and a structural diagram of the mobile terminal, and the image processing method includes:
201. color information of the target object is collected.
The target object may be a human face or an iris, or an object including an iris. The color information of a target object can be collected through the iris recognition device, the iris recognition device further comprises a visible light camera, the visible light camera is used for obtaining a color image corresponding to the target object, and then the color information is extracted from the color image.
202. A target compensation parameter corresponding to the color information is determined from a plurality of compensation parameters stored in advance.
Wherein, a plurality of compensation parameters can be pre-stored in the memory of the mobile terminal, and the compensation parameters can include but are not limited to: the iris is gathered the electric current, the iris is gathered voltage, power is gathered to the iris, color correction coefficient and so on, wherein, the iris is gathered the electric current of gathering when gathering the iris promptly, voltage is gathered for the collection voltage when gathering the iris to the iris, power is gathered for the collection power when gathering the iris to the iris image that the color correction coefficient is to gathering, draw the colour of iris image, and then, carry out the color cast according to this color correction coefficient to this colour and correct, under the normal condition, the distortion also can appear in the colour, therefore, need restore the colour of distortion, so that the colour after handling is more natural, perhaps, lifelike. Accordingly, a mapping relationship between the color information and the compensation parameter may be set in advance, and further, a target compensation parameter corresponding to the color information obtained in the above-described step 201 may be determined from a plurality of compensation parameters stored in advance.
In one possible example, in the step 202, determining the target compensation parameter corresponding to the color information from a plurality of pre-stored compensation parameters may include the following steps:
221. analyzing the color information to obtain abnormal color information;
222. determining a target characterization color corresponding to the abnormal color information;
223. and determining a target compensation parameter corresponding to the target representation color from the plurality of compensation parameters according to a preset corresponding relation between the representation color and the compensation parameters.
The color information is color information of the whole scene, and it can be known by combining life experience that only partial region color distortion often occurs in one scene, so that the color information of the part, namely abnormal color information, can be obtained, which color the abnormal color information is preferred to can be analyzed through the abnormal color information, or the color of the real scene is the same at the bottom, and further, a target representation color is obtained, and the target representation color can be understood as the color information of the real scene. The memory of the mobile terminal can pre-store the corresponding relation between the representation color and the compensation parameter, and further, the target compensation parameter corresponding to the target representation color can be determined according to the corresponding relation, wherein the target compensation parameter is one of a plurality of pre-stored compensation parameters.
203. And when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
The colors of the obtained iris images are different according to different compensation parameters, so that the iris acquisition can be carried out on a target object according to the target compensation parameters to obtain a final iris image which is a color image.
In one possible example, the iris recognition device includes a visible light camera, an infrared fill light, and an infrared camera; in step 203, when acquiring an iris, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a color iris image, which may include the following steps:
31. acquiring an image of the target object by a visible light camera according to the target compensation parameter to obtain a visible light image;
32. acquiring an image of the target object through an infrared light supplement lamp and an infrared camera of the iris recognition device to obtain an infrared image;
33. and carrying out image fusion on the visible light image and the infrared image to obtain the colorful iris image.
Referring to fig. 1A to 1C, the iris recognition device in the embodiment of the present invention may be composed of a visible light camera, an infrared fill-in light, and an infrared camera. The steps 31 and 32 may be executed in parallel, or the step 31 is executed first and then the step 32 is executed, or the step 32 is executed first and then the step 31 is executed. And the controllable visible light camera acquires an image of the target object according to the target compensation parameter to obtain a visible light image, the infrared light supplement lamp and the infrared camera of the controllable iris recognition device acquire the image of the target object to obtain an infrared image, and the visible light image and the infrared image are subjected to image fusion to obtain a colorful iris image.
Optionally, in the step 33, performing image fusion on the visible light image and the infrared image to obtain the color iris image, may include the following steps:
a1, carrying out image segmentation on the visible light image to obtain a visible light iris image;
a2, carrying out image segmentation on the infrared image to obtain an infrared iris image;
a3, carrying out spatial transformation on the visible light iris image and the infrared iris image so as to make the size and the angle of the transformed visible light iris image and the size and the angle of the infrared iris image consistent;
and A4, carrying out image fusion on the transformed visible light iris image and the infrared iris image to obtain the colorful iris image.
Wherein, the image segmentation can adopt one of the following modes: a gray threshold segmentation method, a variance method between maximum classes, a peak-valley method of an image gray histogram, a minimum error method, a maximum entropy automatic threshold method, a graph theory segmentation method, and the like. The spatial transformation may be an affine transformation, a rigid transformation, or a non-rigid transformation, etc. The method comprises the steps of carrying out image segmentation on a visible light image to obtain a visible light iris image, carrying out image segmentation on an infrared image to obtain an infrared iris image, carrying out spatial transformation on the visible light iris image and the infrared iris image to enable the size and the angle of the transformed visible light iris image and the infrared iris image to be consistent, facilitating subsequent image fusion, and carrying out image fusion on the transformed visible light iris image and the infrared iris image to obtain a colorful iris image. Thus, a final iris image can be obtained.
Optionally, in the step 33, performing image fusion on the visible light image and the infrared image to obtain the color iris image, may include the following steps:
b1, performing multi-scale transformation on the visible light image to obtain a first low-frequency component and a first high-frequency component set;
b2, carrying out multi-scale decomposition on the infrared image to obtain a second low-frequency component and a second high-frequency component set;
b3, performing weighting operation on the first low-frequency component and the second low-frequency component to obtain a target low-frequency component;
b4, synthesizing the first high-frequency component set and the second high-frequency component set according to the neighborhood energy measuring principle to obtain a target high-frequency component set;
and B5, carrying out multi-scale inverse transformation on the target low-frequency component and the target high-frequency component set to obtain the color iris image.
The visible light image may be subjected to multi-scale transformation by using a multi-scale decomposition algorithm to obtain a first low-frequency component and a first high-frequency component set, where the first high-frequency component set may include multiple high-frequency components, and the multi-scale decomposition algorithm may include, but is not limited to: wavelet Transform, laplacian Transform, Contourlet Transform (CT), nonsubsampled Contourlet Transform (NSCT), shear wave Transform, etc., taking a Contourlet as an example, performing multi-scale decomposition on a visible light image by using the Contourlet Transform to obtain a low-frequency component and a plurality of high-frequency components, and taking NSCT as an example, performing multi-scale decomposition on the visible light image by using the NSCT to obtain a low-frequency component and a plurality of high-frequency components, and the size of each of the plurality of high-frequency components is the same. For high frequency components, it contains more detail information of the original image. Similarly, a multi-scale decomposition algorithm can be adopted to perform multi-scale decomposition on the infrared image to obtain a second low-frequency component and a second high-frequency component set. The neighborhood may be a neighborhood of a specified size, e.g., 3 x 3,5 x 5,7 x 7,11 x 11, etc. Based on neighborhood consideration, relevance among surrounding pixel points can be fully considered, so that more detailed information can be reserved, in the process of synthesizing high-frequency components, pixel points at corresponding positions are synthesized by adopting the neighborhood energy quantity increasing principle, the first high-frequency component image corresponds to the second high-frequency component image in position, namely the hierarchical position between the first high-frequency component image and the second high-frequency component image is the same as the scale position, for example, the first high-frequency component image is positioned at the 2 nd layer, the 3 rd scale image is positioned at the 2 nd layer, the second high-frequency component image is also positioned at the 3 rd scale image, for example, at the A position, the energy corresponding to the high-frequency component of the infrared image is 10, the energy corresponding to the high-frequency component of the visible light is 11, and the high-frequency coefficient corresponding to the high-frequency component of the visible light is. Specifically, the above B1 and B2 may be executed in parallel, or executed sequentially. The method comprises the steps of carrying out multi-scale transformation on a visible light image to obtain a first low-frequency component and a first high-frequency component set, carrying out multi-scale decomposition on an infrared image to obtain a second low-frequency component and a second high-frequency component set, controlling an AP to carry out weighting operation on the first low-frequency component and the second low-frequency component to obtain a target low-frequency component, synthesizing the first high-frequency component set and the second high-frequency component set according to the neighborhood energy large-scale principle to obtain a target high-frequency component set, and controlling the AP to carry out multi-scale inverse transformation on the target low-frequency component and the target high-frequency component set to obtain a color iris image which is a color iris image and can be better enjoyed by users.
It can be seen that, in the embodiment of the present invention, color information of a target object is acquired, a target compensation parameter corresponding to the color information is determined from a plurality of pre-stored compensation parameters, iris acquisition is performed on the target object according to the target compensation parameter, so as to obtain a color iris image.
Referring to fig. 3, fig. 3 is a schematic flowchart of an image processing method according to an embodiment of the present invention, applied to a mobile terminal including an iris recognition device, a memory, and an application processor AP, where fig. 1C is a diagram of a physical object and a structure of the mobile terminal, where the image processing method includes:
301. color information of the target object is collected.
302. And acquiring the environmental parameters through an environmental sensor.
The steps 301 and 302 may be executed in parallel, or the step 301 is executed first, and then the step 302 is executed, or the step 302 is executed first, and then the step 301 is executed.
Alternatively, the environment sensor may be an environment light sensor for detecting the brightness of the environment, or the environment sensor may be a magnetic field sensor for detecting the magnetic field strength, and the environment sensor may be a wettability sensor for detecting the humidity of the environment, or the environment sensor may be a temperature sensor for detecting the temperature of the environment, wherein a mapping relationship between the environment parameter and the iris collecting parameter may be preset, and after the current environment parameter is determined, the iris collecting parameter corresponding to the current environment parameter may be determined according to the mapping relationship. The iris acquisition parameters may include, but are not limited to: collecting current, collecting voltage, etc.
303. A target compensation parameter corresponding to the color information is determined from a plurality of compensation parameters stored in advance.
304. And when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
The mapping relationship between the target compensation parameter and the environmental parameter and the iris acquisition parameter may be preset, and the target compensation parameter in step 303 and the iris acquisition parameter corresponding to the environmental parameter in step 302 may be determined according to the mapping relationship, and then iris acquisition is performed according to the iris acquisition parameter to obtain a color iris image. The iris acquisition parameters may include, but are not limited to: collecting current, collecting voltage, etc.
It can be seen that, in the embodiment of the present invention, color information of a target object is acquired, a target compensation parameter corresponding to the color information is determined from a plurality of pre-stored compensation parameters, iris acquisition is performed on the target object according to the target compensation parameter and an environmental parameter to obtain a color iris image, and thus, the corresponding compensation parameter can be acquired through the color information of the target object, and further, iris acquisition is performed according to the compensation parameter to finally obtain the color iris image, so that the display effect is better.
Referring to fig. 4, fig. 4 is a mobile terminal according to an embodiment of the present invention, which mainly includes: the application processor AP and the memory can also comprise an iris recognition device; and one or more programs stored in the memory and configured for execution by the AP, the programs including instructions for performing the steps of:
collecting color information of a target object;
determining a target compensation parameter corresponding to the color information from a plurality of compensation parameters stored in advance;
and when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
In one possible example, in the determining of the target compensation parameter corresponding to the color information from a plurality of compensation parameters stored in advance, the program includes instructions for:
analyzing the color information to obtain abnormal color information;
determining a target characterization color corresponding to the abnormal color information;
and determining a target compensation parameter corresponding to the target representation color from the plurality of compensation parameters according to a preset corresponding relation between the representation color and the compensation parameters.
In one possible example, the iris recognition device includes a visible light camera, an infrared fill light, and an infrared camera; in the aspect of compensating the iris based on the target compensation parameter to obtain a color iris image at the time of iris acquisition, the program includes instructions for:
acquiring an image of the target object by a visible light camera according to the target compensation parameter to obtain a visible light image;
acquiring an image of the target object through an infrared light supplement lamp and an infrared camera to obtain an infrared image;
and carrying out image fusion on the visible light image and the infrared image to obtain the colorful iris image.
In one possible example, in said image fusing said visible light image and said infrared image to obtain said color iris image, said program comprises instructions for performing the steps of:
carrying out image segmentation on the visible light image to obtain a visible light iris image;
carrying out image segmentation on the infrared image to obtain an infrared iris image;
performing spatial transformation on the visible light iris image and the infrared iris image so as to enable the size and the angle of the transformed visible light iris image and the size and the angle of the infrared iris image to be consistent;
and carrying out image fusion on the transformed visible light iris image and the infrared iris image to obtain the colorful iris image.
In one possible example, the mobile terminal is further provided with an environmental sensor; the program further includes instructions for performing the steps of:
acquiring an environmental parameter through the environmental sensor;
in the aspect of compensating the iris based on the target compensation parameter to obtain a color iris image at the time of iris acquisition, the program includes instructions for:
and when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
Referring to fig. 5A, fig. 5A is a schematic structural diagram of an image processing apparatus according to the present embodiment. The image processing apparatus is applied to a mobile terminal, and includes a color acquisition unit 501, a compensation parameter determination unit 502, and an iris acquisition unit 503, wherein,
the color acquisition unit 501 is configured to acquire color information of a target object;
the compensation parameter determining unit 502 is configured to determine a target compensation parameter corresponding to the color information from a plurality of pre-stored compensation parameters;
the iris collecting unit 503 is configured to perform compensation on the iris based on the target compensation parameter and the environmental parameter to obtain a color iris image during iris collection.
Alternatively, as shown in fig. 5B, fig. 5B is a detailed structure of the compensation parameter determining unit 502 of the image processing apparatus depicted in fig. 5A, the compensation parameter determining unit 502 may include an analyzing module 5021 and a determining module 5022, which are as follows:
the analysis module 5021 is used for analyzing the color information to obtain abnormal color information;
a determining module 5022, configured to determine a target characterization color corresponding to the abnormal color information;
the determining module 5022 is further configured to determine a target compensation parameter corresponding to the target characterization color from the plurality of compensation parameters according to a preset corresponding relationship between the characterization color and the compensation parameter.
Optionally, as shown in fig. 5C, fig. 5C is a specific detailed structure of the iris collecting unit 503 of the image processing apparatus depicted in fig. 5A, where the iris identifying apparatus includes a visible light camera, an infrared fill-in light, and an infrared camera; the iris acquisition unit 503 may include: the visible light image acquisition module 5031, the infrared image acquisition module 5032 and the first image fusion module 5033 are as follows:
a visible light image acquisition module 5031, configured to perform image acquisition on the target object according to the target compensation parameter through a visible light camera to obtain a visible light image;
the infrared image acquisition module 5032 is configured to acquire an image of the target object through the infrared fill-in light and the infrared camera to obtain an infrared image;
the first image fusion module 5033 is configured to perform image fusion on the visible light image and the infrared image to obtain the color iris image.
Alternatively, as shown in fig. 5D, fig. 5D is a specific refinement structure of the first image fusion module 5033 depicted in fig. 5C, and the first image fusion module 5033 may include: the image segmentation module 601, the spatial transformation module 602, and the second image fusion module 603 are specifically as follows:
the image segmentation module 601 is configured to perform image segmentation on the visible light image to obtain a visible light iris image;
the image segmentation module 601 is further configured to: carrying out image segmentation on the infrared image to obtain an infrared iris image;
a spatial transformation module 602, configured to perform spatial transformation on the visible light iris image and the infrared iris image, so that the size and the angle of the visible light iris image and the size and the angle of the infrared iris image after transformation are consistent;
and a second image fusion module 603, configured to perform image fusion on the converted visible light iris image and the infrared iris image to obtain the color iris image.
Alternatively, as shown in fig. 5E, fig. 5E is a further modified structure of the image processing apparatus depicted in fig. 5A, and compared with fig. 5A, the image processing apparatus depicted in fig. 5E further includes: an environment parameter obtaining unit 504, where the mobile terminal is further provided with an environment sensor; the method comprises the following specific steps:
an environmental parameter obtaining unit 504, configured to obtain an environmental parameter through the environmental sensor;
the iris acquisition unit 503 is specifically configured to: and when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
It can be seen that, the image processing apparatus described in the embodiment of the present invention acquires color information of a target object, determines a target compensation parameter corresponding to the color information from a plurality of pre-stored compensation parameters, and performs iris acquisition on the target object according to the target compensation parameter to obtain a color iris image.
It is to be understood that the functions of each program module of the image processing apparatus of this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not described herein again.
As shown in fig. 6, for convenience of description, only the parts related to the embodiment of the present invention are shown, and details of the specific technology are not disclosed, please refer to the method part in the embodiment of the present invention. The mobile terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and the like, taking the mobile terminal as the mobile phone as an example:
fig. 6 is a block diagram illustrating a partial structure of a mobile phone related to a mobile terminal according to an embodiment of the present invention. Referring to fig. 6, the handset includes: radio Frequency (RF) circuit 910, memory 920, input unit 930, sensor 950, audio circuit 960, Wireless Fidelity (WiFi) module 970, application processor AP980, and power supply 990. Those skilled in the art will appreciate that the handset configuration shown in fig. 6 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 6:
the input unit 930 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 930 may include a touch display screen 933, an iris recognition apparatus 931, and other input devices 932. The input unit 930 may also include other input devices 932. In particular, other input devices 932 may include, but are not limited to, one or more of physical keys, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Wherein the AP80 is operable to perform the steps of:
collecting color information of a target object;
determining a target compensation parameter corresponding to the color information from a plurality of compensation parameters stored in advance;
and when the iris is acquired, compensating the iris based on the target compensation parameter to obtain a colorful iris image.
The AP980 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions and processes of the mobile phone by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the mobile phone. Optionally, AP980 may include one or more processing units; preferably, the AP980 may integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the AP 980.
Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
RF circuitry 910 may be used for the reception and transmission of information. In general, the RF circuit 910 includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 910 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The handset may also include at least one sensor 950, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the touch display screen according to the brightness of ambient light, and the proximity sensor may turn off the touch display screen and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 960, speaker 961, microphone 962 may provide an audio interface between a user and a cell phone. The audio circuit 960 may transmit the electrical signal converted from the received audio data to the speaker 961, and the audio signal is converted by the speaker 961 to be played; on the other hand, the microphone 962 converts the collected sound signal into an electrical signal, and the electrical signal is received by the audio circuit 960 and converted into audio data, and the audio data is processed by the audio playing AP980, and then sent to another mobile phone via the RF circuit 910, or played to the memory 920 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 970, and provides wireless broadband Internet access for the user. Although fig. 6 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The handset also includes a power supply 990 (e.g., a battery) for supplying power to the various components, and preferably, the power supply may be logically connected to the AP980 via a power management system, so that functions such as managing charging, discharging, and power consumption may be performed via the power management system.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiments shown in fig. 2 to fig. 3, the method flows of the steps may be implemented based on the structure of the mobile phone.
In the embodiments shown in fig. 4 and fig. 5A to fig. 5E, the functions of the units may be implemented based on the structure of the mobile phone.
Embodiments of the present invention also provide a computer storage medium for storing a computer program, where the computer program makes a computer execute part or all of the steps of any one of the image processing methods described in the above method embodiments.
Embodiments of the present invention also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the image processing methods as recited in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above embodiments of the present invention are described in detail, and the principle and the implementation of the present invention are explained by applying specific embodiments, and the above description of the embodiments is only used to help understanding the method of the present invention and the core idea thereof; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (13)
1. An image processing method, characterized in that the method comprises:
collecting color information of a target object;
determining a target compensation parameter corresponding to the color information from a plurality of compensation parameters stored in advance;
and when the iris is acquired, compensating the iris based on the target compensation parameter to obtain a colorful iris image.
2. The method according to claim 1, wherein the determining a target compensation parameter corresponding to the color information from a plurality of pre-stored compensation parameters comprises:
analyzing the color information to obtain abnormal color information;
determining a target characterization color corresponding to the abnormal color information;
and determining a target compensation parameter corresponding to the target representation color from the plurality of compensation parameters according to a preset corresponding relation between the representation color and the compensation parameters.
3. The method of any one of claims 1 or 2, wherein compensating the iris at iris acquisition based on the target compensation parameter to obtain a color iris image comprises:
acquiring an image of the target object by a visible light camera according to the target compensation parameter to obtain a visible light image;
acquiring an image of the target object through an infrared light supplement lamp and an infrared camera to obtain an infrared image;
and carrying out image fusion on the visible light image and the infrared image to obtain the colorful iris image.
4. The method of claim 3, wherein the image fusing the visible light image and the infrared image to obtain the color iris image comprises:
carrying out image segmentation on the visible light image to obtain a visible light iris image;
carrying out image segmentation on the infrared image to obtain an infrared iris image;
performing spatial transformation on the visible light iris image and the infrared iris image so as to enable the size and the angle of the transformed visible light iris image and the size and the angle of the infrared iris image to be consistent;
and carrying out image fusion on the transformed visible light iris image and the infrared iris image to obtain the colorful iris image.
5. The method according to any of claims 1-4, characterized in that the mobile terminal is further provided with an environmental sensor; the method further comprises the following steps:
acquiring an environmental parameter through the environmental sensor;
when the iris is collected, the iris is compensated based on the target compensation parameter to obtain a colorful iris image, and the method comprises the following steps:
and when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
6. A mobile terminal, comprising an iris recognition device, a memory and an Application Processor (AP), wherein the iris recognition device and the memory are connected to the AP, and wherein:
the iris recognition device is used for acquiring color information of a target object;
the memory is used for storing a plurality of compensation parameters;
the AP is used for determining a target compensation parameter corresponding to the color information from the plurality of compensation parameters; and when the iris recognition device collects the iris, compensating the iris based on the target compensation parameter corresponding to the color information to obtain a color iris image.
7. The mobile terminal of claim 6, wherein in said determining a target compensation parameter corresponding to the color information from the plurality of compensation parameters, the AP is specifically configured to:
analyzing the color information to obtain abnormal color information; determining a target characterization color corresponding to the abnormal color information; and determining a target compensation parameter corresponding to the target representation color from the plurality of compensation parameters according to a preset corresponding relation between the representation color and the compensation parameters.
8. The mobile terminal according to any one of claims 6 or 7, wherein the iris recognition device comprises a visible light camera, an infrared fill-in light and an infrared camera;
in the aspect of compensating the iris based on the target compensation parameter to obtain a color iris image during the iris acquisition, the iris recognition apparatus is specifically configured to:
acquiring an image of the target object by a visible light camera of the iris recognition device according to the target compensation parameter to obtain a visible light image;
and acquiring an image of the target object through an infrared fill-in light and an infrared camera of the iris recognition device to obtain an infrared image, and indicating the AP to perform image fusion on the visible light image and the infrared image to obtain the colorful iris image.
9. The mobile terminal according to claim 8, wherein in the aspect of image fusion of the visible light image and the infrared image to obtain the color iris image, the AP is specifically configured to:
carrying out image segmentation on the visible light image to obtain a visible light iris image; carrying out image segmentation on the infrared image to obtain an infrared iris image; performing spatial transformation on the visible light iris image and the infrared iris image so as to enable the size and the angle of the transformed visible light iris image and the size and the angle of the infrared iris image to be consistent; and carrying out image fusion on the transformed visible light iris image and the infrared iris image to obtain the colorful iris image.
10. A mobile terminal according to any of claims 6-9, characterized in that the mobile terminal is further provided with an environmental sensor;
the environment sensor is used for acquiring environment parameters;
in the aspect of compensating the iris based on the target compensation parameter to obtain a color iris image during the iris acquisition, the iris recognition apparatus is specifically configured to:
and when the iris is acquired, the iris is compensated based on the target compensation parameter and the environmental parameter to obtain a colorful iris image.
11. A mobile terminal, comprising: an iris recognition device, an application processor AP and a memory; and one or more programs stored in the memory and configured to be executed by the AP, the programs including instructions for performing the method of any of claims 1-5.
12. An image processing apparatus comprising a color acquisition unit, a compensation parameter determination unit, and an iris acquisition unit, wherein,
the color acquisition unit is used for acquiring color information of a target object;
the compensation parameter determining unit is used for determining a target compensation parameter corresponding to the color information from a plurality of pre-stored compensation parameters;
and the iris acquisition unit is used for compensating the iris based on the target compensation parameter to obtain a colorful iris image when the iris is acquired.
13. A computer-readable storage medium for storing a computer program, wherein the computer program causes a computer to perform the method according to any one of claims 1-5.
Publications (3)
| Publication Number | Publication Date |
|---|---|
| HK1247010A1 HK1247010A1 (en) | 2018-09-14 |
| HK1247010A true HK1247010A (en) | 2018-09-14 |
| HK1247010B HK1247010B (en) | 2020-04-24 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107403147B (en) | Iris liveness detection method and related products | |
| CN107423699B (en) | Liveness detection methods and related products | |
| CN107679482B (en) | Unlock control method and related products | |
| CN107862265B (en) | Image processing method and related product | |
| CN107332981B (en) | Image processing method and related products | |
| CN107480496B (en) | Unlocking control method and related product | |
| US11363196B2 (en) | Image selection method and related product | |
| CN110209245B (en) | Face recognition method and related product | |
| CN107679481B (en) | Unlocking control method and related product | |
| CN107451446B (en) | Unlock control method and related products | |
| CN107633499B (en) | Image processing method and related product | |
| CN107480488B (en) | Unlocking control method and related product | |
| CN107644219B (en) | Face registration method and related products | |
| CN107451454B (en) | Unlocking control method and related product | |
| CN108877733B (en) | Color temperature adjustment method and related device | |
| CN107862266A (en) | Image processing method and related product | |
| US10706282B2 (en) | Method and mobile terminal for processing image and storage medium | |
| CN107358183B (en) | Iris liveness detection method and related products | |
| CN107835336B (en) | Dual-camera frame synchronization method and device, user terminal and storage medium | |
| WO2019015574A1 (en) | Unlocking control method and related product | |
| CN106803863A (en) | A kind of image sharing method and terminal | |
| CN107566654B (en) | Unlocking control method and related product | |
| HK1247010A (en) | Image processing method and related products | |
| HK1247010A1 (en) | Image processing method and related products | |
| CN111147838B (en) | Image processing method and device and mobile terminal |