[go: up one dir, main page]

HK1207725B - Image element brightness adjustment - Google Patents

Image element brightness adjustment Download PDF

Info

Publication number
HK1207725B
HK1207725B HK15108078.1A HK15108078A HK1207725B HK 1207725 B HK1207725 B HK 1207725B HK 15108078 A HK15108078 A HK 15108078A HK 1207725 B HK1207725 B HK 1207725B
Authority
HK
Hong Kong
Prior art keywords
image
image element
relief print
part relief
value
Prior art date
Application number
HK15108078.1A
Other languages
Chinese (zh)
Other versions
HK1207725A1 (en
Inventor
弗雷德.弗莱
张东爀
李相坤
Original Assignee
Ib韩国有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/557,349 external-priority patent/US8824792B2/en
Application filed by Ib韩国有限公司 filed Critical Ib韩国有限公司
Publication of HK1207725A1 publication Critical patent/HK1207725A1/en
Publication of HK1207725B publication Critical patent/HK1207725B/en

Links

Description

Picture element brightness adjustment
Background
The present invention relates to the field of methods and systems for capturing body-part relief print images, such as fingerprint and/or hand relief images, for example. Body-part relief print image capture devices, such as scanners and/or readers, are relatively common in safety-related occupations and have become readily available to consumers who may wish to protect information and the like. Body-part relief print image capture devices typically capture an image of the relief print using some type of image capture device (e.g., camera, sensor, etc.), with only the relief print portion of the body-part under consideration being represented in the resulting image.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key elements or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The body-part relief print image device and/or the resulting image may be affected by external (e.g., environmental, locally present dirt, debris, moisture, etc.) and/or internal (e.g., device defects, such as scratches, sensor damage, etc.) conditions, which may provide less than ideal results when attempting to effectively utilize the resulting image. Accordingly, among other things, one or more techniques and/or systems are disclosed for mitigating image defects that may be caused by external and/or internal conditions.
In one embodiment of identifying an adjusted brightness level for image elements representing at least a portion of an image of a body-part relief print, a first weighting factor may be determined for a first image element that may be included in an initial image. Additionally, a body-part relief print weighting value may be determined, wherein the body-part relief print weighting value may be based at least on a combination of the first weighting factor and a second image element brightness value for a second image element that may be included in the body-part relief print image. Additionally, an adjusted brightness level may be determined for the second image element based at least on a combination of the body-part relief print weighting value and the second image element brightness value.
To the accomplishment of the foregoing and related ends, the following description and annexed drawings set forth certain illustrative aspects and embodiments. These are indicative of but a few of the various ways in which one or more aspects may be employed. Other aspects, advantages and novel features of the invention will become apparent from the following detailed description when considered in conjunction with the drawings.
Drawings
Fig. 1 is a component diagram illustrating an example body-part relief print recognition environment in which one or more portions of one or more techniques and/or one or more systems described herein may be implemented.
Fig. 2 is a flow diagram illustrating an exemplary method for identifying an adjusted brightness level for an image element in an image.
Fig. 3 is a flow diagram illustrating an example implementation in which one or more portions of one or more techniques described herein may be implemented.
Fig. 4 is a flow diagram illustrating an example implementation in which one or more portions of one or more techniques described herein may be implemented.
Fig. 5 is a flow diagram illustrating an example implementation in which one or more portions of one or more techniques described herein may be implemented.
Fig. 6 is a flow diagram illustrating an example implementation in which one or more portions of one or more techniques described herein may be implemented.
Fig. 7A and 7B illustrate example implementations in which one or more portions of one or more techniques described herein may be implemented.
FIG. 8 is a composition diagram illustrating an exemplary system for identifying adjusted brightness levels for image elements in an image.
FIG. 9 is a component diagram illustrating an example embodiment in which one or more portions of the system described herein may be implemented.
Fig. 10 is an illustration of an exemplary computer-readable medium comprising processor-executable instructions configured to embody one or more of the provisions set forth herein.
FIG. 11 illustrates an exemplary computing environment in which one or more of the provisions set forth herein may be implemented.
Detailed Description
The claimed subject matter is now described with reference to the drawings, wherein like reference numerals are generally used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. It may be evident, however, that the claimed subject matter may be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to facilitate describing the claimed subject matter.
Fig. 1 is a component diagram illustrating an example body-part relief print recognition environment 100 in which one or more portions of one or more techniques and/or one or more systems described herein may be implemented. As shown in example environment 100, an example body-part relief print recognition system, such as a fingerprint recognition system, may include a relief print generator 102, and may further include a sensor arrangement 104. In one implementation, the sensor arrangement 104 may include one type of image capture component, such as an active pixel sensor (e.g., a CMOS sensor and/or a Thin Film Transistor (TFT)), any digital image capture device (e.g., a CCD device), and/or any suitable image capture device (e.g., a light-sensitive film camera).
In one embodiment, relief print generator 102 may include an electrode-based (e.g., single electrode) electroluminescent assembly 106 and/or electrical connections 108 (e.g., a power source, such as an a/C power source), which electrical connections 108 may provide electrical connections between relief object 118 and electroluminescent assembly 106. Furthermore, in one embodiment, electrode-based electroluminescent assembly 106 may include transparent electrode 110, light-emitting layer 112, and/or dielectric layer 114. In one embodiment, the relief print generator 102 and the sensor arrangement 104 may be separated from each other by a distance 116, or may be arranged such that the sensor arrangement 104 contacts the relief print generator 102. As one example, when the relief print recognition system is activated (e.g., by placing a finger over the image capture location), light generated by the light-emitting layer 106 is emitted in the corresponding direction. In this example, the emitted light may reflect off the dielectric layer 114 and be directed towards the sensor arrangement 104.
It is proposed herein that a method may be devised for mitigating image defects in body-part relief print images, such as fingerprint images, that may be due to external conditions (e.g., dirt or dust on the capture device and/or body part, environmental conditions, and/or image capture conditions) and/or device defects (e.g., scratches, damage, bad parts, operation outside of device parameters, etc.). As one example, an image defect may include a change in brightness between adjacent image elements (e.g., pixels or sub-pixels), where a first image element expresses a higher brightness value (e.g., in terms of grayscale) than a second image element due to some external and/or internal defect.
Fig. 2 is a flow diagram illustrating an exemplary method 200 for identifying an adjusted brightness level for an image element in an image. The exemplary method 200 begins at 202. At 204, a first weighting factor is determined for a first image element included in the initial image. In one implementation, the initial image may comprise an image captured by a body-part relief print imaging device (e.g., a fingerprint and/or handprint reader) in the absence of a body-part. As one example, an apparatus standard (e.g., a calibration standard, such as a blank body part placement part) may be placed on a scan portion of the apparatus, and an initial image (e.g., of the calibration standard) may be captured. As one example of an imaging device using an electroluminescent imaging film to illuminate relief areas of a body-part print, an initial image may indicate relatively complete coverage of activation of the film within the relief print capture area of the device (e.g., the entire film is illuminated on the capture area). In this way, for example, if a defect is present, it may be easier to identify the defect in the initial image.
In one embodiment, an image may include one or more image elements including the smallest element of the image that can be represented and/or controlled. As one example, a picture element may comprise a pixel. Generally, a "pixel" is used to describe a unit of an image, for example, where the unit may include the smallest element of the image that may be represented and/or managed. As another example, a pixel may comprise an addressable screen element (e.g., screen pixel, sub-pixel) of a display device, a single dot in a raster image, and/or a single dot in a printed picture. Further, as an example, a "pixel" may include an "address" corresponding to coordinates of an image and/or display screen (e.g., X, Y coordinates, row and column coordinates, Euclidean space coordinates, etc.). In one embodiment, the image elements may include any type of image "unit" (e.g., pixel, sub-pixel, etc.) that may be represented and/or controlled. Generally, for example, an image may be composed of a plurality of pixels arranged in rows and/or columns (e.g., or some other pattern) to create objects (e.g., relief prints and/or portions of relief prints), colors, shades, tones, etc. within the image.
Additionally, in one embodiment, the first weighting factor may include an indication of a condition of the corresponding image element relative to one possible defect. That is, for example, a portion of an image sensor in an image capture device may include a corresponding image element, and the portion of the sensor may include a defect that results in a reduction in the "brightness" of a captured image. In this implementation, for example, the weighting factor may indicate a reduced "brightness" of the corresponding image element. As another example, a first portion of an electroluminescent imaging film may be illuminated at a lower brightness than a second portion of the film. In this example, image elements made up of multiple regions of the first portion may include different (e.g., higher) weighting factors than image elements made up of respective regions of the second portion.
At 206 in the exemplary implementation 200 of fig. 2, a body-part relief print weighting value is determined based at least on a combination of the first weighting factor and a second image element brightness value of a second image element included in the body-part relief print image. In one implementation, the body-part relief print image may be captured after (e.g., or before) the initial image is captured. In this way, for example, potential external and/or internal defects that may be present in the initial image capture process (e.g., defects that result in the initial image) may also be present in the relief print image capture process. Further, in this embodiment, the body-part relief print image may comprise one or more image elements (e.g., comprising at least a second image element), for example. In this example, an image element brightness value may be identified for a respective one or more image elements (e.g., containing a second image element).
In one embodiment, the image element brightness values may comprise color values, such as gray scale values. For example, the grayscale may include two hundred fifty-six (e.g., zero to two hundred fifty-five, including zero and two hundred fifty-five) gradient values, with the lowest value indicating the lowest brightness level (e.g., no white, only black) of the resulting image (e.g., displayed and/or printed) and the highest value indicating the highest brightness level (e.g., only white, no black) of the resulting image (e.g., displayed and/or printed). As another example, a chroma (e.g., or a portion thereof, such as a chroma of red from an RGB chroma) may include gradient values that also include highest and lowest luminance levels (e.g., or color levels). In this embodiment, as an example, a respective image element in the body-part relief print image may comprise a corresponding image element brightness value that may be identified by comparison to an associated color gradient value.
In one embodiment, an image element brightness value (e.g., a second image element brightness value) may be combined (e.g., multiplied by) a first weighting factor (e.g., of a first image element) to arrive at a body-part relief print weighting value (e.g., of a second image element). As one example, the first image element may comprise a first location in the initial image and the second image element may comprise a second location in the relief image, and the first location and the second location may comprise the same location when the initial image and the relief image are overlaid. In this example, a first weighting factor from an image element of the initial image is applied to a luminance value of a second image element of the relief image, resulting in a weighted value of the body-part relief print image of the associated image element (e.g., the second image element). Further, a body-part relief print weighting value may be determined for a respective one or more image elements included in the body-part relief print image, e.g., using a corresponding first weighting factor from the initial image and an image element brightness value from the body-part relief print image.
At 208 in the exemplary implementation 200 of fig. 2, an adjusted brightness level is determined for the second image element based at least on a combination of the body-part relief print weighting value and the second image element brightness value. In one embodiment, the body-part relief print weighting value determined for the second image element may be combined with (e.g., summed with) the second image element brightness value to yield an adjusted brightness level for the second image element included in the body-part relief print image. In this implementation, for example, the adjusted brightness level of the image elements may mitigate improper "brightness" levels due to defects (e.g., internal or external defects) that occur during body-part relief print image capture.
That is, for example, a weighted value (e.g., indicating a "brightness" difference level) may be indicated for a second image element included in a region having an imaging defect (e.g., dirt, a defective portion, etc.). In this example, image elements from an area that has no defects may include a zero weighting value (e.g., no decrease in brightness due to defects). Further, as an example, the weighted value may indicate a "brightness" level difference between picture elements from a defective region of the imaging device and picture elements from a non-defective region of the imaging device. For example, when the weighted value is combined with the second image element brightness value from the unadjusted relief image, the resulting brightness level of the image element may be more indicative of areas in the image that do not include defects.
It should be understood that the "brightness" level is not limited to the above-described embodiments (e.g., in terms of chroma). For example, the brightness level may be indicated by a "gain" level in the image signal, where an increase in gain may cause the brightness of at least a portion of the image to increase. Further, it is contemplated that other image signal controls designed by those skilled in the art may be used to adjust, manage and/or control the "brightness" of one or more of the image elements of an image.
After determining the adjusted brightness level of the second image element (e.g., and the corresponding one or more image elements in the body-part relief print image), the exemplary embodiment 200 of fig. 2 ends at 210.
Fig. 3 is a flow diagram illustrating an example implementation 300 in which one or more portions of one or more techniques described herein may be implemented. At 302, an initial image may be captured using a body-part relief print capture device standard in cooperation with a body-part relief print capture device. In one embodiment, the initial image may be stored at 304, such as in a local memory (e.g., volatile and/or non-volatile memory) and/or storage device (e.g., an electronic data storage device such as a disk, flash memory, etc.) and/or a remote memory and/or storage device. Further, in this embodiment, metadata (e.g., type, size, time/date, storage location, etc.) associated with the initial image may be indexed in the database (e.g., for subsequent access).
As one example, a body-part relief print capture device standard may comprise a type of calibration standard part that "replaces" a body part (e.g., a hand or finger) when an initial image is captured using a relief print capture device. In one implementation, the body-part relief print capture device standard may include "white space" (e.g., featureless, unpatterned, smooth, monochromatic, etc.) that provides a type of clean "background" for the initial image. As an example, a blank background image (e.g., used as an initial image) may be used to identify potential deviations from an "ideal" image (e.g., an image without defects) due to external and/or internal conditions (e.g., environment, dirt, defects in the capture device, etc.).
In one implementation, a fingerprint and/or handprint capture device may include an electroluminescent portion that utilizes an electric field to activate a photon emitting substance. For example, in this implementation, the body-part relief print capture device criteria may include a desired degree of dielectric constant, providing a desired electric field that allows the electroluminescent portion to function properly. Furthermore, in this implementation, the body-part relief print capture device standard may include a type of calibration blank, for example, for capturing an initial image (e.g., a blank image) when placed in a "body-part" relief print capture location of the device. In one example, standard placement of a body-part relief print capture device in a "body-part" relief print capture location may activate the corresponding photon emitting substance in the electroluminescent portion, thereby providing an initial image indicative of photons detected from an entire portion of the "body-part" relief print capture location (e.g., the entire initial image is indicative of light from the device).
At 306 in the example implementation 300 of fig. 3, a first image element luminance value may be identified for a first image element. In one embodiment, the first image element luminance value may comprise a chrominance value, such as a gray value (e.g., a value from 0 to 255, including 0 and 255). As one example, a digital image may include a plurality of image elements, such as pixels and/or sub-pixels, where the arrangement and luminance values of the image elements are identified by data included in an image file representing the image. In this embodiment, image element luminance values may be identified for a first image element (e.g., and corresponding other image elements in the initial image), for example, by accessing data included in an image file of the initial image (e.g., or some representation of an image thereof, such as stored locally and/or remotely).
In one embodiment, at 312 in FIG. 3, an image element map may be created for the initial image. In this implementation, for example, the image element map may comprise representations of respective image elements in the image, the image elements being arranged according to their corresponding positions in the image. Further, in this example, the respective image element representations may be associated with their corresponding image element luminance values. As an illustrative example, an image element map may include a twenty-five grid of squares (e.g., five by five), with respective squares representing image elements from the image. Additionally, in this example, the respective mesh may include (e.g., be linked to) its corresponding image element brightness value. In one embodiment, the initial image, image element map may be stored (e.g., locally and/or remotely) and/or indexed into a database.
At 308 in the example implementation 300 of fig. 3, a first image brightness value may be identified. In one embodiment, the first image brightness value may comprise an expected value from a set of values comprising one or more image element brightness values respectively corresponding to an image element in the initial image. In one embodiment, the set of image pixel luminance values may comprise values corresponding to image elements from at least a portion of the initial image. Further, the expected value may comprise the highest value from the group.
As an example, one or more first image elements in the initial image may indicate the highest (e.g., ideal) possible luminance value, e.g., where the one or more first image elements are from an image region without a noticeable image defect. Additionally, in this example, one or more second image elements may indicate less than a highest possible luminance value, for example where the one or more second image elements are from an image region that includes a significant image defect. In this example, the first image luminance value may be identified as the highest luminance value comprised in the set of values comprising one or more first image element luminance values.
At 314 in the example implementation 300 of fig. 3, a first weighting factor 350 may be determined for a first image element (e.g., and corresponding other image elements in the initial image). In one embodiment, determining the first weighting factor 350 may include combining the identified first image luminance value of the initial image (e.g., at 308) with the first image element luminance value of the first image element (e.g., at 306).
Fig. 4 is a flow diagram illustrating an example implementation 400 in which one or more portions of one or more techniques described herein may be implemented. In this embodiment 400, beginning at 402, at 314, a first weighting factor 450 (e.g., 350 of fig. 3) may be determined for a corresponding image element in the initial image by combining the identified first image luminance value with the first image element luminance value. At 404, determining a first weighting factor 450 may include determining a difference between a first image brightness value and a first image element brightness value. Further, at 406, a quotient of the difference value and the first image element luminance value (e.g., the difference value divided by the first image element luminance value) may be determined.
As an illustrative example, where the first image brightness value is two hundred and forty and the first image element brightness value is one hundred, the first weighting factor 350 may be determined as follows:
(240-100)/100 ═ 1.4 (first weighting factor).
Further, a first weighting factor 450 may be determined for a corresponding image element in the initial image, wherein a next image element from the initial image may be selected at 408, and the pass through processes 402 through 408 are repeated at least until no more image elements are available for the initial image. In one embodiment, at 410, an image element weighting map (e.g., a first weighting map) may be created for the initial image, for example, where the image element weighting map may include a first weighting factor for the respective image element.
Fig. 5 is a flow diagram illustrating an example implementation 400 in which one or more portions of one or more techniques described herein may be implemented. At 502 in example implementation 500, a body-part relief print image may be captured using a body-part relief print capture device (e.g., the same device as in 302 of fig. 3), wherein the body-part relief print image comprises an image of a body-part relief print (e.g., a fingerprint or a hand print) from at least a portion of a body (e.g., a hand or a finger). As one example, prior to capturing the body-part relief print image, an initial image may be captured using the same image capture device. In this way, for example, the initial image and body-part relief print image may include similar image defects (e.g., due to external and/or internal device conditions), if present.
In 504 of fig. 5, a second image element brightness value may be identified for a second image element included in the body-part relief print image. In one embodiment, the second image element brightness value may comprise a color value (e.g., a gray value) for the second image element in the body-part relief print image. As one example, as described above for the values (e.g., at 306 of fig. 3), the second image element brightness values may be identified for the second image elements (e.g., and corresponding other image elements in the body-part relief print image) by accessing data included in the associated image file of the body-part relief print image (e.g., or some representation of its image, such as stored locally and/or remotely). Further, for example, second image element brightness values may be identified for respective image elements in the body-part relief print image.
In one embodiment, at 506, an image element mapping of the body-part relief print image may be created. In this embodiment, for example, the respective mapped image elements may be associated with their corresponding second image element luminance values. As another example, one or more image elements in a body-part relief print image may include a lower luminance value than other image elements (e.g., from an area having image defects). In this example, the image element mapping of the body-part relief print image may indicate those areas with and without potential image defects. In one embodiment, the position of a first image element in the initial image may correspond to the same image position as the position of a second image element in the body-part relief print image. That is, for example, the first and second image elements may represent the same pixel in respective images captured from the same device.
As an illustrative example, fig. 7A illustrates an example implementation 700 in which one or more portions of one or more techniques described herein may be implemented. In this example implementation 700, body-part relief print image capture device 704 is used to capture relief print image 706 of body-part 702. As an example, a person may place one of his fingers on a fingerprint capture portion of a fingerprint capture device, which activates an image capture event, resulting in an image of at least a portion of the fingerprint.
Further, in this illustrative example 700, the relief print image 706 may include one or more representations of ridges (e.g., indicated by dark lines in the image, but an inverted version of the image may indicate ridges as light lines, such as captured using an electroluminescent device) and valleys (e.g., between ridges). In this embodiment, a portion of one ridge 708 or ridge 708 may include multiple image elements, where an image element map 710 (e.g., 506 of fig. 5) may be created. This example of image element mapping 710 includes a five by five grid, including twenty-five image elements arranged in a square grid pattern (e.g., although other arrangements may also be used).
In this example 700, the respective mesh in image element map 710 includes an associated second image element luminance value, which may correspond to a luminance (e.g., or chroma) level represented by the image element in image 706. In this example, the highest luminance value is fifty, associated with a few picture elements, while other values include forty, thirty-one, and twenty. As an example, a value below fifty may indicate an area of the image that is affected by some external and/or internal device condition, resulting in an image defect.
Returning to fig. 5, at 508, a second image brightness value 550 may be identified. As described above (e.g., in 308 of fig. 3), in one embodiment, the second image brightness value 550 may comprise an expected value from a set of values comprising one or more second image element brightness values that each correspond to an image element in the body-part relief print image. In one embodiment, the set of image pixel brightness values may comprise values corresponding to image elements from at least a portion of a body-part relief print image. Further, the expected value may comprise the highest value from the group. In one example, the second image luminance value 550 may be identified as the highest luminance value included in the set of values including one or more second image element luminance values.
Fig. 6 is a flow diagram illustrating an example implementation 600 in which one or more portions of one or more techniques described herein may be implemented. In this implementation 600, at 602, a gain adjustment between an initial image capture and a body-part relief print image capture may be detected using second image brightness values 550. As one example, the "brightness" level may be adjusted by adjusting a "gain" level for the image capture device, where an increase in gain may cause an increase in brightness of at least a portion of the captured image. For example, a signal amplifier may be used to adjust some portion of the signal level detected by a sensor in an image capture device. In this example, adjusting the brightness gain may adjust the amplification (e.g., either up or down) of the brightness portion of the detected signal level. In one embodiment, the brightness level of the detected signal for the body-part relief print may be different (e.g., lower or higher) than the brightness level of the original image (e.g., either manually adjusted or automatically detected and adjusted). In one example, the brightness level of the detected signal for the body-part relief print may be lower than the initial image, and the gain may be adjusted upward in an attempt to amplify the brightness portion of the detected signal.
At 604 in the example implementation 600 of fig. 6, a gain adjustment function may be applied to the detected gain adjustment. In one embodiment, the gain adjustment function may indicate a relationship between gain adjustments, a relationship between the initial image and the body-part relief print image, and a change in weighting values resulting from the gain adjustments. That is, for example, when the brightness gain is adjusted, a corresponding adjustment may be needed in the weighting factor (e.g., the first weighting factor) to account for the difference in gain. In one implementation, the gain adjustment function may be predetermined, for example, based on intrinsic evidence, and/or may be determined specifically based on existing conditions (e.g., at the time of image capture).
At 606, a weighting adjustment factor is determined as a result of applying a gain adjustment function to the detected gain adjustment. At 608, a first weighting factor (e.g., 350 of fig. 3) may be adjusted for the first image element using the weighting adjustment factor, resulting in a second weighting factor 650 for the first image element. Further, for example, a second weighting factor 650 may be determined for a respective weighting factor associated with the first image element in the initial image using the weighting adjustment factor. As one illustrative example, fig. 7B illustrates an example implementation 750 in which one or more portions of one or more techniques described herein may be implemented 750. First weighting factor image element map 754 indicates respective first weighting factors (e.g., 0, 0.2, 0.6, 1.4) associated with image elements in image element map 754. In this illustrative example 750, the weighting adjustment factor 756 may be multiplied by the respective first weighting factor to obtain a second weighting factor 650 for the respective image element.
Returning to fig. 5, at 510, a weighting factor (e.g., first weighting factor 450 or second weighting factor 650) may be combined with a second image element brightness value from the body-part relief print image. As one example, if no gain adjustment is detected (e.g., 602 of fig. 6) between the initial image and the body-part relief print image capture event, the first weighting factor 450 may be combined with the second image element brightness value (e.g., or if the same result as the first weighting factor 450 is produced, a second weighting factor 650, such as one number multiplied by one, may be used). In another embodiment, if a gain adjustment is detected, the second weighting factor 650 may be combined with the second image element luminance value.
As one illustrative example, in fig. 7B (where the weighting adjustment factor 756 is one), the resulting second weighting factor (for the respective associated image element) from the combination 752 (e.g., multiplication) of the weighting adjustment factor 756 and the respective first weighting factor 754 may be represented by the same number indicated in the first weighting factor map 754. In this example, the second image element brightness values for the image elements 710 of the respective body-part relief print image may be combined (e.g., multiplied by the respective second weighting factor 752) with the respective second weighting factor 752, resulting in a body-part relief print weighting value 758 for the respective second image elements of the body-part relief print image. Further, in this example 750, the respective body-part relief print weighting value 758 may be combined (e.g., summed) with its corresponding second image element brightness value 760 from the body-part relief print image. The result of this combination includes adjusted brightness levels 762 for the respective image elements corresponding to the body-part relief image.
Returning to fig. 5, at 512, the adjusted brightness level 552 for the respective image element may be applied to its corresponding second image element in the body-part relief print image, resulting in an adjusted body-part relief print image. As one illustrative example, in fig. 7B, two image elements on the upper left corner of image element map 710 representing at least a portion of a body-part relief print image comprise a brightness level of twenty. In one example, the reduction in brightness (e.g., relative to the highest-order fifty reduction in the portion of image 710) may be due to a defect in the image capture device, and/or debris on an image capture surface of the image capture device. In this example, applying a weighting factor (e.g., a first or second weighting factor) to the corresponding two second image element brightness values (e.g., at the upper left corner of 710) results in an adjusted brightness level of forty-eight in adjusted body-part relief print image 762.
A system may be devised that may mitigate image defects in a body-part relief print image, such as a fingerprint image, produced by an image capture device, such as a fingerprint scanner. As an example, imaging defects may be indicated by brightness variations between adjacent image elements (e.g., pixels or sub-pixels) in the resulting image, where the defects may be due to external and/or internal device conditions.
Fig. 8 is a block diagram illustrating an exemplary system 800 for identifying adjusted brightness levels for image elements in an image. In exemplary implementation 800, weighting factor component 802 is configured to determine a first weighting factor for a first image element 852 included in initial image 850. In addition, a body-part relief print weighting value component 804 is operatively coupled to the weighting factor component 802. The body-part relief print weighting value component 804 is configured to determine a body-part relief print weighting value based at least on the first weighting factor and a second image element brightness value of a second image element 856 included in the body-part relief print image 854.
In the exemplary implementation 800 of fig. 8, an adjusted brightness level component 806 is operably coupled with a body-part relief print weighting value component 804. Adjusted brightness level component 806 is configured to determine an adjusted brightness level for second image element 856 based at least on a combination of the body-part relief print weighting value and the second image element brightness value. Further, at least a portion of the system is implemented at least in part via the processing unit 808.
Fig. 9 is a component diagram illustrating an example implementation 900 in which one or more portions of the system described herein may be implemented. In this example 900, an expanded form of fig. 8 is provided, and the description of elements, components, etc. described with respect to fig. 8 may not be repeated for simplicity. In example implementation 900, body-part relief print image capture device 910 may be configured to capture initial image 958, where initial image 958 may comprise an image of at least a portion of a body-part relief print capture device standard. Furthermore, body-part relief print image capture device 910 may be configured to capture body-part relief print image 958, where body-part relief print image 958 may comprise an image of a body-part relief print from at least a portion of a body.
In example implementation 900, luma value identification component 912 may be configured to identify a first image element luma value 950 for a first image element in initial image 958. In one embodiment, the first image element luminance value 950 may comprise a color value (e.g., a gray value). Moreover, luminance value identification component 912 can be configured to identify second image element luminance value 954 for the second image element in body-part relief image 958. In one implementation, second image element luminance value 954 may comprise a chrominance value. Additionally, the luma value identification component 912 may be configured to identify a first image luma value 952. In one embodiment, the first image luminance value 952 may comprise a first desired value from a first set comprising one or more image element luminance values respectively corresponding to image elements in the initial image. The brightness value identification component 912 can also be configured to identify a second image brightness value 956. In one embodiment, the second image brightness value 956 may comprise a second desired value from a second set comprising one or more image element brightness values respectively corresponding to image elements in at least a portion of the body-part relief print image.
In example implementation 900, the weight adjustment factor determination component 914 may be configured to determine the weight adjustment factor 962. The weight adjustment factor determination component 914 may determine the weight adjustment factor 962 by applying the detected gain adjustment 960 to a desired weight adjustment function. Detected gain adjustment 960 may comprise the difference in gain applied to body-part relief print image capture device 910 between capturing initial image 958 and capturing body-part relief print image 958.
In example implementation 900, the weighting factor adjustment component 916 may be configured to adjust a first weighting factor 964 for a first image element using a weighting adjustment factor 962, resulting in a second weighting factor 966 for the first image element. In this implementation, body-part relief print weighting value component 804 may be configured to determine a body-part relief print weighting value by combining second weighting factor 966 with second image element luminance value 954 from body-part relief print image 958.
Example implementation 900 further includes an adjusted body-part relief print image component 918. Adjusted body-part relief print image component 918 may be configured to apply adjusted brightness levels 968 associated with respective second image elements of body-part relief print image 958 to the respective second image elements, resulting in an adjusted body-part relief print image 970. That is, for example, the resulting adjusted body-part relief print image 970 may provide an image of relief print with reduced image defects due to external and/or internal conditions affecting the body-part relief image capture device 910.
In another implementation, a computer-readable medium may include processor-executable instructions that may be configured to implement one or more portions of one or more techniques presented herein. An example computer-readable medium is illustrated in FIG. 10, where an implementation 1000 includes a computer-readable medium 1008 (e.g., CD-R, DVD-R, hard drive, flash drive, non-volatile memory storage component), with computer-readable data 1006 encoded on this computer-readable medium 1008. This computer-readable data 1006, in turn, comprises a set of computer instructions 1004, the set of computer instructions 1004 being configurable to operate in accordance with one or more of the techniques set forth herein. In one such implementation 1002, the processor-executable instructions 1004 may be configured to perform a method, such as at least some of the exemplary method 100 of fig. 1. In another such implementation, the processor-executable instructions 1004 may be configured to implement a system, such as at least some of the exemplary system 800 of fig. 8. Many such computer-readable media may be devised by those of ordinary skill in the art that are configured to operate in accordance with the techniques presented herein.
Fig. 11 and the following discussion provide a brief, general description of a computing environment in/on which embodiments of one or more or one or more of the methods and/or systems set forth herein may be implemented. The operating environment of FIG. 11 is only one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the operating environment. Example computing devices include, but are not limited to, personal computers, server computers, hand-held or laptop computer devices, mobile devices (such as mobile phones, mobile consoles, tablets, media players, and the like), multiprocessor systems, consumer electronics, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
Although not required, embodiments are described in the general context of "computer-readable instructions" being executed by one or more computing devices. Computer readable instructions (discussed below) may be distributed via a computer readable medium. Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), data structures, and the like, that perform particular tasks or implement particular abstract data types. Typically, the functionality of the computer readable instructions may be combined or distributed as desired in various environments.
Fig. 11 illustrates an example of a system 1100, the system 1100 including a computing device 1102 configured to implement one or more implementations provided herein. In one configuration, computing device 1102 includes at least one processing unit 1106 and memory 1108. Depending on the exact configuration and type of computing device, memory 1108 may be volatile (such as RAM, for example), non-volatile (such as ROM, flash memory, etc., for example) or some combination of the two. This configuration is illustrated in fig. 11 by dashed line 1104.
In other implementations, the device 1102 may include additional features and/or functionality. For example, device 1102 may also include additional storage devices (e.g., removable and/or non-removable) including, but not limited to, magnetic storage devices, optical storage devices, and the like. Such additional storage is illustrated in fig. 11 by storage 1110. In one embodiment, computer readable instructions to implement one or more embodiments provided herein may be in storage 1110. Storage 1110 may also store other computer readable instructions to implement an operating system, an application program, and the like. Computer readable instructions may be loaded in memory 1108 for execution by processing unit 1106, for example.
The term "computer-readable medium" as used herein includes computer storage media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions or other data. Memory 1108 and storage 1110 are examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 1102. Any such computer storage media may be part of device 1102.
Device 1102 may also include communication connections 1116 that allow device 1102 to communicate with other devices. Communication connection 1116 may include, but is not limited to, a modem, a Network Interface Card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a USB connection, or other interfaces for connecting computing device 1102 to other computing devices. Communication connection 1116 may comprise a wired connection or a wireless connection. Communication connection(s) 1116 may transmit and/or receive communication media.
The term "computer-readable media" may include communication media. Communication media typically embodies computer readable instructions or other data in a "modulated data signal" such as a carrier wave or other transport mechanism and includes any information delivery media. The term "modulated data signal" may include a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
Device 1102 may include an input device 1104, such as a keyboard, mouse, pen, voice input device, touch input device, infrared camera, video input device, and/or any other input device. Output device(s) 1112 such as one or more displays, speakers, printers, and/or any other output device may also be included in device 1102. The input device 1114 and the output device 1112 may be connected to the device 1102 via a wired connection, a wireless connection, or any combination thereof. In one implementation, an input device or an output device from another computing device may be used as input device 1114 or output device 1112 for computing device 1102.
The components of computing device 1102 may be connected by a number of interconnects, such as a bus. Such interconnects may include Peripheral Component Interconnect (PCI) (e.g., PCI Express), Universal Serial Bus (USB), firewire (IEEE 1394), optical bus structures, and the like. In another embodiment, components of computing device 1102 may be interconnected by a network. For example, memory 1108 may be comprised of multiple physical memory units located in different physical locations interconnected by a network.
Those skilled in the art will realize that storage devices utilized to store computer readable instructions may be distributed across a network. For example, a computing device 1120 accessible via network 1118 may store computer readable instructions to implement one or more implementations provided herein. Computing device 1102 may access computing device 1120 and download a part or all of the computer readable instructions for execution.
Alternatively, computing device 1102 may download pieces of the computer readable instructions, as needed, or some instructions may be executed at computing device 1102 and some at computing device 1120.
Various operations of embodiments are provided herein. In one implementation, one or more of the operations described may constitute computer-readable instructions stored on one or more computer-readable media, which if executed by a computing device, would cause the computing device to perform the operations described. The order in which some or all of the described operations are described should not be construed as to imply that these operations are necessarily order dependent. Alternative orderings will be appreciated by those skilled in the art having the benefit of this description. Further, it will be understood that not all operations are provided in every embodiment provided herein.
Moreover, the word "exemplary" is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as "exemplary" is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion. As used in this application, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise, or clear from context, "X employs A or B" is intended to mean any of the natural inclusive permutations. That is, if X employs A; b is used as X; or X employs both A and B, then "X employs A or B" is satisfied under any of the foregoing instances. Further, at least one of A and B and/or similar expressions generally mean A or B or both A and B. In addition, the articles "a" and "an" as used in this application and the appended claims may generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
As used in this application, the terms "component," "module," "system," "interface," and the like are generally intended to refer to a computer-related entity, either hardware, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a controller and the controller can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
Furthermore, the claimed subject matter may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the disclosed subject matter. The term "article of manufacture" as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media. Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
Also, although the invention has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art upon the reading and understanding of this specification and the annexed drawings. The present invention embraces all such modifications and alterations, and is limited only by the scope of the appended claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary embodiments of the invention. In addition, while a particular feature of the invention may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms "includes," has, "" with, "or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term" comprising.
Embodiments have been described above. It will be apparent to those skilled in the art that the above methods and apparatus may incorporate changes and modifications without departing from the general scope of the invention. It is intended to embrace all such modifications and changes as fall within the scope of the appended claims or the equivalents thereof.

Claims (20)

1. A method for identifying an adjusted brightness level of an image element representing at least a portion of an image of a body-part relief print, the method comprising:
determining a first calibration weighting factor for a first image element comprised in the calibration image;
determining a body-part relief print weighting value for a second image element included in the body-part relief print image based at least on a combination of the first calibration weighting factor and a second image element brightness value from the second image element, the second image element in the body-part relief print image corresponding to the first image element in the calibration image; and
determining the adjusted brightness level for the second image element in a body-part relief print image based at least on a combination of the body-part relief print weighting value and the second image element brightness value, at least a portion of the method being implemented at least in part via a processor.
2. The method of claim 1, further comprising capturing the calibration image using a body-part relief print capture device calibration standard in conjunction with a body-part relief print capture device.
3. The method of claim 2, further comprising capturing the body-part relief print image using the body-part relief print capture device, wherein the body-part relief print image comprises an image of a body-part relief print from at least a portion of a body, wherein the body-part relief print comprises one or more of:
a fingerprint; and
and (5) hand veins.
4. The method of claim 1, determining a first calibration weighting factor for the first image element comprising combining the identified calibration image luminance value for the calibration image with a first image element luminance value from the first image element.
5. The method of claim 4, combining the identified calibration image luminance values for the calibration image with first image element luminance values from the first image element comprising one or more of:
determining a difference between the luminance value of the calibration image and the luminance value of the first image element; and
a quotient of the difference value and the luminance value of the first image element is determined.
6. The method of claim 4, comprising identifying a calibration image brightness value, comprising identifying a desired value from a set comprising one or more image element brightness values, each of the one or more image element brightness values corresponding to an image element in the calibration image.
7. The method of claim 4, further comprising identifying the first image element brightness value for the first image element, wherein the first image element brightness value comprises a chrominance value.
8. The method of claim 1, comprising one or more of:
adjusting the calibration weighting factor for the first image element using the weighting adjustment factor, resulting in a second calibration weighting factor for the first image element; and
determining a body-part relief print weighting value comprises combining a second calibration weighting factor with a second image element brightness value.
9. The method of claim 8, determining the weighted adjustment factor for the first image element is based at least on a gain adjustment between the calibration image and the body-part relief print image.
10. The method of claim 9, determining the weight adjustment factor for the first image element further comprises applying a weight adjustment function to the gain adjustment.
11. The method of claim 1, further comprising identifying said second image element brightness value, wherein said second image element brightness value comprises a chrominance value for said second image element in said body-part relief print image.
12. The method of claim 11, wherein a position of a first image element in the calibration image corresponds to a same image position as a position of a second image element in the body-part relief print image.
13. The method of claim 1, further comprising applying the adjusted brightness level to the second image element, thereby resulting in an adjusted body-part relief print image.
14. A system for identifying an adjusted brightness level of an image element representing at least a portion of an image of a body-part relief print, the system comprising:
a weighting factor component configured to determine a first calibration weighting factor for a first image element included in a calibration image;
a body-part relief print weighting value component operably coupled with the weighting factor component and configured to determine a body-part relief print weighting value for a second image element included in a body-part relief print image based at least on a combination of a first calibration weighting factor and a second image element brightness value for the second image element, the second image element in the body-part relief print image corresponding to the first image element in the calibration image; and
an adjusted brightness level component operably coupled with the body-part relief print weighting value component and configured to determine an adjusted brightness level for a second image element in a body-part relief print image based at least on a combination of a body-part relief print weighting value and a second image element brightness value, at least a portion of the system being implemented at least in part via a processor.
15. The system of claim 14, comprising a body-part relief print image capture device configured to capture one or more of:
a calibration image comprising an image of at least a portion of a body-part relief print capture device calibration standard; and
a body-part relief print image comprising an image of a body-part relief print from at least a portion of a body, wherein the portion of a body comprises one or more of:
a finger; and
and (4) a hand.
16. The system of claim 14, further comprising a brightness value identification component configured to perform one or more of:
identifying a first image element brightness value for the first image element, wherein the first image element brightness value comprises a chrominance value;
identifying the second image element brightness value for the second image element, wherein the second image element brightness value comprises a chroma value for the second image element in the body-part relief print image;
identifying a calibration image luminance value comprising a first desired value from a first set, wherein the first set comprises one or more image element luminance values, each of the one or more image element luminance values corresponding to an image element in the calibration image; and
identifying a second image brightness value comprising a second desired value from a second set, wherein the second set comprises one or more image element brightness values each corresponding to an image element in at least a portion of the body-part relief print image.
17. The system of claim 14, wherein the first and second sensors are configured to sense the temperature of the fluid,
comprising a weighting factor adjustment component configured to adjust a first calibration weighting factor for a first image element using a weighting adjustment factor resulting in a second calibration weighting factor for the first image element; and
the body-part relief print weighting value component is configured to determine the body-part relief print weighting value by combining the second calibration weighting factor with the second image element brightness value.
18. The system of claim 17, further comprising a weighting adjustment factor determination component configured to determine the weighting adjustment factor by applying a detected gain adjustment between the image and the body-part relief print image to a desired weighting adjustment function.
19. The system of claim 14, further comprising an adjusted body-part relief print image component configured to apply the adjusted brightness level to the second image element to obtain an adjusted body-part relief print image.
20. A computer non-transitory storage medium comprising computer-executable instructions that, when executed via a processor, perform a method for identifying an adjusted brightness level of an image element representing at least a portion of an image of a body-part relief print, the method comprising:
capturing a calibration image using a body-part relief print capture device standard in combination with a body-part relief print capture device;
determining a first calibration weighting factor for a first image element comprised in the calibration image, wherein the determining the first calibration weighting factor for the first image element comprises combining the identified calibration image luminance value for the calibration image with a first image element luminance value for the first image element;
capturing a body-part relief print image using the body-part relief print capture device, wherein the body-part relief print image comprises an image of a body-part relief print from at least a portion of a body;
adjusting the first calibration weighting factor for the first image element using a weighting adjustment factor, wherein the weighting adjustment factor is based at least on a gain adjustment between the calibration image and the body-part relief print image, resulting in a second weighting factor for the first image element;
determining a body-part relief print weighting value based at least on a combination of the second weighting factor and a second image element brightness value identified for a second image element included in the body-part relief print image; and
determining the adjusted brightness level for the second image element based at least on a combination of the body-part relief print weighting value and the second image element brightness value, at least a portion of the method being implemented at least in part via a processor.
HK15108078.1A 2012-07-25 2013-07-17 Image element brightness adjustment HK1207725B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/557,349 US8824792B2 (en) 2012-07-25 2012-07-25 Image element brightness adjustment
US13/557,349 2012-07-25
PCT/US2013/050852 WO2014018331A1 (en) 2012-07-25 2013-07-17 Image element brightness adjustment

Publications (2)

Publication Number Publication Date
HK1207725A1 HK1207725A1 (en) 2016-02-05
HK1207725B true HK1207725B (en) 2018-09-21

Family

ID=

Similar Documents

Publication Publication Date Title
US8824792B2 (en) Image element brightness adjustment
US11107210B2 (en) Image data generating apparatus generating image data for inspecting external appearance of product
WO2020063111A1 (en) Print detection method and apparatus, print image compensation method and apparatus, and electronic apparatus
CN110009607B (en) Display screen dead pixel detection method and device, computer equipment and storage medium
US8705134B2 (en) Method of processing an image to clarify text in the image
CN108596908B (en) LED display screen detection method and device and terminal
JP2008250950A5 (en)
CN114820683B (en) Data processing method and device, computer equipment and storage medium
CN107018407A (en) Information processor, evaluation figure, evaluation system and method for evaluating performance
US20220130016A1 (en) Optical imaging processing method and storage medium
CN118397664B (en) Method and device for detecting fingerprint residues of shell
JP4964849B2 (en) Image processing apparatus, image processing program, computer-readable recording medium, electronic apparatus, and image processing method
JP2021196451A (en) Image converter, image conversion method, and computer program for image conversion
WO2024179474A1 (en) Fisheye image processing method, electronic device, and storage medium
HK1207725B (en) Image element brightness adjustment
JP6974791B2 (en) Image processing equipment and computer programs
CN107430692A (en) The computation levels computing calculated for computer vision
JP7478628B2 (en) Image processing device, control method, and control program
CN112052700A (en) Image binarization threshold matrix determination and graphic code information identification method and device
JP4935561B2 (en) Image processing device
CN109712126B (en) Picture identification method and device
CN114120876A (en) Color spot repairing method, display panel, electronic device and computer-readable storage medium
US20220178681A1 (en) Identification method, projection method and identification system
CN107392205B (en) Code value table generation method and device of remote controller
JP6905210B2 (en) Image processing equipment and computer programs