[go: up one dir, main page]

GB2356961A - Biometrics system - Google Patents

Biometrics system Download PDF

Info

Publication number
GB2356961A
GB2356961A GB0029202A GB0029202A GB2356961A GB 2356961 A GB2356961 A GB 2356961A GB 0029202 A GB0029202 A GB 0029202A GB 0029202 A GB0029202 A GB 0029202A GB 2356961 A GB2356961 A GB 2356961A
Authority
GB
United Kingdom
Prior art keywords
biometric
biometrics
controlled change
fingerprint
resultant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0029202A
Other versions
GB0029202D0 (en
Inventor
Rudolf Maarten Bolle
Chitra Dorai
Nalini K Ratha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Publication of GB0029202D0 publication Critical patent/GB0029202D0/en
Publication of GB2356961A publication Critical patent/GB2356961A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/30Writer recognition; Reading and verifying signatures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A resultant biometric is a combination of a traditional biometric and some controlled change to the traditional biometric. Resultant finger- or palm prints, for example, are consecutive print images where the subject exerts force, torque and/or rolling (controlled change) over an image acquisition interval of time. The physical way the subject distorts the images is the behavioural part of the resultant biometric, the finger or palm print is the physiological part of the resultant biometric. An undistorted print image in combination with an expression of the distortion trajectory which can be computed from the sequence of distorted print images, forms a more compact representation of the resultant fingerprint. A template representing the resultant print biometrics is derived from the traditional template representing the finger- or palm print plus a template representing the trajectory. Other traditional biometrics also lend themselves to temporal modification.

Description

2356961 BIOMETRICS SYSTEM AND METHOD This invention relates to the field
of biometrics, i.e., physiological or behavioural characteristics of a subject that more or less uniquely relate to the subject's identity.
Fingerprints have been used for identifying persons in a semiautomatic fashion for at least fifty years for law enforcement purposes and have been used for several decades in automatic authentication applications for access control such as building access and computer login. Signature recognition for automatically authenticating a person's identity has been used at least for fifteen years, mainly for banking applications. In an automatic fingerprint or signature identification system, the first stage is the signal acquisition stage where a subject's fingerprint or signature is is acquired. There are several techniques to acquire fingerprints including scanning an inked fingerprint and inkless techniques using optical, capacitative and other semiconductor- based sensing techniques. The acquired signal is processed and matched against a stored template that is a machine representation of the fingerprint. The image processing techniques typically locate ridges and valleys in the fingerprint and derive templates from the ridge and valley pattern of a fingerprint image.
Signatures, on the other hand, are typically sensed through the use of pressure sensitive writing pads or with e lectro -magnetic writing recording devices. More advanced systems use special pens that compute the pen's velocity and acceleration. The recorded signal can be simply a list of (x, y) coordinates, in the case of static signature recognition, or the coordinates can be a function of time (x(t), y(t)) for dynamic signature recognition. The template representing a signature is more directly related to the acquired signal than a fingerprint template is. An example is a representation of a signature in terms of a set of strokes between extremes, where for each stroke the acceleration is encoded. For examples of signature authentication see V. S. Nalwa, "Automatic on-line signature verification," Proceedings of IEEE, pp. 215-239, Feb. 1997.
Recently, biometrics, such as fingerprints, signature, face, and voice are being used increasingly for authenticating a user's identity, for example, for access to medical dossiers, ATM access, access to Internet services and other such applications.
2 With the rapid growth of the Internet, many new e-commerce and e-business applications are being developed and deployed. For example, retail purchasing and travel reservations over the Internet using a credit card are very common commercial applications. Today, users are recognized with a userID and password, for identification and authentication, respectively.
Very soon, more secure methods for authentication and possibly identification involving biometrics ' such as fingerprints, signatures, voice prints, iris images and face images, will be replacing these simple methods of identification. An automated biometrics system involves acquisition of a signal from the user that more or less uniquely identifies the user. For example, for fingerprint- based authentication a user's fingerprint needs to be scannedand some representation needs to be computed and stored. Authentication is then achieved by comparing the representation extracted from the user's newly acquired fingerprint image is with a stored representation extracted from an image acquired at the time of enrolment. In a speaker verification system a user's speech signal is recorded and some representations computed and stored. Authentication is then achieved by comparing the representation extracted from a speech signal recorded at access or logon time with the stored representation.
Similarly, for signature verification, a template is extracted from the digitized signature and compared to previously computed templates.
Biometrics are distinguished into two broad groups: behavioural and physiological biometrics. Physiological biometrics are the ones that are relatively constant over time, such as fingerprint and iris. Behavioural biometrics, on the other hand, are subject to possibly gradual change over time and/or more abrupt changes in short periods of time. Examples of these biometrics are signature, voice and face. (Face is often regarded to be a physiological biometrics since the basic features cannot be changed that easily; however, ageing, haircuts, beard growth and facial expressions do change the global appearance of a face.) One of the main advantages of Internet-based commerce /business solutions is that they are accessible from remote, unattended locations including users, homes. Hence, the biometrics signal has to be acquired from a remote user in an unsupervised manner. So, a fingerprint or a palm-print reader, a signature digitiser or a camera for acquiring face or iris images is attached to the user's home computer. This, of course, opens up the possibility of fraudulent unauthorised system access attempts. Maliciously intended individuals or organisations may obtain biometrics signals from genuine users by intercepting them from the network or obtaining the 3 signals from other applications where the user uses her/his biometrics. The recorded signals can then be reused for unknown, fraudulent purposes such as to impersonate a genuine, registered user of an Internet service. The simplest method is that a signal is acquired once and reused several times.
Perturbations can be added to this previously acquired signal to generate a biometrics signal that looks "fresh." if the complete fingerprint or palm print is known to the perpetrator, a more sophisticated method would be to fabricate from, for example, materials like silicone or latex, an artificial ("spoof") three-dimensional copy of the finger or palm. Finger and palm print images of genuine users can then be produced by impostors without much effort. A transaction server, an authentication server or some other computing device then has the burden of ensuring that the biometrics signal transmitted from a client is a current and live signal, and not a previously acquired or otherwise constructed or obtained signal. Using is artificial body parts, many fingerprint and palm-print readers produce images that look very authentic to a lay person when the right material is used to fabricate these body parts. The images will, in many cases, also appear real to the component image processing parts of the authentication systems. Hence, it is very difficult to determine whether the static fingerprint or palm-print images are produced by a real finger or palm or by spoof copies. other physiological biometrics suffer from the same limitations, the iris of a genuine user can be photographed and used for unauthorised access. A good iris recognition system detects the rapid fluctuations of the iris diameter, but even that phenomenon can be mimicked with iris image sequences. Similar methods, can of course be used for the face biometrics too.
As discussed, fingerprints and, to a lesser extent, palm prints are used more and more for authenticating a user's identity for access to medical dossiers, ATM access and other such applications. A problem with this prior art method of identification is that it is possible to fabricate three-dimensional spoof fingerprints or palm prints. Silicone, latex, urethane and other materials can be used to fabricate these artificial body parts and many image acquisition devices simply produce a realistic looking impression of the ridges on the artificial body parts which is hard to distinguish from a real impression. A contributing factor is that a fingerprint or palm-print impression obtained is the static depiction of the print at some given instant in time. The fingerprint in not a function of time. Similar problems exist with biometrics like face and iris. A problem here is that static two-dimensional or three- dimensional spoof 4 copies of the biometrics can be fabricated and used to spoof biometric security systems since these biometrics, are not functions of time.
Another problem with the prior art is that only one static fingerprint or palm-print image is collected during acquisition of the biometrics signal.
This instant image may be a distorted depiction of the ridges and valleys on the finger or palm because the user exerts force or torque with the finger with respect to the image acquisition device (fingerprint or palm-print reader). A problem is that, without collecting more than one image or modifying the mechanics of the sensor, it cannot be detected whether the image is acquired without distortion. An additional problem with the prior art is that there is only one choice for the image that can be used for person identification. Of course, for non-contact biometrics, like face and iris, distortion of the biometrics pattern cannot be detected or is very hard to detect.
The method presented in US patent 5933515 uses multiple fingers in a sequence which the user remembers and is known to the user only. If the fingers are indexed, say, from left to right as finger 0 through finger 9, the sequence is nothing more than a PIN. If one would consider the sequence plus the fingerprint images as a single biometric, the sequence is a changeable and non-static part of the biometric. However, it is not a series of controlled changes to an existing biometric, the pattern of each of the fingers is not changed but rather the pattern of the sequence can be changed. A problem is that anyone can watch the fingerprint sequence, probably easier than observing PIN entry because fingerprint entry is a slower process. Moreover, it requires storing each of the fingerprints of the subject for comparison.
Another problem with the prior art is that in order to assure authenticity of the biometrics signal, the sensor (fingerprint or palm-print reader, face imaging camera) needs to have embedded computational resources for body part authentication and sensor authentication. Body part authentication is commonly achieved by pulse and body temperature measurement. Sensor authentication can be achieved with two-directional communication between the sensor and the authentication server in the form of a challenge and response question session.
A potential big problem with prior art palm- and fingerprints is that if the user somehow loses a fingerprint or palm print impression or the template representing the print and this ends up in the wrong hands, the print is compromised forever since one cannot change prints. Prints of other f ingers can then be used but that can only be done a few more times.
A problem with prior art systems that use static fingerprints is that there is no additional information associated with the fingerprint which can be used for its additional discriminating power. That is, individuals that have fingerprints that are close in appearance can be confused because the fingerprints are static and no additional information is available to distinguish between these prints.
Traditional fingerprint databases may be searched by first filtering on fingerprint type (loop, whorl,...). A problem with this prior art is that there are few fingerprint classes because the fingerprint images are static snapshots in time and no additional information is associated with the is fingerprints.
A f inal problem with any of the prior art biometrics is that they are not backward compatible with other biometrics. The use of, say, faces for authentication is not backward compatible with fingerprint databases.
An object of the invention is a new type of biometrics which is Produced by a subject through a series of controlled changes to an existing biometrics.
According to the present invention there is provided a biometrics system comprising: an acquisition device for acquiring at least one biometric from a subject over a time period along with a controlled change of the biometric; and a storage process for storing and associating the biometric and the respective controlled change, the combined biometric and the respective controlled change over a time period being a resultant biometric.
Embodiments of the invention will now be described, by way of example, with reference to the following drawings, in which:
Figure 1 gives prior art examples of traditional biometrics.
Figure 2 shows a block diagram of an automated biometrics system for authentication (Figure 2A) and a block diagram of an automated biometrics system for identification (Figure 2B) 6 Figure 3 shows various possibilities for combining two biometrics at the system level, where Fig. 3A is combining through ANDing component, Fig. 3B is combining through ORing component, Fig. 3B is combining through ADDing and Fig. 3D is sequential combining.
Figure 4 is a generic block diagram conceptually showing the combining one biometrics with user action (another biometric) to obtain a biometric that is modified through a series of user-controlled changes.
Figure 5 is an example of a resultant fingerprint biometrics where the user can rotate the finger on the scanner according to a pattern.
Figure 6 is an example of a resultant fingerprint biometrics where the user has four degrees of freedom to move the finger on the scanner.
is Figure 7 is an example of a resultant palm-print image sequence generation.
Figure 8 is an example of a resultant face image sequence generation.
Figure 9 shows a block diagram of the behavioural component extraction of a resultant biometric. As an example, the rotation extraction of resultant rotated fingerprints of Fig. 5 is shown.
Figure 10 shows the local flow computation on a block by block basis from the input resultant fingerprint image sequence.
Figure 11 explains the computation of the curl or the spin of the finger as a function of time, which is the behavioural component of the resultant f ingerprint.
Traditional biometrics, such as fingerprints, have been used for (automatic) authentication and identification purposes for several decades.
Signatures have been accepted as a legally binding proof of identity and automated signature authentication/verification methods have been available for at least 20 years. Figure 1 gives examples of these biometrics. on the top-left, a signature 110 is shown and on the top-right a fingerprint impression 130 is shown. The bottom- left shows a voice (print) 120; the bottom-right an iris pattern 140.
Biometrics can be used for automatic authentication or identification of a human subject. Typically, the subject is enrolled by offering a sample 7 biometric when opening, say, a bank account or subscribing to an Internet service. From this sample biometrics, a template is derived that is stored and used for matching purposes at the time the user wishes to access the account or service. In the present embodiment, a template for a resultant biometric is a combination of a traditional template of the biometrics and a template describing the changing appearance of this biometric over time.
Resultant fingerprints and palm prints are described in further detail. A finger- or palm print template is derived from a selected impression in the sequence where there is no force, torque or rolling exerted. The template of the trajectory is a quantitative description of this motion trajectory over the period of time of the resultant fingerprint. Matching of two templates, in turn, is a combination of traditional matching of fingerprint templates plus dynamic string matching of the trajectories similar to is signature matching. This string matching is well known in the prior art.
Resultant finger-prints sensed while the user only exerts torque are described in greater detail.
A biometric more or less uniquely determines a person's identity, that is, given a biometric signal, the signal is either associated with one unique person or narrows down significantly the list of people with whom this biometric is associated. Fingerprints are an excellent biometrics, since never in history have two people with the same fingerprints been found; on the other hand, biometrics signals such as shoe size and body weight are poor biometrics signals since these signals obviously have little discriminatory value. Biometrics can be divided up into behavioural biometrics and physiological biometrics. Behavioural biometrics depend on a person's physical and mental state and are subject to change, possibly rapid change, over time. Behavioural biometrics include signatures 110 and voice prints 120 (see Fig. 1). Physiological biometrics, on the other hand, are subject to much less variability. Physiological biometrics include fingerprints 130 and iris image 140. For a fingerprint 130, the basic flow structure of ridges and valleys is essentially unchanged over a person's life span. The circular texture of a subject's iris 140, is believed to be even less variable over a subject's life span. Hence, there exist behavioural biometrics, e.g., 110 and 120, which to a certain extent are under the control of the subject and physiological biometrics whose appearance cannot be influenced (the iris 140) or can be influenced very little (the fingerprint 130).
8 Referring now to Fig. 2A, a typical prior-art automatic fingerprint authentication system 200 has a fingerprint image (biometrics signal) as input 210 to the biometrics matching system. This system comprises a signal processing stage 215 for feature extraction, a template extraction stage 220 for extracting a template from the features and a template matching stage 225. Along with the biometrics signal 210, an identifier 212 of the subject is input to the matching system. During the template matching stage 225, the template associated with this particular identifier is retrieved from a database of templates 230 indexed by identifier. If there is a Match/No Match between the extracted template 220 and the retrieved template from database 230, a 'Yes/No' answer 240 is output.
Matching is typically based on a similarity measure; if the measure is significantly large, the answer is 'Yes', otherwise the answer is 'No.' The following reference describes examples of the state of the prior art:
is N. K. Ratha, S. Chen and A. K. Jain, Adaptive flow orientation based feature extraction in fingerprint images, Pattern Recognition, vol. 28, no. 11, pp. 1657-1672, Nov. 1995.
Note that system 200 is not limited to fingerprint authentication; this system architecture is valid for any biometric. The biometric signal 210 that is input to the system can be acquired either local to the application on the client or remotely with the matching application running on some server. Hence the architecture of system 200 applies to all biometrics and networked or non-networked applications.
System 200 in Fig. 2A is an authentication system, system 250 in Fig 2B is an identification system. Again, the identical system 250 consists of the three stages 215, 220 and 225. However, in the case of the identification system 250, only a biometric signal 210 is input to the system. During the template matching stage 225, the extracted template is matched to all template, identifier pairs stored in database 230. If there exists a match between the extracted template 220 and a template associated with an identity in database 230, this identity is the output 255 of the identification system 250. If no match can be found in database 230, the output identity 255 could be set to NIL. Again, the biometric signal 210 can be acquired either local to the application on the client or remotely with the matching application running on some server. Hence architecture 250 applies to networked or non-networked applications.
9 Biometric signals can be combined (integrated) at the system level and at the subject level. The latter is the object of this invention. The former is summarised in Fig. 3 for the purposes of comparing the different methods and for designing decision methods for integrated subject-level biometrics (resultant biometrics). Four possibilities for combining (integrating) two biometrics are shown: Combining through ANDing 210 (Fig. 3A), Combining through ORing 220 (Fig. 3B), Combining through ADDing 230 (Fig. 3C) and serial or sequential combining 240 (Fig 3D). Two biometrics B. (250) and By (260) of a subject z are used for authentication as shown in Fig. 3.
However, more than two biometrics of a subject can be combined in a straightforward fashion. These biometrics can be the same, e.g., two fingerprints, or they can be different biometrics, e.g., fingerprint and signature. The corresponding matchers for the biometrics Bx and By, are matcher A 202 and matcher B 204 in Fig. 3, respectively. These matchers compare the template of the input biometrics 250 and 260 with stored templates and either give a 'Yes/No' 214 answer as in systems 210 and 220 or score values, S, (231) and S2 (233), as in systems 230 and 240.
System 210, combining through ANDing, takes the two 'Yes/No, answers of matcher A 202 and matcher B 204 and combines the result through the AND gate 212. Hence, only if both matchers 202 and 204 agree, the 'Yes/No' output 216 of system 210 is 'Yes' (the biometrics both match and subject Z is authenticated) otherwise the output 216 is 'No' (one or both of the biometrics do not match and subject Z is rejected). System 220, combining through ORing, takes the two 'Yes/No, answers of matchers A 202 and B 204 and combines the result through the OR gate 222. Hence, if one of the matchers' 202 and 204 'Yes/No' output 214 is 'Yes, I the 'Yes/No' output 216 of system 220 is 'Yes' (one or both of the biometrics match and subject Z is authenticated). only if both 'Yes/No' outputs 214 of the matchers 202 and 204 are 'No,' the 'Yes/No' output 216 of system 220 is 'No' (both biometrics do not match and subject Z is rejected).
For system 230, combining through ADDing, matcher A 202 and matcher B 204 produce matching scores S, (231) and S2 (233), respectively. Score S, expresses how similar the template extracted from biometrics B. (250) is to the template stored in matcher A 202, while score S2 expresses how similar the template extracted from biometrics By (260) is to the template stored in matcher B 204. The ADDer 232 gives as output the sum of the scores 231 and 233, S, + S2. In 234, this sum is compared to a decision threshold T, if S, + S2 > T, 236, the output is 'Yes' and the subject Z with biometrics B. and By is authenticated, otherwise the output is 'No' (238) and the subject is rejected.
System 240 in Fig. 3 combines the biometrics B. (250) and By (260) of a subject Z sequentially. The first biometrics B. (250) is matched against the template stored in matcher A (202) resulting in matching score S, (231). The resulting matching score is compared to threshold T, 244, and when test 244 fails and the output is 'No' (238) the subject Z is rejected.
Otherwise biometrics By (260) is matched against the template stored in matcher B (204). The output score S2 (233) of this matcher is compared to threshold T2 246. If the output is 'Yes,' i.e., S2 > T2 (236) subject Z is authenticated. Otherwise, when the output is 'No' 238, subject Z is rejected.
is Figure 4 is a generic flow diagram for combining a biometrics with user action, i.e., combining biometrics at the subject level. The user action, just like the movement of a pen to produce a signature, is the second behavioural biometrics. The user 410 offers a traditional biometric 420 for authentication or identification purposes. Such a biometric could be a fingerprint, iris or face. However, rather than holding the biometric still, as in the case of fingerprint or face, or keeping the eyes open, as in case of iris recognition, the user performs some specific action 430, a(t) with the biometrics. This action is performed over time 432, from time 0 (434) to some time T (436). Hence, the action a(t) is some one-dimensional function of time 430 and acts upon the traditional biometric 420. Note that this biometric is the actual biometric of user 410 and not a machine readable biometrics signal (i.e., in the case of fingerprints, it is the three-dimensional finger with the print on it). It is specified what the constraints of the action 430 are but within these constraints, the user 410 can define the action. (For example, constraints for putting a signature are that the user can move the pen over the paper in the x- and y-direction but cannot move the pen in the z-direction.) That is, the action 430 in some sense transforms the biometric of the user over time. It is this transformed biometric 450 that is input to the biometric signal recording device 460. The output 470 of this device is a sequence (series) of individually transformed biometrics signals B(t) 480 from time 0 (434) to some time T (436). In the case of fingerprints, these are fingerprint images, in the case of face, these are face images. This output sequence 470, is the input 485 to some extraction algorithm 490. The extraction algorithm computes from the sequence of transformed biometrics the pair (a I (t), B), 495, which is itself a biometric. The function a I (t) i 1 11 is some behavioural way of transforming biometric B over a time interval [0, T] and is related to the function a(t) which is chosen by the user (very much like a user would select a signature). The biometrics B can be computed f rom the pair (a, (t), B), that is, where a(t) 430 is zero, where there is no action of the user, the output 470 is undistorted digitisation of biometric 420. In general, it can be computed where in the signal 480, the biometrics 420 is not distorted.
Refer now to Fig. 5. This figure is an example of a resultant fingerprint biometric where the user can rotate the finger on the f ingerprint reader 510 (without sliding over the glass platen). This rotation can be performed according to some user defined angle a as a function of time a (t). An example of producing a resultant fingerprint is given in Fig. 5. The user puts the finger 540 on the fingerprint reader 510 in hand position 520.
is Then from time 0 (434) to time T (436), the user rotates finger 540 over the glass platen of fingerprint reader 510 according to some angle a as a function of time a (t). The rotation takes place in the horizontal plane, the plane parallel to the glass platen of the fingerprint reader. The rotation function in this case is the behavioural part of the resultant fingerprint and is defined by the user. (If this portion of the resultant biometric is compromised, the user can redefine this behavioural part of the resultant fingerprint.) First the user rotates by angle 550 to the left, to the hand position 525. Then the user rotates by angle 555 to the right, resulting in final hand position 530. During this operation over time interval [0, T1, the fingerprint reader has as output 470 a series of transformed (distorted) fingerprint images. This output 470 is a sequence of transformed biometrics 480 (fingerprints), as in Fig. 4, which are the input to the extraction algorithm 490 (Fig. 4). This algorithm computes, given the output 470, the angle a as a function of time a(t) 560 over the time interval 0 (434) to time T (436). The resultant fingerprint in this case is (a (t), F), with F the undistorted fingerprint image. The undistorted fingerprint image is found at times 434, 570 and 436 where the rotation angle a is zero. A preferred method for extracting the rotation angles from the distorted fingerprint images is described in Figs. 9-11.
Figure 6 is an example of a resultant fingerprint biometric where the user has four degrees of freedom, instead of one degree of freedom in Fig. 5, to move the f inger on the scanner. Again, as in Fig. 5, the user has the ability to rotate the finger 540 around the z-axis (the axis perpendicular to the glass platen of the fingerprint reader 510). This is depicted by rotation a, first along 550 to the left and then along 555 to the right.
12 This brings the hand position from 520 to 525 to the final hand position 530. Also, at any given angle a, the user can perform a rotation fl 610 around the axis 620 of the finger. Finally, the user can exert aforce f 630 parallel to the glass platen of fingerprint reader 510. This force can be constrained to be only on the direction of the finger 632, or can be unconstrained 634. In the former case, the degrees of freedom for moving the finger (without sliding over the glass platen) is three, in the latter there are four degrees of freedom. The angles a, P plus force f can be combined and referred to as motion m. During the user operations over time interval [0, T], the fingerprint reader has as output 470 a sequence of transformed (distorted) fingerprint images. This output 470 is a sequence of transformed biometrics 480 (fingerprints), as in Fig. 4, which are the input to an extraction algorithm 490 (Fig. 4). This algorithm computes, given the output 470, the angles a and P as a function of time over the is interval 0 (434) to time T (436). For example, the function for the angle a, a(t) is the function 560 of Fig. 5. Moreover, the algorithm computes the force 630. In the case of force constrained to be along the finger direction 632, f (E) will be a one-dimensional function. For the case that the force may be exerted along the glass platen of reader 510 in any direction, f(C) will be a two-dimensional function. The force will then have a component in the x-direction and a component in the y-direction. The resultant fingerprint for this case is (a(C), fi(t), f (C) F) or (m (C), F) with F the undistorted fingerprint image. The undistorted fingerprint image is found at times 434, 570 and 436 where the reconstructed motion m (690) is zero.
Figure 7 gives an example of the same principle as fingerprints for palm prints. The palm print reader 710 with glass platen 720 can, for example, be mounted next to a door. Only authorised users with matching palm print templates will be allowed access. The user will put his/her hand 730 on the palm print reader platen 720. As with the resultant fingerprints of Figs. 6 and 7, the user will not keep the palm biometric still but rather make movements with the palm. In the case of Fig. 7, rotation of the palm around the axis perpendicular to the glass platen is the behavioural part of the resultant palm-print biometric. The user could, for instance, rotate the hand to the right 740, followed by a rotation of the hand to the left 744, followed by a rotation of the hand to the right 748 again. As in Fig. 5, during these operations over some time interval [0, T1, the palm print reader has as output a sequence of transformed (distorted) palm print images. This output is a sequence of transformed biometrics 480 (palm prints), as in Fig. 4, which are the input to an extraction algorithm 490 13 as in Fig. 4. The algorithm computes, given the output of palm print reader 710, the palm rotation angle a as a function of time a(t) 560 over the time interval 0 (434) to time T (436). The resultant palm print in this case will be (a(t), P), with P the undistorted palm print image. The undistorted palm print image is found at times 434, 570 and 436 where the rotation angle a is zero.
Figure 8 describes a facial resultant biometric. Here subject 800 is posing in front of a camera to be identified or authenticated through both recognition of the physiological face biometrics plus an additional behavioural component. This behavioural component is introduced by head motion of the subject. This motion produces a sequence of face images as a f unction of time, Face-Image (t). When the subj ect 1 s f ace is in canonical position, the head is embedded in coordinate system 805 with the Y-Axis 810 is along the length of the head. The X-axis 820 is parallel to the line connecting the ears, while the Z-Axis 830 is parallel to the perpendicular to the frontal plane of the face. The subject now can generate a resultant biometric, Face-Image(C), by panning the face around the Y-Axis, resulting in a pan 840 as a function of time, Pan(t). The subject further can tilt 850 the face by bending the head in the plane 860 that is spanned by the Y-Axis 810 and the pan direction. This tilting 850 results in another function of time, Tilt (t). Hence, the resultant biometric in this case is a face image at some time, t,,, Face-Image(c.), a frontal depiction of the f ace, plus the pan and tilt, Pan (t) and Tilt (t), respectively. The f ace images in the sequence are mathematical transformations of the image of the face. Other distortions of the face image through other means are envisioned by the present invention.
In Fig. 9, a block diagram 900 of a generic process for extracting the behavioural component from a resultant biometric is given. The input 910 is a sequence of biometric signals B(t). In block 920, subsequent biometric signals, B(t+i) and B(t), are processed through inter-signal analysis.
Block 930, uses this analysis to extract the change, a(t+l) - a(t), in the behavioural component. In turn, this gives the output a(t) as a function of time 940, where a(t) is the behavioural component of the resultant biometric B (t). Added in Fig. 9 are the specific steps (inter- image flow analysis and affine motion parameter estimation) for estimating the finger 14 rotation from a sequence of distorted fingerprint images produced as in Fig. 5. These are further detailed in Figs. 10 and 11.
Rotation from one fingerprint image to the next can be estimated using the steps illustrated in Fig. 10. The images, B(t) and B(t+i), 1010 and 1015, are divided up into 16 x 16 blocks 1020, 1022, 1024,..., 1028 as given by the MPEG compression standard. Given a fingerprint image sequence B (t), of which two images (1010 and 1015) are shown in Fig. 10, the inter-image flow (u, v) 1040 for each block (of size 16 x 16) 1030 present in an image is computed. This represents the motion that may be present in any image B(t) 1010 with respect to its immediate next image B(t+l) 1015 in the sequence. A flow characterisation [u(x,y), v(x,y)] 1050 as a function of (x, y) 1060 and t 1070 of an image sequence is then a uniform motion representation amenable for consistent interpretation. This motion representation 1050 can be computed from the raw motion vectors encoded in the MPEG-1 or MPEG-2 image sequences. If the input is uncompressed, the flow field can be estimated using motion estimation techniques known in the prior art.
The following references describe the state of the prior art in MPEG compression, an example of prior art optical flow estimation in image sequences, and an example of prior art of directly extracting flow from
MPEG-compressed image sequences respectively:
B.G. Haskell, A. Puri and A.N. Netravali, Digital Video: An introduction to MPEG-2,
Chapman and Hill, 1997.
J. Bergen, P. Anandan, K. Hanna and R. Hingorani, Hierarchical model-based motion estimation, Second European Conference on Computer Vision, pp. 237-252, 1992.
Chitra Dorai and Vikrant Kobla, Extracting Motion Annotations from MPEG-2 Compressed Video for HDTV Content Management Applications, IEEE International Conference on Multimedia Computing and Systems, pp.673-678, 1999.
Refer now to Fig. 11. By examining the flow [u(x,y), v(x,y)] 1050 in the blocks 1020, 1022, 1024,., 1028, a largest connected component of zero-motion blocks, pictured by pivotal region 1110 in Fig. 11 is determined. Further analysis is performed on the flow around this region.
Using the flow computed for each image in the given image sequence, motion parameters from the fingerprint region are computed by imposing an affine motion model on the image-to-image flow and sampling the non-zero motion blocks radially around the bounding box 1120 of region 1110. Affine motion A 1130 can transform shape 1140 into shape 1145 in Fig 11B and quantifies translation 1150, rotation 1152 and shear 1154 due to image flow. Six parameters, a:,... a, are estimated in this process, where a, and a, correspond to translation, a3 and ar. correspond to rotation, and a, and a, correspond to shear. These parameters are estimated for each sampling around bounding box 1120. Average curl is computed in each frame t, C(t) a3 + as. The curl in each frame quantitatively provides the extent of rotation, or the spin of the finger skin around the pivotal region. That is, an expression C(C of the behavioural component of the resultant is fingerprint computed from flow vectors Eu(x,y), v(x,y)l 1050 is obtained.
The magnitude of the average translation vector, T(t)= (a,, aj of the frame is also computed.
For all the resultant biometrics discussed and envisioned, we have a traditional behavioural or physiological biometric. For representation (template) purposes and for matching purposes of that part of resultant biometrics, these traditional biometrics are well understood in the prior art. (See, the above Ratha, Chen and Jain reference for fingerprints.) For the other part of the resultant biometrics, the behavioural part, we are left with some one-dimensional or higher - dimensional function a(t) of time, a user action. Matching this part amounts to matching this function a(t) ' with a stored template a' (t). Such matching is again well-understood in the prior art and is routinely done in the area of signature verification. The following reference gives examples of approaches for matching.
V. S. Nalwa, "Automatic on-line signature verification," Proceedings of IEEE, pp. 215-239, Feb. 1997.
Now the resultant biometric, after matching with a stored template has either two 'Yes/No' (214 in Fig. 3) answers or two scores S, and S. (231 and 233 in Fig. 3). Any of the methods for combining the two biometrics discussed on Fig. 3 can be used to combine the traditional and userdefined biometrics of a resultant biometric to arrive at a matching decision.
16

Claims (14)

1. A biometrics system comprising:
an acquisition device for acquiring at least one biometric from a subject over a time period along with a controlled change of the biometric; and a storage process for storing and associating the biometric and the respective controlled change, the combined biometric and the respective controlled change over a time period being a resultant biometric.
2. A system, as claimed in claim 1, where the biometric is any one or more of the following: a physiological biometric, a behavioural biometric, a fingerprint, a face, a palm print, an iris, a retina, a foot print, a gait, a signature, a key stroke pattern, and a voice.
3. A system, as claimed in claim 1 or 2, where the controlled change is performed by the subject.
4. A system, as claimed in claim 1 or 2, where the controlled change is induced by a mechanism external to the subject.
5. A system, as claimed in claim 4, where the mechanism is any one or more of the following: a light change, a light frequency change, and a light intensity change.
6. A system, as claimed in claim 1, where the controlled change includes any one or more of the following: a distortion to the biometric, a force, a pressure, a motion, a torque, a frequency change, a gesture, an energy change, a loudness, an accentuation, and a pattern.
7. A system, as claimed in claim 1, where the biometric is a fingerprint and the controlled change is any one or more of the following: a finger motion, a figure torque, a finger pressure, and a finger force.
8. A system, as claimed in claim 1, where the biometric is a face and the controlled change is a face motion.
9. A system, as claimed in claim 1, where the biometric is a face and the controlled change is a face distortion.
1 17
10. A system, as claimed in claim 1, where the biometric is a palm and the controlled change is a palm motion.
11. A system, as claimed in claim 1, where the biometric is a voice and the controlled change is any one or more of the following: a loudness, a frequency, a pattern, and an intonation.
12. A system, as claimed in claim 1, where the biometric is a gait and the controlled change is any one or more of the following: a stop pattern, a speed, a sway, a course, a carriage, a hop, a skip, and a stride.
13. A system, as in claim 1, where the biometric is a signature and the controlled change is any one or more of the following: a slant, a loop, a is stretch, a size, and a spacing.
14. A method, performed by a biometrics system, comprising the steps of:
acquiring at least one biometric from a subject over a time period along with a controlled change of the biometric; and storing and associating the biometric and the respective controlled change, the combined biometric and the respective controlled change being a resultant biometric.
GB0029202A 1999-12-02 2000-11-30 Biometrics system Withdrawn GB2356961A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16854099P 1999-12-02 1999-12-02
US53583400A 2000-03-28 2000-03-28

Publications (2)

Publication Number Publication Date
GB0029202D0 GB0029202D0 (en) 2001-01-17
GB2356961A true GB2356961A (en) 2001-06-06

Family

ID=26864225

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0029202A Withdrawn GB2356961A (en) 1999-12-02 2000-11-30 Biometrics system

Country Status (2)

Country Link
JP (1) JP2001212112A (en)
GB (1) GB2356961A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2371908A (en) * 2000-10-13 2002-08-07 Omron Tateisi Electronics Co Identity verification
WO2005024706A1 (en) * 2003-09-11 2005-03-17 Philips Intellectual Property & Standards Gmbh Fingerprint detection using sweep-type imager with optoelectronic speed sensor
EP1288845A3 (en) * 2001-08-03 2005-09-07 Nec Corporation User authentication method and user authentication device
WO2009085338A3 (en) * 2007-12-28 2010-03-18 Apple Inc. Control of electronic device by using a person's fingerprints
EP2166475A1 (en) * 2008-09-05 2010-03-24 Fujitsu Limited Function activating apparatus and function activating method
WO2017007401A1 (en) * 2015-07-03 2017-01-12 Fingerprint Cards Ab Apparatus and computer-implemented method for fingerprint based authentication

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040125990A1 (en) * 2002-12-26 2004-07-01 Motorola, Inc. Method and apparatus for asperity detection
US9306952B2 (en) 2006-10-26 2016-04-05 Cfph, Llc System and method for wireless gaming with location determination
US9411944B2 (en) 2006-11-15 2016-08-09 Cfph, Llc Biometric access sensitivity
CA2669836C (en) * 2006-11-14 2016-07-12 Cfph, Llc Biometric access sensitivity
US20220296999A1 (en) 2010-08-13 2022-09-22 Cfph, Llc Multi-process communication regarding gaming information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990003620A1 (en) * 1988-09-23 1990-04-05 Fingermatrix, Inc. Full roll fingerprint apparatus
EP0649116A1 (en) * 1993-10-19 1995-04-19 Enix Corporation Piezoelectric surface pressure input panel
GB2319648A (en) * 1996-11-21 1998-05-27 Nicholas Turville Bullivant Signature verification
EP0889432A2 (en) * 1997-07-03 1999-01-07 Fujitsu Limited Roller fingerprint image capturing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1990003620A1 (en) * 1988-09-23 1990-04-05 Fingermatrix, Inc. Full roll fingerprint apparatus
EP0649116A1 (en) * 1993-10-19 1995-04-19 Enix Corporation Piezoelectric surface pressure input panel
GB2319648A (en) * 1996-11-21 1998-05-27 Nicholas Turville Bullivant Signature verification
EP0889432A2 (en) * 1997-07-03 1999-01-07 Fujitsu Limited Roller fingerprint image capturing system

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2371908A (en) * 2000-10-13 2002-08-07 Omron Tateisi Electronics Co Identity verification
GB2371908B (en) * 2000-10-13 2003-02-26 Omron Tateisi Electronics Co Image comparison apparatus, image comparison method, image comparison center apparatus, and image comparison system
SG106641A1 (en) * 2000-10-13 2004-10-29 Omron Tateisi Electronics Co Image comparison apparatus, image comparison method, image comparison center apparatus, and image comparison system
EP1288845A3 (en) * 2001-08-03 2005-09-07 Nec Corporation User authentication method and user authentication device
WO2005024706A1 (en) * 2003-09-11 2005-03-17 Philips Intellectual Property & Standards Gmbh Fingerprint detection using sweep-type imager with optoelectronic speed sensor
WO2009085338A3 (en) * 2007-12-28 2010-03-18 Apple Inc. Control of electronic device by using a person's fingerprints
EP2166475A1 (en) * 2008-09-05 2010-03-24 Fujitsu Limited Function activating apparatus and function activating method
WO2017007401A1 (en) * 2015-07-03 2017-01-12 Fingerprint Cards Ab Apparatus and computer-implemented method for fingerprint based authentication
US9672409B2 (en) 2015-07-03 2017-06-06 Fingerprint Cards Ab Apparatus and computer-implemented method for fingerprint based authentication

Also Published As

Publication number Publication date
JP2001212112A (en) 2001-08-07
GB0029202D0 (en) 2001-01-17

Similar Documents

Publication Publication Date Title
US7054470B2 (en) System and method for distortion characterization in fingerprint and palm-print image sequences and using this distortion as a behavioral biometrics
de Luis-Garcı́a et al. Biometric identification systems
US6963659B2 (en) Fingerprint verification system utilizing a facial image-based heuristic search method
Zhang Palmprint authentication
Ribarić et al. Multimodal biometric user-identification system for network-based applications
Gamboa et al. An Identity Authentication System Based On Human Computer Interaction Behaviour.
Rajasekar et al. Efficient multimodal biometric recognition for secure authentication based on deep learning approach
Choras Multimodal Biometrics for Person
GB2356961A (en) Biometrics system
Deriche Trends and challenges in mono and multi biometrics
Yin et al. Fusion of face recognition and facial expression detection for authentication: a proposed model
Mansoura et al. Multimodal face and iris recognition with adaptive score normalization using several comparative methods
Singhal et al. Towards an integrated biometric technique
Vinothkanna et al. A novel multimodal biometrics system with fingerprint and gait recognition traits using contourlet derivative weighted rank fusion
Adedeji et al. Clonal Selection Algorithm for feature level fusion of multibiometric systems
CA2340501A1 (en) System, method, and program product for authenticating or identifying a subject through a series of controlled changes to biometrics of the subject
Shubhangi et al. Multi-biometric approaches to face and fingerprint biometrics
Trabelsi et al. A bi-modal palmvein palmprint biometric human identification based on fusing new CDSDP features
Ahmed et al. Intelligent Technique for Human Authentication using Fusion of Finger and Dorsal Hand Veins
Ozkaya et al. Intelligent face mask prediction system
Ozkaya et al. Intelligent face border generation system from fingerprints
Yan et al. Bridging biometrics and forensics
Dey et al. Design and Implementation of Authentication System Using Deep Convoluted Siamese Network
Iloanusi et al. Biometric Recognition: Overview and Applications
Jemimah et al. Web based biometric validation using biological identities: an elaborate survey

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)