US20200143186A1 - Information processing apparatus, information processing method, and storage medium - Google Patents
Information processing apparatus, information processing method, and storage medium Download PDFInfo
- Publication number
- US20200143186A1 US20200143186A1 US16/388,239 US201916388239A US2020143186A1 US 20200143186 A1 US20200143186 A1 US 20200143186A1 US 201916388239 A US201916388239 A US 201916388239A US 2020143186 A1 US2020143186 A1 US 2020143186A1
- Authority
- US
- United States
- Prior art keywords
- user
- face
- information
- unit
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06K9/00906—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G06K9/00288—
-
- G06K9/00335—
-
- G06K9/00912—
-
- G06K9/6201—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
- G06V40/45—Detection of the body part being alive
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/60—Static or dynamic means for assisting the user to position a body part for biometric acquisition
- G06V40/67—Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and a storage medium.
- International Publication No. WO 2015/194135 discloses an authentication apparatus that checks reality to see if a user is a human or not.
- the authentication apparatus disclosed in International Publication No. WO 2015/194135 checks reality of a user by determining whether or not a response to a challenge is correct that is information based on which the user to be authenticated inputs information used for an authentication process.
- an example object of the present invention is to provide an information processing apparatus, an information processing method, and a storage medium that can suitably acquire necessary biometrics information based on a user's movement.
- an information processing method including: instructing a user about a movement; acquiring biometrics information on the user from the user instructed about the movement; and controlling display directed to the user in accordance with a movement status of the user.
- a non-transitory storage medium storing a program that causes a computer to perform: instructing a user about a movement; acquiring biometrics information on the user from the user instructed about the movement; and controlling display directed to the user in accordance with a movement status of the user.
- FIG. 1 is a block diagram illustrating a configuration of an information processing apparatus according to a first example embodiment of the present invention.
- FIG. 2 is a flowchart illustrating an operation of face authentication of the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 3 is a flowchart illustrating an operation of impersonation determination in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 4 is a flowchart illustrating an operation of face recognition in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 6 is a schematic diagram illustrating a face authentication window at the start of acquisition of face images in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 7 is a schematic diagram illustrating one example of a face authentication window during acquisition of face images in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 8 is a schematic diagram illustrating one example of a face authentication window at the completion of acquisition of face images in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 9 is a schematic diagram illustrating one example of a window after face authentication succeeds and login is completed in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 10 is a schematic diagram illustrating one example of a face authentication window when face authentication fails in the information processing apparatus according to the first example embodiment of the present invention.
- FIG. 11 is a block diagram illustrating a configuration of an information processing apparatus according to a second example embodiment of the present invention.
- FIG. 12 is a block diagram illustrating a configuration of an information processing apparatus according another example embodiment of the present invention.
- FIG. 1 to FIG. 10 An information processing apparatus and an information processing method according to a first example embodiment of the present invention will be described by using FIG. 1 to FIG. 10 .
- FIG. 1 is a block diagram illustrating the configuration of the information processing apparatus according to the present example embodiment.
- an information processing apparatus 10 has a central processing unit (CPU) 12 , a random access memory (RAM) 14 , a storage unit 16 , an input unit 18 , a display unit 20 , a capture unit 22 , an audio output unit 24 , and a communication unit 26 .
- the CPU 12 , the RAM 14 , the storage unit 16 , the input unit 18 , the display unit 20 , the capture unit 22 , the audio output unit 24 , and the communication unit 26 are connected to a common bus 28 .
- the information processing apparatus 10 is a smartphone, for example, while not particularly limited.
- the information processing apparatus 10 may be a tablet type personal computer, a mobile phone, or the like. Further, the information processing apparatus 10 may be a computer apparatus such as a laptop personal computer, a desktop personal computer, or the like, for example.
- the information processing apparatus 10 can execute various application programs in accordance with an execution instruction from a user using the same.
- the CPU 12 operates by executing a program stored in the storage unit 16 and functions as a control unit that controls the operation of the entire information processing apparatus 10 . Further, the CPU 12 performs various processes as the information processing apparatus 10 by executing a program stored in the storage unit 16 .
- the RAM 14 provides a memory field necessary for the operation of the CPU 12 .
- the information processing apparatus 10 performs face authentication for a user when the user logs in a particular application program stored in the storage unit 16 . Furthermore, the information processing apparatus 10 is configured to be able to determine impersonation of the user by using a non-living object such as a photograph, a moving image, or the like in face authentication.
- the information processing apparatus 10 can also perform the same face authentication at various timings other than at login to a particular application program.
- the information processing apparatus 10 can also perform the same face authentication for the user at the start of a use of the information processing apparatus 10 , such as at system startup or at unlocking of the information processing apparatus 10 .
- the information processing apparatus 10 can also perform the same face authentication for the user at the start of access to a particular resource such as a particular file, a particular directory, a particular folder, or the like stored in the storage unit 16 or the like.
- the information processing apparatus 10 can be configured to perform face authentication at the start of a particular process for the user who requests that process.
- the CPU 12 functions as each of the following function units used for face authentication for the user by executing a particular application program stored in the storage unit 16 . That is, the CPU 12 functions as an authentication processing unit 12 a , a movement instruction unit 12 b , a face image acquisition unit 12 c , an information provide unit 12 d , an impersonation determination unit 12 e , and a face recognition unit 12 f.
- the authentication processing unit 12 a causes the display unit 20 to display a login window that requests the user for face authentication and accepts a login request from the user.
- the user who is a target of face authentication is able to perform touch entry by pushing a particular region such as a login button, a login icon, or the like in a login window displayed on the display unit 20 , which is configured integrally with the input unit 18 as a touchscreen display.
- This enables the user to input a login request to the information processing apparatus 10 .
- the user is able to input a login request to the information processing apparatus 10 by causing the capture unit 22 to capture its own face and inputting the face image to the information processing apparatus 10 , for example.
- the authentication processing unit 12 a causes the display unit 20 to display a face authentication window used for face authentication.
- the authentication processing unit 12 a performs face authentication as authentication for the user based on a result of impersonation determination performed by the impersonation determination unit 12 e and a result of face recognition performed by the face recognition unit 12 f . That is, the authentication processing unit 12 a determines whether or not face authentication of the user is successful based on a result of impersonation determination and a result of face recognition and performs a process in accordance with the determination results.
- the authentication processing unit 12 a determines that the face authentication of the user is successful when it is determined by the impersonation determination unit 12 e that there is no impersonation and it is determined by the face recognition unit 12 f that there is a matching in face recognition, as described later. In response to determining that face authentication of the user is successful, the authentication processing unit 12 a permits login of the user to a particular application program and performs a login process for allowing the user to log in.
- the movement instruction unit 12 b instructs the user, who is a target of face authentication, to perform a particular movement regarding the head captured by a capture unit 22 .
- the movement instruction unit 12 b can cause the display unit 20 to display the display providing an instruction about a particular movement and instruct the user about the particular movement.
- the movement instruction unit 12 b can cause the display unit 20 to display an instruction message about a particular movement, a visual symbol that moves as an animation with a particular movement, or the like, for example, as the instruction display of a particular movement.
- a visual symbol may be, for example, a face icon that is an icon imitating a face, a pictogram, or the like.
- a particular movement instructed by the movement instruction unit 12 b may be, for example, a movement of shaking the head laterally or vertically, a movement of turning the head around, or the like, while not particularly limited.
- the movement instruction unit 12 b can cause the display unit 20 to display an instruction message “Please shake your head” or cause the display unit 20 to display a face icon that shakes the head in an animation movement, for example.
- the movement instruction unit 12 b can function as a display control unit that controls and changes the display providing an instruction about a particular movement on the display unit 20 , which is display directed to the user, in accordance with the movement state of the user in motion captured by the capture unit 22 .
- the movement instruction unit 12 b can control and change the display providing the instruction about a particular movement on the display unit 20 in accordance with a determination result by the face image acquisition unit 12 c as to whether or not a particular movement of the user is proper as described below.
- the movement instruction unit 12 b controls and changes the display providing the instruction about a particular movement in the following manner, for example.
- the movement instruction unit 12 b can continue the face icon's movement of shaking the head displayed on the display unit 20 or change the face icon's movement of shaking the head to a wider movement. Further, when it is determined to be improper, the movement instruction unit 12 b may also change the face icon display to so as to have a negative facial expression associating that the user's particular movement is improper, for example. Further, when it is determined to be improper, the movement instruction unit 12 b may cause the display unit 20 to emphasize and display an instruction message “Please shake your head” or may cause the display unit 20 to display an instruction message “Please shake your head widely.”
- the movement instruction unit 12 b can control and change the display providing the instruction about a particular movement so as to indicate that the user's particular movement is proper. Further, when it is determined that a user's particular movement is improper, the movement instruction unit 12 b can control and change the display providing the instruction of a particular movement so as to indicate that the user's particular movement is improper.
- the movement instruction unit 12 b may instruct the user about a particular movement by outputting a voice of a movement instruction from the audio output unit 24 in addition to or instead of the display of movement instruction display on the display unit 20 .
- the movement instruction unit 12 b can output a voice uttering an instruction message providing an instruction about a particular movement, such as “Please shake your head”, from the audio output unit 24 .
- the face image acquisition unit 12 c is an information acquisition unit that acquires a face image that is biometrics information on the user from a moving image captured by the capture unit 22 .
- the face image acquisition unit 12 c operates as below to acquire a face image.
- the face image acquisition unit 12 c determines whether or not a particular movement regarding the user's head in a moving image captured by the capture unit 22 is proper.
- the face image acquisition unit 12 c determines that the user's particular movement is proper.
- the face image acquisition unit 12 c determines that the user's particular movement is improper.
- the face image acquisition unit 12 c determines that the user's particular movement is improper.
- the face image acquisition unit 12 c may determine that the user's movement is proper even when a particular movement instructed by the movement instruction unit 12 b and the user's movement are not the same. For example, even when the user shakes the head vertically while the movement instruction unit 12 b displays an animation of shaking the head laterally, the user's movement may be a movement which can be used in impersonation determination to acquire a plurality of face images that can be used for estimating the three-dimensional shape of the user's face. In such a case, the face image acquisition unit 12 c can determine that the user's movement is proper.
- the face image acquisition unit 12 c In response to determining that the user's particular movement is proper, the face image acquisition unit 12 c detects a face image of the user in a moving image captured by the capture unit 22 . Furthermore, the face image acquisition unit 12 c acquires a face image of the detected user as face information on the user. At this time, the face image acquisition unit 12 c evaluates the quality of the detected face image and acquires a high quality face image having a quality above a predetermined quality. For example, the face image acquisition unit 12 c can evaluate the quality of a face image based on a quality value calculated for each of a predetermined number of face feature points preset for estimating a three-dimensional shape of a face. The face image acquisition unit 12 c is not necessarily required to acquire a face image itself but may extract and acquire a feature amount from a face image as face information.
- the face image acquisition unit 12 c repeatedly acquires a face image of the user as described above until a time limit, which is a predetermined time period from the time when a particular movement is instructed by the movement instruction unit 12 b , has elapsed. Thereby, the face image acquisition unit 12 c attempts acquisition of a predefined number of face images necessary for impersonation determination.
- the time limit used in acquiring face images is preset in accordance with required accuracy of impersonation determination, a time period required to complete authentication, or the like, for example.
- the predefined number of face images necessary for impersonation determination is preset in accordance with required accuracy of impersonation determination, a time period required to complete authentication, or the like, for example.
- the face image acquisition unit 12 c functions as a display control unit that causes the display unit 20 to display the progress status of acquisition of face images as display in accordance with the movement status of the user.
- the face image acquisition unit 12 c can cause the display unit 20 to display an acquisition rate of face images in percentage, which is a ratio of the number of acquired face images to the predefined number required for impersonation determination, for example, as the progress status of acquisition of face images in accordance with the movement status of the user.
- the face image acquisition unit 12 c may display the acquisition rate of face images as a numerical value in a form of percentage expression or may display an acquisition rate of face images by using a gage or a progress bar whose length changes in accordance with the acquisition ratio of face images, for example.
- the shape of a gage or a progress bar indicating an acquisition ratio of face images is not particularly limited, and a bar-like shape, a ring-like shape, or a frame-like shape may be employed, for example.
- the face image acquisition unit 12 c controls and changes the display indicating the progress status of acquisition of face images, which is display directed to the user, in accordance with the acquisition status of face images.
- the face image acquisition unit 12 c can cause the display unit 20 to display the display indicating the progress status of acquisition of face images, such as a gage, a progress bar, or the like, in accordance with the user's movement status.
- a form of display indicating the progress status of acquisition of face images is not limited to a gage or a progress bar, and various forms of display may be employed.
- the information provide unit 12 d provides guidance information to the user.
- the guidance information is information that may be a reference, a guide, a help, or the like for successful acquisition of the predefined number of face images by the face image acquisition unit 12 c when face authentication is again performed.
- the information provide unit 12 d can cause the display unit 20 to display and provide guidance information to the user, for example. Further, the information provide unit 12 d may output guidance information by using a voice from the audio output unit and provide the guidance information to the user, for example.
- the information provide unit 12 d can provide, to the user, guidance information in accordance with a reason for being unable to acquire a face image by the face image acquisition unit 12 c .
- the information provide unit 12 d can provide guidance information regarding a user's particular movement.
- the information provide unit 12 d can provide guidance information regarding an authentication environment. Further, it is possible to provide, to the user, guidance information regarding a user's particular movement and guidance information regarding an authentication environment in combination.
- the information provide unit 12 d can provide, as guidance information regarding a user's particular movement, guidance information that guides the user to perform a proper particular movement. For example, for the user instructed to perform a head shaking movement as a particular movement, the information provide unit 12 d can display a guidance message regarding a head shaking movement on the display unit 20 or output a guidance message by using a voice from the audio output unit 24 as guidance information. In such a case, guidance message may be “Please shake your head widely”, “Please shake your head slowly”, “Motion is too fast”, or the like.
- the information provide unit 12 d can provide guidance information that guides the user to perform face authentication in a proper authentication environment as guidance information regarding authentication environment.
- An authentication environment where guidance information may be provided is not particularly limited and may correspond to brightness of an authentication place that is a place where authentication is performed, a capturing distance that is a distance at which the user is captured by the capture unit 22 , or the like, for example.
- the information provide unit 12 d can provide guidance information regarding brightness of the authentication place as guidance information.
- the information provide unit 12 d can display a guidance message such as “it is too bright”, “it is too dark”, or the like on the display unit 20 or output the guidance message as a voice from the audio output unit 24 , for example, as guidance information.
- the information provide unit 12 d can provide guidance information regarding a capturing distance by the capture unit 22 .
- the information provide unit 12 d can display a guidance message such as “it is too far”, “it is too close”, or the like on the display unit 20 or output the guidance message as a voice from the audio output unit 24 , for example, as guidance information.
- the information provide unit 12 d can provide guidance information by causing the display unit 20 to display a visual symbol associating the content of a guidance message instead of or in addition to the guidance message.
- a visual symbol may be, for example, an icon, a pictogram, or the like.
- the impersonation determination unit 12 e performs impersonation determination based on a plurality of face images acquired by the face image acquisition unit 12 c . In impersonation determination, the impersonation determination unit 12 e determines whether or not the user's face is impersonated by a non-living object such as a photograph, a moving image, or the like.
- the impersonation determination unit 12 e estimates a three-dimensional shape of the user's face based on a plurality of face images acquired by the face image acquisition unit 12 c in impersonation determination.
- the plurality of face images acquired by the face image acquisition unit 12 c are obtained when the face of the user performing a particular movement is captured by the capture unit 22 and thus are multi-viewpoint images captured from different viewpoints.
- the impersonation determination unit 12 e estimates a three-dimensional shape of the user's face from a plurality of face images, which are multi-viewpoint images.
- a method of estimating the three-dimensional shape is not particularly limited, and various methods can be used.
- the impersonation determination unit 12 e can estimate a three-dimensional shape of a face image of the user from the plurality of face images acquired by the face image acquisition unit 12 c by using a bundle adjustment method.
- the impersonation determination unit 12 e can perform the following process, for example. That is, the impersonation determination unit 12 e acquires face information indicating respective positions of a predetermined number of face feature points in a face image for each plurality of face images acquired by acquired by the face image acquisition unit 12 c .
- the number of face feature points used for acquiring face information may be two or more, for example, which may be specifically ten face feature points at the right eye, the left eye, the nose, the center of the mouth, the right corner of the mouth, the left corner of the mouth, the right cheek, the left cheek, the right side of the chin, and the left side of the chin.
- the impersonation determination unit 12 e associates each of the predetermined number of face feature points between a plurality of face images based on respective pieces of face information of the acquired plurality of face images.
- the impersonation determination unit 12 e can estimate a three-dimensional shape of the user's face based on a result of association of face feature points. According to such a process, each of a predetermined number of face feature points can be uniquely associated between a plurality of face images based on each face information of the plurality of face images. No complex process is required for this association. It is therefore possible to perform the association of face feature points between the plurality of face images more easily and more accurately, and it is possible to estimate a three-dimensional shape more easily and more accurately.
- the impersonation determination unit 12 e determines whether or not the user's face is impersonated by a non-living object such as a photograph, a moving image, or the like by evaluating the three-dimensional shape of the estimated user's face. For example, the impersonation determination unit 12 e can determine whether or not there is impersonation by evaluating whether or not the estimated three-dimensional shape is three-dimensional, that is, whether the estimated three-dimensional shape is three-dimensional or two-dimensional. In such a case, the impersonation determination unit 12 e can evaluate whether the three-dimensional shape is three-dimensional or two-dimensional by determining a relationship between an evaluation amount regarding the estimated three-dimensional shape and a preset threshold.
- a distance, an angle, or the like associated with a particular face feature point or an amount based thereon may be used, for example, without being particularly limited.
- various methods may be used without being particularly limited.
- the impersonation determination unit 12 e evaluates that the estimated three-dimensional shape of a face is three-dimensional, the impersonation determination unit 12 e determines that the user's face is a face of a living body and is not impersonated. On the other hand, when the impersonation determination unit 12 e evaluates that the estimated three-dimensional shape is two-dimensional, the impersonation determination unit 12 e determines that the user's face is impersonated by a non-living object such as a photograph, a moving image, or the like.
- the face recognition unit 12 f performs face recognition based on one or a plurality of face images acquired by the face image acquisition unit 12 c .
- face recognition the face recognition unit 12 f compares a target face image, which is a face image of the user captured by the capture unit 22 and acquired by the face image acquisition unit 12 c , with a registered face image, which is a pre-registered face image of the user, and determines whether or not both face images are matched.
- the face recognition unit 12 f can use a face image having the highest quality, such as a face image in which the user faces front, as a target face image out of a plurality of face images acquired by the face image acquisition unit 12 c and compare the target face image with a registered face image.
- the face recognition unit 12 f may perform comparison between each of the plurality of face images and a registered face image.
- the registered face image which is registered biometrics information on the user, is pre-stored and registered in the storage unit 16 , for example.
- the face recognition unit 12 f can calculate a matching score indicating the similarity between a feature amount of a target face image and a feature amount of a registered face image. The higher the similarity between feature amounts of both face images is, the larger the matching score is.
- the matching score is greater than or equal to a predetermined threshold
- the face recognition unit 12 f can determine that there is a matching in the comparison between a target face image and a registered face image, that is, there is a matching in the face recognition.
- the matching score is less than a predetermined threshold
- the face recognition unit 12 f can determine that there is no matching in the comparison between a target face image and a registered face image, that is, there is no matching in the face recognition.
- the face recognition unit 12 f can use an already acquired face image as a target face image and compare that target face image with a registered face image in the same manner as described above. In such a case, at the time when it is determined by the face recognition unit 12 f that there is no matching in comparison, the face image acquisition unit 12 c can stop acquisition of face images even before a predefined number of face images are acquired.
- the functions of respective units of the CPU 12 described above are not necessarily required to be implemented in the CPU 12 of the information processing apparatus 10 that is a single apparatus but may be implemented by other external apparatuses such as a server.
- the functions of the impersonation determination unit 12 e , the face recognition unit 12 f , or the like of the functions of respective units of the CPU 12 described above may be implemented by a CPU of a server communicably coupled to the information processing apparatus 10 via a network.
- the CPU 12 of the information processing apparatus 10 transmits a process request from the function of the impersonation determination unit 12 e , the face recognition unit 12 f , or the like to the server via the network together with data such as a face image, a feature amount extracted from a face image, or the like necessary for the requested process. Further, the CPU 12 receives a process result obtained by the function of the impersonation determination unit 12 e , the face recognition unit 12 f , or the like from the server via the network.
- the storage unit 16 is formed of a storage medium such as a nonvolatile memory, such as a flash memory, a hard disk drive, or the like.
- the storage unit 16 stores a program such as a particular application program executed by the CPU 12 , data referenced by the CPU 12 in executing the program, or the like.
- the storage unit 16 stores a face image of the user acquired from an image captured by the capture unit 22 .
- the storage unit 16 stores a registered face image of the user used in face recognition, for example.
- the input unit 18 accepts user input of information, an instruction, or the like to the information processing apparatus 10 .
- the user may input various information or input an instruction to perform a process into the information processing apparatus 10 via the input unit 18 .
- the input unit 18 is formed of a touchscreen embedded in the display unit 20 , for example.
- the input unit 18 may be formed of a keyboard, a mouse, or the like, for example.
- the display unit 20 displays various windows such as an execution window of a particular application program under the control of the CPU 12 .
- an execution window of a particular application program includes a login window, a face authentication window, a window after login, or the like.
- the display unit 20 is formed of a liquid crystal display, an organic light emitting diode (OLED) display, or the like and configured as a touchscreen display together with the input unit 18 .
- OLED organic light emitting diode
- the capture unit 22 captures an image including a face of the user who performs face authentication.
- the capture unit 22 is formed of a digital camera that can capture a moving image and acquires two-dimensional image forming a moving image at a predetermined framerate.
- the communication unit 26 transmits and receives data to and from an external apparatus such as a server via a network under the control of the CPU 12 .
- the communication standard, the communication scheme, or the like of the communication unit 26 is not particularly limited and may be a wireless scheme or a wired scheme.
- the information processing apparatus 10 is configured.
- the movement instruction unit 12 b controls and changes the display providing an instruction about a particular movement on the display unit 20 in accordance with the movement status of the user during a movement captured by the capture unit 22 .
- This enables the user to determine whether or not his/her movement responding to an instruction of a particular movement is proper and appropriately correct his/her movement.
- the face image acquisition unit 12 c causes the display unit 20 to display the progress status of acquisition of face images. The user is able to also use the progress status of acquisition of face images as a material for determining whether or not his/her movement is proper.
- the information processing apparatus 10 of the present example embodiment it is possible to cause the user to move properly in accordance with an instruction of a particular movement, and a face image that is biometrics information necessary for determination of impersonation can be acquired properly based on the user's movement.
- the information provide unit 12 d provides guidance information when acquisition of a predefined number of face images by the face image acquisition unit 12 c fails. Therefore, according to the information processing apparatus 10 of the present example embodiment, it is possible to cause the user to act so that the predefined number of face images can be acquired by the face image acquisition unit 12 c when the user again performs face authentication.
- FIG. 2 is a flowchart illustrating the operation of face authentication in the information processing apparatus 10 .
- FIG. 3 is a flowchart illustrating the operation of impersonation determination in the information processing apparatus 10 .
- FIG. 4 is a flowchart illustrating the operation of face recognition in the information processing apparatus 10 .
- FIG. 5 to FIG. 10 are schematic diagrams illustrating examples of a series of windows on the display unit 20 during a face authentication operation in the information processing apparatus 10 , respectively. In response to the operation of face authentication in the information processing apparatus 10 being performed, an information processing method according to the present example embodiment is performed.
- the authentication processing unit 12 a causes the display unit 20 to display a login window that requests face authentication from the user and accepts a login request of the user as illustrated in FIG. 2 (step S 102 ).
- FIG. 5 illustrates one example of a login window SL that requests face authentication.
- a face icon SL 12 indicating that face authentication is requested in a login operation is displayed.
- the user may input, to the information processing apparatus 10 , a login request that requests login to a particular application program.
- the authentication processing unit 12 a continuously determines whether or not a login request is input by the user (step S 104 ) and stands by until a login request is input (step S 104 , NO).
- the authentication processing unit 12 a In response to determining that a login request is input (step S 104 , YES), the authentication processing unit 12 a causes the display unit 20 to display a face authentication window used for performing face authentication (step S 106 ).
- FIG. 6 illustrates one example of a face authentication window SF at the start of face authentication, that is, at the start of acquisition of a face image by the face image acquisition unit 12 c .
- the face authentication window SF includes a moving image display region SF 12 that displays a moving image captured by the capture unit 22 and a movement instruction display region SF 14 that displays an instruction about a particular movement from the movement instruction unit 12 b .
- the face authentication window SF includes a bar-like gage SF 16 indicating the progress status of acquisition of face images and frame-like gages SF 18 L and SF 18 R similarly indicating the progress status of acquisition of face images.
- the face authentication window SF includes a “Use password” button SF 20 used for switching the authentication scheme to a password scheme and a “Cancel” button SF 22 used for cancelling face authentication.
- the moving image display region SF 12 displays, in substantially real time, a moving image including a face of the user captured by the capture unit 22 .
- the user is able to determine whether or not his/her movement to the instruction of a particular movement from the movement instruction unit 12 b is proper by checking the moving image of his/her face displayed in the moving image display region SF 12 .
- the movement instruction display region SF 14 displays an instruction message SF 142 providing an instruction about a particular movement and a face icon SF 144 similarly providing an instruction about a particular movement.
- the instruction message SF 142 indicates the content of a particular movement instructed by the movement instruction unit 12 b .
- the face icon SF 144 moves as an animation with a particular movement instructed by the movement instruction unit 12 b .
- the movement instruction unit 12 b instructs the user about a particular movement by using the instruction message SF 142 such as “Please shake your head”, for example. Further, the movement instruction unit 12 b instructs the user about a particular movement by using the face icon SF 144 that shakes the head as an animation movement, for example.
- the bar-like gage SF 16 changes the length thereof in accordance with the progress status of acquisition of face images performed by the face image acquisition unit 12 c . Specifically, the bar-like gage SF 16 changes the length thereof so as to extend from the left to the right in accordance with an acquisition rate of face images performed by the face image acquisition unit 12 c . Further, inside the bar-like gage SF 16 , the acquisition rate of face images is displayed as a numerical value in a form of percentage representation.
- the frame-like gages SF 18 L and SF 18 R are arranged so as to form the left half frame portion and the right half frame portion around the moving image display region SF 12 , respectively.
- the frame-like gages SF 18 L and SF 18 R changes the lengths thereof in accordance with the progress status of acquisition of face images performed by the face image acquisition unit 12 c , respectively, in a similar manner to the bar-like gage SF 16 .
- the frame-like gages SF 18 L and SF 18 R change the lengths thereof so as to extend from the bottom to the top in accordance with the acquisition rate of face images performed by the face image acquisition unit 12 c , respectively.
- the face icon SF 144 is arranged at the center on the upper side, and the bar-like gage SF 16 is arranged at the center on the lower side.
- the frame-like gages SF 18 L and SF 18 R indicate an acquisition rate of face images of 0% at the center of the bar-like gage SF 16 and indicate an acquisition rate of face images of 100% at the reaching point at the face icon SF 144 , respectively.
- the “Use password” button SF 20 is a button used for switching the authentication scheme used for login to a particular application program from a face authentication scheme to a password scheme. The user is able to switch the authentication scheme used for login to a password scheme which requires entry of a password by pushing the “Use password” button SF 20 .
- the “Cancel” button SF 22 is a button used for cancelling face authentication used for login to a particular application program. The user is able to cancel face authentication by pushing the “Cancel” button SF 22 .
- the movement instruction unit 12 b causes the display unit 20 to display an instruction message, which provides an instruction about a particular movement, and a face icon, which moves as an animation with a particular movement, in the face authentication window and instructs the user about the particular movement (step S 108 ).
- the movement instruction unit 12 b can provide an instruction about a movement of shaking the head by using an instruction message, a face icon, or the like, for example.
- the face image acquisition unit 12 c attempts acquisition of a predefined number of user's face images necessary for impersonation determination until the time limit elapses.
- the face image acquisition unit 12 c determines whether or not a user's particular movement in the moving image captured by the capture unit 22 is proper (step S 110 ).
- step S 110 If it is determined by the face image acquisition unit 12 c that the user's particular movement is improper (step S 110 , NO), unless the time limit has elapsed (step S 112 , NO), the movement instruction unit 12 b controls and changes the display providing an instruction about a particular movement (step S 114 ). In this case, the movement instruction unit 12 b may continue the movement of the face icon shaking the head, may cause the movement of the face icon shaking the head to be wider, or may emphasize and display an instruction message “Please shake your head” on the display unit 20 , for example.
- the face image acquisition unit 12 c then continues to determine whether or not the user's particular movement is proper (step S 110 ). Note that, if the time limit has elapsed (step S 112 , YES), the process proceeds to step S 138 described below.
- the movement instruction unit 12 b controls and changes the display providing the instruction about a particular movement (step S 116 ).
- the movement instruction unit 12 b may stop the movement of the face icon shaking the head or may change display of the face icon so as to have a smiling facial expression or the like which associates that the user's particular movement is proper, for example.
- the face image acquisition unit 12 c determines that the user's particular movement is proper (step S 110 , YES)
- the face image acquisition unit 12 c detects a face image of the user in the moving image captured by the capture unit 22 (step S 118 ).
- the face image acquisition unit 12 c evaluates the quality of the detected face image (step S 120 ). If the face image acquisition unit 12 c determines that the quality of the face image is not above a predetermined quality (step S 120 , NO), the process proceeds to step S 110 described above to again attempt acquisition of face images.
- the face image acquisition unit 12 c determines that the quality of the face image is above a predetermined quality (step S 120 , YES)
- the face image acquisition unit 12 c acquires a high quality face image having a quality above the predetermined quality (step S 122 ).
- the face image acquisition unit 12 c controls and changes the display indicating the progress status of acquisition of face images, such as a gage, a progress bar, a numerical value in a form of percentage indicating an acquisition rate of face images, or the like (step S 124 ).
- FIG. 7 illustrates one example of the face authentication window SF when face images are being acquired by the face image acquisition unit 12 c .
- FIG. 7 illustrates the face authentication window SF when the acquisition rate of face images is 50%.
- the bar-like gage SF 16 extends from the left to the right, and the frame-like gages SF 18 L and SF 18 R extend from the bottom to the top as the number of acquired face images increases, as illustrated in FIG. 7 .
- the numerical value of the acquisition rate of face images inside the bar-like gage SF 16 increases as the number of acquired face images increases.
- the face image acquisition unit 12 c determines whether or not a predefined number of face images necessary for impersonation determination is acquired (step S 126 ). If the face image acquisition unit 12 c determines that the predefined number of face images have not been acquired (step S 126 , NO), unless the time limit has elapsed (step S 128 , NO), the face image acquisition unit 12 c enters step S 110 described above to again attempt acquisition of face images. Note that, if the time limit has elapsed (step S 128 , YES), the process proceeds to step S 138 described later.
- the impersonation determination unit 12 e performs impersonation determination (step S 130 ), and the face recognition unit 12 f performs face recognition (step S 132 ).
- the face recognition unit 12 f can perform face recognition by using already acquired face images at the time before the predefined number of face images are acquired by the face image acquisition unit 12 c.
- FIG. 8 illustrates one example of the face authentication window SF at completion of acquisition of the predefined number of face images performed by the face image acquisition unit 12 c .
- the bar-like gage SF 16 and the frame-like gages SF 18 L and SF 18 R have extended at the maximum, respectively.
- the acquisition ratio of face images inside the bar-like gage SF 16 is 100%.
- a completion icon SF 24 indicating the completion of acquisition of face images is displayed instead of the face icon SF 144 , and the completion of the process is visually indicated to the user.
- the impersonation determination unit 12 e first estimates a three-dimensional shape of the user's face based on a plurality of face images acquired by the face image acquisition unit 12 c as illustrated in FIG. 3 (step S 202 ).
- the impersonation determination unit 12 e evaluates whether or not the estimated three-dimensional shape is three-dimensional (step S 204 ).
- the impersonation determination unit 12 e evaluates that the estimated three-dimensional shape is three-dimensional (step S 204 , YES), the impersonation determination unit 12 e determines that the user's face is a face of a living body and is not impersonated (step S 206 ).
- the impersonation determination unit 12 e determines that the user's face is impersonated by a photograph, a moving image, or the like (step S 208 ).
- the face recognition unit 12 f first selects a target face image used for comparison with a registered face image from the plurality of face images acquired by the face image acquisition unit 12 c , as illustrated in FIG. 4 (step S 302 ).
- the face recognition unit 12 f reads and acquires a registered face image to be compared with the target face image from the storage unit 16 or the like (step S 304 ).
- the face recognition unit 12 f calculates a matching score indicating the similarity between a feature amount of the target face image and a feature amount of the registered face image (step S 306 ).
- the face recognition unit 12 f determines whether or not the calculated matching score is greater than or equal to a predetermined threshold (step S 308 ).
- step S 308 determines that the matching score is greater than or equal to a predetermined threshold (step S 308 , YES)
- the face recognition unit 12 f determines that there is a matching in the comparison between the target face image and the registered face image, that is, there is a matching in the face recognition (step S 310 ).
- step S 308 determines that the matching score is less than the predetermined threshold (step S 308 , NO)
- the face recognition unit 12 f determines that there is no matching in the comparison between the target face image and the registered face image, that is, there is no matching in the face recognition (step S 312 ).
- the face recognition unit 12 f may perform the process illustrated in FIG. 4 using already acquired face images even at the time before the predefined number of face images are acquired by the face image acquisition unit 12 c.
- the authentication processing unit 12 a determines whether or not face authentication of the user is successful based on a result of the impersonation determination performed by the impersonation determination unit 12 e and a result of the face recognition performed by the face recognition unit 12 f (step S 134 ).
- the authentication processing unit 12 a determines that the face authentication of the user is successful (step S 134 , YES), the authentication processing unit 12 a permits the user to log in a particular application program and performs a login process to allow the user to log in (step S 136 ).
- the authentication processing unit 12 a performs the login process and causes the display unit 20 to display a window after login of the particular application.
- the authentication processing unit 12 a determines that the face authentication is successful if it is determined by the impersonation determination unit 12 e that there is no impersonation (step S 206 ) and it is determined by the face recognition unit 12 f that there is a matching in the face recognition (step S 310 ).
- FIG. 9 is illustrated as a window after login SA of a particular application after face authentication succeeded.
- the window after login SA is a window of the particular application logged in by the user.
- the authentication processing unit 12 a determines that the face authentication of the user failed (step S 134 , NO)
- the authentication processing unit 12 a rejects login of the user to the particular application program. If it is determined by the impersonation determination unit 12 e that there is impersonation (step S 208 ) or if it is determined by the face recognition unit 12 f that there is no matching in the face recognition (step S 312 ), the authentication processing unit 12 a determines that the face authentication failed. In this case, the authentication processing unit 12 a performs a failure process such as a process that notifies the user that the face authentication failed (step S 140 ).
- the information provide unit 12 d provides guidance information regarding a user's particular movement or an authentication environment in accordance with a reason for the face images being unable to be acquired by the face image acquisition unit 12 c (step S 138 ).
- the information provide unit 12 d can display guidance information on the display unit 20 or output a voice from the audio output unit 24 to provide the guidance information to the user. Thereby, it is possible to improve the probability of the predefined number of face images being acquired by the face image acquisition unit 12 c within the time limit when face authentication is again performed.
- the authentication processing unit 12 a performs a failure process such as a process to notify the user that the face authentication failed (step S 140 ).
- FIG. 10 illustrates the face authentication window SF at a face authentication failure when the face authentication failed because the predefined number of face images have not been acquired and the time limit has elapsed.
- a failure icon SF 26 such as an exclamation mark icon indicating that the predefined number of face images have not been acquired and thus the face authentication failed is displayed instead of the face icon SF 144 .
- a failure frame SF 28 indicating that the face authentication failed by using a different color or the like from the frame-like gages SF 18 L and SF 18 R is displayed around the moving image display region SF 12 , for example.
- the moving image display region SF 12 displays a grayscale frame or a black and white frame at a particular point of time of a moving image captured by the capture unit 22 when face authentication fails, for example. Thereby, the moving image display region SF 12 indicates that face authentication failed.
- a guidance message SF 30 is displayed as guidance information provided by the information provide unit 12 d .
- the guidance message SF 30 is a message having the content in accordance with a reason for a failure in acquisition of the predefined number of face images, such as “Motion is too fast”, “It is too bright”, or the like.
- face authentication is performed for the user by the information processing apparatus 10 according to the present example embodiment.
- display providing an instruction about a particular movement on the display unit 20 is controlled and changed, and the progress status of acquisition of face images is displayed on the display unit 20 in accordance with the movement status of the user in the movement captured by the capture unit 22 . Therefore, according to the present example embodiment, a face image that is biometrics information necessary for impersonation determination can be properly acquired based on a user's movement.
- FIG. 11 is a block diagram illustrating a configuration of the information processing apparatus according to the present example embodiment. Note that the same components as those in the first example embodiment described above are labeled with the same references, and the description thereof will be omitted or simplified.
- the information processing apparatus 10 is configured to perform face authentication at login to a particular application
- face authentication may be performed at entry to a particular place such as a room, an area, or the like.
- an information processing apparatus is configured as a control apparatus that controls a door such as an automatic door, a gate, or the like to restrict entry to a particular place and performs face authentication at entry to the particular place will be described.
- an information processing apparatus 210 has a door 30 that restricts entry to a particular place in addition to the configuration of the information processing apparatus 10 according to the first example embodiment illustrated in FIG. 1 .
- the door 30 is formed of an automatic door, a security gate, or the like, for example, and installed at an entrance of a particular place such as a room, an area, or the like where entry of the user is restricted.
- the door 30 performs a door-open operation and a door-close operation under the control of the CPU 12 .
- the information processing apparatus 210 according to the present example embodiment performs the same face authentication as in the information processing apparatus 10 according to the first example embodiment described above when the user enters a particular place where entry is restricted by the door 30 . That is, the information processing apparatus 210 according to the present example embodiment opens the door 30 when face authentication is successful and permits entry of the user to a particular place. On the other hand, the information processing apparatus 210 maintains the door 30 to be closed when face authentication fails and rejects entry of the user to a particular place.
- the information processing apparatus can be configured to perform the same face authentication as the first example embodiment at entry to a particular place.
- face authentication may be performed at login to a server from a user terminal, for example.
- the information processing apparatus can be configured as a server that accepts login from a user terminal via a network, and face authentication can be performed at login from the user terminal.
- an information processing apparatus configured as a server may have the configuration that functions as the CPU 12 , the RAM 14 , and the storage unit 16 that are the same as those of the first example embodiment.
- the user terminal may have the configuration that functions as the input unit 18 , the display unit 20 , the capture unit 22 , and the audio output unit 24 that are the same as those of the first example embodiment, for example.
- an information processing apparatus can be configured to perform the same face authentication as that in the first example embodiment in various scenes that require authentication.
- FIG. 12 is a block diagram illustrating the configuration of the information processing apparatus according to another example embodiment.
- an information processing apparatus 1000 has a movement instruction unit 1002 that instructs the user about a movement and an information acquisition unit 1004 that acquires biometrics information on the user from the user instructed about the movement. Further, the information processing apparatus 1000 has a display control unit 1006 that controls display directed to the user in accordance with the movement status of the user.
- display directed to the user is controlled in accordance with the movement status of the user.
- necessary biometrics information can be suitably acquired based on the user's movement.
- biometrics information a gait image, a fingerprint image, an iris image, a finger vein image, a palm image, a palm vein image, or the like may be acquired other than a face image.
- Biometrics authentication may be performed by using biometrics information acquired from a user instructed to perform a particular movement instead of face authentication using a face image as biometrics information.
- biometrics information such as a face image can be utilized for various purposes.
- information processing apparatuses and 210 may be configured as a system including one or a plurality of apparatuses.
- each of the example embodiments includes a processing method that stores, in a storage medium, a program that causes the configuration of each of the example embodiments to operate so as to implement the function of each of the example embodiments described above, reads the program stored in the storage medium as a code, and executes the program in a computer. That is, the scope of each of the example embodiments also includes a computer readable storage medium. Further, each of the example embodiments includes not only the storage medium in which the computer program described above is stored but also the computer program itself.
- a floppy (registered trademark) disk a hard disk, an optical disk, a magneto-optical disk, a compact disk-read only memory (CD-ROM), a magnetic tape, a non-volatile memory card, or a ROM, for example, may be used.
- the scope of each example embodiment includes not only those executing a process with a program itself stored in the storage medium but also those operating on an operating system (OS) in cooperation with the function of another software or an extension board to execute the process.
- OS operating system
- SaaS Software as a Service
- An example advantage according to the invention is that necessary biometrics information can be suitably acquired based on a user's movement.
- An information processing apparatus comprising:
- a movement instruction unit that instructs a user about a movement
- an information acquisition unit that acquires biometrics information on the user from the user instructed about the movement
- a display control unit that controls display directed to the user in accordance with a movement status of the user.
- the movement instruction unit indicates the movement by causing a display unit to display the display that instructs the user about the movement
- the display control unit controls the display that instructs the user about the movement in accordance with the movement status of the user.
- the information processing apparatus according to supplementary note 1 or 2, wherein the display control unit causes a display unit to display the display indicating progress status of acquisition of the biometrics information in accordance with the movement status of the user.
- the information processing apparatus according to any one of supplementary notes 1 to 3 further comprising an information provide unit that provides guidance information when the information acquisition unit is unable to acquire a predetermined amount of the biometrics information within a predetermined time period.
- the information processing apparatus according to supplementary note 4, wherein the guidance information is information regarding a movement or an environment of the user.
- the movement instruction unit instructs the user about the movement related to a head of the user
- the information acquisition unit acquires multiple pieces of face information as the biometrics information from the user.
- the information processing apparatus according to any one of supplementary notes 1 to 6 further comprising an impersonation determination unit that determines, based on the biometrics information, whether or not the user is impersonated.
- the information processing apparatus further comprising a comparison unit that compares the biometrics information acquired by the information acquisition unit with registered biometrics information.
- the information processing apparatus according to supplementary note 8, wherein the information acquisition unit stops acquisition of the biometrics information when there is no matching in comparison performed by the comparison unit.
- the information processing apparatus according to supplementary note 8 or 9 further comprising an authentication processing unit that performs authentication on the user based on a result of determination performed by the impersonation determination unit and a result of comparison performed by the comparison unit.
- An information processing method comprising:
- a non-transitory storage medium storing a program that causes a computer to perform:
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Collating Specific Patterns (AREA)
- User Interface Of Digital Computer (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
Abstract
Description
- This application is a Continuation of International Application No. PCT/JP2018/041060, filed Nov. 5, 2018. The entire contents of the above-referenced application are expressly incorporated herein by reference.
- The present invention relates to an information processing apparatus, an information processing method, and a storage medium.
- International Publication No. WO 2015/194135 discloses an authentication apparatus that checks reality to see if a user is a human or not. The authentication apparatus disclosed in International Publication No. WO 2015/194135 checks reality of a user by determining whether or not a response to a challenge is correct that is information based on which the user to be authenticated inputs information used for an authentication process.
- When it is determined whether or not there is impersonation by using a photograph or the like based on a user's movement in accordance with a particular instruction such as a challenge as with the art disclosed in International Publication No. WO 2015/194135, the user may be unable to move properly in accordance with the particular instruction. Thus, in the art disclosed in International Publication No. WO 2015/194135, it may be difficult to suitably collect necessary information based on the user's movement.
- In view of the problem described above, an example object of the present invention is to provide an information processing apparatus, an information processing method, and a storage medium that can suitably acquire necessary biometrics information based on a user's movement.
- According to one example aspect of the present invention, provided is an information processing apparatus including: a movement instruction unit that instructs a user about a movement; an information acquisition unit that acquires biometrics information on the user from the user instructed about the movement; and a display control unit that controls display directed to the user in accordance with a movement status of the user.
- According to another example aspect of the present invention, provided is an information processing method including: instructing a user about a movement; acquiring biometrics information on the user from the user instructed about the movement; and controlling display directed to the user in accordance with a movement status of the user.
- According to yet another example aspect of the present invention, provided is a non-transitory storage medium storing a program that causes a computer to perform: instructing a user about a movement; acquiring biometrics information on the user from the user instructed about the movement; and controlling display directed to the user in accordance with a movement status of the user.
-
FIG. 1 is a block diagram illustrating a configuration of an information processing apparatus according to a first example embodiment of the present invention. -
FIG. 2 is a flowchart illustrating an operation of face authentication of the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 3 is a flowchart illustrating an operation of impersonation determination in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 4 is a flowchart illustrating an operation of face recognition in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 5 is a schematic diagram illustrating one example of a login window in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 6 is a schematic diagram illustrating a face authentication window at the start of acquisition of face images in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 7 is a schematic diagram illustrating one example of a face authentication window during acquisition of face images in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 8 is a schematic diagram illustrating one example of a face authentication window at the completion of acquisition of face images in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 9 is a schematic diagram illustrating one example of a window after face authentication succeeds and login is completed in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 10 is a schematic diagram illustrating one example of a face authentication window when face authentication fails in the information processing apparatus according to the first example embodiment of the present invention. -
FIG. 11 is a block diagram illustrating a configuration of an information processing apparatus according to a second example embodiment of the present invention. -
FIG. 12 is a block diagram illustrating a configuration of an information processing apparatus according another example embodiment of the present invention. - An information processing apparatus and an information processing method according to a first example embodiment of the present invention will be described by using
FIG. 1 toFIG. 10 . - First, the configuration of the information processing apparatus according to the present example embodiment will be described by using
FIG. 1 .FIG. 1 is a block diagram illustrating the configuration of the information processing apparatus according to the present example embodiment. - As illustrated in
FIG. 1 , aninformation processing apparatus 10 according to the present example embodiment has a central processing unit (CPU) 12, a random access memory (RAM) 14, astorage unit 16, aninput unit 18, adisplay unit 20, acapture unit 22, anaudio output unit 24, and acommunication unit 26. TheCPU 12, theRAM 14, thestorage unit 16, theinput unit 18, thedisplay unit 20, thecapture unit 22, theaudio output unit 24, and thecommunication unit 26 are connected to acommon bus 28. - The
information processing apparatus 10 is a smartphone, for example, while not particularly limited. Theinformation processing apparatus 10 may be a tablet type personal computer, a mobile phone, or the like. Further, theinformation processing apparatus 10 may be a computer apparatus such as a laptop personal computer, a desktop personal computer, or the like, for example. Theinformation processing apparatus 10 can execute various application programs in accordance with an execution instruction from a user using the same. - The
CPU 12 operates by executing a program stored in thestorage unit 16 and functions as a control unit that controls the operation of the entireinformation processing apparatus 10. Further, theCPU 12 performs various processes as theinformation processing apparatus 10 by executing a program stored in thestorage unit 16. TheRAM 14 provides a memory field necessary for the operation of theCPU 12. - The
information processing apparatus 10 according to the present example embodiment performs face authentication for a user when the user logs in a particular application program stored in thestorage unit 16. Furthermore, theinformation processing apparatus 10 is configured to be able to determine impersonation of the user by using a non-living object such as a photograph, a moving image, or the like in face authentication. - Note that the
information processing apparatus 10 can also perform the same face authentication at various timings other than at login to a particular application program. For example, theinformation processing apparatus 10 can also perform the same face authentication for the user at the start of a use of theinformation processing apparatus 10, such as at system startup or at unlocking of theinformation processing apparatus 10. Further, for example, theinformation processing apparatus 10 can also perform the same face authentication for the user at the start of access to a particular resource such as a particular file, a particular directory, a particular folder, or the like stored in thestorage unit 16 or the like. In such a way, theinformation processing apparatus 10 can be configured to perform face authentication at the start of a particular process for the user who requests that process. - The
CPU 12 functions as each of the following function units used for face authentication for the user by executing a particular application program stored in thestorage unit 16. That is, theCPU 12 functions as anauthentication processing unit 12 a, amovement instruction unit 12 b, a faceimage acquisition unit 12 c, an information provideunit 12 d, animpersonation determination unit 12 e, and aface recognition unit 12 f. - The
authentication processing unit 12 a causes thedisplay unit 20 to display a login window that requests the user for face authentication and accepts a login request from the user. For example, the user who is a target of face authentication is able to perform touch entry by pushing a particular region such as a login button, a login icon, or the like in a login window displayed on thedisplay unit 20, which is configured integrally with theinput unit 18 as a touchscreen display. This enables the user to input a login request to theinformation processing apparatus 10. Further, the user is able to input a login request to theinformation processing apparatus 10 by causing thecapture unit 22 to capture its own face and inputting the face image to theinformation processing apparatus 10, for example. Once a login request is input by the user, theauthentication processing unit 12 a causes thedisplay unit 20 to display a face authentication window used for face authentication. - Further, the
authentication processing unit 12 a performs face authentication as authentication for the user based on a result of impersonation determination performed by theimpersonation determination unit 12 e and a result of face recognition performed by theface recognition unit 12 f. That is, theauthentication processing unit 12 a determines whether or not face authentication of the user is successful based on a result of impersonation determination and a result of face recognition and performs a process in accordance with the determination results. - The
authentication processing unit 12 a determines that the face authentication of the user is successful when it is determined by theimpersonation determination unit 12 e that there is no impersonation and it is determined by theface recognition unit 12 f that there is a matching in face recognition, as described later. In response to determining that face authentication of the user is successful, theauthentication processing unit 12 a permits login of the user to a particular application program and performs a login process for allowing the user to log in. - On the other hand, when it is determined by the
impersonation determination unit 12 e that there is impersonation or when it is determined by theface recognition unit 12 f that there is no matching in face authentication, theauthentication processing unit 12 a determines that the face authentication of the user is unsuccessful, that is, the face authentication of the user failed. In response to determining that the face authentication of the user failed, theauthentication processing unit 12 a denies login of the user to the application program. In this case, theauthentication processing unit 12 a can perform a failure process such as a process of notifying the user that face authentication failed. For example, theauthentication processing unit 12 a can display the display indicating that face authentication failed on thedisplay unit 20 and perform notification. - In response to receiving entry of a login request from the user, the
movement instruction unit 12 b instructs the user, who is a target of face authentication, to perform a particular movement regarding the head captured by acapture unit 22. Themovement instruction unit 12 b can cause thedisplay unit 20 to display the display providing an instruction about a particular movement and instruct the user about the particular movement. Themovement instruction unit 12 b can cause thedisplay unit 20 to display an instruction message about a particular movement, a visual symbol that moves as an animation with a particular movement, or the like, for example, as the instruction display of a particular movement. A visual symbol may be, for example, a face icon that is an icon imitating a face, a pictogram, or the like. - A particular movement instructed by the
movement instruction unit 12 b may be, for example, a movement of shaking the head laterally or vertically, a movement of turning the head around, or the like, while not particularly limited. When a movement of shaking the head is instructed as a particular movement, themovement instruction unit 12 b can cause thedisplay unit 20 to display an instruction message “Please shake your head” or cause thedisplay unit 20 to display a face icon that shakes the head in an animation movement, for example. - Further, the
movement instruction unit 12 b can function as a display control unit that controls and changes the display providing an instruction about a particular movement on thedisplay unit 20, which is display directed to the user, in accordance with the movement state of the user in motion captured by thecapture unit 22. For example, themovement instruction unit 12 b can control and change the display providing the instruction about a particular movement on thedisplay unit 20 in accordance with a determination result by the faceimage acquisition unit 12 c as to whether or not a particular movement of the user is proper as described below. - For example, specifically, when providing an instruction about a movement of shaking the head as a particular movement, the
movement instruction unit 12 b controls and changes the display providing the instruction about a particular movement in the following manner, for example. - When it is determined by the face
image acquisition unit 12 c that the user's movement of shaking the head is proper, themovement instruction unit 12 b may stop the face icon's movement of shaking the head displayed on thedisplay unit 20, for example. Further, when it is determined to be proper, themovement instruction unit 12 b may change the face icon display so as to have an affirmative facial expression associating that the user's particular movement is proper, for example. - On the other hand, when it is determined by the face
image acquisition unit 12 c that the user's movement of shaking the head is improper, themovement instruction unit 12 b can continue the face icon's movement of shaking the head displayed on thedisplay unit 20 or change the face icon's movement of shaking the head to a wider movement. Further, when it is determined to be improper, themovement instruction unit 12 b may also change the face icon display to so as to have a negative facial expression associating that the user's particular movement is improper, for example. Further, when it is determined to be improper, themovement instruction unit 12 b may cause thedisplay unit 20 to emphasize and display an instruction message “Please shake your head” or may cause thedisplay unit 20 to display an instruction message “Please shake your head widely.” - As discussed above, when it is determined that a user's particular movement is proper, the
movement instruction unit 12 b can control and change the display providing the instruction about a particular movement so as to indicate that the user's particular movement is proper. Further, when it is determined that a user's particular movement is improper, themovement instruction unit 12 b can control and change the display providing the instruction of a particular movement so as to indicate that the user's particular movement is improper. - Further, the
movement instruction unit 12 b may instruct the user about a particular movement by outputting a voice of a movement instruction from theaudio output unit 24 in addition to or instead of the display of movement instruction display on thedisplay unit 20. In such a case, themovement instruction unit 12 b can output a voice uttering an instruction message providing an instruction about a particular movement, such as “Please shake your head”, from theaudio output unit 24. - The face
image acquisition unit 12 c is an information acquisition unit that acquires a face image that is biometrics information on the user from a moving image captured by thecapture unit 22. The faceimage acquisition unit 12 c operates as below to acquire a face image. - The face
image acquisition unit 12 c determines whether or not a particular movement regarding the user's head in a moving image captured by thecapture unit 22 is proper. When the user's particular movement is a movement which can be used in impersonation determination to acquire a plurality of face images that can be used for estimate a three-dimensional shape of a user's head, the faceimage acquisition unit 12 c determines that the user's particular movement is proper. On the other hand, when the user's particular movement is a movement from which such a plurality of face images are unable to be acquired, the faceimage acquisition unit 12 c determines that the user's particular movement is improper. For example, when face images of multiple viewpoints are substantially unable to be acquired because the user's movement is too small, when the user's face has not been captured by thecapture unit 22, or the like, it is not possible to acquire a plurality of face images that can be used for estimate a three-dimensional shape of the user's face. Therefore, in these cases, the faceimage acquisition unit 12 c determines that the user's particular movement is improper. - Note that the face
image acquisition unit 12 c may determine that the user's movement is proper even when a particular movement instructed by themovement instruction unit 12 b and the user's movement are not the same. For example, even when the user shakes the head vertically while themovement instruction unit 12 b displays an animation of shaking the head laterally, the user's movement may be a movement which can be used in impersonation determination to acquire a plurality of face images that can be used for estimating the three-dimensional shape of the user's face. In such a case, the faceimage acquisition unit 12 c can determine that the user's movement is proper. - In response to determining that the user's particular movement is proper, the face
image acquisition unit 12 c detects a face image of the user in a moving image captured by thecapture unit 22. Furthermore, the faceimage acquisition unit 12 c acquires a face image of the detected user as face information on the user. At this time, the faceimage acquisition unit 12 c evaluates the quality of the detected face image and acquires a high quality face image having a quality above a predetermined quality. For example, the faceimage acquisition unit 12 c can evaluate the quality of a face image based on a quality value calculated for each of a predetermined number of face feature points preset for estimating a three-dimensional shape of a face. The faceimage acquisition unit 12 c is not necessarily required to acquire a face image itself but may extract and acquire a feature amount from a face image as face information. - The face
image acquisition unit 12 c repeatedly acquires a face image of the user as described above until a time limit, which is a predetermined time period from the time when a particular movement is instructed by themovement instruction unit 12 b, has elapsed. Thereby, the faceimage acquisition unit 12 c attempts acquisition of a predefined number of face images necessary for impersonation determination. Note that the time limit used in acquiring face images is preset in accordance with required accuracy of impersonation determination, a time period required to complete authentication, or the like, for example. Further, the predefined number of face images necessary for impersonation determination is preset in accordance with required accuracy of impersonation determination, a time period required to complete authentication, or the like, for example. - Further, the face
image acquisition unit 12 c functions as a display control unit that causes thedisplay unit 20 to display the progress status of acquisition of face images as display in accordance with the movement status of the user. The faceimage acquisition unit 12 c can cause thedisplay unit 20 to display an acquisition rate of face images in percentage, which is a ratio of the number of acquired face images to the predefined number required for impersonation determination, for example, as the progress status of acquisition of face images in accordance with the movement status of the user. The faceimage acquisition unit 12 c may display the acquisition rate of face images as a numerical value in a form of percentage expression or may display an acquisition rate of face images by using a gage or a progress bar whose length changes in accordance with the acquisition ratio of face images, for example. The shape of a gage or a progress bar indicating an acquisition ratio of face images is not particularly limited, and a bar-like shape, a ring-like shape, or a frame-like shape may be employed, for example. The faceimage acquisition unit 12 c controls and changes the display indicating the progress status of acquisition of face images, which is display directed to the user, in accordance with the acquisition status of face images. The faceimage acquisition unit 12 c can cause thedisplay unit 20 to display the display indicating the progress status of acquisition of face images, such as a gage, a progress bar, or the like, in accordance with the user's movement status. Note that a form of display indicating the progress status of acquisition of face images is not limited to a gage or a progress bar, and various forms of display may be employed. - When the time limit elapses without the predefined number of face images being acquired by the face
image acquisition unit 12 c and thus acquisition of the predetermined number of face images fails, that is, when a predetermined amount of biometrics information is not acquired within a predetermined time period, the information provideunit 12 d provides guidance information to the user. The guidance information is information that may be a reference, a guide, a help, or the like for successful acquisition of the predefined number of face images by the faceimage acquisition unit 12 c when face authentication is again performed. The information provideunit 12 d can cause thedisplay unit 20 to display and provide guidance information to the user, for example. Further, the information provideunit 12 d may output guidance information by using a voice from the audio output unit and provide the guidance information to the user, for example. - The information provide
unit 12 d can provide, to the user, guidance information in accordance with a reason for being unable to acquire a face image by the faceimage acquisition unit 12 c. For example, when a face image is unable to be acquired because of an improper movement of the user, the information provideunit 12 d can provide guidance information regarding a user's particular movement. Further, for example, when a face image is unable to be acquired because of an improper authentication environment that is an environment where face authentication is performed, the information provideunit 12 d can provide guidance information regarding an authentication environment. Further, it is possible to provide, to the user, guidance information regarding a user's particular movement and guidance information regarding an authentication environment in combination. - The information provide
unit 12 d can provide, as guidance information regarding a user's particular movement, guidance information that guides the user to perform a proper particular movement. For example, for the user instructed to perform a head shaking movement as a particular movement, the information provideunit 12 d can display a guidance message regarding a head shaking movement on thedisplay unit 20 or output a guidance message by using a voice from theaudio output unit 24 as guidance information. In such a case, guidance message may be “Please shake your head widely”, “Please shake your head slowly”, “Motion is too fast”, or the like. - Further, the information provide
unit 12 d can provide guidance information that guides the user to perform face authentication in a proper authentication environment as guidance information regarding authentication environment. An authentication environment where guidance information may be provided is not particularly limited and may correspond to brightness of an authentication place that is a place where authentication is performed, a capturing distance that is a distance at which the user is captured by thecapture unit 22, or the like, for example. - For example, when a face image above a certain quality is unable to be acquired by the face
image acquisition unit 12 c because of an excessively bright or excessively dark authentication place, the information provideunit 12 d can provide guidance information regarding brightness of the authentication place as guidance information. In such a case, the information provideunit 12 d can display a guidance message such as “it is too bright”, “it is too dark”, or the like on thedisplay unit 20 or output the guidance message as a voice from theaudio output unit 24, for example, as guidance information. - Further, for example, when a face image above a certain quality is unable to be acquired by the face
image acquisition unit 12 c because of an excessively long or excessively short capturing distance by thecapture unit 22, the information provideunit 12 d can provide guidance information regarding a capturing distance by thecapture unit 22. In such a case, the information provideunit 12 d can display a guidance message such as “it is too far”, “it is too close”, or the like on thedisplay unit 20 or output the guidance message as a voice from theaudio output unit 24, for example, as guidance information. - Note that the information provide
unit 12 d can provide guidance information by causing thedisplay unit 20 to display a visual symbol associating the content of a guidance message instead of or in addition to the guidance message. A visual symbol may be, for example, an icon, a pictogram, or the like. - The
impersonation determination unit 12 e performs impersonation determination based on a plurality of face images acquired by the faceimage acquisition unit 12 c. In impersonation determination, theimpersonation determination unit 12 e determines whether or not the user's face is impersonated by a non-living object such as a photograph, a moving image, or the like. - The
impersonation determination unit 12 e estimates a three-dimensional shape of the user's face based on a plurality of face images acquired by the faceimage acquisition unit 12 c in impersonation determination. The plurality of face images acquired by the faceimage acquisition unit 12 c are obtained when the face of the user performing a particular movement is captured by thecapture unit 22 and thus are multi-viewpoint images captured from different viewpoints. Theimpersonation determination unit 12 e estimates a three-dimensional shape of the user's face from a plurality of face images, which are multi-viewpoint images. A method of estimating the three-dimensional shape is not particularly limited, and various methods can be used. For example, theimpersonation determination unit 12 e can estimate a three-dimensional shape of a face image of the user from the plurality of face images acquired by the faceimage acquisition unit 12 c by using a bundle adjustment method. - In estimation of a three-dimensional shape, the
impersonation determination unit 12 e can perform the following process, for example. That is, theimpersonation determination unit 12 e acquires face information indicating respective positions of a predetermined number of face feature points in a face image for each plurality of face images acquired by acquired by the faceimage acquisition unit 12 c. The number of face feature points used for acquiring face information may be two or more, for example, which may be specifically ten face feature points at the right eye, the left eye, the nose, the center of the mouth, the right corner of the mouth, the left corner of the mouth, the right cheek, the left cheek, the right side of the chin, and the left side of the chin. - Furthermore, the
impersonation determination unit 12 e associates each of the predetermined number of face feature points between a plurality of face images based on respective pieces of face information of the acquired plurality of face images. Theimpersonation determination unit 12 e can estimate a three-dimensional shape of the user's face based on a result of association of face feature points. According to such a process, each of a predetermined number of face feature points can be uniquely associated between a plurality of face images based on each face information of the plurality of face images. No complex process is required for this association. It is therefore possible to perform the association of face feature points between the plurality of face images more easily and more accurately, and it is possible to estimate a three-dimensional shape more easily and more accurately. - The
impersonation determination unit 12 e determines whether or not the user's face is impersonated by a non-living object such as a photograph, a moving image, or the like by evaluating the three-dimensional shape of the estimated user's face. For example, theimpersonation determination unit 12 e can determine whether or not there is impersonation by evaluating whether or not the estimated three-dimensional shape is three-dimensional, that is, whether the estimated three-dimensional shape is three-dimensional or two-dimensional. In such a case, theimpersonation determination unit 12 e can evaluate whether the three-dimensional shape is three-dimensional or two-dimensional by determining a relationship between an evaluation amount regarding the estimated three-dimensional shape and a preset threshold. As an evaluation amount, a distance, an angle, or the like associated with a particular face feature point or an amount based thereon may be used, for example, without being particularly limited. Note that, as a method of evaluating whether a three-dimensional shape is three-dimensional or two-dimensional, various methods may be used without being particularly limited. - When the
impersonation determination unit 12 e evaluates that the estimated three-dimensional shape of a face is three-dimensional, theimpersonation determination unit 12 e determines that the user's face is a face of a living body and is not impersonated. On the other hand, when theimpersonation determination unit 12 e evaluates that the estimated three-dimensional shape is two-dimensional, theimpersonation determination unit 12 e determines that the user's face is impersonated by a non-living object such as a photograph, a moving image, or the like. - The
face recognition unit 12 f performs face recognition based on one or a plurality of face images acquired by the faceimage acquisition unit 12 c. In face recognition, theface recognition unit 12 f compares a target face image, which is a face image of the user captured by thecapture unit 22 and acquired by the faceimage acquisition unit 12 c, with a registered face image, which is a pre-registered face image of the user, and determines whether or not both face images are matched. For example, theface recognition unit 12 f can use a face image having the highest quality, such as a face image in which the user faces front, as a target face image out of a plurality of face images acquired by the faceimage acquisition unit 12 c and compare the target face image with a registered face image. Further, theface recognition unit 12 f may perform comparison between each of the plurality of face images and a registered face image. Note that the registered face image, which is registered biometrics information on the user, is pre-stored and registered in thestorage unit 16, for example. - In comparison between both face images, the
face recognition unit 12 f can calculate a matching score indicating the similarity between a feature amount of a target face image and a feature amount of a registered face image. The higher the similarity between feature amounts of both face images is, the larger the matching score is. When comparison with a registered face image is performed for each of the plurality of face images, the average value, the maximum value, or the like of matching score values calculated in respective comparisons may be used as a matching score. When the matching score is greater than or equal to a predetermined threshold, theface recognition unit 12 f can determine that there is a matching in the comparison between a target face image and a registered face image, that is, there is a matching in the face recognition. On the other hand, when the matching score is less than a predetermined threshold, theface recognition unit 12 f can determine that there is no matching in the comparison between a target face image and a registered face image, that is, there is no matching in the face recognition. - Note that, at the time before a predefined number of face images are acquired by the face
image acquisition unit 12 c, theface recognition unit 12 f can use an already acquired face image as a target face image and compare that target face image with a registered face image in the same manner as described above. In such a case, at the time when it is determined by theface recognition unit 12 f that there is no matching in comparison, the faceimage acquisition unit 12 c can stop acquisition of face images even before a predefined number of face images are acquired. - Note that some or all of the functions of respective units of the
CPU 12 described above are not necessarily required to be implemented in theCPU 12 of theinformation processing apparatus 10 that is a single apparatus but may be implemented by other external apparatuses such as a server. For example, the functions of theimpersonation determination unit 12 e, theface recognition unit 12 f, or the like of the functions of respective units of theCPU 12 described above may be implemented by a CPU of a server communicably coupled to theinformation processing apparatus 10 via a network. In such a case, theCPU 12 of theinformation processing apparatus 10 transmits a process request from the function of theimpersonation determination unit 12 e, theface recognition unit 12 f, or the like to the server via the network together with data such as a face image, a feature amount extracted from a face image, or the like necessary for the requested process. Further, theCPU 12 receives a process result obtained by the function of theimpersonation determination unit 12 e, theface recognition unit 12 f, or the like from the server via the network. - The
storage unit 16 is formed of a storage medium such as a nonvolatile memory, such as a flash memory, a hard disk drive, or the like. Thestorage unit 16 stores a program such as a particular application program executed by theCPU 12, data referenced by theCPU 12 in executing the program, or the like. For example, thestorage unit 16 stores a face image of the user acquired from an image captured by thecapture unit 22. Further, thestorage unit 16 stores a registered face image of the user used in face recognition, for example. - The
input unit 18 accepts user input of information, an instruction, or the like to theinformation processing apparatus 10. The user may input various information or input an instruction to perform a process into theinformation processing apparatus 10 via theinput unit 18. Theinput unit 18 is formed of a touchscreen embedded in thedisplay unit 20, for example. Note that theinput unit 18 may be formed of a keyboard, a mouse, or the like, for example. - The
display unit 20 displays various windows such as an execution window of a particular application program under the control of theCPU 12. For example, an execution window of a particular application program includes a login window, a face authentication window, a window after login, or the like. For example, thedisplay unit 20 is formed of a liquid crystal display, an organic light emitting diode (OLED) display, or the like and configured as a touchscreen display together with theinput unit 18. - The
capture unit 22 captures an image including a face of the user who performs face authentication. For example, thecapture unit 22 is formed of a digital camera that can capture a moving image and acquires two-dimensional image forming a moving image at a predetermined framerate. - The
communication unit 26 transmits and receives data to and from an external apparatus such as a server via a network under the control of theCPU 12. The communication standard, the communication scheme, or the like of thecommunication unit 26 is not particularly limited and may be a wireless scheme or a wired scheme. - In such a way, the
information processing apparatus 10 according to the present example embodiment is configured. - In the
information processing apparatus 10 according to the present example embodiment, themovement instruction unit 12 b controls and changes the display providing an instruction about a particular movement on thedisplay unit 20 in accordance with the movement status of the user during a movement captured by thecapture unit 22. This enables the user to determine whether or not his/her movement responding to an instruction of a particular movement is proper and appropriately correct his/her movement. Further, the faceimage acquisition unit 12 c causes thedisplay unit 20 to display the progress status of acquisition of face images. The user is able to also use the progress status of acquisition of face images as a material for determining whether or not his/her movement is proper. Therefore, according to theinformation processing apparatus 10 of the present example embodiment, it is possible to cause the user to move properly in accordance with an instruction of a particular movement, and a face image that is biometrics information necessary for determination of impersonation can be acquired properly based on the user's movement. - Further, in the
information processing apparatus 10 according to the present example embodiment, the information provideunit 12 d provides guidance information when acquisition of a predefined number of face images by the faceimage acquisition unit 12 c fails. Therefore, according to theinformation processing apparatus 10 of the present example embodiment, it is possible to cause the user to act so that the predefined number of face images can be acquired by the faceimage acquisition unit 12 c when the user again performs face authentication. - The operation of face authentication in the
information processing apparatus 10 according to the present example embodiment will be further described below by usingFIG. 2 toFIG. 10 .FIG. 2 is a flowchart illustrating the operation of face authentication in theinformation processing apparatus 10.FIG. 3 is a flowchart illustrating the operation of impersonation determination in theinformation processing apparatus 10.FIG. 4 is a flowchart illustrating the operation of face recognition in theinformation processing apparatus 10.FIG. 5 toFIG. 10 are schematic diagrams illustrating examples of a series of windows on thedisplay unit 20 during a face authentication operation in theinformation processing apparatus 10, respectively. In response to the operation of face authentication in theinformation processing apparatus 10 being performed, an information processing method according to the present example embodiment is performed. - First, when a particular application program is executed by the
CPU 12, theauthentication processing unit 12 a causes thedisplay unit 20 to display a login window that requests face authentication from the user and accepts a login request of the user as illustrated inFIG. 2 (step S102). -
FIG. 5 illustrates one example of a login window SL that requests face authentication. As illustrated inFIG. 5 , on the login window SL, a face icon SL12 indicating that face authentication is requested in a login operation is displayed. For example, by pushing the face icon SL12, the user may input, to theinformation processing apparatus 10, a login request that requests login to a particular application program. - The
authentication processing unit 12 a continuously determines whether or not a login request is input by the user (step S104) and stands by until a login request is input (step S104, NO). - In response to determining that a login request is input (step S104, YES), the
authentication processing unit 12 a causes thedisplay unit 20 to display a face authentication window used for performing face authentication (step S106). -
FIG. 6 illustrates one example of a face authentication window SF at the start of face authentication, that is, at the start of acquisition of a face image by the faceimage acquisition unit 12 c. As illustrated inFIG. 6 , the face authentication window SF includes a moving image display region SF12 that displays a moving image captured by thecapture unit 22 and a movement instruction display region SF14 that displays an instruction about a particular movement from themovement instruction unit 12 b. Further, the face authentication window SF includes a bar-like gage SF16 indicating the progress status of acquisition of face images and frame-like gages SF18L and SF18R similarly indicating the progress status of acquisition of face images. Further, the face authentication window SF includes a “Use password” button SF20 used for switching the authentication scheme to a password scheme and a “Cancel” button SF22 used for cancelling face authentication. - The moving image display region SF12 displays, in substantially real time, a moving image including a face of the user captured by the
capture unit 22. The user is able to determine whether or not his/her movement to the instruction of a particular movement from themovement instruction unit 12 b is proper by checking the moving image of his/her face displayed in the moving image display region SF12. - The movement instruction display region SF14 displays an instruction message SF142 providing an instruction about a particular movement and a face icon SF144 similarly providing an instruction about a particular movement. The instruction message SF142 indicates the content of a particular movement instructed by the
movement instruction unit 12 b. Further, the face icon SF144 moves as an animation with a particular movement instructed by themovement instruction unit 12 b. Themovement instruction unit 12 b instructs the user about a particular movement by using the instruction message SF142 such as “Please shake your head”, for example. Further, themovement instruction unit 12 b instructs the user about a particular movement by using the face icon SF144 that shakes the head as an animation movement, for example. - The bar-like gage SF16 changes the length thereof in accordance with the progress status of acquisition of face images performed by the face
image acquisition unit 12 c. Specifically, the bar-like gage SF16 changes the length thereof so as to extend from the left to the right in accordance with an acquisition rate of face images performed by the faceimage acquisition unit 12 c. Further, inside the bar-like gage SF16, the acquisition rate of face images is displayed as a numerical value in a form of percentage representation. - The frame-like gages SF18L and SF18R are arranged so as to form the left half frame portion and the right half frame portion around the moving image display region SF12, respectively. The frame-like gages SF18L and SF18R changes the lengths thereof in accordance with the progress status of acquisition of face images performed by the face
image acquisition unit 12 c, respectively, in a similar manner to the bar-like gage SF16. Specifically, the frame-like gages SF18L and SF18R change the lengths thereof so as to extend from the bottom to the top in accordance with the acquisition rate of face images performed by the faceimage acquisition unit 12 c, respectively. - On the frame surrounding the moving image display region SF12 formed of the frame-like gages SF18L and SF18R, the face icon SF144 is arranged at the center on the upper side, and the bar-like gage SF16 is arranged at the center on the lower side. The frame-like gages SF18L and SF18R indicate an acquisition rate of face images of 0% at the center of the bar-like gage SF16 and indicate an acquisition rate of face images of 100% at the reaching point at the face icon SF144, respectively.
- The “Use password” button SF20 is a button used for switching the authentication scheme used for login to a particular application program from a face authentication scheme to a password scheme. The user is able to switch the authentication scheme used for login to a password scheme which requires entry of a password by pushing the “Use password” button SF20.
- The “Cancel” button SF22 is a button used for cancelling face authentication used for login to a particular application program. The user is able to cancel face authentication by pushing the “Cancel” button SF22.
- Once the face authentication window is displayed, as illustrated in
FIG. 2 , themovement instruction unit 12 b causes thedisplay unit 20 to display an instruction message, which provides an instruction about a particular movement, and a face icon, which moves as an animation with a particular movement, in the face authentication window and instructs the user about the particular movement (step S108). Themovement instruction unit 12 b can provide an instruction about a movement of shaking the head by using an instruction message, a face icon, or the like, for example. - Once the user is instructed by the
movement instruction unit 12 b to perform a particular movement, the faceimage acquisition unit 12 c attempts acquisition of a predefined number of user's face images necessary for impersonation determination until the time limit elapses. - First, the face
image acquisition unit 12 c determines whether or not a user's particular movement in the moving image captured by thecapture unit 22 is proper (step S110). - If it is determined by the face
image acquisition unit 12 c that the user's particular movement is improper (step S110, NO), unless the time limit has elapsed (step S112, NO), themovement instruction unit 12 b controls and changes the display providing an instruction about a particular movement (step S114). In this case, themovement instruction unit 12 b may continue the movement of the face icon shaking the head, may cause the movement of the face icon shaking the head to be wider, or may emphasize and display an instruction message “Please shake your head” on thedisplay unit 20, for example. The faceimage acquisition unit 12 c then continues to determine whether or not the user's particular movement is proper (step S110). Note that, if the time limit has elapsed (step S112, YES), the process proceeds to step S138 described below. - On the other hand, if it is determined by the face
image acquisition unit 12 c that the user's particular movement is proper (step S110, YES), themovement instruction unit 12 b controls and changes the display providing the instruction about a particular movement (step S116). In this case, themovement instruction unit 12 b may stop the movement of the face icon shaking the head or may change display of the face icon so as to have a smiling facial expression or the like which associates that the user's particular movement is proper, for example. - If the face
image acquisition unit 12 c determines that the user's particular movement is proper (step S110, YES), the faceimage acquisition unit 12 c detects a face image of the user in the moving image captured by the capture unit 22 (step S118). - Next, the face
image acquisition unit 12 c evaluates the quality of the detected face image (step S120). If the faceimage acquisition unit 12 c determines that the quality of the face image is not above a predetermined quality (step S120, NO), the process proceeds to step S110 described above to again attempt acquisition of face images. - On the other hand, if the face
image acquisition unit 12 c determines that the quality of the face image is above a predetermined quality (step S120, YES), the faceimage acquisition unit 12 c acquires a high quality face image having a quality above the predetermined quality (step S122). - In response to acquiring face images, the face
image acquisition unit 12 c controls and changes the display indicating the progress status of acquisition of face images, such as a gage, a progress bar, a numerical value in a form of percentage indicating an acquisition rate of face images, or the like (step S124). -
FIG. 7 illustrates one example of the face authentication window SF when face images are being acquired by the faceimage acquisition unit 12 c. Note thatFIG. 7 illustrates the face authentication window SF when the acquisition rate of face images is 50%. In the face authentication window SF during acquisition of face images, the bar-like gage SF16 extends from the left to the right, and the frame-like gages SF18L and SF18R extend from the bottom to the top as the number of acquired face images increases, as illustrated inFIG. 7 . Further, the numerical value of the acquisition rate of face images inside the bar-like gage SF16 increases as the number of acquired face images increases. - Next, as illustrated in
FIG. 2 , the faceimage acquisition unit 12 c determines whether or not a predefined number of face images necessary for impersonation determination is acquired (step S126). If the faceimage acquisition unit 12 c determines that the predefined number of face images have not been acquired (step S126, NO), unless the time limit has elapsed (step S128, NO), the faceimage acquisition unit 12 c enters step S110 described above to again attempt acquisition of face images. Note that, if the time limit has elapsed (step S128, YES), the process proceeds to step S138 described later. - On the other hand, if the face
image acquisition unit 12 c determines that the predefined number of face images have been acquired (step S126, YES), theimpersonation determination unit 12 e performs impersonation determination (step S130), and theface recognition unit 12 f performs face recognition (step S132). Note that any one of the impersonation determination by theimpersonation determination unit 12 e and the face recognition by theface recognition unit 12 f may be performed earlier than the other and may be performed in parallel. Further, theface recognition unit 12 f can perform face recognition by using already acquired face images at the time before the predefined number of face images are acquired by the faceimage acquisition unit 12 c. - Note that
FIG. 8 illustrates one example of the face authentication window SF at completion of acquisition of the predefined number of face images performed by the faceimage acquisition unit 12 c. As illustrated inFIG. 8 , in the face authentication window SF at the completion of acquisition of face images, the bar-like gage SF16 and the frame-like gages SF18L and SF18R have extended at the maximum, respectively. Further, the acquisition ratio of face images inside the bar-like gage SF16 is 100%. Further, in the face authentication window SF at the completion of acquisition of face images, a completion icon SF24 indicating the completion of acquisition of face images is displayed instead of the face icon SF144, and the completion of the process is visually indicated to the user. - The
impersonation determination unit 12 e first estimates a three-dimensional shape of the user's face based on a plurality of face images acquired by the faceimage acquisition unit 12 c as illustrated inFIG. 3 (step S202). - Next, the
impersonation determination unit 12 e evaluates whether or not the estimated three-dimensional shape is three-dimensional (step S204). - If the
impersonation determination unit 12 e evaluates that the estimated three-dimensional shape is three-dimensional (step S204, YES), theimpersonation determination unit 12 e determines that the user's face is a face of a living body and is not impersonated (step S206). - On the other hand, if the
impersonation determination unit 12 e evaluates that the estimated three-dimensional shape is not three-dimensional, that is, is two-dimensional (step S204, NO), theimpersonation determination unit 12 e determines that the user's face is impersonated by a photograph, a moving image, or the like (step S208). - On the other hand, the
face recognition unit 12 f first selects a target face image used for comparison with a registered face image from the plurality of face images acquired by the faceimage acquisition unit 12 c, as illustrated inFIG. 4 (step S302). - Further, the
face recognition unit 12 f reads and acquires a registered face image to be compared with the target face image from thestorage unit 16 or the like (step S304). - Next, the
face recognition unit 12 f calculates a matching score indicating the similarity between a feature amount of the target face image and a feature amount of the registered face image (step S306). - Next, the
face recognition unit 12 f determines whether or not the calculated matching score is greater than or equal to a predetermined threshold (step S308). - If the
face recognition unit 12 f determines that the matching score is greater than or equal to a predetermined threshold (step S308, YES), theface recognition unit 12 f determines that there is a matching in the comparison between the target face image and the registered face image, that is, there is a matching in the face recognition (step S310). - On the other hand, if the
face recognition unit 12 f determines that the matching score is less than the predetermined threshold (step S308, NO), theface recognition unit 12 f determines that there is no matching in the comparison between the target face image and the registered face image, that is, there is no matching in the face recognition (step S312). - Note that the
face recognition unit 12 f may perform the process illustrated inFIG. 4 using already acquired face images even at the time before the predefined number of face images are acquired by the faceimage acquisition unit 12 c. - Next, as illustrated in
FIG. 2 , theauthentication processing unit 12 a determines whether or not face authentication of the user is successful based on a result of the impersonation determination performed by theimpersonation determination unit 12 e and a result of the face recognition performed by theface recognition unit 12 f (step S134). - If the
authentication processing unit 12 a determines that the face authentication of the user is successful (step S134, YES), theauthentication processing unit 12 a permits the user to log in a particular application program and performs a login process to allow the user to log in (step S136). Theauthentication processing unit 12 a performs the login process and causes thedisplay unit 20 to display a window after login of the particular application. Theauthentication processing unit 12 a determines that the face authentication is successful if it is determined by theimpersonation determination unit 12 e that there is no impersonation (step S206) and it is determined by theface recognition unit 12 f that there is a matching in the face recognition (step S310). -
FIG. 9 is illustrated as a window after login SA of a particular application after face authentication succeeded. As illustrated inFIG. 9 , the window after login SA is a window of the particular application logged in by the user. - On the other hand, if the
authentication processing unit 12 a determines that the face authentication of the user failed (step S134, NO), theauthentication processing unit 12 a rejects login of the user to the particular application program. If it is determined by theimpersonation determination unit 12 e that there is impersonation (step S208) or if it is determined by theface recognition unit 12 f that there is no matching in the face recognition (step S312), theauthentication processing unit 12 a determines that the face authentication failed. In this case, theauthentication processing unit 12 a performs a failure process such as a process that notifies the user that the face authentication failed (step S140). - As described above, unlike the case where the predefined number of face images are acquired by the face
image acquisition unit 12 c and face authentication is performed, face authentication may fail as a result that the predefined number of face images were not obtained and the time limit has elapsed (step S112, YES or step S128, YES). In these cases, the information provideunit 12 d provides guidance information regarding a user's particular movement or an authentication environment in accordance with a reason for the face images being unable to be acquired by the faceimage acquisition unit 12 c (step S138). For example, the information provideunit 12 d can display guidance information on thedisplay unit 20 or output a voice from theaudio output unit 24 to provide the guidance information to the user. Thereby, it is possible to improve the probability of the predefined number of face images being acquired by the faceimage acquisition unit 12 c within the time limit when face authentication is again performed. - Next, the
authentication processing unit 12 a performs a failure process such as a process to notify the user that the face authentication failed (step S140). -
FIG. 10 illustrates the face authentication window SF at a face authentication failure when the face authentication failed because the predefined number of face images have not been acquired and the time limit has elapsed. As illustrated inFIG. 10 , in the face authentication window SF at the face authentication failure, a failure icon SF26 such as an exclamation mark icon indicating that the predefined number of face images have not been acquired and thus the face authentication failed is displayed instead of the face icon SF144. Further, instead of the frame-like gages SF18L and SF18R, a failure frame SF28 indicating that the face authentication failed by using a different color or the like from the frame-like gages SF18L and SF18R is displayed around the moving image display region SF12, for example. Further, while displaying a color moving image captured by thecapture unit 22 when face images are acquired, the moving image display region SF12 displays a grayscale frame or a black and white frame at a particular point of time of a moving image captured by thecapture unit 22 when face authentication fails, for example. Thereby, the moving image display region SF12 indicates that face authentication failed. - Furthermore, in the moving image display region SF12, a guidance message SF30 is displayed as guidance information provided by the information provide
unit 12 d. For example, the guidance message SF30 is a message having the content in accordance with a reason for a failure in acquisition of the predefined number of face images, such as “Motion is too fast”, “It is too bright”, or the like. - In such a way, face authentication is performed for the user by the
information processing apparatus 10 according to the present example embodiment. - As discussed above, according to the present example embodiment, display providing an instruction about a particular movement on the
display unit 20 is controlled and changed, and the progress status of acquisition of face images is displayed on thedisplay unit 20 in accordance with the movement status of the user in the movement captured by thecapture unit 22. Therefore, according to the present example embodiment, a face image that is biometrics information necessary for impersonation determination can be properly acquired based on a user's movement. - An information processing apparatus according to a second example embodiment of the present invention will be described by using
FIG. 11 .FIG. 11 is a block diagram illustrating a configuration of the information processing apparatus according to the present example embodiment. Note that the same components as those in the first example embodiment described above are labeled with the same references, and the description thereof will be omitted or simplified. - While the case where the
information processing apparatus 10 is configured to perform face authentication at login to a particular application has been described in the first example embodiment described above, the invention is not limited thereto. For example, face authentication may be performed at entry to a particular place such as a room, an area, or the like. In the present example embodiment, a case where an information processing apparatus is configured as a control apparatus that controls a door such as an automatic door, a gate, or the like to restrict entry to a particular place and performs face authentication at entry to the particular place will be described. - As illustrated in
FIG. 11 , an information processing apparatus 210 according to the present example embodiment has adoor 30 that restricts entry to a particular place in addition to the configuration of theinformation processing apparatus 10 according to the first example embodiment illustrated inFIG. 1 . - The
door 30 is formed of an automatic door, a security gate, or the like, for example, and installed at an entrance of a particular place such as a room, an area, or the like where entry of the user is restricted. Thedoor 30 performs a door-open operation and a door-close operation under the control of theCPU 12. - The information processing apparatus 210 according to the present example embodiment performs the same face authentication as in the
information processing apparatus 10 according to the first example embodiment described above when the user enters a particular place where entry is restricted by thedoor 30. That is, the information processing apparatus 210 according to the present example embodiment opens thedoor 30 when face authentication is successful and permits entry of the user to a particular place. On the other hand, the information processing apparatus 210 maintains thedoor 30 to be closed when face authentication fails and rejects entry of the user to a particular place. - As with the present example embodiment, the information processing apparatus can be configured to perform the same face authentication as the first example embodiment at entry to a particular place.
- Further, in addition to the above, face authentication may be performed at login to a server from a user terminal, for example. Accordingly, the information processing apparatus can be configured as a server that accepts login from a user terminal via a network, and face authentication can be performed at login from the user terminal. In such a case, an information processing apparatus configured as a server may have the configuration that functions as the
CPU 12, theRAM 14, and thestorage unit 16 that are the same as those of the first example embodiment. The user terminal may have the configuration that functions as theinput unit 18, thedisplay unit 20, thecapture unit 22, and theaudio output unit 24 that are the same as those of the first example embodiment, for example. - As discussed above, an information processing apparatus can be configured to perform the same face authentication as that in the first example embodiment in various scenes that require authentication.
- The information processing apparatus described in the above example embodiments may be configured as illustrated in
FIG. 12 according to another example embodiment.FIG. 12 is a block diagram illustrating the configuration of the information processing apparatus according to another example embodiment. - As illustrated in
FIG. 12 , aninformation processing apparatus 1000 according to another example embodiment has amovement instruction unit 1002 that instructs the user about a movement and aninformation acquisition unit 1004 that acquires biometrics information on the user from the user instructed about the movement. Further, theinformation processing apparatus 1000 has adisplay control unit 1006 that controls display directed to the user in accordance with the movement status of the user. - According to another example embodiment, display directed to the user is controlled in accordance with the movement status of the user. Thereby, according to another example embodiment, necessary biometrics information can be suitably acquired based on the user's movement.
- The present invention is not limited to the example embodiments described above, and various modifications are possible.
- For example, while the case where a plurality of face images are acquired as biometrics information used for impersonation determination has been described as an example in the above example embodiments, the invention is not limited thereto. As biometrics information, a gait image, a fingerprint image, an iris image, a finger vein image, a palm image, a palm vein image, or the like may be acquired other than a face image. Biometrics authentication may be performed by using biometrics information acquired from a user instructed to perform a particular movement instead of face authentication using a face image as biometrics information.
- Further, while the case where face images acquired as biometrics information is used to perform impersonation determination and face recognition has been described as an example in the above example embodiments, the invention is not limited thereto. The acquired biometrics information such as a face image can be utilized for various purposes.
- Further, the information processing apparatuses and 210 according to the example embodiments described above may be configured as a system including one or a plurality of apparatuses.
- Further, the scope of each of the example embodiments includes a processing method that stores, in a storage medium, a program that causes the configuration of each of the example embodiments to operate so as to implement the function of each of the example embodiments described above, reads the program stored in the storage medium as a code, and executes the program in a computer. That is, the scope of each of the example embodiments also includes a computer readable storage medium. Further, each of the example embodiments includes not only the storage medium in which the computer program described above is stored but also the computer program itself.
- As the storage medium, a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a compact disk-read only memory (CD-ROM), a magnetic tape, a non-volatile memory card, or a ROM, for example, may be used. Further, the scope of each example embodiment includes not only those executing a process with a program itself stored in the storage medium but also those operating on an operating system (OS) in cooperation with the function of another software or an extension board to execute the process.
- Services realized by the function of each example embodiment described above can be provided to the user in a form of Software as a Service (SaaS).
- An example advantage according to the invention is that necessary biometrics information can be suitably acquired based on a user's movement.
- The whole or part of the example embodiments disclosed above can be described as, but not limited to, the following supplementary notes.
- (Supplementary Note 1)
- An information processing apparatus comprising:
- a movement instruction unit that instructs a user about a movement;
- an information acquisition unit that acquires biometrics information on the user from the user instructed about the movement; and
- a display control unit that controls display directed to the user in accordance with a movement status of the user.
- (Supplementary Note 2)
- The information processing apparatus according to supplementary note 1,
- wherein the movement instruction unit indicates the movement by causing a display unit to display the display that instructs the user about the movement, and
- wherein the display control unit controls the display that instructs the user about the movement in accordance with the movement status of the user.
- (Supplementary Note 3)
- The information processing apparatus according to supplementary note 1 or 2, wherein the display control unit causes a display unit to display the display indicating progress status of acquisition of the biometrics information in accordance with the movement status of the user.
- (Supplementary Note 4)
- The information processing apparatus according to any one of supplementary notes 1 to 3 further comprising an information provide unit that provides guidance information when the information acquisition unit is unable to acquire a predetermined amount of the biometrics information within a predetermined time period.
- (Supplementary Note 5)
- The information processing apparatus according to supplementary note 4, wherein the guidance information is information regarding a movement or an environment of the user.
- (Supplementary Note 6)
- The information processing apparatus according to any one of supplementary notes 1 to 5,
- wherein the movement instruction unit instructs the user about the movement related to a head of the user, and
- wherein the information acquisition unit acquires multiple pieces of face information as the biometrics information from the user.
- (Supplementary Note 7)
- The information processing apparatus according to any one of supplementary notes 1 to 6 further comprising an impersonation determination unit that determines, based on the biometrics information, whether or not the user is impersonated.
- (Supplementary Note 8)
- The information processing apparatus according to supplementary note 7 further comprising a comparison unit that compares the biometrics information acquired by the information acquisition unit with registered biometrics information.
- (Supplementary Note 9)
- The information processing apparatus according to supplementary note 8, wherein the information acquisition unit stops acquisition of the biometrics information when there is no matching in comparison performed by the comparison unit.
- (Supplementary Note 10)
- The information processing apparatus according to supplementary note 8 or 9 further comprising an authentication processing unit that performs authentication on the user based on a result of determination performed by the impersonation determination unit and a result of comparison performed by the comparison unit.
- (Supplementary Note 11)
- An information processing method comprising:
- instructing a user about a movement;
- acquiring biometrics information on the user from the user instructed about the movement; and
- controlling display directed to the user in accordance with a movement status of the user.
- (Supplementary Note 12)
- A non-transitory storage medium storing a program that causes a computer to perform:
- instructing a user about a movement;
- acquiring biometrics information on the user from the user instructed about the movement; and
- controlling display directed to the user in accordance with a movement status of the user.
- While the present invention has been described with reference to the example embodiments, the present invention is not limited to the example embodiment described above. Various modification that can be understood by those skilled in the art can be made to the configuration or the details of the present invention within the scope of the present invention.
Claims (20)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/227,707 US20210256282A1 (en) | 2018-11-05 | 2021-04-12 | Information processing apparatus, information processing method, and storage medium |
| US18/381,382 US20240045937A1 (en) | 2018-11-05 | 2023-10-18 | Information processing apparatus, information processing method, and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2018/041060 WO2020095350A1 (en) | 2018-11-05 | 2018-11-05 | Information processing device, information processing method, and recording medium |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/041060 Continuation WO2020095350A1 (en) | 2018-11-05 | 2018-11-05 | Information processing device, information processing method, and recording medium |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/227,707 Continuation US20210256282A1 (en) | 2018-11-05 | 2021-04-12 | Information processing apparatus, information processing method, and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20200143186A1 true US20200143186A1 (en) | 2020-05-07 |
Family
ID=70459944
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/388,239 Abandoned US20200143186A1 (en) | 2018-11-05 | 2019-04-18 | Information processing apparatus, information processing method, and storage medium |
| US17/227,707 Abandoned US20210256282A1 (en) | 2018-11-05 | 2021-04-12 | Information processing apparatus, information processing method, and storage medium |
| US18/381,382 Pending US20240045937A1 (en) | 2018-11-05 | 2023-10-18 | Information processing apparatus, information processing method, and storage medium |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/227,707 Abandoned US20210256282A1 (en) | 2018-11-05 | 2021-04-12 | Information processing apparatus, information processing method, and storage medium |
| US18/381,382 Pending US20240045937A1 (en) | 2018-11-05 | 2023-10-18 | Information processing apparatus, information processing method, and storage medium |
Country Status (5)
| Country | Link |
|---|---|
| US (3) | US20200143186A1 (en) |
| EP (1) | EP3879419B1 (en) |
| JP (4) | JPWO2020095350A1 (en) |
| SG (1) | SG11202104685WA (en) |
| WO (1) | WO2020095350A1 (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11200305B2 (en) * | 2019-05-31 | 2021-12-14 | International Business Machines Corporation | Variable access based on facial expression configuration |
| US20220019771A1 (en) * | 2019-04-19 | 2022-01-20 | Fujitsu Limited | Image processing device, image processing method, and storage medium |
| US11423878B2 (en) * | 2019-07-17 | 2022-08-23 | Lg Electronics Inc. | Intelligent voice recognizing method, apparatus, and intelligent computing device |
| US20220343673A1 (en) * | 2019-09-27 | 2022-10-27 | Nec Corporation | Information processing apparatus, information processing method and storage medium |
| US20220414193A1 (en) * | 2021-06-28 | 2022-12-29 | Capital One Services, Llc | Systems and methods for secure adaptive illustrations |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114022926B (en) * | 2021-10-18 | 2025-10-21 | 中国银联股份有限公司 | Face recognition method, device, equipment and storage medium |
| JPWO2025141668A1 (en) * | 2023-12-25 | 2025-07-03 |
Citations (35)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050212654A1 (en) * | 2003-09-29 | 2005-09-29 | Fuji Photo Film Co., Ltd. | Authentication system and program |
| US20080192980A1 (en) * | 2007-02-14 | 2008-08-14 | Samsung Electronics Co., Ltd. | Liveness detection method and apparatus of video image |
| US20090135188A1 (en) * | 2007-11-26 | 2009-05-28 | Tsinghua University | Method and system of live detection based on physiological motion on human face |
| US8254647B1 (en) * | 2012-04-16 | 2012-08-28 | Google Inc. | Facial image quality assessment |
| US20130015946A1 (en) * | 2011-07-12 | 2013-01-17 | Microsoft Corporation | Using facial data for device authentication or subject identification |
| US8441548B1 (en) * | 2012-06-15 | 2013-05-14 | Google Inc. | Facial image quality assessment |
| US8457367B1 (en) * | 2012-06-26 | 2013-06-04 | Google Inc. | Facial recognition |
| US8542879B1 (en) * | 2012-06-26 | 2013-09-24 | Google Inc. | Facial recognition |
| US8582835B2 (en) * | 2011-07-11 | 2013-11-12 | Accenture Global Services Limited | Liveness detection |
| US20140150091A1 (en) * | 2010-02-12 | 2014-05-29 | Apple Inc. | Biometric sensor for human presence detection and associated methods |
| US20140270412A1 (en) * | 2012-01-20 | 2014-09-18 | Cyberlink Corp. | Liveness detection system based on face behavior |
| US20140337930A1 (en) * | 2013-05-13 | 2014-11-13 | Hoyos Labs Corp. | System and method for authorizing access to access-controlled environments |
| US20150154392A1 (en) * | 2013-11-29 | 2015-06-04 | International Business Machines Corporation | Secure face authentication with liveness detection for mobile |
| US20160063235A1 (en) * | 2014-08-28 | 2016-03-03 | Kevin Alan Tussy | Facial Recognition Authentication System Including Path Parameters |
| US20160063314A1 (en) * | 2014-09-03 | 2016-03-03 | Samet Privacy, Llc | Image processing apparatus for facial recognition |
| US9444924B2 (en) * | 2009-10-28 | 2016-09-13 | Digimarc Corporation | Intuitive computing methods and systems |
| US20160342851A1 (en) * | 2015-05-22 | 2016-11-24 | Yahoo! Inc. | Computerized system and method for determining authenticity of users via facial recognition |
| US20160366129A1 (en) * | 2015-06-10 | 2016-12-15 | Alibaba Group Holding Limited | Liveness detection method and device, and identity authentication method and device |
| US20170053174A1 (en) * | 2015-08-18 | 2017-02-23 | Beijing Kuangshi Technology Co., Ltd. | Liveness detection apparatus and liveness detection method |
| US9619723B1 (en) * | 2016-02-17 | 2017-04-11 | Hong Kong Applied Science and Technology Research Institute Company Limited | Method and system of identification and authentication using facial expression |
| US20170124312A1 (en) * | 2014-06-19 | 2017-05-04 | Nec Corporation | Authentication device, authentication system, and authentication method |
| US20170344793A1 (en) * | 2014-10-22 | 2017-11-30 | Veridium Ip Limited | Systems and methods for performing iris identification and verification using mobile devices |
| US20180048645A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for determining user liveness and verifying user identities |
| US20180046853A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for determining user liveness and verifying user identities |
| US20180046852A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for enhancing user liveness detection |
| US9934443B2 (en) * | 2015-03-31 | 2018-04-03 | Daon Holdings Limited | Methods and systems for detecting head motion during an authentication transaction |
| US20180101721A1 (en) * | 2013-07-02 | 2018-04-12 | Robert Frank Nienhouse | System and method for locating and determining substance use |
| US20180173980A1 (en) * | 2016-12-15 | 2018-06-21 | Beijing Kuangshi Technology Co., Ltd. | Method and device for face liveness detection |
| US20180181737A1 (en) * | 2014-08-28 | 2018-06-28 | Facetec, Inc. | Facial Recognition Authentication System Including Path Parameters |
| US20180189960A1 (en) * | 2014-12-31 | 2018-07-05 | Morphotrust Usa, Llc | Detecting Facial Liveliness |
| US20180211096A1 (en) * | 2015-06-30 | 2018-07-26 | Beijing Kuangshi Technology Co., Ltd. | Living-body detection method and device and computer program product |
| US20180349682A1 (en) * | 2017-05-31 | 2018-12-06 | Facebook, Inc. | Face liveness detection |
| US20180373923A1 (en) * | 2014-11-13 | 2018-12-27 | Intel Corporation | Spoofing detection in image biometrics |
| US20190080070A1 (en) * | 2017-09-09 | 2019-03-14 | Apple Inc. | Implementation of biometric authentication |
| US10262204B2 (en) * | 2014-10-15 | 2019-04-16 | Samsung Electronics Co., Ltd. | User terminal apparatus and iris recognition method thereof |
Family Cites Families (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4221808B2 (en) * | 1999-03-30 | 2009-02-12 | コニカミノルタセンシング株式会社 | Three-dimensional data input method and apparatus |
| JP3974375B2 (en) * | 2001-10-31 | 2007-09-12 | 株式会社東芝 | Person recognition device, person recognition method, and traffic control device |
| JP2003178306A (en) * | 2001-12-12 | 2003-06-27 | Toshiba Corp | Personal authentication device and personal authentication method |
| JP2003317100A (en) * | 2002-02-22 | 2003-11-07 | Matsushita Electric Ind Co Ltd | Information terminal device, authentication system, and registration / authentication method |
| JP2004362079A (en) * | 2003-06-02 | 2004-12-24 | Fuji Photo Film Co Ltd | Personal identification device |
| CN101379528B (en) * | 2006-03-01 | 2012-07-04 | 日本电气株式会社 | Face authentication device, face authentication method |
| JP4929828B2 (en) * | 2006-05-10 | 2012-05-09 | 日本電気株式会社 | Three-dimensional authentication method, three-dimensional authentication device, and three-dimensional authentication program |
| JP4753097B2 (en) * | 2008-02-06 | 2011-08-17 | コニカミノルタビジネステクノロジーズ株式会社 | Control system, control method, and control program |
| US8860795B2 (en) * | 2008-10-28 | 2014-10-14 | Nec Corporation | Masquerading detection system, masquerading detection method, and computer-readable storage medium |
| JP2010218039A (en) * | 2009-03-13 | 2010-09-30 | Nec Corp | System and method for authenticating face |
| JP5159950B2 (en) * | 2009-05-28 | 2013-03-13 | 株式会社東芝 | Image processing apparatus, method, and program |
| FR2997211B1 (en) * | 2012-10-18 | 2021-01-01 | Morpho | PROCESS FOR AUTHENTICATION OF AN IMAGE CAPTURE OF A THREE-DIMENSIONAL ENTITY |
| JP6265592B2 (en) * | 2012-12-10 | 2018-01-24 | セコム株式会社 | Facial feature extraction apparatus and face authentication system |
| US9408076B2 (en) * | 2014-05-14 | 2016-08-02 | The Regents Of The University Of California | Sensor-assisted biometric authentication for smartphones |
| US10853625B2 (en) * | 2015-03-21 | 2020-12-01 | Mine One Gmbh | Facial signature methods, systems and software |
| JP6507046B2 (en) * | 2015-06-26 | 2019-04-24 | 株式会社東芝 | Three-dimensional object detection device and three-dimensional object authentication device |
| US10657362B2 (en) * | 2015-06-30 | 2020-05-19 | Nec Corporation Of America | Facial recognition system |
| CN106897658B (en) * | 2015-12-18 | 2021-12-14 | 腾讯科技(深圳)有限公司 | Method and device for identifying living body of human face |
| WO2017170384A1 (en) * | 2016-03-28 | 2017-10-05 | 日本電気株式会社 | Biodata processing device, biodata processing system, biodata processing method, biodata processing program, and recording medium for storing biodata processing program |
| US10339367B2 (en) * | 2016-03-29 | 2019-07-02 | Microsoft Technology Licensing, Llc | Recognizing a face and providing feedback on the face-recognition process |
| JP6785611B2 (en) * | 2016-10-11 | 2020-11-18 | 株式会社日立製作所 | Biometric device and method |
| US10679083B2 (en) * | 2017-03-27 | 2020-06-09 | Samsung Electronics Co., Ltd. | Liveness test method and apparatus |
| JP2018173731A (en) * | 2017-03-31 | 2018-11-08 | ミツミ電機株式会社 | Face authentication apparatus and face authentication method |
| WO2018226265A1 (en) * | 2017-09-09 | 2018-12-13 | Apple Inc. | Implementation of biometric authentication |
| KR102301599B1 (en) * | 2017-09-09 | 2021-09-10 | 애플 인크. | Implementation of biometric authentication |
| JP6859970B2 (en) * | 2018-03-09 | 2021-04-14 | 京セラドキュメントソリューションズ株式会社 | Login support system |
| JP2019212156A (en) * | 2018-06-07 | 2019-12-12 | パナソニックIpマネジメント株式会社 | Facial image registration system, facial image registration method, moving body terminal, and face authentication image registration device |
| WO2020022034A1 (en) * | 2018-07-25 | 2020-01-30 | 日本電気株式会社 | Information processing device, information processing method, and information processing program |
| US10402553B1 (en) * | 2018-07-31 | 2019-09-03 | Capital One Services, Llc | System and method for using images to authenticate a user |
-
2018
- 2018-11-05 JP JP2020556377A patent/JPWO2020095350A1/en active Pending
- 2018-11-05 EP EP18939355.6A patent/EP3879419B1/en active Active
- 2018-11-05 SG SG11202104685WA patent/SG11202104685WA/en unknown
- 2018-11-05 WO PCT/JP2018/041060 patent/WO2020095350A1/en not_active Ceased
-
2019
- 2019-04-18 US US16/388,239 patent/US20200143186A1/en not_active Abandoned
-
2021
- 2021-04-12 US US17/227,707 patent/US20210256282A1/en not_active Abandoned
-
2023
- 2023-02-16 JP JP2023022081A patent/JP2023063314A/en active Pending
- 2023-10-18 US US18/381,382 patent/US20240045937A1/en active Pending
-
2024
- 2024-05-17 JP JP2024080580A patent/JP2024103525A/en active Pending
-
2025
- 2025-08-29 JP JP2025142893A patent/JP2025170042A/en active Pending
Patent Citations (44)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050212654A1 (en) * | 2003-09-29 | 2005-09-29 | Fuji Photo Film Co., Ltd. | Authentication system and program |
| US20080192980A1 (en) * | 2007-02-14 | 2008-08-14 | Samsung Electronics Co., Ltd. | Liveness detection method and apparatus of video image |
| US20090135188A1 (en) * | 2007-11-26 | 2009-05-28 | Tsinghua University | Method and system of live detection based on physiological motion on human face |
| US9444924B2 (en) * | 2009-10-28 | 2016-09-13 | Digimarc Corporation | Intuitive computing methods and systems |
| US20140150091A1 (en) * | 2010-02-12 | 2014-05-29 | Apple Inc. | Biometric sensor for human presence detection and associated methods |
| US8582835B2 (en) * | 2011-07-11 | 2013-11-12 | Accenture Global Services Limited | Liveness detection |
| US20130015946A1 (en) * | 2011-07-12 | 2013-01-17 | Microsoft Corporation | Using facial data for device authentication or subject identification |
| US20150310259A1 (en) * | 2011-07-12 | 2015-10-29 | Microsoft Technology Licensing, Llc | Using facial data for device authentication or subject identification |
| US20140270412A1 (en) * | 2012-01-20 | 2014-09-18 | Cyberlink Corp. | Liveness detection system based on face behavior |
| US8254647B1 (en) * | 2012-04-16 | 2012-08-28 | Google Inc. | Facial image quality assessment |
| US8441548B1 (en) * | 2012-06-15 | 2013-05-14 | Google Inc. | Facial image quality assessment |
| US8542879B1 (en) * | 2012-06-26 | 2013-09-24 | Google Inc. | Facial recognition |
| US8457367B1 (en) * | 2012-06-26 | 2013-06-04 | Google Inc. | Facial recognition |
| US20140016837A1 (en) * | 2012-06-26 | 2014-01-16 | Google Inc. | Facial recognition |
| US20140307929A1 (en) * | 2012-06-26 | 2014-10-16 | Google, Inc. | Facial recognition |
| US20140337930A1 (en) * | 2013-05-13 | 2014-11-13 | Hoyos Labs Corp. | System and method for authorizing access to access-controlled environments |
| US20180101721A1 (en) * | 2013-07-02 | 2018-04-12 | Robert Frank Nienhouse | System and method for locating and determining substance use |
| US20150154392A1 (en) * | 2013-11-29 | 2015-06-04 | International Business Machines Corporation | Secure face authentication with liveness detection for mobile |
| US20170124312A1 (en) * | 2014-06-19 | 2017-05-04 | Nec Corporation | Authentication device, authentication system, and authentication method |
| US20180181737A1 (en) * | 2014-08-28 | 2018-06-28 | Facetec, Inc. | Facial Recognition Authentication System Including Path Parameters |
| US20160063235A1 (en) * | 2014-08-28 | 2016-03-03 | Kevin Alan Tussy | Facial Recognition Authentication System Including Path Parameters |
| US20160307030A1 (en) * | 2014-09-03 | 2016-10-20 | Samet Privacy, Llc | Image processing apparatus for facial recognition |
| US20160063314A1 (en) * | 2014-09-03 | 2016-03-03 | Samet Privacy, Llc | Image processing apparatus for facial recognition |
| US10628670B2 (en) * | 2014-10-15 | 2020-04-21 | Samsung Electronics Co., Ltd. | User terminal apparatus and iris recognition method thereof |
| US10262204B2 (en) * | 2014-10-15 | 2019-04-16 | Samsung Electronics Co., Ltd. | User terminal apparatus and iris recognition method thereof |
| US20170344793A1 (en) * | 2014-10-22 | 2017-11-30 | Veridium Ip Limited | Systems and methods for performing iris identification and verification using mobile devices |
| US20180373923A1 (en) * | 2014-11-13 | 2018-12-27 | Intel Corporation | Spoofing detection in image biometrics |
| US20180189960A1 (en) * | 2014-12-31 | 2018-07-05 | Morphotrust Usa, Llc | Detecting Facial Liveliness |
| US9934443B2 (en) * | 2015-03-31 | 2018-04-03 | Daon Holdings Limited | Methods and systems for detecting head motion during an authentication transaction |
| US10430679B2 (en) * | 2015-03-31 | 2019-10-01 | Daon Holdings Limited | Methods and systems for detecting head motion during an authentication transaction |
| US20160342851A1 (en) * | 2015-05-22 | 2016-11-24 | Yahoo! Inc. | Computerized system and method for determining authenticity of users via facial recognition |
| US20160366129A1 (en) * | 2015-06-10 | 2016-12-15 | Alibaba Group Holding Limited | Liveness detection method and device, and identity authentication method and device |
| US20180211096A1 (en) * | 2015-06-30 | 2018-07-26 | Beijing Kuangshi Technology Co., Ltd. | Living-body detection method and device and computer program product |
| US20170053174A1 (en) * | 2015-08-18 | 2017-02-23 | Beijing Kuangshi Technology Co., Ltd. | Liveness detection apparatus and liveness detection method |
| US9619723B1 (en) * | 2016-02-17 | 2017-04-11 | Hong Kong Applied Science and Technology Research Institute Company Limited | Method and system of identification and authentication using facial expression |
| US20180046850A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for enhancing user liveness detection |
| US20180046852A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for enhancing user liveness detection |
| US20180046853A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for determining user liveness and verifying user identities |
| US20180048645A1 (en) * | 2016-08-09 | 2018-02-15 | Mircea Ionita | Methods and systems for determining user liveness and verifying user identities |
| US20180173980A1 (en) * | 2016-12-15 | 2018-06-21 | Beijing Kuangshi Technology Co., Ltd. | Method and device for face liveness detection |
| US20180349682A1 (en) * | 2017-05-31 | 2018-12-06 | Facebook, Inc. | Face liveness detection |
| US20190080072A1 (en) * | 2017-09-09 | 2019-03-14 | Apple Inc. | Implementation of biometric authentication |
| US20190080071A1 (en) * | 2017-09-09 | 2019-03-14 | Apple Inc. | Implementation of biometric authentication |
| US20190080070A1 (en) * | 2017-09-09 | 2019-03-14 | Apple Inc. | Implementation of biometric authentication |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220019771A1 (en) * | 2019-04-19 | 2022-01-20 | Fujitsu Limited | Image processing device, image processing method, and storage medium |
| US12033429B2 (en) * | 2019-04-19 | 2024-07-09 | Fujitsu Limited | Image processing device of determining authenticity of object, image processing method of determining authenticity of object, and storage medium storing program of determining authenticity of object |
| US11200305B2 (en) * | 2019-05-31 | 2021-12-14 | International Business Machines Corporation | Variable access based on facial expression configuration |
| US11423878B2 (en) * | 2019-07-17 | 2022-08-23 | Lg Electronics Inc. | Intelligent voice recognizing method, apparatus, and intelligent computing device |
| US20220343673A1 (en) * | 2019-09-27 | 2022-10-27 | Nec Corporation | Information processing apparatus, information processing method and storage medium |
| US12223765B2 (en) * | 2019-09-27 | 2025-02-11 | Nec Corporation | Information processing apparatus, information processing method and storage medium |
| US20220414193A1 (en) * | 2021-06-28 | 2022-12-29 | Capital One Services, Llc | Systems and methods for secure adaptive illustrations |
| US12189735B2 (en) * | 2021-06-28 | 2025-01-07 | Capital One Services, Llc | Systems and methods for secure adaptive illustrations |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2024103525A (en) | 2024-08-01 |
| US20240045937A1 (en) | 2024-02-08 |
| EP3879419A4 (en) | 2021-11-03 |
| EP3879419A1 (en) | 2021-09-15 |
| EP3879419B1 (en) | 2026-01-28 |
| WO2020095350A1 (en) | 2020-05-14 |
| JP2023063314A (en) | 2023-05-09 |
| SG11202104685WA (en) | 2021-06-29 |
| JP2025170042A (en) | 2025-11-14 |
| JPWO2020095350A1 (en) | 2021-09-24 |
| US20210256282A1 (en) | 2021-08-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240045937A1 (en) | Information processing apparatus, information processing method, and storage medium | |
| US10902104B2 (en) | Biometric security systems and methods | |
| US10205883B2 (en) | Display control method, terminal device, and storage medium | |
| US20200064916A1 (en) | Method for Automatically Identifying at least one User of an Eye Tracking Device and Eye Tracking Device | |
| US20190005222A1 (en) | Face-Controlled Liveness Verification | |
| US12217548B2 (en) | Authentication device, authentication method, and recording medium | |
| CN110114777B (en) | Identification, authentication and/or guidance of users using gaze information | |
| US20120320181A1 (en) | Apparatus and method for security using authentication of face | |
| CN107995979A (en) | User identification and/or authentication using gaze information | |
| US20150043792A1 (en) | Biometric authentication device and method | |
| US10586031B2 (en) | Biometric authentication of a user | |
| CN105279479A (en) | Face authentication device and face authentication method | |
| CN105279409A (en) | Handheld identity verification device, identity verification method and identity verification system | |
| US12547687B2 (en) | System and method for biometric authentication | |
| JP6267025B2 (en) | Communication terminal and communication terminal authentication method | |
| JP2017162302A (en) | Biometric authentication device, biometric authentication method, and biometric authentication program | |
| KR101680598B1 (en) | Apparatus and method for face authenticating which provides with optimum face guidance | |
| JP2018128785A (en) | Biometric authentication apparatus, biometric authentication method, and biometric authentication program | |
| WO2022147411A1 (en) | Facial expression to augment face id and presentation attack detection | |
| JP6798285B2 (en) | Biometric device, biometric method and program | |
| US12283131B2 (en) | Information presenting system, information presenting method, computer program, and authentication system | |
| CN107577929B (en) | A biometric-based access control method for different systems and electronic equipment | |
| US20200285724A1 (en) | Biometric authentication device, biometric authentication system, and computer program product | |
| WO2020133405A1 (en) | Method and device for controlling ground remote control robot | |
| JP2025075214A (en) | Face Recognition Device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: NEC CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KATO, SANAE;KUMAZAKI, JUNICHI;KAWASE, NOBUAKI;SIGNING DATES FROM 20190308 TO 20190320;REEL/FRAME:048949/0077 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |