WO2024095362A1 - Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement - Google Patents
Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement Download PDFInfo
- Publication number
- WO2024095362A1 WO2024095362A1 PCT/JP2022/040864 JP2022040864W WO2024095362A1 WO 2024095362 A1 WO2024095362 A1 WO 2024095362A1 JP 2022040864 W JP2022040864 W JP 2022040864W WO 2024095362 A1 WO2024095362 A1 WO 2024095362A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- camera
- information processing
- degradation
- authentication
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
Definitions
- This disclosure relates to an information processing system, an information processing device, an information processing method, and a recording medium.
- Patent document 1 describes measuring the actual distance between the subject and the camera from the facial component distance, and acquiring an eye image from a person image of the subject who has been confirmed to be in the iris photography space. Patent document 1 further describes measuring the quality of the acquired eye image, and acquiring an image for iris recognition that meets a standard quality level.
- Patent document 2 describes evaluating the quality of an iris image before it is processed for iris recognition. Patent document 2 also describes that an evaluation of the iris image is provided according to blur, defocus, eye closure, obscuration, etc.
- Patent Document 3 describes how, in walk-through authentication, the focus is fixed and burst imaging is performed on a subject who passes through a fixed focus position, and in the case of re-authentication, the focus is scanned and burst imaging is performed on a subject who is standing still. By doing this, the technology in Patent Document 3 makes it possible to extract an iris image focused on the subject's iris.
- This disclosure aims to improve upon the techniques described in the prior art documents mentioned above.
- a camera capable of capturing an image of a target;
- An acquisition means for acquiring status information indicating at least a status of the target;
- an estimation means for generating deterioration information relating to deterioration that is estimated to occur in an object image obtained by capturing an image of the object with the camera, using the state information;
- an information processing system comprising: a control means for outputting control information corresponding to the degradation information at least one of before the object is imaged by the camera and when the object is imaged by the camera.
- An acquisition means for acquiring status information indicating at least a status of an object; an estimation means for generating deterioration information relating to deterioration that is estimated to occur in an object image obtained by capturing an image of the object with a camera, using the state information;
- an information processing device comprising: a control means for outputting control information corresponding to the degradation information at least one of before the object is imaged by the camera and when the object is imaged by the camera.
- One or more computers Obtaining status information indicating at least a status of the target; generating deterioration information regarding deterioration that is estimated to occur in an object image obtained by capturing an image of the object with a camera using the state information; There is provided an information processing method for outputting control information according to the degradation information at least one of before the object is imaged by the camera and when the object is imaged by the camera.
- a computer-readable recording medium having a program recorded thereon causes a computer to include: an acquisition unit that acquires status information indicating at least a status of an object; There is provided an estimation means for generating, using the status information, degradation information relating to degradation that is estimated to occur in an object image obtained by imaging the object with a camera, and a recording medium that functions as a control means for outputting control information corresponding to the degradation information at least one of before the object is imaged with the camera and when the object is imaged with the camera.
- FIG. 1 is a diagram showing an overview of an information processing system according to a first embodiment.
- 1 is a diagram showing an overview of an information processing apparatus according to a first embodiment;
- FIG. 1 is a diagram showing an overview of an information processing method according to a first embodiment.
- FIG. 1 is a diagram illustrating an example of a usage environment of an information processing system according to a first embodiment.
- 1 is a block diagram illustrating a functional configuration of an information processing system according to a first embodiment.
- 4 is a flowchart illustrating a flow of processing performed by the information processing system according to the first embodiment.
- 10 is a diagram illustrating an example of a deterioration estimation model used by an estimation unit to generate deterioration information.
- FIG. 11 is a diagram illustrating a functional configuration of an information processing system according to a second embodiment.
- 13 is a flowchart illustrating the flow of a process executed by an information processing system according to a third embodiment.
- 11 is a diagram illustrating an example of a quality estimation model used by the control unit to generate a quality score.
- FIG. 13 is a block diagram illustrating a functional configuration of an information processing system according to a fourth embodiment. 13 is a diagram for explaining a method in which an estimation unit according to a sixth embodiment generates degradation information.
- FIG. FIG. 23 is a block diagram illustrating a functional configuration of an information processing system according to a seventh embodiment.
- FIG. 23 is a diagram illustrating an example of a usage environment of an information processing system according to a seventh embodiment.
- FIG. 23 is a block diagram illustrating a functional configuration of an information processing system according to an eighth embodiment.
- FIG. 1 is a diagram showing an overview of an information processing system 50 according to a first embodiment.
- the information processing system 50 includes a camera 20, an acquisition unit 110, an estimation unit 130, and a control unit 150.
- the camera 20 is capable of capturing an image of an object.
- the acquisition unit 110 acquires state information indicating at least a state of the object.
- the estimation unit 130 generates degradation information using the state information.
- the degradation information is information on degradation estimated to occur in an object image obtained by capturing an image of the object with the camera 20.
- the control unit 150 outputs control information according to the degradation information at least one of before the object is captured by the camera 20 and when the object is captured by the camera 20.
- This information processing system 50 can estimate the degradation of the target image before capturing the image of the target with the camera 20, and take measures to obtain a good image.
- FIG. 2 is a diagram showing an overview of the information processing device 10 according to this embodiment.
- the information processing device 10 includes an acquisition unit 110, an estimation unit 130, and a control unit 150.
- the acquisition unit 110 acquires status information indicating at least the status of the target.
- the estimation unit 130 generates degradation information using the status information.
- the degradation information is information about degradation that is estimated to occur in a target image obtained by capturing an image of the target with the camera 20.
- the control unit 150 outputs control information corresponding to the degradation information at least either before the target is captured by the camera 20 or when the target is captured by the camera 20.
- This information processing device 10 can estimate the degradation of the target image before capturing the image of the target with the camera 20, and take measures to obtain a good image.
- the information processing system 50 according to this embodiment can be configured to include the information processing device 10 according to this embodiment.
- FIG. 3 is a diagram showing an overview of the information processing method according to this embodiment.
- the information processing method according to this embodiment is executed by one or more computers.
- the information processing method according to this embodiment includes steps S10, S20, and S30.
- step S10 status information indicating at least the status of the target is acquired.
- step S20 degradation information is generated using the status information.
- the degradation information is information about degradation estimated to occur in the target image obtained by capturing an image of the target with the camera 20.
- control information corresponding to the degradation information is output at least either before the target is captured by the camera 20 or when the target is captured by the camera 20.
- This information processing method makes it possible to estimate the degradation of the target image before capturing an image of the target with the camera 20, and to take measures to obtain a good image.
- the information processing method according to this embodiment can be executed by the information processing device 10 according to this embodiment.
- FIG. 4 is a diagram illustrating an example of a usage environment of the information processing system 50 according to this embodiment.
- the information processing system 50 relates to, for example, a walk-through authentication system.
- the target image captured by the camera 20 is an image used to authenticate the target 90.
- the target 90 is moving, for example, so as to approach point P2, which is the point where the target image is captured.
- point P1 which is the point where the target image is captured.
- the state of the target 90 at point P1 which is a position farther from the camera 20 than point P2, is measured using the state measurement unit 30.
- the estimation unit 130 of the information processing device 10 estimates the deterioration of the target image obtained by capturing an image of the target 90 with the camera 20 at point P2. Then, based on the estimation result, the control unit 150 of the information processing device 10 outputs control information for suppressing the deterioration of the target image.
- Patent Documents 1 to 3 make it impossible to estimate image degradation and take measures before capturing an image.
- control unit 150 outputs control information corresponding to the degradation information at least either before the object is imaged by the camera 20 or when the object is imaged by the camera 20. Therefore, appropriate control for suppressing degradation is possible, and a good image can be obtained.
- the information processing system 50, information processing device 10, and information processing method according to this embodiment it is particularly preferable to use the information processing system 50, information processing device 10, and information processing method according to this embodiment to implement measures to improve the quality of the target image before or when capturing the target image.
- point P2 the focal position of the camera 20
- point P1 a point farther away from the camera 20 than point P2
- these points are not limited to the example of the walk-through system shown in Figure 4.
- FIG. 5 is a block diagram illustrating the functional configuration of the information processing system 50 according to this embodiment.
- FIG. 6 is a flowchart illustrating the flow of processing performed by the information processing system 50 according to this embodiment. Detailed examples of the information processing system 50, information processing device 10, and information processing method according to this embodiment will be described with reference to FIGS. 5 and 6.
- the target 90 is, for example, a human. However, the target 90 may be a living thing other than a human, or a non-living object. It is preferable that the target 90 moves closer to the camera 20 before the target image is captured.
- the target image captured by camera 20 is the image used to authenticate target 90.
- Camera 20 is focused on approximately point P2.
- Information processing system 50 is capable of performing, for example, iris authentication of target 90.
- Camera 20 is, for example, an iris imaging camera for imaging the iris of target 90.
- the target image is an image used for iris authentication.
- camera 20 is provided so as to be able to image the iris of target 90 located at point P2.
- the target image is an image including the eye.
- the target image may be an image including both eyes, or an image including only one of the right eye or the left eye.
- the target image may be an image including not only the eye, but also the area around the eye.
- the authentication performed by the information processing system 50 is not limited to iris authentication.
- the information processing system 50 may be a system capable of performing facial authentication of the target 90.
- the camera 20 may be a camera for capturing an image to be used for facial authentication.
- the target image may be an image to be used for facial authentication.
- the camera 20 is arranged to be capable of capturing an image of the face of the target 90 located at point P2.
- the target image is an image to be used for facial authentication
- the target image is an image that includes a face.
- the information processing system 50 further includes one or more status measurement units 30.
- Status information is generated based on the measurement results by the status measurement units 30.
- the status information indicates at least the status of the object 90 at the time when the object 90 is located at a point farther away than the focal point of the camera 20.
- the control unit 150 outputs control information at least one of before the object 90 reaches the focal point of the camera 20 and when the object 90 reaches the focal point of the camera 20. In this way, it is possible to perform imaging that takes degradation into account without performing imaging with the camera 20 in advance, and obtain a good image of the object.
- the state measurement unit 30 examples include a camera, a lidar (LiDAR), and an infrared sensor.
- the camera is not particularly limited, and may be, for example, a depth camera, a wide-angle camera, a visible light camera, or a near-infrared camera.
- the state measurement unit 30 is a depth camera or a lidar, the distance from the state measurement unit 30 to the target 90 and three-dimensional information of the target 90 can be obtained.
- the state measurement unit 30 is a wide-angle camera, it can capture an image of a wider range than the camera 20, for example, the entire target 90.
- step S101 the state measurement unit 30 measures the state of the target 90 at a point farther away from the camera 20 than point P2. Note that if the state measurement unit 30 is a camera of some type, the state measurement unit 30 measuring the state of the target 90 includes the state measurement unit 30 capturing an image of the target 90.
- the point P1 may or may not be the same for the multiple state measurement units 30. In other words, the point P1 can be set independently for each state measurement unit 30.
- Point P1 is a point where the target 90 is located in front of point P2.
- the distance between points P1 and P2 is not particularly limited, but is, for example, 50 cm or more and 2 m or less.
- step S102 status information is generated using the measurement results by the status measurement unit 30.
- the status information includes information related to the target 90 or items worn by the target 90.
- the status information indicates one or more of the face direction, body posture, gaze direction, movement speed, and whether or not glasses are worn.
- the status information is not limited to these examples as long as it includes information related to the target 90 or items worn by the target 90. By using such status information, the deterioration factors can be accurately estimated.
- the facial orientation of the subject 90 can be determined by analyzing information obtained, for example, from a depth camera, a lidar, and/or a wide-angle camera using existing techniques.
- the status information can include, for example, the angle of the facial orientation (left/right angle and up/down angle) based on a specified direction.
- the body posture of the subject 90 can be identified by analyzing information obtained, for example, from at least one of a depth camera, a lidar, and a wide-angle camera using existing methods.
- the state information can be, for example, a body posture label indicating the body posture, such as "walking,” “standing upright,” or “walking with a hunched back.”
- the body posture label can be a predetermined numerical value.
- the gaze direction of the subject 90 can be determined, for example, by analyzing information obtained from at least one of a depth camera and a wide-angle camera using existing techniques.
- the status information can include, for example, the angle of the gaze direction (left/right angle and up/down angle) based on a specified direction.
- the moving speed of the target 90 can be determined by analyzing information obtained, for example, by at least one of a depth camera, a lidar, and an infrared sensor using existing methods. For example, if the status measurement unit 30 is an infrared sensor, multiple infrared sensors detect the passing of the target 90 at multiple points. Then, the speed of the target 90 can be calculated based on the distance between the multiple points and the difference in the passing timing.
- the status information can include, for example, information indicating the direction and speed of movement of the target 90.
- Whether or not the subject 90 is wearing glasses can be determined by, for example, analyzing information obtained from at least one of a depth camera, a lidar, and a wide-angle camera using existing techniques.
- the status information can include, for example, information indicating whether or not the subject 90 is wearing glasses.
- the state information can be a vector whose elements are each of these multiple types of information.
- the method of generating state information is not limited to the above example.
- a trained neural network may be used to generate state information.
- the process of generating status information from the measurement results of the status measurement unit 30 may be performed by the status measurement unit 30, may be performed by the information processing device 10, or may be performed by an analysis device different from the information processing device 10 and the status measurement unit 30.
- the analysis device or the information processing device 10 acquires the measurement results and information necessary for generating the status information from one or more status measurement units 30 and uses it to generate the status information.
- the devices generating the multiple types of information do not all need to be the same.
- some of the information may be generated by the status measurement unit 30, and other information may be generated by the information processing device 10.
- the information processing device 10 can generate the status information by combining multiple types of information.
- the acquisition unit 110 acquires state information.
- the acquisition unit 110 may acquire the state information from the state measurement unit 30, or from another analysis device.
- the acquisition unit 110 may acquire state information generated by the information processing device 10 using information acquired from one or more state measurement units 30 and analysis devices.
- the estimation unit 130 generates degradation information using the status information acquired by the acquisition unit 110.
- the degradation information indicates, for example, one or more degradation factors occurring in the target image and the degree of degradation for each of the one or more degradation factors.
- the status information indicates one or more combinations of degradation factors and the degree of degradation.
- the one or more degradation factors include, for example, one or more of focus blur, motion blur, occlusion, and off-angle. If the target image is an image that includes an iris, it is preferable that the one or more degradation factors include one or more of focus blur, motion blur, eyelid occlusion, lighting reflection occlusion, and off-angle. By targeting such degradation factors, it is possible to accurately estimate the degradation of the target image.
- the degradation factor can also be said to be the type of degradation that is predicted to occur in the target image.
- Focus blur is blurring of the target 90 or a specific part of the target 90 due to a shift in focus.
- the specific part of the target 90 is, for example, a part of the target image used for authentication.
- Motion blur is blurring due to a change in the relative position between the camera 20 and the target 90 or a specific part of the target 90.
- the degree of degradation may indicate the state of a specific part in the target image. For example, if the target image is an image that includes an iris, the specific part is the iris. The greater the degree of blurring of the iris region in the target image, the higher the degree of degradation may be.
- Occlusion means that at least a part of the target 90 or a predetermined part of the target 90 is not visible in the target image.
- eyelid occlusion means that at least a part of the iris is hidden by the eyelid.
- illumination reflection occlusion means that the brightness of at least a part of the iris region of the target image increases due to illumination being reflected off the pupil. If there is a part of the iris region that is too bright (for example, a blown-out highlight), the iris pattern used for iris authentication cannot be detected.
- the object image is an image that includes an iris
- the greater the proportion of the iris that is hidden the greater the degree of degradation.
- Off-angle means that the orientation of the object 90 or the orientation of a specific part of the object 90 deviates from the optical axis of the camera 20.
- the object image is an image that includes an iris
- the deterioration information may, for example, include a numerical value indicating the degree of deterioration for each deterioration factor.
- the deterioration information may be a vector whose elements are each a multiple degree of deterioration.
- the method by which the estimation unit 130 generates the deterioration information is not particularly limited, but the estimation unit 130 can generate the deterioration information, for example, by using a trained neural network that has been trained using the state information and the correct deterioration information. In this way, it is possible to generate deterioration information with high estimation accuracy.
- FIG. 7 is a diagram illustrating a deterioration estimation model 131 that the estimation unit 130 uses to generate deterioration information.
- the input data of the deterioration estimation model 131 includes state information, and the output data of the deterioration estimation model 131 includes deterioration information.
- the deterioration estimation model 131 includes a neural network. This neural network is a trained neural network that has undergone machine learning in advance using state information and correct deterioration information as training data.
- the state information used in this machine learning is preferably state information obtained in the information processing system 50 according to this embodiment or in a system similar to the information processing system 50 according to this embodiment.
- the correct degradation information used in this machine learning is preferably information on degradation that has occurred in an image of the object 90 obtained by the camera 20 in the information processing system 50 according to this embodiment or in a system similar to the information processing system 50 according to this embodiment.
- the correct degradation information is information on degradation that has occurred in an image obtained without any notification or control of the object 90 and the camera 20 between the measurement by the state measurement unit 30 and the image capture by the camera 20.
- the type and form of information included in the correct degradation information are the same as the type and form of information in the degradation information described above.
- the degradation estimation model 131 may be a model prepared for specific imaging conditions.
- the imaging conditions when preparing the state conditions and ground truth degradation information used for learning the degradation estimation model 131 are the same as the imaging conditions when estimating degradation information using the degradation estimation model 131.
- the imaging conditions are, for example, one or more imaging parameters of the camera 20, the lighting state for the target 90, and a combination of these.
- the information processing system 50 includes a lighting unit 40, which can be controlled by the information processing device 10.
- the lighting unit 40 is, for example, a lighting device capable of adjusting the direction and intensity of light.
- a deterioration estimation model 131 may be prepared for each of the multiple imaging conditions and stored in advance in a storage device accessible by the estimation unit 130.
- the estimation unit 130 selects and uses the deterioration estimation model 131 that corresponds to the imaging condition at the time when the status information was obtained from the multiple deterioration estimation models 131.
- the information processing device 10 can obtain information for identifying the imaging condition at the time when the status information was obtained from the camera 20 and the lighting unit 40.
- the estimation unit 130 inputs state information to the deterioration estimation model 131.
- the estimation unit 130 then obtains the deterioration information output from the deterioration estimation model 131.
- step S105 the control unit 150 outputs control information corresponding to the degradation information generated by the estimation unit 130.
- control unit 150 outputs control information for controlling the imaging conditions of the camera 20 based on the degradation information. In this way, a target image with less degradation can be obtained.
- the control information output by the control unit 150 is information for controlling the imaging conditions of the camera 20 so as to suppress the deterioration indicated in the deterioration information.
- the control unit 150 outputs the control information to at least one of the camera 20 and the lighting unit 40.
- the distance between the above-mentioned points P1 and P2 is not particularly limited, but is preferably, for example, 50 cm or more and 70 cm or less.
- the distance between points P1 and P2 50 cm or more time to control the imaging conditions can be secured.
- the distance between points P1 and P2 70 cm or less the accuracy of estimating deterioration can be improved.
- control unit 150 An example of the processing performed by the control unit 150 in this embodiment is described below.
- This degradation can be reduced, for example, by shortening the exposure time of the camera 20 or by adjusting the focus position of the camera 20.
- the control unit 150 when the degree of degradation due to motion blur indicated in the degradation information is equal to or greater than a predetermined standard a, the control unit 150 outputs control information to the camera 20 for shortening the exposure time when obtaining the target image. At this time, the control unit 150 may output control information for shortening the exposure time as the degree of degradation due to motion blur increases.
- control unit 150 may output control information to the camera 20 for changing the focal length while capturing the target image. Specifically, the control unit 150 outputs control information for causing the camera 20 to capture the target image while reducing the focal length.
- control unit 150 outputs control information for, for example, narrowing the aperture of the lens of the camera 20 to increase the depth of field.
- control unit 150 outputs control information for changing the focal position of the camera 20.
- control unit 150 If the degree of degradation due to lighting reflection occlusion indicated in the degradation information is equal to or greater than a predetermined standard c, the control unit 150 outputs control information for changing at least one of the position of the camera 20, the orientation of the camera 20, the position of the lighting relative to the target 90, and the orientation of the lighting, so that lighting reflection does not occur on the target 90 or a predetermined part of the target 90.
- step S106 the camera 20 generates a target image by capturing an image of the target 90 located at point P2.
- the camera 20 may capture an image based on a control signal output from the control unit 150. Also, as described above, if the control unit 150 controls the imaging parameters by outputting control information to the camera 20, the control unit 150 may output the control information when capturing the target image. In other words, steps S105 and S106 may be performed simultaneously.
- the degree of deterioration of the target image actually obtained in step S106 is expected to be lower than the degree of deterioration indicated in the deterioration information, i.e., the estimated degree of deterioration.
- Each functional component of the information processing device 10 acquisition unit 110, estimation unit 130, and control unit 150 may be realized by hardware that realizes each functional component (e.g., hardwired electronic circuitry, etc.), or may be realized by a combination of hardware and software (e.g., a combination of an electronic circuitry and a program that controls it, etc.).
- acquisition unit 110 e.g., hardwired electronic circuitry, etc.
- estimation unit 130 e.g., estimation unit 130, and control unit 150
- control unit 150 may be realized by hardware that realizes each functional component (e.g., hardwired electronic circuitry, etc.), or may be realized by a combination of hardware and software (e.g., a combination of an electronic circuitry and a program that controls it, etc.).
- a combination of hardware and software e.g., a combination of an electronic circuitry and a program that controls it, etc.
- FIG. 8 is a diagram illustrating a computer 1000 for realizing the information processing device 10.
- the computer 1000 is any computer.
- the computer 1000 is a SoC (System On Chip), a Personal Computer (PC), a server machine, a tablet terminal, or a smartphone.
- the computer 1000 may be a dedicated computer designed to realize the information processing device 10, or may be a general-purpose computer.
- the information processing device 10 may be realized by one computer 1000, or may be realized by a combination of multiple computers 1000.
- the computer 1000 has a bus 1020, a processor 1040, a memory 1060, a storage device 1080, an input/output interface 1100, and a network interface 1120.
- the bus 1020 is a data transmission path through which the processor 1040, the memory 1060, the storage device 1080, the input/output interface 1100, and the network interface 1120 transmit and receive data to and from each other.
- the method of connecting the processor 1040 and other components to each other is not limited to bus connection.
- the processor 1040 is one of various processors such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or an FPGA (Field-Programmable Gate Array).
- the memory 1060 is a main storage device realized using a RAM (Random Access Memory) or the like.
- the storage device 1080 is an auxiliary storage device realized using a hard disk, an SSD (Solid State Drive), a memory card, or a ROM (Read Only Memory) or the like.
- the input/output interface 1100 is an interface for connecting the computer 1000 to an input/output device.
- an input device such as a keyboard and an output device such as a display are connected to the input/output interface 1100.
- the input/output interface 1100 may be connected to the input device or output device by a wireless connection or a wired connection.
- the network interface 1120 is an interface for connecting the computer 1000 to a network.
- This communication network is, for example, a LAN (Local Area Network) or a WAN (Wide Area Network).
- the method for connecting the network interface 1120 to the network may be a wireless connection or a wired connection.
- the camera 20, one or more state measurement units 30, and the lighting unit 40 are each connected to the information processing device 10 via an input/output interface 1100 or a network interface 1120, and can communicate with the information processing device 10.
- the storage device 1080 stores program modules that realize each functional component of the information processing device 10.
- the processor 1040 reads each of these program modules into the memory 1060 and executes them to realize the function corresponding to each program module.
- control unit 150 outputs control information corresponding to the degradation information at least one of before the target is imaged by the camera 20 and when the target is imaged by the camera 20. Therefore, it is possible to estimate the degradation of the target image before the target is imaged by the camera 20 and take measures to obtain a good image.
- Second Embodiment 9 is a diagram illustrating a functional configuration of an information processing system 50 according to the second embodiment.
- the information processing system 50, the information processing device 10, and the information processing method according to the present embodiment are the same as the information processing system 50, the information processing device 10, and the information processing method according to the first embodiment, respectively, except for the points described below.
- the control unit 150 outputs control information for executing a notification when the deterioration information satisfies a predetermined condition C. By doing so, it is possible to encourage the target 90 to take action to avoid the estimated deterioration.
- the information processing system 50 further includes one or more notification units 70.
- the control unit 150 outputs control information to the notification unit 70 to cause the notification unit 70 to execute a notification.
- Examples of the notification unit 70 include a display, a speaker, and a light-emitting device. If the notification unit 70 is a display, the notification may be, for example, a display of a message, a diagram, or the like on the display. If the notification unit 70 is a speaker, the notification information may be a sound output such as a message or an alarm. If the notification unit 70 is a light-emitting device, the notification information may be light emission.
- the notification unit 70 is connected to the information processing device 10 via the input/output interface 1100 or the network interface 1120.
- Each notification unit 70 is provided so that the notification by that notification unit 70 can be recognized by an object 90 located between points P1 and P2.
- control unit 150 may output notification information instead of outputting control information for controlling the imaging environment to at least one of the camera 20 and the lighting unit 40, or may output notification information in addition to outputting control information for controlling the imaging environment.
- control unit 150 An example of the processing performed by the control unit 150 according to this embodiment is described below. Note that in the following example, when the degree of deterioration is equal to or greater than a predetermined standard set for that degree of deterioration, this corresponds to satisfying the predetermined condition C.
- the control unit 150 outputs control information for causing one or more notification units 70 to issue a notification urging the object 90 to slow down its moving speed.
- a message such as "Please slow down your walking speed” or "Please stop once at a specified position" is displayed on the display or is issued as a sound from the speaker.
- control unit 150 outputs control information to cause one or more notification units 70 to issue a notification urging the target 90 to change the direction of his/her face or gaze.
- a message such as "Look ahead" may be displayed on the display or emitted from the speaker.
- the gaze of the target 90 may be guided by emitting light from a light-emitting device.
- control unit 150 If the degree of deterioration due to eyelid occlusion indicated in the deterioration information is equal to or greater than a predetermined standard e, the control unit 150 outputs control information to cause one or more notification units 70 to issue a notification urging the subject 90 to open his/her eyes. By doing so, for example, a message such as "Please open your eyes wide" may be displayed on the display or emitted from the speaker.
- control unit 150 outputs control information to one or more notification units 70 to notify the subject 90 to change at least one of the facial orientation, position, and gaze direction so as to direct at least one of the facial orientation and gaze toward the camera 20.
- a message such as "Look ahead” or “Move a little more to the right and then gaze towards us” may be displayed on the display or emitted from the speaker.
- a light-emitting device may emit light to guide the gaze of the subject 90.
- the control unit 150 may also output control information based on the status information. For example, if the status information indicates that the subject 90 is wearing glasses, the control information is output to cause one or more notification units 70 to issue a notification urging the subject 90 to remove the glasses. In this way, a message such as "Please remove your glasses" may be displayed on the display or emitted from the speaker.
- the distance between points P1 and P2 is not particularly limited, but is preferably, for example, 1 m or more and 2 m or less. By making the distance between points P1 and P2 1 m or more, it is possible to ensure time for the object 90 to change its state in response to the alarm. By making the distance between points P1 and P2 2 m or less, it is possible to ensure sufficient accuracy in estimating the deterioration.
- control unit 150 outputs control information for executing a notification when the deterioration information satisfies a predetermined condition C. Therefore, it is possible to encourage the target 90 to take action to avoid the estimated deterioration.
- Third Embodiment 10 is a flowchart illustrating a flow of processing executed by the information processing system 50 according to the third embodiment.
- the information processing system 50, the information processing device 10, and the information processing method according to this embodiment are the same as the information processing system 50, the information processing device 10, and the information processing method according to at least one of the first embodiment and the second embodiment, respectively, except for the points described below.
- the control unit 150 estimates the quality of the target image, which indicates its suitability as an image to be used for authentication, using the degradation information.
- the control unit 150 then outputs control information according to the quality. In this way, it is possible to obtain a target image suitable for authentication processing.
- the success or failure of authentication depends on the quality of the image used for authentication. For example, it may happen that an object 90 that should be authenticated is not recognized due to low image quality. Furthermore, the quality of the image required for authentication may differ depending on the performance and characteristics of the authentication device.
- the control unit 150 in this embodiment generates, for example, a quality score as a quality indicator of the suitability of the image for use in authentication.
- steps S101 to S104 and step S106 are as described in the first embodiment.
- step S204 the control unit 150 estimates the quality using the degradation information. For example, the control unit 150 generates a quality score. The lower the suitability of an image for use in authentication, the lower the quality score of the target image.
- the method by which the control unit 150 generates the quality score is not particularly limited, and may be a method based on linear regression or a method using a neural network. The method by which the control unit 150 generates the quality score will be described in detail later.
- control unit 150 outputs control information according to the quality score.
- the control unit 150 may output control information for controlling the imaging conditions of the camera 20, as described in the first embodiment, only if the quality score is equal to or less than a predetermined standard g.
- the control unit 150 may output control information for executing an alert, as described in the second embodiment, only if the quality score is equal to or less than a predetermined standard g.
- the following describes a first and second example of how the control unit 150 generates a quality score.
- the control unit 150 calculates the quality score by linear regression.
- the quality score is calculated as a weighted sum of the degrees of deterioration for one or more deterioration factors.
- the weights for each degree of deterioration can be determined in advance, for example, as follows.
- multiple images with different deterioration states are prepared.
- the objects captured in these images may be the same or different.
- an authentication process is performed using these images to confirm whether the authentication is successful or not.
- the degree of deterioration related to each deterioration factor is identified for each image. This degree of deterioration is consistent with (i.e., comparable to) the content of the deterioration information generated by the estimation unit 130.
- a weight for each degree of deterioration is determined.
- a formula for determining a weighted sum using these weights is defined as a formula for calculating the quality score.
- an authentication score indicating the likelihood of identity in authentication for each image may be determined by other methods.
- the authentication score may be assigned to each image by a specific authentication algorithm, or may be assigned by human judgment.
- a weight for each degree of degradation is determined so that a quality score equivalent to the authentication score is obtained using multiple combinations of the authentication score and one or more degrees of degradation.
- An example of an authentication algorithm is described below.
- iris authentication for example, an iris code extracted from an iris image is used as a feature.
- an authentication algorithm can be used that determines whether or not the person is the real person based on the Hamming distance between the feature of the person registered in advance and the feature extracted from the image captured by the camera 20.
- the authentication algorithm There is no particular limitation on the authentication algorithm, and various algorithms can be used. However, it is preferable that the multiple authentication scores prepared when determining one formula for calculating the quality score are prepared using the same algorithm.
- the one or more weights used in the formula may be generated by a neural network. That is, the degree of deterioration related to one or more deterioration factors and the authentication score are used as input data for the neural network, and the one or more weights are used as output data for the neural network. Then, the neural network is trained so that the quality score calculated using the output weights approaches the authentication score as the correct answer. The one or more weights output from the trained neural network thus obtained are used as weights to be used in the formula for calculating the quality score.
- control unit 150 calculates the quality score by substituting the degree of degradation indicated in the degradation information generated by the estimation unit 130 into a predetermined formula as described above.
- ⁇ Second Example> 11 is a diagram illustrating a quality estimation model 151 used by the control unit 150 to generate a quality score.
- Input data of the quality estimation model 151 includes degradation information
- output data of the quality estimation model 151 includes a quality score.
- the quality estimation model 151 includes a neural network. This neural network is a trained neural network that has been trained by machine learning using multiple combinations of an authentication score (corresponding to a correct answer of the quality score) and one or more degrees of degradation, as described in the first example.
- control unit 150 inputs the degree of each degradation indicated in the degradation information generated by the estimation unit 130 to the quality estimation model 151.
- the control unit 150 then obtains a quality score output from the quality estimation model 151.
- control unit 150 uses the degradation information to estimate the quality of the target image, which indicates its suitability as an image to be used for authentication. Therefore, it is possible to obtain a target image suitable for authentication processing.
- the information processing system 50, information processing device 10, and information processing method according to the fourth embodiment are the same as the information processing system 50, information processing device 10, and information processing method according to at least any of the first to third embodiments, except for the points described below.
- the control unit 150 stops the camera 20 from capturing an image of the target 90 when the degradation information satisfies a predetermined condition D. This makes it possible to reduce unnecessary processing when there is a low probability of obtaining a good target image.
- Condition D is a condition that indicates that it is unlikely that a good target image will be obtained.
- Condition D may be a condition on the degradation information itself, or may be a condition on the quality (quality score) obtained from the degradation information as in the third embodiment. In that case, if it is estimated that the target image does not meet the quality required for authentication, the imaging of the target 90 by the camera 20 is stopped.
- condition D is that, of one or more degrees of deterioration indicated in the deterioration information, a predetermined number or more of the degrees of deterioration are equal to or higher than a predetermined standard set for each degree of deterioration.
- condition D is that the quality score obtained based on the deterioration information is equal to or lower than a predetermined score.
- condition D is not limited to these examples.
- FIG. 12 is a block diagram illustrating the functional configuration of an information processing system 50 according to this embodiment.
- the information processing system 50 according to this embodiment includes one or more notification units 70, similar to the information processing system 50 according to the second embodiment.
- the control unit 150 according to this embodiment stops the camera 20 from capturing an image of the target 90 and outputs control information for executing a notification.
- the control unit 150 according to this embodiment does not need to output control information for controlling the imaging conditions of the camera 20.
- control unit 150 If the deterioration information satisfies a predetermined condition D, the control unit 150 outputs control information to cause one or more notification units 70 to issue a notification to the target 90, for example, encouraging the target 90 to go back and start the movement again. By doing so, a message such as "Please go back a little and repeat the route" may be displayed on the display or emitted from the speaker.
- Such a notification is expected to cause the target 90 to try moving again.
- the information processing system 50 can then repeat measurements at point P1, generate status information, generate degradation information, and so on. According to this example, these repeats can be performed without capturing an image of the target 90 with the camera 20, so a good image of the target can be obtained in a short time compared to a case in which the target 90 is captured by the camera 20 and then the user is prompted to try again.
- the information processing system 50 further includes an alternative camera 22.
- the alternative camera 22 is a camera provided separately from the camera 20, and is a camera for capturing an image of the target 90 in place of the camera 20 to generate a target image.
- the hardware configuration of the computer that realizes the information processing device 10 according to this embodiment is shown in FIG. 8, for example, similar to the information processing device 10 according to the first embodiment.
- the alternative camera 22 is connected to the information processing device 10 via an input/output interface 1100 or a network interface 1120.
- the control unit 150 If the degradation information satisfies a predetermined condition D, the control unit 150 outputs control information to cause one or more notification units 70 to issue a notification to, for example, the target 90, urging the target 90 to stop in the imaging area of the alternative camera 22. By doing so, a message such as "Please stop in front of the camera on the right side" is displayed on the display or emitted from the speaker.
- This notification is expected to cause the target 90 to stop at a position where the alternative camera 22 can capture a good image of the target 90.
- the information processing system 50 then captures an image of the target 90 using the alternative camera 22, and obtains a good image of the target.
- imaging using the alternative camera 22 can be performed without capturing an image of the target 90 using the camera 20, so a good image of the target can be obtained in a short time compared to the case where the camera 20 captures an image of the target 90 and then prompts the user to capture the image again.
- control unit 150 stops the camera 20 from capturing an image of the target 90 when the degradation information satisfies a predetermined condition D. Therefore, it is possible to reduce unnecessary processing when there is a low possibility of obtaining a good target image.
- the information processing system 50, information processing device 10, and information processing method according to the fifth embodiment are the same as the information processing system 50, information processing device 10, and information processing method according to at least any of the first to fourth embodiments, except for the points described below.
- the status information further indicates the imaging conditions of the camera 20.
- the imaging conditions include the imaging parameters of the camera 20 and the environmental conditions.
- the status information indicates one or more of the exposure time of the camera 20, the lighting conditions for the target 90, the focal position of the camera 20, the lens aperture of the camera 20, and the brightness of the imaging area captured by the camera 20.
- the degradation estimation model 131 was a model that was trained on the premise of specific imaging conditions.
- the actual imaging conditions may not necessarily match the imaging conditions assumed by the degradation estimation model 131.
- the imaging parameters of the camera 20 may not be fixed, and may be automatically adjusted according to the brightness of the imaging area, etc.
- the status information indicates, for example, one or more of the exposure time of the camera 20, the lighting conditions for the target 90, the focal position of the camera 20, the lens aperture of the camera 20, and the brightness of the area captured by the camera 20.
- Examples of the lighting conditions for the target 90 include at least one of the direction and intensity of the lighting by the lighting unit 40.
- Examples of the focal position of the camera 20 include at least one of the focal length and the coordinates of the focal position.
- the information processing device 10 can obtain information indicating the exposure time of the camera 20, the focal position of the camera 20, and the lens aperture of the camera 20 from the camera 20.
- the information processing device 10 can obtain information indicating the lighting state of the target 90 from the lighting unit 40.
- the information processing device 10 can obtain information indicating the brightness of the area captured by the camera 20 from an illuminance sensor or the like provided in the image capturing area.
- the information processing device 10 acquires one or more pieces of information indicating the imaging conditions of the camera 20 as described above. Then, the information processing device 10 generates status information including both the information indicating the state of the target 90 and the information indicating the imaging conditions of the camera 20.
- the status information can be a vector whose elements are each of multiple types of information (face direction, body posture, exposure time of the camera 20, focal position of the camera 20, etc.). Then, the acquisition unit 110 acquires the generated status information.
- the estimation unit 130 can generate degradation information using the degradation estimation model 131.
- the degradation information includes information indicating the imaging conditions of the camera 20, so it can be said that the input of the degradation estimation model 131 also includes conditions indicating the imaging conditions of the camera 20.
- the degradation estimation model 131 according to this embodiment can be trained in the same way as the degradation estimation model 131 according to the first embodiment.
- the state information used in the machine learning of the degradation estimation model 131 according to this embodiment includes information indicating the imaging conditions of the camera 20. This information indicating the imaging conditions is information indicating the imaging conditions when the image that is the source of the correct degradation information was captured.
- the degradation estimation model 131 is a model that can be used regardless of the imaging conditions.
- the estimation unit 130 generates degradation information in the same manner as described in the first embodiment, using state information including information indicating the imaging conditions.
- the control unit 150 outputs control information in the same manner as the control unit 150 according to at least any one of the first to fourth embodiments.
- the status information further indicates the imaging conditions of the camera 20. Therefore, the cause of deterioration and the degree of deterioration can be estimated with high accuracy.
- Sixth Embodiment 13 is a diagram for explaining a method in which the estimation unit 130 according to the sixth embodiment generates degradation information.
- the information processing system 50, the information processing device 10, and the information processing method according to this embodiment are the same as the information processing system 50, the information processing device 10, and the information processing method according to at least any one of the first embodiment to the fifth embodiment, except for the points described below.
- the estimation unit 130 uses the state information to generate image-capture state information indicating the result of estimating the state of the target 90 at the time when the target 90 reaches the focus of the camera 20.
- the estimation unit 130 then generates degradation information using the image-capture state information.
- the estimation unit 130 according to this embodiment performs degradation estimation in a first stage in which the state at the time of image capture is estimated based on the state at point P1, and a second stage in which degradation of the target image is estimated based on the state at the time of image capture. This allows the information processing system 50 to be applied to changes in the state between point P1 and point P2 due to changes in the usage environment of the information processing system 50, without changing the processing of the second stage (e.g., the estimation model).
- the configuration of the image capture state information is the same as the configuration of the state information. That is, the image capture state information indicates at least the state of the object.
- the image capture state information may further indicate the image capture conditions of the camera 20.
- the image capture state information may be a vector whose elements are each of multiple types of information.
- the estimation unit 130 generates degradation information from state information, for example, using a shooting-time state estimation model 132 and a degradation estimation model 133.
- the shooting-time state estimation model 132 and the degradation estimation model 133 are each trained models including a neural network.
- the shooting-time state estimation model 132 is a model for estimating the state at the time of image capture based on the state at point P1.
- the degradation estimation model 133 is a model for estimating the degradation of the target image based on the state at the time of image capture.
- the input data of the image-capturing state estimation model 132 includes state information
- the output data of the image-capturing state estimation model 132 includes state information at the time of image capture.
- the image-capturing state estimation model 132 can be prepared in advance by performing machine learning using state information and correct state information as learning data.
- the correct state information is information obtained by measuring the state (at point P2) at the time of image capture. In other words, the state at point P2 can be measured and correct state information can be generated in the same manner as the information processing system 50 generates state information at point P1.
- the state information used for the machine learning of the image-capturing state estimation model 132 may be the same as the state information used for the machine learning of the degradation estimation model 131. It is preferable that the configuration of the correct state information is the same as the state information.
- the input data of the degradation estimation model 133 includes state information at the time of imaging, and the output data of the degradation estimation model 133 includes degradation information.
- the degradation estimation model 133 can be prepared in advance by performing machine learning using the above-mentioned correct state information and the above-mentioned correct degradation information as learning data.
- the estimation unit 130 inputs the state acquired by the acquisition unit 110 to the image capture state estimation model 132, and obtains image capture state information output from the image capture state estimation model 132.
- the estimation unit 130 further inputs the image capture state information to the degradation estimation model 133, and obtains degradation information output from the degradation estimation model 133.
- the control unit 150 outputs control information, similar to the control unit 150 according to at least any one of the first to fourth embodiments.
- the status information may or may not include information indicating the imaging conditions of the camera 20. If the status information includes information indicating the imaging conditions of the camera 20, the imaging time status information and the correct status information also include information indicating the imaging conditions of the camera 20.
- a time-of-shooting state estimation model 132 and a deterioration estimation model 133 may be prepared for each of the multiple imaging conditions.
- the time-of-shooting state estimation model 132 and the deterioration estimation model 133 for each of the multiple imaging conditions are stored in advance in a storage device accessible by the estimation unit 130. Then, as in the first embodiment, the estimation unit 130 uses the time-of-shooting state estimation model 132 and the deterioration estimation model 133 that correspond to the imaging conditions at the time the status information was obtained.
- the estimation unit 130 may generate image capture state information based on a predetermined rule.
- the estimation unit 130 can generate image capture state information from state information based on the following rules.
- time T1 the time when the state measurement unit 30 measures the state of the object 90
- time T2 the time when the object 90 reaches the focus of the camera 20
- the estimation unit 130 estimates the body posture and moving speed at time T2, for example, based on the body posture and moving speed of the subject 90 at time T1.
- the walking model described in the literature "Comprehensive Analysis Model and Simulation of Bipedal Walking” by Yamazaki Nobuhisa (Biomechanism, Vol. 3, 1975, pp. 261-269) can be used.
- the estimation unit 130 may calculate a lower speed at time T2. The amount of speed reduction can be determined in advance based on the results of prior research or experiments (for example, the average of multiple subjects 90).
- the estimation unit 130 takes the facial direction and gaze direction of the target 90 at time T1 as the estimated results of the facial direction and gaze direction of the target 90 at time T2. However, if guidance regarding the face and gaze direction is always performed, the estimation unit 130 may take the direction toward the guidance destination as the estimated results of the facial direction and gaze direction of the target 90 at time T2.
- the estimation unit 130 uses the presence or absence of glasses at time T1 as the estimation result of the presence or absence of glasses at time T2. However, if the subject 90 is constantly guided to remove his or her glasses, the estimation unit 130 may estimate that the subject 90 is not wearing glasses at time T2.
- the estimation unit 130 can generate image capture state information from state information based on these rules. Then, by inputting the generated image capture state information to the deterioration estimation model 133, deterioration information can be obtained.
- the estimation unit 130 uses the state information to generate image capture state information indicating the result of estimating the state of the object 90 at the time when the object 90 reaches the focus of the camera 20. Then, the estimation unit 130 uses the image capture state information to generate degradation information. Therefore, it is possible to respond to changes in state between point P1 and point P2 with minor changes.
- Fig. 14 is a block diagram illustrating a functional configuration of an information processing system 50 according to the seventh embodiment.
- Fig. 15 is a diagram illustrating a usage environment of the information processing system 50 according to the present embodiment.
- the information processing system 50, the information processing device 10, and the information processing method according to the present embodiment are the same as the information processing system 50, the information processing device 10, and the information processing method according to at least any one of the first embodiment to the sixth embodiment, except for the points described below.
- the information processing device 10 further includes a first authentication unit 170. That is, the information processing system 50 according to this embodiment further includes a first authentication unit 170.
- the first authentication unit 170 performs authentication using a target image of the target 90 generated by the camera 20.
- the authentication process performed by the first authentication unit 170 is, for example, iris authentication or face authentication.
- the authentication process performed by the first authentication unit 170 is not limited to these examples and can be any authentication process.
- the first authentication unit 170 can perform authentication processing using an existing method. For example, the first authentication unit 170 obtains feature information by detecting a predetermined area from the target image and extracting features of the detected area. When the first authentication unit 170 performs iris authentication processing, the predetermined area is an area corresponding to the iris. When the first authentication unit 170 performs face authentication processing, the predetermined area is an area corresponding to the face.
- the authentication information storage unit 100 holds multiple pieces of authentication information in advance, in which identification information and feature information are associated with each other.
- the identification information is information for individually identifying multiple targets 90. If the target 90 is a person, the identification information is, for example, personal identification information.
- the authentication information storage unit 100 may be included in the information processing device 10, or may be provided outside the information processing device 10. However, the first authentication unit 170 can access the authentication information storage unit 100. The first authentication unit 170 compares the feature information obtained from the target image with each piece of feature information stored in the authentication information storage unit 100. The first authentication unit 170 then identifies the feature information that has the highest degree of match with the feature information obtained from the target image from among the multiple pieces of feature information stored in the authentication information storage unit 100. The first authentication unit 170 then identifies the identification information associated with the identified feature information as the identification information of the target 90 captured by the camera 20.
- the first authentication unit 170 outputs the identification information of the identified target 90.
- the first authentication unit 170 may, for example, display the identification information of the target 90 on a display, or may transmit it to another device.
- the first authentication unit 170 may output information to the effect that authentication has not been successful.
- the first authentication unit 170 may also control the passage of the target 90 according to the identification information of the target 90.
- the information processing system 50 further includes a gate 80.
- the authentication information stored in the authentication information storage unit 100 includes information associated with the identification information indicating whether or not the target 90 is allowed to pass through.
- the first authentication unit 170 When the first authentication unit 170 identifies the identification information of the target 90, it reads out information associated with the identification information indicating whether or not the target 90 can pass through. If information indicating that the target 90 can pass through is associated with the identification information, the first authentication unit 170 puts the gate 80 in a state where the target 90 can pass through. On the other hand, if information indicating that the target 90 can pass through is not associated with the identification information, the first authentication unit 170 does not put the gate 80 in a state where the target 90 can pass through.
- the first authentication unit 170 also does not put the gate 80 in a state where the target 90 can pass through if there is no feature information among the multiple pieces of feature information stored in the authentication information storage unit 100 that matches the feature information obtained from the target image to a degree that exceeds a predetermined standard. In this way, the information processing system 50 can control the passage of the target 90.
- the hardware configuration of the computer that realizes the information processing device 10 according to this embodiment is shown in FIG. 8, for example, similar to the information processing device 10 according to the first embodiment.
- the storage device 1080 of the computer 1000 that realizes the information processing device 10 according to this embodiment further stores a program module that realizes the functions of the first authentication unit 170 of this embodiment.
- the authentication information storage unit 100 is provided inside the information processing device 10, for example, the authentication information storage unit 100 is realized using a storage device 1080.
- the gate 80 is connected to the information processing device 10 via an input/output interface 1100 or a network interface 1120.
- the information processing device 10 according to this embodiment further includes a first authentication unit 170. Therefore, authentication can be performed using a target image.
- Eighth embodiment 16 is a block diagram illustrating a functional configuration of an information processing system 50 according to an eighth embodiment.
- the information processing system 50, the information processing device 10, and the information processing method according to this embodiment are the same as the information processing system 50, the information processing device 10, and the information processing method according to at least any one of the first embodiment to the seventh embodiment, except for the points described below.
- the information processing system 50 includes a first authentication unit 170.
- the first authentication unit 170 performs authentication using a target image.
- the information processing system 50 according to this embodiment further includes a second authentication unit 190.
- the second authentication unit 190 performs authentication using information different from the target image. If the quality based on the degradation information satisfies a predetermined condition A, the control unit 150 increases the importance of authentication by the second authentication unit 190. This can increase the possibility of authentication by the second authentication unit 190 even if a good target image cannot be obtained by the camera 20.
- the first authentication unit 170 performs, for example, iris authentication processing using an image of the target obtained by the camera 20.
- the second authentication unit 190 performs, for example, face authentication using an image obtained by a second authentication camera 60 different from the camera 20.
- the camera 20 and the second authentication camera 60 are controlled independently to obtain an image of the target 90.
- the authentication processing performed by the first authentication unit 170 and the second authentication unit 190 is as described for the first authentication unit 170 in the seventh embodiment.
- the information processing system 50 can obtain both the authentication result by the first authentication unit 170 and the authentication result by the second authentication unit 190.
- the information processing system 50 may output both of these authentication results, or may output only the authentication result with the higher reliability.
- the authentication result with the higher authentication score may be output as the authentication result with the higher reliability.
- Examples of authentication scores include the degree of match with the feature information in the authentication information described above, and a score based on the Hamming distance described above.
- the information processing system 50 may also determine and output a final authentication result based on the result obtained by integrating these two authentication results and the authentication scores, or based on the authentication score. For example, the authentication result may be determined and output based on the sum, product, or average of the two authentication scores.
- control unit 150 in this embodiment uses the degradation information to estimate the quality score of the target image, which indicates its suitability as an image to be used for authentication.
- the control unit 150 increases the importance of authentication by the second authentication unit 190. That is, the control unit 150 controls the second authentication camera 60 to improve the quality of the image of the target 90 obtained by the second authentication camera 60. For example, the control unit 150 outputs control information for increasing the resolution of the second authentication camera 60 when imaging the target 90. Alternatively, the control unit 150 outputs control information for causing the second authentication camera 60 to image the target 90 at a timing when the second authentication camera 60 obtains an image in which a predetermined part of the target 90 is captured larger.
- the predetermined part of the target 90 is a part that the second authentication unit 190 uses for authentication, such as the face.
- control unit 150 does not increase the importance of authentication by the second authentication unit 190.
- the hardware configuration of the computer that realizes the information processing device 10 according to this embodiment is shown in FIG. 8, for example, similar to the information processing device 10 according to the first embodiment.
- the storage device 1080 of the computer 1000 that realizes the information processing device 10 according to this embodiment further stores program modules that realize the functions of the first authentication unit 170 and the second authentication unit 190 of this embodiment.
- the authentication information storage unit 100 is provided inside the information processing device 10, for example, the authentication information storage unit 100 is realized using a storage device 1080.
- the second authentication camera 60 is connected to the information processing device 10 via an input/output interface 1100 or a network interface 1120.
- control unit 150 increases the importance of authentication by the second authentication unit 190 when the quality based on the degradation information satisfies a predetermined condition A. Therefore, when it is estimated that a good target image cannot be obtained by the camera 20, it is possible to increase the possibility of authentication by the second authentication unit 190.
- This modification is a modification of the eighth embodiment.
- the information processing system 50, the information processing device 10, and the information processing method according to this modification are the same as the information processing system 50, the information processing device 10, and the information processing method according to the eighth embodiment, respectively, except for the points described below.
- the information processing system 50 includes a first authentication unit 170 and a second authentication unit 190, similar to the information processing system 50 according to the eighth embodiment.
- the control unit 150 increases the importance of authentication by the second authentication unit 190 when the degradation information satisfies a predetermined condition B.
- Condition B is a condition for the degradation information itself.
- Condition B is, for example, that of one or more degrees of degradation indicated in the degradation information, a predetermined number or more degrees of degradation are equal to or above a predetermined standard set for each degree of degradation.
- the control unit 150 increases the importance of authentication by the second authentication unit 190.
- An example of the method in which the control unit 150 increases the importance of authentication by the second authentication unit 190 is as described in the eighth embodiment.
- a camera capable of capturing an image of a target; An acquisition means for acquiring status information indicating at least a status of the target; an estimation means for generating deterioration information relating to deterioration that is estimated to occur in an object image obtained by capturing an image of the object with the camera, using the state information; a control means for outputting control information corresponding to the degradation information at least one of before the object is imaged by the camera and when the object is imaged by the camera. 1-2.
- the state information indicates at least a state of the object at a time when the object is located at a point farther away than the focal point of the camera; An information processing system, wherein the control means outputs the control information at least one of before the object reaches the focus of the camera and when the object reaches the focus of the camera.
- the estimation means includes: generating image capture state information indicating an estimation result of a state of the object at a time when the object reaches the focal point of the camera using the state information; An information processing system that generates the deterioration information using the image capture state information. 1-4. In the information processing system according to any one of 1-1.
- the degradation information indicates one or more degradation factors occurring in the target image and a degree of degradation for each of the one or more degradation factors. 1-5.
- the target image is an image including an iris
- the one or more degradation factors include one or more of focus blur, motion blur, eyelid occlusion, lighting reflection occlusion, and off-angle. 1-6.
- the control means Estimating a quality of the target image, which indicates suitability as an image to be used for authentication, using the degradation information; An information processing system that outputs the control information according to the quality. 1-7.
- the state information indicates one or more of a face direction, a body posture, a gaze direction, a moving speed, and whether or not glasses are worn. 1-10.
- the information processing system wherein the status information further indicates an imaging condition of the camera. 1-11.
- the estimation means is an information processing system that generates the degradation information using a trained neural network trained using the state information and correct degradation information. 1-13.
- the control means outputs the control information for executing a notification when the deterioration information satisfies a predetermined condition C. 1-14.
- the control means is an information processing system that stops the camera from capturing an image of the target when the deterioration information satisfies a predetermined condition D. 1-15. In the information processing system according to any one of 1-1.
- the control means outputs the control information for controlling the imaging conditions of the camera based on the deterioration information.
- the state information indicates at least a state of the object at a time when the object is located at a point farther away than the focal point of the camera;
- the control means outputs the control information at least one of before the object reaches the focus of the camera and when the object reaches the focus of the camera.
- the estimation means includes: generating image capture state information indicating an estimation result of a state of the object at a time when the object reaches the focal point of the camera using the state information; An information processing device that generates the degradation information using the image capture state information.
- the degradation information indicates one or more degradation factors occurring in the target image and a degree of degradation of each of the one or more degradation factors. 2-5.
- the target image is an image including an iris
- the one or more degradation factors include one or more of focus blur, motion blur, eyelid occlusion, lighting reflection occlusion, and off-angle. 2-6.
- the control means Estimating a quality of the target image, which indicates suitability as an image to be used for authentication, using the degradation information; An information processing device that outputs the control information according to the quality. 2-7.
- the state information indicates one or more of a face direction, a body posture, a gaze direction, a moving speed, and whether or not glasses are worn. 2-10.
- the information processing device wherein the status information further indicates an imaging condition of the camera. 2-11.
- the status information indicates one or more of the exposure time of the camera, the lighting condition for the object, the focal position of the camera, the lens aperture of the camera, and the brightness of the area captured by the camera. 2-12.
- the estimation means is an information processing device that generates the degradation information using a trained neural network trained using the state information and correct degradation information. 2-13.
- the control means is an information processing device that outputs the control information for executing a notification when the deterioration information satisfies a predetermined condition C. 2-14.
- the control means is an information processing device that stops the camera from capturing an image of the target when the deterioration information satisfies a predetermined condition D. 2-15. In the information processing device according to any one of 2-1.
- the control means is an information processing device that outputs the control information for controlling the imaging conditions of the camera based on the deterioration information.
- 3-1. One or more computers, Obtaining status information indicating at least a status of the target; generating deterioration information regarding deterioration that is estimated to occur in an object image obtained by capturing an image of the object with a camera using the state information; The information processing method outputs control information corresponding to the degradation information at least one of before the object is imaged by the camera and when the object is imaged by the camera. 3-2.
- the state information indicates at least a state of the object at a time when the object is located at a point farther away than the focal point of the camera; An information processing method, wherein the one or more computers output the control information at least one of before the object reaches the focus of the camera and when the object reaches the focus of the camera. 3-3.
- the one or more computers In the information processing method described in 3-2., The one or more computers: generating image capture state information indicating an estimation result of a state of the object at a time when the object reaches the focal point of the camera using the state information; An information processing method for generating the degradation information using the image capture state information. 3-4. In the information processing method according to any one of 3-1.
- the degradation information indicates one or more degradation factors occurring in the target image and a degree of degradation for each of the one or more degradation factors.
- the target image is an image including an iris
- the information processing method, wherein the one or more degradation factors include one or more of focus blur, motion blur, eyelid occlusion, lighting reflection occlusion, and off-angle.
- the one or more computers Estimating a quality of the target image, which indicates suitability as an image to be used for authentication, using the degradation information; An information processing method for outputting the control information according to the quality. 3-7.
- the one or more computers further comprise: Perform authentication using the target image; performing authentication using information different from the target image; The information processing method, wherein the one or more computers increase the importance of the authentication using information different from the target image when the quality satisfies a predetermined condition A. 3-8.
- the one or more computers further comprise: Perform authentication using the target image; performing authentication using information different from the target image; The information processing method, wherein, when the degradation information satisfies a predetermined condition B, the one or more computers increase the importance of the authentication using information different from the target image. 3-9. In the information processing method according to any one of 3-1.
- the state information indicates one or more of a face direction, a body posture, a gaze direction, a moving speed, and whether or not glasses are worn. 3-10.
- An information processing method, wherein the status information further indicates an imaging condition of the camera. 3-11.
- An information processing method in which the status information indicates one or more of the exposure time of the camera, the lighting condition for the object, the focus position of the camera, the lens aperture of the camera, and the brightness of the area captured by the camera. 3-12.
- the status information indicates one or more of the exposure time of the camera, the lighting condition for the object, the focus position of the camera, the lens aperture of the camera, and the brightness of the area captured by the camera.
- the one or more computers output the control information for executing a notification when the degradation information satisfies a predetermined condition C. 3-14.
- the one or more computers output the control information for controlling an imaging condition of the camera based on the degradation information. 4-1.
- a computer-readable recording medium having a program recorded thereon causes a computer to include: an acquisition unit that acquires status information indicating at least a status of an object; an estimation means for generating, using the status information, degradation information relating to degradation that is estimated to occur in an object image obtained by imaging the object with a camera; and a recording medium that functions as a control means for outputting control information corresponding to the degradation information at least one of before the object is imaged with the camera and when the object is imaged with the camera.
- the state information indicates at least a state of the object at a time when the object is located at a point farther away than the focal point of the camera;
- the control means outputs the control information at least one of before the object reaches the focus of the camera and when the object reaches the focus of the camera.
- the estimation means includes: generating image capture state information indicating an estimation result of a state of the object at a time when the object reaches the focal point of the camera using the state information; A recording medium for generating the deterioration information using the image capture state information. 4-4.
- the degradation information is a recording medium indicating one or more degradation factors occurring in the target image and the degree of degradation of each of the one or more degradation factors.
- the target image is an image including an iris
- the one or more degradation factors include one or more of focus blur, motion blur, eyelid occlusion, lighting reflection occlusion, and off-angle.
- the control means Estimating a quality of the target image, which indicates suitability as an image to be used for authentication, using the degradation information; A recording medium for outputting the control information according to the quality. 4-7.
- the program further comprises: a first authentication means for performing authentication using the target image; and a second authentication means for performing authentication using information different from the target image,
- the control means increases the importance of authentication by the second authentication means when the quality satisfies a predetermined condition A. 4-8.
- the recording medium according to any one of 4-1 to 4-7 The program further comprises: a first authentication means for performing authentication using the target image; and a second authentication means for performing authentication using information different from the target image, The control means increases the importance of authentication by the second authentication means when the degradation information satisfies a predetermined condition B. 4-9.
- the state information indicates one or more of a face direction, a body posture, a gaze direction, a moving speed, and whether or not glasses are worn. 4-10.
- the recording medium according to 4-10. The status information is a recording medium indicating one or more of the exposure time of the camera, the lighting conditions for the object, the focal position of the camera, the lens aperture of the camera, and the brightness of the area captured by the camera. 4-12.
- the estimation means generates the degradation information using a trained neural network trained using the state information and correct degradation information. 4-13.
- the recording medium according to any one of 4-1. to 4-12.
- the control means outputs the control information for executing a notification when the deterioration information satisfies a predetermined condition C. 4-14.
- the recording medium according to any one of 4-1. to 4-13.
- the control means stops the camera from capturing an image of the object when the deterioration information satisfies a predetermined condition D. 4-15.
- the state information indicates at least a state of the object at a time when the object is located at a point farther away than the focal point of the camera;
- the control means outputs the control information at least one of before the object reaches the focus of the camera and when the object reaches the focus of the camera.
- the estimation means includes: generating image capture state information indicating an estimation result of a state of the object at a time when the object reaches the focal point of the camera using the state information; A program for generating the deterioration information using the image capture state information. 5-4.
- the degradation information indicates one or more degradation factors occurring in the target image and the degree of degradation of each of the one or more degradation factors. 5-5.
- the target image is an image including an iris
- the one or more degradation factors include one or more of focus blur, motion blur, eyelid occlusion, lighting reflection occlusion, and off-angle. 5-6.
- the control means Estimating a quality of the target image, which indicates suitability as an image to be used for authentication, using the degradation information; A program for outputting the control information according to the quality. 5-7.
- the computer further comprises: a first authentication means for performing authentication using the target image; and a second authentication means for performing authentication using information different from the target image,
- the control means is a program for increasing the importance of authentication by the second authentication means when the quality satisfies a predetermined condition A. 5-8.
- the computer further comprises: a first authentication means for performing authentication using the target image; and a second authentication means for performing authentication using information different from the target image
- the control means is a program for increasing the importance of authentication by the second authentication means when the degradation information satisfies a predetermined condition B. 5-9.
- the state information indicates one or more of face direction, body posture, gaze direction, movement speed, and whether or not glasses are worn. 5-10.
- the program, wherein the status information further indicates imaging conditions of the camera. 5-11.
- the status information indicates one or more of the exposure time of the camera, the lighting conditions for the object, the focus position of the camera, the lens aperture of the camera, and the brightness of the area captured by the camera. 5-12.
- the estimation means is a program that generates the degradation information using a trained neural network that has been trained using the state information and correct degradation information. 5-13.
- the control means is a program that outputs the control information for executing a notification when the deterioration information satisfies a predetermined condition C. 5-14. In the program according to any one of 5-1.
- the control means is a program that stops the camera from capturing an image of the target when the deterioration information satisfies a predetermined condition D. 5-15.
- the control means is a program that outputs the control information for controlling the imaging conditions of the camera based on the deterioration information.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Un système de traitement d'informations (50) comprend une caméra (20), une unité d'acquisition (110), une unité d'estimation (130) et une unité de commande (150). La caméra (20) peut capturer une image d'une cible. L'unité d'acquisition (110) acquiert au moins des informations d'état indiquant un état de la cible. L'unité d'estimation (130) génère des informations de détérioration à l'aide des informations d'état. Les informations de détérioration concernent la détérioration estimée dans l'image cible obtenue par capture de l'image de la cible avec la caméra (20). L'unité de commande (150) délivre des informations de commande correspondant aux informations de détérioration avant que l'image de la cible ne soit capturée par la caméra (20) et/ou lorsque l'image de la cible est capturée par la caméra (20).
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/040864 WO2024095362A1 (fr) | 2022-11-01 | 2022-11-01 | Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement |
| JP2024553980A JPWO2024095362A1 (fr) | 2022-11-01 | 2022-11-01 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2022/040864 WO2024095362A1 (fr) | 2022-11-01 | 2022-11-01 | Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024095362A1 true WO2024095362A1 (fr) | 2024-05-10 |
Family
ID=90930106
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2022/040864 Ceased WO2024095362A1 (fr) | 2022-11-01 | 2022-11-01 | Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JPWO2024095362A1 (fr) |
| WO (1) | WO2024095362A1 (fr) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006157428A (ja) * | 2004-11-29 | 2006-06-15 | Fuji Photo Film Co Ltd | 撮影装置及び撮影方法 |
| JP2008028434A (ja) * | 2006-07-18 | 2008-02-07 | Matsushita Electric Ind Co Ltd | 撮影装置、認証装置および撮影方法 |
-
2022
- 2022-11-01 WO PCT/JP2022/040864 patent/WO2024095362A1/fr not_active Ceased
- 2022-11-01 JP JP2024553980A patent/JPWO2024095362A1/ja active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006157428A (ja) * | 2004-11-29 | 2006-06-15 | Fuji Photo Film Co Ltd | 撮影装置及び撮影方法 |
| JP2008028434A (ja) * | 2006-07-18 | 2008-02-07 | Matsushita Electric Ind Co Ltd | 撮影装置、認証装置および撮影方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2024095362A1 (fr) | 2024-05-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10878237B2 (en) | Systems and methods for performing eye gaze tracking | |
| JP6582604B2 (ja) | 瞳孔検出プログラム、瞳孔検出方法、瞳孔検出装置および視線検出システム | |
| EP3153092B1 (fr) | Système de détection de pupille, système de détection du regard, procédé de détection de pupille, et programme de détection de pupille | |
| US8477996B2 (en) | Method and device for finding and tracking pairs of eyes | |
| JP5024067B2 (ja) | 顔認証システム、方法及びプログラム | |
| JP5106459B2 (ja) | 立体物判定装置、立体物判定方法及び立体物判定プログラム | |
| US10417782B2 (en) | Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program | |
| US10248852B2 (en) | Method for recognizing facial expression of headset wearing user and apparatus enabling the same | |
| JP6822482B2 (ja) | 視線推定装置、視線推定方法及びプログラム記録媒体 | |
| US10817722B1 (en) | System for presentation attack detection in an iris or face scanner | |
| US10402996B2 (en) | Distance measuring device for human body features and method thereof | |
| CN107203743B (zh) | 一种人脸深度跟踪装置及实现方法 | |
| WO2020079741A1 (fr) | Dispositif d'authentification d'iris, procédé d'authentification d'iris et support d'enregistrement | |
| US20170243061A1 (en) | Detection system and detection method | |
| WO2020195732A1 (fr) | Dispositif de traitement d'image, procédé de traitement d'image, et support d'enregistrement dans lequel un programme est stocké | |
| Rigas et al. | Gaze estimation as a framework for iris liveness detection | |
| WO2024095362A1 (fr) | Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement | |
| US12243351B2 (en) | Gaze estimation apparatus, gaze estimation method, model generation apparatus, and model generation method | |
| US12424025B2 (en) | Personal authentication apparatus and control method | |
| CN112528713A (zh) | 一种注视点估计方法、系统、处理器及设备 | |
| JP2016045707A (ja) | 特徴点検出システム、特徴点検出方法、および特徴点検出プログラム | |
| JP2017202038A (ja) | 判別装置、判別方法、および判別プログラム | |
| US20250104468A1 (en) | Information processing apparatus, information processing method, and recording medium | |
| CN112528714A (zh) | 基于单光源的注视点估计方法、系统、处理器及设备 | |
| JP6003277B2 (ja) | 診断支援装置および診断支援方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22964384 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024553980 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22964384 Country of ref document: EP Kind code of ref document: A1 |