WO2012111664A1 - 認証装置、認証プログラム、および認証方法 - Google Patents
認証装置、認証プログラム、および認証方法 Download PDFInfo
- Publication number
- WO2012111664A1 WO2012111664A1 PCT/JP2012/053395 JP2012053395W WO2012111664A1 WO 2012111664 A1 WO2012111664 A1 WO 2012111664A1 JP 2012053395 W JP2012053395 W JP 2012053395W WO 2012111664 A1 WO2012111664 A1 WO 2012111664A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- authentication
- unit
- image
- surface information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1347—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/993—Evaluation of the quality of the acquired pattern
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/60—Static or dynamic means for assisting the user to position a body part for biometric acquisition
- G06V40/67—Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/34—Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
Definitions
- the present invention relates to an authentication apparatus, an authentication program, and an authentication method for performing personal authentication using biometric information.
- the human body includes biological information that can identify an individual, and some of them are used as information for identifying and authenticating the individual.
- biometric information that can be used for authentication includes fingerprints, eye retinas and irises, faces, blood vessels, and DNA (Deoxyribo Nucleic Acid).
- biometric authentication is performed by comparing biometric information (registration template) collected during registration with biometric information acquired during authentication.
- a biometric authentication apparatus that guides biometric information to an appropriate position in order to acquire biometric information that can be compared with a registered template.
- a biometric authentication device using a palm vein detects a palm position shift and guides the palm to a position overlapping with a registered template (see, for example, Patent Document 1).
- the biometric authentication device may be evaluated as being unusable for the user if guidance to the proper posture of the biometrics occurs.
- the user in the vicinity of the boundary of whether or not the living body needs to be guided, even though the user recognizes that the living body is in the proper posture, the user is guided to an appropriate posture different from the user's recognition, In spite of doubt about whether or not, there is an experience that there is no guidance to the proper posture of the living body. Such experience makes it difficult for the user to recognize the true proper posture.
- a biometric authentication device that allows such an experience is not easy to use for the user, and it is required to reduce the guidance frequency in the vicinity of the boundary of whether or not guidance to an appropriate posture of the biometric is necessary.
- the present invention has been made in view of these points, and an object thereof is to provide an authentication device, an authentication program, and an authentication method that can expand the allowable range of the posture of a living body.
- the authentication device includes a surface information correction unit and an authentication unit.
- the surface information correction unit corrects the surface information regarding the surface of the living body extracted from the image information obtained by photographing the living body.
- the authentication unit performs biometric authentication using the corrected surface information.
- an authentication program for causing a computer to execute a process for performing personal authentication using characteristics of a living body is a surface information related to the surface of the living body extracted from image information obtained by photographing the living body. Is corrected, and biometric authentication is performed using the corrected surface information.
- an authentication method for performing personal authentication using the characteristics of a living body executed by a computer corrects surface information related to the surface of the living body extracted from image information obtained by photographing the living body, Biometric authentication is performed using the corrected surface information.
- the allowable range of the posture of the living body can be expanded.
- FIG. 1 is a diagram illustrating a configuration of an authentication apparatus according to the first embodiment.
- the authentication device 1 is an authentication device that performs personal authentication using a living body.
- the living body used by the authentication device 1 for personal authentication is a palm vein.
- the authentication device 1 performs personal authentication by comparing and collating image information obtained by photographing the palm of the sensor unit with a template registered in advance.
- the sensor unit includes an image sensor, shoots a living body as a subject, generates image information of the living body, and outputs the image information to the authentication device 1.
- the image information is biological image data (for example, including position information and luminance information), and is generated in a predetermined image format.
- the template is data collected in advance from a living body for use in matching of the living body.
- the authentication device 1 includes a surface information extraction unit 2, a surface information analysis unit 3, a surface information correction unit 4, and an authentication unit 5.
- the surface information extraction unit 2 extracts surface information that can evaluate the unevenness of the palm surface (biological surface) from image information obtained by photographing the palm (biological body).
- the surface information is information that can evaluate unevenness on the surface of the living body, and is, for example, luminance information.
- the surface information analysis unit 3 analyzes the unevenness of the palm surface from the surface information.
- the surface information correction unit 4 corrects irregularities on the palm surface for the surface information. For example, the surface information correction unit 4 performs correction for reducing unevenness of the palm surface with respect to the surface information.
- the authentication unit 5 performs biometric authentication using the corrected surface information.
- the authentication device 1 re-acquires palm image information. .
- the authentication device 1 performs notification for guiding the palm to the proper posture.
- the person to be authenticated receives the notification and corrects the posture of the palm.
- the surface information correction unit 4 performs the correction, the opportunity to perform biometric authentication using the surface information corrected by the authentication unit 5 increases, and the authentication device 1 guides the palm of the person to be authenticated to an appropriate posture. Opportunities for notification are reduced.
- the authentication device 1 can reduce the guidance frequency in the vicinity of the boundary whether or not guidance to the appropriate posture of the palm is necessary.
- biometric information used in biometric authentication is acquired from a living body, it fluctuates due to the influence of biological activity. That is, the biological information is not always constant and can change with a certain width. Specifically, the biological information is affected by biological respiration, muscle tension and relaxation, blood vessel dilation and relaxation, and the like. Even if the person to be authenticated tries to maintain a stationary state, the biometric information varies slightly under the unconsciousness due to the influence of the above-described bioactivity.
- the biological information is affected by the environmental change when the biological information is acquired.
- the environment for acquiring biometric information differs each time it is acquired, and cannot be said to be constant.
- the ambient light when acquiring biometric information is affected by the outdoor light when it is affected by the outdoor light. Even if it is indoors, the lighting cycle of indoor lights such as fluorescent lights and the movement of people around It changes by etc.
- the authentication device 1 corrects the biological information to reduce the fluctuation of the biological information and the influence of environmental changes. And the authentication apparatus 1 reduces the opportunity of performing the alert
- FIG. 2 is a diagram illustrating a configuration of the authentication system according to the second embodiment.
- a system in which the authentication system 10 performs authentication using a palm vein is exemplified, but the present invention is not limited to this, and the present invention can also be applied to a system that performs authentication at another feature detection site of a living body.
- the authentication system 10 is a system that recognizes characteristics of a living body and identifies and authenticates an individual, and can be used for information system log-on, entrance / exit management, and the like.
- the authentication system 10 includes an authentication device 20, an authentication device 30, an authentication device 40, an authentication server 50, and a network 51.
- the authentication device 20, the authentication device 30, and the authentication device 40 are devices that perform biometric authentication when authenticating a user.
- the authentication device 20 is a computer that performs user authentication, and is, for example, a business terminal device that requires a high security level.
- the authentication device 30 is an ATM (Automated Transaction Machine: Machine) installed by a financial institution.
- the authentication device 40 is a security area entry / exit management device.
- the authentication server 50 associates and stores identification information for identifying the user and biometric information (template) registered in advance before biometric authentication.
- the identification information for identifying the user is a unique ID (IDentification) given to the user directly (for example, a user number) or indirectly (for example, an account number).
- the biometric information registered in advance is feature information obtained by extracting a feature portion from image information, encoded information obtained by encoding image information or feature information, and the like.
- the authentication server 50 is a database server of the authentication system 10 and includes various databases (for example, an NG biometric information database, an NG environment information database, and a guidance information database described later).
- the network 51 connects the authentication device 20, the authentication device 30, the authentication device 40, and the authentication server 50 so that they can communicate with each other by wire or wirelessly.
- the various databases may be included in the authentication device 20, the authentication device 30, and the authentication device 40 instead of the authentication server 50.
- the authentication device 20 includes a processing device 21, a display 22, and a sensor unit built-in mouse 24.
- the authentication device 20 includes a keyboard 23, an IC (Integrated Circuit) card reader / writer 25, and the like as necessary.
- the sensor unit built-in mouse 24 incorporates a sensor unit.
- the sensor unit includes an imaging device, captures the palm of the user, and outputs a captured image to the processing device 21.
- the IC card reader / writer 25 reads and writes information on the user's IC card 26.
- the keyboard 23 and the sensor unit built-in mouse 24 accept input operations.
- a user who requests authentication inputs identification information (for example, a user ID) for identifying the user through the keyboard 23, the sensor unit built-in mouse 24, or the IC card reader / writer 25.
- the authentication device 20 guides the user to input biometric information through display using the display 22.
- the user inputs biological information by holding his / her hand over the sensor unit built-in mouse 24.
- the authentication device 20 that has input a palm vein image as biometric information collates the input vein image (biometric information) with the registered template.
- the registration template can be recorded in the storage unit of the processing device 21, the storage unit of the authentication server 50, or the storage unit of the user's IC card 26.
- the authentication device 30 includes a sensor unit 31.
- the sensor unit 31 includes an imaging device and photographs the palm of the user.
- the authentication device 30 authenticates the user using the captured image.
- the authentication device 30 includes an IC card reader / writer (not shown) and performs verification using a registration template stored in an IC card (for example, an IC chip built-in type cash card).
- the authentication device 40 includes a numeric keypad 41, an IC card reader / writer 42, and a sensor unit 43.
- the numeric keypad 41 is used to input a personal identification number when authentication with a personal identification number is used together.
- the IC card reader / writer 42 reads and writes information on a user's IC card (not shown).
- the sensor unit 43 includes an imaging device and photographs the palm of the user.
- the authentication device 40 authenticates the user using the registered template stored in the IC card and the photographed image, and controls the opening / closing of the door 44.
- FIG. 3 is a diagram illustrating a configuration of the authentication device according to the second embodiment.
- the authentication device 20 includes a control unit 200, a storage unit 201, a notification unit 202, and a communication unit 203. Further, the authentication device 20 includes an image input unit 204, an object extraction unit 205, a palm determination unit 206, a palm cutout unit 207, an outer shape correction unit 208, and a surface information extraction unit 209. Furthermore, the authentication device 20 includes a surface information analysis unit 210, a surface information correction unit 211, an NG information acquisition unit 212, a guidance method selection unit 213, a biological information extraction unit 214, and a verification unit 215.
- the control unit 200 performs overall control of each processing unit and performs user authentication.
- the storage unit 201 stores and holds image information acquired from the sensor unit built-in mouse 24, various databases, and the like.
- the notification unit 202 generates a required display message, such as guidance on how to hold the palm of the hand over the sensor unit built-in mouse 24 to the user, notification of the success or failure of collation, and displays the message on the display 22.
- the notification unit 202 generates a required voice message, such as guidance on how to hold the palm of the hand over the sensor unit built-in mouse 24 or notification of the success or failure of verification, and outputs the voice from a speaker (not shown).
- the communication unit 203 performs communication with the sensor unit built in the sensor unit built-in mouse 24, communication with the IC chip built in the IC card reader / writer 25, and communication with the computer connected to the network 51.
- the image input unit 204 inputs a captured image of the living body from the sensor unit built-in mouse 24.
- the object extraction unit 205 removes the background from the captured image and extracts the subject.
- the palm determination unit 206 determines whether or not the subject is a palm. If the palm determination unit 206 determines that the subject is not a palm, the image input unit 204 again inputs a captured image of the living body from the sensor unit built-in mouse 24. At this time, the notification unit 202 may guide the operation of holding the palm.
- the palm cutout unit 207 cuts out a palm (including a finger or a wrist) from the subject that the palm determination unit 206 has determined to be a palm.
- the external shape correction unit 208 corrects the position of the extracted palm (front / back / left / right position correction), size (up / down height correction), and orientation (rotation correction) to the correct position.
- the surface information extraction unit 209 extracts surface information from the palm image corrected by the external shape correction unit 208. Specifically, the surface information extraction unit 209 extracts luminance (luminance information) from the palm image as surface information.
- the surface information extraction unit 209 is not limited to the luminance as the surface information, but may be brightness extracted from a palm image.
- the surface information extraction unit 209 may acquire distance information from the sensor unit built-in mouse 24 as information accompanying the palm image, and may acquire the distance between the distance measuring sensor and the palm surface as surface information.
- the surface information analysis unit 210 analyzes the uneven part of the palm from the surface information extracted by the surface information extraction unit 209.
- the surface information correction unit 211 corrects the uneven portion in the correctable range among the uneven portions analyzed by the surface information analysis unit 210.
- the surface information analysis unit 210 obtains a palm image in which the uneven portion is corrected or a palm image in which the correction is not required for the uneven portion. If the surface information correction unit 211 determines that the uneven portion is not within the correctable range, the image input unit 204 again inputs a captured image of the living body from the sensor unit built-in mouse 24. At this time, the notification unit 202 may guide the operation of holding the palm.
- the biological information extraction unit 214 extracts biological information used for collation from the palm image obtained by the surface information analysis unit 210. Specifically, the biometric information extraction unit 214 extracts a vein pattern in the palm image or verification information included in the vein pattern.
- the matching information includes, for example, feature points (vein end points and branch points) included in the vein pattern, the number of veins connecting the feature points and neighboring feature points with a straight line, and a small image centered on the feature points.
- the collation unit 215 compares the biometric information (collation information) extracted by the biometric information extraction unit 214 with a registered template registered in advance.
- the authentication device 20 corrects slight fluctuations in the biological information and environmental changes by correcting the uneven portions.
- the authentication device 20 reduces the chance of performing a notification for guiding the living body to an appropriate posture by reducing fluctuations of biological information and the influence of environmental changes.
- the authentication device 20 has a function for executing a process for performing more appropriate guidance when the surface information correction unit 211 determines that the uneven portion is not within a correctable range.
- the NG information acquisition unit 212 registers a photographed image that could not be used for collation as an NG image in the NG biological information database.
- the NG information acquisition unit 212 registers environment information when shooting a captured image that could not be used for collation in the NG environment information database.
- the guidance method selection unit 213 refers to the guidance information database and selects a guidance method corresponding to a captured image that could not be used for collation. The guidance method selected by the guidance method selection unit 213 is notified to the user by the notification unit 202.
- the authentication device 20 can notify the user by presenting an appropriate guidance method when the captured image cannot be used for verification. Accumulation of the cause of failure in the NG biological information database and the NG environment information database contributes to enhancement of the guidance method selected by the guidance method selection unit 213.
- FIG. 4 is a diagram illustrating a hardware configuration example of the authentication apparatus according to the second embodiment.
- the authentication device 20 includes a processing device 21, a display 22, a keyboard 23, a sensor unit built-in mouse 24, and an IC card reader / writer 25.
- the entire processing apparatus 21 is controlled by a CPU (Central Processing Unit) 101.
- a RAM (Random Access Memory) 102, an HDD (Hard Disk Drive) 103, a communication interface 104, a graphic processing device 105, and an input / output interface 106 are connected to the CPU 101 via a bus 107.
- the RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101.
- the RAM 102 stores various data necessary for processing by the CPU 101.
- the HDD 103 stores an OS and application programs.
- a display 22 is connected to the graphic processing device 105.
- the graphic processing device 105 displays an image on the screen of the display 22 in accordance with a command from the CPU 101.
- a keyboard 23, a sensor unit built-in mouse 24, and an IC card reader / writer 25 are connected to the input / output interface 106.
- the input / output interface 106 can be connected to a portable recording medium interface that can write information to the portable recording medium 110 and read information from the portable recording medium 110.
- the input / output interface 106 transmits signals sent from the keyboard 23, the sensor unit built-in mouse 24, the IC card reader / writer 25, and the portable recording medium interface to the CPU 101 via the bus 107.
- the input / output interface 106 may be connected to the sensor unit built-in mouse 24 by USB (UniversalUniversSerial Bus).
- USB UniversalUniversSerial Bus
- the USB connection is desirably USB 2.0 or later, which can be connected in the high speed mode, because the processing device 21 receives a captured image from the sensor unit.
- the communication interface 104 is connected to the network 51.
- the communication interface 104 transmits / receives data to / from the authentication server 50.
- the processing functions of the present embodiment can be realized.
- the authentication device 30, the authentication device 40, and the authentication server 50 can also be realized with the same hardware configuration.
- the processing device 21 can be configured to include modules each composed of an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), or the like, or can be configured without the CPU 101.
- each of the processing devices 21 includes a nonvolatile memory (for example, EEPROM (Electrically Erasable and Programmable Read Only Memory), flash memory, flash memory type memory card, etc.) and stores module firmware.
- the nonvolatile memory can write firmware via the portable recording medium 110 or the communication interface 104.
- the processing device 21 can also update the firmware by rewriting the firmware stored in the nonvolatile memory.
- FIG. 5 is a diagram illustrating a configuration of a sensor unit according to the second embodiment.
- the sensor unit 31 and the sensor unit 43 can have the same configuration as the sensor unit 24a.
- the sensor unit 24a is built in the mouse 24 with a built-in sensor unit.
- the sensor unit 24a includes a control unit 24b, a photographing unit 24c, a distance measuring unit 24d, a storage unit 24e, and a communication unit 24f.
- the control unit 24b comprehensively controls each processing unit.
- the imaging unit 24c acquires image information from a living body that is a subject.
- the imaging unit 24c includes an image sensor (for example, a CMOS (Complementary Metal OxideconductorSemiconductor) sensor, a CCD (Charge ⁇ ⁇ Coupled Device) sensor), a condenser lens, and a plurality of near-infrared light emitting elements that irradiate a subject (e.g. LED: Light Emitting Diode).
- CMOS Complementary Metal OxideconductorSemiconductor
- CCD Charge ⁇ ⁇ Coupled Device
- the near-infrared light emitting element is provided, for example, around the image sensor, emits near-infrared light toward the subject direction (upward), and the image sensor photographs the subject irradiated with the near-infrared ray.
- the image sensor can capture the palm of the subject in a wide range of the shooting range through a condenser lens (wide angle lens).
- the distance measuring unit 24d acquires distance information with respect to the living body that is the subject.
- the sensor unit 24a can photograph the palm within a predetermined range by measuring the photographing timing with the distance measuring sensor.
- the imaging unit 24c may continuously shoot at a predetermined timing (for example, shoot 15 frames per second), and use one or a plurality of captured images for collation.
- the storage unit 24e stores the image information acquired by the photographing unit 24c and the distance information acquired by the distance measuring unit 24d in association with the image information.
- the communication unit 24f is connected to the communication unit 203 of the processing device 21, and receives an instruction from the processing device 21, and transmits image information and distance information.
- the image photographed by the sensor unit 24a is an image obtained by irradiating a living body (palm) as a subject with near infrared rays and photographing reflected light. Since hemoglobin in red blood cells flowing in the veins has lost oxygen, this hemoglobin (reduced hemoglobin) has a property of absorbing near infrared rays in the vicinity of 700 nm to 1000 nm. Therefore, when near infrared rays are applied to the palm, only a portion where the vein is present is less reflected, and the position of the vein can be recognized by the intensity of reflected light of the near infrared ray. The photographed image becomes an achromatic image although it is easy to extract characteristic information by using a specific light source.
- FIG. 6 is a flowchart of authentication processing according to the second embodiment.
- the processing device 21 receives an authentication request from the user, executes an authentication process, and obtains an authentication result.
- the processing device 21 acquires a registered template.
- the registration template can be acquired from the storage unit (HDD 103) of the processing device 21, the storage unit of the authentication server 50, or the storage unit of the user's IC card 26.
- the processing device 21 acquires a registration template corresponding to the user based on information (for example, a user ID, a card ID, etc.) that can uniquely identify the registration template. If the user is not specified at the start of the authentication process, the registration template may be acquired immediately before verification.
- the processing device 21 acquires a captured image used for authentication from the sensor unit built-in mouse 24 (image input unit 204). [Step S13] The processing device 21 extracts a subject from the acquired captured image (object extraction unit 205). The processing device 21 determines whether or not the extracted subject is a palm (palm determination unit 206). The processing device 21 cuts out the palm from the subject determined to be the palm (the palm cutout unit 207). The processing device 21 corrects the extracted palm image to the normal position (outline correction unit 208).
- Step S14 The processing device 21 determines whether or not the palm image corrected to the normal position can be used for collation. This determination is performed by comparing the palm image corrected to the normal position and the image of the registered template, or comparing the palm image corrected to the normal position and the model. The processing device 21 proceeds to step S19 if it determines that the palm image corrected to the normal position can be used for collation, and proceeds to step S15 if it determines that it cannot be used for collation.
- Step S15 The processing device 21 extracts surface information from the palm image corrected to the normal position (surface information extraction unit 209).
- the processing device 21 analyzes the uneven portion of the palm from the extracted surface information and determines whether or not the surface information can be corrected (surface information analysis unit 210). If it is determined that the surface information can be corrected, the processing device 21 proceeds to step S18, and if it is not determined that the surface information can be corrected, the processing device 21 proceeds to step S16.
- the processing device 21 executes an NG information acquisition process (NG information acquisition unit 212).
- the NG information acquisition process is a process of registering an NG image in the NG biological information database and registering environment information when the NG image is captured in the NG environment information database.
- Step S17 The processing device 21 notifies the user that re-acquisition is to be performed in order to re-acquire the authentication image. Further, the processing device 21 provides guidance for placing the user's palm in the normal position in order to reacquire an appropriate authentication image (guidance method selection unit 213, notification unit 202).
- the processing device 21 corrects the surface information when the surface information is within a correctable range (surface information correction unit 211).
- the processing device 21 extracts biometric information used for collation from a palm image in which surface information is corrected or a palm image that does not require correction of surface information (biological information extraction unit 214).
- Step S20 The processing device 21 compares the extracted biometric information with a registered template registered in advance (verification unit 215).
- the processing device 21 proceeds to Step S22 as a collation match if the collation is matched, that is, if it is evaluated that the degree of coincidence exceeds a predetermined threshold value. Proceed.
- Step S ⁇ b> 22 The processing device 21 receives the collation match, determines the identity, and ends the authentication process after executing the required process associated with the successful authentication.
- Step S ⁇ b> 23 The processing device 21 receives the verification mismatch, determines whether the person is rejected, and terminates the authentication process after executing a required process associated with the authentication failure.
- FIG. 7 is a flowchart of surface information extraction processing according to the second embodiment.
- FIG. 8 is a diagram illustrating an evaluation unit of surface information according to the second embodiment.
- FIG. 9 is a diagram illustrating normalization of the luminance distribution according to the second embodiment.
- FIG. 10 is a diagram illustrating an example of surface information extraction according to the second embodiment. The surface information extraction process is executed prior to the determination of whether or not the surface information can be corrected in step S15 of the authentication process.
- the surface information extraction unit 209 acquires luminance information of the entire palm from the palm image.
- the region from which the surface information extraction unit 209 obtains luminance information is the entire palm region 61 corresponding to the palm portion of the hand 60 (see FIG. 8).
- the entire palm area 61 is, for example, a circular area, but is not limited thereto, and may be an area surrounded by an ellipse, a rectangle, another polygon, a free closed curve, or the like.
- the surface information extraction unit 209 divides the entire palm region 61 into a plurality of partial regions 62.
- the arrangement position of each partial area 62 is set in advance.
- Each partial area 62 is arranged to allow an overlapping area to overlap each other (see FIG. 8).
- the surface information extraction unit 209 acquires luminance information for one partial region 62.
- the surface information extraction unit 209 normalizes the acquired luminance information.
- the surface information extraction unit 209 normalizes the luminance information by shifting the luminance distribution and correcting the luminance distribution range. For example, the surface information extraction unit 209 generates a luminance distribution 64 in a gray scale (0 to 255) range from the acquired luminance information.
- the luminance distribution 64 has a distribution range lower limit Th01, a distribution range upper limit Th02, and a distribution range W0.
- the surface information extraction unit 209 corrects and normalizes the luminance distribution 64 to the distribution range lower limit Th11, the distribution range upper limit Th12, and the distribution range W1 of the model luminance distribution 65 (see FIG. 9).
- This normalization makes it possible to objectively evaluate the state of palm surface information even if there are variations in the subject's palm color, gloss, and other subjects, as well as in shooting environments such as outdoors or indoors.
- the surface information extraction unit 209 divides the partial region 62 into a plurality of minute partial regions 63.
- the arrangement position of each minute partial region 63 is set in advance.
- Each minute partial region 63 is arranged to allow an overlapping region to overlap each other (see FIG. 8).
- the surface information extraction unit 209 acquires corrected luminance information for one minute partial region 63.
- the surface information extraction unit 209 evaluates the luminance size and the luminance distribution for one minute partial region 63 from which luminance information has been acquired. For example, the surface information extraction unit 209 evaluates the magnitude of brightness (brightness) and the magnitude of the brightness distribution by comparison with ideal values. More specifically, the surface information extraction unit 209 evaluates the size of the luminance distribution based on the luminance distribution range of a predetermined range including the mode value, and increases the luminance based on the average luminance of the distribution range to be evaluated. Evaluate.
- the evaluation of the magnitude of the brightness includes “pretty dark”, “dark”, “standard”, “bright”, and “pretty bright” as compared with the ideal value.
- the evaluation of the size of the luminance distribution includes “wide”, “narrow”, and “just right” compared to the ideal value.
- Step S38 The surface information extraction unit 209 determines whether or not the evaluation of the luminance size and the luminance distribution has been completed for all the minute partial regions 63 for the partial region 62 under evaluation. The surface information extraction unit 209 proceeds to step S36 if the evaluation has not been completed for all the minute partial regions 63, and proceeds to step S39 if it has been completed.
- the surface information extraction unit 209 evaluates the corresponding partial area 62 from the evaluation of all the minute partial areas 63.
- the surface information extraction unit 209 calculates the luminance distribution size and luminance size of the partial region 62 based on the evaluation of the partial region 62.
- Step S40 The surface information extraction unit 209 determines whether or not the evaluation has been completed for all the partial areas 62 for the entire palm area 61. The surface information extraction unit 209 proceeds to step S33 if the evaluation has not been completed for all the partial regions 62, and ends the surface information extraction processing if it has been completed.
- FIG. 10 shows an example in which the surface information extraction unit 209 extracts surface information from the entire palm region 61a in this way. If the surface information extraction unit 209 is within a predetermined range compared to the model (ideal value), the surface information extraction unit 209 extracts a uniform luminance distribution as surface information for the entire palm region 61a as shown in FIG. When the surface information extraction unit 209 has a nonuniform luminance distribution as compared with the model, the surface information extraction unit 209 extracts the nonuniform luminance distribution as surface information for the entire palm region 61b as shown in FIG. The entire palm region 61b has a luminance distribution 67a and a luminance distribution 67b as non-uniform luminance distributions.
- the luminance distribution 67a is evaluated as “dark”, and the luminance distribution 67b is evaluated as “bright”. Since the dark portion indicates the concave portion and the bright portion indicates the convex portion, it can be understood that the entire palm region 61b is lightly rounded.
- the surface information extracting unit 209 extracts non-uniform luminance distribution as surface information for the entire palm region 61c as shown in FIG. 10 (3).
- the entire palm region 61c has a luminance distribution 67c and a luminance distribution 67d as non-uniform luminance distributions.
- the luminance distribution 67c is evaluated as “pretty dark”, and the luminance distribution 67d is evaluated as “pretty bright”.
- a considerably dark portion indicates a strong concave portion, and a considerably bright portion indicates a strong convex portion. Therefore, it can be understood that the entire palm region 61c has a rounded palm.
- FIG. 11 is a diagram illustrating a modified example of the surface information evaluation unit according to the second embodiment.
- the entire palm area 68 is an area of the hand 60 from which the surface information extraction unit 209 acquires luminance information.
- the entire palm area 68 is an area in which a plurality of partial areas 69 are gathered.
- the arrangement position of each partial area 69 is set in advance.
- Each partial region 69 does not have an overlapping region that overlaps each other, and is disposed adjacent to each other.
- One partial region 69 is a region in which a plurality of minute partial regions 70 are assembled.
- the arrangement position of each minute partial area 70 is set in advance.
- Each minute partial area 70 does not have an overlapping area overlapping each other, and is arranged adjacent to each other.
- a modification of the surface information evaluation unit is that the entire palm region 68, the partial region 69, and the minute partial region 70 have different shapes, and thus the palm entire region 61, the partial region 62, and the minute partial region 63 are different. This is different from the second embodiment in which each has a similar shape.
- FIG. 12 is a flowchart of surface information analysis processing according to the second embodiment.
- FIG. 13 is a diagram illustrating an example of surface information analysis according to the second embodiment. The surface information analysis process is executed after the surface information extraction in step S15 of the authentication process.
- the surface information analysis unit 210 acquires the surface information (the level of unevenness and its range) extracted by the surface information extraction unit 209 for each part.
- the part refers to a region obtained by dividing the palm into a plurality of parts, and is defined in advance.
- Each part has a corresponding relationship with one or more partial regions 62 described above.
- each part includes a central part 91 located at the center of the palm of the hand 60, an upper part 95 and a lower part 93 located above and below the central part 91, and a thumb part 94 and a little finger part 92 located on the left and right of the central part 91.
- each part may be defined corresponding to a human skeleton or muscle, for example, divided into six parts from a first metacarpal part to a fifth metacarpal part and a carpal part.
- Step S42 The surface information analysis unit 210 determines whether one of a plurality of portions is a strong convex range or a strong concave range. Whether or not a strong convex range or a strong concave range is determined is compared with a predetermined threshold value. The surface information analysis unit 210 proceeds to step S43 when it is determined that the region to be determined has either a strong convex range or a strong concave range. The surface information analysis unit 210 proceeds to step S44 when it is determined that there is neither a strong convex range or a strong concave range in the region to be determined.
- Step S43 The surface information analysis unit 210 sets the retry flag and terminates the surface information analysis process, assuming that the region to be determined cannot be corrected.
- the authentication device 20 re-acquires the authentication image by setting the retry flag.
- Step S44 The surface information analysis unit 210 determines whether or not the region to be determined is a weak convex range. The determination of whether or not the range is a weak convex range is made by comparing with a predetermined threshold value. If the surface information analysis unit 210 determines that there is a weak convex range in the region to be determined, the surface information analysis unit 210 proceeds to step S45. If the surface information analysis unit 210 determines that there is no weak convex range in the region to be determined, the surface information analysis unit 210 proceeds to step S46.
- Step S ⁇ b> 45 The surface information analysis unit 210 sets a convex correction flag at the portion to be determined.
- Step S46 The surface information analysis unit 210 determines whether or not the region to be determined is a weak concave range. The determination of whether or not the area is a weak concave range is made by comparing with a predetermined threshold value. The surface information analysis unit 210 proceeds to step S47 when it is determined that there is a weak concave range in the determination target region. If the surface information analysis unit 210 determines that there is no weak concave range in the determination target region, the surface information analysis unit 210 proceeds to step S48.
- the surface information analysis unit 210 sets a concave correction flag in the region to be determined.
- the surface information analysis unit 210 determines whether or not the analysis has been completed for all of the plurality of parts. If the analysis has not been completed for all of the plurality of parts, the surface information analysis unit 210 proceeds to step S42 in order to analyze the part that has not been analyzed yet. The surface information analysis unit 210 ends the surface information analysis process when the analysis is completed for all of the plurality of parts.
- the surface information analysis unit 210 analyzes the surface information (unevenness information on the palm surface) by determining whether the convex correction flag and the concave correction flag are set for all of a plurality of portions. Further, the surface information analysis unit 210 sets a retry flag if at least one of a plurality of parts has a strong convex range or a strong concave range, and determines that the image to be analyzed is not suitable for authentication. .
- the hand 60a is an example of surface information analysis having a strong convex portion 71a and a strong concave portion 72a. It can be evaluated that the hand 60a is in a state where the palm is depressed.
- the hand 60b is an example of surface information analysis having a strong convex portion 71b and a strong concave portion 72b. It can be evaluated that the hand 60b is in a state where the thumb is lowered with respect to the palm.
- the hand 60c is an example of surface information analysis having a strong convex portion 71c and a strong concave portion 72c.
- the hand 60c is in a state where the little finger is lowered with respect to the palm.
- the analysis result of the surface information analysis unit 210 can evaluate the state of the hand 60, an appropriate posture is pointed out when the retry flag is set and the authentication image is reacquired. It is possible to perform simple guidance.
- FIG. 14 is a flowchart of surface information correction processing according to the second embodiment.
- the surface information correction process is executed in step S18 of the authentication process.
- Step S51 The surface information correction unit 211 determines whether or not the convex correction flag is set for each part analyzed by the surface information analysis unit 210.
- the surface information correction unit 211 proceeds to step S52 when it is determined that the convex correction flag is set, and proceeds to step S53 when it is determined that the convex correction flag is not set.
- Step S52 The surface information correction unit 211 executes a surface reflection removal process for removing the surface reflection of the palm image. Details of the surface reflection removal processing will be described later with reference to FIGS.
- the surface information correction unit 211 determines whether there is comparison target data to be compared with palm images.
- the surface information correction unit 211 proceeds to step S56 when there is comparison target data, and proceeds to step S54 when there is no comparison target data.
- the comparison target data is, for example, a user registration template.
- the registration template can be acquired based on the user ID input by the user.
- the surface information correction unit 211 can use the comparison target data serving as the standard model.
- the comparison target data in the case of using the registration template may be not the registration template itself but a limit model provided with a predetermined margin range based on the registration template.
- Step S54 The surface information correction unit 211 determines whether or not the concave correction flag is set for each part analyzed by the surface information analysis unit 210.
- the surface information correction unit 211 proceeds to step S55 when it is determined that the concave correction flag is set, and ends the surface information correction processing when it is determined that the concave correction flag is not set.
- the surface information correction unit 211 performs luminance correction in units of parts when there is no comparison data of palm images. For example, the surface information correction unit 211 performs the luminance correction by offsetting the average value of the luminance for each part so as to be aligned with the overall average. Note that the luminance correction is not limited to the average value of the luminance for each part, and a median value or a mode value may be used. Further, the luminance correction for each part may be performed according to different rules for each part. The surface information correction unit 211 ends the surface information correction process after performing the luminance correction in units of parts.
- Step S56 When there is data to be compared for the palm image, the surface information correction unit 211 looks down the entire palm and evaluates the unevenness match. The surface information correction unit 211 ends the surface information correction processing if the unevenness match evaluation is within a predetermined threshold range. On the other hand, the surface information correction unit 211 proceeds to step S57 if the unevenness matching evaluation is not within a predetermined threshold range.
- the surface information correction unit 211 performs the luminance correction so as to correct the average value of the luminance extended to the peripheral area in the range where the unevenness matching evaluation is low.
- the surface information correction unit 211 performs the luminance correction by offsetting the average value of the luminance of the part to be corrected and the surrounding area so as to be aligned with the overall average.
- Step S58 The surface information correction unit 211 re-evaluates the concave-convex match over the entire palm.
- the surface information correction unit 211 ends the surface information correction processing if the reevaluation of the unevenness match is within a predetermined threshold range.
- the surface information correction unit 211 proceeds to step S59 if the reevaluation of the unevenness match is not within the predetermined threshold range.
- Step S59 The surface information correction unit 211 sets the retry flag and ends the surface information correction process, assuming that the palm image cannot be corrected.
- the authentication device 20 re-acquires the authentication image by setting the retry flag.
- the surface information correction unit 211 corrects slight fluctuations in the biological information and environmental changes within the allowable range by correcting the uneven portion. Thereby, the authentication apparatus 20 reduces the opportunity to re-acquire the authentication image.
- FIG. 15 is a flowchart of surface reflection removal processing according to the second embodiment.
- FIG. 16 is a diagram illustrating surface reflection removal from the luminance distribution according to the second embodiment.
- FIG. 17 is a diagram illustrating an example of the surface reflection removal process of the second embodiment. The surface reflection removal process is executed in step S52 of the surface information correction process.
- the surface information correction unit 211 estimates a palm brightness model from the palm image (FIG. 17 (1)).
- the surface information correction unit 211 obtains an estimated luminance model 74 by a spline function from the luminance graph 73 obtained by scanning the palm surface luminance in the horizontal direction.
- the surface information correction unit 211 uses a low-order (for example, second-order) spline function to remove high-frequency components such as bright convex portions and dark veins compared to the surroundings.
- the surface information correction unit 211 estimates the luminance model of the entire palm from a plurality of estimated luminance models 74 whose positions are shifted in the vertical direction.
- luminance model in a palm image can use known methods, such as estimated vinegar from a brightness
- the surface information correction unit 211 extracts convex portions (for example, convex portions 71d and 71e) having surface reflection from the difference between the luminance graph 73 and the estimated luminance model 74 (for example, convex portion extraction image 75). (FIG. 17 (2))).
- the surface information correction unit 211 obtains a surface reflection enhanced image (for example, an enhanced image 76 (FIG. 17 (3))) in which surface reflection is enhanced from the palm image.
- the surface information correction unit 211 estimates a surface reflection component included in the surface reflection enhanced image. [Step S64] The surface information correction unit 211 removes the surface reflection component from the surface reflection enhanced image, and generates a surface reflection relaxation image (for example, a relaxation image 77 (FIG. 17 (4))) in which the surface reflection is reduced.
- a surface reflection relaxation image for example, a relaxation image 77 (FIG. 17 (4))
- the surface information correction unit 211 corrects the luminance of the palm image based on the generated surface reflection mitigation image, and ends the surface reflection removal process.
- a known method such as removal of the effect of non-uniform illumination can be used.
- FIG. 18 is a flowchart of NG information acquisition processing according to the second embodiment.
- FIG. 19 is a diagram illustrating an example of NG biometric information according to the second embodiment.
- FIG. 20 is a diagram illustrating an example of an NG image according to the second embodiment.
- FIG. 21 is a diagram illustrating an example of NG environment information according to the second embodiment.
- FIG. 22 is a diagram illustrating an example of a shooting environment image according to the second embodiment.
- the NG information acquisition process is executed in step S16 of the authentication process when the surface information cannot be corrected.
- the NG information acquisition unit 212 outputs, as NG biometric information, a captured image that cannot be corrected for surface information and information related to the captured image, and updates the NG biometric information database 52.
- the NG information acquisition unit 212 determines whether or not there is a palm in the shooting range.
- the NG information acquisition unit 212 determines whether or not there is a palm in the imaging range by acquiring the current captured image from the sensor unit built-in mouse 24.
- the NG information acquisition unit 212 proceeds to step S73 when it is determined that there is a palm in the shooting range, and proceeds to step S74 when it is determined that there is no palm in the shooting range.
- the NG information acquisition unit 212 instructs the notification unit 202 to notify the user to retract the palm from the shooting range.
- the notification unit 202 notifies the user by display or sound so as to retract the palm from the shooting range. In this way, the NG information acquisition unit 212 waits for the palm to evacuate from the shooting range.
- the NG information acquisition unit 212 acquires a video (environment video) without a palm in the shooting range.
- the NG information acquisition unit 212 acquires the current captured image from the sensor unit built-in mouse 24, and acquires an image without a palm in the imaging range.
- the NG information acquisition unit 212 outputs, as NG environment information, information related to the acquired environment video and the environment (NG environment) that has resulted in acquiring a captured image that cannot be corrected for surface information.
- the information database 53 is updated.
- the NG information acquisition unit 212 ends the NG information acquisition process after updating the NG environment information database 53.
- NG biometric information 300 is an example of NG biometric information updated in step S71.
- the NG biological information 300 is an example of NG biological information managed by the NG biological information database 52.
- the NG biometric information is information related to the palm image that has become NG in the collation, such as an identification number, date, time, user ID, retry (number of retries), NG biometric image, NG reason, and the palm image.
- the identification number is identification information that uniquely identifies the verification that has become NG.
- the date and time are the date and time of collation that is determined to be NG, respectively. The time may be information in units of seconds as well as hours and minutes.
- the user ID is identification information that uniquely identifies the user.
- Retry is the cumulative number of times the retry flag has been set.
- the NG living body image is a palm image used for collation.
- the palm image 95 is an NG living body image having a strong convex portion 71f (FIG. 20 (1)).
- the palm image 96 is an NG living body image having strong convex portions 71g and strong concave portions 72g (FIG. 20 (2)).
- the reason for NG is the reason that the palm image is judged as NG in the collation.
- the NG environment information 310 is an example of NG environment information updated in step S75.
- the NG environment information 310 is an example of NG environment information managed by the NG environment information database 53.
- the NG environment information is information related to the environment that has become NG by collation, such as an identification number, date, time, user ID, NG environment image, sensor ID, lighting, temperature, humidity, physical condition, and the like.
- the identification number is identification information that uniquely identifies the verification that has become NG.
- the date and time are the date and time of collation that is determined to be NG, respectively. The time may be information in units of seconds as well as hours and minutes.
- the user ID is identification information that uniquely identifies the user.
- the NG environment image is a shooting environment image at the time of palm shooting (collation).
- the shooting environment image 78 is a good shooting environment image (FIG. 22 (1)).
- the shooting environment image 79 is an NG environment image in which the fluorescent lamp 80 is reflected (FIG. 22 (2)).
- the shooting environment image 81 is an NG environment image in which strong external light 82 is reflected ((3) in FIG. 22).
- the shooting environment image 83 is an NG environment image in which the unknown light 84 is reflected (FIG. 22 (4)). These NG environment images do not necessarily include the reason for collation failure in the shooting environment.
- the sensor ID is identification information that uniquely identifies the ID of the sensor unit that has captured the palm image.
- the illumination is information indicating ON / OFF of illumination in the surrounding environment of the authentication device 20 when taking a palm image. Air temperature and humidity are the temperature and humidity of the surrounding environment of the authentication device 20, respectively.
- the physical condition is the physical condition of the user.
- the authentication device 20 can be provided with sensors that measure temperature, humidity, illuminance, and the like (not shown).
- the authentication device 20 may acquire ambient environment information by communication from a management device that manages lighting and air conditioning (not shown).
- the authentication apparatus 20 can acquire a user's physical condition from the input device which is not shown in figure.
- the authentication device 20 may acquire other ambient environment information such as weather and a self-evaluation of a hand-held operation.
- the authentication device 20 makes it possible to guide the palm more appropriately by enhancing the information stored in the NG biological information database 52 and the NG environment information database 53. Further, the enhancement of information stored in the NG biological information database 52 and the NG environment information database 53 contributes to the improvement of extraction, analysis, and correction of surface information by the authentication device 20.
- FIG. 23 is a flowchart of guidance method selection processing according to the second embodiment.
- FIG. 24 is a diagram illustrating an example of the guidance information database according to the second embodiment.
- FIG. 25 is a diagram illustrating an example of a guidance display screen according to the second embodiment.
- the guidance method selection process is executed in step S17 of the authentication process.
- Step S81 The guidance method selection unit 213 determines whether there is a registered template corresponding to the user.
- the guidance method selection unit 213 proceeds to step S82 if there is a registered template corresponding to the user, and proceeds to step S83 if not.
- the guidance method selection unit 213 uses a palm image comparison target as a registered template. [Step S83] The guidance method selection unit 213 sets the comparison target of palm images as a standard model.
- the guidance method selection unit 213 identifies the inclination and deformation of the palm from the unevenness information (surface information).
- the unevenness information used for identification the unevenness information analyzed by the surface information analysis unit 210 may be used, or analysis may be performed again in accordance with the identification of the comparison target.
- the guidance method selection unit 213 acquires guidance information corresponding to the specified palm inclination and deformation from the guidance information database 54.
- the guidance method selection unit 213 proceeds to step S87 when guidance information corresponding to the specified palm inclination and deformation can be obtained from the guidance information database 54.
- the guidance method selection unit 213 proceeds to step S88 when the guidance information corresponding to the identified palm inclination and deformation cannot be obtained from the guidance information database 54.
- the guidance method selection unit 213 performs guidance notification according to the guidance information acquired from the guidance information database 54, and ends the guidance method selection process.
- the guidance method selection unit 213 performs notification for guiding a normal position where the palm is held over, and ends the guidance method selection process.
- Guidance information 320 is an example of guidance information acquired in step S85.
- the guidance information 320 is an example of guidance information managed by the guidance information database 54.
- the guidance information includes an identification number and information for discriminating a guidance method such as a concave state, a convex state for each part (part 1, part 2,%), Or a total state overlooking a plurality of parts.
- the guidance information 320 is information related to guidance notification such as a status message, a guidance message, and a retry message corresponding to the total status.
- the identification number is identification information that uniquely identifies the guidance information.
- the combination of the uneven state for each region specifies the total state.
- the total state corresponds to the uneven state by one or more parts.
- the status message is a message that indicates the palm status of the user.
- the guidance message is a message for guiding the palm to the normal position.
- the retry message is a message indicating the degree of guidance compared to the previous shooting posture.
- the authentication device 20 performs notification such as the guidance display screen 85.
- the guidance display screen 85 displays a guidance display 86 based on an image, a guidance display 87, and a message display 88 that is a guidance display based on a message.
- the guidance display 86 is an image of the palm as viewed from above, and displays the posture at the normal position (standard model) and the posture at the photographing position so that they can be compared.
- the guidance display 86 only needs to be able to grasp a horizontal shift, and may be an image of the palm viewed from below.
- the guidance display 87 is an image of the palm viewed from the side, and displays the normal position posture and the photographing position posture (estimated from analysis of surface information) in a comparable manner.
- the guidance display 87 only needs to be able to grasp a vertical shift.
- a photographed image can be used, but CG (Computer / Graphic) may be used.
- the guidance display 87 can use CG.
- the guidance display 86 and the guidance display 87 display the outer shape of the normal position and posture with a solid line and the outer shape of the shooting position and posture with a broken line to facilitate comparison.
- the message display 88 includes a status message 89, a guidance message A90, and a guidance message B91.
- the status message 89 indicates the posture that caused the verification failure in order to correct the posture that the user is not aware of. For example, the status message 89 points out that “the finger is slightly bent”.
- the guidance message A90 is a message that alerts the user's attitude toward shooting in order to correct the shake of the posture that the user is not aware of. For example, the guidance message A90 indicates “Please relax.”
- the guidance message B91 is a message that specifically points out an incorrect posture of the user. For example, the guidance message B91 guides “Please hold your hand so that the entire palm is flat when viewed from the side”. Note that the authentication device 20 may perform voice notification in addition to or instead of the message display 88.
- the authentication device 20 evaluates how to hold the hand (notification of the status message 89) to the user, it can be expected to improve the skill improvement rate of the user's hand holding.
- the authentication device 20 alerts the attitude of the user to take a picture (notification of the guidance message A90), it can be expected to correct the shake of the posture that the user is not aware of.
- the authentication apparatus 20 makes a specific indication (notification of the guidance display 86, the guidance display 87, and the guidance message B91) about the incorrect posture of the user, it can be expected that the user corrects the posture accurately.
- the authentication device 20 when performing the guidance notification compared with the standard model, the authentication device 20 does not need to access the registered template when performing the guidance notification. As a result, the authentication device 20 can also apply the guidance notification even when performing prior collation with the standard model before collation using the registered template.
- FIG. 26 is a flowchart of guidance method selection processing according to the third embodiment.
- the guidance method selection process of the third embodiment is different from that of the second embodiment in that guidance notification according to the number of retries is performed.
- the guidance method selection unit 213 identifies the inclination and deformation of the palm from the unevenness information (surface information) analyzed by the surface information analysis unit 210. [Step S ⁇ b> 92] The guidance method selection unit 213 acquires guidance information corresponding to the specified palm inclination and deformation from the guidance information database 54.
- the guidance method selection unit 213 acquires the cumulative number of times that the retry flag has been set (the number of retries). [Step S94] The guidance method selection unit 213 determines whether or not the number of retries is equal to or less than a predetermined value. The guidance method selection unit 213 proceeds to step S96 if the number of retries is equal to or less than the specified value, and proceeds to step S95 if the number exceeds the specified value.
- the guidance method selection unit 213 identifies a failure cause with a specific frequency from a plurality of failure causes, and gives a notice for calling attention to the failure cause with a high frequency.
- the guidance method selection unit 213 proceeds to step S97 if guidance information corresponding to the specified palm inclination and deformation can be obtained from the guidance information database 54.
- the guidance method selection unit 213 proceeds to step S98 when guidance information corresponding to the specified palm inclination and deformation cannot be obtained from the guidance information database 54.
- the guidance method selection unit 213 performs guidance notification according to the guidance information acquired from the guidance information database 54, and ends the guidance method selection process.
- the guidance method selection unit 213 performs notification for guiding the normal position where the hand is held over, and ends the guidance method selection process.
- the authentication device 20 can accurately point out not only the cause of the previous failure but also the user's failure tendency.
- the authentication apparatus according to the fourth embodiment is different from the second embodiment in that the surface information can be corrected for the photographed image without requiring the extraction and analysis of the surface information in advance.
- FIG. 27 is a diagram illustrating a configuration of the authentication device according to the fourth embodiment.
- symbol is made the same and description is abbreviate
- the authentication device 20a includes a control unit 200, a storage unit 201, a notification unit 202, a communication unit 203, an image input unit 204, an object extraction unit 205, a palm determination unit 206, a palm extraction unit 207, And an external shape correction unit 208. Furthermore, the authentication device 20a includes a surface information correction unit 211a, an NG information acquisition unit 212, a guidance method selection unit 213, a biological information extraction unit 214, and a verification unit 215.
- the surface information correction unit 211a performs correction to remove surface reflected light and high frequency components from the palm image corrected by the outer shape correction unit 208.
- the surface information correction unit 211a may perform correction to remove the surface reflected light and the high-frequency component from the captured image of the palm.
- FIG. 28 is a flowchart of surface information correction processing according to the fourth embodiment.
- FIG. 29 is a flowchart of palm brightness model generation processing according to the fourth embodiment.
- FIG. 30 is a diagram illustrating an example of a surface information correction process according to the fourth embodiment.
- FIG. 31 is a diagram illustrating a luminance graph of the fourth embodiment and a luminance corrected image obtained by removing surface reflection and high frequency from the luminance graph.
- the surface information correction unit 211a acquires a captured image of the palm (for example, a palm image 350).
- the palm image 350 is an image obtained by photographing the palm by irradiating near infrared rays.
- the palm image 350 is, for example, an image with a gray scale of 256 gradations from 0 to 255.
- 30 (1) to 30 (6) schematically represent a gray scale image (luminance distribution) with 256 gradations using a dividing line for each predetermined gradation. In the luminance distribution shown in FIGS. 30 (1) to 30 (6), the background portion is dark and the palm portion is bright.
- the surface information correction unit 211a executes palm luminance model generation processing for generating a palm luminance model (for example, luminance model 351) from a palm captured image (for example, palm image 350). Details of the palm brightness model generation process will be described later.
- the surface information correction unit 211a extracts surface reflected light (for example, surface reflected light 352) from a captured image of the palm (for example, palm image 350) and a luminance model (for example, luminance model 351).
- the surface reflected light can be extracted from the difference between the captured image of the palm and the luminance model. More specifically, surface reflected light can be extracted by obtaining the maximum value of the luminance of the captured image of the palm and the luminance of the luminance model for each pixel and subtracting the luminance of the luminance model from the maximum value.
- the surface reflected light includes specular reflected light obtained by specularly reflecting light from a light source (including external light and the like, not limited to a light source that irradiates the palm when photographing the palm).
- the surface information correction unit 211a identifies a high-frequency component (for example, a high-frequency component 353) included in the surface reflected light (for example, the surface reflected light 352).
- the high-frequency component here refers to a frequency component higher than the frequency component of biometric information (palm veins) used for biometric authentication.
- the high-frequency component is, for example, a noise component derived from sensitivity characteristics unique to an image sensor that captures (captures) a palm, the surrounding environment, and the like.
- the surface information correction unit 211a is, for example, a top-hat ⁇ ⁇ by Reconstruction based on an opening process using a structural element (StructuringStructElement), which is a kind of gray-scale morphology process. )
- the high frequency component can be specified by the processing.
- the surface information correction unit 211a can identify a high frequency component from the processing result by selecting a structural element having a shape highly correlated with the high frequency component.
- the surface information correction unit 211a removes the surface reflected light (for example, the surface reflected light 352) and the high frequency component (for example, the high frequency component 353) from the captured image of the palm (for example, the palm image 350).
- the surface information correction unit 211a further generates a relaxed image (for example, the relaxed image 354) in which the steep change is relaxed based on the brightness model (for example, the brightness model 351).
- the removal of the surface reflected light and the high frequency component of the palm photographed image is not limited to both the removal of the surface reflected light and the removal of the high frequency component, and either one of them may be performed. Further, the removal of the surface reflected light and the high-frequency component of the palm photographed image is not limited to the removal of all but may be a removal of a predetermined ratio. Further, the ratio of removing the surface reflected light and the high frequency component from the palm photographed image may be determined separately for the removal of the surface reflected light from the palm photographed image and the removal of the high frequency component from the palm photographed image. . The ratio of removing surface reflected light and high-frequency components from the palm-captured image may be set in advance, or set according to predetermined conditions (for example, for each type of image sensor included in the authentication device or for each individual). May be.
- the surface information correction unit 211a generates a luminance correction image (for example, luminance correction image 355) in which the surface luminance of the relaxation image (for example, relaxation image 354) is uniformed, and ends the surface information correction processing. .
- the surface information correction unit 211a alleviates a sharp change in luminance in the relaxed image and uniformizes the surface luminance of the relaxed image based on the luminance model.
- the surface brightness of the relaxed image can be made uniform by reducing the brightness by a predetermined amount or a predetermined ratio based on the difference between the brightness model and the relaxed image.
- the surface information correction unit 211a performs sampling (downsampling) by reducing the sampling frequency of the palm-captured image, thereby reducing the information amount of the captured image. That is, the surface information correction unit 211a generates a first converted image from the palm image.
- the sampling frequency may be set in advance or may be determined according to the frequency component included in the captured image.
- the surface information correction unit 211a masks the palm portion of the first converted image, and uses the average value of the luminance of the non-mask area as the non-mask area (background part). Fill the area (padding). That is, the surface information correction unit 211a generates a second converted image from the first converted image.
- the surface information correction unit 211a removes a steep change in luminance of the second converted image by smoothing using a spline function (smoothing spline). That is, the surface information correction unit 211a generates a third converted image from the second converted image.
- a spline function smoothing spline
- the surface information correction unit 211a increases the information amount of the third converted image by increasing the sampling frequency of the third converted image and performing sampling (upsampling) to generate a luminance model. That is, the surface information correction unit 211a generates a luminance model (fourth converted image) from the third converted image.
- the surface information correction unit 211a can generate a palm brightness model from the palm image.
- amendment part 211a can remove surface reflected light and a high frequency component from the picked-up image of a palm. Further, the surface information correction unit 211a can generate a brightness correction image in which the surface brightness of the image (relaxation image) from which the surface reflected light and the high frequency components are removed is made uniform.
- the generation of the brightness correction image from the captured image of the palm performed by the surface information correction unit 211a in this way will be described using the graph shown in FIG.
- the graphs shown in FIGS. 31 (1) and 31 (2) are graphs obtained when a photographed image of the palm is scanned in the horizontal direction.
- the x-axis is the horizontal position of the palm, and the y-axis is the luminance.
- the surface information correction unit 211a obtains a captured image showing a luminance distribution as shown in the luminance graph 356.
- the luminance graph 356 is a graph schematically showing the luminance distribution when the palm image 350 is scanned in the horizontal direction. In the luminance graph 356, the luminance increases along the outer shape of the palm, and the luminance decreases in the background portion.
- the luminance graph 356 includes a portion where the luminance changes sharply due to specular reflection (specular reflection portion), a vein portion where the luminance decreases compared to the surroundings, and a high-frequency component that causes noise.
- the surface information correction unit 211a executes a palm model generation process from the luminance graph 356 to obtain a luminance model showing a luminance distribution as shown in the luminance graph 357 (FIG. 31 (1)).
- the luminance graph 357 is a graph schematically showing the luminance distribution when the luminance model 351 is scanned in the horizontal direction. In the luminance graph 357, the specular reflection portion, the vein portion, and the high-frequency component that are in the luminance graph 356 are removed.
- the surface reflection light removed by the surface information correction unit 211a is a specular reflection part of the luminance graph 356, and corresponds to, for example, the surface reflection light 352.
- the veins removed by the surface information correction unit 211a are vein portions of the luminance graph 356.
- the high frequency component removed by the surface information correction unit 211a corresponds to the high frequency component 353, for example.
- the surface information correction unit 211a removes the surface reflected light and high frequency components from the luminance graph 356, and relieves a steep change based on the luminance model, thereby obtaining a relaxed image showing the luminance distribution as in the luminance graph 358.
- the luminance graph 358 is a graph schematically showing the luminance distribution when the relaxed image 354 is scanned in the horizontal direction. In the luminance graph 358, the specular reflection portion and the high-frequency component that are present in the luminance graph 356 are removed.
- the surface information correction unit 211a uniformizes the surface luminance of the luminance graph 358 and obtains a luminance correction image showing the luminance distribution as shown in the luminance graph 359 (FIG. 31 (2)).
- the luminance graph 359 is a graph schematically showing the luminance distribution when the luminance correction image 355 is scanned in the horizontal direction. In the luminance graph 359, the gradient of luminance that is the same as that in the luminance graph 358 is made uniform.
- the authentication device 20a can obtain a brightness correction image that can easily extract biometric information.
- the permissible range of the posture of the living body can be expanded.
- the authentication device 20a can specify the surface reflected light by using only the grayscale image as an input, and can realize high-speed processing compared to the case where the surface reflected light is specified by the color image.
- the authentication device 20a can satisfactorily remove high-frequency components contained in the surface reflected light.
- the authentication system 10 shown in FIG. 2 includes an authentication device 20, an authentication device 30, and an authentication device 40.
- Each of the devices includes an image sensor that images a living body, and naturally has inherent sensitivity characteristics.
- the authentication device 20 includes a CMOS image sensor as the image sensor
- the authentication device 30 includes a CCD image sensor as the image sensor.
- the authentication device 20a can remove the noise based on the sensitivity characteristic unique to the image sensor and extract the stable biological information (vein feature). Thereby, the authentication device 20a can maintain high authentication accuracy even when different authentication devices are used at the time of registration and verification of biometric information. And the authentication system comprised including the authentication apparatus 20a can improve the compatibility and interoperability between different authentication apparatuses.
- the above processing functions can be realized by a computer.
- a program describing the processing contents of the functions that the authentication device 20, the authentication device 30, the authentication device 40, the authentication server 50, and the authentication device 20a should have is provided.
- the program describing the processing contents can be recorded on a computer-readable recording medium (including a portable recording medium).
- the computer-readable recording medium include a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory.
- the magnetic recording device include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape.
- Optical discs include DVD (Digital Versatile Disc), DVD-RAM, CD-ROM, CD-R (Recordable) / RW (ReWritable), and the like.
- Magneto-optical recording media include MO (Magneto-Optical disk).
- a portable recording medium such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
- the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.
- the palm is exemplified and described as the living body surface.
- the present invention is not limited thereto, and any living body surface may be used.
- the surface of the living body may be the sole of a foot, fingers of limbs, insteps of limbs, wrists, arms, and the like.
- the biological surface when using a vein for authentication, should just be a site
- biometric information used for authentication is not limited to veins, but may be fingerprints, palm prints, or other information. Further, the above-described embodiment can be variously modified within a range not departing from the gist of the embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Collating Specific Patterns (AREA)
- Image Input (AREA)
Abstract
Description
本発明の上記および他の目的、特徴および利点は本発明の例として好ましい実施の形態を表す添付の図面と関連した以下の説明により明らかになるであろう。
[第1の実施形態]
まず、第1の実施形態の認証装置について、図1を用いて説明する。図1は、第1の実施形態の認証装置の構成を示す図である。
[第2の実施形態]
図2は、第2の実施形態の認証システムの構成を示す図である。第2の実施形態として、認証システム10が手のひらの静脈を用いて認証をおこなうシステムを例示するが、これに限らず生体のその他の特徴検出部位で認証をおこなうシステムにも適用可能である。
認証装置20は、制御部200と、記憶部201と、報知部202と、通信部203とを備える。さらに、認証装置20は、画像入力部204と、対象物抽出部205と、手のひら判定部206と、手のひら切出し部207と、外形補正部208と、表面情報抽出部209とを備える。さらに、認証装置20は、表面情報解析部210と、表面情報補正部211と、NG情報取得部212と、誘導方法選択部213と、生体情報抽出部214と、照合部215とを備える。
認証装置20は、処理装置21、ディスプレイ22、キーボード23、センサユニット内蔵マウス24、ICカードリーダライタ25を備える。
以上のようなハードウェア構成によって、本実施の形態の処理機能を実現することができる。なお、認証装置30、認証装置40、認証サーバ50も同様のハードウェア構成で実現できる。
[ステップS13]処理装置21は、取得した撮影画像から被写体を抽出する(対象物抽出部205)。処理装置21は、抽出した被写体が手のひらであるか否かを判定する(手のひら判定部206)。処理装置21は、手のひらであると判定した被写体から、手のひらを切り出す(手のひら切出し部207)。処理装置21は、切り出した手のひらの画像を正位置に補正する(外形補正部208)。
[ステップS19]処理装置21は、表面情報を補正した手のひら画像、または表面情報の補正を必要としなかった手のひら画像から照合に用いる生体情報を抽出する(生体情報抽出部214)。
[ステップS21]処理装置21は、照合が一致すれば、すなわち所定の閾値を超える一致度を有すると評価すれば、照合一致としてステップS22にすすみ、照合が一致しなければ、照合不一致としてステップS23にすすむ。
[ステップS23]処理装置21は、照合不一致を受けて、本人拒否の判定をおこない、認証失敗に伴う所要の処理を実行後に認証処理を終了する。
[ステップS34]表面情報抽出部209は、取得した輝度情報を正規化する。表面情報抽出部209は、この輝度情報の正規化を、輝度分布のシフトと、輝度分布範囲の補正とによっておこなう。たとえば、表面情報抽出部209は、取得した輝度情報からグレースケール(0から255)の範囲の輝度分布64を生成する。輝度分布64は、分布範囲下限Th01、分布範囲上限Th02、分布範囲W0である。表面情報抽出部209は、輝度分布64をモデル輝度分布65の分布範囲下限Th11、分布範囲上限Th12、分布範囲W1に補正して正規化する(図9参照)。
[ステップS37]表面情報抽出部209は、輝度情報を取得した1つの微小部分領域63について、輝度の大きさと輝度分布の大きさを評価する。たとえば、表面情報抽出部209は、理想値との比較により輝度の大きさ(明るさ)と輝度分布の大きさの評価をおこなう。より具体的には、表面情報抽出部209は、最頻値を含む所定の範囲の輝度の分布範囲により輝度分布の大きさを評価し、評価対象となった分布範囲の平均輝度により輝度の大きさを評価する。輝度の大きさの評価は、理想値と比較して、「かなり暗い」、「暗い」、「標準」、「明るい」、「かなり明るい」がある。また、輝度分布の大きさの評価は、理想値と比較して、「広い」、「狭い」、「丁度よい」がある。
ここで、図8を用いて説明した表面情報の評価単位の変形例について図11を用いて説明する。図11は、第2の実施形態の表面情報の評価単位の変形例を示す図である。
[ステップS46]表面情報解析部210は、判定対象とした部位について、弱い凹範囲であるか否かを判定する。弱い凹範囲であるか否かの判定は、あらかじめ定めた閾値と比較しておこなう。表面情報解析部210は、判定対象とした部位に弱い凹範囲があると判定した場合にステップS47にすすむ。表面情報解析部210は、判定対象とした部位に弱い凹範囲がないと判定した場合にステップS48にすすむ。
[ステップS48]表面情報解析部210は、複数ある部位のすべてについて解析が終了したか否かを判定する。表面情報解析部210は、複数ある部位のすべてについて解析が終了していない場合に、未だ解析の終了していない部位を解析するためにステップS42にすすむ。表面情報解析部210は、複数ある部位のすべてについて解析が終了した場合に、表面情報解析処理を終了する。
[ステップS64]表面情報補正部211は、表面反射強調画像から表面反射成分を除去し、表面反射を緩和した表面反射緩和画像(たとえば、緩和イメージ77(図17(4)))を生成する。
次に、NG情報取得部212が実行するNG情報取得処理について図18から図22を用いて詳細に説明する。図18は、第2の実施形態のNG情報取得処理のフローチャートである。図19は、第2の実施形態のNG生体情報の一例を示す図である。図20は、第2の実施形態のNG画像の一例を示す図である。図21は、第2の実施形態のNG環境情報の一例を示す図である。図22は、第2の実施形態の撮影環境画像の一例を示す図である。NG情報取得処理は、表面情報の補正ができない場合に、認証処理のステップS16において実行される。
[ステップS83]誘導方法選択部213は、手のひら画像の比較対象を標準モデルとする。
[ステップS86]誘導方法選択部213は、特定した手のひらの傾き、変形に対応する誘導情報を誘導情報データベース54から取得できた場合は、ステップS87にすすむ。一方、誘導方法選択部213は、特定した手のひらの傾き、変形に対応する誘導情報を誘導情報データベース54から取得できなかった場合は、ステップS88にすすむ。
[ステップS88]誘導方法選択部213は、手のひらをかざす正位置を案内する報知をおこない、誘導方法選択処理を終了する。
[第3の実施形態]
図26は、第3の実施形態の誘導方法選択処理のフローチャートである。第3の実施形態の誘導方法選択処理は、リトライ回数に応じた誘導報知をおこなう点で、第2の実施形態と異なる。
[ステップS92]誘導方法選択部213は、特定した手のひらの傾き、変形に対応する誘導情報を、誘導情報データベース54から取得する。
[ステップS94]誘導方法選択部213は、リトライ回数があらかじめ定める規定値以下か否かを判定する。誘導方法選択部213は、リトライ回数が規定値以下であれば、ステップS96にすすみ、規定値を超えていればステップS95にすすむ。
[ステップS96]誘導方法選択部213は、特定した手のひらの傾き、変形に対応する誘導情報を誘導情報データベース54から取得できた場合は、ステップS97にすすむ。一方、誘導方法選択部213は、特定した手のひらの傾き、変形に対応する誘導情報を誘導情報データベース54から取得できなかった場合は、ステップS98にすすむ。
[ステップS98]誘導方法選択部213は、手のひらをかざす正位置を案内する報知をおこない、誘導方法選択処理を終了する。
[第4の実施形態]
次に、図27から図31を用いて、第4の実施形態を説明する。第4の実施形態の認証装置は、撮影画像について、あらかじめ表面情報の抽出と解析をおこなうことを必要とせずに表面情報の補正をおこなうことができる点で、第2の実施形態と異なる。
[ステップS111]表面情報補正部211aは、手のひらの撮影画像についてサンプリング周波数を下げてサンプリング(ダウンサンプリング)をおこなうことにより撮影画像の情報量を低減する。すなわち、表面情報補正部211aは、手のひらの撮影画像から第1の変換画像を生成する。なお、サンプリング周波数は、あらかじめ設定するものであってもよいし、撮影画像に含まれる周波数成分に応じて決定したものであってもよい。
たとえば、生体表面は、足の裏、手足の指、手足の甲、手首、腕などであってもよい。
なお、生体情報取得部位を特定可能な生体表面であれば認証に有利である。たとえば、手のひらや顔などであれば、取得した画像から部位を特定可能である。
また、上述の実施の形態は、実施の形態の要旨を逸脱しない範囲内において種々の変更を加えることができる。
2 表面情報抽出部
3 表面情報解析部
4 表面情報補正部
5 認証部
10 認証システム
20、30、40、20a 認証装置
21 処理装置
22 ディスプレイ
23 キーボード
24 センサユニット内蔵マウス
24a、31、43 センサユニット
24b 制御部
24c 撮影部
24d 測距部
24e 記憶部
24f 通信部
25、42 ICカードリーダライタ
26 ICカード
41 テンキー
44 扉
50 認証サーバ
51 ネットワーク
52 NG生体情報データベース
53 NG環境情報データベース
54 誘導情報データベース
60 手
61、68 手のひら全体領域
62、69 部分領域
63、70 微小部分領域
101 CPU
102 RAM
103 HDD
104 通信インタフェース
105 グラフィック処理装置
106 入出力インタフェース
107 バス
110 可搬型記録媒体
200 制御部
201 記憶部
202 報知部
203 通信部
204 画像入力部
205 対象物抽出部
206 手のひら判定部
207 手のひら切出し部
208 外形補正部
209 表面情報抽出部
210 表面情報解析部
211,211a 表面情報補正部
212 NG情報取得部
213 誘導方法選択部
214 生体情報抽出部
215 照合部
Claims (15)
- 生体を撮影した画像情報から抽出された、生体表面に関する表面情報を補正する表面情報補正部と、
補正した前記表面情報を用いて生体認証をおこなう認証部と、
を備えることを特徴とする認証装置。 - 前記表面情報補正部は、
前記画像情報の輝度分布から前記生体の輝度モデルを生成する輝度モデル生成部と、
前記輝度モデルから前記生体の表面反射光を抽出する表面反射光抽出部と、
前記表面反射光に含まれる、前記生体認証に用いる生体情報が有する周波数成分より高い高周波成分を特定する高周波成分特定部と、
前記画像情報から前記表面反射光と前記高周波成分とのうちから少なくとも一部を除去して急峻な変化を緩和した緩和画像を生成する緩和画像生成部と、
を備えることを特徴とする請求の範囲第1項記載の認証装置。 - 前記表面情報補正部は、前記緩和画像の表面輝度を均一化する補正をおこなう輝度補正部を備えることを特徴とする請求の範囲第2項記載の認証装置。
- 前記緩和画像生成部は、前記画像情報から前記表面反射光と前記高周波成分についてそれぞれ所定の割合で除去することを特徴とする請求の範囲第2項または請求の範囲第3項記載の認証装置。
- 前記表面情報は、前記生体表面の凹凸を評価可能であって、
前記画像情報から、前記生体表面の凹凸を評価可能な表面情報を抽出する表面情報抽出部と、
前記表面情報から前記生体表面の凹凸を解析する表面情報解析部と、
を備え、
前記表面情報補正部は、前記表面情報について前記生体表面の凹凸を補正することを特徴とする請求の範囲第1項記載の認証装置。 - 前記表面情報は、前記生体表面の凹凸を評価可能であって、
前記画像情報から輝度情報を、前記生体表面の凹凸を評価可能な前記表面情報として抽出する表面情報抽出部と、
前記表面情報から前記生体表面の凹凸部の範囲と明るさを特定して、前記生体表面の凹凸部の範囲と明るさの程度が補正可能な所定範囲内にあるか否かを判定する表面情報解析部と、
を備え、
前記表面情報補正部は、前記表面情報について前記生体表面の凹凸を補正することを特徴とする請求の範囲第1項記載の認証装置。 - 前記表面情報解析部は、
前記生体表面を複数の部分領域に分割して、前記部分領域毎の凹凸部の範囲と明るさを特定することを特徴とする請求の範囲第6項記載の認証装置。 - 前記部分領域は、相互に重なる重複領域があることを許して配置されることを特徴とする請求の範囲第7項記載の認証装置。
- 前記表面情報補正部は、前記表面情報の部位ごとの明るさを補正することを特徴とする請求の範囲第8項記載の認証装置。
- 前記画像情報を補正できない場合に、画像情報を再取得するために、前記生体を正位置に誘導する報知をおこなう報知部を備えることを特徴とする請求の範囲第6項乃至請求の範囲第9項のいずれか1項記載の認証装置。
- 前記報知部は、補正できなかった前記画像情報から、前記生体の姿勢を評価したメッセージを報知することを特徴とする請求の範囲第10項記載の認証装置。
- 補正できなかった前記画像情報をNG生体情報として取得するNG情報取得部を備えることを特徴とする請求の範囲第6項乃至請求の範囲第9項のいずれか1項記載の認証装置。
- 前記NG情報取得部は、補正できなかった前記画像情報を撮影した環境に関する情報を、NG環境情報として取得することを特徴とする請求の範囲第12項記載の認証装置。
- 生体の特徴を利用して個人認証をおこなう処理をコンピュータに実行させる認証プログラムであって、
前記コンピュータに、
前記生体を撮影した画像情報から抽出された、生体表面に関する表面情報を補正し、
補正した前記表面情報を用いて生体認証をおこなう、
処理を実行させることを特徴とする認証プログラム。 - コンピュータが実行する、生体の特徴を利用して個人認証をおこなう認証方法であって、
前記生体を撮影した画像情報から抽出された、生体表面に関する表面情報を補正し、
補正した前記表面情報を用いて生体認証をおこなう、
ことを特徴とする認証方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012557971A JP5685272B2 (ja) | 2011-02-15 | 2012-02-14 | 認証装置、認証プログラム、および認証方法 |
EP12746896.5A EP2677490B1 (en) | 2011-02-15 | 2012-02-14 | Authentication device, authentication program, and authentication method |
BR112013020468A BR112013020468A2 (pt) | 2011-02-15 | 2012-02-14 | aparelho de autenticação, programa de autenticação, e método de autenticação |
US13/951,964 US9245168B2 (en) | 2011-02-15 | 2013-07-26 | Authentication apparatus, authentication program, and authentication method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011053187 | 2011-02-15 | ||
JPPCT/JP2011/053187 | 2011-02-15 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/951,964 Continuation US9245168B2 (en) | 2011-02-15 | 2013-07-26 | Authentication apparatus, authentication program, and authentication method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012111664A1 true WO2012111664A1 (ja) | 2012-08-23 |
Family
ID=46672582
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/053395 WO2012111664A1 (ja) | 2011-02-15 | 2012-02-14 | 認証装置、認証プログラム、および認証方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9245168B2 (ja) |
EP (1) | EP2677490B1 (ja) |
BR (1) | BR112013020468A2 (ja) |
WO (1) | WO2012111664A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014180435A (ja) * | 2013-03-19 | 2014-09-29 | Fujitsu Ltd | 生体情報入力装置、生体情報入力プログラム、生体情報入力方法 |
JP2016522487A (ja) * | 2013-05-10 | 2016-07-28 | イーリス−ゲゼルシャフト ミット ベシュレンクテル ハフツング インフラレッド アンド インテリジェント センサーズ | 手の静脈パターンを記録するセンサシステム及び方法 |
JP2016181753A (ja) * | 2015-03-23 | 2016-10-13 | 株式会社リコー | 情報処理装置、情報処理方法、プログラムおよびシステム |
EP2975844A4 (en) * | 2013-03-13 | 2016-11-09 | Fujitsu Frontech Ltd | IMAGE PROCESSING DEVICE, IMAGE PROCESSING AND PROGRAM |
CN107463885A (zh) * | 2017-07-19 | 2017-12-12 | 广东欧珀移动通信有限公司 | 生物识别模式控制方法及相关产品 |
JP2018067206A (ja) * | 2016-10-20 | 2018-04-26 | 富士通株式会社 | 撮影装置 |
EP3330888A1 (en) | 2016-11-30 | 2018-06-06 | Fujitsu Limited | Biometric authentication apparatus, biometric authentication method, and biometric authentication program |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5691669B2 (ja) * | 2011-03-08 | 2015-04-01 | 富士通株式会社 | 生体情報処理装置、生体情報処理方法、および生体情報処理プログラム |
CN104205165B (zh) * | 2012-03-28 | 2017-06-23 | 富士通株式会社 | 生物体认证装置、生物体认证方法、以及生物体认证程序 |
CN104778465B (zh) * | 2015-05-06 | 2018-05-15 | 北京航空航天大学 | 一种基于特征点匹配的目标跟踪方法 |
US10963159B2 (en) * | 2016-01-26 | 2021-03-30 | Lenovo (Singapore) Pte. Ltd. | Virtual interface offset |
JP6623832B2 (ja) * | 2016-02-26 | 2019-12-25 | 富士通株式会社 | 画像補正装置、画像補正方法及び画像補正用コンピュータプログラム |
JP6712247B2 (ja) * | 2017-06-09 | 2020-06-17 | 株式会社日立製作所 | 生体署名システム及び生体署名方法 |
CN107736874B (zh) | 2017-08-25 | 2020-11-20 | 百度在线网络技术(北京)有限公司 | 一种活体检测的方法、装置、设备和计算机存储介质 |
JP7056052B2 (ja) * | 2017-09-22 | 2022-04-19 | 富士通株式会社 | 画像処理プログラム、画像処理方法、及び画像処理装置 |
WO2019235773A1 (en) * | 2018-06-08 | 2019-12-12 | Samsung Electronics Co., Ltd. | Proximity based access control in a communication network |
KR20200100481A (ko) * | 2019-02-18 | 2020-08-26 | 삼성전자주식회사 | 생체 정보를 인증하기 위한 전자 장치 및 그의 동작 방법 |
JP7292050B2 (ja) * | 2019-02-19 | 2023-06-16 | 株式会社ジャパンディスプレイ | 検出装置及び認証方法 |
CN110070006B (zh) * | 2019-04-01 | 2021-09-21 | Oppo广东移动通信有限公司 | 指纹识别方法、电子装置及计算机可读取介质 |
US11003957B2 (en) | 2019-08-21 | 2021-05-11 | Advanced New Technologies Co., Ltd. | Method and apparatus for certificate identification |
EP3836015A1 (de) * | 2019-12-09 | 2021-06-16 | Iris-Gmbh Infrared & Intelligent Sensors | Sensorsystem zur prüfung von handvenenmustern |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003214823A (ja) * | 2002-01-22 | 2003-07-30 | Casio Comput Co Ltd | フォトセンサシステム及びその駆動制御方法 |
JP2005353014A (ja) * | 2004-05-14 | 2005-12-22 | Sony Corp | 撮像装置 |
JP2007219625A (ja) * | 2006-02-14 | 2007-08-30 | Canon Inc | 血管画像入力装置、及び個人認証システム |
JP2010218258A (ja) * | 2009-03-17 | 2010-09-30 | Ricoh Co Ltd | 生体認証装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3578321B2 (ja) | 1999-03-16 | 2004-10-20 | 日本ビクター株式会社 | 画像正規化装置 |
JP4039039B2 (ja) * | 2001-11-08 | 2008-01-30 | ソニー株式会社 | 個人認証装置 |
JP3770241B2 (ja) * | 2003-03-04 | 2006-04-26 | 株式会社日立製作所 | 個人認証装置及び個人認証方法 |
JP4247691B2 (ja) * | 2006-05-17 | 2009-04-02 | ソニー株式会社 | 登録装置、照合装置、登録方法、照合方法及びプログラム |
JP4786483B2 (ja) | 2006-09-14 | 2011-10-05 | 富士通株式会社 | 生体認証装置の生体誘導制御方法及び生体認証装置 |
JP4379500B2 (ja) * | 2007-07-30 | 2009-12-09 | ソニー株式会社 | 生体撮像装置 |
JP4957514B2 (ja) | 2007-11-12 | 2012-06-20 | 富士通株式会社 | ガイド装置、撮像装置、撮像システム、ガイド方法 |
JP5053889B2 (ja) | 2008-02-29 | 2012-10-24 | グローリー株式会社 | 画像照合装置、本人認証装置、対応点探索装置、対応点探索方法及び対応点探索プログラム |
JP5098973B2 (ja) * | 2008-11-27 | 2012-12-12 | 富士通株式会社 | 生体認証装置、生体認証方法及び生体認証プログラム |
JP5501210B2 (ja) * | 2010-12-16 | 2014-05-21 | 富士フイルム株式会社 | 画像処理装置 |
-
2012
- 2012-02-14 EP EP12746896.5A patent/EP2677490B1/en active Active
- 2012-02-14 WO PCT/JP2012/053395 patent/WO2012111664A1/ja active Application Filing
- 2012-02-14 BR BR112013020468A patent/BR112013020468A2/pt not_active IP Right Cessation
-
2013
- 2013-07-26 US US13/951,964 patent/US9245168B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003214823A (ja) * | 2002-01-22 | 2003-07-30 | Casio Comput Co Ltd | フォトセンサシステム及びその駆動制御方法 |
JP2005353014A (ja) * | 2004-05-14 | 2005-12-22 | Sony Corp | 撮像装置 |
JP2007219625A (ja) * | 2006-02-14 | 2007-08-30 | Canon Inc | 血管画像入力装置、及び個人認証システム |
JP2010218258A (ja) * | 2009-03-17 | 2010-09-30 | Ricoh Co Ltd | 生体認証装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2677490A4 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2975844A4 (en) * | 2013-03-13 | 2016-11-09 | Fujitsu Frontech Ltd | IMAGE PROCESSING DEVICE, IMAGE PROCESSING AND PROGRAM |
EP3217659A1 (en) * | 2013-03-13 | 2017-09-13 | Fujitsu Frontech Limited | Image processing apparatus, image processing method, and program |
US9818177B2 (en) | 2013-03-13 | 2017-11-14 | Fujitsu Frontech Limited | Image processing apparatus, image processing method, and computer-readable recording medium |
US10210601B2 (en) | 2013-03-13 | 2019-02-19 | Fujitsu Frontech Limited | Image processing apparatus, image processing method, and computer-readable recording medium |
JP2014180435A (ja) * | 2013-03-19 | 2014-09-29 | Fujitsu Ltd | 生体情報入力装置、生体情報入力プログラム、生体情報入力方法 |
EP2781185B1 (en) * | 2013-03-19 | 2021-10-20 | Fujitsu Limited | Biometric information input apparatus, biometric information input program and biometric information input method |
JP2016522487A (ja) * | 2013-05-10 | 2016-07-28 | イーリス−ゲゼルシャフト ミット ベシュレンクテル ハフツング インフラレッド アンド インテリジェント センサーズ | 手の静脈パターンを記録するセンサシステム及び方法 |
JP2016181753A (ja) * | 2015-03-23 | 2016-10-13 | 株式会社リコー | 情報処理装置、情報処理方法、プログラムおよびシステム |
JP2018067206A (ja) * | 2016-10-20 | 2018-04-26 | 富士通株式会社 | 撮影装置 |
JP2018092272A (ja) * | 2016-11-30 | 2018-06-14 | 富士通株式会社 | 生体認証装置、生体認証方法及びプログラム |
EP3330888A1 (en) | 2016-11-30 | 2018-06-06 | Fujitsu Limited | Biometric authentication apparatus, biometric authentication method, and biometric authentication program |
US10528805B2 (en) | 2016-11-30 | 2020-01-07 | Fujitsu Limited | Biometric authentication apparatus, biometric authentication method, and computer-readable storage medium |
CN107463885A (zh) * | 2017-07-19 | 2017-12-12 | 广东欧珀移动通信有限公司 | 生物识别模式控制方法及相关产品 |
Also Published As
Publication number | Publication date |
---|---|
EP2677490A4 (en) | 2016-11-30 |
EP2677490A1 (en) | 2013-12-25 |
EP2677490B1 (en) | 2021-04-28 |
US9245168B2 (en) | 2016-01-26 |
BR112013020468A2 (pt) | 2016-10-18 |
US20130308834A1 (en) | 2013-11-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012111664A1 (ja) | 認証装置、認証プログラム、および認証方法 | |
US20220165087A1 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
US10038691B2 (en) | Authorization of a financial transaction | |
CN103460244B (zh) | 生物体认证装置、生物体认证***以及生物体认证方法 | |
US10425814B2 (en) | Control of wireless communication device capability in a mobile device with a biometric key | |
CN110326001A (zh) | 使用利用移动设备捕捉的图像执行基于指纹的用户认证的***和方法 | |
JP5622928B2 (ja) | 照合装置、照合プログラム、および照合方法 | |
JP6751072B2 (ja) | 生体認証システム | |
JP6160148B2 (ja) | 生体情報入力装置、生体情報入力プログラム、生体情報入力方法 | |
US9111152B2 (en) | Verification object specifying apparatus, verification object specifying program, and verification object specifying method | |
EP2610820A2 (en) | Authentication apparatus, authentication program, and method of authentication | |
JP5685272B2 (ja) | 認証装置、認証プログラム、および認証方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12746896 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012557971 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012746896 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112013020468 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112013020468 Country of ref document: BR Kind code of ref document: A2 Effective date: 20130812 |