WO2013114806A1 - Dispositif d'authentification biométrique et procédé d'authentification biométrique - Google Patents

Dispositif d'authentification biométrique et procédé d'authentification biométrique Download PDF

Info

Publication number
WO2013114806A1
WO2013114806A1 PCT/JP2013/000217 JP2013000217W WO2013114806A1 WO 2013114806 A1 WO2013114806 A1 WO 2013114806A1 JP 2013000217 W JP2013000217 W JP 2013000217W WO 2013114806 A1 WO2013114806 A1 WO 2013114806A1
Authority
WO
WIPO (PCT)
Prior art keywords
predetermined
biometric
information
living body
feature data
Prior art date
Application number
PCT/JP2013/000217
Other languages
English (en)
Japanese (ja)
Inventor
一秀 梅田
Original Assignee
九州日本電気ソフトウェア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 九州日本電気ソフトウェア株式会社 filed Critical 九州日本電気ソフトウェア株式会社
Publication of WO2013114806A1 publication Critical patent/WO2013114806A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands

Definitions

  • the present invention relates to biometric authentication technology.
  • Motion capture technology is used in various fields such as CG (Computer Graphics) character generation and sports analysis in movies and games.
  • CG Computer Graphics
  • Kinect registered trademark
  • Motion capture technology is also used in security fields such as personal authentication. Along with this, an authentication method using the gesture of the person to be authenticated has also been proposed.
  • Patent Document 1 detection of a specific action (blink of eyes) of a predetermined part of a living body is used as a trigger for starting a collation process of a face image, so that impersonation using a photograph is prevented, an ID number is input, and the like It has been proposed that the operation of the apparatus is unnecessary.
  • Patent Document 2 proposes a method for permitting the use of a device such as a vehicle based on the results of both authentication using a vehicle key owned by the person to be authenticated and authentication using a gesture of the person to be authenticated. ing.
  • the present invention has been made in view of such circumstances, and provides a technique for efficiently performing biometric authentication using a biological action.
  • the first aspect relates to a biometric authentication device.
  • the biometric authentication device includes an information acquisition unit that sequentially acquires three-dimensional information including biometric information of a target biological body, and a predetermined operation of a predetermined part of the target biological body from the three-dimensional information acquired by the information acquisition unit.
  • Extraction that extracts biometric feature data specified using motion position information of a predetermined motion pattern detected by the motion detection unit from a motion detection unit that detects a pattern and a biometric information storage unit that stores a plurality of biometric feature data ,
  • a feature data generation unit that generates current biological feature data of the target living body from the three-dimensional information acquired by the information acquisition unit, biometric feature data extracted by the extraction unit, and a target generated by the feature data generation unit
  • a collation unit that collates the current biological feature data of the biological body.
  • the second aspect relates to a biometric authentication method.
  • the biometric authentication method according to the second aspect sequentially acquires three-dimensional information including biological information of a target living body, detects a predetermined operation pattern of a predetermined part of the target living body from the acquired three-dimensional information, and a plurality of living bodies From the biometric information storage unit that stores the feature data, the biometric feature data specified by using the motion position information of the detected predetermined motion pattern is extracted, and the current biometric feature data of the target living body is generated from the acquired three-dimensional information And comparing the extracted biometric feature data with the generated current biometric feature data of the target biometric.
  • a biometric authentication system including the biometric authentication device according to the first aspect, a three-dimensional sensor that sends three-dimensional information to the information acquisition unit, and the biometric information storage unit.
  • it may be a computer program that causes a computer to realize the configuration of the first aspect, or a computer-readable recording medium that records such a program.
  • This recording medium includes a non-transitory tangible medium.
  • FIG. 1 is a diagram conceptually illustrating a hardware configuration example of the face image authentication apparatus in the first embodiment.
  • FIG. 2 is a diagram conceptually illustrating a processing configuration example of the face image authentication apparatus in the first embodiment.
  • FIG. 3 is a diagram illustrating an example of the identification data storage unit.
  • FIG. 4 is a diagram illustrating an example of the biological information storage unit.
  • FIG. 5 is a flowchart illustrating an operation example of the face image authentication apparatus according to the first embodiment.
  • FIG. 6 is a diagram illustrating a transition example of the display image in the embodiment.
  • FIG. 7 is a diagram illustrating a transition example of the display image in the embodiment.
  • FIG. 8 is a diagram illustrating a transition example of the display image in the embodiment.
  • the biometric authentication device includes an information acquisition unit that sequentially acquires three-dimensional information including biometric information of a target living body, and a predetermined part of the target biological body that is obtained from the three-dimensional information acquired by the information acquisition unit.
  • the biometric feature data specified using the motion position information of the predetermined motion pattern detected by the motion detection unit is extracted from the motion detection unit that detects the motion pattern and the biometric information storage unit that stores a plurality of biometric feature data.
  • Generated by the extraction unit, the feature data generation unit that generates the current biological feature data of the target living body from the three-dimensional information acquired by the information acquisition unit, the biological feature data extracted by the extraction unit, and the feature data generation unit And a collation unit that collates the current biological feature data of the target biological body.
  • the present embodiment is also a biometric authentication method including each operation executed by each processing unit, and is a program that causes a computer to realize each processing unit and a recording medium that stores the program.
  • the information acquisition unit may generate 3D information by itself as a 3D sensor, or may acquire the 3D information from an external 3D sensor.
  • the biological information included in the acquired three-dimensional information may be information on the whole body of the person, information on only the upper body and lower body, or a head, hand, foot, or the like. It may be information of a predetermined part.
  • a living body image of the target living body can be generated from the three-dimensional information.
  • the biometric information storage unit may be included in the biometric authentication device according to the present embodiment, or may be included in another device (such as a server device).
  • the biometric feature data and the current biometric feature data may be any information that can be acquired from the three-dimensional information, such as a face, eyes (iris), and hands (fingerprints and veins).
  • a predetermined motion pattern of a predetermined part of the target living body is detected, and the biological feature data specified by the motion position information of the predetermined motion pattern from the biological information storage unit, and the target biological body
  • One-to-one matching is performed using the extracted current biological feature data. That is, in the present embodiment, biometric feature data for one-to-one matching is specified using a predetermined motion pattern, that is, motion position information of a predetermined gesture, as a key. This can be realized by uniquely determining the motion position of the predetermined motion pattern to be detected for each target living body.
  • the conventional one-to-one matching an input operation such as a user ID or a password is required to specify one piece of biometric feature data to be verified. Input operation is unnecessary.
  • authentication processing can be performed at higher speed and higher accuracy than one-to-many matching.
  • the predetermined action of the living body is effectively used and the one-to-one matching of the living body information is executed. Therefore, the biometric authentication using the action of the living body can be performed efficiently and with high accuracy. Can be executed.
  • FIG. 1 is a diagram conceptually illustrating a hardware configuration example of a face image authentication device (hereinafter simply referred to as an authentication device) 10 according to the first embodiment.
  • the authentication device 10 according to the first embodiment is a so-called computer, and includes, for example, a CPU (Central Processing Unit) 2, a memory 3, an input / output interface (I / F) 4, and the like that are connected to each other via a bus 5.
  • the memory 3 is a RAM (Random Access Memory), a ROM (Read Only Memory), a hard disk, a portable storage medium, or the like.
  • the input / output I / F 4 is connected to the three-dimensional sensor 7, the display unit 9, and the like.
  • the display unit 9 is a display device such as a display. Note that this embodiment does not limit devices connected to the input / output I / F 4, but via an input unit that accepts user operation input such as a keyboard and a mouse, a printer, and a network (not shown). It may be connected to a communication device or the like that communicates with other computers.
  • the hardware configuration of the authentication device 10 is not limited.
  • the three-dimensional sensor 7 detects three-dimensional information including the subject's biological information.
  • the detected three-dimensional information includes information on the two-dimensional image (image) of the subject obtained by visible light and information on the distance from the three-dimensional sensor 7.
  • the axis indicating the distance from the three-dimensional sensor 7 is denoted as the Z axis.
  • the three-dimensional sensor 7 is realized by a visible light camera and a distance image sensor like Kinect (registered trademark), for example.
  • Kinect registered trademark
  • a distance image sensor also called a depth sensor, irradiates a subject with a near-infrared light pattern from a laser and captures the pattern with a camera that detects the near-infrared light. The distance to the person is calculated.
  • a method for realizing the three-dimensional sensor 7 is not limited, and the three-dimensional sensor 7 may be realized by a three-dimensional scanner method using a plurality of visible light cameras. In FIG. 1, only one three-dimensional sensor 7 is illustrated. However, as the three-dimensional sensor 7, a plurality of sensors such as a visible light camera that captures a two-dimensional image of the subject and a sensor that detects a distance to the subject. Equipment may be provided.
  • FIG. 2 is a diagram conceptually illustrating a processing configuration example of the authentication device 10 according to the first embodiment.
  • the authentication device 10 according to the first embodiment includes an information acquisition unit 11, an operation detection unit 13, an extraction unit 15, a feature data generation unit 17, a collation unit 19, an identification data storage unit 21, a biometric information storage unit 23, and a display processing unit 25.
  • Etc. Each of these processing units is realized, for example, by executing a program stored in the memory 3 by the CPU 2.
  • the program may be installed from a portable recording medium such as a CD (Compact Disc) or a memory card or another computer on the network via the input / output I / F 4 and stored in the memory 3. Good.
  • CD Compact Disc
  • the information acquisition unit 11 sequentially acquires three-dimensional information including biological information of the subject (target biological body) from the three-dimensional sensor 7.
  • the three-dimensional information includes the biological information of the subject and the information on the distance from the three-dimensional sensor 7, the form of realization is not limited.
  • the acquisition speed of the three-dimensional information is arbitrary, but the acquisition speed is related to an allowable operation speed of the operator. That is, if the acquisition speed is high, the allowable operation speed of the operator is also high.
  • the motion detection unit 13 detects a predetermined motion pattern of a predetermined part of the subject from the three-dimensional information acquired by the information acquisition unit 11. Specifically, the motion detection unit 13 recognizes the predetermined part of the target person from the three-dimensional information, and the predetermined part of the target person in the predetermined direction of the predetermined dimension in the three dimensions indicated by the sequentially acquired three-dimensional information. Is detected as a predetermined motion pattern.
  • a general image recognition (pattern recognition) technique may be used for the recognition of the predetermined part, the description is simplified here.
  • a technique for recognizing a human body, face, eyes, mouth and the like from an image is well known.
  • the motion detection unit 13 recognizes a predetermined part of the subject from the three-dimensional information of the subject using such a well-known method.
  • the motion detection unit 13 recognizes the predetermined reference portion of the subject based on the biological image of the subject included in the three-dimensional information, and recognizes the predetermined portion based on the positional relationship with the predetermined reference portion. May be. For example, the motion detection unit 13 recognizes a face as a predetermined reference portion, and then recognizes a face closer to the three-dimensional sensor 7 on the Z axis than the face as both hands of the subject person, and sets the center of each palm. Recognized as a predetermined part.
  • the predetermined reference portion and the predetermined portion are not limited as long as the predetermined portion of the subject can be recognized.
  • both feet are recognized from the biological image, and from the distance information, the portion of both feet is closest to the three-dimensional sensor 7 What is in position is recognized as each toe of the subject.
  • a general motion capture technique may be used to detect the movement as the predetermined operation pattern. Since this motion capture technique itself is well known, description of itself is omitted here.
  • the motion detection unit 13 has a predetermined width (for example, 20 centimeters (cm) in the real world) in a direction in which the palm of the subject approaches the Z-axis 3D sensor 7 of the 3D information. Detects that it has moved).
  • the predetermined operation pattern in this example is an operation in which the subject projects the palm forward from a certain position (in the direction of the three-dimensional sensor 7).
  • the motion detector 13 detects the motion as a predetermined motion pattern regardless of the position at which the palm protrudes on the plane coordinates orthogonal to the Z axis. For example, when the palm is protruded from the position on the head, in front of the face, and in front of the chest, each motion is detected.
  • the motion detection unit 13 may set a detection space for such a predetermined motion pattern.
  • the motion detection unit 13 sets a detection space of a predetermined size at a predetermined position in the three-dimensional space indicated by the three-dimensional information acquired by the information acquisition unit 11, and the predetermined part of the subject outside the detection space is set.
  • the motion is excluded, and a predetermined motion pattern of a predetermined portion of the subject in the detection space is detected.
  • the motion detection unit 13 acquires coordinate information indicating the space. Note that the position and size of the detection space are not limited as long as the predetermined part of the subject can be reached.
  • the motion detection unit 13 sets only the motion of the subject in the detection space set in this way as the detection target of the predetermined motion pattern.
  • the motion detection unit 13 detects a predetermined part of the subject from the three-dimensional information and sets the detection space at a position indicating a predetermined positional relationship with the detected predetermined part or another predetermined part of the subject. Also good. For example, when the predetermined part is the center part of the palm, the motion detection unit 13 is in a range where both hands reach from the face in the direction of the three-dimensional sensor 7 at a position substantially the same height as the face recognized as another predetermined part. A spatial area having a size that can be reached by both hands is set at the position.
  • the predetermined operation pattern may be one predetermined operation or a combination of two or more predetermined operations.
  • the combination of two or more predetermined movements may be a combination of a plurality of different movement forms, a plurality of the same movement forms, or a combination thereof.
  • the predetermined motion pattern may be that the operation of pushing the palm forward is performed a predetermined number of times, or the operation of pushing the palm forward is composed of two times and the operation of pulling the palm forward is composed of three times. Also good.
  • the motion detection unit 13 When the motion detection unit 13 detects a predetermined motion pattern in the detection space, the motion detection unit 13 holds the motion position information of the detected predetermined motion pattern.
  • the movement position information is information indicating a predetermined movement included in the predetermined movement pattern, that is, a position where the movement is performed.
  • the motion position information includes the position information on which each movement has been performed. For example, when the predetermined motion pattern is a motion of protruding the palm three times in the Z-axis direction, the motion position information indicates three positions where the motion is performed on the plane coordinates orthogonal to the Z-axis.
  • FIG. 3 is a diagram illustrating an example of the identification data storage unit 21.
  • the identification data storage unit 21 stores the correspondence between the motion position information of the predetermined motion pattern and each biometric identification data.
  • the biometric identification data is described as a personal ID (Identification), and the first position information to the fifth position information are stored for each personal ID.
  • the operation position information of the predetermined operation pattern may indicate one position, or may indicate a plurality of positions of 5 or less.
  • FIG. 4 is a diagram illustrating an example of the biological information storage unit 23.
  • the biometric information storage unit 23 stores a plurality of biometric feature data and biometric identification data associated with each biometric feature data.
  • the biometric identification data is written as a personal ID.
  • Each biometric feature data stored in the biometric information storage unit 23 is each biometric information legitimately acquired from each individual in advance in order to indicate the legitimacy of each individual.
  • the biometric feature data is, for example, data indicating facial features acquired from each individual's facial image.
  • the method for extracting biometric feature data as biometric information from each individual and the configuration of the biometric feature data itself may be a well-known one used in a well-known biometric authentication technique. Omitted.
  • the extraction unit 15 acquires biometric identification data corresponding to the motion position information of the predetermined motion pattern detected by the motion detection unit 13 from the identification data storage unit 21, and the biometric feature associated with the acquired biometric identification data. Data is extracted from the biological information storage unit 23.
  • the extraction unit 15 extracts the personal ID corresponding to the motion position information (a plurality of position information) of the predetermined motion pattern from the identification data storage unit 21, and the living body associated with the personal ID Feature data is extracted from the biometric information storage unit 23.
  • the extraction unit 15 uses the identification data storage unit 21 to specify the records in which the position information of the identification data storage unit 21 is included in the predetermined allowable area from the positions indicated by the operation position information. From the above, the personal ID may be extracted.
  • the feature data generation unit 17 generates biometric feature data of the target living body from the three-dimensional information acquired by the information acquisition unit 11.
  • the biometric feature data generated by the feature data generation unit 17 is represented as current biometric feature data in order to distinguish it from the biometric feature data stored in the biometric information storage unit 23.
  • the feature data generation unit 17 generates current biological feature data of the target living body from a two-dimensional biological image included in the three-dimensional information.
  • the biometric feature data generation method itself by the feature data generation unit 17 may be a well-known method used in a well-known biometric authentication technique, and thus detailed description thereof is omitted here.
  • the collation unit 19 collates the biometric feature data extracted from the biometric information storage unit 23 with the current biometric feature data of the target living organism generated by the feature data generation unit 17. As for collation between biometric feature data, a well-known method may be used in biometric authentication, and thus description thereof is omitted here.
  • the collation unit 19 generates output data indicating the collation result, and outputs the collation result to the display unit 9 and other output devices via the input / output I / F 4.
  • the display processing unit 25 causes the display unit 9 to display a predetermined display image by sending screen data via the input / output I / F 4.
  • the display processing unit 25 includes a living body image of the target living body generated from the three-dimensional information acquired by the information acquiring unit 11 or a living body replicated image that replicates the living body image, and a predetermined operation pattern on the target living body.
  • a display image including a guide image that guides the user to operate is displayed on the display unit 9. In the display image, at least a part of the guide image is arranged in a space corresponding to the detection space set by the motion detection unit 13 in the display image.
  • a living body replicated image is, for example, an animation image that replicates a subject.
  • the display processing unit 25 may display the guide image in the display image at a predetermined timing.
  • the predetermined timing may be, for example, a timing when the motion detection unit 13 detects a predetermined part of the subject in the three-dimensional space indicated by the three-dimensional information acquired by the information acquisition unit 11, or the motion detection unit 13 may be a timing at which a predetermined pose of the subject is detected, which is set as a trigger for starting the detection process of the predetermined motion pattern by 13. Further, when the predetermined motion pattern is detected by the motion detection unit 13, the display processing unit 25 may change the induced image into a shape corresponding to the predetermined motion pattern.
  • a sentence explaining a predetermined operation pattern may be displayed, or an image having a user interface that can be operated by a virtual subject in a display image such as a button, a dial, or a screw is displayed.
  • the display processing unit 25 may display a guide image in which a plurality of buttons are arranged in the display image. Good.
  • the display processing unit 25 changes the color of at least one of the plurality of buttons in response to detection of a predetermined action pattern corresponding to the movement of the target person's biological image or the living body replicated image in which the action of the target person is reflected. Or may be changed to a pressed state.
  • FIG. 5 is a flowchart illustrating an operation example of the authentication device 10 according to the first embodiment.
  • the authentication device 10 sequentially acquires three-dimensional information including the subject's biological information (S51).
  • This three-dimensional information may be acquired from the information detected by the three-dimensional sensor 7 as described above, or may be acquired from another device.
  • the authentication device 10 detects a predetermined operation pattern of a predetermined part of the subject based on the three-dimensional information of the subject acquired sequentially (S52). At this time, the authentication apparatus 10 may set a detection space of a predetermined size at a predetermined position in the three-dimensional space indicated by the three-dimensional information, and set this detection space as a detection region of a predetermined motion pattern. Further, the authentication device 10 displays a display image including a biological image of the subject person or a biologically replicated image that duplicates the biological image and a guidance image that guides the subject person to operate a predetermined operation pattern on the display unit 9. You may let them. In this case, the authentication device 10 may deform the guide image in the display image in response to detection of the predetermined motion pattern of the subject.
  • the authentication device 10 After detecting the predetermined operation pattern, the authentication device 10 extracts biometric identification data corresponding to the detected operation position information of the predetermined operation pattern from the identification data storage unit 21 (S53).
  • biometric identification data corresponding to the combination of position information where each action is performed is extracted.
  • the authentication device 10 extracts biometric feature data corresponding to the biometric identification data extracted in (S53) from the biometric information storage unit 23 that stores biometric feature data of a plurality of persons (S54).
  • the authentication device 10 generates current biological feature data of the subject based on the three-dimensional information of the subject acquired sequentially (S55).
  • the authentication device 10 collates the biometric feature data extracted from the biometric information storage unit 23 in (S54) and the current biometric feature data generated in (S55) (S56), and outputs the collation result (S57). ).
  • (S55) may be executed before (S53), or may be executed between (S53) and (S54).
  • the predetermined motion pattern (predetermined gesture) of the subject is detected from the three-dimensional information sequentially obtained by the three-dimensional sensor 7 or the like, and information on the position where the predetermined motion pattern is performed ( The biometric identification data corresponding to the movement position information is specified, and the biometric feature data corresponding to the biometric identification data is extracted. One-to-one matching is performed between the biometric feature data and the current biometric feature data reflecting the current biometric information of the subject.
  • one-to-one verification of biometric information can be performed according to a predetermined gesture of the subject without requiring an input operation of the subject. That is, according to the first embodiment, biometric authentication using a biometric operation can be executed efficiently and with high accuracy.
  • a guide image for causing the subject to perform a motion of the predetermined motion pattern is displayed, and the guide image is deformed according to the predetermined motion pattern. If it operates, a predetermined operation pattern (pressing operation, pulling operation, etc.) can be naturally executed.
  • the predetermined motion pattern is determined depending on the part of the target person, the other person, or the object in the three-dimensional space indicated by the three-dimensional information. Can be prevented from being inhibited. That is, the target person can cause the face image authentication apparatus 10 to perform face authentication by a natural operation.
  • the biometric identification data (the personal ID in FIGS. 3 and 4) is used, but the biometric identification data may not be used.
  • the authentication device 10 does not have the identification data storage unit 21, and the biometric information storage unit 23 may store a plurality of pieces of motion position information and biometric feature data stored in the identification data storage unit 21.
  • (S53) and (S54) shown in FIG. 5 are replaced with processing for extracting biometric feature data corresponding to the detected motion position information of the predetermined motion pattern.
  • the combination of the guide image and the predetermined operation pattern may be arbitrarily switched. Even if the movement of the subject pushing the palm forward is a predetermined movement pattern, even if the movement of the subject pulling the palm forward is the predetermined movement pattern, both movements are performed if each movement is executed at the same position. The position information is equal. Therefore, as long as the combination of the guide image and the predetermined motion pattern that does not change the motion position information corresponding to the subject person is used, collation is possible as in the first embodiment even if the combination is arbitrarily switched.
  • the authentication device 10 further includes a pattern switching unit, and the pattern switching unit performs switching of the induced image and switching of the predetermined operation pattern at an arbitrary timing by the operation detection unit 13 and the display processing.
  • the unit 25 is instructed. If it does in this way, it can make it difficult for a 3rd person to grasp
  • the authentication device 10 includes the identification data storage unit 21 and the biometric information storage unit 23.
  • the identification data storage unit 21 and the biometric information storage unit 23 are provided in other devices and are authenticated.
  • the device 10 may access the identification data storage unit 21 and the biometric information storage unit 23 in the other device.
  • the verification result is used for door lock authentication.
  • a display image as shown in the examples of FIGS. 6, 7, and 8 is displayed on the display unit 9.
  • 6, 7, and 8 are diagrams illustrating examples of display image transition in the embodiment.
  • an image of a three-dimensional space generated by sequentially acquired three-dimensional information is displayed on the display unit 9.
  • the three-dimensional sensor 7 captures an image of the target person and its surrounding space, detects distance information from the three-dimensional sensor 7, and the three-dimensional information including the image and the distance information is the information acquisition unit 11. Obtained by The display processing unit 25 causes the display unit 9 to display the video generated from the three-dimensional information.
  • the subject in the detection space in the display image, the subject takes a posture (hereinafter referred to as a start gesture) in which the palms of both hands are put together and fixed in front of the face.
  • a start gesture in which the palms of both hands are put together and fixed in front of the face.
  • the motion detection unit 13 detects a face that is easy to recognize an image as a predetermined reference portion in the display image, and detects a face closer to the three-dimensional sensor 7 on the Z axis than the face. Recognize as both hands.
  • the motion detector 13 recognizes the center of each palm as a predetermined part from the shape of both hands together.
  • the display processing unit 25 displays the transparent keyboard 31 as a guidance image in front of the subject in the display image.
  • the transparent keyboard 31 is arranged on a two-dimensional plane orthogonal to the Z axis in the three-dimensional space indicated by the three-dimensional information.
  • the motion detection unit 13 recognizes the position information in the two-dimensional plane as the position information of each button of the transparent keyboard 31.
  • the predetermined operation pattern in the present embodiment is an operation in which the palm of the subject moves by a predetermined width in a direction approaching the Z-axis three-dimensional sensor 7.
  • the subject operates so as to press the button of the transparent keyboard 31 (guidance image) on which the subject image in the display image is displayed on the display unit 9 while viewing the display image (video) reflected on the display unit 9. . Since this operation matches the predetermined operation pattern of the present embodiment, the operation detection unit 13 detects this operation as a predetermined operation pattern.
  • the display processing unit 25 changes the color of the button virtually operated according to the predetermined operation pattern, as shown in FIG.
  • FIG. 8 shows a state where the button indicating “1” on the transparent keyboard 31 is pressed.
  • the operation detection unit 13 holds the operation position information every time a predetermined operation pattern, that is, a virtual button pressing operation is detected.
  • the position information of the virtually operated button is held as the operation position information.
  • the operation detection unit 13 continues to detect the predetermined operation pattern until an operation of virtually pressing the “OK” button of the transparent keyboard 31 is detected.
  • the operation position information of the predetermined operation pattern indicates a plurality of positions.
  • the extraction unit 15 extracts the biometric feature data corresponding to the motion position information of the predetermined motion pattern from the biometric information storage unit 23, and the collation unit 19 uses the biometric feature data as shown in FIG. 6, FIG. 7, and FIG. One-to-one matching is performed with the current biological feature data extracted from the video.
  • the display processing unit 25 displays this collation result in the display image as “OK”, “NG”, “ ⁇ ”, “x”, and the like.
  • the collation unit 19 in this embodiment sends the collation result to the door lock control unit. Thereby, when the collation result indicates success, the door lock is released by the control unit, and when the collation result indicates failure, the door lock remains locked.
  • the motion detection unit 13 recognizes the position information in the two-dimensional plane as the position information of each button of the transparent keyboard 31, but the motion detection unit 13 does not recognize the character associated with each button.
  • Information (numbers displayed on each button of the transparent keyboard 31) may be recognized, and this character information may be used as biometric identification data.
  • the motion detection unit 13 identifies a number associated with the button corresponding to the motion position of the predetermined motion pattern, and extracts biometric feature data corresponding to this number from the biometric information storage unit 23.
  • An information acquisition unit for sequentially acquiring three-dimensional information including biological information of the target biological body;
  • An operation detection unit for detecting a predetermined operation pattern of a predetermined part of the target living body from the three-dimensional information acquired by the information acquisition unit;
  • An extraction unit that extracts biometric feature data specified by using the motion position information of the predetermined motion pattern detected by the motion detection unit from a biometric information storage unit that stores a plurality of biometric feature data;
  • a feature data generation unit that generates current biological feature data of the target living body from the three-dimensional information acquired by the information acquisition unit;
  • a collation unit for collating the biometric feature data extracted by the extraction unit with the current biometric feature data of the target living body generated by the feature data generation unit;
  • a biometric authentication device for authenticates biometric feature data extracted by the extraction unit with the current biometric feature data of the target living body generated by the feature data generation unit.
  • the motion detector is A detection space having a predetermined size is set at a predetermined position in the three-dimensional space indicated by the three-dimensional information; Excluding the movement of the predetermined part of the target living body outside the detection space, and detecting the predetermined movement pattern of the predetermined part of the target living body in the detection space; The biometric authentication device according to appendix 1.
  • Appendix 3 A display image including a living body image of the target living body generated from the three-dimensional information or a living body replicated image that replicates the living body image, and a guidance image that guides the target living body to operate the predetermined operation pattern.
  • the biometric authentication according to appendix 2 further comprising a display processing unit that causes the display unit to display the display image in which at least a part of the guide image is arranged in a space corresponding to the detection space in the display image. apparatus.
  • Appendix 4 The biometric authentication device according to appendix 3, wherein the display processing unit changes the guidance image when the predetermined motion pattern is detected by the motion detection unit.
  • the predetermined motion pattern is a movement of the predetermined part of the target living body with a predetermined width in a predetermined direction of a predetermined dimension in a three-dimensional space indicated by the three-dimensional information.
  • the operation position information of the predetermined operation pattern for specifying the biometric feature data indicates a position where the movement is performed in a two-dimensional plane other than the predetermined dimension,
  • the biometric information storage unit stores biometric identification data associated with each of the plurality of biometric feature data,
  • the extraction unit includes: Including an identification data storage unit for storing a correspondence relationship between the movement position information of the predetermined movement pattern and each biometric identification data,
  • the biometric identification data corresponding to the motion position information of the predetermined motion pattern detected by the motion detection unit is acquired from the identification data storage unit, and the biometric feature data associated with the acquired biometric identification data is stored in the biometric feature data. Extract from the biological information storage unit,
  • the biometric authentication device according to any one of appendices 1 to 4.
  • the predetermined part of the target living body is a portion in the left hand and right hand of the target living body
  • the guide image is an image in which a plurality of buttons are arranged, Position information in the two-dimensional plane for acquiring the biometric identification data corresponds to at least one position of the plurality of buttons in the display image;
  • the display processing unit changes at least one of the plurality of buttons in response to detection of the predetermined operation pattern.
  • (Appendix 8) Sequentially acquiring three-dimensional information including biological information of the target biological body, From the acquired three-dimensional information, a predetermined motion pattern of a predetermined part of the target living body is detected, Extracting biometric feature data specified using the motion position information of the detected predetermined motion pattern from a biometric information storage unit that stores a plurality of biometric feature data, Generating current biological feature data of the target biological body from the acquired three-dimensional information; Collating the extracted biometric feature data with the generated current biometric feature data of the target biometric, A biometric authentication method.
  • a detection space having a predetermined size is set at a predetermined position in the three-dimensional space indicated by the three-dimensional information; Excluding the movement of the predetermined part of the target living body outside the detection space, Detecting the predetermined motion pattern of the predetermined part of the target living body in the detection space;
  • the biometric authentication method according to appendix 8 further comprising:
  • the predetermined motion pattern is a movement of the predetermined part of the target living body with a predetermined width in a predetermined direction of a predetermined dimension in a three-dimensional space indicated by the three-dimensional information.
  • the operation position information of the predetermined operation pattern for specifying the biometric feature data indicates a position where the movement is performed in a two-dimensional plane other than the predetermined dimension,
  • the biometric information storage unit stores biometric identification data associated with each of the plurality of biometric feature data,
  • the biometric feature data is extracted from an identification data storage unit that stores the correspondence between the motion position information of the predetermined motion pattern and each of the biometric identification data, and the biometric data corresponding to the detected motion position information of the predetermined motion pattern. Acquiring identification data, and extracting the biometric feature data associated with the acquired biometric identification data from the biometric information storage unit;
  • the biometric authentication method according to any one of appendices 8 to 11.
  • the predetermined part of the target living body is a portion in the left hand and right hand of the target living body
  • the guide image is an image in which a plurality of buttons are arranged, Position information in the two-dimensional plane for acquiring the biometric identification data corresponds to at least one position of the plurality of buttons in the display image;
  • the display image is displayed by changing at least one of the plurality of buttons according to detection of the predetermined operation pattern.
  • a detection space having a predetermined size is set at a predetermined position in the three-dimensional space indicated by the three-dimensional information; Excluding the movement of the predetermined part of the target living body outside the detection space, Detecting the predetermined motion pattern of the predetermined part of the target living body in the detection space; The program according to appendix 14, further executing the above.
  • (Appendix 16) Said at least one computer, A display image including a living body image of the target living body generated from the three-dimensional information or a living body replicated image that replicates the living body image, and a guidance image that guides the target living body to operate the predetermined operation pattern. And causing the display unit to display the display image in which at least a part of the guide image is arranged in a space corresponding to the detection space in the display image.
  • the program according to supplementary note 15 further causing the above to be executed.
  • Appendix 17 Said at least one computer, The program according to appendix 16, further causing the guide image to be changed when the predetermined operation pattern is detected.
  • the predetermined motion pattern is a movement of the predetermined part of the target living body with a predetermined width in a predetermined direction of a predetermined dimension in a three-dimensional space indicated by the three-dimensional information.
  • the operation position information of the predetermined operation pattern for specifying the biometric feature data indicates a position where the movement is performed in a two-dimensional plane other than the predetermined dimension,
  • the biometric information storage unit stores biometric identification data associated with each of the plurality of biometric feature data,
  • the biometric feature data is extracted from an identification data storage unit that stores the correspondence between the motion position information of the predetermined motion pattern and each of the biometric identification data, and the biometric data corresponding to the detected motion position information of the predetermined motion pattern. Acquiring identification data, and extracting the biometric feature data associated with the acquired biometric identification data from the biometric information storage unit; The program according to any one of appendices 14 to 17.
  • the predetermined part of the target living body is a portion in the left hand and right hand of the target living body
  • the guide image is an image in which a plurality of buttons are arranged, Position information in the two-dimensional plane for acquiring the biometric identification data corresponds to at least one position of the plurality of buttons in the display image;
  • the display image is displayed by changing at least one of the plurality of buttons according to detection of the predetermined operation pattern.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Input (AREA)
  • Image Analysis (AREA)

Abstract

Ce dispositif d'authentification biométrique comprend : une unité d'acquisition qui acquiert de manière séquentielle des informations tridimensionnelles incluant des informations biométriques d'un être vivant qui doit être autorisé ; une unité de détection permettant de détecter un motif de mouvement prédéterminé d'une section prédéterminée de l'être vivant sur la base des informations tridimensionnelles qui ont été acquises par l'unité d'acquisition ; une unité d'extraction qui extrait, à partir d'une unité de stockage qui stocke une pluralité de données de caractéristiques biométriques, des données de caractéristiques biométriques identifiées à l'aide d'informations de position de mouvement du motif de mouvement prédéterminé qui a été détecté par l'unité de détection ; une unité de génération permettant de générer des données de caractéristiques biométriques actuelles pour l'être vivant qui doit être autorisé sur la base des informations tridimensionnelles qui ont été acquises par l'unité d'acquisition ; et une unité de mise en correspondance qui met en correspondance les données de caractéristiques biométriques qui ont été extraites par l'unité d'extraction avec les données de caractéristiques biométriques actuelles pour l'être vivant qui doit être autorisé, qui ont été générées par l'unité de génération.
PCT/JP2013/000217 2012-01-30 2013-01-18 Dispositif d'authentification biométrique et procédé d'authentification biométrique WO2013114806A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-016933 2012-01-30
JP2012016933 2012-01-30

Publications (1)

Publication Number Publication Date
WO2013114806A1 true WO2013114806A1 (fr) 2013-08-08

Family

ID=48904858

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/000217 WO2013114806A1 (fr) 2012-01-30 2013-01-18 Dispositif d'authentification biométrique et procédé d'authentification biométrique

Country Status (2)

Country Link
JP (1) JPWO2013114806A1 (fr)
WO (1) WO2013114806A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016115199A (ja) * 2014-12-16 2016-06-23 国立大学法人 鹿児島大学 認証処理装置及び認証処理方法
CN106236061A (zh) * 2015-06-04 2016-12-21 松下知识产权经营株式会社 人体检测装置
WO2020022014A1 (fr) * 2018-07-25 2020-01-30 日本電気株式会社 Dispositif, procédé et programme de traitement d'informations
WO2020022034A1 (fr) * 2018-07-25 2020-01-30 日本電気株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
JP2020115361A (ja) * 2015-09-03 2020-07-30 日本電気株式会社 認証装置、防犯システム、認証装置による制御方法およびプログラム
US10974537B2 (en) 2019-08-27 2021-04-13 Advanced New Technologies Co., Ltd. Method and apparatus for certificate identification
US11003957B2 (en) 2019-08-21 2021-05-11 Advanced New Technologies Co., Ltd. Method and apparatus for certificate identification
JP2021168142A (ja) * 2020-03-23 2021-10-21 日本電気株式会社 情報処理装置、防犯システム、情報処理方法およびプログラム
US11227038B2 (en) 2016-10-13 2022-01-18 Advanced New Technologies Co., Ltd. User identity authentication using virtual reality
JP2022058211A (ja) * 2020-09-30 2022-04-11 大日本印刷株式会社 本人認証システム、サーバ、サーバプログラム、取引装置及び装置プログラム

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NO347923B1 (en) * 2017-09-15 2024-05-13 Elliptic Laboratories Asa User Authentication Control

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004078316A (ja) * 2002-08-09 2004-03-11 Honda Motor Co Ltd 姿勢認識装置及び自律ロボット
JP2005292994A (ja) * 2004-03-31 2005-10-20 Toshiba Corp 人物認識装置と通行制御装置
JP2009151424A (ja) * 2007-12-19 2009-07-09 Sony Ericsson Mobilecommunications Japan Inc 情報処理装置、情報処理方法、情報処理プログラム及び携帯端末装置
JP2010541398A (ja) * 2007-09-24 2010-12-24 ジェスチャー テック,インコーポレイテッド 音声及びビデオ通信のための機能向上したインタフェース

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004078316A (ja) * 2002-08-09 2004-03-11 Honda Motor Co Ltd 姿勢認識装置及び自律ロボット
JP2005292994A (ja) * 2004-03-31 2005-10-20 Toshiba Corp 人物認識装置と通行制御装置
JP2010541398A (ja) * 2007-09-24 2010-12-24 ジェスチャー テック,インコーポレイテッド 音声及びビデオ通信のための機能向上したインタフェース
JP2009151424A (ja) * 2007-12-19 2009-07-09 Sony Ericsson Mobilecommunications Japan Inc 情報処理装置、情報処理方法、情報処理プログラム及び携帯端末装置

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016115199A (ja) * 2014-12-16 2016-06-23 国立大学法人 鹿児島大学 認証処理装置及び認証処理方法
US11030739B2 (en) 2015-06-04 2021-06-08 Panasonic Intellectual Property Management Co., Ltd. Human detection device equipped with light source projecting at least one dot onto living body
CN106236061A (zh) * 2015-06-04 2016-12-21 松下知识产权经营株式会社 人体检测装置
JP2017000742A (ja) * 2015-06-04 2017-01-05 パナソニックIpマネジメント株式会社 人体検出装置
CN106236061B (zh) * 2015-06-04 2021-09-21 松下知识产权经营株式会社 人体检测装置
JP2020115361A (ja) * 2015-09-03 2020-07-30 日本電気株式会社 認証装置、防犯システム、認証装置による制御方法およびプログラム
US11227038B2 (en) 2016-10-13 2022-01-18 Advanced New Technologies Co., Ltd. User identity authentication using virtual reality
CN112424791A (zh) * 2018-07-25 2021-02-26 日本电气株式会社 信息处理装置、信息处理方法和信息处理程序
JP7363785B2 (ja) 2018-07-25 2023-10-18 日本電気株式会社 情報処理装置、情報処理方法および情報処理プログラム
CN112424791B (zh) * 2018-07-25 2024-03-19 日本电气株式会社 信息处理装置、信息处理方法和信息处理程序
JPWO2020022034A1 (ja) * 2018-07-25 2021-08-05 日本電気株式会社 情報処理装置、情報処理方法および情報処理プログラム
JPWO2020022014A1 (ja) * 2018-07-25 2021-08-12 日本電気株式会社 情報処理装置、情報処理方法および情報処理プログラム
WO2020022034A1 (fr) * 2018-07-25 2020-01-30 日本電気株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
JP7363786B2 (ja) 2018-07-25 2023-10-18 日本電気株式会社 情報処理装置、情報処理方法および情報処理プログラム
WO2020022014A1 (fr) * 2018-07-25 2020-01-30 日本電気株式会社 Dispositif, procédé et programme de traitement d'informations
US11670111B2 (en) 2018-07-25 2023-06-06 Nec Corporation Information processing apparatus, information processing method, and information processing program
US11600118B2 (en) 2018-07-25 2023-03-07 Nec Corporation Information processing apparatus, information processing method, and information processing program
US11003957B2 (en) 2019-08-21 2021-05-11 Advanced New Technologies Co., Ltd. Method and apparatus for certificate identification
US10974537B2 (en) 2019-08-27 2021-04-13 Advanced New Technologies Co., Ltd. Method and apparatus for certificate identification
JP7151830B2 (ja) 2020-03-23 2022-10-12 日本電気株式会社 情報処理装置、防犯システム、情報処理方法およびプログラム
JP2021168142A (ja) * 2020-03-23 2021-10-21 日本電気株式会社 情報処理装置、防犯システム、情報処理方法およびプログラム
JP7484040B2 (ja) 2020-03-23 2024-05-16 日本電気株式会社 情報処理方法、防犯システム、情報処理装置およびプログラム
JP7226496B2 (ja) 2020-09-30 2023-02-21 大日本印刷株式会社 本人認証システム、サーバ、サーバプログラム、取引装置及び装置プログラム
JP2022058211A (ja) * 2020-09-30 2022-04-11 大日本印刷株式会社 本人認証システム、サーバ、サーバプログラム、取引装置及び装置プログラム

Also Published As

Publication number Publication date
JPWO2013114806A1 (ja) 2015-05-11

Similar Documents

Publication Publication Date Title
WO2013114806A1 (fr) Dispositif d'authentification biométrique et procédé d'authentification biométrique
US11755137B2 (en) Gesture recognition devices and methods
Tian et al. KinWrite: Handwriting-Based Authentication Using Kinect.
Lai et al. A gesture-driven computer interface using Kinect
US9734393B2 (en) Gesture-based control system
JP5205187B2 (ja) 入力システム及び入力方法
KR20130099317A (ko) 인터랙티브 증강현실 구현 시스템 및 증강현실 구현 방법
CN114077726A (zh) 用于对用户进行认证的***、方法和机器可读介质
Maisto et al. An accurate algorithm for the identification of fingertips using an RGB-D camera
KR20110080327A (ko) 얼굴 인식 장치 및 그 방법
KR20150034257A (ko) 입력 장치, 기기, 입력 방법 및 기록 매체
JP6674683B2 (ja) 認証処理装置及び認証処理方法
CN114220130A (zh) 非接触式手势与掌纹、掌静脉融合的身份识别***及方法
JP5964603B2 (ja) データ入力装置、及び表示装置
WO2019037257A1 (fr) Dispositif et procédé de commande de saisie de mot de passe, et support de stockage lisible par ordinateur
Halarnkar et al. Gesture recognition technology: A review
KR101286750B1 (ko) 제스처를 이용한 패스워드 판단시스템
KR101525011B1 (ko) Nui 기반의 실감형 가상공간 디스플레이 제어장치 및 제어방법
JP2007156768A (ja) 個人認証装置,個人認証情報登録装置,個人認証方法,個人認証情報登録方法,およびコンピュータプログラム
JP2020107038A (ja) 情報処理装置、情報処理方法及びプログラム
JP2020107037A (ja) 情報処理装置、情報処理方法及びプログラム
Ducray et al. Authentication based on a changeable biometric using gesture recognition with the kinect™
WO2022180890A1 (fr) Système d'authentification biométrique, terminal d'authentification et procédé d'authentification
Kanev et al. A human computer interactions framework for biometric user identification
Abdrabou et al. How Unique do we Move? Understanding the Human Body and Context Factors for User Identification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13743660

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013556238

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13743660

Country of ref document: EP

Kind code of ref document: A1