WO2018133584A1 - 一种身份验证方法及装置 - Google Patents

一种身份验证方法及装置 Download PDF

Info

Publication number
WO2018133584A1
WO2018133584A1 PCT/CN2017/115684 CN2017115684W WO2018133584A1 WO 2018133584 A1 WO2018133584 A1 WO 2018133584A1 CN 2017115684 W CN2017115684 W CN 2017115684W WO 2018133584 A1 WO2018133584 A1 WO 2018133584A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
eye
user
living body
verification
Prior art date
Application number
PCT/CN2017/115684
Other languages
English (en)
French (fr)
Inventor
徐强华
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018133584A1 publication Critical patent/WO2018133584A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Definitions

  • the present disclosure relates to the field of communications technologies, and in particular, to an identity verification method and apparatus.
  • the eyeball texture recognition mainly extracts and recognizes the information of the white part of the eye.
  • the blood vessel part of the white part of the eye is extracted, that is, the blood vessel of the white part of the eye, and the shape of the blood vessel on each person's eyeball is unique, even the twins are not.
  • eyeball texture recognition is the uniqueness of the distribution of blood vessel texture on the white of the eye.
  • the eyeball may be congested due to allergies, red eyes or hangovers, it does not affect the arrangement of blood vessels on the white of the eye, so the stability of the eyeball texture recognition is sufficient.
  • eyeball texture recognition and fingerprint recognition are similar, and there are cases of misrecognition. For example, in the case of eyeball texture recognition, high-definition eye photos may be mixed, and once misunderstood, it may cause loss to the user's property.
  • the present disclosure provides an identity verification method and apparatus.
  • An embodiment of the present disclosure provides an identity verification method, which is applied to a terminal device, and the method includes:
  • the verification data includes: a first eyeball texture data set acquired at a first location within the first scan frame and a second eyeball texture dataset acquired at at least one second location within the first scan frame;
  • the data includes a first eye feature data set acquired at the first location and a second eye feature data set of the user to be verified acquired at the at least one second location;
  • the authenticated user is authenticated based on the eyeprint verification data and the live authentication data.
  • the step of authenticating the user according to the eyeprint verification data and the living body verification data includes:
  • the authentication is passed.
  • the step of determining the eyeprint verification data as the living eyeprint data according to the eyeprint verification data and the living body verification data comprises:
  • the pattern verification data is living eye pattern data.
  • the step of obtaining eye print verification data and living body verification data of the user to be verified includes:
  • the step of determining that the eyeprint verification data matches the eye patch template data successfully comprises:
  • the similarity is greater than or equal to the preset similarity threshold, it is determined that the eyeprint verification data is successfully matched with the eye patch template data.
  • the method further includes:
  • the eye user template data of the first user is entered.
  • the step of entering the eye pattern data of the first user includes:
  • the eyeprint entry data and the living body entry data of the first user are acquired;
  • the eyeprint entry data includes: a third eyeball texture data set acquired at a third position in the scan frame and a fourth eyeball texture data set acquired at at least one fourth position in the second scan frame;
  • the living body entry data is included in the third position And the acquired third eye feature data set and the fourth eye feature data set of the user to be verified acquired at the at least one fourth location;
  • the eye pattern entry data when the eye pattern entry data is determined to be the living eye pattern data, the eye pattern entry data is stored as the eye pattern template data of the first user.
  • the step of determining the eye pattern entry data as the living eye pattern data according to the eye pattern entry data and the living body entry data includes:
  • the input data is the living eye pattern data.
  • the step of acquiring the eyeprint entry data and the living body entry data of the first user includes:
  • the eyeprint entry data and the living body entry data of the first user are acquired.
  • an embodiment of the present disclosure further provides an identity verification apparatus, which is applied to a terminal device, including:
  • the detecting module is configured to: when detecting that the eyes of the user to be verified move in the first preset direction on the display interface of the terminal device, obtain the eyeprint verification data and the living body verification data of the user to be verified, and verify the eye pattern
  • the data includes: a first eyeball texture data set acquired at a first location within the first scan frame and a second eyeball texture dataset acquired at at least one second location within the first scan frame; biometric verification data And including a first eye feature data set acquired at the first location and a second eye feature data set of the user to be verified acquired at the at least one second location;
  • the verification module is configured to authenticate the user to be verified according to the eyeprint verification data and the living body verification data.
  • the verification module includes:
  • Obtaining a sub-module configured to obtain pre-stored eye pattern data according to the eye pattern verification data and the living body verification data, and determining that the eye pattern verification data is the living eye pattern data;
  • the verification submodule is configured to pass the authentication when it is determined that the eyeprint verification data and the eye pattern data match are successful.
  • the acquisition submodule is configured to:
  • the pattern verification data is living eye pattern data.
  • the detecting module includes:
  • the first display sub-module is configured to display the first living body detection ball moving in the first preset direction in the first scan frame, and prompting the user to be verified that the two eyes are gazing at the first living body detection ball;
  • a first shooting sub-module configured to take a photo for the eyes of the user to be verified during the movement of the first living body detecting ball
  • the first obtaining submodule is configured to obtain eyeprint verification data and living body verification data of the user to be verified according to the photograph taken.
  • the verification sub-module includes:
  • a calculating unit configured to calculate a similarity between the eye pattern data and the eyeprint verification data according to a preset similarity algorithm
  • the matching unit is configured to determine that the eyeprint verification data and the eye pattern data match are successfully matched when the similarity is greater than or equal to the preset similarity threshold.
  • the apparatus further includes:
  • the template entry module is configured to input the eye user template data of the first user.
  • the template entry module includes:
  • the input detection sub-module is configured to: when the first user's eyes move in the second preset direction on the display interface of the terminal device, obtain the first user's eyeprint input data and the living body input data;
  • the entry data includes: a third eye texture data set acquired at a third position in the second scan frame and a fourth eye texture data set acquired at at least one fourth position in the second scan frame;
  • the data includes third eye feature data acquired at the third location And a fourth eye feature data set of the user to be verified acquired at the at least one fourth location;
  • the template storage sub-module is configured to, according to the eye pattern entry data and the living body input data, determine that the eye pattern entry data is the living eye pattern data, and store the eye pattern entry data as the first user's eye pattern template data.
  • the template storage submodule is configured to:
  • the input data is the living eye pattern data.
  • the entry detection sub-module includes:
  • the second display sub-module is configured to display the second living body detection ball moving in the second preset direction in the second scan frame, and prompt the first user to both eyes on the second living body detection ball;
  • a second shooting sub-module configured to take a photo for the first user's eyes during the second living body detecting ball movement
  • the second obtaining submodule is configured to acquire the eyeprint entry data and the living body entry data of the first user according to the photograph taken.
  • Embodiments of the present disclosure also provide a computer readable storage medium storing computer executable instructions that, when executed by a processor, implement the methods described above.
  • the identity verification method and device in the process of authenticating the user, obtains the eyeprint verification data and the living body verification data of the user to be verified, and performs living body detection while detecting the eyeball texture to ensure the collected data.
  • living data avoid false verification through high-definition eyeball images during the authentication process to improve the security of authentication; especially in the payment process of electronic devices, it can effectively protect the user's property security;
  • a live detection link is also added to ensure the effectiveness of eye pattern entry.
  • the present disclosure solves the problem that eye texture recognition may be misidentified.
  • FIG. 1 is a flow chart showing the basic steps of an identity verification method provided by a first example of the present disclosure
  • FIG. 2 is a schematic diagram showing a scene of an eyeprint data entry process provided by a second example of the present disclosure
  • FIG. 3 is a flow chart showing the basic steps of the eyeprint data entry process provided by the second example of the present disclosure
  • FIG. 4 is a second schematic diagram of a scene of an eyeprint data entry process provided by a second example of the present disclosure
  • FIG. 5 is a schematic diagram showing a scenario of an eyeprint data verification process provided by a second example of the present disclosure
  • FIG. 6 is a flow chart showing the basic steps of the eyeprint data verification process provided by the second example of the present disclosure.
  • FIG. 7 shows a block diagram of an identity verification apparatus provided by a third example of the present disclosure.
  • a first example of the present disclosure provides an identity verification method, which is applied to a terminal device, including:
  • Step 101 When it is detected that the eyes of the user to be verified move in the first preset direction on the display interface of the terminal device, obtain the eyeprint verification data and the living body verification data of the user to be verified, and the eyeprint verification data includes : a first eyeball texture data set acquired at a first position within the first scan frame and a second eyeball texture data set acquired at at least one second location within the first scan frame; the biometric verification data is included in a first eye feature data set acquired at the first location and a second eye feature data set of the user to be verified acquired at the at least one second location.
  • the authentication user is authenticated, and the user's eyes are placed in the first scan frame and moved along the first preset direction.
  • the first eye texture data set and the first eye at the first position are acquired.
  • a feature data set, and a second eye texture data set and a second eye feature data set at the second position are acquired.
  • the eyeball texture data set is a set of eye texture data of the user to be verified, and the eyeball texture data is detected by the eyeball texture Out of the data, eye texture detection (eyeball texture recognition)
  • the principle is to read the distribution pattern of the blood vessel texture on the white of the eye of the user photographed by the front camera of the electronic device, thereby achieving the function of identification.
  • the eye feature data set is a set of eye feature data of the user to be verified; in the present example, the eye feature data may include: human eye feature parameters such as eyelid length and width, eye corner, pupil, and the like.
  • the living body verification data is data generated by the living body in an active state.
  • Step 102 Perform identity verification on the user to be authenticated according to the eyeprint verification data and the living body verification data.
  • step 102 can include:
  • the authentication is passed.
  • the second eye feature data set in the living body verification data is compared with the first eye feature data set, and the collected first eye feature data set is verified as living data to avoid falsehood through the high-definition eye image.
  • the effect of the verification verifying the data collected at the first location and the data collected at the second location by the first eye texture data set and the second eye texture data set in the eyeprint verification data as data of the same user, and further
  • the eyeprint verification data is determined to be the living eyeprint data and is the data of the same user, and the validity of the obtained eyeprint verification data of the user to be verified is ensured.
  • the pre-stored eye pattern data is matched with the eye pattern verification data of the user to be verified, and after the matching is successful, the identity verification of the user to be authenticated is determined to pass. Otherwise, the authentication of the user to be authenticated does not pass.
  • the step of obtaining the pre-stored eye pattern data when the eye pattern verification data is the living eye pattern data according to the eye pattern verification data and the living body verification data may include:
  • the pattern verification data is living eye pattern data.
  • the first eye texture data set and the second eye texture data set are not equal and the intersection is a non-empty set, that is, the first eye texture data set and the second eye texture data set have several identical elements, but cannot be congruent. If congruent, the first eye texture data set and the second eye texture data set are from the same static picture.
  • the first eye feature data set and the second eye feature data set may be determined by the eyeball tracking technique as the living eye feature data.
  • Eye tracking technology is mainly to track the movement information of the eyeball; when the human eye looks in different directions, the eye will have subtle changes, these changes will produce extractable features, extract these features through image capture or scanning, and then track in real time.
  • a change in the eye when the second eye feature data set is determined by the eyeball tracking technique to generate data after the motion for the first eye feature data set, that is, the user's eyeball generates a rotation at the first position, and is determined in the first position and
  • the data collected at the second position is living body data, that is, the eye pattern verification data is living eye pattern data.
  • step 101 can include:
  • the first living body detecting ball moves in the first preset direction in the first scanning frame, and in the process of moving, photographs the eyes of the user to be verified in different positions; and during the moving process, the eyes of the user to be verified are reminded to both eyes.
  • the first living body detects the ball, ensuring that the eyes of the user to be verified fall within the first scan frame.
  • the data corresponding to the first location and the data corresponding to the second location are extracted according to the photograph taken.
  • determining that the eyeprint verification data matches the eyeprint template data successfully may include:
  • the similarity is greater than or equal to the preset similarity threshold, it is determined that the eyeprint verification data is successfully matched with the eye patch template data.
  • the similarity degree between the eye pattern data and the eye pattern verification data is calculated by a preset similarity algorithm, and the higher the similarity, the closer the data is; when the similarity is greater than the preset similarity threshold, the determination is performed. The match was successful; otherwise, the match was unsuccessful.
  • the method can also include:
  • the eye user template data of the first user is entered.
  • the eye user template data of the first user is entered for comparison with the data to be verified.
  • the steps of entering the first user's eye pattern template data include:
  • the eyeprint entry data includes: the third eyeball texture data set acquired at the third position in the second scan frame and a fourth eye texture data set acquired at at least one fourth position in the second scan frame;
  • the living body entry data includes a third eye feature data set acquired at the third position and at the at least one fourth position The acquired fourth eye feature data set of the user to be verified.
  • the eye pattern entry data is stored as the eye pattern template data of the first user.
  • the process of inputting the eye patch template data includes: acquiring data of the third location and the data of the fourth location in the second scan frame during the process of moving the first user's eyes in the second predetermined direction in the second scan frame. .
  • the process of inputting the eye patch template data includes: acquiring data of the third location and the data of the fourth location in the second scan frame during the process of moving the first user's eyes in the second predetermined direction in the second scan frame.
  • the second step above may include:
  • the input data is the living eye pattern data.
  • the third eye texture data set and the fourth eye texture data set are not equal and the intersection is a non-empty set, that is, the third eye texture data set and the fourth eye texture data set have several identical elements, but cannot be equal. If congruent, it indicates that the third eye texture data set and the fourth eye texture data set are from the same static picture.
  • the first step above may include:
  • the eyeprint entry data and the living body entry data of the first user are acquired.
  • the second living body detecting ball moves in the second predetermined direction in the second scanning frame, and moves over The first user's eyes are photographed at different positions during the movement; and during the movement, the user to be authenticated is reminded that the two eyes are gazing at the second living body detecting ball to ensure that the first user's eyes fall within the second scanning frame.
  • the data corresponding to the third position and the data corresponding to the fourth position are extracted according to the photograph taken.
  • the living body detection in the process of authenticating the user, by acquiring the eyeprint verification data and the living body verification data of the user to be verified, the living body detection is performed at the same time as the eyeball texture detection, and the collected data is ensured to be living data.
  • identity verification false verification is implemented through high-definition eyeball images to improve the security of identity verification; in particular, in the payment process of electronic devices, user property security can be effectively protected; and eye pattern template data entry
  • the living body detection link has also been added to ensure the effectiveness of eye pattern entry.
  • the present disclosure solves the problem that eye texture recognition may be misidentified.
  • the second example introduces the authentication method provided by the present disclosure.
  • the eyeprint data entry process mainly includes the following steps:
  • Step 301 verifying whether the user's eyes fall within the second scan frame
  • the user makes sure that the user's eyes and the terminal device maintain a proper distance.
  • the user is guided by the second living body detecting ball at the terminal interface, and the eyes fall in the second scanning frame, which is the initial position of the eye pattern.
  • Step 302 extracting a third eyeball texture data set and a third eye feature data set of the user in the third location; extracting a fourth eyeball texture data set and a fourth eye feature dataset of the user in the fourth location;
  • the camera scans both eyes, and the interface displays the scanning progress.
  • the terminal interface guides the two eyes to move, and in the process of moving the eyes, it needs to fall in the second scanning frame.
  • the scanning process is a process of high-speed photographing.
  • the camera takes a series of pictures of different eyes at different positions, and after algorithm processing, respectively obtains the third eyeball texture data set and the third eye feature data set of the initial position, that is, the third position, and obtains The fourth eyeball texture data and the fourth eye feature data set of the fourth position, if there are a plurality of fourth positions, obtain an eye pattern data set and a human eye feature set of other positions, and all the eye pattern characteristic data constitutes one A complete set of eye pattern data sets, defined as eye pattern entry data (eye pattern data).
  • the third eye texture data set and the fourth eyeball texture data are processed by an algorithm to infer the first
  • There is a certain relationship between the three eyeball texture dataset and the fourth eyeball texture data that is, there is an intersection and not all, otherwise it indicates that the eyeprints recorded in the third position and the fourth position are different human eye lines or static pictures.
  • the third eye feature data set and the fourth eye feature data set are processed by an algorithm to infer that the third eye feature data set and the fourth eye feature data set have a certain relationship, thereby inferring the set of eye pattern characteristic data.
  • the set is the eye pattern data of the living body.
  • the eye feature data set includes human eye feature parameters such as eyelids, eye corners, pupils, etc.
  • the fourth position is estimated to be the rotation of the eyeball based on the third position, which indicates that the eye contour is living. If the human eye feature data of the third location and the fourth location are the same, the description is a static picture.
  • step 303 the scanning is completed, and the eyeprint input data is saved.
  • the eyeprint data verification process mainly includes the following steps:
  • Step 601 verifying whether the user's eyes fall within the first scan frame
  • the user is guided by the first living body detecting ball at the terminal interface, and the eyes fall within the first scanning frame.
  • Step 602 extract a first eye texture data set of the user and a first eye feature data set at the first location; and extract a second eye texture data set and a second eye feature data set of the user in the second location;
  • the eyeprint feature data set obtained herein is defined as eyeprint verification data, and the process of acquiring eyeprint verification data is the same as step 302; and the eye feature data of the first location and the second location are also acquired, also according to the first location and
  • the eye pattern characteristic data and the eye part characteristic data of the second position can estimate whether the eye pattern verification data is the eye pattern verification data of the living body.
  • step 603 the scanning is completed, and the eyeprint verification data and the eyeprint data are entered into the data for similarity matching. If the similarity is within the preset similarity threshold, the verification is passed.
  • the similarity threshold can be set according to the security level, the security level is high, the similarity threshold can be set higher, the security level is lower, and the similarity threshold can be set lower.
  • the living body detection in the process of authenticating the user, by acquiring the eyeprint verification data and the living body verification data of the user to be verified, the living body detection is performed at the same time as the eyeball texture detection, and the collected data is ensured to be living data.
  • identity verification false verification is implemented through high-definition eyeball images to improve the security of identity verification; in particular, in the payment process of electronic devices, user property security can be effectively protected; and eye pattern template data entry
  • the living body detection link has also been added to ensure the effectiveness of eye pattern entry.
  • the present disclosure solves the problem that eye texture recognition may be misidentified.
  • a third example of the present disclosure provides an identity verification apparatus, which is applied to a terminal device, and includes:
  • the detecting module 701 is configured to: when detecting that the eyes of the user to be verified move in the first preset direction on the display interface of the terminal device, obtain the eyeprint verification data and the living body verification data of the user to be verified, and the eye pattern
  • the verification data includes: a first eyeball texture data set acquired at a first location within the first scan frame and a second eyeball texture dataset acquired at at least one second location within the first scan frame;
  • the data includes a first eye feature data set acquired at the first location and a second eye feature data set of the user to be verified acquired at the at least one second location;
  • the verification module 702 is configured to perform authentication on the user to be verified according to the eyeprint verification data and the living body verification data.
  • the verification module 702 can include:
  • Obtaining a sub-module configured to obtain pre-stored eye pattern data according to the eye pattern verification data and the living body verification data, and determining that the eye pattern verification data is the living eye pattern data;
  • the verification submodule is configured to pass the authentication when it is determined that the eyeprint verification data and the eye pattern data match are successful.
  • the get submodule can be configured to:
  • the eye pattern verification data is determined to be living eye pattern data.
  • the detection module 701 can include:
  • the first display sub-module is configured to display the first living body detection ball moving in the first preset direction in the first scan frame, and prompting the user to be verified that the two eyes are gazing at the first living body detection ball;
  • a first shooting sub-module configured to take a photo for the eyes of the user to be verified during the movement of the first living body detecting ball
  • the first obtaining submodule is configured to obtain eyeprint verification data and living body verification data of the user to be verified according to the photograph taken.
  • the verification sub-module can include:
  • a calculating unit configured to calculate a similarity between the eye pattern data and the eyeprint verification data according to a preset similarity algorithm
  • the matching unit is configured to determine that the eyeprint verification data and the eye pattern data match are successfully matched when the similarity is greater than or equal to the preset similarity threshold.
  • the apparatus may further include:
  • the template entry module is configured to input the eye user template data of the first user.
  • the template entry module can include:
  • the input detection sub-module is configured to: when the first user's eyes move in the second preset direction on the display interface of the terminal device, obtain the first user's eyeprint input data and the living body input data;
  • the entry data includes: a third eye texture data set acquired at a third position in the second scan frame and a fourth eye texture data set acquired at at least one fourth position in the second scan frame;
  • the data includes a third eye feature data set acquired at the third location and a fourth eye feature data set of the user to be verified acquired at the at least one fourth location;
  • the template storage sub-module is configured to, according to the eye pattern entry data and the living body input data, determine that the eye pattern entry data is the living eye pattern data, and store the eye pattern entry data as the first user's eye pattern template data.
  • the template storage submodule can be configured to:
  • the input data is the living eye pattern data.
  • the entry detection sub-module can include:
  • the second display sub-module is configured to display the second living body detection ball moving in the second preset direction in the second scan frame, and prompt the first user to both eyes on the second living body detection ball;
  • a second shooting sub-module configured to take a photo for the first user's eyes during the second living body detecting ball movement
  • the second obtaining submodule is configured to acquire the eyeprint entry data and the living body entry data of the first user according to the photograph taken.
  • the identity verification device provided by the embodiment of the present disclosure is a device applying the above method, that is, all the embodiments of the above method are applicable to the device, and both can achieve the same or similar beneficial effects.
  • Embodiments of the present disclosure also provide a computer readable storage medium storing computer executable instructions that, when executed by a processor, implement the methods described above.
  • computer storage medium includes volatile and nonvolatile, implemented in any method or technology for storing information, such as computer readable instructions, data structures, program modules or other data. Sex, removable and non-removable media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical disc storage, magnetic cartridge, magnetic tape, magnetic disk storage or other magnetic storage device, or may Any other medium used to store the desired information and that can be accessed by the computer.
  • communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and can include any information delivery. Send the media.
  • the living body detection is performed while the eyeball texture is detected, and the collected data is ensured to be a living body.
  • Data to avoid false verification through high-definition eyeball images in the process of identity verification, to improve the security of identity verification; especially in the payment process of electronic devices, can effectively protect the user's property security; and eye pattern template data
  • a live test is also added to ensure the effectiveness of eye pattern entry.
  • the present disclosure solves the problem that eye texture recognition may be misidentified.
  • the present disclosure therefore has industrial applicability.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

提供了一种身份验证方法及装置,该方法应用于一终端设备,包括:检测到待验证用户的双眼在终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取待验证用户的眼纹验证数据和活体验证数据,眼纹验证数据包括:在第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;根据眼纹验证数据和活体验证数据,对待验证用户进行身份验证。

Description

一种身份验证方法及装置 技术领域
本公开涉及通信技术领域,特别涉及一种身份验证方法及装置。
背景技术
随着通信技术的发展,通过电子设备进行身份验证的方式也逐渐兴起,比如移动支付业务;通过电子设备进行身份验证的方式存在多种,比如数字密码,图形密码,指纹密码,人脸识别,眼球纹理识别等;其中眼球纹理识别作为新起技术,开始慢慢地普及。
眼球纹理识别主要是对眼白部分的信息进行提取与识别,这里要提取的是眼白部分的血管构成,即眼白部分的血管,每个人的眼球上血管的形状都是独一无二的,即使是双胞胎也不相同,眼球纹理识别正是利用了眼白上血管纹理分布的唯一性。虽然人的眼球会因为过敏、红眼或者熬夜宿醉等情况发生充血的状况,但这些并不会影响眼白上血管排布,所以眼球纹理识别的稳定性也是足够的。然而,眼球纹理识别和指纹识别类似,存在误识别的情况,比如在眼球纹理识别时,高清眼睛照片,便可能蒙混过关,一旦误识,则有可能给用户的财产造成损失。
发明内容
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。
针对眼球纹理识别可能会存在误识别的情况,本公开提供了一种身份验证方法及装置。
本公开的实施例提供了一种身份验证方法,应用于一终端设备,该方法包括:
检测到待验证用户的双眼在终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取待验证用户的眼纹验证数据和活体验证数据,眼纹 验证数据包括:在第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;活体验证数据包括在第一位置上所获取的第一眼部特征数据集以及在至少一个第二位置上所获取的待验证用户的第二眼部特征数据集;
根据眼纹验证数据和活体验证数据,对待验证用户进行身份验证。
在示例性实施例中,根据眼纹验证数据和活体验证数据,对待验证用户进行身份验证的步骤,包括:
根据眼纹验证数据和活体验证数据,确定眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据;
在确定眼纹验证数据与眼纹模板数据匹配成功时,身份验证通过。
在示例性实施例中,根据眼纹验证数据和活体验证数据,确定眼纹验证数据为活体眼纹数据的步骤,包括:
当第一眼球纹理数据集与第二眼球纹理数据集不相等且交集为非空集,且判断第一眼部特征数据集、第二眼部特征数据集为活体眼部特征数据时,确定眼纹验证数据为活体眼纹数据。
在示例性实施例中,获取待验证用户的眼纹验证数据和活体验证数据的步骤,包括:
在第一扫描框内显示沿第一预设方向移动的第一活体检测球,并提示待验证用户双眼注视第一活体检测球;
在第一活体检测球移动的过程中,为待验证用户的双眼拍摄照片;
根据所拍摄的照片,获取待验证用户的眼纹验证数据和活体验证数据。
在示例性实施例中,确定眼纹验证数据与眼纹模板数据匹配成功的步骤,包括:
根据预设的相似度算法,计算眼纹模板数据与眼纹验证数据的相似度;
当相似度大于或等于预设的相似度阈值时,确定眼纹验证数据与眼纹模板数据匹配成功。
在示例性实施例中,该方法还包括:
录入第一用户的眼纹模板数据。
在示例性实施例中,录入第一用户的眼纹模板数据的步骤,包括:
检测到第一用户双眼在终端设备的显示界面上第二扫描框内沿第二预设方向移动时,获取第一用户的眼纹录入数据和活体录入数据;眼纹录入数据包括:在第二扫描框内的第三位置上所获取的第三眼球纹理数据集和在第二扫描框内的至少一个第四位置上所获取的第四眼球纹理数据集;活体录入数据包括在第三位置上所获取的第三眼部特征数据集以及在至少一个第四位置上所获取的待验证用户的第四眼部特征数据集;
根据眼纹录入数据和活体录入数据,确定眼纹录入数据为活体眼纹数据时,将眼纹录入数据存储为第一用户的眼纹模板数据。
在示例性实施例中,根据眼纹录入数据和活体录入数据,确定眼纹录入数据为活体眼纹数据的步骤,包括:
当第三眼球纹理数据集与第四眼球纹理数据集不相等且交集为非空集,且判断第三眼部特征数据集、第四眼部特征数据集为活体眼部特征数据时,确定眼纹录入数据为活体眼纹数据。
在示例性实施例中,获取第一用户的眼纹录入数据和活体录入数据的步骤,包括:
在第二扫描框内显示沿第二预设方向移动的第二活体检测球,并提示第一用户双眼注视第二活体检测球;
在第二活体检测球移动的过程中,为第一用户双眼拍摄照片;
根据所拍摄的照片,获取第一用户的眼纹录入数据和活体录入数据。
为了实现上述目的,本公开的实施例还提供了一种身份验证装置,应用于一终端设备,包括:
检测模块,配置为检测到待验证用户的双眼在终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取待验证用户的眼纹验证数据和活体验证数据,眼纹验证数据包括:在第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;活体验证数据包括在第一位置上所获取的第一眼部特征数据集以及在至少一个第二位置上所获取的待验证用户的第二眼部特征数据集;
验证模块,配置为根据眼纹验证数据和活体验证数据,对待验证用户进行身份验证。
在示例性实施例中,验证模块包括:
获取子模块,配置为根据眼纹验证数据和活体验证数据,确定眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据;
验证子模块,配置为在确定眼纹验证数据与眼纹模板数据匹配成功时,身份验证通过。
在示例性实施例中,获取子模块配置为:
当第一眼球纹理数据集与第二眼球纹理数据集不相等且交集为非空集,且判断第一眼部特征数据集、第二眼部特征数据集为活体眼部特征数据时,确定眼纹验证数据为活体眼纹数据。
在示例性实施例中,检测模块包括:
第一显示子模块,配置为在第一扫描框内显示沿第一预设方向移动的第一活体检测球,并提示待验证用户双眼注视第一活体检测球;
第一拍摄子模块,配置为在第一活体检测球移动的过程中,为待验证用户的双眼拍摄照片;
第一获取子模块,配置为根据所拍摄的照片,获取待验证用户的眼纹验证数据和活体验证数据。
在示例性实施例中,验证子模块包括:
计算单元,配置为根据预设的相似度算法,计算眼纹模板数据与眼纹验证数据的相似度;
匹配单元,配置为当相似度大于或等于预设的相似度阈值时,确定眼纹验证数据与眼纹模板数据匹配成功。
在示例性实施例中,该装置还包括:
模板录入模块,配置为录入第一用户的眼纹模板数据。
在示例性实施例中,模板录入模块包括:
录入检测子模块,配置为检测到第一用户双眼在终端设备的显示界面上第二扫描框内沿第二预设方向移动时,获取第一用户的眼纹录入数据和活体录入数据;眼纹录入数据包括:在第二扫描框内的第三位置上所获取的第三眼球纹理数据集和在第二扫描框内的至少一个第四位置上所获取的第四眼球纹理数据集;活体录入数据包括在第三位置上所获取的第三眼部特征数据 集以及在至少一个第四位置上所获取的待验证用户的第四眼部特征数据集;
模板存储子模块,配置为根据眼纹录入数据和活体录入数据,确定眼纹录入数据为活体眼纹数据时,将眼纹录入数据存储为第一用户的眼纹模板数据。
在示例性实施例中,模板存储子模块配置为:
当第三眼球纹理数据集与第四眼球纹理数据集不相等且交集为非空集,且判断第三眼部特征数据集、第四眼部特征数据集为活体眼部特征数据时,确定眼纹录入数据为活体眼纹数据。
在示例性实施例中,录入检测子模块包括:
第二显示子模块,配置为在第二扫描框内显示沿第二预设方向移动的第二活体检测球,并提示第一用户双眼注视第二活体检测球;
第二拍摄子模块,配置为在第二活体检测球移动的过程中,为第一用户双眼拍摄照片;
第二获取子模块,配置为根据所拍摄的照片,获取第一用户的眼纹录入数据和活体录入数据。
本公开实施例还提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现以上描述的方法。
本公开的上述方案至少包括以下有益效果:
本公开提供的身份验证方法及装置,在对用户进行身份验证的过程中,通过获取待验证用户的眼纹验证数据和活体验证数据,在眼球纹理检测的同时进行活体检测,确保所采集的数据为活体数据,避免在身份验证的过程中,通过高清眼球图片实现虚假验证,以提高身份验证的安全性;特别是在电子设备的支付过程中,可有效地保护用户的财产安全;且眼纹模板数据录入的过程中,也加入了活体检测环节,确保眼纹录入的有效性。本公开解决了眼球纹理识别可能会存在误识别的情况的问题。
在阅读并理解了附图和详细描述后,可以明白其他方面。
附图说明
图1表示本公开的第一示例提供的身份验证方法的基本步骤流程图;
图2表示本公开的第二示例提供的眼纹数据录入过程的场景示意图之一;
图3表示本公开的第二示例提供的眼纹数据录入过程的基本步骤流程图;
图4表示本公开的第二示例提供的眼纹数据录入过程的场景示意图之二;
图5表示本公开的第二示例提供的眼纹数据验证过程的场景示意图;
图6表示本公开的第二示例提供的眼纹数据验证过程的基本步骤流程图;
图7表示本公开的第三示例提供的身份验证装置的框图。
具体实施方式
下面将结合附图及具体示例进行详细描述。
第一示例
参见图1,本公开的第一示例提供了一种身份验证方法,应用于一终端设备,包括:
步骤101,检测到待验证用户的双眼在终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取待验证用户的眼纹验证数据和活体验证数据,眼纹验证数据包括:在第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;活体验证数据包括在第一位置上所获取的第一眼部特征数据集以及在至少一个第二位置上所获取的待验证用户的第二眼部特征数据集。
其中,对待验证用户进行身份验证,引导用户双眼落在第一扫描框内,并沿第一预设方向移动;在移动过程中,获取第一位置上的第一眼球纹理数据集和第一眼部特征数据集,以及第二位置上的第二眼球纹理数据集和第二眼部特征数据集;眼球纹理数据集即待验证用户的眼球纹理数据的集合,眼球纹理数据是通过眼球纹理检测得出的数据,眼球纹理检测(眼球纹理识别) 的原理是读取电子设备的前置摄像头拍摄的使用者眼睛眼白上的血管纹理的分布形态,从而达到身份识别的作用。眼部特征数据集即待验证用户的眼部特征数据的集合;在本示例中,眼部特征数据可包括:眼眶长宽、眼角、瞳孔等人眼特征参数。
其中,活体验证数据即生物体在活动状态下产生的数据。
步骤102,根据眼纹验证数据和活体验证数据,对待验证用户进行身份验证。
在本示例中,步骤102可包括:
根据眼纹验证数据和活体验证数据,确定眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据;
在确定眼纹验证数据与眼纹模板数据匹配成功时,身份验证通过。
其中,通过活体验证数据中的第二眼部特征数据集与第一眼部特征数据集进行比对,验证所采集的第一眼部特征数据集为活体数据,避免通过高清眼纹图片达到虚假验证的效果;通过眼纹验证数据中的第一眼球纹理数据集与第二眼球纹理数据集验证在第一位置上所采集的数据以及第二位置上所采集的数据为同一用户的数据,进而确定眼纹验证数据为活体眼纹数据且为同一用户的数据,保证获取的待验证用户的眼纹验证数据的有效性。
当确定获取的眼纹验证数据为同一用户的活体眼纹数据时,将预先存储的眼纹模板数据与待验证用户的眼纹验证数据进行匹配,匹配成功后,确定对待验证用户的身份验证通过;否则,对待验证用户的身份验证不通过。
在本示例中,根据眼纹验证数据和活体验证数据,确定眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据的步骤,可包括:
当第一眼球纹理数据集与第二眼球纹理数据集不相等且交集为非空集,且判断第一眼部特征数据集、第二眼部特征数据集为活体眼部特征数据时,确定眼纹验证数据为活体眼纹数据。
其中,第一眼球纹理数据集与第二眼球纹理数据集不相等且交集为非空集,即第一眼球纹理数据集与第二眼球纹理数据集中有若干个相同的元素,但是不能全等,若全等的话,则说明第一眼球纹理数据集与第二眼球纹理数据集来自同一张静态图片。
例如,可通过眼球追踪技术判断第一眼部特征数据集、第二眼部特征数据集为活体眼部特征数据。眼球追踪技术主要是追踪眼球的运动信息;当人的眼睛看向不同方向时,眼部会有细微的变化,这些变化会产生可以提取的特征,通过图像捕捉或扫描提取这些特征,从而实时追踪眼睛的变化;当通过眼球追踪技术判断第二眼部特征数据集为第一眼部特征数据集产生运动之后的数据时,即用户眼球在第一位置上产生了转动,确定在第一位置以及第二位置上采集的数据为活体数据,即眼纹验证数据为活体眼纹数据。
在本示例中,步骤101可包括:
在第一扫描框内显示沿第一预设方向移动的第一活体检测球,并提示待验证用户双眼注视第一活体检测球;
在第一活体检测球移动的过程中,为待验证用户的双眼拍摄照片;
根据所拍摄的照片,获取待验证用户的眼纹验证数据和活体验证数据。
其中,第一活体检测球在第一扫描框内沿第一预设方向移动,移动的过程中拍摄待验证用户的双眼在不同位置的照片;且移动的过程中,提醒待验证用户的双眼注视第一活体检测球,确保待验证用户的双眼落在第一扫描框内。根据所拍摄的照片提取第一位置对应的数据和第二位置对应的数据。
在本示例中,在步骤102中,确定眼纹验证数据与眼纹模板数据匹配成功可包括:
根据预设的相似度算法,计算眼纹模板数据与眼纹验证数据的相似度;
当相似度大于或等于预设的相似度阈值时,确定眼纹验证数据与眼纹模板数据匹配成功。
例如,通过预设的相似度算法,计算眼纹模板数据与眼纹验证数据的相似度,相似度越高,表明二者的数据越接近;当相似度大于预设的相似度阈值时,确定匹配成功;否则,匹配不成功。
在本示例中,该方法还可包括:
录入第一用户的眼纹模板数据。
其中,录入第一用户的眼纹模板数据,用于对待验证的数据进行比对。
录入第一用户的眼纹模板数据的步骤,包括:
第一步,检测到第一用户双眼在终端设备的显示界面上第二扫描框内沿 第二预设方向移动时,获取第一用户的眼纹录入数据和活体录入数据;眼纹录入数据包括:在第二扫描框内的第三位置上所获取的第三眼球纹理数据集和在第二扫描框内的至少一个第四位置上所获取的第四眼球纹理数据集;活体录入数据包括在第三位置上所获取的第三眼部特征数据集以及在至少一个第四位置上所获取的待验证用户的第四眼部特征数据集。
第二步,根据眼纹录入数据和活体录入数据,确定眼纹录入数据为活体眼纹数据时,将眼纹录入数据存储为第一用户的眼纹模板数据。
例如,录入眼纹模板数据的过程,包括获取第一用户双眼在第二扫描框内沿第二预设方向移动的过程中,第二扫描框内的第三位置的数据和第四位置的数据。通过其中,通过活体录入数据中的第三眼部特征数据集与第四眼部特征数据集进行比对,确保所录入的第三眼部特征数据集为活体数据;通过眼纹录入数据中的第三眼球纹理数据集与第四眼球纹理数据集录入在第三位置上所采集的数据以及第四位置上所采集的数据为同一用户的数据,进而确定眼纹录入数据为活体眼纹数据且为同一用户的数据,保证获取的第一用户的眼纹录入数据的有效性。
在本示例中,上述第二步,可包括:
当第三眼球纹理数据集与第四眼球纹理数据集不相等且交集为非空集,且判断第三眼部特征数据集、第四眼部特征数据集为活体眼部特征数据时,确定眼纹录入数据为活体眼纹数据。
其中,第三眼球纹理数据集与第四眼球纹理数据集不相等且交集为非空集,即第三眼球纹理数据集与第四眼球纹理数据集中有若干个相同的元素,但是不能全等,若全等的话,则说明第三眼球纹理数据集与第四眼球纹理数据集来自同一张静态图片。
在本示例中,上述第一步,可包括:
在第二扫描框内显示沿第二预设方向移动的第二活体检测球,并提示第一用户双眼注视第二活体检测球;
在第二活体检测球移动的过程中,为第一用户双眼拍摄照片;
根据所拍摄的照片,获取第一用户的眼纹录入数据和活体录入数据。
其中,第二活体检测球在第二扫描框内沿第二预设方向移动,移动的过 程中拍摄第一用户的双眼在不同位置的照片;且移动的过程中,提醒待验证用户双眼注视第二活体检测球,确保第一用户双眼落在第二扫描框内。根据所拍摄的照片提取第三位置对应的数据和第四位置对应的数据。
本公开的上述示例中,在对用户进行身份验证的过程中,通过获取待验证用户的眼纹验证数据和活体验证数据,在眼球纹理检测的同时进行活体检测,确保所采集的数据为活体数据,避免在身份验证的过程中,通过高清眼球图片实现虚假验证,以提高身份验证的安全性;特别是在电子设备的支付过程中,可有效地保护用户的财产安全;且眼纹模板数据录入的过程中,也加入了活体检测环节,确保眼纹录入的有效性。本公开解决了眼球纹理识别可能会存在误识别的情况的问题。
第二示例
参见图2-图6,第二示例介绍了本公开提供的身份验证方法。
一、眼纹数据录入过程:
参见图2-图4,眼纹数据录入过程主要包括以下步骤:
步骤301,验证用户双眼是否落在第二扫描框内;
确保用户眼睛和终端设备保持合适距离,在终端界面通过第二活体检测球引导用户,双眼落在第二扫描框内,这个位置为眼纹录入初始位置。
步骤302,在第三位置提取用户的第三眼球纹理数据集以及第三眼部特征数据集;在第四位置提取用户的第四眼球纹理数据集以及第四眼部特征数据集;
例如,摄像头扫描双眼,界面显示扫描进度,扫描完成后,终端界面引导双眼移动,双眼移动的过程中,需落在第二扫描框内。扫描过程,就是高速拍照的过程,摄像头拍摄了双眼在不同位置的一系列图片,经过算法处理,分别得到初始位置即第三位置的第三眼球纹理数据集和第三眼部特征数据集,得到第四位置的第四眼球纹理数据和第四眼部特征数据集,如有多个第四位置,得到其它位置的眼纹特征数据集和人眼特征集,所有的眼纹特征数据构成了一组完整的眼纹特征数据集,定义为眼纹录入数据(眼纹模板数据)。
第三眼球纹理数据集和第四眼球纹理数据,经过算法处理,可以推断第 三眼球纹理数据集和第四眼球纹理数据有一定关联关系,即有交集且不全等,否则表明第三位置和第四位置录入的眼纹是不同人的眼纹,或者是静态图片。
例如,在第三位置,通过扫描,图像分割,特征提取,获取第三眼球纹理数据集f3;在第四位置,人眼进行了视线下移,采集到第四眼球纹理数据f4,如果是活体眼纹,f3和f4有如下关系:f3≠f4,f3∩f4≠Ф,且f3∪f4构成了一组完整的眼纹特征数据集。如果f3=f4,则说明是静态图片。
第三眼部特征数据集和第四眼部特征数据集,经过算法处理,可以推断第三眼部特征数据集和第四眼部特征数据集有一定关联关系,从而推断这组眼纹特征数据集是活体的眼纹数据。眼部特征数据集包括眼眶、眼角、瞳孔等人眼特征参数,利用眼球追踪技术,推算第四位置是在第三位置基础上眼球产生了转动,则说明是活体眼纹。如果第三位置和第四位置的人眼特征数据一样的,则说明是静态图片。
步骤303,扫描完成,保存眼纹录入数据。
二、眼纹数据验证过程:
参见图5及图6,眼纹数据验证过程主要包括以下步骤:
步骤601,验证用户双眼是否落在第一扫描框内;
确保用户眼睛和终端设备保持合适距离,在终端界面通过第一活体检测球引导用户,双眼落在第一扫描框内。
步骤602,在第一位置提取用户的第一眼球纹理数据集以及第一眼部特征数据集;在第二位置提取用户的第二眼球纹理数据集以及第二眼部特征数据集;
例如,引导双眼注视第二位置的第二活体检测提示球,摄像头扫描双眼,界面显示扫描进度,扫描完毕。此处获取的眼纹特征数据集定义为眼纹验证数据,获取眼纹验证数据的过程同步骤302;同时也获取了第一位置和第二位置的眼部特征数据,同样根据第一位置和第二位置的眼纹特征数据和眼部特征数据,可以推算眼纹验证数据是否是活体的眼纹验证数据。
步骤603,扫描完成,把眼纹验证数据和眼纹录入数据进行相似度匹配,若相似度在预设的相似度阈值内,则验证通过。
在本示例中,可以根据安全等级高低设置相似度阈值,安全等级高,相似度阈值可以设置得高一些,安全等级低,相似度阈值可设置低一些。
本公开的上述示例中,在对用户进行身份验证的过程中,通过获取待验证用户的眼纹验证数据和活体验证数据,在眼球纹理检测的同时进行活体检测,确保所采集的数据为活体数据,避免在身份验证的过程中,通过高清眼球图片实现虚假验证,以提高身份验证的安全性;特别是在电子设备的支付过程中,可有效地保护用户的财产安全;且眼纹模板数据录入的过程中,也加入了活体检测环节,确保眼纹录入的有效性。本公开解决了眼球纹理识别可能会存在误识别的情况的问题。
第三示例
参见图7,本公开的第三示例提供了一种身份验证装置,应用于一终端设备,包括:
检测模块701,配置为检测到待验证用户的双眼在终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取待验证用户的眼纹验证数据和活体验证数据,眼纹验证数据包括:在第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;活体验证数据包括在第一位置上所获取的第一眼部特征数据集以及在至少一个第二位置上所获取的待验证用户的第二眼部特征数据集;
验证模块702,配置为根据眼纹验证数据和活体验证数据,对待验证用户进行身份验证。
在本示例中,验证模块702可包括:
获取子模块,配置为根据眼纹验证数据和活体验证数据,确定眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据;
验证子模块,配置为在确定眼纹验证数据与眼纹模板数据匹配成功时,身份验证通过。
在本示例中,获取子模块可配置为:
当第一眼球纹理数据集与第二眼球纹理数据集不相等且交集为非空集,且判断第一眼部特征数据集、第二眼部特征数据集为活体眼部特征数据时, 确定眼纹验证数据为活体眼纹数据。
在本示例中,检测模块701可包括:
第一显示子模块,配置为在第一扫描框内显示沿第一预设方向移动的第一活体检测球,并提示待验证用户双眼注视第一活体检测球;
第一拍摄子模块,配置为在第一活体检测球移动的过程中,为待验证用户的双眼拍摄照片;
第一获取子模块,配置为根据所拍摄的照片,获取待验证用户的眼纹验证数据和活体验证数据。
在本示例中,验证子模块可包括:
计算单元,配置为根据预设的相似度算法,计算眼纹模板数据与眼纹验证数据的相似度;
匹配单元,配置为当相似度大于或等于预设的相似度阈值时,确定眼纹验证数据与眼纹模板数据匹配成功。
在本示例中,该装置还可包括:
模板录入模块,配置为录入第一用户的眼纹模板数据。
在本示例中,模板录入模块可包括:
录入检测子模块,配置为检测到第一用户双眼在终端设备的显示界面上第二扫描框内沿第二预设方向移动时,获取第一用户的眼纹录入数据和活体录入数据;眼纹录入数据包括:在第二扫描框内的第三位置上所获取的第三眼球纹理数据集和在第二扫描框内的至少一个第四位置上所获取的第四眼球纹理数据集;活体录入数据包括在第三位置上所获取的第三眼部特征数据集以及在至少一个第四位置上所获取的待验证用户的第四眼部特征数据集;
模板存储子模块,配置为根据眼纹录入数据和活体录入数据,确定眼纹录入数据为活体眼纹数据时,将眼纹录入数据存储为第一用户的眼纹模板数据。
在本示例中,模板存储子模块可配置为:
当第三眼球纹理数据集与第四眼球纹理数据集不相等且交集为非空集,且判断第三眼部特征数据集、第四眼部特征数据集为活体眼部特征数据时,确定眼纹录入数据为活体眼纹数据。
在本示例中,录入检测子模块可包括:
第二显示子模块,配置为在第二扫描框内显示沿第二预设方向移动的第二活体检测球,并提示第一用户双眼注视第二活体检测球;
第二拍摄子模块,配置为在第二活体检测球移动的过程中,为第一用户双眼拍摄照片;
第二获取子模块,配置为根据所拍摄的照片,获取第一用户的眼纹录入数据和活体录入数据。
本公开实施例提供的身份验证装置是应用上述方法的装置,即上述方法的所有实施例均适用于该装置,且均能达到相同或相似的有益效果。
本公开实施例还提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被处理器执行时实现以上描述的方法。
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、***、装置中的功能模块/单元可以被实施为软件、固件、硬件及其适当的组合。在硬件实施方式中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些组件或所有组件可以被实施为由处理器,如数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以分布在计算机可读介质上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于RAM、ROM、EEPROM、闪存或其他存储器技术、CD-ROM、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递 送介质。
以上所述是本公开的示例性实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本公开所述原理的前提下,还可以作出若干改进和润饰,这些改进和润饰也应视为本公开的保护范围。
工业实用性
根据本公开的上述实施例,在对用户进行身份验证的过程中,通过获取待验证用户的眼纹验证数据和活体验证数据,在眼球纹理检测的同时进行活体检测,确保所采集的数据为活体数据,避免在身份验证的过程中,通过高清眼球图片实现虚假验证,以提高身份验证的安全性;特别是在电子设备的支付过程中,可有效地保护用户的财产安全;且眼纹模板数据录入的过程中,也加入了活体检测环节,确保眼纹录入的有效性。本公开解决了眼球纹理识别可能会存在误识别的问题。因此本公开具有工业实用性。

Claims (18)

  1. 一种身份验证方法,应用于一终端设备,包括:
    检测到待验证用户的双眼在所述终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取所述待验证用户的眼纹验证数据和活体验证数据,所述眼纹验证数据包括:在所述第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在所述第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;所述活体验证数据包括在所述第一位置上所获取的第一眼部特征数据集以及在至少一个所述第二位置上所获取的所述待验证用户的第二眼部特征数据集;
    根据所述眼纹验证数据和所述活体验证数据,对所述待验证用户进行身份验证。
  2. 根据权利要求1所述的方法,其中,所述根据所述眼纹验证数据和所述活体验证数据,对所述待验证用户进行身份验证的步骤,包括:
    根据所述眼纹验证数据和所述活体验证数据,确定所述眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据;
    在确定所述眼纹验证数据与所述眼纹模板数据匹配成功时,身份验证通过。
  3. 根据权利要求2所述的方法,其中,所述根据所述眼纹验证数据和所述活体验证数据,确定所述眼纹验证数据为活体眼纹数据的步骤,包括:
    当所述第一眼球纹理数据集与所述第二眼球纹理数据集不相等且交集为非空集,且判断所述第一眼部特征数据集、所述第二眼部特征数据集为活体眼部特征数据时,确定所述眼纹验证数据为活体眼纹数据。
  4. 根据权利要求1所述的方法,其中,所述获取所述待验证用户的眼纹验证数据和活体验证数据的步骤,包括:
    在所述第一扫描框内显示沿所述第一预设方向移动的第一活体检测球,并提示所述待验证用户双眼注视所述第一活体检测球;
    在所述第一活体检测球移动的过程中,为所述待验证用户的双眼拍摄照片;
    根据所拍摄的照片,获取所述待验证用户的眼纹验证数据和活体验证数据。
  5. 根据权利要求2所述的方法,其中,所述确定所述眼纹验证数据与所述眼纹模板数据匹配成功的步骤,包括:
    根据预设的相似度算法,计算所述眼纹模板数据与所述眼纹验证数据的相似度;
    当所述相似度大于或等于预设的相似度阈值时,确定所述眼纹验证数据与所述眼纹模板数据匹配成功。
  6. 根据权利要求1所述的方法,其中,所述方法还包括:
    录入第一用户的眼纹模板数据。
  7. 根据权利要求6所述的方法,其中,所述录入第一用户的眼纹模板数据的步骤,包括:
    检测到所述第一用户双眼在所述终端设备的显示界面上第二扫描框内沿第二预设方向移动时,获取所述第一用户的眼纹录入数据和活体录入数据;所述眼纹录入数据包括:在所述第二扫描框内的第三位置上所获取的第三眼球纹理数据集和在所述第二扫描框内的至少一个第四位置上所获取的第四眼球纹理数据集;所述活体录入数据包括在所述第三位置上所获取的第三眼部特征数据集以及在至少一个第四位置上所获取的所述待验证用户的第四眼部特征数据集;
    根据所述眼纹录入数据和所述活体录入数据,确定所述眼纹录入数据为活体眼纹数据时,将所述眼纹录入数据存储为所述第一用户的眼纹模板数据。
  8. 根据权利要求7所述的方法,其中,所述根据所述眼纹录入数据和所述活体录入数据,确定所述眼纹录入数据为活体眼纹数据的步骤,包括:
    当所述第三眼球纹理数据集与所述第四眼球纹理数据集不相等且交集为非空集,且判断所述第三眼部特征数据集、所述第四眼部特征数据集为活体眼部特征数据时,确定所述眼纹录入数据为活体眼纹数据。
  9. 根据权利要求7所述的方法,其中,所述获取所述第一用户的眼纹录入数据和活体录入数据的步骤,包括:
    在所述第二扫描框内显示沿所述第二预设方向移动的第二活体检测球,并提示所述第一用户双眼注视所述第二活体检测球;
    在所述第二活体检测球移动的过程中,为所述第一用户双眼拍摄照片;
    根据所拍摄的照片,获取所述第一用户的眼纹录入数据和活体录入数据。
  10. 一种身份验证装置,应用于一终端设备,包括:
    检测模块(701),配置为检测到待验证用户的双眼在所述终端设备的显示界面上第一扫描框内沿第一预设方向移动时,获取所述待验证用户的眼纹验证数据和活体验证数据,所述眼纹验证数据包括:在所述第一扫描框内的第一位置上所获取的第一眼球纹理数据集和在所述第一扫描框内的至少一个第二位置上所获取的第二眼球纹理数据集;所述活体验证数据包括在所述第一位置上所获取的第一眼部特征数据集以及在至少一个所述第二位置上所获取的所述待验证用户的第二眼部特征数据集;
    验证模块(702),配置为根据所述眼纹验证数据和所述活体验证数据,对所述待验证用户进行身份验证。
  11. 根据权利要求10所述的装置,其中,所述验证模块(702)包括:
    获取子模块,配置为根据所述眼纹验证数据和所述活体验证数据,确定所述眼纹验证数据为活体眼纹数据时,获取预先存储的眼纹模板数据;
    验证子模块,配置为在确定所述眼纹验证数据与所述眼纹模板数据匹配成功时,身份验证通过。
  12. 根据权利要求11所述的装置,其中,所述获取子模块配置为:
    当所述第一眼球纹理数据集与所述第二眼球纹理数据集不相等且交集为非空集,且根据判断所述第一眼部特征数据集、所述第二眼部特征数据集为活体眼部特征数据时,确定所述眼纹验证数据为活体眼纹数据。
  13. 根据权利要求10所述的装置,其中,所述检测模块(701)包括:
    第一显示子模块,配置为在所述第一扫描框内显示沿所述第一预设方向移动的第一活体检测球,并提示所述待验证用户双眼注视所述第一活体检测球;
    第一拍摄子模块,配置为在所述第一活体检测球移动的过程中,为所述 待验证用户的双眼拍摄照片;
    第一获取子模块,配置为根据所拍摄的照片,获取所述待验证用户的眼纹验证数据和活体验证数据。
  14. 根据权利要求11所述的装置,其中,所述验证子模块包括:
    计算单元,配置为根据预设的相似度算法,计算所述眼纹模板数据与所述眼纹验证数据的相似度;
    匹配单元,配置为当所述相似度大于或等于预设的相似度阈值时,确定所述眼纹验证数据与所述眼纹模板数据匹配成功。
  15. 根据权利要求10所述的装置,其中,所述装置还包括:
    模板录入模块,配置为录入第一用户的眼纹模板数据。
  16. 根据权利要求15所述的装置,其中,所述模板录入模块包括:
    录入检测子模块,配置为检测到所述第一用户双眼在所述终端设备的显示界面上第二扫描框内沿第二预设方向移动时,获取所述第一用户的眼纹录入数据和活体录入数据;所述眼纹录入数据包括:在所述第二扫描框内的第三位置上所获取的第三眼球纹理数据集和在所述第二扫描框内的至少一个第四位置上所获取的第四眼球纹理数据集;所述活体录入数据包括在所述第三位置上所获取的第三眼部特征数据集以及在至少一个第四位置上所获取的所述待验证用户的第四眼部特征数据集;
    模板存储子模块,配置为根据所述眼纹录入数据和所述活体录入数据,确定所述眼纹录入数据为活体眼纹数据时,将所述眼纹录入数据存储为所述第一用户的眼纹模板数据。
  17. 根据权利要求16所述的装置,其中,所述模板存储子模块配置为:
    当所述第三眼球纹理数据集与所述第四眼球纹理数据集不相等且交集为非空集,且判断所述第三眼部特征数据集、所述第四眼部特征数据集为活体眼部特征数据时,确定所述眼纹录入数据为活体眼纹数据。
  18. 根据权利要求16所述的装置,其中,所述录入检测子模块包括:
    第二显示子模块,配置为在所述第二扫描框内显示沿所述第二预设方向移动的第二活体检测球,并提示所述第一用户双眼注视所述第二活体检测球;
    第二拍摄子模块,配置为在所述第二活体检测球移动的过程中,为所述第一用户双眼拍摄照片;
    第二获取子模块,配置为根据所拍摄的照片,获取所述第一用户的眼纹录入数据和活体录入数据。
PCT/CN2017/115684 2017-01-17 2017-12-12 一种身份验证方法及装置 WO2018133584A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710030837.4A CN108319830A (zh) 2017-01-17 2017-01-17 一种身份验证方法及装置
CN201710030837.4 2017-01-17

Publications (1)

Publication Number Publication Date
WO2018133584A1 true WO2018133584A1 (zh) 2018-07-26

Family

ID=62890672

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/115684 WO2018133584A1 (zh) 2017-01-17 2017-12-12 一种身份验证方法及装置

Country Status (2)

Country Link
CN (1) CN108319830A (zh)
WO (1) WO2018133584A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110460580B (zh) * 2019-07-11 2022-02-22 ***股份有限公司 图像采集装置、服务器及加、解密方法
CN110532957B (zh) * 2019-08-30 2021-05-07 北京市商汤科技开发有限公司 人脸识别方法及装置、电子设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184277A (zh) * 2015-09-29 2015-12-23 杨晴虹 活体人脸识别方法以及装置
CN105184246A (zh) * 2015-08-28 2015-12-23 北京旷视科技有限公司 活体检测方法和活体检测***
WO2016078429A1 (zh) * 2014-11-19 2016-05-26 中兴通讯股份有限公司 一种身份识别的方法和装置
CN106203372A (zh) * 2016-07-19 2016-12-07 奇酷互联网络科技(深圳)有限公司 基于眼睛的活体检测方法、装置和终端设备
CN106203297A (zh) * 2016-06-30 2016-12-07 北京七鑫易维信息技术有限公司 一种身份识别方法及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7327860B2 (en) * 2005-05-04 2008-02-05 West Virginia University Conjunctival scans for personal identification
CN105825102A (zh) * 2015-01-06 2016-08-03 中兴通讯股份有限公司 一种基于眼纹识别的终端解锁方法和装置
CN105824403A (zh) * 2015-09-21 2016-08-03 维沃移动通信有限公司 一种对终端进行操作的方法及终端
CN105174246B (zh) * 2015-10-08 2017-06-27 南京理工大学 一种毫米级多级孔碳球的制备方法
CN105205379A (zh) * 2015-10-28 2015-12-30 广东欧珀移动通信有限公司 一种终端应用的控制方法、装置和终端
CN105472147A (zh) * 2015-11-23 2016-04-06 努比亚技术有限公司 基于眼纹识别的应用锁处理方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016078429A1 (zh) * 2014-11-19 2016-05-26 中兴通讯股份有限公司 一种身份识别的方法和装置
CN105184246A (zh) * 2015-08-28 2015-12-23 北京旷视科技有限公司 活体检测方法和活体检测***
CN105184277A (zh) * 2015-09-29 2015-12-23 杨晴虹 活体人脸识别方法以及装置
CN106203297A (zh) * 2016-06-30 2016-12-07 北京七鑫易维信息技术有限公司 一种身份识别方法及装置
CN106203372A (zh) * 2016-07-19 2016-12-07 奇酷互联网络科技(深圳)有限公司 基于眼睛的活体检测方法、装置和终端设备

Also Published As

Publication number Publication date
CN108319830A (zh) 2018-07-24

Similar Documents

Publication Publication Date Title
JP2020170569A (ja) 認証システム、認証方法及びプログラム
CN110443016B (zh) 信息防泄露方法、电子装置及存储介质
US20180034852A1 (en) Anti-spoofing system and methods useful in conjunction therewith
CN107392137B (zh) 人脸识别方法及装置
JP5076563B2 (ja) 顔照合装置
KR101810190B1 (ko) 얼굴 인식을 이용한 사용자 인증 방법 및 그 장치
JP7269711B2 (ja) 生体認証システム、生体認証方法およびプログラム
CN111144277B (zh) 一种带活体检测功能的人脸验证方法和***
US11682236B2 (en) Iris authentication device, iris authentication method and recording medium
KR101954763B1 (ko) 얼굴 인식 출입 통제 장치 및 이의 동작 방법
CN106022216A (zh) 身份自动识别方法
JP2019197426A (ja) 顔認証装置、顔認証方法および顔認証システム
JP7318833B2 (ja) 画像処理デバイス、画像処理方法、およびプログラム
EP4343689A1 (en) Body part authentication system and authentication method
WO2018133584A1 (zh) 一种身份验证方法及装置
KR101680598B1 (ko) 얼굴유도용 최적 가이드를 제공하는 얼굴인증 처리 장치 및 얼굴인증 처리 방법
JP6446676B1 (ja) 本人認証システム、方法およびプログラム
CN114491128A (zh) 图像数据存储方法、装置及电子设备
US11507646B1 (en) User authentication using video analysis
CN110088765B (zh) 对照装置和对照方法
JP2008000464A (ja) 認証装置および認証方法
CN112069915A (zh) 一种具有人脸识别***的atm机
CN111353388A (zh) 活体检测方法、装置、电子设备及存储介质
KR20210050649A (ko) 모바일 기기의 페이스 인증 방법
KR102072168B1 (ko) 생체 구분이 가능한 홍채 인식 방법 및 이를 이용한 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17892280

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17892280

Country of ref document: EP

Kind code of ref document: A1